MSc Thesis

Interactive Document Viewer

MSc Thesis

Single Neuron Functional Analysis - Information Theoretical Approach

This thesis investigates the computational capabilities of individual neurons by analyzing their input-output relationships through information theory. Using mutual information and entropy as key metrics, the research quantifies how neuronal structure and biophysical properties influence a neuron's capacity to process and transmit information.

Author: Nitzan Luxembourg
Degree: Master of Science (MSc)
Field: Brain Sciences: Computation & Information Processing
Institution: The Hebrew University of Jerusalem
Year: 2020-2024

Key Research Components

  • Information theoretical analysis of single neuron computation
  • Computational modeling of Rat Layer 5b and Human Layer 2/3 pyramidal cells
  • Comparison between detailed biological models using artificial neural networks
  • Analysis of dendritic morphology and synaptic distribution effects
  • Classification of neurons on spectrum from "rigid" to "fluid" functionality

Research Areas

Information Theory Single Neuron Modeling Mutual Information Computational Neuroscience Neuronal Complexity Neural Networks
Page of --
100%

Loading thesis PDF...

Abstract

Individual neurons represent fundamental computational units in the brain, each capable of complex information processing through their intricate morphological and biophysical properties. This thesis employs information theory to quantify and understand how a neuron's structure—including dendritic morphology, synaptic distribution, and ion channel conductance—directly influences its computational capabilities.

The research utilizes detailed computational models of Rat Layer 5b and Human Layer 2/3 pyramidal cells, comparing them to simplified models and analogous Artificial Neural Networks.

Through systematic experiments modulating neuronal resting potential and synaptic strength, the study reveals how these factors significantly alter firing rate, entropy, and overall computational behavior. The research proposes a functional spectrum where neurons with high mutual information are "fluid" and adaptable for learning, while those with low mutual information are "rigid," providing stable representations.

This information-theoretic framework offers a novel classification system for neuronal function and provides fundamental insights into the diversity of computation in the brain. The core contribution demonstrates that a neuron's computational power can be effectively measured by the amount of information it transmits from inputs to output.