Engineering Theory of Neuronal Shape
Thursday, April 24, 2008 - 9:30am - 10:10am
Dmitri Chklovskii (Cold Spring Harbor Laboratory)
The human brain is a network containing a hundred billion neurons, each communicating with several thousand others. As the wiring for neuronal communication draws on limited space and energy resources, evolution had to optimize their use. This principle of minimizing wiring costs explains many features of brain architecture, including placement and shape of many neurons. However, the shape of some neurons and their synaptic properties remained unexplained. This led us to the principle of maximization of brain’s ability to store information, which can be expressed as maximization of entropy. Combination of the two principles, analogous to the minimization of free energy in statistical physics, provides a systematic view of brain architecture, necessary to explain brain function.