Knowing Neurons
Big Ideas

Learning from Disorder: The Paradox of Information in the Brain

In Dante’s Inferno, the fifth circle of Hell is a place where the wrathful fight each other for eternity.  Similarly, I often consider YouTube comments to be an extracanonical circle of Hell where the trolls fight each other for eternity.  You might, then, imagine my surprise when I found many thoughtful comments expressing wonder and intrigue on a YouTube video of brain activity in a zebrafish.  In the video, which showcases a cutting-edge brain imaging technique, neurons throughout the brain of the fish light up like fireflies and dance with purpose. At one point in the video, the entire brain glows with a brilliant glimmer.  “I think the zebrafish had an epiphany at 0:19,” one commenter writes. “I think the fish might have gotten startled,” another person chimes in.  “Just imagine how a human brain would look using this same technique,” says another commenter, “the moment of first love, fear, or anguish, it would all seem like a continuum of electrical stimulus, but each event gradually carving out our personality and perception.”

Does a lightbulb moment really look like a lightbulb in the brain?  Does thinking harder or experiencing deep emotions like love, fear, or anguish light up more neurons?  If the meme about only using 10 percent of our brains were true, would using 100 percent of our brain logically result in a stroke of genius?

Probably not.

To investigate how cells in the brain light up in response to learning, a group led by Takaki Komiyama used a similar imaging technique to study neurons in mice.  Continuing a line of inquiry started by the late Walter J. Freeman III, Komiyama and colleagues imaged the part of the brain that responds to smells — known as the olfactory bulb — as the mice learned odors.  While Freeman had performed similar experiments with electrodes measuring voltage, this group visualized the activity of the olfactory bulb with a fluorescent dye that glows when calcium rushes into a neuron.  Using this technique, Komiyama’s group observed the activity of mitral cells — excitatory cells that send odor information to the cerebral cortex — and granule cells — inhibitory cells with local connections.

Olfaction Knowing Neurons
Mitral cells receive direct input from sensory neurons and relay output concerning odors to the olfactory cortex.  Granule cells modulate mitral cells through local, inhibitory connections.  (Image adapted from Sakamoto et al., 2014.)

When a mouse was exposed to a new odor, all mitral cells lit up in a nearly unanimous glow.  However, after several days of odor learning, only a few mitral cells ultimately responded to the odor.  Even stranger, if the mouse were put under anesthesia, virtually all mitral cells again responded in unison, despite the mouse being unconscious.  Inhibitory granule cells, which do not send information directly to the cortex, showed the opposite activity as mitral cells: novel odors and odors presented under anesthesia elicited little response, whereas odor learning gave rise to unanimous activity.

It seems like a paradox: why do cells that carry odor information to the cortex respond less as a group to odors the mouse has learned and more to odors that the mouse does not recognize?  In the phantasmagoria of the brain, shouldn’t ignorance be dark and knowledge bright?

Enter information theory. Once regarded as nebulous and subjective, scientists are now treating information as a fundamental quantity, like mass or energy.  This revolution, which began with the first information theorist, Claude Shannon, defines information as reduction in uncertainty.  Imagine you are playing hangman with a friend.  Not knowing the word your friend has picked, you might play it safe and choose a vowel.  But since vowels are common, knowing there is an “e” in your friend’s word doesn’t offer many clues.  Suppose instead you take a risk and guess a rare letter like “z” or “x.”  Discovering an “x” narrows down the possible words considerably!  For this reason, the letter “x” carries more information than vowels like “e.”

To quantify this reduction in uncertainty, Shannon measured information in bits by considering the probability that a state — such as a letter or a Chinese character — might occur.  Because there are many more characters in Chinese script than there are letters in the English alphabet, each Chinese character is relatively rare and carries more information.  By contrast, a coin, with two possible states — heads and tails — carries exactly one bit of information, like the answer to a yes or no question.  Importantly, signals can carry information without meaning: information theory considers only probability and not semantics.


To carry information, a brain must be able to select from a large repertoire of possible states.  A brain whose neurons all fire in unison is like an alphabet with only two letters: on (firing) and off (not firing).  Indeed, a loss of consciousness coincides with many neurons acting in unison during sleep, anesthesia, and seizures, reducing the informational content of the brain.  By contrast, when only a small number of neurons respond to a stimulus, they form a rare state and reduce uncertainty much like the letter “x” in the game of hangman.  Over the course of learning an odor, inhibitory granule cells appear to sculpt such rare states from the distributed activity of mitral cells by silencing all but a select few.  This strategy is known as sparse coding, because information is encoded with few action potentials.  Sparse coding saves the brain energy and might seem to contradict Freeman’s observation that meaning is conveyed over large spatial patterns of activity.  Yet, sparse coding does not mean that silent cells are unimportant!  If we ignore the silent cells, we are left with a smaller alphabet and information is lost.  An even larger repertoire of states is allowed if we consider four dimensions, three of space and one of time.  In further agreement with Freeman’s findings, Komiyama’s group noted that mitral cell activity changes over a short interval of time immediately following the odor sniff.

Written alphabets are structured, and information generally implies order.  Yet, Shannon named his measure of information entropy, a term which means disorder.  Because disordered systems can be arranged in a larger number of ways than orderly systems, they have a larger repertoire of states and carry more information.  Indeed, mitral cells progress from a highly ordered state — all cells in agreement — to a highly disordered state — cells responding disharmoniously — with the course of learning.  Similarly, your bedroom probably moves from a tidy, organized state to a disheveled mess with time.  In fact, the second law of thermodynamics tells us that the disorder of any closed system increases with time unless work is done to organize it.  In some cases, however, a passive process of collecting information about parts of a system can be used to organize it, rather than an active process of doing work (Footnote 1).  This seems to undermine the second law!  Imagine organizing a large kennel of dogs according to dog breed by passively opening a partition in the kennel rather than actively chasing individual dogs.  The epiphany, however, that collecting information itself creates disorder elsewhere allowed twentieth century physicists to breathe a huge, collective sigh of relief: the second law has not been violated.

The psychologist Carl Jung once said,

“In all chaos there is a cosmos, in all disorder a secret order.”

Likewise, in all order there is a void, a great dearth of information.  As neuroscience becomes increasingly driven by visual images that galvanize funding, the public takes delight in flashy videos of the brain lighting up.  But for the zebrafish we visited at the beginning of this article, her brightest moment may have also been her dimmest.  Order is simple and easy to accept.  What takes courage is to embrace the chaos and accept disorder.



Edelman, Gerald M., and Giulio Tononi. A universe of consciousness: How matter becomes imagination. Basic books, 2000.

Keller, Philipp J., and Misha B. Ahrens. “Visualizing whole-brain activity and development at the single-cell level using light-sheet microscopy.” Neuron85.3 (2015): 462-483.

Kato, Hiroyuki K., et al. “Dynamic sensory representations in the olfactory bulb: modulation by wakefulness and experience.” Neuron 76.5 (2012): 962-975.

Mitchell, Melanie. Complexity: A guided tour. Oxford University Press, 2009.

Sakamoto, Masayuki, Ryoichiro Kageyama, and Itaru Imayoshi. “The functional significance of newly born neurons integrated into olfactory bulb circuits.” Frontiers in Neuroscience 8 (2014): 121.



  1. The physicist James Clerk Maxwell once asked how a demon might be able to “cheat” by organizing a chamber of gas into hot and cold regions through a passive process of opening a partition in the chamber rather than an active process of doing work on individual gas molecules.  The answer?  By collecting information about gas molecules before they cross the partition, the demon dissipates energy, sowing further disorder.


  • Joel Frohlich

    Joel Frohlich is a postdoc studying consciousness in the lab of Martin Monti at UCLA. He is interested in using brain activity recorded with EEG to infer when a person is conscious. Joel earned his PhD from UCLA in 2018 studying EEG markers of neurodevelopmental disorders in the lab of Shafali Jeste. You can also check out Joel's blog Consciousness, Self-Organization, and Neuroscience on Psychology Today. For more about Joel's research and writing, please visit Joel's website at

    View all posts

Joel Frohlich

Joel Frohlich is a postdoc studying consciousness in the lab of Martin Monti at UCLA. He is interested in using brain activity recorded with EEG to infer when a person is conscious. Joel earned his PhD from UCLA in 2018 studying EEG markers of neurodevelopmental disorders in the lab of Shafali Jeste. You can also check out Joel's blog Consciousness, Self-Organization, and Neuroscience on Psychology Today. For more about Joel's research and writing, please visit Joel's website at

2 thoughts on “Learning from Disorder: The Paradox of Information in the Brain

  • Anonymouse

    This is a fun article. If you wanted to follow this with a really great one, you could think about how this relates to the brain’s development over time, in particular to how changes in brain activity and morphology have been interpreted as signs of cognitive decline, even though a very different conclusion should be drawn from the available evidence – here are two cool papers that could get you started:

    Ramscar, Michael, et al. “The myth of cognitive decline: Non‐linear dynamics of lifelong learning.” Topics in cognitive science 6.1 (2014): 5-42.

    Ramscar, Michael, et al. “Learning is not decline: The mental lexicon as a window into cognition across the lifespan.” The Mental Lexicon 8.3 (2013): 450-481.

  • Really great article.

    An open question: does unison represent greater harmony? Socrates thought that harmony could allow no dissonance — but Aristotle called this “the error of Socrates.” Confucius, too, was very careful to declare that harmony was not “sameness.” Entropy and harmony are not two sides of the same axis…

Comments are closed.