Over the past few decades, the neurochemical A monoamine neurotransmitter. Dopamine is involved in many b... has earned the reputation of being the brain’s reward molecule. This image is built on observations that when animals and humans experience surges of dopamine, they feel rewarded and motivated to pursue more of the experience or substance which triggered the dopamine release. Researchers theorize that this neurochemical is at the root of an ancient system that evolved to make us feel gratified and thus more likely to approach situations and objects that might satisfy our needs — anything from nutrition to the more sophisticated desires for social approval or even money, a universal ticket for access to most resources.
But aside from rewarding sensations, we have some evidence that elevated dopamine levels produce another interesting phenomenon — people become more impulsive. One group of individuals frequently affected by this are Parkinson’s disease patients, who suffer from a fundamental dopamine deficiency and are thus treated with medications which either boost dopamine production (eg. L-DOPA) or stimulate dopamine receptors similarly to how natural dopamine release would (eg. apomorphine). Clinicians and researchers have observed that, during the course of such treatment, patients can become prone to compulsive gambling, shopping, and eating habits. They are not the only ones, as individuals exposed to substances like alcohol and cocaine, which temporarily enhance dopamine release in the brain, are also known for making rash choices.
One key similarity between the decisions of people under the influence of these recreational drugs and dopaminergic medications is that they apparently lack foresight, driven by a craving for short-term gratification. Technically most animals, including humans, prefer instantaneous rewards to those that require waiting. Yet most of us always tread the line between giving in to the appeal of immediate rewards and working towards long-term goals. Of course, some people are motivated by short-term prospects more strongly than others. This raises some questions about whether such individual variation might be explained by inherent differences in the dopamine systems of our brains.
So how might dopamine be involved when we weigh our options and make decisions? And how might this neurochemical influence our tendency to make impulsive choices, both throughout life, and when we are under the influence of various substances? To understand this, let’s look at what we know about what excites dopamine-releasing cells. One brain region that researchers frequently use to examine this question, is the ventral tegmental area (VTA), a collection of cells that provide one of the main sources of dopamine, hidden deep within the brain.
The image below is taken from an experiment in which researchers recorded the activity of hundreds of dopaminergic neurons in the ventral tegmental area (VTA) of monkeys, as they received rewards, such as droplets of juice or fruit morsels.
Decades ago, such recordings revealed that dopamine neurons are extremely sensitive to unexpected rewards, and that the release of dopamine from these cells is fundamental for encouraging the brain to learn about situations in which animals are likely to obtain useful rewards in the future. Importantly, dopaminergic activity increases as animals encounter rewards of greater magnitudes. This enables the neurochemical to send stronger teaching signals to the brain, shaping our fundamental tendency to be more motivated by large rewards.
But hold on a second. If I were to ask if you prefer $50 handed to you right now or in two months, you would probably rather have the money this instant, right? A monkey would make the same choice if you replaced the money with juice. The fact that a two month delay makes the $50 seem less enticing makes it clear that our decisions aren’t driven purely by the size of a reward. Instead, time is part of the equation that determines how much a reward is worth to us.
Whenever animals decide between possible actions they can take, they need to take into account the fact that they can lead to rewards which differ both in terms of size and the time-scale over which they are obtainable. Should the squirrel eat this acorn now, or perhaps bury it for later hungry times? In doing so, it would have to accept the risk that the acorn might be stolen in the meantime. Should the human being who held a handful of grain some 12,000 years ago have eaten it immediately, or perhaps buried it in the soil just in case it yielded a greater reward several months later (and started the agricultural revolution)? Who knows whether there might be a drought or a flood during the wait? Delaying can be risky, which is why the brain needs to account for the passage of time when we weigh our options. Accordingly, both our own experiences and experimental evidence point to the fact that the subjective values of rewards become reduced, or discounted, in accordance with how long we need to wait for them. This phenomenon, referred to as temporal discounting, is what makes all animals, including us, innately drawn towards immediate gratification, as its perceived value hasn’t been corroded by the anticipated passage of time. The responses of dopamine neurons also appear to discount rewards with time, which raises the possibility that their activity underpins the psychological phenomenon of discounting.
In one experiment, researchers recorded from a deep-brain cluster of dopamine cells in monkeys with the goal of examining how their activity, and the resulting dopamine release, was affected by requiring the monkeys to delay the gratification of receiving a tasty juice reward. So how can you tell a monkey how much it needs to wait? In this particular experiment, two images flashed up on the screen at the start of each trial, representing two different amounts of juice, and the monkey was required to look at one of these pictures to indicate which amount it preferred. Importantly, along with these pictures, the screen also displayed an image which meant that a monkey’s choice would have to be followed by a certain waiting period. Given that the monkeys practiced this task for a shocking 20,000 trials (!) before the researchers even began recording from their dopamine cells, there were plenty of opportunities for them to learn that they should expect to wait a certain amount of time for their chosen reward whenever they saw a particular image.
The experiment revealed that, as rewards were expected to arrive with increasing delays, the activity of dopamine cells during anticipation became weaker, dampening the release of the neurochemical. A monkey’s dopamine neurons, similarly to the monkey itself, are less excited when satisfaction needs to be delayed. Researchers have also examined how delaying rewards affects human brains, using functional magnetic resonance imaging (fMRI). This technique, which measures the oxygen-richness of blood that flows to various brain regions, is used to identify levels of activity across the brain, because researchers assume that the more activity a group of neurons is producing, the more it needs to be rapidly supplied with energy and oxygen through the bloodstream. Using fMRI, studies have found that when humans anticipate monetary rewards, the responses of several brain regions targeted by dopamine decline as money is being offered at increasing delays.
Given the importance of dopamine for driving us to seek rewards, the system’s subdued reaction to delayed gratification might be at the root of why delays make rewards lose their appeal to us. Indeed, the same fMRI study discovered that the more an individual’s brain response deteriorated with longer reward delays, the more likely that person was to be impatient and decide on a short-term reward.
So can altering the workings of the dopamine system sway us away from being willing to wait and compel us to make impulsive choices? To answer this question, it’s important to figure out how to conduct a scientific investigation of impulsiveness. Experiments have looked into this by asking participants, whether it’s monkeys, humans, or rats, to make choices between immediate and delayed rewards. Of course, participants need to be asked the right questions, as requiring them to simply choose between the same reward now versus later means that they would invariably decide on the quicker reward. This would make it impossible to quantitatively measure impulsiveness (since every decision would be the same).
Thus, experimental participants are usually asked to choose between smaller immediate rewards, and larger delayed rewards. This creates a situation in which decisions are no longer absolutely obvious, which creates room for variety of choice. For instance, would a person prefer to have $20 transferred to their bank account now or $40 after a 6 month waiting period? How about if the $40 reward required a 12 month wait? To examine how delays reduce people’s opinions of how much rewards are worth, researchers tend to look out for when a person is equally likely to choose the short-term option as the long-term one. For example, if a person is pretty much indifferent to whether they receive $20 immediately or $80 in 12 months, we can reasonably assume that they regard these two rewards as equally valuable. In the case of this person, delaying an $80 reward by 12 months apparently reduces the money to roughly 25% of its original subjective value, since this option seems no more or less attractive than the immediate $20 (based on the fact that the person is equally likely to choose either option). We can see that the point at which an individual becomes indifferent to their options tells us something about how a particular amount of time degrades the subjective value of a reward. Using this fact, researchers have looked for points of indifference for various delays to find just how much rewards become discounted as people need to wait increasingly longer. These points, which together make up a so-called temporal discount function, reveal ample diversity in levels of impulsiveness in the normal human population. Let’s look at three such functions obtained from three very different individuals.
The temporal discount function in the middle (AVERAGE) shows a relatively steady decline in the subjective values of rewards that require increasingly longer waits. We can see that if a reward is only available after 180 days (last point on the plot), it degrades to roughly 50% of the original value it had when it was available immediately. This means that this particular individual is no more likely to prefer getting $100 transferred to their bank account 180 days from now than getting $50 this instant. This level of temporal discounting appears to be representative of the average human being.
On the other hand, the temporal discount function on top reveals a remarkably patient individual, for whom the value of reward is virtually unaffected by delay. This person’s strategy is extremely focused on long-term gains, as their decisions are fundamentally driven by greater rewards, irrespective of waiting time.
Finally, the function on the bottom shows an individual who is virtually incapable of deferring satisfaction, even if this means missing out on substantially greater later rewards. Based on this person’s extremely heavy discount rate, we can see that postponing a reward by a month degrades it to only 10% of its original value. Clearly this person feels that waiting is rarely worthwhile. While most of us would find it surprising, and perhaps unreasonable, this individual would be no more likely to pick a $100 bank transfer 180 days from now than $10 handed to them immediately. It’s safe to say that this person is very impulsive, consistently preferring short-term gratification over the prospect of long-term reward. In fact, this person isn’t so different from someone under the influence of drugs which enhance dopamine release, as researchers have found that such substances fundamentally influence the rate at which delayed rewards lose their subjective value.
This was observed in an experiment in which healthy human participants were asked to choose between short-term and long-term rewards shortly after being given a single dose of L-DOPA, a drug used to boost dopamine production in patients with Parkinson’s disease. Remarkably, researchers observed that for several hours after consuming L-DOPA, participants became much heavier discounters of delayed rewards and shifted their preferences towards short-term rewards, compared to when they were not under the influence of the drug, or simply given multivitamin pills. In essence, making these individuals wait for their monetary rewards made the money lose its subjective value faster than when dopamine levels were not given a boost.
Importantly, participants didn’t make their decisions any quicker than normal. This tells us that they became more impulsive not because they just didn’t think properly before deciding, but because they did think about it and still found the immediate rewards more appealing. This finding says something fundamental about why drugs which enhance dopamine release can cause us to behave as if we have little foresight — it appears that they make our brains react as if rewards which require some time are simply not worth the wait.
Of course, dopamine-enhancing drugs are far from the only driving force behind impulsive behavior. The three temporal discount functions you see above reveal just how widely humans can vary in terms of their impulsive tendencies, although I don’t need to cite any studies to convince you of this. Researchers have some evidence that intrinsic differences in impulsiveness could, to some extent, be explained by stable differences in the workings of various components of the dopamine system. Particularly important are the D2 receptors. These receptors are quite special, as they are located both on the dendrites of postsynaptic cells and the The long process that carries an action potential from the c... terminals of presynaptic cells. Sitting on the terminals of dopamine-releasing neurons allows these receptors to detect nearby concentrations of the neurochemical and continuously adjust the amount of dopamine being released by the cell, thus preventing neurochemical build-up. This makes D2 receptors essential for keeping a lid on dopamine release in the brain.
Given their importance, it may be unsurprising that natural variation in the function of these receptors can influence the tendency to make impulsive choices. This might in fact be the case for intrinsically impulsive individuals, as researchers have found that their brains have either fewer or less functional D2 receptors, which is somewhat equivalent to having rusty brakes on dopamine release. Accordingly, the study reported that impulsive brains release disproportionately high amounts of dopamine in response to natural rewards or dopamine-enhancing drugs, such as cocaine or amphetamine. Interestingly, the same reduced D2 receptor function and inflated dopamine responses have been observed in the brains of impulsive lab rats.
The situation appears to be even more exaggerated in individuals with psychopathy — a personality disorder characterized by extremely manipulative, remorseless, and impulsive behavior, which drives individuals to seek gratification at all costs. The reputation that psychopaths have for being impulsive goes hand in hand with the observation that they are especially harsh at discounting rewards with time. Of course, psychopathy is a complex disorder rooted in multiple biological disruptions. Having said that, we have reason to believe that their pathologically impulsive behavior might be rooted in a hypersensitive dopamine system. Researchers have found that psychopaths who score highest on impulsivity measures release abnormally high quantities of dopamine when anticipating rewards or consuming drugs. It’s possible that these individuals behave impulsively because their dopamine release in response to instantaneous rewards is so strong that very few delayed rewards could compare.
In truth, researchers are far from understanding how dopamine cells gain access to information which allows them to regulate how the brain perceives the values of rewards. Where do we even begin trying to understand how and where the brain represents time? In the past, researchers have proposed that the brain might be clocking in time using certain neurons which behave like pacemakers, ‘ticking’ rhythmically and accumulating information about the passage of time. It’s likely that the pacemaker approach would only allow the brain to register time over the briefest timescales, ranging from milliseconds to seconds. This would be of little help in situations when the choices we make rely on our ability to comprehend weeks, months, and even years, worth of time! Researchers are, unfortunately, far from even beginning to figure out how our neurons represent such vast stretches of time. But based on the bits of evidence that we do have, it’s undeniable that time is an essential component of most decisions our brains ever need to make.
Written by Sofia Deleniv.
Buckholtz, J. W. et al. (2010). Dopaminergic network differences in human impulsivity. Science 329. http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3161413/
Buckholtz, J. W. et al. (2010). Mesolimbic dopamine reward system hypersensitivity in individuals with psychopathic traits. Nature Neuroscience 13, 419-421. http://www.nature.com/neuro/journal/v13/n4/full/nn.2510.html
Carlezon, W. A. and Chartoff, E. H. (2007). Intracranial self-stimulation (ICSS) in rodents to study the neurobiology of motivation. Nature Protocols. 2, 2987–2995. http://www.nature.com/nprot/journal/v2/n11/pdf/nprot.2007.441.pdf
D’Ardenne, K. et al. (2008). BOLD responses reflecting dopaminergic signals in the human ventral tegmental area. Science 319, 1264-1267. http://science.sciencemag.org/content/319/5867/1264.full.pdf
Dougherty, D. M. et al. (2000). Effects of moderate and high doses of alcohol on attention, impulsivity, discriminability, and response bias in immediate and delayed memory task performance. Alcoholism: Clinical and Experimental Research 24, 1702-1711. http://onlinelibrary.wiley.com/doi/10.1111/j.1530-0277.2000.tb01972.x/abstract
Fibiger, H. C. et al. (1987). The role of dopamine in intracranial self-stimulation of the ventral tegmental area. Journal of Neuroscience 7, 3888-3895. http://www.jneurosci.org/content/7/12/3888.full.pdf
Kable, J. W. and Glimcher, P. W. (2007). The neural correlates of subjective value during intertemporal choice. Nature Neuroscience 10, 1625-1633. http://www.nature.com/neuro/journal/v10/n12/full/nn2007.html
Kobayashi, S. and Schultz, W. (2008). Influence of reward delays on responses of dopamine neurons. Journal of Neuroscience 28, 7837-7846. http://www.jneurosci.org/content/28/31/7837.full.pdf+html
Kohls, G. et al. (2013). The nucleus accumbens is involved in both the pursuit of social reward and the avoidance of social punishment. Neuropsychologia 51, 2062-2069. http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3799969/pdf/nihms512175.pdf
Mendez, I. A. et al. (2000). Self-administered cocaine causes long-lasting increases in impulsive choice in a delay discounting task. Behavioural Neuroscience 124, 470-477. http://www.ncbi.nlm.nih.gov/pmc/articles/PMC2976632/pdf/nihms248427.pdf
Pine, A. et al. (2010). Dopamine, time, and impulsivity in humans. Journal of Neuroscience 30, 8888-8896. http://www.jneurosci.org/content/30/26/8888.full
Richards, J. B. et al. (1997). Determination of discount functions in rats with an adjusting-amount procedure. Journal of the Experimental Analysis of Behavior 67, 353-366.
Rodriguez, M. L. and Logue, A. W. (1988). Adjusting delay to reinforcement: comparing choice in pigeons and humans. Journal of Experimental Psychology 14, 105-117.
Saddoris, M. P. et al. (2015). Mesolimbic dopamine dynamically tracks, and is causally linked to, discrete aspects of value-based decision making. Biological Psychiatry 77, 903-911. http://www.sciencedirect.com/science/article/pii/S0006322314008336
Sato, A. (2008). Temporal and probability discounting in individuals with antisocial personality disorder traits. Japanese Journal of Personality 17, 50-59. https://www.jstage.jst.go.jp/article/personality/17/1/17_1_50/_article
Schultz, W. (1998) Predictive reward signal of dopamine neurons. Journal of Neurophysiology 80, 1-27. http://jn.physiology.org/content/jn/80/1/1.full.pdf
Schultz, W. (2010). Dopamine signals for reward value and risk: basic and recent data. Behavioral and Brain Functions 6:24. http://download.springer.com/static/pdf/853/art%253A10.1186%252F1744-9081-6-24.pdf?originUrl=http%3A%2F%2Fbehavioralandbrainfunctions.biomedcentral.com%2Farticle%2F10.1186%2F1744-9081-6-24&token2=exp=1453400939~acl=%2Fstatic%2Fpdf%2F853%2Fart%25253A10.1186%25252F1744-9081-6-24.pdf*~hmac=cf613bd27e7868fdbd9db221fa4a7bd011b493eb39ff72db6386be37f941cf50
Simon, N. W. et al. (2007). Cocaine exposure causes long-term increases in impulsive choice. Behavioural Neuroscience 121, 543-549.
Uhl, G. (2007). Premature poking: impulsivity, cocaine and dopamine. Nature Medicine 13, 413-414. http://www.nature.com/nm/journal/v13/n4/full/nm0407-413.html
Vezina, P. et al. (2002) Sensitization of midbrain dopamine neuron reactivity promotes the pursuit of amphetamine. Journal of Neuroscience 22, 4654-4662. http://www.jneurosci.org/content/22/11/4654.full.pdf
Vytlacil, J. et al. (2014). An approach for identifying brainstem dopaminergic pathways using resting state functional MRI. PLoS ONE 9(1). http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0087109
Weintraub, D. (2009). Impulse control disorders in Parkinson’s disease: prevalence and possible risk factors. Parkinsonism & Related Disorders 15, 110-113. http://www.ncbi.nlm.nih.gov/pubmed/20082968
Wittmann, M. and Paulus, M. P. (2007). Decision making, impulsivity and time perception. Trends in Cognitive Sciences. http://koso.ucsd.edu/~martin/WittmannTimeReview2008.pdf
Images made by Jooyeun Lee.
Latest posts by knowingneurons (see all)
- Video: Can Neuroscience Explain the Mandela Effect? - October 3, 2018
- Announcing the Knowing Neurons Patreon - March 15, 2018
- Myth or Fact? One region of the brain sets us humans apart from other species. - August 22, 2016