Posts Tagged ‘Dopamine’

Cheap? Yes. Fake? Not at all.  It’s another genetic study on the placebo effect and it highlights the fact that our brains are not static input-out machines that were built from scratch using a genetic blueprint.  Rather, what we expect and believe matters … a lot.

How does it work?  Nobody knows for sure, but dopamine has been implicated in synaptic mechanisms that are used to register the fulfillment or violation of expectations.  For example, if you believe that a certain something will happen … and something better happens, your brain produces a burst of dopamine.  If something worse happens, then you get a drop in dopamine.  Your expectations and beliefs influence your dopamine levels.

Apparently, some of us metabolize dopamine faster vs. slower which may be related to a weaker vs. stronger placebo response.  For example, my rs4680 GG fast dopamine metabolizing genotype says, the “medicine in my mind” is not very strong.  But, on the other hand, I do watch A LOT of Grey’s Anatomy.

Read Full Post »

Political power must feel pretty good … especially if you have deep-seated personal insecurities and can conveniently use the notoriety of your office to indulge in a sense of superiority and vanity.  Among many, many brain systems that develop slowly during childhood – inflated ego, interpersonal hostility and impulsivity can emerge very early during development.  Instantaneous electronic “boner-to-picture-to-internet” hand-held technology just makes it that much easier to get busted once you’ve become a full-grown asshole.

Here’s a small insight into how this unfortunate developmental pathway might unfold … from a small-scale genetic study on variation in an intra-cytoplasmic loop of the Dopamine DRD4 receptor and its relationship to infidelity:

[DRD4] 7R+ individuals exhibit augmented anticipatory desire response to stimuli signaling dopaminergic incentives, such as food, alcohol, tobacco, gambling, and opiates. Although it is as yet speculative, these associations suggest that 7R+ individuals may allocate greater attention to appetitive rewards, contributing to the behavioral differences in promiscuity and infidelity observed here.

Neither the first, nor the last gene-twitter interaction to have gone badly for someone …

More on the DRD4 and social bonding genes

Read Full Post »

Wobble base pair guanine uracil (GU)

Image via Wikipedia

Hands shake and wobble as the decades pass … moreso in some.

A recently evolved “T” allele (rs12720208) in the  3′ untranslated region (3′ UTR) of the FGF20 gene has been implicated in the risk of Parkinson’s Disease … namely by creating a wobbly G:U base-pair between microRNA-433 (miR-433) and the FGF20 transcript.  Since the normal function of microRNA-433 is to repress translation of proteins (such as FGF20), it is suspected that the PD risk “T” allele carriers make relatively more FGF20 … which, in turn … leads to the production of higher levels of alpha-synuclein (the main component of Lewy body fibrils, a pathological marker of diseases such as PD).  This newly evolved T-allele has also been associated with brain structural differences in healthy individuals.

My hands will shake and wobble as the decades pass … but not because I carry the G:U wobble pairing between miR-433:FGF20.  My 23andMe profile shows that I carry 2 C alleles and will produce the thermodynamically favorable G:C pairing.  Something to keep in mind as I lose my mind in the decades to come.

Enhanced by Zemanta

Read Full Post »

Perhaps someday, but it’s complicated. This is because the brain is not a simple input-output device.  If we step on a thumbtack, it hurts … but can hurt more if you are feeling sad and lonely and much less if you are in love and just won the lottery.  Expectations and memories matter, and so – our genotype – is something that reflects the development brain systems used for processing emotions, memories and expectations (like, um, the whole brain does this).

This paper explored this question using a shoulder exercise soreness assay and the COMT genotype and found that:

Participants that endorsed cognitions consistent with pain catastrophizing and had a genetic predisposition to low COMT enzyme activity had significantly higher pain intensity and pressure pain ratings when compared with groups with 1 or no risk factors.

Pain catastrophizing” is a measure of how much a person ruminates (unable to suppress or divert attention away from pain-related thoughts) and/or focuses on and exaggerates the threat value of a painful stimuli and/or feels helpless and unable to cope with the adversity of painful stimuli.  It may be the most important aspect of coping with pain … an understanding that your perspective modulates your pain.

This may be worth noting given the  “dramatic increase in accidental deaths associated with the use of prescription opioids and also an increasing average daily morphine equivalent dose” despite the finding that “there is no clear evidence that long-term opiate therapy for chronic back pain is efficacious”.

Read Full Post »

Novelty candles may be used.
Image via Wikipedia

Everyone has a birthday right. Its the day you (your infant self) popped into the world and started breathing, right?  But what about the day “you” were born – that is – “you” in the more philosophical, Jungian, spiritual, social, etc. kind of a way when you became aware of being in some ways apart from others and the world around you.  In her 1997 paper, “The Basal Ganglia and Cognitive Pattern Generators“, Professor Ann Graybiel writes,

The link between intent and action may also have a quite specific function during development. This set of circuits may provide part of the neural mechanism for building up cognitive patterns involving recognition of the self. It is well documented that, as voluntary motor behaviors develop and as feedback about the consequences of these behaviors occurs, the perceptuomotor world of the infant develops (Gibson 1969). These same correlations among intent, action, and consequence also offer a simple way for the young organism to acquire the distinction between actively initiated and passively received events. As a result, the infant can acquire the recognition of self as actor. The iterative nature of many basal ganglia connections and the apparent involvement of the basal ganglia in some forms of learning could provide a mechanism for this development of self-awareness.

As Professor Graybiel relates the “self” to function in the basal-ganglia and the so-called cortico-thalamic basal-ganglia loops – a set of parallel circuits that help to properly filter internal mental activity into specific actions and executable decisions – I got a kick out of a paper that describes how the development of the basal-ganglia can go awry for cells that are born at certain times.

Check out the paper, “Modular patterning of structure and function of the striatum by retinoid receptor signaling” by Liao et al.   It reveals that mice who lack a certain retinoic acid receptor gene (RARbeta) have a type of defective neurogenesis in late-born cells that make up a part of the basal ganglia (striatum) known as a striosome.  Normally, the authors say, retinoic acid helps to expand a population of late-born striosomal cells, but in the RARbeta mutant mice, the rostral striosomes remain under-developed.   When given dopaminergic stimulation, these mutant mice showed slightly less grooming and more sterotypic behaviors.

So when was “my self’s” birthday?  Was it when these late-born striosomal cells were, umm, born?  Who knows, but I’m glad my retinoic acid system was intact.

Reblog this post [with Zemanta]

Read Full Post »

DON’T tell the grant funding agencies, but, in at least one way, the effort to relate genetic variation to individual differences in cognitive function is a totally intractable waste of money.

Let’s say we ask a population of folks to perform a task – perhaps a word memory task – and then we use neuroimaging to identify the areas of the brain that (i) were associated with performance of the task, and (ii) were not only associated with performance, but were also associated with genetic variation in the population.  Indeed, there are already examples of just this type of “imaging-genetic” study in the literature.  Such studies form a crucial translational link in understanding how genes (whose biochemical functions are most often studied in animal models) relate to human brain function (usually studied with cognitive psychology). However, do these genes relate to just this task? What if subjects were recalling objects? or feelings?  What if subjects were recalling objects / experiences / feelings / etc. from their childhoods?  Of course, there are thousands of common cognitive operations one’s brain routinely performs, and, hence, thousands of experimental paradigms that could be used in such “imaging-genetic” gene association studies.  At more than $500/hour (some paradigms last up to 2 hours) in imaging costs, the translational genes-to-cognition endeavor could get expensive!

DO tell the grant funding agencies that this may not be a problem any longer.

The recent paper by Liu and colleagues “Prefrontal-Related Functional Connectivities within the Default Network Are Modulated by COMT val158met in Healthy Young Adults” [doi: 10.1523/jneurosci.3941-09.2010] suggests an approach that may simplify matters.  Their approach still involves genotyping (in this case for rs4680) and neuroimaging.  However, instead of performing a specific cognitive task, the team asks subjects to lay in the scanner – and do nothing.  That’s right – nothing – just lay still with eyes closed and just let the mind wander and not to think about anything in particular – for a mere 10 minutes.  Hunh?  What the heck can you learn from that?

It turns out that one can learn a lot.  This is because the neural pathways that the brain uses when you are actively doing something (a word recall task) are largely intact even when you are doing nothing.  Your brain does not “turn off” when you are laying still with your eyes closed and drifting in thought.  Rather, your brain slips into a kind of default pattern, described in studies of  “default networks” or “resting-state networks” where wide-ranging brain circuits remain dynamically coupled and actively exchange neural information.  One really great paper that describes these networks is a free-and-open article by Hagmann et al., “Mapping the Structural Core of Human Cerebral Cortex” [doi: 10.1371/journal.pbio.0060159] from which I’ve lifted their Figure 1 above.  The work by Hagmann et al., and others show that the brain has a sort of “connectome” where there are thousands of “connector hubs” or nodes that remain actively coupled (meaning that if one node fires, the other node will fire in a synchronized way) when the brain is at rest and when the brain is actively performing cognitive operations.  In a few studies, it seems that the strength of functional coupling in certain brain areas at rest is correlated (positively and negatively) with the activation of these areas when subjects are performing a specific task.

In the genetic study reported by Liu and colleagues, they found that genotype (N=57) at the dopaminergic COMT gene correlated with differences in the functional connectivity (synchronization of firing) of nodes in the prefrontal cortex.  This result is eerily similar to results found for a number of specific tasks (N-back, Wisconsin Card Sorting, Gambling, etc.) where COMT genotype was correlated with the differential activation of the frontal cortex during the task.  So it seems that one imaging paradigm (lay still and rest for 10 minutes) provided comparable insights to several lengthy (and diverse) activation tasks.  Perhaps this is the case. If so, might it provide a more direct route to linking genetic variation with cognitive function?

Liu and colleagues do not comment on this proposition directly nor do they seem to be over-interpreting their results in they way I have editorialized things here.  They very thoughtfully point out the ways in which the networks they’ve identified and similar and different to the published findings of others.  Certainly, this study and the other one like it are the first in what might be a promising new direction!

Reblog this post [with Zemanta]

Read Full Post »

[picapp src=”e/7/8/1/Children_Attend_Classes_9572.jpg?adImageId=4955179&imageId=1529412″ width=”380″ height=”253″ /]

This year, my 5 year-old son and I have passed many afternoons sitting on the living room rug learning to read.  While he ever so gradually learns to decode words, eg. “C-A-T”  sound by sound, letter by letter – I can’t help but marvel at the human brain and wonder what is going on inside.  In case you have forgotten, learning to read is hard – damn hard.  The act of linking sounds with letters and grouping letters into words and then words into meanings requires a lot of effort from the child  (and the parent to keep discomfort-averse child in one place). Recently, I asked him if he could spell words in pairs such as “MOB & MOD”, “CAD & CAB”, “REB & RED” etc., and, as he slowly sounded out each sound/letter, he informed me that “they are the same daddy“.  Hence, I realized that he was having trouble – not with the sound to letter correspondence, or the grouping of the letters, or the meaning, or handwriting – but rather – just hearing and discriminating the -B vs. -D sounds at the end of the word pairs.  Wow, OK, this was a much more basic aspect of literacy – just being able to hear the sounds clearly.  So this is the case, apparently, for many bright and enthusiastic children, who experience difficulty in learning to read.  Without the basic perceptual tools to hear “ba” as different from “da” or “pa” or “ta” – the typical schoolday is for naught.

With this in mind, the recent article, “Genetic determinants of target and novelty-related event-related potentials in the auditory oddball response” [doi:10.1016/j.neuroimage.2009.02.045] caught my eye.  The research team of Jingyu Liu and colleagues asked healthy volunteers just to listen to a soundtrack of meaningless beeps, tones, whistles etc.  The participants typically would hear a long stretch of the same sound eg. “beep, beep, beep, beep” with a rare oddball “boop” interspersed at irregular intervals.  The subjects were instructed to simply press a button each time they heard an oddball stimulus.  Easy, right?  Click here to listen to an example of an “auditory oddball paradigm” (though not one from the Liu et al., paper).  Did you hear the oddball?  What was your brain doing? and what genes might contribute to the development of this perceptual ability?

The researchers sought to answer this question by screening 41 volunteers at 384 single nucleotide polymorphisms (SNPs) in 222 genes selected for their metabolic function in the brain.  The team used electroencephalogram recordings of brain activity to measure differences in activity for “boop” vs. “beep” type stimuli – specifically, at certain times before and after stimulus onset – described by the so-called N1, N2b, P3a, P3b component peaks in the event-related potentials waveforms.  800px-Erp1Genotype data (coded as 1,0,-1 for aa, aA, AA) and EEG data were plugged into the team’s home-grown parallel independent components analysis (ICA) pipeline (generously provided freely here) and several positives were then evaluated for their relationships in biochemical signal transduction pathways (using the Ingenuity Pathway Analysis toolkit.  A very novel and sophisticated analytical method for certain!

The results showed that certain waveforms, localized to certain areas of the scalp were significantly associated with the perception of various oddball “boop”-like stimuli.  For example, the early and late P3 ERP components, located over the frontal midline and parieto-occipital areas, respectively, were associated with the perception of oddball stimuli.  Genetic analysis showed that several catecholaminergic SNPs such as rs1800545 and rs521674 (ADRA2A), rs6578993 and rs3842726 (TH) were associated with both the early and late P3 ERP component as well as other aspects of oddball detection.

Both of these genes are important in the synaptic function of noradrenergic and dopaminergic synapses. Tyrosine hydroxylase, in particular, is a rate-limiting enzyme in catecholamine synthesis.  Thus, the team has identified some very specific molecular processes that contribute to individual differences in perceptual ability.  In addition to the several other genes they identified, the team has provided a fantastic new method to begin to crack open the synaptic complexities of attention and learning.  See, I told you learning to read was hard!

Reblog this post [with Zemanta]

Read Full Post »

Older Posts »