Posts Tagged ‘Gene expression’

Who knew?  Reinius and colleagues have discovered where she’s kept it stashed away … in 85 brain-expressed genes they refer to as a conserved sexual signature … tsk tsk naughty.  Ladies, you can skip over the parts about macho men with rippling muscles and power tools … ’cause apparently, what really turns Mother Nature on is polyamine biosynthesis.

Read Full Post »

Hey Yin, we have a genome and a brain … what’s the relaionship?

I dunno Yang.  Lets focus on variation.  Genome sequence variation can vary with the brain  … and … gene expression can vary with the brain  … however … genome sequence variation can vary with gene expression … but … here’s a paper showing that gene expression is under the control of genome sequence variation. Purrrr.

Hey Yin, the correlation between genome sequence variation and gene expression confuses me.  I mean, gene expression can change if the environment changes right?  Doesn’t this confound research that uses genome sequence variation?


thanks for the pic noyfb.

Enhanced by Zemanta

Read Full Post »

Myelin Repair Foundation Logo
Image by Myelin Repair Foundation via Flickr

from Ye et al., 2009:

HDAC1/2 genes encode proteins that modify the epigenome (make it less accessible for gene expression).

When HDAC1/2 functions around the HES5 and ID2/4 (repressors of white matter development) genes, the epigenetic changes (less acetylation of chromatin) helps to repress the repressors.

This type of epigenetic repression of gene expression (genes that repress white matter development) is essential for white matter development.

Enhanced by Zemanta

Read Full Post »

Modified drawing of the neural circuitry of th...
Image via Wikipedia

You already know this, but when you are stressed out (chronic stress), your brain doesn’t work very wellThat’s right – just when you need it most – your brain has a way of letting you down!

Here are a few things that happen to the very cells (in the hippocampus) that you rely on:

reorganization within mossy fiber terminals
loss of excitatory glutamatergic synapses
reduction in the surface area of postsynaptic densities
marked retraction of thorny excrescences
alterations in the lengths of the terminal dendritic segments of pyramidal cells
reduction of the dorsal anterior CA1 area volume

Thanks brain!  Thanks neurons for abandoning me when I need you most!  According to this article, these cellular changes lead to, “impaired hippocampal involvement in episodic, declarative, contextual and spatial memory – likely to debilitate an individual’s ability to process information in new situations and to make decisions about how to deal with new challenges.” UGH!

Are our cells making these changes for a reason?  Might it be better for cells to remodel temporarily rather than suffer permanent, life-long damage?  Perhaps.  Perhaps there are molecular pathways that can lead the reversal of these allostatic stress adaptations?

Check out this recent paper: “A negative regulator of MAP kinase causes depressive behavior” [doi 10.1038/nm.2219]  the authors have identified a gene – MKP-1 – a phosphatase that normally dephosphorylates various MAP kinases involved in cellular growth, that, when inactivated in mice, produces animals that are resistant to chronic unpredictable stress.  Although its known that MKP-1 is needed to limit immune responses associated with multi-organ failure during bacterial infections, the authors suggest:

“pharmacological blockade of MKP-1 would produce a resilient of anti-depressant response to stress”

Hmmm … so Mother Nature is using the same gene to regulate the immune response (turn it off so that it doesn’t damage the rest of the body) and to regulate synaptic growth (turn it off – which is something we DON’T want to do when we’re trying to recover from chronic stress)?  Mother Nature gives us MKP-1 so I can survive an infection, but the same gene prevents us from recovering (finding happiness) from stress?

Of course, we do not need to rely only on pharmacological solutions.  Exercise & social integration are cited by these authors as the top 2 non-medication strategies.

Enhanced by Zemanta

Read Full Post »

remember a day before today
Image by DerrickT via Flickr

Most cells in your adult body are “terminally differentiated” – meaning that they have developed from stem cells into the final liver, or heart, or muscle or endothelial cell that they were meant to be.  From that point onward, cells are able to “remember” to stay in this final state – in part – via stable patterns of DNA methylation that reinforce the regulation of “the end state” of gene expression for that cell.  As evidence for this role of DNA methylation, it has been observed that levels of DNA methyl transferase (DNMT) decline when cells are fully differentiated and thus, cannot modify or disrupt their patterns of methylation.

NOT the case in the brain! Even though neurons in the adult brain are fully differentiated, levels of methyl transferases – DO NOT decline.  Why not? Afterall, we wouldn’t want our neurons to turn into liver cells, or big toe cells, would we?

One hypothesis, suggested by David Sweatt and colleagues is that neurons have more important things to “remember”.   They suggest in their fee and open research article, “Evidence That DNA (Cytosine-5) Methyltransferase Regulates Synaptic Plasticity in the Hippocampus” [doi: 10.1074/jbc.M511767200] that:

DNA methylation could have lasting effects on neuronal gene expression and overall functional state. We hypothesize that direct modification of DNA, in the form of DNA (cytosine-5) methylation, is another epigenetic mechanism for long term information storage in the nervous system.

By measuring methylated vs. unmethylated DNA in the promoter of the reelin and BDNF genes and relating this to electrophysiological measures of synaptic plasticity, the research team finds correlations between methylation status and synaptic plasticity.  More specifically, they find that zebularine (an inhibitor of DNMT) CAN block long-term potentiation (LTP), but NOT block baseline synaptic transmission nor the ability of synapses to fire in a theta-burst pattern (needed to induce LTP).

This suggests that the epigenetic machinery used for DNA methylation may have a role in the formation of cellular memory – but not in the same sense as in other cells in the body – where cells remember to remain in a terminally differentiated state.

In the brain, this epigenetic machinery may help cells remember stuff that’s more germane to brain function … you know … our memories and stuff.

Enhanced by Zemanta

Read Full Post »

One day, each of us may have the dubious pleasure of browsing our genomes.  What will we find?   Risk for this?  Risk for that?  Protection for this? and that?  Fast twitching muscles & wet ear wax?  Certainly.  Some of the factors will give us pause, worry and many restless nights.  Upon these genetic variants we will likely wonder, “why me? and, indeed, “why my parents (and their parents) and so on?”

Why the heck! if a genetic variant is associated with poor health, is it floating around in human populations?

A complex question, made moreso by the fact that our modern office-bound, get-married when you’re 30, live to 90+ lifestyle is so dramatically different than our ancestors. In the area of mental health, there are perhaps a few such variants – notably the deaded APOE E4 allele – that are worth losing sleep over, perhaps though, after you have lived beyond 40 or 50 years of age.

Another variant that might be worth consideration – from cradle-to-grave – is the so-called 5HTTLPR a short stretch of concatenated DNA repeats that sits in the promoter region of the 5-HTT gene and – depending on the number of repeats – can regulate the transcription of 5HTT mRNA.  Much has been written about the unfortunateness of this “short-allele” structural variant in humans – mainly that when the region is “short”, containing 14 repeats, that folks tend to be more anxious and at-risk for anxiety disorders.  Folks with the “long” (16 repeat variant) tend to be less anxious and even show a pattern of brain activity wherein the activity of the contemplative frontal cortex is uncorrelated from the emotionally active amygdala.  Thus, 5HTTLPR “long” carriers are less likely to be influenced, distracted or have their cognitive processes disrupted by activity in emotional centers of the brain.

Pity me, a 5HTTLPR “short”/”short”  who greatly envies the calm, cool-headed, even-tempered “long”/”long” folks and their uncorrelated PFC-amygdala activity.  Where did their genetic good fortune come from?

Klaus Peter Lesch and colleagues say the repeat-containing LPR DNA may be the remnants of an ancient viral insertion or transposing DNA element insertion that occurred some 40 million years ago.  In their article entitled, “The 5-HT transporter gene-linked polymorphic region (5-HTTLPR) in evolutionary perspective:  alternative biallelic variation in rhesus monkeys“, they demonstrate that the LPR sequences are not found in primates outside our simian cousins (baboons, macaques, chimps, gorillas, orangutans).  More recently, the ancestral “short” allele at the 5HTTLPR acquired some additional variation leading to the rise of the “long” allele which can be found in chimps, gorillas, orangutans and ourselves.

So I missed out on inheriting “CCCCCCTGCACCCCCCAGCATCCCCCCTGCACCCCCCAGCAT” (2 extra repeats of the ancient viral insertion) which could have altered the entire emotional landscape of my life.  Darn, to think too, that it has been floating around in the primate gene pool all these years and I missed out on it.  Drat!

Enhanced by Zemanta

Read Full Post »

Cinematicode wall
Image by Smeerch via Flickr

As far as science movies go, the new movie, “To Age or Not To Age” seems like a lot of fun.  The interview with Dr. Leonard Guarente suggests that the sirtuin genes play a starring role in the film.  Certainly,  an NAD+ dependent histone deacetylase – makes for a sexy movie star – especially when it is able to sense diet and metabolism and establish the overall lifespan of an organism.

One comment in the movie trailer, by Aubrey de Grey, suggests that humans may someday be able to push the physiology of aging to extreme ends.  That studies of transgenic mice over-expressing SIRT1 showed physiological properties of calorie-restricted (long lived) mice – even when fed ad libitum – suggests that something similar might be possible in humans.

Pop a pill and live it up at your local Denny’s for the next 100 years?  Sounds nice (& a lot like grad school).

Just a few twists to the plot here.  It turns out that – in the brain – SIRT1 may not function as it does in the body.  Here’s a quote from a research article “Neuronal SIRT1 regulates endocrine and behavioral responses to calorie restriction” that inactivated SIRT1 just in the brain:

Our findings suggest that CR triggers a reduction in Sirt1 activity in hypothalamic neurons governing somatotropic signaling to lower this axis, in contrast with the activation of Sirt1 by CR in many other tissues. Sirt1 may have evolved to positively regulate the somatotropic axis, as it does insulin production in β cells, to control mammalian health span and life span in an overarching way. However, the fact that Sirt1 is a positive regulator of the somatotropic axis may complicate attempts to increase murine life span by whole-body activation of this sirtuin.

To a limited extent, it seems that – in the brain – SIRT1 has the normal function of promoting aging.  Therefore, developing “pills” that are activators of SIRT1 would be good for the body, but somehow might be counteracted by what the brain would do.  Who’s in charge anyway?  Mother Nature will not make it easy to cheat her! Another paper published recently also examined the role of SIRT1 in the brain and found that – normally – SIRT1 enhances neuronal plasticity (by blocking the expression of a  micro-RNA miR-134 that binds to the mRNA of, and inhibits the translation of, synaptic plasticity proteins such as CREB).

So, I won’t be first to line up for SIRT1 “activator” pills (such as Resveratrol), but I might pop a few if I’m trying to learn something new.

Enhanced by Zemanta

Read Full Post »

Mother Nature
Image via Wikipedia

The current buzz about about GWAS  and longevity and GWAS in general has stirred up many longstanding inconvenient issues that arise when trying to interpret the results of very large, expensive and worthwhile genetic studies.  Its seems that Mother Nature does not give up her secrets without a fight.

One of the most common “inconvenient issues” is the fact that so many of the SNPs that come out of these studies are located far away from protein-encoding exons.  This ubiquitous observation is almost always followed with, “well, maybe its in linkage disequilibrium with a more functional SNP” or something along these lines – wherein the authors get an automatic pass.  OK by me.

Another “inconvenient issue” is the fact that many of these SNPs are of minimal effect and don’t exactly add up or interact to account for the expected heritability.  This problem of “missing heritability” is a big one (see some new insights in the latest issue of Nature Genetics) leading many to suspect that the effects of genes are dependent on complex interactions with each other and the environment.

A recent paper, “A map of open chromatin in human pancreatic islets” [doi:10.1038/ng.530] by Gaulton and colleagues caught my eye because it seems to shed light on both of these particular inconvenient issues.  The authors find that the diabetes risk variant rs7903146 in the TCF7L2 gene is both located in an intron and subject to epigenetic regulation (our sedentary, high-fat, high-stress lives can potentially interact with the genome by causing epigenetic change).

It appears that the T-allele of the intronic rs7903146 is correlated with a more open, transcription-prone form of DNA/chromatin than is the C-allele. The authors confirmed this using both chromatin mapping and gene expression assays on pancreatic islet cells harvested from non-diabetic donors and islet cell-lines.  The results suggest that the risk-conferring T-allele of this intronic SNP may be driving expression (gain-of-function) of the TCF7L2 gene.  What types of environmental stimuli might also impact the opening and closing of chromatin at this location?

This type of interplay of environment, genome and epigenome is probably rampant in the area of brain and behavior – so perhaps the study of diabetes will provide some clues to the many GWAS SNPs that are far away from exons. More on the genetics of epigenetics here.

Enhanced by Zemanta

Read Full Post »

Nucleosome structure.
Image via Wikipedia

pointer to the NOVA program on epigenetics “Ghost in Your Genes” (YouTube link here).  Fantastic footage.  Great intro to epigenetics and so-called trans-generational effects and the inheritance of epigenetic marks – which, in some cases – are left by adverse or stressful experience.  A weird, wild, game-changing concept indeed – that my grandchildren could inherit epigenetic changes induced in my genome by adverse experience.

Reblog this post [with Zemanta]

Read Full Post »

Where's Waldo in Google Maps?
Image by Si1very via Flickr

In an earlier post on Williams Syndrome, we delved into the notion that sometimes a genetic variant can lead to enhanced function – such as certain social behaviors in the case of WS.  A mechanism that is thought to underlie this phenomenon has to do with the way in which information processing in the brain is widely distributed and that sometimes a gene variant can impact one processing pathway, while leaving another pathway intact, or even upregulated.  In the case of Williams Syndrome a relatively intact ventral stream (“what”) processing but disrupted dorsal stream (“where”) processing leads to weaker projections to the frontal cortex and amygdala which may facilitate gregarious and prosocial (a lack of fear and inhibition) behavior.  Other developmental disabilities may differentially disrupt these 2 visual information processing pathways.  For instance, developmental dyspraxia contrasts with WS as it differentially disrupts the ventral stream processing pathway.

A recent paper by Woodcock and colleagues in their article, “Dorsal and ventral stream mediated visual processing in genetic subtypes of Prader–Willi syndrome” [doi:10.1016/j.neuropsychologia.2008.09.019] ask how another developmental disability – Prader-Willi syndrome – might differentially influence the development of these information processing pathways.  PWS arises from the lack of expression (via deletion or uniparental disomy) of a cluster of paternally expressed genes in the 15q11-13 region (normally the gene on the maternally inherited chromosome is silent, or imprintedrelated post here).  By comparing PWS children to matched controls, the team reports evidence showing that PWS children who carry the deletion are slightly more impaired in a task that depends on the dorsal “where” pathway whilst some sparing or relative strength in the ventral “what” pathway.

Reblog this post [with Zemanta]

Read Full Post »

Image by theloushe via Flickr

** PODCAST accompanies this post**

I have a little boy who loves to run and jump and scream and shout – a lot.  And by this, I mean running – at full speed and smashing his head into my gut,  jumping – off the couch onto my head,  screaming – spontaneous curses and R-rated body parts and bodily functions.  I hope you get the idea.  Is this normal? or (as I oft imagine) will I soon be sitting across the desk from a school psychologist pitching me the merits of an ADHD diagnosis and medication?

Of course, when it comes to behavior, there is not a distinct line one can cross from normal to abnormal.  Human behavior is complex, multi-dimensional and greatly interpreted through the lens of culture.  Our present culture is highly saturated by mass-marketing, making it easy to distort a person’s sense of “what’s normal” and create demand for consumer products that folks don’t really need (eg. psychiatric diagnoses? medications?).   Anyhow, its tough to know what’s normal.  This is an important issue to consider for those (mass-marketing hucksters?) who might be inclined to promote genetic data as “hard evidence” for illness, disorder or abnormality of some sort.

With this in mind, I really enjoyed a recent paper by Stollstorff et al., “Neural response to working memory load varies by dopamine transporter genotype in children” [doi:10.1016/j.neuroimage.2009.12.104] who asked how the brains of healthy children functioned, even though they carry a genotype that has been widely associated with the risk of ADHD.  Healthy children who carry genetic risk for ADHD. Hmm, might this be my boy?

The researchers looked at a 9- vs. 10-repeat VNTR polymorphism in the 3′-UTR of the dopamine transporter gene (DAT1).  This gene – which encodes the very protein that is targeted by so many ADHD medications – influences the re-uptake of dopamine from the synaptic cleft.  In the case of 10/10 genotypes, it seems that DAT1 is more highly expressed, thus leading to more re-uptake and hence less dopamine in the synaptic cleft.  Generally, dopamine is needed to enhance the signal/noise of neurotransmission, so – at the end of the day – the 10/10 genotype is considered less optimal than the 9/9-repeat genotype.  As noted by the researchers, the ADHD literature shows that the 10-repeat allele, not the 9-repeat, is most often associated with ADHD.

The research team asked these healthy children (typically developing children between 7 and 12 years of age) to perform a so-called N-back task which requires that children remember words that are presented to them one-at-a-time.  Each time a new word is presented, the children had to decide whether that word was the same as the previous word (1-back) or the previous, previous word (2-back).  Its a maddening task and places an extreme demand on neural circuits involved in active maintenance of information (frontal cortex) as well as inhibition of irrelevant information that occurs during updating (basal ganglia circuits).

As the DAT1 protein is widely expressed in the basal ganglia, the research team asked where in the brain was variation in the DAT1 (9- vs. 10-repeat) associated with neural activity?  and where was there a further difference between 1-back and 2-back?  Indeed, the team finds that brain activity in many regions of the basal ganglia (caudate, putamen, substantia nigra & subthalamic nucleus) were associated with genetic variation in DAT1.  Neat!  the gene may be exerting an influence on brain function (and behavior) in healthy children, even though they do not carry a diagnosis.  Certainly, genes are not destiny, even though they do influence brain and behavior.

What was cooler to me though, is the way the investigators examined the role of genetic variation in the 1-back (easy or low load condition) vs. 2-back (harder, high-load condition) tasks.  Their data shows that there was less of an effect of genotype on brain activation in the easy tasks.  Rather, only when the task was hard, did it become clear that the basal ganglia in the 10/10 carriers was lacking the necessary brain activation needed to perform the more difficult task.  Thus, the investigators reveal that the genetic risk may not be immediately apparent under conditions where heavy “loads” or demands are not placed on the brain.  Cognitive load matters when interpreting genetic data!

This result made me think that genes in the brain might be a lot like genes in muscles.  Individual differences in muscle strength are not associated with genotype when kids are lifting feathers.  Only when kids are actually training and using their muscles, might one start to see that some genetically advantaged kids have muscles that strengthen faster than others.  Does this mean there is a “weak muscle gene” – yes, perhaps.  But with the proper training regimen, children carrying such a “weak muscle gene” would be able to gain plenty of strength.

I guess its off to the mental and physical gyms for me and my son.

** PODCAST accompanies this post** also, here’s a link to the Vaidya lab!

Reblog this post [with Zemanta]

Read Full Post »

Diagram to illustrate Minute Structure of the ...
Image via Wikipedia

For a great many reasons, research on mental illness is focused on the frontal cortex.  Its just a small part of the brain, and certainly, many things can go wrong in other places during brain/cognitive development, but, it remains a robust finding, that when the frontal cortex is not working well, individuals have difficulties in regulating thoughts and emotions.  Life is difficult enough to manage, let alone without a well functioning frontal cortex.  So its no surprise that many laboratories look very closely at how this region develops prenatally and during childhood.

One of the more powerful genetic methods is the analysis of gene expression via microarrays (here is a link to a tutorial on this technology).  When this technology is coupled with extremely careful histological analysis and dissection of cortical circuits in the frontal cortex, it begins to become possible to begin to link changes in gene expression with the physiological properties of specific cells and local circuits in the frontal cortex. The reason this is an exciting pursuit is because the mammalian neocortex is organized in a type of layered fashion wherein 6 major layers have different types of connectivity and functionality.  The developmental origins of this functional specificity are thought to lie in a process known as radial migration (here is a video of a neuron as it migrates radially and finds its place in the cortical hierarchy).  As cells are queued out of the ventricular zone, and begin their migration to the cortical surface, they are exposed to all sorts of growth factors and morphogens that help them differentiate and form the proper connectivities.  Thus, the genes that regulate this process are of keen interest to understanding normal and abnormal cognitive development.

Here’s an amazing example of this – 2 papers entitled, “Infragranular gene expression disturbances in the prefrontal cortex in schizophrenia: Signature of altered neural development?” [doi:10.1016/j.nbd.2009.12.013] and “Molecular markers distinguishing supragranular and infragranular layers in the human prefrontal cortex [doi:10.1111/j.1460-9568.2007.05396.x] both by Dominique Arion and colleagues.  In both papers, the authors ask, “what genes are differentially expressed in different layers of the cortex?”.  This is a powerful line of inquiry since the different layers of cortex are functionally different in terms of their connectivity.  For example, layers II-III (the so-called supragranular layers) are known to connect mainly to other cortical neurons – which is different functionally than layers V-VI (the so-called infragranular layers) that connect mainly to the striatum (layer V) and thalamus (layer VI).  Thus, if there are genes whose expression is unique to a layer, then one has a clue as to how that gene might contribute to normal/abnormal information processing.

The authors hail from a laboratory that is well-known for work over many years on fine-scaled histological analysis of the frontal cortex at the University of Pittsburgh and used a method called, laser capture microdissection, where post-mortem sections of human frontal cortex (area 46) were cut to separate the infragraular layer from the supragranular layer.  The mRNA from these tissue sections was then used for DNA microarray hybridization.  Various controls, replicate startegies and in-situ tissue hybridizations were then employed to validate the initial microarray results.

In first paper, the where the authors compare infra vs. supragranular layers, they report that 40 genes were more highly expressed in the supragranular layers (HOP, CUTL2 and MPPE1 were among the most enriched) and 29 genes were highly expressed in the infragranular layers (ZNF312, CHN2, HS3ST2 were among the most enriched).  Other differentially expressed genes included several that have previously been implicated in cortical layer formation such as RLN, TLX-NR2E1, SEMA3E, PCP4, SERPINE2, NR2F2/ARP1, PCDH8, WIF1, JAG1, MBP.  Amazing!! A handful of genes that seem to label subpopulations of projection neurons in the frontal cortex.  Polymorphic markers for these genes would surely be powerful tools for imaging-genetic studies on cognitive development.

In the second paper, the authors compare infra vs. supragranular gene expression in post-mortem brains from patients with schizophrenia and healthy matched controls. Using the same methods, the team reports both supra- and infragranular gene expression changes in schizophrenia (400 & 1200 differences respectively) – more than 70% of the differences appearing to be reductions in gene expression in schizophrenia. Interestingly, the team reports that the genes that were differentially expressed in the infragranular layers provided sufficient information to discriminate between cases and controls, whilst the gene expression differences in the supragranular layers did not.  More to the point, the team finds that 51 genes that were differentially expressed in infra- vs. supragranular expression were also differentially expressed in cases vs. controls  (many of these are also found to be associated in population genetic association studies of schiz vs. control as well!).  Thus, the team has identified layer (function) -specific genes that are associated with schizophrenia.  These genes, the ones enriched in the infragranular layers especially, seem to be at the crux of a poorly functioning frontal cortex.

The authors point to 3 such genes (SEMA3E, SEMA6D, SEMA3C) who happen to members of the same gene family – the semaphorin gene family.  This gene family is very important for the neuronal guidance (during radial migration), morphology, pruning and other processes where cell shape and position are regulated.  The authors propose that the semaphorins might act as “integrators” of various forms of wiring during development and in adulthood.  More broadly, the authors provide a framework to understand how the development of connectivity on the frontal cortex is regulated by genetic factors – indeed, many suspected genetic risk factors play a role in the developmental pathways the authors have focused on.

Reblog this post [with Zemanta]

Read Full Post »

One of the complexities in beginning to understand how genetic variation relates to cognitive function and behavior is that – unfortunately – there is no gene for “personality”, “anxiety”, “memory” or any other type of “this” or “that” trait.  Most genes are expressed rather broadly across the entire brain’s cortical layers and subcortical systems.  So, just as there is no single brain region for “personality”, “anxiety”, “memory” or any other type of “this” or “that” trait, there can be no such gene.  In order for us to begin to understand how to interpret our genetic make-up, we must learn how to interpret genetic variation via its effects on cells and synapses – that go on to function in circuits and networks.  Easier said than done?  Yes, but perhaps not so intractable.

Here’s an example.  One of the most well studied circuits/networks/systems in the field of cognitive science are so-called basal-ganglia-thalamcortical loops.  These loops have been implicated in a great many forms of cognitive function involving the regulation of everything from movement, emotion and memory to reasoning ability.  Not surprisingly, neuroimaging studies on cognitive function almost always find activations in this circuitry.  In many cases, the data from neuroimaging and other methodologies suggests that one portion of this circuitry – the frontal cortex – plays a role in the representation of such aspects as task rules, relationships between task variables and associations between possible choices and outcomes.  This would be sort of like the “thinking” part of our mental life where we ruminate on all the possible choices we have and the ins and outs of what each choice has to offer.  Have you ever gone into a Burger King and – even though you’ve known for 20 years what’s on the menu – you freeze up and become lost in thought just as its your turn to place your order?  Your frontal cortex is at work!

The other aspect of this circuitry is the subcortical basla ganglia, which seems to play the downstream role of processing all that ruminating activity going on in the frontal cortex and filtering it down into a single action.  This is a simple fact of life – that we can be thinking about dozens of things at a time, but we can only DO 1 thing at a time.  Alas, we must choose something at Burger King and place our order.  Indeed, one of the hallmarks of mental illness seems to be that this circuitry functions poorly – which may be why individuals have difficulty in keeping their thoughts and actions straight – the thinking clearly and acting clearly aspect of healthy mental life.  Certainly, in neurological disorders such as Parkinson’s Disease and Huntington’s Disease, where this circuitry is damaged, the ability to think and move one’s body in a coordinated fashion is disrupted.

Thus, there are at least 2 main components to a complex system/circuits/networks that are involved in many aspects of learning and decision making in everyday life.  Therefore, if we wanted to understand how a gene – that is expressed in both portions of this circuitry – inflenced our mental life, we would have to interpret its function in relation to each specific portion of the circuitry.  In otherwords, the gene might effect the prefrontal (thinking) circuitry in one way and the basla-ganglia (action-selection) circuitry in a different way.  Since we’re all familiar with the experience of walking in to a Burger King and seeing folks perplexed and frozen as they stare at the menu, perhaps its not too difficult to imagine that a gene might differentially influence the ruminating process (hmm, what shall I have today?) and the action selection (I’ll take the #3 combo) aspect of this eveyday occurrance (for me, usually 2 times per week).

Nice idea you say, but does the idea flow from solid science?  Well, check out the recent paper from Cindy M. de Frias and colleagues “Influence of COMT Gene Polymorphism on fMRI-assessed Sustained and Transient Activity during a Working Memory Task.” [PMID: 19642882].  In this paper, the authors probed the function of a single genetic variant (rs4680 is the Methionine/Valine variant of the dopamine metabolizing COMT gene) on cognitive functions that preferentially rely on the prefronal cortex as well as mental operations that rely heavily on the basal-ganglia.  As an added bonus, the team also probed the function of the hippocampus – yet a different set of circuits/networks that are important for healthy mental function.  OK, so here is 1 gene who is functioning  within 3 separable (yet connected) neural networks!

The team focused on a well-studied Methionine/Valine variant of the dopamine metabolizing COMT gene which is broadly expessed across the pre-frontal (thinking) part of the circuitry and the basal-ganglia part of the circuitry (action-selection) as well as the hippocampus.  The team performed a neuroimaging study wherein participants (11 Met/Met and 11 Val/Val) subjects had to view a series of words presented one-at-a-time and respond if they recalled that a word was a match to the word presented 2-trials beforehand  (a so-called “n-back task“).  In this task, each of the 3 networks/circuits (frontal cortex, basal-ganglia and hippocampus) are doing somewhat different computations – and have different needs for dopamine (hence COMT may be doing different things in each network).  In the prefrontal cortex, according to a theory proposed by Robert Bilder and colleagues [doi:10.1038/sj.npp.1300542] the need is for long temporal windows of sustained neuronal firing – known as tonic firing (neuronal correlate with trying to “keep in mind” all the different words that you are seeing).  The authors predicted that under conditions of tonic activity in the frontal cortex, dopamine release promotes extended tonic firing and that Met/Met individuals should produce enhanced tonic activity.  Indeed, when the authors looked at their data and asked, “where in the brain do we see COMT gene associations with extended firing? they found such associations in the frontal cortex (frontal gyrus and cingulate cortex)!

Down below, in the subcortical networks, a differerent type of cognitive operation is taking place.  Here the cells/circuits are involved in the action selection (press a button) of whether the word is a match and in the working memory updating of each new word.  Instead of prolonged, sustained “tonic” neuronal firing, the cells rely on fast, transient “phasic” bursts of activity.  Here, the modulatory role of dopamine is expected to be different and the Bilder et al. theory predicts that COMT Val/Val individuals would be more efficient at modulating the fast, transient form of cell firing required here.   Similarly, when the research team explored their genotype and brain activity data and asked, “where in the brain do we see COMT gene associations with transient firing? they found such associations in the right hippocampus.

Thus, what can someone who carries the Met/Met genotype at rs4680 say to their fellow Val/Val lunch-mate next time they visit a Burger King?  “I have the gene for obesity? or impulsivity? or “this” or “that”?  Perhaps not.  The gene influences different parts of each person’s neural networks in different ways.  The Met/Met having the advantage in pondering (perhaps more prone to annoyingly gaze at the menu forever) whist the Val/Val has the advantage in the action selecting (perhaps ordering promptly but not getting the best burger and fries combo).

Reblog this post [with Zemanta]

Read Full Post »

Last year I dug a bit into the area of epigenetics (indexed here) and learned that the methylation (CH3) and acetylation (OCCH3) of genomic DNA & histones, respectively, can have dramatic effects on the structure of DNA and its accessibility to transcription factors – and hence – gene expression.  Many of the papers I covered suggested that the environment can influence the degree to which these so-called “epigenetic marks” are covalently bonded onto the genome during early development.  Thus, the thinking goes, the early environment can modulate gene expression in ways that are long-lasting – even transgenerational.  The idea is a powerful one to be sure.  And a scary one as well, as parents who read this literature, may fret that their children (and grandchildren) can be epigenetically scarred by early nutritional, physical and/or psycho-social stress.  I must admit that, as a parent of young children myself, I began to wonder if I might be negatively influencing the epigenome of my children.

I’m wondering how much physical and/or social stress is enough to cause changes in the epigenome?  Does the concern about epigenetics only apply to exposure to severe stress?  or run of the mill forms of stress?  How much do we know about this?

This year, I hope to explore this line of inquiry further.  For starters, I came across a fantastic paper by Fraga et al., entitled, “Epigenetic differences arise during the lifetime of monozygotic twins” [doi:10.1073/pnas.0500398102].   The group carries out a remarkably straightforward and time honored approach – a twin study – to ask how much identical twins differ at the epigenetic level.  Since identical twins have the same genome sequence, any differences in their physiology, behavior etc. are, strictly speaking, due to the way in which the environment (from the uterus to adulthood) shapes their development.  Hence, the team of Fraga et al., can compare the amount and location of methyl (CH3) and acetyl (OCCH3) groups to see whether the environment has differentially shaped the epigenome.

An analysis of some 40 identical twin pairs from ages 3-74 years old showed that – YES – the environment, over time, does seem to shape the epigenome (in this case of lymphocytes).  The most compelling evidence for me was seen in Figure 4 where the team used a method known as Restriction Landmark Genomic Scanning (RLGS) to compare patterns of methylation in a genome-wide manner.  Using this analysis, the team found that older twin pairs had about 2.5 times as many differences as did the epigenomes of the youngest twin pairs.  These methylation differences also correlated with gene expression differences (older pairs also had more gene expression differences) and they found that the individual who showed the lowest levels of methylation also had the highest levels of gene expression.  Furthermore, the team finds that twin pairs who lived apart and had more differences in life history were more likely to have epigenetic differences.  Finally, measures of histone acetylation seemed consistent with the gradient of epigenetic change over time and life-history distance.

Thus it seems that, as everyday life progresses, the epigenome changes too.  So, perhaps, one does not need extreme forms of stress to leave long-lasting epigenetic marks on the genome?  Is this true during early life (where the team did not see many differences between pairs)?  and in the brain (the team focused mainly on lymphocytes)?  Are the differences between twins due to the creation of new environmentally-mediated marks or the faulty passage of existing marks from dividing cell-to-cell over time?  Will be fun to seek out information on this.

Reblog this post [with Zemanta]

Read Full Post »

Some quick sketches that might help put the fast-growing epigenetics and cognitive development literature into context.  Visit the University of Utah’s Epigenetics training site for more background!

The genome is just the A,G,T,C bases that encode proteins and other mRNA molecules.  The “epi”genome are various modification to the DNA – such as methylation (at C residues) – and acetylation of histone proteins.   These changes help the DNA form various secondary and tertiary structures that can facilitate or block the interaction of DNA with the transcriptional machinery.

When DNA is highly methylated, it generally is less accessible for transcription and hence gene expression is reduced.  When histone proteins (purple blobs that help DNA coil into a compact shape) are acetylated, the DNA is much more accessible and gene expression goes up.

We know that proper epigenetic regulation is critical for cognitive development because mutations in MeCP2 – a protein that binds to methylated C residues – leads to Rett syndrome.  MeCP2 is normally responsible for binding to methylated DNA and recruiting histone de-acetylases (HDACs) to help DNA coil and condense into a closed form that is inaccessible for gene expression (related post here).

When DNA is accessible for gene expression, then it appears that – during brain development – there are relatively more synaptic spines produced (related post here).  Is this a good thing? Rett syndrome would suggest that – NO – too many synaptic spines and too much excitatory activity during brain development may not be optimal.  Neither is too little excitatory (too much inhibitory) activity and too few synaptic spines.  It is likely that you need just the right balance (related post here). Some have argued (here) that autism & schizophrenia are consequences of too many & too few synapses during development.

The sketch above illustrates a theoretical conjecture – not a scenario that has been verified by extensive scientific study. It tries to explain why epigenetic effects can, in practice, be difficult to disentangle from true (changes in the A,G,T,C sequence) genetic effects.  This is because – for one reason – a mother’s experience (extreme stress, malnutrition, chemical toxins) can – based on some evidence – exert an effect on the methylation of her child’s genome.  Keep in mind, that methylation is normal and widespread throughout the genome during development.  However, in this scenario, if the daughter’s behavior or physiology were to be influenced by such methylation, then she could, in theory, when reaching reproductive age, expose her developing child to an environment that leads to altered methylation (shown here of the grandaughter’s genome).  Thus, an epigenetic change would look much like there is a genetic variant being passed from one generation to the next, but such a genetic variant need not exist (related post here, here) – as its an epigenetic phenomenon.  Genes such as BDNF have been the focus of many genetic/epigenetic studies (here, here) – however, much, much more work remains to determine and understand just how much stress/malnutrition/toxin exposure is enough to cause such multi-generational effects.  Disentangling the interaction of genetics with the environment (and its influence on the epigenome) is a complex task, and it is very difficult to prove the conjecture/model above, so be sure to read the literature and popular press on these topics carefully.

Reblog this post [with Zemanta]

Read Full Post »

We are all familiar with the notion that genes are NOT destiny and that the development of an individual’s mind and body occur in a manner that is sensitive to the environment (e.g. children who eat lots of healthy food grow bigger and stronger than those who have little or no access to food).  In the case of the brain, one of the ways in which the environment gets factored into development – is via so-called “sensitive periods” where certain parts of the brain transiently rely on sensory experience in order to develop.  Children born with cataracts, for example, will have much better vision if the cataracts are removed in the first few weeks of life rather than later on.  This is because the human visual system has a “sensitive period” early in development where it is extra-sensitive to visual input and, after which, the function and connectivity of various parts of the system is – somewhat permanently – established for the rest of the person’s life.  Hence, if there is little visual input (cataracts) during the sensitive period, then the visual system is somewhat permanently unable to process visual information – even if the cataracts are subsequently removed.  (To learn more about this topic, visit Pawan Sinha’s lab at M.I.T and his Project Prakash intervention study on childhood blindness.)

What the heck is an “in”sensitive period then?   Well, whereas visual input is clearly a “good thing” for the sensitive period of visual development, perhaps some inputs are “bad” and it may be useful to shield or protect the brain from exposure.  Maybe some environmental inputs are “bad” and one would not want the developing brain to be exposed to them and say, “OK, this (bad stuff) is normal“.  As a parent, I am constantly telling my children that the traffic-filled street is a “bad place” and, like all parents, I would not want my children to think that it was OK to wander into the street.  Clearly, I want my child to recognize the car-filled street as a “bad thing”.

In the developing brain, it turns out that there are some “bad things” that one would NOT like (the brain) to get accustomed to.  Long-term exposure to glucocorticoids is one example – well-known to cause a type of neuronal remodelling in the hippocampus, that is associated with poor cognitive performance (visit Bruce McEwen’s lab at Rockefeller University to learn more about this).  Perhaps an “in”sensitive period – where the brain is insensitive to glucocorticoids – is one way to teach the brain that glucocorticoids are “bad” and DO NOT get too familiar with them (such a period does actually occur during early post-natal mammalian development).  Of course, we do need our brains to mount an acute stress response, if and when, we are being threatened, but it is also very important that the brain learn to TURN-OFF the acute stress response when the threat has passed – an extensive literature on the deleterious effects of chronic exposure to stress bears this out.  Hence, the brain needs to learn to recognize the flow of glucocorticoids as something that needs to be shut down.

OK, so our developing brain needs to learn what/who is “good vs. bad”.  Perhaps sensitive and insensitive periods help to reinforce this learning – and also – to cement learning into the system in a sort of permanent way (I’m really not sure if this is the consensus view, but I’ll try and podcast interview some of the experts here asap).  In any case, in the case of the visual system, it is clear that the lack of visual input during the sensitive period has long lasting consequences.  In the case of the stress response, it is also clear that if there is untoward stress early in development, one can be (somewhat) destined to endure a lifetime of emotional difficulty.  Previous posts here, here, here cover research on behavioral/genomic correlates of early life stress.

Genes meet environment in the epigenome during sensitive and insensitive periods?

As stated at the outset – genes are not destiny.  The DNA cannot encode a system that knows who/what is good vs. bad, but rather can only encode a system of molecular parts that can assemble to learn these contingencies on the fly.  During sensitive periods in the visual system, cells in the visual system are more active and fire more profusely during the sensitive period. This extra firing leads to changes in gene expression in ways that (somewhat) permanently set the connectivity, strength and sensitivity of visual synapses.  The expression of neuroligins, neurexins, integrins and all manner of extracellular proteins that stabilize synaptic connections are well-known tagets of activity-induced gene expression.  Hence the environment “interacts” with the genome via neuronal firing which induces gene expression which – in turn – feeds back and modulates neuronal firing.  Environment –> neuronal firing –> gene expression –> modified neuronal firing.  OK.

Similarly, in the stress response system, the environment induces changes in the firing of cells in the hypothalamus which leads (through a series of intermediates) to the release of glucocorticoids.  Genes induced during the firing of hypothalamic cells and by the release of glucocorticoid can modify the organism’s subsequent response to stressful events.  Environment –> neuronal firing –> gene expression –> modified neuronal firing.  OK.

Digging deeper into the mechanism by which neuronal firing induces gene expression, we find an interesting twist.   Certainly there is a well-studied mechanism wherein neuronal firing causes Ca++ release which activates gene expression of neuroligins, neurexins, integrins and all manner of extracellular proteins that stabilize synaptic connections – for many decades.  There is another mechanism that can permanently mark certain genes and alter their levels of expression – in a long-lasting manner.  These are so-called epigenetic mechanisms such as DNA methylation and acetylation.  As covered here and here, for instance, Michael Meaney’s lab has shown that DNA CpG methylation of various genes can vary in response to early-life stress and/or maternal care. In some cases, females who were poorly cared for, may, in turn, be rather lousy mothers themselves as a consequence of these epigenetic markings.

A new research article, “Dynamic DNA methylation programs persistent adverse effects of early-life stress” by Chris Murgatroyd and colleagues [doi:10.1038/nn.2436] explores these mechanisms in great detail.  The team explored the expression of the arginine vasopressin (AVP) peptide – a gene which is important for healthy social interaction and social-stress responsivity.  Among many other interesting results, the team reports that early life stress (using a mouse model) leads to lower levels of methylation in the 3rd CpG island which is located downstream in a distal gene-expression-enhancer region.  In short, more early-life stress was correlated with less methylation, more AVP expression which is known to potentiate the release of glucocorticoids (a bad thing).   The team reports that the methyl binding MeCP2 protein, encoded by the gene that underlies Rett syndrome, acts as a repressor of AVP expression – which would normally be a good thing since it would keep AVP levels (and hence glucocorticoid levels) down.  But unfortunately, early-life stress removes the very methyl groups to which MeCP2 binds and also the team reports that parvocelluar neuronal depolarization leads to phosphorylation (on serine residue #438) of MeCP2 – a form of MeCP2 that is less accessible to its targets.  So, in  a manner similar to other examples, early life stress can have long-lasting effects on gene expression via an epigenetic mechanism – and disables an otherwise protective mechanism that would shield the organism from the effects of stress.  Much like in the case of Rett syndrome (as covered here) it seems that when MeCP2 is bound – then it silences gene expression – which would seem to be a good thing when it comes to the case of AVP.

So who puts these epigenetic marks on chromosomes and why?

I’ll try and explore this further in the weeks ahead.  One intriguing idea about why methylation has been co-opted among mammals, has to do with the idea of parent-offspring conflict.  According to David Haig, one of the experts on this topic, males have various incentives to cause their offspring to be large and fast growing, while females have incentive to combat the genomic tricks that males use, and to keep their offspring smaller and more manageable in size.  The literature clearly show that genes that are marked or methylated by fathers (paternally imprinted genes) tend to be growth promoting genes and that maternally imprinted genes tend to be growth inhibitors.  One might imagine that maternally methylated genes might have an impact on maternal care as well.

Lastly, the growth promoting/inhibiting effects of paternal/maternal genes and gene markings is now starting to be discussed somewhat in the context of autism/schizophrenia which have have been associated with synaptic under-/over-growth, respectively.

Building a brain is already tough enough – but to have to do it amidst an eons-old battle between maternal and paternal genomes.  Sheesh!  More on this to come.

Reblog this post [with Zemanta]

Read Full Post »

Image by Sbrimbillina via Flickr

Here’s a gene whose relationship to mental function is very straightforward.  If you hold your breath, your blood pH falls (more CO2 leads to more free H+ protons dissolved in your blood stream).  You also may become anxious, or worse if you are forced to hold your breath.  How does this process work?

Ziemann et al., in their new paper, “The Amygdala Is a Chemosensor that Detects Carbon Dioxide and Acidosis to Elicit Fear Behavior” [doi 10.1016/j.cell.2009.10.029] show that the acid sensing ion channel-1a (ASIC1a) gene is a proton-sensing Na+ and Ca++ channel – designed to activate dendritic spines when sensing H+ and drive neuronal activity.  Mice that lack this gene are not sensitive to higher CO2 levels, but when the protein is replaced in the amygdala, the mice show fearful behavior in response to higher CO2 levels.  Mother nature has provided a very straightforward way – ASIC1a activation of our fear center – of letting us know that no oxygen is a BAD thing!

Reblog this post [with Zemanta]

Read Full Post »

*** PODCAST accompanies this post ***

Nowadays, it seems that genomics is spreading beyond the rarefied realm of science and academia into the general, consumer-based popular culture.  Quelle surprise!?  Yes, the era of the personal genome is close at hand, even as present technology  provides – directly to the general consumer public – a  genome-wide sampling of many hundreds of thousands of single nucleotide variants.   As curious early adopters begin to surf their personal genomic information, one might wonder how they, and  homo sapiens in general, will ultimately utilize their genome information.  Interestingly, some have already adapted the personal genome to facilitate what homo sapiens loves to do most – that is, to interact with one another.  They are at the vanguard of a new and hip form of social interaction known as “personal genome sharing”.  People connecting in cyberspace – via  haplotype or sequence alignment – initiating new social contacts with distant cousins (of which there may be many tens of thousands at 5th cousins and beyond).  Sharing genes that regulate the social interaction of sharing genes, as it were.

A broader view of social genes, within the context of our neo-Darwinian synthesis, however, shows that the relationship between the genome and social behavior can be rather complex.  When genes contribute directly to the fitness of an organism (eg. sharper tooth and claw), it is relatively straightforward to explain how novel fitness-conferring genetic variants increase in frequency from generation to generation.  Even when genetic variants are selfish, that is, when they subvert the recombination or gamete production machinery, in some cases to the detriment of their individual host, they can still readily spread through populations.  However, when a new genetic variant confers a fitness benefit to unrelated individuals by enhancing a cooperative or reciprocally-altruistic form of social interaction, it becomes more difficult to explain how such a novel genetic variant can take hold and spread in a large, randomly mating population.  Debates on the feasibility natural selection acting “above the level of the individual” seem settled against this proposition.  However, even in the face of such difficult population genetic conundrums, research on the psychology, biology and evolutionary genetics of social interactions continues unabated.  Like our primate and other mammalian cousins, with whom homo sapiens shares some 90-99% genetic identity, we are an intensely social species as our literature, poetry, music, cinema, not to mention the more recent twittering, myspacing, facebooking and genome-sharing demonstrate.

Indeed, many of the most compelling examples of genetic research on social interactions are those that reveal the devastating impacts on psychological development and function when social interaction is restricted.  In cases of maternal and/or peer-group social separation stress, the effects on gene expression in the brain are dramatic and lead to long-lasting consequences on human emotional function.  Studies on loneliness by John Cacioppo and colleagues reveal that even the perception of loneliness is aversive enough to raise arousal levels which, may, have adaptive value.  A number of specific genes have been shown to interact with a history of neglect or maltreatment in childhood and, subsequently, increase the risk of depression or emotional lability in adulthood.  Clearly then, despite the difficulties in explaining how new “social genes” arise and take hold in populations, the human genome been shaped over evolutionary time to function optimally within the context of a social group.

From this perspective, a new paper, “Oxytocin receptor genetic variation relates to empathy and stress reactivity in humans” by Sarina Rodrigues and colleagues [doi.org/10.1073/pnas.0909579106] may be of broad interest as a recent addition to a long-standing, but now very rapidly growing, flow of genetic research on genes and social interactions.  The research team explored just a single genetic variant in the gene encoding the receptor for a small neuropeptide known as oxytocin, a protein with well-studied effects on human social interactions.  Intra-nasal administration of oxytocin, for example, has been reported to enhance eye-gaze, trust, generosity and the ability to infer the emotional state of others.  In the Rodrigues et al., study, a silent G to A change (rs53576) within exon 3 of the oxytocin receptor (OXTR) gene is used to subgroup an ethnically diverse population of 192 healthy college students who participated in assessments for pro-social traits such as the “Reading the Mind in the Eyes” (RMET) test of empathetic accuracy as well as measures of dispositional empathy.  Although an appraisal of emotionality in others is not a cooperative behavior per se, it has been demonstrated to be essential for healthy social function.  The Rodrigues et al., team find that the subgroup of students who carried the GG genotype were more accurate and able to discern the emotional state of others than students who carried the A-allele.  Such molecular genetic results are an important branching point to further examine neural and cognitive mechanisms of empathy as well as long-standing population genetic concerns of how new genetic variants like the A-allele of rs53576 arose and managed to take-hold in human populations.

Regarding the latter, there are many avenues for inquiry, but oxytocin’s role in the regulation of the reproductive cycle and social behavior stands out as an ideal target for natural selection.  Reproductive and behavioral-genetic factors that influence the ritualized interactions between males and females have been demonstrated to be targets of natural selection during the process of speciation.  New variants can reduce the cross-mating of closely related species who might otherwise mate and produce sterile or inviable hybrid offspring.  So-called pre-mating speciation mechanisms are an efficient means, therefore, to ensure that reproduction leads to fit and fertile offspring.  In connection with this idea, reports of an eye-gaze assessment similar to the RMET test used by Rodrigues et al., revealed that women’s pupils dilate more widely to photos of men they were sexually attracted to during their period of the menstrual cycle of greatest fertility, thus demonstrating a viable link between social preference and reproductive biology.  However, in the Rodrigues et al., study, it was the G-allele that was associated with superior social appraisal and this allele is not the novel allele, but rather the ancestral allele that is carried by chimpanzees, macaques and orangutans.  Therefore, it does not seem that the novel A-allele would have been targeted by natural selection in this type of pre-mating social-interaction scenrio.  Might other aspects of OXTR function provide more insight then?  Rodrigues et al.,  explore the role of the gene beyond the social interaction dimension and note that OXTR is widely expressed in limbic circuitry and also plays a broader modulatory role in many emotional reactivity.  For this reason, they sought to assess the stress responsivity of the participants via changes in heart-rate that are elicited by the unpredictable onset of an acoustic startle.  The results show that the A-allele carriers showed greater stress reactivity and also greater scores on a 12-point scale of affective reactivity.  Might greater emotional vigilance in the face of adversity confer a fitness advantage for A-allele carriers? Perhaps this could be further explored.

Regarding the neural and cognitive mechanisms of empathy and other pro-social traits, the Rodrigues et al., strategy demonstrates that when human psychological research includes genetic information it can more readily be informed by a wealth of non-human animal models.  Comparisons of genotype-phenotype correlations at the behavioral, physiological, anatomical and cellular levels across different model systems is one general strategy for generating hypotheses about how a gene like OXTR mediates and moderates cognitive function and also why it (and human behavior) evolved.  For example, mice that lack the OXTR gene show higher levels of aggression and deficits in social recognition memory.  In humans, genetic associations of the A-allele with autism, and social loneliness form possible translational bridges.  In other areas of human psychology such as in the areas of attention and inhibition, several genetic variants correlate with specific  mental operations and areas of brain activation.  The psychological construct of inhibition, once debated purely from a behavioral psychological perspective, is now better understood to be carried out by a collection of neural networks that function in the lateral frontal cortex as well as basal ganglia and frontal midline.  Individual differences in the activation of these brain regions have been shown to relate to genetic differences in a number of dopaminergic genes, whose function in animal models is readily linked to the physiologic function of specific neural circuits and types of synapses.  In the area of social psychology, where such types of neuroimaging-genetic studies are just getting underway, the use of “hyper-scanning”, a method that involves the simultaneous neuroimaging of two or more individuals playing a social game (prisoners dilemma) reveals a co-activation of dopamine-rich brain areas when players are able to make sound predictions of other participant’s choices.  These types of social games can model specific aspects of reciprocal social interactions such as trust, punishment, policing, sanctions etc. that have been postulated to support the evolution of social behavior via reciprocal altruism.  Similar imaging work showed that intra-nasal administration of oxytocin potently reduced amygdala activation and decreased amygdala coupling to brainstem regions implicated in autonomic and behavioural manifestations of fear.  Such recent examples affirm the presence of a core neural circuitry involved in social interaction whose anatomical and physiological properties can be probed using genetic methods in human and non-human populations.

Although there will remain complexities in explaining how new “social genes” can arise and move through evolutionary space and time (let alone cyberspace!) the inter-flows of genetic data and social psychological function in homo sapiens will likely increase.  The rising tide should inevitably force both psychologists and evolutionary biologists to break out of long-standing academic silos and work together to construct coherent models that are consistent with cognitive-genetic findings as well as population- genetic and phylogenetic data.  Such efforts will heavily depend on a foundation of psychological research into “social genes” in a manner illustrated by Rodrigues et al.

*** PODCAST accompanies this post *** Thanks agian Dr. Rodrigues!!!

Reblog this post [with Zemanta]

Read Full Post »

Where da rodents kick it
Image by Scrunchleface via Flickr

A recent GWAS study identified the 3′ region of the liver- (not brain) expressed PECR gene (rs7590720(G) and rs1344694(T)) on chromosome 2 as a risk factor for alcohol dependency.  These results, as reported by Treutlein et al., in “Genome-wide Association Study of Alcohol Dependence” were based on a population of 487 male inpatients and a follow-up re-test in a population of 1024 male inpatients and 996 control participants.

The authors also asked whether lab rats who – given the choice between water-based and ethanol-spiked beverages over the course of 1 year – showed differential gene expression in those rats that were alcohol preferrers vs. alcohol non-preferring rats.  Among a total of 542 genes that were found to be differentially expressed in the amygdala and caudate nucleus of alcohol vs. non-alcohol-preferring rat strains,  a mere 3 genes – that is the human orthologs of these 3 genes – did also show significant association with alcohol dependency in the human populations.  Here are the “rat genes” (ie. human homologs that show differential expression in rats and association with alcohol dependency in humans): rs1614972(C) in the alcohol dehydrogenase 1C (ADH1C) gene, rs13273672(C) in the GATA binding protein 4 (GATA4) gene, and rs11640875(A) in the cadherin 13 (CDH13) gene.

My 23andMe profile gives a mixed AG at rs7590720, and a mixed GT at rs1344694 while I show a mixed CT at rs1614972, CT at rs13273672 and AG at rs11640875.  Boooring! a middling heterozygote at all 5 alcohol prefer/dependency loci.   Were these the loci for chocolate prefer/dependency I would be a full risk-bearing homozygote.


Reblog this post [with Zemanta]

Read Full Post »

Gravestone of Samuel Coleridge-Taylor,Wallington
Image by sludgegulper via Flickr

Few events are as hard to understand as the loss of a loved one to suicide – a fatal confluence of factors that are oft scrutinized – but whose analysis can provide little comfort to family and friends.  To me, one frightening and vexing aspect of what is known about the biological roots of depression, anxiety, impulsivity and other mental traits and states associated with suicide, is the way in which early life (even prenatal) experience can influence events in later life.  As covered in this blog here and here, there appear to be very early interactions between emotional experience in early life and the methylation of specific points in the genome.  Such methylation – often referred to as epigenetic marks – can regulate the expression of genes that are important for synaptic plasticity and cognitive development.

The recent paper, “Alternative Splicing, Methylation State, and Expression Profile of Tropomyosin-Related Kinase B in the Frontal Cortex of Suicide Completers” is a recent example of a link between epigenetic marks and suicide.  The team of Ernst et al., examined gene expression profiles from the frontal cortex and cerebellum of 28 males lost to suicide and 11 control, ethnically-matched control participants.  Using a subject-by-subject comparison method described as “extreme value analysis” the team identified 2 Affymetrix probes: 221794_at and 221796_at – that are specific to NTRK2 (TRKB) gene – that showed significantly lower expression in several areas of the frontal cortex.  The team also found that these probes were specific to exon 16 – which is expressed only in the TRKB.T1 isoform that is expressed only in astrocytes.

Further analysis showed that there were no genetic differences in the promoter region of this gene that would explain the expression differences, but, however, that there were 2 methylation sites (epigenetic differences) whose methylation status correlated with expression levels (P=0.01 and 0.004).  As a control, the DNA-methylation at these sites was not correlated with TRKB.T1 expression when DNA and RNA was taken from the cerebellum (a control since the cerebellum is not thought to be directly involved in the regulation of mood).

In the case of TRKB.T1 expression, the team reports that more methylation at these 2 sites in the promoter region is associated with less TRKB.T1 expression in the frontal cortex.  Where and when are these marks laid down?  Are they reversible?  How can we know or suspect what is happening to our epigenome (you can’t measure this by spitting into a cup as with current genome sequencing methods)? To me, the team has identified an important clue from which such follow-up questions can be addressed.  Now that they have a biomarker, they can help us begin to better understand our complex and often difficult emotional lives within a broader biological context.

Reblog this post [with Zemanta]

Read Full Post »

Older Posts »