Feeds:
Posts
Comments

Posts Tagged ‘Gene’

Twin studies have long suggested that genetic variation is a part of healthy and disordered mental life.  The problem however – some 10 years now since the full genome sequence era began – has been finding the actual genes that account for this heritability.

It sounds simple on paper – just collect lots of folks with disorder X and look at their genomes in reference to a demographically matched healthy control population.  Voila! whatever is different is a candidate for genetic risk.  Apparently, not so.

The missing heritability problem that clouds the birth of the personal genomes era refers to the baffling inability to find enough common genetic variants that can account for the genetic risk of an illness or disorder.

There are any number of reasons for this … (i) even as any given MZ and DZ twin pair shares genetic variants that predispose them toward the similar brains and mental states, it may be the case that different MZ and DZ pairs have different types of rare genetic variation thus diluting out any similar patterns of variation when large pools of cases and controls are compared …  (ii) also, the way that the environment interacts with common risk-promoting genetic variation may be quite different from person to person – making it hard to find variation that is similarly risk-promoting in large pools of cases and controls … and many others I’m sure.

One research group recently asked whether the type of common genetic variation(SNP vs. CNV) might inform the search for the missing heritability.  The authors of the recent paper, “Genome-wide association study of CNVs in 16,000 cases of eight common diseases and 3,000 shared controls” [doi:10.1038/nature08979] looked at an alternative to the usual SNP markers – so called common copy number variants (CNVs) – and asked if these markers might provide a stronger accounting for genetic risk.  While a number of previous papers in the mental health field have indeed shown associations with CNVs, this massive study (some 3,432 CNV probes in 2000 or so cases and 3000 controls) did not reveal an association with bipolar disorder.  Furthermore, the team reports that common CNV variants are already in fairly strong linkage disequilibrium with common SNPs and so perhaps may not have reached any farther into the abyss of rare genetic variation than previous GWAS studies.

Disappointing perhaps, but a big step forward nonetheless!  What will the personal genomes era look like if we all have different forms of rare genetic variation?

Reblog this post [with Zemanta]

Read Full Post »

One of the complexities in beginning to understand how genetic variation relates to cognitive function and behavior is that – unfortunately – there is no gene for “personality”, “anxiety”, “memory” or any other type of “this” or “that” trait.  Most genes are expressed rather broadly across the entire brain’s cortical layers and subcortical systems.  So, just as there is no single brain region for “personality”, “anxiety”, “memory” or any other type of “this” or “that” trait, there can be no such gene.  In order for us to begin to understand how to interpret our genetic make-up, we must learn how to interpret genetic variation via its effects on cells and synapses – that go on to function in circuits and networks.  Easier said than done?  Yes, but perhaps not so intractable.

Here’s an example.  One of the most well studied circuits/networks/systems in the field of cognitive science are so-called basal-ganglia-thalamcortical loops.  These loops have been implicated in a great many forms of cognitive function involving the regulation of everything from movement, emotion and memory to reasoning ability.  Not surprisingly, neuroimaging studies on cognitive function almost always find activations in this circuitry.  In many cases, the data from neuroimaging and other methodologies suggests that one portion of this circuitry – the frontal cortex – plays a role in the representation of such aspects as task rules, relationships between task variables and associations between possible choices and outcomes.  This would be sort of like the “thinking” part of our mental life where we ruminate on all the possible choices we have and the ins and outs of what each choice has to offer.  Have you ever gone into a Burger King and – even though you’ve known for 20 years what’s on the menu – you freeze up and become lost in thought just as its your turn to place your order?  Your frontal cortex is at work!

The other aspect of this circuitry is the subcortical basla ganglia, which seems to play the downstream role of processing all that ruminating activity going on in the frontal cortex and filtering it down into a single action.  This is a simple fact of life – that we can be thinking about dozens of things at a time, but we can only DO 1 thing at a time.  Alas, we must choose something at Burger King and place our order.  Indeed, one of the hallmarks of mental illness seems to be that this circuitry functions poorly – which may be why individuals have difficulty in keeping their thoughts and actions straight – the thinking clearly and acting clearly aspect of healthy mental life.  Certainly, in neurological disorders such as Parkinson’s Disease and Huntington’s Disease, where this circuitry is damaged, the ability to think and move one’s body in a coordinated fashion is disrupted.

Thus, there are at least 2 main components to a complex system/circuits/networks that are involved in many aspects of learning and decision making in everyday life.  Therefore, if we wanted to understand how a gene – that is expressed in both portions of this circuitry – inflenced our mental life, we would have to interpret its function in relation to each specific portion of the circuitry.  In otherwords, the gene might effect the prefrontal (thinking) circuitry in one way and the basla-ganglia (action-selection) circuitry in a different way.  Since we’re all familiar with the experience of walking in to a Burger King and seeing folks perplexed and frozen as they stare at the menu, perhaps its not too difficult to imagine that a gene might differentially influence the ruminating process (hmm, what shall I have today?) and the action selection (I’ll take the #3 combo) aspect of this eveyday occurrance (for me, usually 2 times per week).

Nice idea you say, but does the idea flow from solid science?  Well, check out the recent paper from Cindy M. de Frias and colleagues “Influence of COMT Gene Polymorphism on fMRI-assessed Sustained and Transient Activity during a Working Memory Task.” [PMID: 19642882].  In this paper, the authors probed the function of a single genetic variant (rs4680 is the Methionine/Valine variant of the dopamine metabolizing COMT gene) on cognitive functions that preferentially rely on the prefronal cortex as well as mental operations that rely heavily on the basal-ganglia.  As an added bonus, the team also probed the function of the hippocampus – yet a different set of circuits/networks that are important for healthy mental function.  OK, so here is 1 gene who is functioning  within 3 separable (yet connected) neural networks!

The team focused on a well-studied Methionine/Valine variant of the dopamine metabolizing COMT gene which is broadly expessed across the pre-frontal (thinking) part of the circuitry and the basal-ganglia part of the circuitry (action-selection) as well as the hippocampus.  The team performed a neuroimaging study wherein participants (11 Met/Met and 11 Val/Val) subjects had to view a series of words presented one-at-a-time and respond if they recalled that a word was a match to the word presented 2-trials beforehand  (a so-called “n-back task“).  In this task, each of the 3 networks/circuits (frontal cortex, basal-ganglia and hippocampus) are doing somewhat different computations – and have different needs for dopamine (hence COMT may be doing different things in each network).  In the prefrontal cortex, according to a theory proposed by Robert Bilder and colleagues [doi:10.1038/sj.npp.1300542] the need is for long temporal windows of sustained neuronal firing – known as tonic firing (neuronal correlate with trying to “keep in mind” all the different words that you are seeing).  The authors predicted that under conditions of tonic activity in the frontal cortex, dopamine release promotes extended tonic firing and that Met/Met individuals should produce enhanced tonic activity.  Indeed, when the authors looked at their data and asked, “where in the brain do we see COMT gene associations with extended firing? they found such associations in the frontal cortex (frontal gyrus and cingulate cortex)!

Down below, in the subcortical networks, a differerent type of cognitive operation is taking place.  Here the cells/circuits are involved in the action selection (press a button) of whether the word is a match and in the working memory updating of each new word.  Instead of prolonged, sustained “tonic” neuronal firing, the cells rely on fast, transient “phasic” bursts of activity.  Here, the modulatory role of dopamine is expected to be different and the Bilder et al. theory predicts that COMT Val/Val individuals would be more efficient at modulating the fast, transient form of cell firing required here.   Similarly, when the research team explored their genotype and brain activity data and asked, “where in the brain do we see COMT gene associations with transient firing? they found such associations in the right hippocampus.

Thus, what can someone who carries the Met/Met genotype at rs4680 say to their fellow Val/Val lunch-mate next time they visit a Burger King?  “I have the gene for obesity? or impulsivity? or “this” or “that”?  Perhaps not.  The gene influences different parts of each person’s neural networks in different ways.  The Met/Met having the advantage in pondering (perhaps more prone to annoyingly gaze at the menu forever) whist the Val/Val has the advantage in the action selecting (perhaps ordering promptly but not getting the best burger and fries combo).

Reblog this post [with Zemanta]

Read Full Post »

Last year I dug a bit into the area of epigenetics (indexed here) and learned that the methylation (CH3) and acetylation (OCCH3) of genomic DNA & histones, respectively, can have dramatic effects on the structure of DNA and its accessibility to transcription factors – and hence – gene expression.  Many of the papers I covered suggested that the environment can influence the degree to which these so-called “epigenetic marks” are covalently bonded onto the genome during early development.  Thus, the thinking goes, the early environment can modulate gene expression in ways that are long-lasting – even transgenerational.  The idea is a powerful one to be sure.  And a scary one as well, as parents who read this literature, may fret that their children (and grandchildren) can be epigenetically scarred by early nutritional, physical and/or psycho-social stress.  I must admit that, as a parent of young children myself, I began to wonder if I might be negatively influencing the epigenome of my children.

I’m wondering how much physical and/or social stress is enough to cause changes in the epigenome?  Does the concern about epigenetics only apply to exposure to severe stress?  or run of the mill forms of stress?  How much do we know about this?

This year, I hope to explore this line of inquiry further.  For starters, I came across a fantastic paper by Fraga et al., entitled, “Epigenetic differences arise during the lifetime of monozygotic twins” [doi:10.1073/pnas.0500398102].   The group carries out a remarkably straightforward and time honored approach – a twin study – to ask how much identical twins differ at the epigenetic level.  Since identical twins have the same genome sequence, any differences in their physiology, behavior etc. are, strictly speaking, due to the way in which the environment (from the uterus to adulthood) shapes their development.  Hence, the team of Fraga et al., can compare the amount and location of methyl (CH3) and acetyl (OCCH3) groups to see whether the environment has differentially shaped the epigenome.

An analysis of some 40 identical twin pairs from ages 3-74 years old showed that – YES – the environment, over time, does seem to shape the epigenome (in this case of lymphocytes).  The most compelling evidence for me was seen in Figure 4 where the team used a method known as Restriction Landmark Genomic Scanning (RLGS) to compare patterns of methylation in a genome-wide manner.  Using this analysis, the team found that older twin pairs had about 2.5 times as many differences as did the epigenomes of the youngest twin pairs.  These methylation differences also correlated with gene expression differences (older pairs also had more gene expression differences) and they found that the individual who showed the lowest levels of methylation also had the highest levels of gene expression.  Furthermore, the team finds that twin pairs who lived apart and had more differences in life history were more likely to have epigenetic differences.  Finally, measures of histone acetylation seemed consistent with the gradient of epigenetic change over time and life-history distance.

Thus it seems that, as everyday life progresses, the epigenome changes too.  So, perhaps, one does not need extreme forms of stress to leave long-lasting epigenetic marks on the genome?  Is this true during early life (where the team did not see many differences between pairs)?  and in the brain (the team focused mainly on lymphocytes)?  Are the differences between twins due to the creation of new environmentally-mediated marks or the faulty passage of existing marks from dividing cell-to-cell over time?  Will be fun to seek out information on this.

Reblog this post [with Zemanta]

Read Full Post »

Some quick sketches that might help put the fast-growing epigenetics and cognitive development literature into context.  Visit the University of Utah’s Epigenetics training site for more background!

The genome is just the A,G,T,C bases that encode proteins and other mRNA molecules.  The “epi”genome are various modification to the DNA – such as methylation (at C residues) – and acetylation of histone proteins.   These changes help the DNA form various secondary and tertiary structures that can facilitate or block the interaction of DNA with the transcriptional machinery.

When DNA is highly methylated, it generally is less accessible for transcription and hence gene expression is reduced.  When histone proteins (purple blobs that help DNA coil into a compact shape) are acetylated, the DNA is much more accessible and gene expression goes up.

We know that proper epigenetic regulation is critical for cognitive development because mutations in MeCP2 – a protein that binds to methylated C residues – leads to Rett syndrome.  MeCP2 is normally responsible for binding to methylated DNA and recruiting histone de-acetylases (HDACs) to help DNA coil and condense into a closed form that is inaccessible for gene expression (related post here).

When DNA is accessible for gene expression, then it appears that – during brain development – there are relatively more synaptic spines produced (related post here).  Is this a good thing? Rett syndrome would suggest that – NO – too many synaptic spines and too much excitatory activity during brain development may not be optimal.  Neither is too little excitatory (too much inhibitory) activity and too few synaptic spines.  It is likely that you need just the right balance (related post here). Some have argued (here) that autism & schizophrenia are consequences of too many & too few synapses during development.

The sketch above illustrates a theoretical conjecture – not a scenario that has been verified by extensive scientific study. It tries to explain why epigenetic effects can, in practice, be difficult to disentangle from true (changes in the A,G,T,C sequence) genetic effects.  This is because – for one reason – a mother’s experience (extreme stress, malnutrition, chemical toxins) can – based on some evidence – exert an effect on the methylation of her child’s genome.  Keep in mind, that methylation is normal and widespread throughout the genome during development.  However, in this scenario, if the daughter’s behavior or physiology were to be influenced by such methylation, then she could, in theory, when reaching reproductive age, expose her developing child to an environment that leads to altered methylation (shown here of the grandaughter’s genome).  Thus, an epigenetic change would look much like there is a genetic variant being passed from one generation to the next, but such a genetic variant need not exist (related post here, here) – as its an epigenetic phenomenon.  Genes such as BDNF have been the focus of many genetic/epigenetic studies (here, here) – however, much, much more work remains to determine and understand just how much stress/malnutrition/toxin exposure is enough to cause such multi-generational effects.  Disentangling the interaction of genetics with the environment (and its influence on the epigenome) is a complex task, and it is very difficult to prove the conjecture/model above, so be sure to read the literature and popular press on these topics carefully.

Reblog this post [with Zemanta]

Read Full Post »

Tao Te Ching
Image via Wikipedia

In previous posts, we have explored some of the basic molecular (de-repression of chromatin structure) and cellular (excess synaptogenesis) consequences of mutations in the MeCP2 gene – a.k.a the gene whose loss of function gives rise to Rett syndrome.  One of the more difficult aspects of understanding how a mutation in a lowly gene can give rise to changes in cognitive function is bridging a conceptual gap between biochemical functions of a gene product — to its effects on neural network structure and dynamics.  Sure, we can readily acknowledge that neural computations underlie our mental life and that these neurons are simply cells that link-up in special ways – but just what is it about the “connecting up part” that goes wrong during developmental disorders?

In a recent paper entitled, “Intact Long-Term Potentiation but Reduced Connectivity between Neocortical Layer 5 Pyramidal Neurons in a Mouse Model of Rett Syndrome” [doi: 10.1523/jneurosci.1019-09.2009] Vardhan Dani and Sacha Nelson explore this question in great detail.  They address the question by directly measuring the strength of neural connections between pyramidal cells in the somatosensory cortex of healthy and MeCP2 mutant mice.  In earlier reports, MeCP2 neurons showed weaker neurotransmission and weaker plasticity (an ability to change the strength of interconnection – often estimated by a property known as “long term potentiation” (LTP – see video)).   In this paper, the authors examined the connectivity of cortical cells using an electrophysiological method known as patch clamp recording and found that early in development, the LTP induction was comparable in healthy and MeCP2 mutant animals, and even so once the animals were old enough to show cognitive symptoms.  During these early stages of development, there were also no differences between baseline neurotransmission between cortical cells in normal and MeCP2 mice.  Hmmm – no differences? Yes, during the early stages of development, there were no differences between genetic groups – however – once the team examined later stages of development (4 weeks of age) it was apparent that the MeCP2 animals had weaker amplitudes of cortical-cortical excitatory neurotransmission.  Closer comparisons of when the baseline and LTP deficits occurred, suggested that the LTP deficits are secondary to baseline strength of neurotransmission and connectivity in the developing cortex in MeCP2 animals.

So it seems that MeCP2 can alter the excitatory connection strength of cortical cells.  In the discussion of the paper, the authors point out the importance of a proper balance of inhibition and excitation (yin and yang, if you will) in the construction or “connecting up part” of neural networks.  Just as Rett syndrome may arise due to such a problem in the proper linking-up of cells – who use their excitatory and inhibitory connections to establish balanced feedback loops – so too may other developmental disorders such as autism, Down’s syndrome, fragile X-linked mental retardation arise from an improper balance of inhibition and excitation.

Reblog this post [with Zemanta]

Read Full Post »

MECP2
Image via Wikipedia

The cognitive and emotional impairments in the autism spectrum disorders can be difficult for parents and siblings to understand and cope with.  Here are some graphics and videos that might assist in understanding how genetic mutations and epigenetic modifications can lead to various forms of social withdrawl commonly observed in the autism spectrum disorders in children.

In this post, the focus is just on the MecP2 gene – where mutations are known to give rise to Rett Syndrome – one of the autism spectrum disorders.  I’ll try and lay out some of the key steps in the typical bare-bones-link-infested-blogger-fashion – starting with mutations in the MecP2 gene.  Disclaimer: there are several fuzzy areas and leaps of faith in the points and mouse model evidence below, and there are many other genes associated with various aspects of autism spectrum disorders that may or may not work in this fashion.  Nevertheless, still it seems one can begin to pull a mechanistic thread from gene to social behavior Stay tuned for more on this topic.

1. The MecP2 gene encodes a protein that binds to 5-Methylcytosine – very simply – a regular cytosine reside with an extra methyl group added at position 5.  Look at the extra -CH3 group on the cytosine residue in the picture at right.  See?  That’s a 5-methylcyctosine residue – and it pairs in the DNA double helix with guanosine (G) in the same fashion as does the regular cyctosine reside (C). 5methC OK, now, mutations in the gene that encode the  MecP2 gene – such as those found at Arginine residue 133 and Serine residue 134 impair the ability of the protein to bind to these 5-Methylcyctosine residues.  bindingMecP2The figure at left illustrates this, and shows how the MecP2 protein lines up with the bulky yellow 5-Methylcytosine residues in the blue DNA double helix during binding.

2. When the MecP2 protein is bound to the methylated DNA, it serves as a binding site for another type of protein – an HDAC or histone deacetylase. The binding of MecP2 and HDAC (and other proteins (see p172 section 5.3 of this online bookChromatin Structure and Gene Expression“)).  The binding of the eponymously named HDAC’s leads to the “de-acetylation” of proteins known as histones.  The movie below illustrates how histone “de-acetylation” leads to the condensation of DNA structure and repression or shutting down of gene expression (when the DNA is tightly coiled, it is inaccessible to transcription factors).  Hence: DNA methylation leads (via MecP2, HDAC binding) to a repression on gene expression.


3. When mutated forms of MecP2 cannot bind, the net result is MORE acetylation and MORE gene expression. As covered previously here, this may not be a good thing during brain development since more gene expression can induce the formation of more synapses and – possibly – lead to neural networks that fail to grow and mature in the “normal” fashion. The figure at right toomanysynapsessuggests that neural networks with too many synapses may not be appropriately connected and may be locked-in to sub-optimal architectures.  Evidence for excessive synaptogenesis is abundant within the autism spectrum disorders.  Neuroligins – a class of genes that have been implicated in autism are known to function in cell & synaptic adhesion (open access review here), and can alter the balance of excitation/inhibition when mutated – which seems consistent with this heuristic model of neural networks that can be too adhesive or sticky.

4. Cognitive and social impairment can result from poor-functioning neural networks containing, but not limited to the amygdala. The normal development of neural networks containing the forntal cortex and amygdala are important for proper social and emotional function.  The last piece of the puzzle then would be to find evidence for developmental abnormalities in these networks and to show that such abnormalities mediate social and/or emotional function.  Such evidence is abundant.

Regarding the effects of MecP2 however, we can consider the work of Adachi et al., who were able to delete the MecP2 gene – just in the amygdala – of (albeit, an adult) mouse.  Doing so, led to the disruption of various emotional behaviors – BUT NOT – of various social interaction deficits that are observed when MecP2 is deleted in the entire forebrain.  This was the case also when the team infused HDAC inhibitors into the amygdala suggesting that loss of transcriptional repression in the adult amygdala may underlie the emotional impariments seen in some autism spectrum disorders.  Hence, such emotional impairments (anxiety etc.) might be treatable in adults (more on this result later and its implications for gene-therapy).

Whew!  Admittedly, the more you know – the more you don’t know.  True here, but still amazing to see the literature starting to interlink across human-genetic, mouse-genetic, human-functional-imaging levels of analysis. Hoping this rambling was helpful.

Reblog this post [with Zemanta]

Read Full Post »

Violinist marionette performs
Image by eugene via Flickr

The homunculus (argument) is a pesky problem in cognitive science – a little guy who might suddenly appear when you propose a mechanism for decision making, spontaneous action or forethought  etc. – and would take credit for the origination of the neural impulse.  While there are many mechanistic models of decision making that have slain the little bugger – by invoking competition between past experience and memory as the source of new thoughts and ideas – one must always tread lightly, I suppose, to be wary that cognitive mechanisms are based completely in neural properties devoid of a homuncular source.

Still, the human mind must begin somewhere.  After all, its just a ball of cells initially, and then a tube and then some more folds, layers, neurogenesis and neural migration  etc. before maturing – miraculously – into a child that one day looks at you and says, “momma” or “dada”.  How do these neural networks come into being?  Who or what guides their development toward that unforgettable, “momma (dada)” moment?  A somewhat homuncluar “genetic program” – whose instructions we can attribute to millions of years of natural selection?  Did early hominid babies say “momma (dada)?  Hmmm. Seems like we might be placing a lot of faith in the so-called “instructions” provided by the genome, but who am I to quibble.

On the other hand, you might find that the recent paper by Akhtar et al., “Histone Deacetylases 1 and 2 Form a Developmental Switch That Controls Excitatory Synapse Maturation and Function” [doi:10.1523/jneurosci.0097-09.2009] may change the way you think about cognitive development.  The team explores the function of two very important epigenetic regulators of gene expression – histone deacetylases 1,2 (HDAC1, HDAC2) on the functionality of synapses in early developing mice and mature animals.  By epigenetic, I refer to the role of these genes in regulating chromatin structure and not via direct, site-specific DNA binding.  The way the HDAC genes work is by de-acetylating – removing acetyl groups – thus removing a electrostatic repulsion of acetyl groups (negative charge) on histone proteins with the phosphate backbone of DNA (also a negative charge).  When the histone proteins carry such an acetyl group, they do NOT bind well to DNA (negative-negative charge repulsion) and the DNA molecule is more open and exposed to binding of transcription factors that activate gene expression.  Thus if one (as Akhtar do) turns off a de-acetylating HDAC gene, then the resulting animal has a genome that is more open and exposed to transcription factor binding and gene expression.  Less HDAC = more gene expression!

What were the effects on synaptic function?  To summarize, the team found that in early development (neonatal mouse hippocampal cells) cells where the HDAC1 or 2 genes were turned off (either through pharmacologic blockers or via partial deletion of the gene(s) via lentivirus introduction of Cre recombinase) had more synapses and more synaptic electrical activity than did hippocampal cells from control animals.  Keep in mind that the HDACs are located in the nucleus of the neuron and the synapses are far, far away.  Amazingly – they are under the control of an epigenetic regulator of gene expression;  hence, ahem, “epigenetic puppetmasters”.  In adult cells, the knockdown of HDACs did not show the same effects on synaptic formation and activity.  Rather the cells where HDAC2 was shut down showed less synaptic formation and activity (HDAC1 had no effect).  Again, it is amazing to see effects on synaptic function regulated at vast distances.  Neat!

The authors suggest that the epigenetic regulatory system of HDAC1 & 2 can serve to regulate the overall levels of synaptic formation during early cognitive development.  If I understand their comments in the discussion, this may be because, you don’t necessarily want to have too many active synapses during the formation of a neural network.   Might such networks might be prone to excitotoxic damage or perhaps to being locked-in to inefficient circuits?  The authors note that HDACs interact with MecP2, a gene associated with Rett Syndrome – a developmental disorder (in many ways similar to autism) where neural networks underlying cognitive development in children fail to progress to support higher, more flexible forms of cognition.  Surely the results of Akhtar et al., must be a key to understanding and treating these disorders.

Interestingly, here, the controller of these developmental phenotypes is not a “genetic program” but rather an epigenetic one, whose effects are wide-spread across the genome and heavily influenced by the environment.  So no need for an homunculus here.

Reblog this post [with Zemanta]

Read Full Post »

Lonely child
Image by kodomut via Flickr

For humans, there are few sights more heart-wrenching than an orphaned child (or any orphaned vertebrate for that matter).  Isolated, cold, unprotected, vulnerable – what could the cold, hard calculus of natural selection – “red in tooth and claw” – possibly have to offer these poor, vulnerable unfortunates?

So I wondered while reading, “Functional CRH variation increases stress-induced alcohol consumption in primates” [doi:10.1073/pnas.0902863106].  In this paper, the authors considered the role of a C-to-T change at position -248 in the promoter of the corticotropin releasing hormone (CRH or CRF) gene.  Its biochemical role was examined using nuclear extracts from hypothalamic cells, to demonstrate that this C-to-T nucleotide change disrupts protein-DNA binding, and, using transcriptional reporter assays, that the T-allele showed higher levels of transcription after forskolin stimulation.  Presumably, biochemical differences conferred by the T-allele can have a physiological role and alter the wider functionality of the hypothalamic-pituitary-axis (HPA axis), in which the CRH gene plays a critical role.

The authors ask whether primates (rhesus macaques) who differ in genotype (CC vs. CT) show any differences in physiological stress reactivity – as predicted by differences in the activity of the CRH promoter.  As a stressor, the team used a form of brief separation stress and found that there were no differences in HPA function (assessed by ACTH and Cortisol levels) in animals who were reared by their mothers.  However, when the stress paradigm was performed on animals who were reared without a mother (access to play with other age-matched macaques) there were significant differences in HPA function between the 2 genetic groups (T-alleles showing greater release of stress hormones).  Further behavioral assessments found that the peer reared animals who carried the T-allele explored their environment less when socially separated as adults (again no C vs. T differences in maternally reared animals).  In a separate assessment the T-carriers showed a preference for sweetened alcohol vs. sweetened water in ad lib consumption.

One way of summarizing these findings, could be to say that having no mother is a bad thing (more stress reactivity) and having the T-allele just makes it worse!  Another way could be to say that the T-allele enhances the self-protection behaviors (less exploration could be advantageous in the wild?) that arise from being orphaned.  Did mother nature (aka. natural selection) provide the macaque with a boost of self-preservation (in the form of a T-allele that enhances emotional/behavioral inhibition)?  I’m not sure, but it will be fun to report on further explorations of this query.  Click here for an interview with the corresponding author, Dr. Christina Barr.

—p.s.—

The authors touch on previous studies (here and here) that explored natural selection on this gene in primates and point out that humans and macaques both have 2 major haplotype clades (perhaps have been maintained in a yin-yang sort of fashion over the course of primate evolution) and that humans have a C-to-T change (rs28364015) which would correspond to position -201 in the macaque (position 68804715 on macaque chr. 8), which could be readily tested for similar functionality in humans.  In any case, the T-allele is rare in macaques, so it may be the case that few orphaned macaques ever endure the full T-allele experience.  In humans, the T-allele at rs28364015 seems more common.

Nevertheless, this is yet another – complicated – story of how genome variation is not destiny, but rather a potentiator or life experience – for better or worse.  Related posts on genes and early development (MAOA-here), (DAT-here), (RGS2-here), or just click the “development tag“.

Reblog this post [with Zemanta]

Read Full Post »

FTM_phase_locking_v4_0**PODCAST accompanies this post** In the brain, as in other aspects of life, timing is everything.  On an intuitive level, its pretty clear, that, since neurons have to work together in widely distributed networks, they have a lot of incentive to talk to each other in a rhythmic, organized way. Think of a choir that sings together vs. a cacophony of kids in a cafeteria – which would you rather have as your brain? A technical way of saying this could be, “Clustered bursting oscillations, with in-phase synchrony within each cluster, have been proposed as a binding mechanism. According to this idea, neurons that encode a particular stimulus feature synchronize in the same cluster.”  A less technical way of saying this was first uttered by Carla Shatz who said, “Neurons that fire together wire together” and “Neurons that fire apart wire apart“.  So it seems, that the control over neural timing and synchronicity – the rushing “in” of Na+ ions and rushing “out” of K+ ions that occur during cycles of depolarization and repolarization of an action potential take only a few milliseconds – is something that neurons would have tight control over.

With this premise in mind, it is fascinating to ponder some recent findings reported by Huffaker et al. in their research article, “A primate-specific, brain isoform of KCNH2 affects cortical physiology, cognition, neuronal repolarization and risk of schizophrenia” [doi: 10.1038/nm.1962].  Here, the research team has identified a gene, KCNH2, that is both differentially expressed in brains of schizophrenia patients vs. healthy controls and that contains several SNP genetic variants (rs3800779, rs748693, rs1036145) that are associated with multiple different patient populations.  Furthermore, the team finds that the risk-associated SNPs are associated with greater expression of an isoform of KCNH2 – a kind of special isoform – one that is expressed in humans and other primates, but not in rodents (they show a frame-shift nucleotide change that renders their ATG start codon out of frame and their copy non-expressed).  Last I checked, primates and rodents shared a common ancestor many millenia ago. Very neat – since some have suggested that newer evolutionary innovations might still have some kinks that need to be worked out.

In any case, the research team shows that the 3 SNPs are associated with a variety of brain parameters such as hippocampal volume, hippocampal activity (declarative memory task) and activity in the dorsolateral prefrontal cortex (DLPFC). The main suggestion of how these variants in KCNH2 might lead to these brain changes and risk for schizophrenia comes from previous findings that mutations in this gene screw up the efflux of K+ ions during the repolarization phase of an action potential.  In the heart (where KCNH2 is also expressed) this has been shown to lead to a form of “long QT syndrome“.  Thus, the team explores this idea using primary neuronal cell cultures and confirms that greater expression of the primate isoform leads to non-adaptive, quickly deactivating, faster firing patterns, presumably due to the extra K+ channels. 

The authors hint that fast & extended spiking is – in the context of human cognition – is thought to be a good thing since its needed to allow the binding of multiple input streams.  However, in this case, the variants seem to have pushed the process to a non-adaptive extreme.  Perhaps there is a seed of an interesting evolutionary story here, since the innovation (longer, extended firing in the DLPFC) that allows humans to ponder so many ideas at the same time, may have some legacy non-adaptive genetic variation still floating around in the human lineage.  Just a speculative muse – but fun to consider in a blog post.

In any case, the team has substantiated a very plausible mechanism for how the genetic variants may give rise to the disorder.  A scientific tour-de-force if there ever was one.

On a personal note, I checked my 23andMe profile and found that while rs3800779 and rs748693 were not assayed, rs1036145 was, and I – boringly – am a middling G/A heterozygote.  In this article, the researchers find that the A/As showed smaller right-hippocampal grey matter volume, but the G/A were not different that the G/Gs.  During a declarative memory task, the GGs showed little or no change in hippocampal activity while the AA and GA group showed changes – but only in the left hippocampus.  In the N-back task (a working memory task), the AA’s showed more changes in brain activation in the right DLPFC compared to the GGs and GAs.

For further edification, here is a video showing the structure of the KCNH2-type K+ channel.  Marvel at the tiny pore that allows red K+ ions to leak through during the repolarization phase of an action potential.   **PODCAST accompanies this post**

Reblog this post [with Zemanta]

Read Full Post »

English: Visualization of a DTI measurement of...
Image via Wikipedia

Within the genetic news flow, there is often, and rightly so, much celebration when a gene for a disease is identified.  This is indeed an important first step, but often, the slogging from that point to a treatment – and the many small breakthroughs along the way – can go unnoticed. One reason why these 2nd (3rd, 4th, 5th …) steps are so difficult, is that in some cases, folks who carry “the gene” variant for a particular disorder, do not, in fact, display symptoms of the disorder.

Huh? One can carry the risk variant – or many risk variants – and not show any signs of illness?  Yes, this is an example of what geneticists refer to as variable penetrance, or the notion of carrying a mutation, but not outwardly displaying the mutant phenotype.  This, is one of the main reasons why genes are not deterministic, but much more probablistic in their influence of human development.

Of course, in the brain, such complexities exist, perhaps even moreso.  For example, take the neurological condition known as dystonia, a movement disorder that, according to the Dystonia Medical Research Foundation, “causes the muscles to contract and spasm involuntarily. The neurological mechanism that makes muscles relax when they are not in use does not function properly. Opposing muscles often contract simultaneously as if they are “competing” for control of a body part. The involuntary muscle contractions force the body into repetitive and often twisting movements as well as awkward, irregular postures.”  Presently there are more than a dozen genes and/or chromosomal loci that are associated with dystonia – two of the major genes, DYT1 and DYT6 – having been identified as factors in early onset forms of dystonia.  Now as we enter the era of personal genomes, an individual can assess their (own, child’s, preimplantion embryo’s!) genetic risk for such rare genetic variants – whose effects may not be visible until age 12 or older.  In the case of DYT1, this rare mutation (a GAG deletion at position 946 which causes a loss of a glutamate residue in the torsin A protein) gives rise to dystonia in about 30-40% of carriers.  So, how might these genes work and why do some individuals develop dystonia and others do not?  Indeed, these are the complexities that await in the great expanse between gene identification and treatment.

An inspection of the molecular aspects of torsin A (DYT1) show that it is a member of the AAA family of adenosine triphosphatases and is related to the Clp protease/heat shock family of genes that help to properly fold poly-peptide chains as they are secreted from the endoplasmic reticulum of the cell – a sort-of handyman, general purpose gene (expressed in almost every tissue in the body) that sits on an assembly line and hammers away to help make sure that proteins have the right shape as they come off their assembly linesNot much of a clue for dystonia – hmm.  Similarly, the THAP domain containing, apoptosis associated protein 1 (THAP1) gene (a.k.a. DYT6) is also expressed widely in the body and seems to function as a DNA binding protein that regulates aspects of cell cycle progression and apoptosis.  Also not much an obvious clue to dystonia – hmm, hmm.  Perhaps you can now see why the identification of “the gene” – something worth celebrating – can just leave you aghast at how much more you don’t know.

That these genes influence an early developmental form of the disorder suggests a possible developmental role for these rather generic cogs in the cellular machinery.  But where? how? & why an effect in some folks and not others?  To these questions, comes an amazing analysis of DYT1 and DYT6 carriers in the article entitled, “Cerebellothalamocortical Connectivity Regulates Penetrance in Dystonia” by Argyelan and colleagues [doi: 10.1523/JNEUROSCI.2300-09.2009]. In this article, the research team uses a method called diffusion tensor imaging (sensitive to white matter density) to examine brain structure and function among individuals who carry the mutations but either DO or DO NOT manifest the symptoms. By looking at white matter tracts (super highways of neural traffic) throughout the brain the team was able to ask whether some tracts were different in the 2 groups (as well as a group of unaffectd, non-carriers).  In this way, the team can begin to better understand the causal pathway between these run-of-the-mill genes (torsin A and thap1) and the complex pattern of muscle spasms that arise from their mutations.

To get right to the findings, the team has discovered that in one particular tract, a superhighway known as “cerebellar outflow pathway in the white matter of lobule VI, adjacent to the dentate nucleus” (not as quaint as Route 66) that those participants that DO manifest dystonia had less tract integrity and connectivity there compared to those that DO NOT manifest and healthy controls (who have the most connectivity there).  Subsequent measures of resting-state blood flow confirmed that the disruptions in white matter tracts were correlated with cerebellar outflow to the thalamus and – more importantly – with activity in areas of the motor cortex.  The correlations were such that individuals who DO manifest dystonia had greater activity in the motor cortex (this is what dystonia really comes down to — too much activity in the motor cortex).

Thus the team were able to query gene carriers using their imaging methods and zero-in on “where in the brain” these generic proteins exert a detrimental effect.  This seems to me, to be a huge step forward in understanding how a run-of-the-mill gene can alter brain function in such a profound way.  Now that they’ve found the likely circuit (is it the white matter per se or the neurons?), more focus can be applied to how this circuit develops – and can be repaired.

Reblog this post [with Zemanta]

Read Full Post »

Backyard trampoline
Image by Kevin Steele via Flickr

For more than a decade, we’ve known that at least 95% of the human genome is junk – or junque – if you’re offended by the thought that “you” emerged from a single cell whose genome is mostly a vast pile of crap – or crappe – if you insist.  Hmmm, what is this crap?  It turns out to be a lot of random repeating sequences and a massive collection of evolutionary artifacts left over from the evolution of earlier genomes – mainly bits of retroviruses who once inserted themselves irreversibly into our ancestors’ genomes.  One subset of this type of – can we upgrade it from crappe to “relic” now? – is something we’ve labelled “autonomously mobile DNA sequences” or more specifically, “long interspersed nuclear elements (LINEs or L1s)”.  This class of DNA relic comprises more than 15% of the human genome (that’s about 3-5x more than the relevant genomic sequence from which you emerge) and retains the ability to pick itself up out of the genome – via an RNA intermediate – and insert itself into new places in the genome.  This has been observed to happen in the germ line of humans and a few L1 insertions are even responsible for genetic forms of humn disease (for example in the factor VIII gene giving rise to haemophilia).  The mechanism of transposition – or “jumping” as these elements are sometimes called “jumping genes” – involves the assembly of a certain type of transcriptional, transport and reverse-transcription (RNA back to DNA) apparatus that is known to be available in stem cells, but hardly ever  in somatic cells.

Except, it would seem, for the brain – which as we’ve covered here before – keeps its precious neurons and glia functioning under separate rules.  Let’s face it, if a liver cell dies, you just replace it without notice, but if neurons die, so do your childhood memories.  So its not too surprising, perhaps, that brain cells have special ‘stem-cell-like’ rules for keeping themselves youthful.  This seems to be borne out again in a paper entitled, “L1 retrotransposition in human neural progenitor cells” by Coufal et al., [doi:10.1038/nature08248].  Here the team shows that L1 elements are able to transpose themselves in neural stem cells and that there are more L1 elements (about 80 copies more per cell) in the hippocampus than in liver or heart cells.  So apparently, the hippocampus, which does seem to contain a niche of stem cells, permits the transposition or “jumping” of L1 elements in a way that the liver and heart do not.  Sounds like a fun place to be a gene!

Reblog this post [with Zemanta]

Read Full Post »

A column of the cortex
Image by Ethan Hein via Flickr

Here’s a new addition to a rapidly growing list of findings for the valine-to-methionine substitution in the COMT gene (rs4680).  The paper, “Effects of the Val158Met catechol-O-methyltransferase polymorphism on cortical structure in children and adolescents” by Shaw and colleagues at the NIMH [doi:10.1038/mp.2008.121] finds that when genotype was used as a regressor for cortical thickness measures in children (8-14 years of age) significant associations were found in the right inferior frontal gyrus and the right superior/middle temporal gyrus (in both areas, the met/met group had thicker cortex).  The team notes that the findings in the frontal cortex were expected – as many others have found associations of COMT with this brain area using other imaging modalities.  However, the temporal lobe finds are something new.  No speculations on the mechanisms/implications are provided by the researchers on this new finding, but known interconnectivities of these two brain regions exist – perhaps supporting aspects of language, memory and/or other cognitive processes?

Perhaps the findings provide a clue to an important role that genes may play in the development of cognitive function.

Reblog this post [with Zemanta]

Read Full Post »

labyrinthine circuit board lines
Image by quapan via Flickr

Amidst a steady flow of upbeat research news in the behavioral-genetics literature, there are many inconvenient, uncomfortable, party-pooping sentiments that are more often left unspoken.  I mean, its a big jump – from gene to behavior – and just too easy to spoil the mood by reminding your colleagues that, “well, everything is connected to everything” or “that gene association holds only for that particular task“.  Such may have been the case often times in the past decade when the so-called imaging-genetics literature emerged to parse out a role for genetic variation in the structure and functional activation of the brain using various neuroimaging methods.  Sure, the 5HTT-LPR was associated with amygdala activation during a face matching task, but what about other tasks (and imaging modalities) and other brain regions that express this gene.  How could anyone (let alone NIMH) make sense out of all of those – not to mention the hundreds of other candidate genes poised for imaging-genetic research?

With this in mind, it is a pleasure to meet the spoiler-of-spoilers! Here is a research article that examines a few candidate genetic polymorphisms and compares their findings across multiple imaging modalities.  In his article, “Neural Connectivity as an Intermediate Phenotype: Brain Networks Under Genetic Control” [doi: 10.1002/hbm.20639] Andreas Meyer-Lindenberg examines the DARPP32, 5HTT and MAOA genes and asks whether their associations with aspects of brain structure/function are in any way consistent across different neuroimaging modalities.  Amazingly, the answer seems to be, yes.

For example, he finds that the DARPP32 associations are consistently associated with the striatum and prefrontal-striatal connectivity – even as the data were collected using voxel-based morphometry, fMRI in separate tasks, and an analysis of functional connectivity.  Similarly, both the 5HTT and MAOA gene promoter repeats also showed consistent findings within a medial prefrontal and amygdala circuit across these various modalities.

This type of finding – if it holds up to the spoilers & party poopers – could radically simplify the understanding of how genes influence cognitive function and behavior.  As suggested by Meyer-Lindenberg, “features of connectivity often better account for behavioral effects of genetic variation than regional parameters of activation or structure.”  He suggests that dynamic causal modeling of resting state brain function may be a powerful approach to understand the role of a gene in a rather global, brain-wide sort of way.  I hope so and will be following this cross-cutting “connectivity” approach in much more detail!

Reblog this post [with Zemanta]

Read Full Post »

Human chromosome 15
Image via Wikipedia

One way to organize the great and growing body of research into autism is via a sort-of  ‘top-down’ vs. ‘bottom-up’ perspective.  From the ‘top-down’ one can read observational research that carefully catalogs the many & varied social and cognitive attributes that are associated with autism.  Often times, these behavioral studies are coupled with neurochemical or neuroimaging studies that test whether variation in such biomarkers is correlated with aspects of autism.  In this manner, the research aims to dig down into the physiology and biochemistry of the developing brain to find out what is different and what differences might predict the onset of autistic traits.  At the deepest biological level – the bedrock, so to speak – are a number of genetic variations that have been correlated with autism.  These genetic variants permit another research strategy – a ‘bottom-up’ strategy that allows investigators to ask, “what goes wrong when we manipulate this genetic variant?”  While proponents of each strategy are painfully aware of the limitations of their own strategy – oft on the barbed-end of commentary from the other side – it is especially exciting when the ‘top-down’ and ‘bottom-up’ methods find themselves meeting in the agreement in the middle.

So is the case with Nakatani et al., “Abnormal Behavior in a Chromosome- Engineered Mouse Model for Human 15q11-13 Duplication Seen in Autism” [doi: 10.1016/j.cell.2009.04.024] who created a mouse that carries a 6.3 megabase duplication of a region in the mouse that luckily happens to be remarkably conserved in terms of gene identity and order with the 15q11-13 region in humans – a region that, when duplicated, is found in about 5% of cases with autism.  [click here for maps of mouse human synteny/homology on human chr15] Thus the team was able to engineer mice with the duplication and ask, “what goes wrong?” and “does it resemble autism in any kind of meaningful way (afterall these are mice we’re dealing with)?

Well, the results are rather astounding to me.  Most amazing is the expression of a small nucleoar RNA (snoRNA) – SNORD115 (mouse-HBII52) – that function in the nucleolus of the cell, and plays a role in the alternative splicing of exon Vb of the 5HT2C receptor.  The team then found that the editing of 5HTR2C was altered in the duplication mice and also that Ca++ signalling was increased when the 5HTR2C receptors were stimulated in the duplication mice (compared to controls).  Thus, a role for altered serotonin function – which has been a longstanding finding in the ‘topdown’ approach – was met midway and affirmed by this ‘bottom-up’ approach!  Also included in the paper are descriptions of the abberant social behaviors of the mice via a 3-chambered social interaction test where duplication mice were rather indifferent to a stranger mouse (wild-type mice often will hang out with each other).

Amazing stuff!

Another twist to the story is the way in which the 15q11-13 region displays a phenomenon known as genomic-imprinting, whereby only the mother or the father’s portion of the chromosome is expressed.  For example, the authors show that the mouse duplication is ‘maternally imprinted’ meaning that that pups do not express the copy of the duplication that comes from the mother (its expression is shut down via epigenetic mechanisms that involve – wait for itsnoRNAs!)  so the effects that they report are only from mice who obtained the duplication from their fathers.  So, if you by chance were wondering why its so tough to sort out the genetic basis of autism – here’s one reason why.  On top of this, the 5HTR2C gene is located on the X-chromosome which complicates the story even more in terms of sorting out the inheritance of the disorder.

Further weird & wild is the fact that the UBE3A gene (paternally imprinted) and the genetic cause of Angelman Syndrome sits in this region – as does the SNRPN gene (maternally imprinted) which encodes a protein that influences alternative RNA splicing and also gives rise to Prader-Willi syndrome.  Thus, this tiny region of the genome, which carries so-called “small” RNAs can influence a multitude of developmental disabilities.  Certainly, a region of the genome that merits further study!!

Reblog this post [with Zemanta]

Read Full Post »