Feeds:
Posts
Comments

Posts Tagged ‘DNA’

remember a day before today
Image by DerrickT via Flickr

Most cells in your adult body are “terminally differentiated” – meaning that they have developed from stem cells into the final liver, or heart, or muscle or endothelial cell that they were meant to be.  From that point onward, cells are able to “remember” to stay in this final state – in part – via stable patterns of DNA methylation that reinforce the regulation of “the end state” of gene expression for that cell.  As evidence for this role of DNA methylation, it has been observed that levels of DNA methyl transferase (DNMT) decline when cells are fully differentiated and thus, cannot modify or disrupt their patterns of methylation.

NOT the case in the brain! Even though neurons in the adult brain are fully differentiated, levels of methyl transferases – DO NOT decline.  Why not? Afterall, we wouldn’t want our neurons to turn into liver cells, or big toe cells, would we?

One hypothesis, suggested by David Sweatt and colleagues is that neurons have more important things to “remember”.   They suggest in their fee and open research article, “Evidence That DNA (Cytosine-5) Methyltransferase Regulates Synaptic Plasticity in the Hippocampus” [doi: 10.1074/jbc.M511767200] that:

DNA methylation could have lasting effects on neuronal gene expression and overall functional state. We hypothesize that direct modification of DNA, in the form of DNA (cytosine-5) methylation, is another epigenetic mechanism for long term information storage in the nervous system.

By measuring methylated vs. unmethylated DNA in the promoter of the reelin and BDNF genes and relating this to electrophysiological measures of synaptic plasticity, the research team finds correlations between methylation status and synaptic plasticity.  More specifically, they find that zebularine (an inhibitor of DNMT) CAN block long-term potentiation (LTP), but NOT block baseline synaptic transmission nor the ability of synapses to fire in a theta-burst pattern (needed to induce LTP).

This suggests that the epigenetic machinery used for DNA methylation may have a role in the formation of cellular memory – but not in the same sense as in other cells in the body – where cells remember to remain in a terminally differentiated state.

In the brain, this epigenetic machinery may help cells remember stuff that’s more germane to brain function … you know … our memories and stuff.

Enhanced by Zemanta

Read Full Post »

The structure of part of a DNA double helix
Image via Wikipedia

just a pointer to: Genetic Future’s pointer to the recent article, “Family become first to have DNA sequenced for non-medical reasons“.    The father suggests, “it will be ethically improper if you don’t have your children sequenced“.

Early days.

Reblog this post [with Zemanta]

Read Full Post »

Twin studies have long suggested that genetic variation is a part of healthy and disordered mental life.  The problem however – some 10 years now since the full genome sequence era began – has been finding the actual genes that account for this heritability.

It sounds simple on paper – just collect lots of folks with disorder X and look at their genomes in reference to a demographically matched healthy control population.  Voila! whatever is different is a candidate for genetic risk.  Apparently, not so.

The missing heritability problem that clouds the birth of the personal genomes era refers to the baffling inability to find enough common genetic variants that can account for the genetic risk of an illness or disorder.

There are any number of reasons for this … (i) even as any given MZ and DZ twin pair shares genetic variants that predispose them toward the similar brains and mental states, it may be the case that different MZ and DZ pairs have different types of rare genetic variation thus diluting out any similar patterns of variation when large pools of cases and controls are compared …  (ii) also, the way that the environment interacts with common risk-promoting genetic variation may be quite different from person to person – making it hard to find variation that is similarly risk-promoting in large pools of cases and controls … and many others I’m sure.

One research group recently asked whether the type of common genetic variation(SNP vs. CNV) might inform the search for the missing heritability.  The authors of the recent paper, “Genome-wide association study of CNVs in 16,000 cases of eight common diseases and 3,000 shared controls” [doi:10.1038/nature08979] looked at an alternative to the usual SNP markers – so called common copy number variants (CNVs) – and asked if these markers might provide a stronger accounting for genetic risk.  While a number of previous papers in the mental health field have indeed shown associations with CNVs, this massive study (some 3,432 CNV probes in 2000 or so cases and 3000 controls) did not reveal an association with bipolar disorder.  Furthermore, the team reports that common CNV variants are already in fairly strong linkage disequilibrium with common SNPs and so perhaps may not have reached any farther into the abyss of rare genetic variation than previous GWAS studies.

Disappointing perhaps, but a big step forward nonetheless!  What will the personal genomes era look like if we all have different forms of rare genetic variation?

Reblog this post [with Zemanta]

Read Full Post »

Crocus (cropped)
Image by noahg. via Flickr

If you’ve started to notice the arrival of spring blossoms, you may have wondered, “how do the blossoms know when its spring?”  Well, it turns out that its not the temperature, but rather, that plants sense the length of the day-light cycle in order to synchronize their  own life cycles with the seasons.  According to the photoperiodism entry for wikipedia, “Many flowering plants use a photoreceptor protein, such as phytochrome or cryptochrome, to sense seasonal changes in night length, or photoperiod, which they take as signals to flower.”

It turns out that humans are much the same. Say wha?!

Yep, as the long ago descendants of single cells who had to eek out a living during day (when the sun emits mutagenic UV radiation) and night cycles, our very own basic molecular machinery that regulates the transcription, translation, replication and a host of other cellular functions is remarkably sensitive – entrained – in a clock-like fashion to the rising and setting sun.  This is because, in our retinas, there are light-sensing cells that send signals to the suprachiasmatic nucleus (SCN) which then – via the pineal gland – secretes systemic hormones such as melatonin that help synchronize cells and organs in your brain and body.  When this process is disrupted, folks can feel downright lousy, as seen in seasonal affective disorder (SAD), delayed sleep phase syndrome (DSPS) and other circadian rhythm disorders.

If you’re skeptical, consider the effects of genetic variation in genes that regulate our circadian rhythms, often called “clock” genes – very ancient genes that keep our cellular clocks synchronized with each other and the outside environment.  Soria et al., have a great paper entitled, “Differential Association of Circadian Genes with Mood Disorders: CRY1 and NPAS2 are Associated with Unipolar Major Depression and CLOCK and VIP with Bipolar Disorder” [doi: 10.1038/npp.2009.230] wherein they reveal that normal variation in these clock genes is associated with mood regulation.

A few of the highlights reported are rs2287161 in the CRY1 gene,  rs11123857 in the NPAS2 gene, and rs885861 in the VIPR2 gene – where the C-allele, G-allele and C-allele, respectively, were associated with mood disorders.

I’m not sure how one would best interpret genetic variation of such circadian rhythm genes.  Perhaps they index how much a person’s mood could be influenced by changes or disruptions to the normal rhythm??  Not sure.  My 23andMe data shows the non-risk AA genotype for rs11123857 (the others are not covered by 23andMe).

Reblog this post [with Zemanta]

Read Full Post »

According to wikipedia, “Jean Philippe Arthur Dubuffet (July 31, 1901 – May 12, 1985) was one of the most famous French painters and sculptors of the second half of the 20th century.”  “He coined the term Art Brut (meaning “raw art,” often times referred to as ‘outsider art’) for art produced by non-professionals working outside aesthetic norms, such as art by psychiatric patients, prisoners, and children.”  From this interest, he amassed the Collection de l’Art Brut, a sizable collection of artwork, of which more than half, was painted by artists with schizophrenia.  One such painting that typifies this style is shown here, entitled, General view of the island Neveranger (1911) by Adolf Wolfe, a psychiatric patient.

Obviously, Wolfe was a gifted artist, despite whatever psychiatric diagnosis was suggested at the time.  Nevertheless, clinical psychiatrists might be quick to point out that such work reflects the presence of an underlying thought disorder (loss of abstraction ability, tangentiality, loose associations, derailment, thought blocking, overinclusive thinking, etc., etc.) – despite the undeniable aesthetic beauty in the work.  As an ardent fan of such art,  it made me wonder just how “well ordered” my own thoughts might be.  Given to being rather forgetful and distractable, I suspect my thinking process is just sufficiently well ordered to perform the routine tasks of day-to-day living, but perhaps not a whole lot more so.  Is this bad or good?  Who knows.

However, Krug et al., in their recent paper, “The effect of Neuregulin 1 on neural correlates of episodic memory encoding and retrieval” [doi:10.1016/j.neuroimage.2009.12.062] do note that the brains of unaffected relatives of persons with mental illness show subtle differences in various patterns of activation.  It seems that when individuals are using their brains to encode information for memory storage, unaffected relatives show greater activation in areas of the frontal cortex compared to unrelated subjects.  This so-called encoding process during episodic memory is very important for a healthy memory system and its dysfunction is correlated with thought disorders and other aspects of cognitive dysfunction.  Krug et al., proceed to explore this encoding process further and ask if a well-known schizophrenia risk variant (rs35753505 C vs. T) in the neuregulin-1 gene might underlie this phenomenon.  To do this, they asked 34 TT, 32 TC and 28 CC individuals to perform a memory (of faces) game whilst laying in an MRI scanner.

The team reports that there were indeed differences in brain activity during both the encoding (storage) and retrieval (recall) portions of the task – that were both correlated with genotype – and also in which the CC risk genotype was correlated with more (hyper-) activation.  Some of the brain areas that were hyperactivated during encoding and associated with CC genotype were the left middle frontal gyrus (BA 9), the bilateral fusiform gyrus and the left middle occipital gyrus (BA 19).  The left middle occipital gyrus showed gene associated-hyperactivation during recall.  So it seems, that healthy individuals can carry risk for mental illness and that their brains may actually function slightly differently.

As an ardent fan of Art Brut, I confess I hoped I would carry the CC genotype, but alas, my 23andme profile shows a boring TT genotype.  No wonder my artwork sucks.  More on NRG1 here.

Reblog this post [with Zemanta]

Read Full Post »

If you’re a coffee drinker, you may have noticed the new super-sized portions available at Starbucks.  On this note, it may be worth noting that caffeine is a potent psychoactive substance of which – too much – can turn your buzz into a full-blown panic disorder.  The Diagnostic and Statistical Manual for psychiatry outlines a number of caffeine-related conditions mostly involving anxieties that can arise when the natural alertness-promoting effects are pushed to extremes.  Some researchers have begun to explore the way the genome interacts with caffeine and it is likely that many genetic markers will surface to explain some of the individual differences in caffeine tolerance.

Here’s a great paper, “Association between ADORA2A and DRD2 Polymorphisms and Caffeine-Induced Anxiety” [doi: 10.1038/npp.2008.17] wherein polymorphisms in the adenosine A2A receptor (ADORA2A encodes the protein that caffeine binds to and antagonizes) – as well as the dopamine D2 receptor (DRD2 encodes a protein whose downstream signals are normally counteracted by A2A receptors) — show associations with anxiety after the consumption of 150mg of caffeine (about an average cup of coffee – much less than the super-size, super-rich cups that Starbucks sells).  The variants, rs5751876 (T-allele), rs2298383 (T-allele) and rs4822492 (G-allele) from the ADORA2A gene as well as rs1110976 (-/G genotype) from the DRD2 gene showed significant increases in anxiety in a test population of 102 otherwise-healthy light-moderate regular coffee drinkers.

My own 23andMe data only provides a drop of information suggesting I’m protected from the anxiety-promoting effects.  Nevertheless, I’ll avoid the super-sizes.
rs5751876 (T-allele)  C/C – less anxiety
rs2298383 (T-allele) – not covered
rs4822492 (G-allele) – not covered
rs1110976 (-/G genotype) – not covered

Reblog this post [with Zemanta]

Read Full Post »

DON’T tell the grant funding agencies, but, in at least one way, the effort to relate genetic variation to individual differences in cognitive function is a totally intractable waste of money.

Let’s say we ask a population of folks to perform a task – perhaps a word memory task – and then we use neuroimaging to identify the areas of the brain that (i) were associated with performance of the task, and (ii) were not only associated with performance, but were also associated with genetic variation in the population.  Indeed, there are already examples of just this type of “imaging-genetic” study in the literature.  Such studies form a crucial translational link in understanding how genes (whose biochemical functions are most often studied in animal models) relate to human brain function (usually studied with cognitive psychology). However, do these genes relate to just this task? What if subjects were recalling objects? or feelings?  What if subjects were recalling objects / experiences / feelings / etc. from their childhoods?  Of course, there are thousands of common cognitive operations one’s brain routinely performs, and, hence, thousands of experimental paradigms that could be used in such “imaging-genetic” gene association studies.  At more than $500/hour (some paradigms last up to 2 hours) in imaging costs, the translational genes-to-cognition endeavor could get expensive!

DO tell the grant funding agencies that this may not be a problem any longer.

The recent paper by Liu and colleagues “Prefrontal-Related Functional Connectivities within the Default Network Are Modulated by COMT val158met in Healthy Young Adults” [doi: 10.1523/jneurosci.3941-09.2010] suggests an approach that may simplify matters.  Their approach still involves genotyping (in this case for rs4680) and neuroimaging.  However, instead of performing a specific cognitive task, the team asks subjects to lay in the scanner – and do nothing.  That’s right – nothing – just lay still with eyes closed and just let the mind wander and not to think about anything in particular – for a mere 10 minutes.  Hunh?  What the heck can you learn from that?

It turns out that one can learn a lot.  This is because the neural pathways that the brain uses when you are actively doing something (a word recall task) are largely intact even when you are doing nothing.  Your brain does not “turn off” when you are laying still with your eyes closed and drifting in thought.  Rather, your brain slips into a kind of default pattern, described in studies of  “default networks” or “resting-state networks” where wide-ranging brain circuits remain dynamically coupled and actively exchange neural information.  One really great paper that describes these networks is a free-and-open article by Hagmann et al., “Mapping the Structural Core of Human Cerebral Cortex” [doi: 10.1371/journal.pbio.0060159] from which I’ve lifted their Figure 1 above.  The work by Hagmann et al., and others show that the brain has a sort of “connectome” where there are thousands of “connector hubs” or nodes that remain actively coupled (meaning that if one node fires, the other node will fire in a synchronized way) when the brain is at rest and when the brain is actively performing cognitive operations.  In a few studies, it seems that the strength of functional coupling in certain brain areas at rest is correlated (positively and negatively) with the activation of these areas when subjects are performing a specific task.

In the genetic study reported by Liu and colleagues, they found that genotype (N=57) at the dopaminergic COMT gene correlated with differences in the functional connectivity (synchronization of firing) of nodes in the prefrontal cortex.  This result is eerily similar to results found for a number of specific tasks (N-back, Wisconsin Card Sorting, Gambling, etc.) where COMT genotype was correlated with the differential activation of the frontal cortex during the task.  So it seems that one imaging paradigm (lay still and rest for 10 minutes) provided comparable insights to several lengthy (and diverse) activation tasks.  Perhaps this is the case. If so, might it provide a more direct route to linking genetic variation with cognitive function?

Liu and colleagues do not comment on this proposition directly nor do they seem to be over-interpreting their results in they way I have editorialized things here.  They very thoughtfully point out the ways in which the networks they’ve identified and similar and different to the published findings of others.  Certainly, this study and the other one like it are the first in what might be a promising new direction!

Reblog this post [with Zemanta]

Read Full Post »

Last year I dug a bit into the area of epigenetics (indexed here) and learned that the methylation (CH3) and acetylation (OCCH3) of genomic DNA & histones, respectively, can have dramatic effects on the structure of DNA and its accessibility to transcription factors – and hence – gene expression.  Many of the papers I covered suggested that the environment can influence the degree to which these so-called “epigenetic marks” are covalently bonded onto the genome during early development.  Thus, the thinking goes, the early environment can modulate gene expression in ways that are long-lasting – even transgenerational.  The idea is a powerful one to be sure.  And a scary one as well, as parents who read this literature, may fret that their children (and grandchildren) can be epigenetically scarred by early nutritional, physical and/or psycho-social stress.  I must admit that, as a parent of young children myself, I began to wonder if I might be negatively influencing the epigenome of my children.

I’m wondering how much physical and/or social stress is enough to cause changes in the epigenome?  Does the concern about epigenetics only apply to exposure to severe stress?  or run of the mill forms of stress?  How much do we know about this?

This year, I hope to explore this line of inquiry further.  For starters, I came across a fantastic paper by Fraga et al., entitled, “Epigenetic differences arise during the lifetime of monozygotic twins” [doi:10.1073/pnas.0500398102].   The group carries out a remarkably straightforward and time honored approach – a twin study – to ask how much identical twins differ at the epigenetic level.  Since identical twins have the same genome sequence, any differences in their physiology, behavior etc. are, strictly speaking, due to the way in which the environment (from the uterus to adulthood) shapes their development.  Hence, the team of Fraga et al., can compare the amount and location of methyl (CH3) and acetyl (OCCH3) groups to see whether the environment has differentially shaped the epigenome.

An analysis of some 40 identical twin pairs from ages 3-74 years old showed that – YES – the environment, over time, does seem to shape the epigenome (in this case of lymphocytes).  The most compelling evidence for me was seen in Figure 4 where the team used a method known as Restriction Landmark Genomic Scanning (RLGS) to compare patterns of methylation in a genome-wide manner.  Using this analysis, the team found that older twin pairs had about 2.5 times as many differences as did the epigenomes of the youngest twin pairs.  These methylation differences also correlated with gene expression differences (older pairs also had more gene expression differences) and they found that the individual who showed the lowest levels of methylation also had the highest levels of gene expression.  Furthermore, the team finds that twin pairs who lived apart and had more differences in life history were more likely to have epigenetic differences.  Finally, measures of histone acetylation seemed consistent with the gradient of epigenetic change over time and life-history distance.

Thus it seems that, as everyday life progresses, the epigenome changes too.  So, perhaps, one does not need extreme forms of stress to leave long-lasting epigenetic marks on the genome?  Is this true during early life (where the team did not see many differences between pairs)?  and in the brain (the team focused mainly on lymphocytes)?  Are the differences between twins due to the creation of new environmentally-mediated marks or the faulty passage of existing marks from dividing cell-to-cell over time?  Will be fun to seek out information on this.

Reblog this post [with Zemanta]

Read Full Post »

silver copy of a 1930 penny
Image via Wikipedia

In their forecast “The World in 2010” special issue, the Economist points to “The looming crisis in human genetics” wherein scientists will reluctantly acknowledge that, even with super-cheap genome sequencing tools, we may not soon understand how genetic variation contributes to complex illness.  The argument is a valid one to be sure, but only time will tell.

A paper I read recently, reminded me of the long hard slog ahead in the area of genomics and psychiatric illness.  The authors in “Association of the Glutamate Transporter Gene SLC1A1 With Atypical Antipsychotics–Induced Obsessive-compulsive Symptoms” [Kwon et al., (2009) Arch Gen Psychiatry 66(11)] are trying to do something very important.  They would like to understand why certain (most) psychiatric medications have adverse side-effects and how to steer patients clear of adverse side-effects.  This is because, nowadays, a patient learns via a drawn-out trial-and-error ordeal about which medications he/she can manage the benefits/costs.

Specifically, the authors focused their efforts on so-called obsessive-compulsive symptoms that can arise from treatment with atypical antipsychotic medications.  Working from 3 major medical centers (Samsung Medical Center, Seoul National University Hospital and Asan Medical Center) Kwon et al., were able to cobble together a mere 40 patients who display these particular adverse side-effects and matched them with 54 patients based on several demographic and medication-based criteria.  Keep in mind that most genetic studies use upwards of 1,000 samples and still – hardly – are able to obtain significant effects.

Nevertheless, the authors note that the glutamate transporter gene (SLC1A1 or EAAC1) is a most logical candidate gene, being a located in a region mapped for obsessive-compulsive disorder risk and also a gene that appears to be down-regulated in response to atypical anti-psychotic treatment (particularly clozapine).  A series of statistical association tests for 10 SNPs in this gene reveal that two SNPs (rs2228622 and rs3780412) and a 3-SNP haplotype (the A/C/G haplotype at rs2228622-rs3780413-rs3780412) showed modestly significant association (about 4-fold higher risk) with the adverse symptoms.

To me, this is a very noteworthy finding.  A lot of work went into a very important problem – perhaps THE most pressing problem for patients on anti-psychotic medications today – and the results, while only of modest significance, are probably biologically valid.  The authors point out that rs2228622 and rs3780412 have previously been associated with OCD in other studies.

But when you compare these modest results (that these authors fought hard to obtain) with the big promises of the genomic era (as noted in the Economist article), well then, the results seem rather diminutive.  Will all patients who carry the risk haplotype be steered away from atypical antipsychotics?  Will big pharma (the authors of this paper disclose a great many ties to big pharma) support the fragmentation of their blockbuster drug markets into a hundred sub-populations?  I doubt it.  But some doctors and patients will experiment and continue to explore this avenue of inquiry – and it will take a long time to work out.  Better check back in 2020.

Reblog this post [with Zemanta]

Read Full Post »

We are all familiar with the notion that genes are NOT destiny and that the development of an individual’s mind and body occur in a manner that is sensitive to the environment (e.g. children who eat lots of healthy food grow bigger and stronger than those who have little or no access to food).  In the case of the brain, one of the ways in which the environment gets factored into development – is via so-called “sensitive periods” where certain parts of the brain transiently rely on sensory experience in order to develop.  Children born with cataracts, for example, will have much better vision if the cataracts are removed in the first few weeks of life rather than later on.  This is because the human visual system has a “sensitive period” early in development where it is extra-sensitive to visual input and, after which, the function and connectivity of various parts of the system is – somewhat permanently – established for the rest of the person’s life.  Hence, if there is little visual input (cataracts) during the sensitive period, then the visual system is somewhat permanently unable to process visual information – even if the cataracts are subsequently removed.  (To learn more about this topic, visit Pawan Sinha’s lab at M.I.T and his Project Prakash intervention study on childhood blindness.)

What the heck is an “in”sensitive period then?   Well, whereas visual input is clearly a “good thing” for the sensitive period of visual development, perhaps some inputs are “bad” and it may be useful to shield or protect the brain from exposure.  Maybe some environmental inputs are “bad” and one would not want the developing brain to be exposed to them and say, “OK, this (bad stuff) is normal“.  As a parent, I am constantly telling my children that the traffic-filled street is a “bad place” and, like all parents, I would not want my children to think that it was OK to wander into the street.  Clearly, I want my child to recognize the car-filled street as a “bad thing”.

In the developing brain, it turns out that there are some “bad things” that one would NOT like (the brain) to get accustomed to.  Long-term exposure to glucocorticoids is one example – well-known to cause a type of neuronal remodelling in the hippocampus, that is associated with poor cognitive performance (visit Bruce McEwen’s lab at Rockefeller University to learn more about this).  Perhaps an “in”sensitive period – where the brain is insensitive to glucocorticoids – is one way to teach the brain that glucocorticoids are “bad” and DO NOT get too familiar with them (such a period does actually occur during early post-natal mammalian development).  Of course, we do need our brains to mount an acute stress response, if and when, we are being threatened, but it is also very important that the brain learn to TURN-OFF the acute stress response when the threat has passed – an extensive literature on the deleterious effects of chronic exposure to stress bears this out.  Hence, the brain needs to learn to recognize the flow of glucocorticoids as something that needs to be shut down.

OK, so our developing brain needs to learn what/who is “good vs. bad”.  Perhaps sensitive and insensitive periods help to reinforce this learning – and also – to cement learning into the system in a sort of permanent way (I’m really not sure if this is the consensus view, but I’ll try and podcast interview some of the experts here asap).  In any case, in the case of the visual system, it is clear that the lack of visual input during the sensitive period has long lasting consequences.  In the case of the stress response, it is also clear that if there is untoward stress early in development, one can be (somewhat) destined to endure a lifetime of emotional difficulty.  Previous posts here, here, here cover research on behavioral/genomic correlates of early life stress.

Genes meet environment in the epigenome during sensitive and insensitive periods?

As stated at the outset – genes are not destiny.  The DNA cannot encode a system that knows who/what is good vs. bad, but rather can only encode a system of molecular parts that can assemble to learn these contingencies on the fly.  During sensitive periods in the visual system, cells in the visual system are more active and fire more profusely during the sensitive period. This extra firing leads to changes in gene expression in ways that (somewhat) permanently set the connectivity, strength and sensitivity of visual synapses.  The expression of neuroligins, neurexins, integrins and all manner of extracellular proteins that stabilize synaptic connections are well-known tagets of activity-induced gene expression.  Hence the environment “interacts” with the genome via neuronal firing which induces gene expression which – in turn – feeds back and modulates neuronal firing.  Environment –> neuronal firing –> gene expression –> modified neuronal firing.  OK.

Similarly, in the stress response system, the environment induces changes in the firing of cells in the hypothalamus which leads (through a series of intermediates) to the release of glucocorticoids.  Genes induced during the firing of hypothalamic cells and by the release of glucocorticoid can modify the organism’s subsequent response to stressful events.  Environment –> neuronal firing –> gene expression –> modified neuronal firing.  OK.

Digging deeper into the mechanism by which neuronal firing induces gene expression, we find an interesting twist.   Certainly there is a well-studied mechanism wherein neuronal firing causes Ca++ release which activates gene expression of neuroligins, neurexins, integrins and all manner of extracellular proteins that stabilize synaptic connections – for many decades.  There is another mechanism that can permanently mark certain genes and alter their levels of expression – in a long-lasting manner.  These are so-called epigenetic mechanisms such as DNA methylation and acetylation.  As covered here and here, for instance, Michael Meaney’s lab has shown that DNA CpG methylation of various genes can vary in response to early-life stress and/or maternal care. In some cases, females who were poorly cared for, may, in turn, be rather lousy mothers themselves as a consequence of these epigenetic markings.

A new research article, “Dynamic DNA methylation programs persistent adverse effects of early-life stress” by Chris Murgatroyd and colleagues [doi:10.1038/nn.2436] explores these mechanisms in great detail.  The team explored the expression of the arginine vasopressin (AVP) peptide – a gene which is important for healthy social interaction and social-stress responsivity.  Among many other interesting results, the team reports that early life stress (using a mouse model) leads to lower levels of methylation in the 3rd CpG island which is located downstream in a distal gene-expression-enhancer region.  In short, more early-life stress was correlated with less methylation, more AVP expression which is known to potentiate the release of glucocorticoids (a bad thing).   The team reports that the methyl binding MeCP2 protein, encoded by the gene that underlies Rett syndrome, acts as a repressor of AVP expression – which would normally be a good thing since it would keep AVP levels (and hence glucocorticoid levels) down.  But unfortunately, early-life stress removes the very methyl groups to which MeCP2 binds and also the team reports that parvocelluar neuronal depolarization leads to phosphorylation (on serine residue #438) of MeCP2 – a form of MeCP2 that is less accessible to its targets.  So, in  a manner similar to other examples, early life stress can have long-lasting effects on gene expression via an epigenetic mechanism – and disables an otherwise protective mechanism that would shield the organism from the effects of stress.  Much like in the case of Rett syndrome (as covered here) it seems that when MeCP2 is bound – then it silences gene expression – which would seem to be a good thing when it comes to the case of AVP.

So who puts these epigenetic marks on chromosomes and why?

I’ll try and explore this further in the weeks ahead.  One intriguing idea about why methylation has been co-opted among mammals, has to do with the idea of parent-offspring conflict.  According to David Haig, one of the experts on this topic, males have various incentives to cause their offspring to be large and fast growing, while females have incentive to combat the genomic tricks that males use, and to keep their offspring smaller and more manageable in size.  The literature clearly show that genes that are marked or methylated by fathers (paternally imprinted genes) tend to be growth promoting genes and that maternally imprinted genes tend to be growth inhibitors.  One might imagine that maternally methylated genes might have an impact on maternal care as well.

Lastly, the growth promoting/inhibiting effects of paternal/maternal genes and gene markings is now starting to be discussed somewhat in the context of autism/schizophrenia which have have been associated with synaptic under-/over-growth, respectively.

Building a brain is already tough enough – but to have to do it amidst an eons-old battle between maternal and paternal genomes.  Sheesh!  More on this to come.

Reblog this post [with Zemanta]

Read Full Post »

LARRY DAVID AT TRIBECA FILM FESTIVAL WIKIPEDIA
Image by roberthuffstutter via Flickr

pointer to: Eye-on-DNA’s post of last nights episode of “Lopez Tonight” where Larry David shared the unveiling of his “Ancestry-by-DNA” results.  He was good sport and it was great to see science as FUNHis results made me wonder if such ancestry tests are reliable though.

Reblog this post [with Zemanta]

Read Full Post »

By Richard Wheeler (Zephyris) 2007. The three ...
Image via Wikipedia

File this story under “the more you know, the more you don’t know” or simply under “WTF!”  The new paper, “Microduplications of 16p11.2 are associated with schizophrenia” [doi:10.1038/ng.474] reveals that a short stretch of DNA on chromosome 16p11.2 is – very rarely – duplicated and – more rarely – deleted.  In an analysis of 8,590 individuals with schizophrenia, 2,172 with developmental delay or autism, 4,822 with bipolar disorder and 30,492 controls, the the microduplication of 16p11.2 was strongly associated with schizophrenia, bipolar and autism while the reciprocal microdeletion was strongly associated with developmental delay or autism – but not associated with schizophrenia or bipolar disorder.

OK, so the title of my post is misleading (hey, its a blog) since there are clearly many additional factors that contribute to the developmental outcome of autism vs. schizophrenia, but this stretch of DNA seems to hold clues about early development of brain systems that go awry in both disorders.  Here is a list of the brain expressed genes in this 600 kbp region (in order from telomere-side to centromere-side): SPN, QPRT, C16orf54, MAZ, PRRT2, C16orf53, MVP, CDIPT, SEZ6L2, ASPHD1, KCTD13, TMEM219, TAOK2, HIRIP3, INO80E, DOC2A, FLJ25404, FAM57B, ALDOA, PPP4C, TBX6, YPEL3, GDPD3, MAPK3, CORO1A.

Any guess as to which one(s) are the culprits?  I’ll go with HIRIP3 given its role in chromatin structure regulation – and the consequent regulation of under- (schiz?)/over- (autism) growth of synapses. What an amazing mystery to pursue.

Reblog this post [with Zemanta]

Read Full Post »

MECP2
Image via Wikipedia

The cognitive and emotional impairments in the autism spectrum disorders can be difficult for parents and siblings to understand and cope with.  Here are some graphics and videos that might assist in understanding how genetic mutations and epigenetic modifications can lead to various forms of social withdrawl commonly observed in the autism spectrum disorders in children.

In this post, the focus is just on the MecP2 gene – where mutations are known to give rise to Rett Syndrome – one of the autism spectrum disorders.  I’ll try and lay out some of the key steps in the typical bare-bones-link-infested-blogger-fashion – starting with mutations in the MecP2 gene.  Disclaimer: there are several fuzzy areas and leaps of faith in the points and mouse model evidence below, and there are many other genes associated with various aspects of autism spectrum disorders that may or may not work in this fashion.  Nevertheless, still it seems one can begin to pull a mechanistic thread from gene to social behavior Stay tuned for more on this topic.

1. The MecP2 gene encodes a protein that binds to 5-Methylcytosine – very simply – a regular cytosine reside with an extra methyl group added at position 5.  Look at the extra -CH3 group on the cytosine residue in the picture at right.  See?  That’s a 5-methylcyctosine residue – and it pairs in the DNA double helix with guanosine (G) in the same fashion as does the regular cyctosine reside (C). 5methC OK, now, mutations in the gene that encode the  MecP2 gene – such as those found at Arginine residue 133 and Serine residue 134 impair the ability of the protein to bind to these 5-Methylcyctosine residues.  bindingMecP2The figure at left illustrates this, and shows how the MecP2 protein lines up with the bulky yellow 5-Methylcytosine residues in the blue DNA double helix during binding.

2. When the MecP2 protein is bound to the methylated DNA, it serves as a binding site for another type of protein – an HDAC or histone deacetylase. The binding of MecP2 and HDAC (and other proteins (see p172 section 5.3 of this online bookChromatin Structure and Gene Expression“)).  The binding of the eponymously named HDAC’s leads to the “de-acetylation” of proteins known as histones.  The movie below illustrates how histone “de-acetylation” leads to the condensation of DNA structure and repression or shutting down of gene expression (when the DNA is tightly coiled, it is inaccessible to transcription factors).  Hence: DNA methylation leads (via MecP2, HDAC binding) to a repression on gene expression.


3. When mutated forms of MecP2 cannot bind, the net result is MORE acetylation and MORE gene expression. As covered previously here, this may not be a good thing during brain development since more gene expression can induce the formation of more synapses and – possibly – lead to neural networks that fail to grow and mature in the “normal” fashion. The figure at right toomanysynapsessuggests that neural networks with too many synapses may not be appropriately connected and may be locked-in to sub-optimal architectures.  Evidence for excessive synaptogenesis is abundant within the autism spectrum disorders.  Neuroligins – a class of genes that have been implicated in autism are known to function in cell & synaptic adhesion (open access review here), and can alter the balance of excitation/inhibition when mutated – which seems consistent with this heuristic model of neural networks that can be too adhesive or sticky.

4. Cognitive and social impairment can result from poor-functioning neural networks containing, but not limited to the amygdala. The normal development of neural networks containing the forntal cortex and amygdala are important for proper social and emotional function.  The last piece of the puzzle then would be to find evidence for developmental abnormalities in these networks and to show that such abnormalities mediate social and/or emotional function.  Such evidence is abundant.

Regarding the effects of MecP2 however, we can consider the work of Adachi et al., who were able to delete the MecP2 gene – just in the amygdala – of (albeit, an adult) mouse.  Doing so, led to the disruption of various emotional behaviors – BUT NOT – of various social interaction deficits that are observed when MecP2 is deleted in the entire forebrain.  This was the case also when the team infused HDAC inhibitors into the amygdala suggesting that loss of transcriptional repression in the adult amygdala may underlie the emotional impariments seen in some autism spectrum disorders.  Hence, such emotional impairments (anxiety etc.) might be treatable in adults (more on this result later and its implications for gene-therapy).

Whew!  Admittedly, the more you know – the more you don’t know.  True here, but still amazing to see the literature starting to interlink across human-genetic, mouse-genetic, human-functional-imaging levels of analysis. Hoping this rambling was helpful.

Reblog this post [with Zemanta]

Read Full Post »

Violinist marionette performs
Image by eugene via Flickr

The homunculus (argument) is a pesky problem in cognitive science – a little guy who might suddenly appear when you propose a mechanism for decision making, spontaneous action or forethought  etc. – and would take credit for the origination of the neural impulse.  While there are many mechanistic models of decision making that have slain the little bugger – by invoking competition between past experience and memory as the source of new thoughts and ideas – one must always tread lightly, I suppose, to be wary that cognitive mechanisms are based completely in neural properties devoid of a homuncular source.

Still, the human mind must begin somewhere.  After all, its just a ball of cells initially, and then a tube and then some more folds, layers, neurogenesis and neural migration  etc. before maturing – miraculously – into a child that one day looks at you and says, “momma” or “dada”.  How do these neural networks come into being?  Who or what guides their development toward that unforgettable, “momma (dada)” moment?  A somewhat homuncluar “genetic program” – whose instructions we can attribute to millions of years of natural selection?  Did early hominid babies say “momma (dada)?  Hmmm. Seems like we might be placing a lot of faith in the so-called “instructions” provided by the genome, but who am I to quibble.

On the other hand, you might find that the recent paper by Akhtar et al., “Histone Deacetylases 1 and 2 Form a Developmental Switch That Controls Excitatory Synapse Maturation and Function” [doi:10.1523/jneurosci.0097-09.2009] may change the way you think about cognitive development.  The team explores the function of two very important epigenetic regulators of gene expression – histone deacetylases 1,2 (HDAC1, HDAC2) on the functionality of synapses in early developing mice and mature animals.  By epigenetic, I refer to the role of these genes in regulating chromatin structure and not via direct, site-specific DNA binding.  The way the HDAC genes work is by de-acetylating – removing acetyl groups – thus removing a electrostatic repulsion of acetyl groups (negative charge) on histone proteins with the phosphate backbone of DNA (also a negative charge).  When the histone proteins carry such an acetyl group, they do NOT bind well to DNA (negative-negative charge repulsion) and the DNA molecule is more open and exposed to binding of transcription factors that activate gene expression.  Thus if one (as Akhtar do) turns off a de-acetylating HDAC gene, then the resulting animal has a genome that is more open and exposed to transcription factor binding and gene expression.  Less HDAC = more gene expression!

What were the effects on synaptic function?  To summarize, the team found that in early development (neonatal mouse hippocampal cells) cells where the HDAC1 or 2 genes were turned off (either through pharmacologic blockers or via partial deletion of the gene(s) via lentivirus introduction of Cre recombinase) had more synapses and more synaptic electrical activity than did hippocampal cells from control animals.  Keep in mind that the HDACs are located in the nucleus of the neuron and the synapses are far, far away.  Amazingly – they are under the control of an epigenetic regulator of gene expression;  hence, ahem, “epigenetic puppetmasters”.  In adult cells, the knockdown of HDACs did not show the same effects on synaptic formation and activity.  Rather the cells where HDAC2 was shut down showed less synaptic formation and activity (HDAC1 had no effect).  Again, it is amazing to see effects on synaptic function regulated at vast distances.  Neat!

The authors suggest that the epigenetic regulatory system of HDAC1 & 2 can serve to regulate the overall levels of synaptic formation during early cognitive development.  If I understand their comments in the discussion, this may be because, you don’t necessarily want to have too many active synapses during the formation of a neural network.   Might such networks might be prone to excitotoxic damage or perhaps to being locked-in to inefficient circuits?  The authors note that HDACs interact with MecP2, a gene associated with Rett Syndrome – a developmental disorder (in many ways similar to autism) where neural networks underlying cognitive development in children fail to progress to support higher, more flexible forms of cognition.  Surely the results of Akhtar et al., must be a key to understanding and treating these disorders.

Interestingly, here, the controller of these developmental phenotypes is not a “genetic program” but rather an epigenetic one, whose effects are wide-spread across the genome and heavily influenced by the environment.  So no need for an homunculus here.

Reblog this post [with Zemanta]

Read Full Post »

Lonely child
Image by kodomut via Flickr

For humans, there are few sights more heart-wrenching than an orphaned child (or any orphaned vertebrate for that matter).  Isolated, cold, unprotected, vulnerable – what could the cold, hard calculus of natural selection – “red in tooth and claw” – possibly have to offer these poor, vulnerable unfortunates?

So I wondered while reading, “Functional CRH variation increases stress-induced alcohol consumption in primates” [doi:10.1073/pnas.0902863106].  In this paper, the authors considered the role of a C-to-T change at position -248 in the promoter of the corticotropin releasing hormone (CRH or CRF) gene.  Its biochemical role was examined using nuclear extracts from hypothalamic cells, to demonstrate that this C-to-T nucleotide change disrupts protein-DNA binding, and, using transcriptional reporter assays, that the T-allele showed higher levels of transcription after forskolin stimulation.  Presumably, biochemical differences conferred by the T-allele can have a physiological role and alter the wider functionality of the hypothalamic-pituitary-axis (HPA axis), in which the CRH gene plays a critical role.

The authors ask whether primates (rhesus macaques) who differ in genotype (CC vs. CT) show any differences in physiological stress reactivity – as predicted by differences in the activity of the CRH promoter.  As a stressor, the team used a form of brief separation stress and found that there were no differences in HPA function (assessed by ACTH and Cortisol levels) in animals who were reared by their mothers.  However, when the stress paradigm was performed on animals who were reared without a mother (access to play with other age-matched macaques) there were significant differences in HPA function between the 2 genetic groups (T-alleles showing greater release of stress hormones).  Further behavioral assessments found that the peer reared animals who carried the T-allele explored their environment less when socially separated as adults (again no C vs. T differences in maternally reared animals).  In a separate assessment the T-carriers showed a preference for sweetened alcohol vs. sweetened water in ad lib consumption.

One way of summarizing these findings, could be to say that having no mother is a bad thing (more stress reactivity) and having the T-allele just makes it worse!  Another way could be to say that the T-allele enhances the self-protection behaviors (less exploration could be advantageous in the wild?) that arise from being orphaned.  Did mother nature (aka. natural selection) provide the macaque with a boost of self-preservation (in the form of a T-allele that enhances emotional/behavioral inhibition)?  I’m not sure, but it will be fun to report on further explorations of this query.  Click here for an interview with the corresponding author, Dr. Christina Barr.

—p.s.—

The authors touch on previous studies (here and here) that explored natural selection on this gene in primates and point out that humans and macaques both have 2 major haplotype clades (perhaps have been maintained in a yin-yang sort of fashion over the course of primate evolution) and that humans have a C-to-T change (rs28364015) which would correspond to position -201 in the macaque (position 68804715 on macaque chr. 8), which could be readily tested for similar functionality in humans.  In any case, the T-allele is rare in macaques, so it may be the case that few orphaned macaques ever endure the full T-allele experience.  In humans, the T-allele at rs28364015 seems more common.

Nevertheless, this is yet another – complicated – story of how genome variation is not destiny, but rather a potentiator or life experience – for better or worse.  Related posts on genes and early development (MAOA-here), (DAT-here), (RGS2-here), or just click the “development tag“.

Reblog this post [with Zemanta]

Read Full Post »

English: Visualization of a DTI measurement of...
Image via Wikipedia

Within the genetic news flow, there is often, and rightly so, much celebration when a gene for a disease is identified.  This is indeed an important first step, but often, the slogging from that point to a treatment – and the many small breakthroughs along the way – can go unnoticed. One reason why these 2nd (3rd, 4th, 5th …) steps are so difficult, is that in some cases, folks who carry “the gene” variant for a particular disorder, do not, in fact, display symptoms of the disorder.

Huh? One can carry the risk variant – or many risk variants – and not show any signs of illness?  Yes, this is an example of what geneticists refer to as variable penetrance, or the notion of carrying a mutation, but not outwardly displaying the mutant phenotype.  This, is one of the main reasons why genes are not deterministic, but much more probablistic in their influence of human development.

Of course, in the brain, such complexities exist, perhaps even moreso.  For example, take the neurological condition known as dystonia, a movement disorder that, according to the Dystonia Medical Research Foundation, “causes the muscles to contract and spasm involuntarily. The neurological mechanism that makes muscles relax when they are not in use does not function properly. Opposing muscles often contract simultaneously as if they are “competing” for control of a body part. The involuntary muscle contractions force the body into repetitive and often twisting movements as well as awkward, irregular postures.”  Presently there are more than a dozen genes and/or chromosomal loci that are associated with dystonia – two of the major genes, DYT1 and DYT6 – having been identified as factors in early onset forms of dystonia.  Now as we enter the era of personal genomes, an individual can assess their (own, child’s, preimplantion embryo’s!) genetic risk for such rare genetic variants – whose effects may not be visible until age 12 or older.  In the case of DYT1, this rare mutation (a GAG deletion at position 946 which causes a loss of a glutamate residue in the torsin A protein) gives rise to dystonia in about 30-40% of carriers.  So, how might these genes work and why do some individuals develop dystonia and others do not?  Indeed, these are the complexities that await in the great expanse between gene identification and treatment.

An inspection of the molecular aspects of torsin A (DYT1) show that it is a member of the AAA family of adenosine triphosphatases and is related to the Clp protease/heat shock family of genes that help to properly fold poly-peptide chains as they are secreted from the endoplasmic reticulum of the cell – a sort-of handyman, general purpose gene (expressed in almost every tissue in the body) that sits on an assembly line and hammers away to help make sure that proteins have the right shape as they come off their assembly linesNot much of a clue for dystonia – hmm.  Similarly, the THAP domain containing, apoptosis associated protein 1 (THAP1) gene (a.k.a. DYT6) is also expressed widely in the body and seems to function as a DNA binding protein that regulates aspects of cell cycle progression and apoptosis.  Also not much an obvious clue to dystonia – hmm, hmm.  Perhaps you can now see why the identification of “the gene” – something worth celebrating – can just leave you aghast at how much more you don’t know.

That these genes influence an early developmental form of the disorder suggests a possible developmental role for these rather generic cogs in the cellular machinery.  But where? how? & why an effect in some folks and not others?  To these questions, comes an amazing analysis of DYT1 and DYT6 carriers in the article entitled, “Cerebellothalamocortical Connectivity Regulates Penetrance in Dystonia” by Argyelan and colleagues [doi: 10.1523/JNEUROSCI.2300-09.2009]. In this article, the research team uses a method called diffusion tensor imaging (sensitive to white matter density) to examine brain structure and function among individuals who carry the mutations but either DO or DO NOT manifest the symptoms. By looking at white matter tracts (super highways of neural traffic) throughout the brain the team was able to ask whether some tracts were different in the 2 groups (as well as a group of unaffectd, non-carriers).  In this way, the team can begin to better understand the causal pathway between these run-of-the-mill genes (torsin A and thap1) and the complex pattern of muscle spasms that arise from their mutations.

To get right to the findings, the team has discovered that in one particular tract, a superhighway known as “cerebellar outflow pathway in the white matter of lobule VI, adjacent to the dentate nucleus” (not as quaint as Route 66) that those participants that DO manifest dystonia had less tract integrity and connectivity there compared to those that DO NOT manifest and healthy controls (who have the most connectivity there).  Subsequent measures of resting-state blood flow confirmed that the disruptions in white matter tracts were correlated with cerebellar outflow to the thalamus and – more importantly – with activity in areas of the motor cortex.  The correlations were such that individuals who DO manifest dystonia had greater activity in the motor cortex (this is what dystonia really comes down to — too much activity in the motor cortex).

Thus the team were able to query gene carriers using their imaging methods and zero-in on “where in the brain” these generic proteins exert a detrimental effect.  This seems to me, to be a huge step forward in understanding how a run-of-the-mill gene can alter brain function in such a profound way.  Now that they’ve found the likely circuit (is it the white matter per se or the neurons?), more focus can be applied to how this circuit develops – and can be repaired.

Reblog this post [with Zemanta]

Read Full Post »

Backyard trampoline
Image by Kevin Steele via Flickr

For more than a decade, we’ve known that at least 95% of the human genome is junk – or junque – if you’re offended by the thought that “you” emerged from a single cell whose genome is mostly a vast pile of crap – or crappe – if you insist.  Hmmm, what is this crap?  It turns out to be a lot of random repeating sequences and a massive collection of evolutionary artifacts left over from the evolution of earlier genomes – mainly bits of retroviruses who once inserted themselves irreversibly into our ancestors’ genomes.  One subset of this type of – can we upgrade it from crappe to “relic” now? – is something we’ve labelled “autonomously mobile DNA sequences” or more specifically, “long interspersed nuclear elements (LINEs or L1s)”.  This class of DNA relic comprises more than 15% of the human genome (that’s about 3-5x more than the relevant genomic sequence from which you emerge) and retains the ability to pick itself up out of the genome – via an RNA intermediate – and insert itself into new places in the genome.  This has been observed to happen in the germ line of humans and a few L1 insertions are even responsible for genetic forms of humn disease (for example in the factor VIII gene giving rise to haemophilia).  The mechanism of transposition – or “jumping” as these elements are sometimes called “jumping genes” – involves the assembly of a certain type of transcriptional, transport and reverse-transcription (RNA back to DNA) apparatus that is known to be available in stem cells, but hardly ever  in somatic cells.

Except, it would seem, for the brain – which as we’ve covered here before – keeps its precious neurons and glia functioning under separate rules.  Let’s face it, if a liver cell dies, you just replace it without notice, but if neurons die, so do your childhood memories.  So its not too surprising, perhaps, that brain cells have special ‘stem-cell-like’ rules for keeping themselves youthful.  This seems to be borne out again in a paper entitled, “L1 retrotransposition in human neural progenitor cells” by Coufal et al., [doi:10.1038/nature08248].  Here the team shows that L1 elements are able to transpose themselves in neural stem cells and that there are more L1 elements (about 80 copies more per cell) in the hippocampus than in liver or heart cells.  So apparently, the hippocampus, which does seem to contain a niche of stem cells, permits the transposition or “jumping” of L1 elements in a way that the liver and heart do not.  Sounds like a fun place to be a gene!

Reblog this post [with Zemanta]

Read Full Post »