Feeds:
Posts
Comments

Posts Tagged ‘DNA’

remember a day before today
Image by DerrickT via Flickr

Most cells in your adult body are “terminally differentiated” – meaning that they have developed from stem cells into the final liver, or heart, or muscle or endothelial cell that they were meant to be.  From that point onward, cells are able to “remember” to stay in this final state – in part – via stable patterns of DNA methylation that reinforce the regulation of “the end state” of gene expression for that cell.  As evidence for this role of DNA methylation, it has been observed that levels of DNA methyl transferase (DNMT) decline when cells are fully differentiated and thus, cannot modify or disrupt their patterns of methylation.

NOT the case in the brain! Even though neurons in the adult brain are fully differentiated, levels of methyl transferases – DO NOT decline.  Why not? Afterall, we wouldn’t want our neurons to turn into liver cells, or big toe cells, would we?

One hypothesis, suggested by David Sweatt and colleagues is that neurons have more important things to “remember”.   They suggest in their fee and open research article, “Evidence That DNA (Cytosine-5) Methyltransferase Regulates Synaptic Plasticity in the Hippocampus” [doi: 10.1074/jbc.M511767200] that:

DNA methylation could have lasting effects on neuronal gene expression and overall functional state. We hypothesize that direct modification of DNA, in the form of DNA (cytosine-5) methylation, is another epigenetic mechanism for long term information storage in the nervous system.

By measuring methylated vs. unmethylated DNA in the promoter of the reelin and BDNF genes and relating this to electrophysiological measures of synaptic plasticity, the research team finds correlations between methylation status and synaptic plasticity.  More specifically, they find that zebularine (an inhibitor of DNMT) CAN block long-term potentiation (LTP), but NOT block baseline synaptic transmission nor the ability of synapses to fire in a theta-burst pattern (needed to induce LTP).

This suggests that the epigenetic machinery used for DNA methylation may have a role in the formation of cellular memory – but not in the same sense as in other cells in the body – where cells remember to remain in a terminally differentiated state.

In the brain, this epigenetic machinery may help cells remember stuff that’s more germane to brain function … you know … our memories and stuff.

Enhanced by Zemanta

Read Full Post »

The structure of part of a DNA double helix
Image via Wikipedia

just a pointer to: Genetic Future’s pointer to the recent article, “Family become first to have DNA sequenced for non-medical reasons“.    The father suggests, “it will be ethically improper if you don’t have your children sequenced“.

Early days.

Reblog this post [with Zemanta]

Read Full Post »

Twin studies have long suggested that genetic variation is a part of healthy and disordered mental life.  The problem however – some 10 years now since the full genome sequence era began – has been finding the actual genes that account for this heritability.

It sounds simple on paper – just collect lots of folks with disorder X and look at their genomes in reference to a demographically matched healthy control population.  Voila! whatever is different is a candidate for genetic risk.  Apparently, not so.

The missing heritability problem that clouds the birth of the personal genomes era refers to the baffling inability to find enough common genetic variants that can account for the genetic risk of an illness or disorder.

There are any number of reasons for this … (i) even as any given MZ and DZ twin pair shares genetic variants that predispose them toward the similar brains and mental states, it may be the case that different MZ and DZ pairs have different types of rare genetic variation thus diluting out any similar patterns of variation when large pools of cases and controls are compared …  (ii) also, the way that the environment interacts with common risk-promoting genetic variation may be quite different from person to person – making it hard to find variation that is similarly risk-promoting in large pools of cases and controls … and many others I’m sure.

One research group recently asked whether the type of common genetic variation(SNP vs. CNV) might inform the search for the missing heritability.  The authors of the recent paper, “Genome-wide association study of CNVs in 16,000 cases of eight common diseases and 3,000 shared controls” [doi:10.1038/nature08979] looked at an alternative to the usual SNP markers – so called common copy number variants (CNVs) – and asked if these markers might provide a stronger accounting for genetic risk.  While a number of previous papers in the mental health field have indeed shown associations with CNVs, this massive study (some 3,432 CNV probes in 2000 or so cases and 3000 controls) did not reveal an association with bipolar disorder.  Furthermore, the team reports that common CNV variants are already in fairly strong linkage disequilibrium with common SNPs and so perhaps may not have reached any farther into the abyss of rare genetic variation than previous GWAS studies.

Disappointing perhaps, but a big step forward nonetheless!  What will the personal genomes era look like if we all have different forms of rare genetic variation?

Reblog this post [with Zemanta]

Read Full Post »

Crocus (cropped)
Image by noahg. via Flickr

If you’ve started to notice the arrival of spring blossoms, you may have wondered, “how do the blossoms know when its spring?”  Well, it turns out that its not the temperature, but rather, that plants sense the length of the day-light cycle in order to synchronize their  own life cycles with the seasons.  According to the photoperiodism entry for wikipedia, “Many flowering plants use a photoreceptor protein, such as phytochrome or cryptochrome, to sense seasonal changes in night length, or photoperiod, which they take as signals to flower.”

It turns out that humans are much the same. Say wha?!

Yep, as the long ago descendants of single cells who had to eek out a living during day (when the sun emits mutagenic UV radiation) and night cycles, our very own basic molecular machinery that regulates the transcription, translation, replication and a host of other cellular functions is remarkably sensitive – entrained – in a clock-like fashion to the rising and setting sun.  This is because, in our retinas, there are light-sensing cells that send signals to the suprachiasmatic nucleus (SCN) which then – via the pineal gland – secretes systemic hormones such as melatonin that help synchronize cells and organs in your brain and body.  When this process is disrupted, folks can feel downright lousy, as seen in seasonal affective disorder (SAD), delayed sleep phase syndrome (DSPS) and other circadian rhythm disorders.

If you’re skeptical, consider the effects of genetic variation in genes that regulate our circadian rhythms, often called “clock” genes – very ancient genes that keep our cellular clocks synchronized with each other and the outside environment.  Soria et al., have a great paper entitled, “Differential Association of Circadian Genes with Mood Disorders: CRY1 and NPAS2 are Associated with Unipolar Major Depression and CLOCK and VIP with Bipolar Disorder” [doi: 10.1038/npp.2009.230] wherein they reveal that normal variation in these clock genes is associated with mood regulation.

A few of the highlights reported are rs2287161 in the CRY1 gene,  rs11123857 in the NPAS2 gene, and rs885861 in the VIPR2 gene – where the C-allele, G-allele and C-allele, respectively, were associated with mood disorders.

I’m not sure how one would best interpret genetic variation of such circadian rhythm genes.  Perhaps they index how much a person’s mood could be influenced by changes or disruptions to the normal rhythm??  Not sure.  My 23andMe data shows the non-risk AA genotype for rs11123857 (the others are not covered by 23andMe).

Reblog this post [with Zemanta]

Read Full Post »

According to wikipedia, “Jean Philippe Arthur Dubuffet (July 31, 1901 – May 12, 1985) was one of the most famous French painters and sculptors of the second half of the 20th century.”  “He coined the term Art Brut (meaning “raw art,” often times referred to as ‘outsider art’) for art produced by non-professionals working outside aesthetic norms, such as art by psychiatric patients, prisoners, and children.”  From this interest, he amassed the Collection de l’Art Brut, a sizable collection of artwork, of which more than half, was painted by artists with schizophrenia.  One such painting that typifies this style is shown here, entitled, General view of the island Neveranger (1911) by Adolf Wolfe, a psychiatric patient.

Obviously, Wolfe was a gifted artist, despite whatever psychiatric diagnosis was suggested at the time.  Nevertheless, clinical psychiatrists might be quick to point out that such work reflects the presence of an underlying thought disorder (loss of abstraction ability, tangentiality, loose associations, derailment, thought blocking, overinclusive thinking, etc., etc.) – despite the undeniable aesthetic beauty in the work.  As an ardent fan of such art,  it made me wonder just how “well ordered” my own thoughts might be.  Given to being rather forgetful and distractable, I suspect my thinking process is just sufficiently well ordered to perform the routine tasks of day-to-day living, but perhaps not a whole lot more so.  Is this bad or good?  Who knows.

However, Krug et al., in their recent paper, “The effect of Neuregulin 1 on neural correlates of episodic memory encoding and retrieval” [doi:10.1016/j.neuroimage.2009.12.062] do note that the brains of unaffected relatives of persons with mental illness show subtle differences in various patterns of activation.  It seems that when individuals are using their brains to encode information for memory storage, unaffected relatives show greater activation in areas of the frontal cortex compared to unrelated subjects.  This so-called encoding process during episodic memory is very important for a healthy memory system and its dysfunction is correlated with thought disorders and other aspects of cognitive dysfunction.  Krug et al., proceed to explore this encoding process further and ask if a well-known schizophrenia risk variant (rs35753505 C vs. T) in the neuregulin-1 gene might underlie this phenomenon.  To do this, they asked 34 TT, 32 TC and 28 CC individuals to perform a memory (of faces) game whilst laying in an MRI scanner.

The team reports that there were indeed differences in brain activity during both the encoding (storage) and retrieval (recall) portions of the task – that were both correlated with genotype – and also in which the CC risk genotype was correlated with more (hyper-) activation.  Some of the brain areas that were hyperactivated during encoding and associated with CC genotype were the left middle frontal gyrus (BA 9), the bilateral fusiform gyrus and the left middle occipital gyrus (BA 19).  The left middle occipital gyrus showed gene associated-hyperactivation during recall.  So it seems, that healthy individuals can carry risk for mental illness and that their brains may actually function slightly differently.

As an ardent fan of Art Brut, I confess I hoped I would carry the CC genotype, but alas, my 23andme profile shows a boring TT genotype.  No wonder my artwork sucks.  More on NRG1 here.

Reblog this post [with Zemanta]

Read Full Post »

If you’re a coffee drinker, you may have noticed the new super-sized portions available at Starbucks.  On this note, it may be worth noting that caffeine is a potent psychoactive substance of which – too much – can turn your buzz into a full-blown panic disorder.  The Diagnostic and Statistical Manual for psychiatry outlines a number of caffeine-related conditions mostly involving anxieties that can arise when the natural alertness-promoting effects are pushed to extremes.  Some researchers have begun to explore the way the genome interacts with caffeine and it is likely that many genetic markers will surface to explain some of the individual differences in caffeine tolerance.

Here’s a great paper, “Association between ADORA2A and DRD2 Polymorphisms and Caffeine-Induced Anxiety” [doi: 10.1038/npp.2008.17] wherein polymorphisms in the adenosine A2A receptor (ADORA2A encodes the protein that caffeine binds to and antagonizes) – as well as the dopamine D2 receptor (DRD2 encodes a protein whose downstream signals are normally counteracted by A2A receptors) — show associations with anxiety after the consumption of 150mg of caffeine (about an average cup of coffee – much less than the super-size, super-rich cups that Starbucks sells).  The variants, rs5751876 (T-allele), rs2298383 (T-allele) and rs4822492 (G-allele) from the ADORA2A gene as well as rs1110976 (-/G genotype) from the DRD2 gene showed significant increases in anxiety in a test population of 102 otherwise-healthy light-moderate regular coffee drinkers.

My own 23andMe data only provides a drop of information suggesting I’m protected from the anxiety-promoting effects.  Nevertheless, I’ll avoid the super-sizes.
rs5751876 (T-allele)  C/C – less anxiety
rs2298383 (T-allele) – not covered
rs4822492 (G-allele) – not covered
rs1110976 (-/G genotype) – not covered

Reblog this post [with Zemanta]

Read Full Post »

DON’T tell the grant funding agencies, but, in at least one way, the effort to relate genetic variation to individual differences in cognitive function is a totally intractable waste of money.

Let’s say we ask a population of folks to perform a task – perhaps a word memory task – and then we use neuroimaging to identify the areas of the brain that (i) were associated with performance of the task, and (ii) were not only associated with performance, but were also associated with genetic variation in the population.  Indeed, there are already examples of just this type of “imaging-genetic” study in the literature.  Such studies form a crucial translational link in understanding how genes (whose biochemical functions are most often studied in animal models) relate to human brain function (usually studied with cognitive psychology). However, do these genes relate to just this task? What if subjects were recalling objects? or feelings?  What if subjects were recalling objects / experiences / feelings / etc. from their childhoods?  Of course, there are thousands of common cognitive operations one’s brain routinely performs, and, hence, thousands of experimental paradigms that could be used in such “imaging-genetic” gene association studies.  At more than $500/hour (some paradigms last up to 2 hours) in imaging costs, the translational genes-to-cognition endeavor could get expensive!

DO tell the grant funding agencies that this may not be a problem any longer.

The recent paper by Liu and colleagues “Prefrontal-Related Functional Connectivities within the Default Network Are Modulated by COMT val158met in Healthy Young Adults” [doi: 10.1523/jneurosci.3941-09.2010] suggests an approach that may simplify matters.  Their approach still involves genotyping (in this case for rs4680) and neuroimaging.  However, instead of performing a specific cognitive task, the team asks subjects to lay in the scanner – and do nothing.  That’s right – nothing – just lay still with eyes closed and just let the mind wander and not to think about anything in particular – for a mere 10 minutes.  Hunh?  What the heck can you learn from that?

It turns out that one can learn a lot.  This is because the neural pathways that the brain uses when you are actively doing something (a word recall task) are largely intact even when you are doing nothing.  Your brain does not “turn off” when you are laying still with your eyes closed and drifting in thought.  Rather, your brain slips into a kind of default pattern, described in studies of  “default networks” or “resting-state networks” where wide-ranging brain circuits remain dynamically coupled and actively exchange neural information.  One really great paper that describes these networks is a free-and-open article by Hagmann et al., “Mapping the Structural Core of Human Cerebral Cortex” [doi: 10.1371/journal.pbio.0060159] from which I’ve lifted their Figure 1 above.  The work by Hagmann et al., and others show that the brain has a sort of “connectome” where there are thousands of “connector hubs” or nodes that remain actively coupled (meaning that if one node fires, the other node will fire in a synchronized way) when the brain is at rest and when the brain is actively performing cognitive operations.  In a few studies, it seems that the strength of functional coupling in certain brain areas at rest is correlated (positively and negatively) with the activation of these areas when subjects are performing a specific task.

In the genetic study reported by Liu and colleagues, they found that genotype (N=57) at the dopaminergic COMT gene correlated with differences in the functional connectivity (synchronization of firing) of nodes in the prefrontal cortex.  This result is eerily similar to results found for a number of specific tasks (N-back, Wisconsin Card Sorting, Gambling, etc.) where COMT genotype was correlated with the differential activation of the frontal cortex during the task.  So it seems that one imaging paradigm (lay still and rest for 10 minutes) provided comparable insights to several lengthy (and diverse) activation tasks.  Perhaps this is the case. If so, might it provide a more direct route to linking genetic variation with cognitive function?

Liu and colleagues do not comment on this proposition directly nor do they seem to be over-interpreting their results in they way I have editorialized things here.  They very thoughtfully point out the ways in which the networks they’ve identified and similar and different to the published findings of others.  Certainly, this study and the other one like it are the first in what might be a promising new direction!

Reblog this post [with Zemanta]

Read Full Post »

Older Posts »