Feeds:
Posts
Comments

Posts Tagged ‘Biology’

3rd Dalai Lama,
Image via Wikipedia

Just a few excerpts from a lecture by the renown social psychologist Paul Ekman who is known for his work on the biology of human emotion.  Here he relates conceptual bridges between the writings of Charles Darwin and HH The Dalai Lama.  Ekman notes that both Darwin and HH The Dalai Lama intuit the existence of an organic natural source of compassion wherein humans are compelled to relieve the suffering of others so that the discomfort we feel when seeing others in pain can be relieved.  HH The Dalai Lama further suggests that these emotions are spontaneous, but compassion can be enhanced through PRACTICE!

Seems that science and ancient traditions can have a fascinating way of re-informing each other.

(more…)

Read Full Post »

Nucleosome structure.
Image via Wikipedia

pointer to the NOVA program on epigenetics “Ghost in Your Genes” (YouTube link here).  Fantastic footage.  Great intro to epigenetics and so-called trans-generational effects and the inheritance of epigenetic marks – which, in some cases – are left by adverse or stressful experience.  A weird, wild, game-changing concept indeed – that my grandchildren could inherit epigenetic changes induced in my genome by adverse experience.

Reblog this post [with Zemanta]

Read Full Post »

Twin studies have long suggested that genetic variation is a part of healthy and disordered mental life.  The problem however – some 10 years now since the full genome sequence era began – has been finding the actual genes that account for this heritability.

It sounds simple on paper – just collect lots of folks with disorder X and look at their genomes in reference to a demographically matched healthy control population.  Voila! whatever is different is a candidate for genetic risk.  Apparently, not so.

The missing heritability problem that clouds the birth of the personal genomes era refers to the baffling inability to find enough common genetic variants that can account for the genetic risk of an illness or disorder.

There are any number of reasons for this … (i) even as any given MZ and DZ twin pair shares genetic variants that predispose them toward the similar brains and mental states, it may be the case that different MZ and DZ pairs have different types of rare genetic variation thus diluting out any similar patterns of variation when large pools of cases and controls are compared …  (ii) also, the way that the environment interacts with common risk-promoting genetic variation may be quite different from person to person – making it hard to find variation that is similarly risk-promoting in large pools of cases and controls … and many others I’m sure.

One research group recently asked whether the type of common genetic variation(SNP vs. CNV) might inform the search for the missing heritability.  The authors of the recent paper, “Genome-wide association study of CNVs in 16,000 cases of eight common diseases and 3,000 shared controls” [doi:10.1038/nature08979] looked at an alternative to the usual SNP markers – so called common copy number variants (CNVs) – and asked if these markers might provide a stronger accounting for genetic risk.  While a number of previous papers in the mental health field have indeed shown associations with CNVs, this massive study (some 3,432 CNV probes in 2000 or so cases and 3000 controls) did not reveal an association with bipolar disorder.  Furthermore, the team reports that common CNV variants are already in fairly strong linkage disequilibrium with common SNPs and so perhaps may not have reached any farther into the abyss of rare genetic variation than previous GWAS studies.

Disappointing perhaps, but a big step forward nonetheless!  What will the personal genomes era look like if we all have different forms of rare genetic variation?

Reblog this post [with Zemanta]

Read Full Post »

Crocus (cropped)
Image by noahg. via Flickr

If you’ve started to notice the arrival of spring blossoms, you may have wondered, “how do the blossoms know when its spring?”  Well, it turns out that its not the temperature, but rather, that plants sense the length of the day-light cycle in order to synchronize their  own life cycles with the seasons.  According to the photoperiodism entry for wikipedia, “Many flowering plants use a photoreceptor protein, such as phytochrome or cryptochrome, to sense seasonal changes in night length, or photoperiod, which they take as signals to flower.”

It turns out that humans are much the same. Say wha?!

Yep, as the long ago descendants of single cells who had to eek out a living during day (when the sun emits mutagenic UV radiation) and night cycles, our very own basic molecular machinery that regulates the transcription, translation, replication and a host of other cellular functions is remarkably sensitive – entrained – in a clock-like fashion to the rising and setting sun.  This is because, in our retinas, there are light-sensing cells that send signals to the suprachiasmatic nucleus (SCN) which then – via the pineal gland – secretes systemic hormones such as melatonin that help synchronize cells and organs in your brain and body.  When this process is disrupted, folks can feel downright lousy, as seen in seasonal affective disorder (SAD), delayed sleep phase syndrome (DSPS) and other circadian rhythm disorders.

If you’re skeptical, consider the effects of genetic variation in genes that regulate our circadian rhythms, often called “clock” genes – very ancient genes that keep our cellular clocks synchronized with each other and the outside environment.  Soria et al., have a great paper entitled, “Differential Association of Circadian Genes with Mood Disorders: CRY1 and NPAS2 are Associated with Unipolar Major Depression and CLOCK and VIP with Bipolar Disorder” [doi: 10.1038/npp.2009.230] wherein they reveal that normal variation in these clock genes is associated with mood regulation.

A few of the highlights reported are rs2287161 in the CRY1 gene,  rs11123857 in the NPAS2 gene, and rs885861 in the VIPR2 gene – where the C-allele, G-allele and C-allele, respectively, were associated with mood disorders.

I’m not sure how one would best interpret genetic variation of such circadian rhythm genes.  Perhaps they index how much a person’s mood could be influenced by changes or disruptions to the normal rhythm??  Not sure.  My 23andMe data shows the non-risk AA genotype for rs11123857 (the others are not covered by 23andMe).

Reblog this post [with Zemanta]

Read Full Post »

We hope, that you choke, that you choke.
Image by Corrie… via Flickr

Coping with fear and anxiety is difficult.  At times when one’s life, livelihood or loved one’s are threatened, we naturally hightenen our senses and allocate our emotional and physical resources for conflict.  At times, when all is well, and resources, relationships and relaxation time are plentiful, we should unwind and and enjoy the moment.  But most of us don’t.  Our prized cognitive abilities to remember, relive and ruminate on the bad stuff out there are just too well developed – and we suffer – some more than others  (see Robert Saplosky’s book “Why Zebras Don’t Get Ulcers” and related video lecture (hint – they don’t get ulcers because they don’t have the cognitive ability to ruminate on past events).  Such may be the flip side to our (homo sapiens) super-duper cognitive abilities.

Nevertheless, we try to understand our fears and axieties and understand their bio-social-psychological bases. A recent paper entitled, “A Genetically Informed Study of the Association Between Childhood Separation Anxiety, Sensitivity to CO2, Panic Disorder, and the Effect of Childhood Parental Loss” by Battaglia et al. [Arch Gen Psychiatry. 2009;66(1):64-71] brought to mind many of the complexities in beginning to understand the way in which some individuals come to suffer more emotional anguish than others.  The research team addressed a set of emotional difficulties that have been categorized by psychiatrists as “panic disorder” and involving sudden attacks of fear, sweating, racing heart, shortness of breath, etc. which can begin to occur in early adulthood.

Right off the bat, it seems that one of the difficulties in understanding such an emotional state(s) are the conventions (important for $$ billing purposes) used to describe the relationship between “healthy” and “illness” or “disorder”.  I mean, honestly, who hasn’t experienced what could be described as a mild panic disorder once or twice?  I have, but perhaps that doesn’t amount to a disorder.  A good read on the conflation of normal stress responses and disordered mental states is “Transforming Normality into Pathology: The DSM and the Outcomes of Stressful Social Arrangements” by Allan V. Horwitz.

Another difficulty in understanding how and why someone might experience such a condition has to do with the complexities of their childhood experience (not to mention genes). Child development and mental health are inextrictably related, yet, the relationship is hard to understand.  Certainly, the function of the adult brain is the product of countless developmental unfoldings that build upon one another, and certainly there is ample evidence that when healthy development is disrupted in a social or physical way, the consequences can be very unfortunate and long-lasting. Yet, our ability to make sense of how and why an individual is having mental and/or emotional difficulty is limited.  Its a complex, interactive and emergent set of processes.

What I liked about the Battaglia et al., article was the way in which they acknowledged all of these complexities and – using a multivariate twin study design – tried to objectively measure the effects of genes and environment (early and late) as well as candidate biological pathways (sensitivity to carbon dioxide).  The team gathered 346 twin pairs (equal mix of MZ and DZ) and assessed aspects of early and late emotional life as well as the sensitivity to the inhalation of 35% CO2 (kind of feels like suffocating and is known to activate fear circuitry perhaps via the ASC1a gene).   The basic notion was to parcel out the correlations between early emotional distress and adult emotional distress as well as with a very specific physiological response (fear illicited by breathing CO2).  If there were no correlation or covariation between early and late distress (or the physiological response) then perhaps these processes are not underlain by any common mechanism.

However, the team found that there was covariation between early life emotion (criteria for separation anxiety disorder) and adult emotion (panic disorder) as well as the physiological/fear response illicited by CO2.  Indeed there seems to be a common, or continuous, set of processes whose disruption early in development can manifest as emotional difficulty later in development.  Furthermore, the team suggests that the underlying unifying or core process is heavily regulated by a set of additive genetic factors.  Lastly, the team finds that the experience of parental loss in childhood increased (but not via an interaction with genetic variation) the strength of the covariation between early emotion, late emotion and CO2 reactivity.  The authors note several limitations and cautions to over-interpreting these data – which are from the largest such study of its kind to date.

For individuals who are tangled in persistent ruminations and emotional difficulties, I don’t know if these findings help.  They seem to bear out some of the cold, cruel logic of life and evolution – that our fear systems are great at keeping us alive when we’ve had adverse experience in childhood, but not necessarily happy.  On the other hand, the covariation is weak, so there is no such destiny in life, even when dealt unfortunate early experience AND genetic risk.  I hope that learning about the science might help folks cope with such cases of emotional distress.

Reblog this post [with Zemanta]

Read Full Post »

DON’T tell the grant funding agencies, but, in at least one way, the effort to relate genetic variation to individual differences in cognitive function is a totally intractable waste of money.

Let’s say we ask a population of folks to perform a task – perhaps a word memory task – and then we use neuroimaging to identify the areas of the brain that (i) were associated with performance of the task, and (ii) were not only associated with performance, but were also associated with genetic variation in the population.  Indeed, there are already examples of just this type of “imaging-genetic” study in the literature.  Such studies form a crucial translational link in understanding how genes (whose biochemical functions are most often studied in animal models) relate to human brain function (usually studied with cognitive psychology). However, do these genes relate to just this task? What if subjects were recalling objects? or feelings?  What if subjects were recalling objects / experiences / feelings / etc. from their childhoods?  Of course, there are thousands of common cognitive operations one’s brain routinely performs, and, hence, thousands of experimental paradigms that could be used in such “imaging-genetic” gene association studies.  At more than $500/hour (some paradigms last up to 2 hours) in imaging costs, the translational genes-to-cognition endeavor could get expensive!

DO tell the grant funding agencies that this may not be a problem any longer.

The recent paper by Liu and colleagues “Prefrontal-Related Functional Connectivities within the Default Network Are Modulated by COMT val158met in Healthy Young Adults” [doi: 10.1523/jneurosci.3941-09.2010] suggests an approach that may simplify matters.  Their approach still involves genotyping (in this case for rs4680) and neuroimaging.  However, instead of performing a specific cognitive task, the team asks subjects to lay in the scanner – and do nothing.  That’s right – nothing – just lay still with eyes closed and just let the mind wander and not to think about anything in particular – for a mere 10 minutes.  Hunh?  What the heck can you learn from that?

It turns out that one can learn a lot.  This is because the neural pathways that the brain uses when you are actively doing something (a word recall task) are largely intact even when you are doing nothing.  Your brain does not “turn off” when you are laying still with your eyes closed and drifting in thought.  Rather, your brain slips into a kind of default pattern, described in studies of  “default networks” or “resting-state networks” where wide-ranging brain circuits remain dynamically coupled and actively exchange neural information.  One really great paper that describes these networks is a free-and-open article by Hagmann et al., “Mapping the Structural Core of Human Cerebral Cortex” [doi: 10.1371/journal.pbio.0060159] from which I’ve lifted their Figure 1 above.  The work by Hagmann et al., and others show that the brain has a sort of “connectome” where there are thousands of “connector hubs” or nodes that remain actively coupled (meaning that if one node fires, the other node will fire in a synchronized way) when the brain is at rest and when the brain is actively performing cognitive operations.  In a few studies, it seems that the strength of functional coupling in certain brain areas at rest is correlated (positively and negatively) with the activation of these areas when subjects are performing a specific task.

In the genetic study reported by Liu and colleagues, they found that genotype (N=57) at the dopaminergic COMT gene correlated with differences in the functional connectivity (synchronization of firing) of nodes in the prefrontal cortex.  This result is eerily similar to results found for a number of specific tasks (N-back, Wisconsin Card Sorting, Gambling, etc.) where COMT genotype was correlated with the differential activation of the frontal cortex during the task.  So it seems that one imaging paradigm (lay still and rest for 10 minutes) provided comparable insights to several lengthy (and diverse) activation tasks.  Perhaps this is the case. If so, might it provide a more direct route to linking genetic variation with cognitive function?

Liu and colleagues do not comment on this proposition directly nor do they seem to be over-interpreting their results in they way I have editorialized things here.  They very thoughtfully point out the ways in which the networks they’ve identified and similar and different to the published findings of others.  Certainly, this study and the other one like it are the first in what might be a promising new direction!

Reblog this post [with Zemanta]

Read Full Post »

silver copy of a 1930 penny
Image via Wikipedia

In their forecast “The World in 2010” special issue, the Economist points to “The looming crisis in human genetics” wherein scientists will reluctantly acknowledge that, even with super-cheap genome sequencing tools, we may not soon understand how genetic variation contributes to complex illness.  The argument is a valid one to be sure, but only time will tell.

A paper I read recently, reminded me of the long hard slog ahead in the area of genomics and psychiatric illness.  The authors in “Association of the Glutamate Transporter Gene SLC1A1 With Atypical Antipsychotics–Induced Obsessive-compulsive Symptoms” [Kwon et al., (2009) Arch Gen Psychiatry 66(11)] are trying to do something very important.  They would like to understand why certain (most) psychiatric medications have adverse side-effects and how to steer patients clear of adverse side-effects.  This is because, nowadays, a patient learns via a drawn-out trial-and-error ordeal about which medications he/she can manage the benefits/costs.

Specifically, the authors focused their efforts on so-called obsessive-compulsive symptoms that can arise from treatment with atypical antipsychotic medications.  Working from 3 major medical centers (Samsung Medical Center, Seoul National University Hospital and Asan Medical Center) Kwon et al., were able to cobble together a mere 40 patients who display these particular adverse side-effects and matched them with 54 patients based on several demographic and medication-based criteria.  Keep in mind that most genetic studies use upwards of 1,000 samples and still – hardly – are able to obtain significant effects.

Nevertheless, the authors note that the glutamate transporter gene (SLC1A1 or EAAC1) is a most logical candidate gene, being a located in a region mapped for obsessive-compulsive disorder risk and also a gene that appears to be down-regulated in response to atypical anti-psychotic treatment (particularly clozapine).  A series of statistical association tests for 10 SNPs in this gene reveal that two SNPs (rs2228622 and rs3780412) and a 3-SNP haplotype (the A/C/G haplotype at rs2228622-rs3780413-rs3780412) showed modestly significant association (about 4-fold higher risk) with the adverse symptoms.

To me, this is a very noteworthy finding.  A lot of work went into a very important problem – perhaps THE most pressing problem for patients on anti-psychotic medications today – and the results, while only of modest significance, are probably biologically valid.  The authors point out that rs2228622 and rs3780412 have previously been associated with OCD in other studies.

But when you compare these modest results (that these authors fought hard to obtain) with the big promises of the genomic era (as noted in the Economist article), well then, the results seem rather diminutive.  Will all patients who carry the risk haplotype be steered away from atypical antipsychotics?  Will big pharma (the authors of this paper disclose a great many ties to big pharma) support the fragmentation of their blockbuster drug markets into a hundred sub-populations?  I doubt it.  But some doctors and patients will experiment and continue to explore this avenue of inquiry – and it will take a long time to work out.  Better check back in 2020.

Reblog this post [with Zemanta]

Read Full Post »

Where da rodents kick it
Image by Scrunchleface via Flickr

A recent GWAS study identified the 3′ region of the liver- (not brain) expressed PECR gene (rs7590720(G) and rs1344694(T)) on chromosome 2 as a risk factor for alcohol dependency.  These results, as reported by Treutlein et al., in “Genome-wide Association Study of Alcohol Dependence” were based on a population of 487 male inpatients and a follow-up re-test in a population of 1024 male inpatients and 996 control participants.

The authors also asked whether lab rats who – given the choice between water-based and ethanol-spiked beverages over the course of 1 year – showed differential gene expression in those rats that were alcohol preferrers vs. alcohol non-preferring rats.  Among a total of 542 genes that were found to be differentially expressed in the amygdala and caudate nucleus of alcohol vs. non-alcohol-preferring rat strains,  a mere 3 genes – that is the human orthologs of these 3 genes – did also show significant association with alcohol dependency in the human populations.  Here are the “rat genes” (ie. human homologs that show differential expression in rats and association with alcohol dependency in humans): rs1614972(C) in the alcohol dehydrogenase 1C (ADH1C) gene, rs13273672(C) in the GATA binding protein 4 (GATA4) gene, and rs11640875(A) in the cadherin 13 (CDH13) gene.

My 23andMe profile gives a mixed AG at rs7590720, and a mixed GT at rs1344694 while I show a mixed CT at rs1614972, CT at rs13273672 and AG at rs11640875.  Boooring! a middling heterozygote at all 5 alcohol prefer/dependency loci.   Were these the loci for chocolate prefer/dependency I would be a full risk-bearing homozygote.

 

Reblog this post [with Zemanta]

Read Full Post »

Mi iPod con vídeo
Image by juanpol via Flickr

It was a great pleasure to speak with Professor Garet Lahvis from the Department of Behavioral Neuroscience at the Oregon Health and Science University, and learn more about how the biology of empathy and social behaviors in general can be approached with animal models that are suitable for genetic studies.  The podcast is HERE and the post on his lab’s recent paper, “Empathy Is Moderated by Genetic Background in Mice” is HEREThank you again Dr. Lahvis!

Reblog this post [with Zemanta]

Read Full Post »

creb1According to Joseph LeDoux, “One of the most important contributions of modern neuroscience has been to show that the nature/nurture debate operates around a false dichotomy: the assumption that biology, on one hand, and lived experience, on the other, affect us in fundamentally different ways” (ref).  Indeed.  While I know not where the current debate stands, I’d like to point to a fantastic example of just how inextricably linked the genome is to the environment.  In their recent paper, “A Biological Function for the Neuronal Activity-Dependent Component of Bdnf Transcription in the Development of Cortical Inhibition” [doi:10.1016/j.neuron.2008.09.024]  Hong et al., ask what happens when you take away the ability of a given gene to respond to the environment.  This is not a traditional “knockout” experiment – where the gene is inactivated from the moment of conception onwards – but rather a much more subtle type of experimental manipulation.  What happens when you prevent nurture from exerting an effect on gene expression?

The team focused on the BDNF gene whose transcription can be initiated from any one of eight promoter sites (I-XIII).  These sites vary in activity depending on the phase of development and/or the tissue or type of cell – all of which make for a complex set of instructions able to turn the BDNF gene on and off in precise developmental and/or tissue-specific ways.  In the case of promoter IV, it appears to be triggered in the cortex in response to Ca++ release that occurs when neurons are firing – a phenomena called, “neuronal activity dependent transcription” – a top example of how the environment can influence gene expression.  Seeing as how BDNF promoter IV is important for this type of environment-induced gene expression, the team asked what happens when you remove this particular promoter?

To do this, the team constructed – keep in mind that these are – mice that contain mutations in several of the Calcium (Ca++) response elements in the promoter IV region.  They introduced point mutations so that the Ca++ sensitive protein CREB could not bind to the promoter and activate gene expression.  OK, so what happens?

Firstly, the team reports that the mutant mice are more or less indistinguishable from controls in appearance, gait, growth rate, brain size and can also reproduce and transmit the mutations.  WOW! Is that one strike AGAINST nurture? The team then shows that BDNF levels are more than 50% reduced in cultured neurons, but that levels of other immediate early genes are NOT affected (as expected).  In living animals, the effects were similar when they asked how much gene expression occurs in the sensory cortex when animals are exposed to light (after an extended period of darkness).  OK, so there are few effects, so far, other than lower levels of nurture-induced BDNF expression – hmmm. Looking more closely however, the team found that the mutant mice generated lower levels of inhibitory neuron activity – as measured by the firing of miniature inhibitory postsynaptic currents.  Follow-on results showed that the total number of inhibitory neurons (parvalbumin and NPY + GABAergic cells) was no different than controls and so it would seem that the activity dependence of BDNF is important for the maintenance of inhibitory synapses.

Hence, the team has found that what “nurture” does (via the BDNF promoter IV in this case) is to exert an effect on the connectivity of inhibitory neurons.  Wow, thanks mother nurture!  Although it may seem like an obscure role for something as important as THE environment, the team points out that the relative balance of excitation-to-inhibition (yin-yang as covered here for Rett syndrome) is crucial for proper cognitive development.

To explore the notion of inhibory/excitation balance further, check out this (TED link) video lecture, where Michael Merzenich describes this imbalance as a “signal-to-noise” problem wherein some children’s brains are rather noisy (due to any number of genetic/environmental reasons – such as, perhaps, poorly maintained inhibitory connections).  This can make it harder to develop and function in life.  Perhaps someday, the genetic/environment research like that of Hong and colleagues will inform the rehabilitative strategies developed by Merzenich.

Reblog this post [with Zemanta]

Read Full Post »

arch_fatesAm having a wonderful time reading, “Your Inner Fish” by Professor Neil Shubin – an exploration into the deep evolutionary roots of the human body.  Amazed to contemplate the embryonic structures known as the branchial arches, or gill arches – which we share with sharks! – and the role of the gcm2 gene that is expressed in these arches and controls salt balance in humans and fish.  Pharyngula has a wonderful post on this !! 

Hoping to find more deep evolutionary roots of mind and brain.
innershark

Reblog this post [with Zemanta]

Read Full Post »

Violinist marionette performs
Image by eugene via Flickr

The homunculus (argument) is a pesky problem in cognitive science – a little guy who might suddenly appear when you propose a mechanism for decision making, spontaneous action or forethought  etc. – and would take credit for the origination of the neural impulse.  While there are many mechanistic models of decision making that have slain the little bugger – by invoking competition between past experience and memory as the source of new thoughts and ideas – one must always tread lightly, I suppose, to be wary that cognitive mechanisms are based completely in neural properties devoid of a homuncular source.

Still, the human mind must begin somewhere.  After all, its just a ball of cells initially, and then a tube and then some more folds, layers, neurogenesis and neural migration  etc. before maturing – miraculously – into a child that one day looks at you and says, “momma” or “dada”.  How do these neural networks come into being?  Who or what guides their development toward that unforgettable, “momma (dada)” moment?  A somewhat homuncluar “genetic program” – whose instructions we can attribute to millions of years of natural selection?  Did early hominid babies say “momma (dada)?  Hmmm. Seems like we might be placing a lot of faith in the so-called “instructions” provided by the genome, but who am I to quibble.

On the other hand, you might find that the recent paper by Akhtar et al., “Histone Deacetylases 1 and 2 Form a Developmental Switch That Controls Excitatory Synapse Maturation and Function” [doi:10.1523/jneurosci.0097-09.2009] may change the way you think about cognitive development.  The team explores the function of two very important epigenetic regulators of gene expression – histone deacetylases 1,2 (HDAC1, HDAC2) on the functionality of synapses in early developing mice and mature animals.  By epigenetic, I refer to the role of these genes in regulating chromatin structure and not via direct, site-specific DNA binding.  The way the HDAC genes work is by de-acetylating – removing acetyl groups – thus removing a electrostatic repulsion of acetyl groups (negative charge) on histone proteins with the phosphate backbone of DNA (also a negative charge).  When the histone proteins carry such an acetyl group, they do NOT bind well to DNA (negative-negative charge repulsion) and the DNA molecule is more open and exposed to binding of transcription factors that activate gene expression.  Thus if one (as Akhtar do) turns off a de-acetylating HDAC gene, then the resulting animal has a genome that is more open and exposed to transcription factor binding and gene expression.  Less HDAC = more gene expression!

What were the effects on synaptic function?  To summarize, the team found that in early development (neonatal mouse hippocampal cells) cells where the HDAC1 or 2 genes were turned off (either through pharmacologic blockers or via partial deletion of the gene(s) via lentivirus introduction of Cre recombinase) had more synapses and more synaptic electrical activity than did hippocampal cells from control animals.  Keep in mind that the HDACs are located in the nucleus of the neuron and the synapses are far, far away.  Amazingly – they are under the control of an epigenetic regulator of gene expression;  hence, ahem, “epigenetic puppetmasters”.  In adult cells, the knockdown of HDACs did not show the same effects on synaptic formation and activity.  Rather the cells where HDAC2 was shut down showed less synaptic formation and activity (HDAC1 had no effect).  Again, it is amazing to see effects on synaptic function regulated at vast distances.  Neat!

The authors suggest that the epigenetic regulatory system of HDAC1 & 2 can serve to regulate the overall levels of synaptic formation during early cognitive development.  If I understand their comments in the discussion, this may be because, you don’t necessarily want to have too many active synapses during the formation of a neural network.   Might such networks might be prone to excitotoxic damage or perhaps to being locked-in to inefficient circuits?  The authors note that HDACs interact with MecP2, a gene associated with Rett Syndrome – a developmental disorder (in many ways similar to autism) where neural networks underlying cognitive development in children fail to progress to support higher, more flexible forms of cognition.  Surely the results of Akhtar et al., must be a key to understanding and treating these disorders.

Interestingly, here, the controller of these developmental phenotypes is not a “genetic program” but rather an epigenetic one, whose effects are wide-spread across the genome and heavily influenced by the environment.  So no need for an homunculus here.

Reblog this post [with Zemanta]

Read Full Post »

Kali
Image via Wikipedia

Joseph LeDoux‘s book, “Synaptic Self: How Our Brains Become Who We Are” opens with his recounting of an incidental glance at a t-shirt, “I don’t know, so maybe I’m not” (a play on Descartes’ cogito ergo sum) that prompted him to explore how our brain encodes memory and how that leads to our sense of self.  More vividly, Elizabeth Wurtzel, in “Prozac Nation” recounts, “Nothing in my life ever seemed to fade away or take its rightful place among the pantheon of experiences that constituted my eighteen years. It was all still with me, the storage space in my brain crammed with vivid memories, packed and piled like photographs and old dresses in my grandmother’s bureau. I wasn’t just the madwoman in the attic — I was the attic itself. The past was all over me, all under me, all inside me.” Both authors, like many others, have shared their personal reflections on the fact that – to put it far less eloquently than LeDoux and Wurtzl – “we” or “you” are encoded in your memories, which are “saved” in the form of synaptic connections that strengthen and weaken and morph through age and experience.  Furthermore, such synaptic connections and the myriad biochemical machinery that constitute them, are forever modulated by mood, motivation and your pharmacological concoction du jour.

Well, given that my “self” or “who I think of as myself” or ” who I’m aware of at the moment writing this blog post” … you get the neuro-philosophical dilemma here … hangs ever so tenuously on the biochemical function of a bunch of tiny little proteins that make up my synaptic connections – perhaps I should get to know these little buggers a bit better.

OK, how about a gene known as kalirin – which is named after the multiple-handed Hindu goddess Kali whose name, coincidentally, means “force of time (kala)” and is today considered the goddess of time and change (whoa, very fitting for a memory gene huh?).  The imaginative biochemists who dubbed kalirin recognized that the protein was multi-handed and able to interact with lots of other proteins.  In biochemical terms, kalirin is known as a “guanine nucleotide exchange factor” – basically, just a helper protein who helps to activate someone known as a Rho GTPase (by helping to exchange the spent GDP for a new, energy-laden GTP) who can then use the GTP to induce changes in neuronal shape through effects on the actin cytoskeleton.  Thus, kalirin, by performing its GDP-GTP exchange function, helps the actin cytoskeleton to grow.  The video below, shows how the actin cytoskeleton grows and contracts – very dynamically – in dendrites that carry synaptic spines – whose connectivity is the very essence of “self”.  Indeed, there is a lot of continuing action at the level of the synapse and its connection to other synapses, and kalirin is just one of many proteins that work in this dynamic, ever-changing biochemical reaction that makes up our synaptic connections.

In their paper”Kalirin regulates cortical spine morphogenesis and disease-related behavioral phenotypes” [doi: 10.1073/pnas.0904636106] Michael Cahill and colleagues put this biochemical model of kalirin to the test, by examining a mouse whose version of kalirin has been inactivated.  Although the mice born with this inactivated form are able to live, eat and breed, they do have significantly less dense patterns of dendritic spines in layer V of the frontal cortex (not in the hippocampus however, even though kalirin is expressed there).  Amazingly, the deficits in spine density could be rescued by subsequent over-expression of kalirinHmm, perhaps a kalirin medication in the future? Further behavior analyses revealed deficits in memory that are dependent on the frontal cortex (working memory) but not hippocampus (reference memory) which seems consistent with the synaptic spine density findings.

Lastly, the authors point out that human kalirin gene expression and variation has been associated with several neuro-psychiatric conditions such as schizophrenia, ADHD and Alzheimer’s Disease.   All of these disorders are particularly cruel in the way they can deprive a person of their own self-perception, self-identity and dignity.  It seems that kalirin is a goddess I plan on getting to know better.  I hope she treats “me” well in the years to come.

Reblog this post [with Zemanta]

Read Full Post »

Lonely child
Image by kodomut via Flickr

For humans, there are few sights more heart-wrenching than an orphaned child (or any orphaned vertebrate for that matter).  Isolated, cold, unprotected, vulnerable – what could the cold, hard calculus of natural selection – “red in tooth and claw” – possibly have to offer these poor, vulnerable unfortunates?

So I wondered while reading, “Functional CRH variation increases stress-induced alcohol consumption in primates” [doi:10.1073/pnas.0902863106].  In this paper, the authors considered the role of a C-to-T change at position -248 in the promoter of the corticotropin releasing hormone (CRH or CRF) gene.  Its biochemical role was examined using nuclear extracts from hypothalamic cells, to demonstrate that this C-to-T nucleotide change disrupts protein-DNA binding, and, using transcriptional reporter assays, that the T-allele showed higher levels of transcription after forskolin stimulation.  Presumably, biochemical differences conferred by the T-allele can have a physiological role and alter the wider functionality of the hypothalamic-pituitary-axis (HPA axis), in which the CRH gene plays a critical role.

The authors ask whether primates (rhesus macaques) who differ in genotype (CC vs. CT) show any differences in physiological stress reactivity – as predicted by differences in the activity of the CRH promoter.  As a stressor, the team used a form of brief separation stress and found that there were no differences in HPA function (assessed by ACTH and Cortisol levels) in animals who were reared by their mothers.  However, when the stress paradigm was performed on animals who were reared without a mother (access to play with other age-matched macaques) there were significant differences in HPA function between the 2 genetic groups (T-alleles showing greater release of stress hormones).  Further behavioral assessments found that the peer reared animals who carried the T-allele explored their environment less when socially separated as adults (again no C vs. T differences in maternally reared animals).  In a separate assessment the T-carriers showed a preference for sweetened alcohol vs. sweetened water in ad lib consumption.

One way of summarizing these findings, could be to say that having no mother is a bad thing (more stress reactivity) and having the T-allele just makes it worse!  Another way could be to say that the T-allele enhances the self-protection behaviors (less exploration could be advantageous in the wild?) that arise from being orphaned.  Did mother nature (aka. natural selection) provide the macaque with a boost of self-preservation (in the form of a T-allele that enhances emotional/behavioral inhibition)?  I’m not sure, but it will be fun to report on further explorations of this query.  Click here for an interview with the corresponding author, Dr. Christina Barr.

—p.s.—

The authors touch on previous studies (here and here) that explored natural selection on this gene in primates and point out that humans and macaques both have 2 major haplotype clades (perhaps have been maintained in a yin-yang sort of fashion over the course of primate evolution) and that humans have a C-to-T change (rs28364015) which would correspond to position -201 in the macaque (position 68804715 on macaque chr. 8), which could be readily tested for similar functionality in humans.  In any case, the T-allele is rare in macaques, so it may be the case that few orphaned macaques ever endure the full T-allele experience.  In humans, the T-allele at rs28364015 seems more common.

Nevertheless, this is yet another – complicated – story of how genome variation is not destiny, but rather a potentiator or life experience – for better or worse.  Related posts on genes and early development (MAOA-here), (DAT-here), (RGS2-here), or just click the “development tag“.

Reblog this post [with Zemanta]

Read Full Post »

English: Visualization of a DTI measurement of...
Image via Wikipedia

Within the genetic news flow, there is often, and rightly so, much celebration when a gene for a disease is identified.  This is indeed an important first step, but often, the slogging from that point to a treatment – and the many small breakthroughs along the way – can go unnoticed. One reason why these 2nd (3rd, 4th, 5th …) steps are so difficult, is that in some cases, folks who carry “the gene” variant for a particular disorder, do not, in fact, display symptoms of the disorder.

Huh? One can carry the risk variant – or many risk variants – and not show any signs of illness?  Yes, this is an example of what geneticists refer to as variable penetrance, or the notion of carrying a mutation, but not outwardly displaying the mutant phenotype.  This, is one of the main reasons why genes are not deterministic, but much more probablistic in their influence of human development.

Of course, in the brain, such complexities exist, perhaps even moreso.  For example, take the neurological condition known as dystonia, a movement disorder that, according to the Dystonia Medical Research Foundation, “causes the muscles to contract and spasm involuntarily. The neurological mechanism that makes muscles relax when they are not in use does not function properly. Opposing muscles often contract simultaneously as if they are “competing” for control of a body part. The involuntary muscle contractions force the body into repetitive and often twisting movements as well as awkward, irregular postures.”  Presently there are more than a dozen genes and/or chromosomal loci that are associated with dystonia – two of the major genes, DYT1 and DYT6 – having been identified as factors in early onset forms of dystonia.  Now as we enter the era of personal genomes, an individual can assess their (own, child’s, preimplantion embryo’s!) genetic risk for such rare genetic variants – whose effects may not be visible until age 12 or older.  In the case of DYT1, this rare mutation (a GAG deletion at position 946 which causes a loss of a glutamate residue in the torsin A protein) gives rise to dystonia in about 30-40% of carriers.  So, how might these genes work and why do some individuals develop dystonia and others do not?  Indeed, these are the complexities that await in the great expanse between gene identification and treatment.

An inspection of the molecular aspects of torsin A (DYT1) show that it is a member of the AAA family of adenosine triphosphatases and is related to the Clp protease/heat shock family of genes that help to properly fold poly-peptide chains as they are secreted from the endoplasmic reticulum of the cell – a sort-of handyman, general purpose gene (expressed in almost every tissue in the body) that sits on an assembly line and hammers away to help make sure that proteins have the right shape as they come off their assembly linesNot much of a clue for dystonia – hmm.  Similarly, the THAP domain containing, apoptosis associated protein 1 (THAP1) gene (a.k.a. DYT6) is also expressed widely in the body and seems to function as a DNA binding protein that regulates aspects of cell cycle progression and apoptosis.  Also not much an obvious clue to dystonia – hmm, hmm.  Perhaps you can now see why the identification of “the gene” – something worth celebrating – can just leave you aghast at how much more you don’t know.

That these genes influence an early developmental form of the disorder suggests a possible developmental role for these rather generic cogs in the cellular machinery.  But where? how? & why an effect in some folks and not others?  To these questions, comes an amazing analysis of DYT1 and DYT6 carriers in the article entitled, “Cerebellothalamocortical Connectivity Regulates Penetrance in Dystonia” by Argyelan and colleagues [doi: 10.1523/JNEUROSCI.2300-09.2009]. In this article, the research team uses a method called diffusion tensor imaging (sensitive to white matter density) to examine brain structure and function among individuals who carry the mutations but either DO or DO NOT manifest the symptoms. By looking at white matter tracts (super highways of neural traffic) throughout the brain the team was able to ask whether some tracts were different in the 2 groups (as well as a group of unaffectd, non-carriers).  In this way, the team can begin to better understand the causal pathway between these run-of-the-mill genes (torsin A and thap1) and the complex pattern of muscle spasms that arise from their mutations.

To get right to the findings, the team has discovered that in one particular tract, a superhighway known as “cerebellar outflow pathway in the white matter of lobule VI, adjacent to the dentate nucleus” (not as quaint as Route 66) that those participants that DO manifest dystonia had less tract integrity and connectivity there compared to those that DO NOT manifest and healthy controls (who have the most connectivity there).  Subsequent measures of resting-state blood flow confirmed that the disruptions in white matter tracts were correlated with cerebellar outflow to the thalamus and – more importantly – with activity in areas of the motor cortex.  The correlations were such that individuals who DO manifest dystonia had greater activity in the motor cortex (this is what dystonia really comes down to — too much activity in the motor cortex).

Thus the team were able to query gene carriers using their imaging methods and zero-in on “where in the brain” these generic proteins exert a detrimental effect.  This seems to me, to be a huge step forward in understanding how a run-of-the-mill gene can alter brain function in such a profound way.  Now that they’ve found the likely circuit (is it the white matter per se or the neurons?), more focus can be applied to how this circuit develops – and can be repaired.

Reblog this post [with Zemanta]

Read Full Post »

1/365 [dazed & confused]
Image by PhotoJonny via Flickr

pointer to: Daniel MacArthur and Neil Walker’s (@ Genetic Future bog) in-depth coverage of various critiques on the recent back-to-back-to-back Nature magazine trifecta (covered here) on GWAS results for schizophrenia.  Rough going for the global corsortia and a major f**king bummer for folks like myself who have been hoping that these vast studies would provide a solid basis for genome-based cognitive intervention strategies in the future.  Some of the discussion in the comments section points to the weakness in the diagnostic criteria, which is a topic also covered here recently.

Perhaps there is hope in the brain systems / imaging-based approaches that are taking off as genome technology spreads into cognitive and imaging science. Tough to scan 10’s of thousands of people however. Double F**K!

I guess DSM-based psychiatric genetics is just about dead for the time being.  The announcement of the soon to shutter deCODE Genetics and its 5-year stock price captures the failure of this endeavor.

decode1

Reblog this post [with Zemanta]

Read Full Post »

Backyard trampoline
Image by Kevin Steele via Flickr

For more than a decade, we’ve known that at least 95% of the human genome is junk – or junque – if you’re offended by the thought that “you” emerged from a single cell whose genome is mostly a vast pile of crap – or crappe – if you insist.  Hmmm, what is this crap?  It turns out to be a lot of random repeating sequences and a massive collection of evolutionary artifacts left over from the evolution of earlier genomes – mainly bits of retroviruses who once inserted themselves irreversibly into our ancestors’ genomes.  One subset of this type of – can we upgrade it from crappe to “relic” now? – is something we’ve labelled “autonomously mobile DNA sequences” or more specifically, “long interspersed nuclear elements (LINEs or L1s)”.  This class of DNA relic comprises more than 15% of the human genome (that’s about 3-5x more than the relevant genomic sequence from which you emerge) and retains the ability to pick itself up out of the genome – via an RNA intermediate – and insert itself into new places in the genome.  This has been observed to happen in the germ line of humans and a few L1 insertions are even responsible for genetic forms of humn disease (for example in the factor VIII gene giving rise to haemophilia).  The mechanism of transposition – or “jumping” as these elements are sometimes called “jumping genes” – involves the assembly of a certain type of transcriptional, transport and reverse-transcription (RNA back to DNA) apparatus that is known to be available in stem cells, but hardly ever  in somatic cells.

Except, it would seem, for the brain – which as we’ve covered here before – keeps its precious neurons and glia functioning under separate rules.  Let’s face it, if a liver cell dies, you just replace it without notice, but if neurons die, so do your childhood memories.  So its not too surprising, perhaps, that brain cells have special ‘stem-cell-like’ rules for keeping themselves youthful.  This seems to be borne out again in a paper entitled, “L1 retrotransposition in human neural progenitor cells” by Coufal et al., [doi:10.1038/nature08248].  Here the team shows that L1 elements are able to transpose themselves in neural stem cells and that there are more L1 elements (about 80 copies more per cell) in the hippocampus than in liver or heart cells.  So apparently, the hippocampus, which does seem to contain a niche of stem cells, permits the transposition or “jumping” of L1 elements in a way that the liver and heart do not.  Sounds like a fun place to be a gene!

Reblog this post [with Zemanta]

Read Full Post »

Antigen presentation stimulates T cells to bec...
Image via Wikipedia

Its not often that Nature magazine publishes a triple-back-to-back-to-back, so take note if you’re interested in the genetics of mental illness. The 3 papers – [doi:10.1038/nature08185] involving 3,322 individuals with schizophrenia and 3,587 controls, [doi:10.1038/nature08186] 4,999 cases and 15,555 controls and [doi:10.1038/nature08192] 8,008 cases and 19,077 controls – are as massive and powerful as any genome-wide effort to-date.  The results?  Overall a common result showing linkage to the major histocompatibility complex or so-called ‘MHC genes’ located on chromosome 6.  What to these genes do? and what’s the relevence to mental illness?

Here’s a quickie immunology primer on the biological function of the major histocompatibility genes.  They encode proteins whose molecular function is display short peptides on the surface of aptly named antigen presenting cells in the immune system (think of your hand as an MHC protein holding onto an apple (the short peptide) and holding it out or presenting it to someone (an Helper T-Cell).  This act of “presentation” is done so that the Helper T-Cells can determine whether such peptides are “self” or “non-self”.  If such displayed peptides are non-self (such as a virus, endotoxin or bacterium), then the helper T-Cells will sound the alarm and initiate a T- or B-Cell based immune response aimed specifically at the offending invader.  The movies below show the MHC proteins in their place displaying antigen peptides on the cell surface for binding with a helper T-Cell.


So, what does this have to do with mental illness? Although there are other non-immunological genes interspersed among the MHC genes, there is good reason to begin to explore the role of external infection and early development.  The authors of one paper note that,  “Schizophrenia patients are more likely, compared to the general population, to have been born in the winter or the spring. Although infections such as influenza and measles have been proposed as a possible mechanism for this distortion, a clear association between infectious agents and schizophrenia has not been demonstrated.”

The more we know, the more we don’t know.  Hopefully more early environment data will be analyzed.

Reblog this post [with Zemanta]

Read Full Post »

labyrinthine circuit board lines
Image by quapan via Flickr

Amidst a steady flow of upbeat research news in the behavioral-genetics literature, there are many inconvenient, uncomfortable, party-pooping sentiments that are more often left unspoken.  I mean, its a big jump – from gene to behavior – and just too easy to spoil the mood by reminding your colleagues that, “well, everything is connected to everything” or “that gene association holds only for that particular task“.  Such may have been the case often times in the past decade when the so-called imaging-genetics literature emerged to parse out a role for genetic variation in the structure and functional activation of the brain using various neuroimaging methods.  Sure, the 5HTT-LPR was associated with amygdala activation during a face matching task, but what about other tasks (and imaging modalities) and other brain regions that express this gene.  How could anyone (let alone NIMH) make sense out of all of those – not to mention the hundreds of other candidate genes poised for imaging-genetic research?

With this in mind, it is a pleasure to meet the spoiler-of-spoilers! Here is a research article that examines a few candidate genetic polymorphisms and compares their findings across multiple imaging modalities.  In his article, “Neural Connectivity as an Intermediate Phenotype: Brain Networks Under Genetic Control” [doi: 10.1002/hbm.20639] Andreas Meyer-Lindenberg examines the DARPP32, 5HTT and MAOA genes and asks whether their associations with aspects of brain structure/function are in any way consistent across different neuroimaging modalities.  Amazingly, the answer seems to be, yes.

For example, he finds that the DARPP32 associations are consistently associated with the striatum and prefrontal-striatal connectivity – even as the data were collected using voxel-based morphometry, fMRI in separate tasks, and an analysis of functional connectivity.  Similarly, both the 5HTT and MAOA gene promoter repeats also showed consistent findings within a medial prefrontal and amygdala circuit across these various modalities.

This type of finding – if it holds up to the spoilers & party poopers – could radically simplify the understanding of how genes influence cognitive function and behavior.  As suggested by Meyer-Lindenberg, “features of connectivity often better account for behavioral effects of genetic variation than regional parameters of activation or structure.”  He suggests that dynamic causal modeling of resting state brain function may be a powerful approach to understand the role of a gene in a rather global, brain-wide sort of way.  I hope so and will be following this cross-cutting “connectivity” approach in much more detail!

Reblog this post [with Zemanta]

Read Full Post »

Human chromosome 15
Image via Wikipedia

One way to organize the great and growing body of research into autism is via a sort-of  ‘top-down’ vs. ‘bottom-up’ perspective.  From the ‘top-down’ one can read observational research that carefully catalogs the many & varied social and cognitive attributes that are associated with autism.  Often times, these behavioral studies are coupled with neurochemical or neuroimaging studies that test whether variation in such biomarkers is correlated with aspects of autism.  In this manner, the research aims to dig down into the physiology and biochemistry of the developing brain to find out what is different and what differences might predict the onset of autistic traits.  At the deepest biological level – the bedrock, so to speak – are a number of genetic variations that have been correlated with autism.  These genetic variants permit another research strategy – a ‘bottom-up’ strategy that allows investigators to ask, “what goes wrong when we manipulate this genetic variant?”  While proponents of each strategy are painfully aware of the limitations of their own strategy – oft on the barbed-end of commentary from the other side – it is especially exciting when the ‘top-down’ and ‘bottom-up’ methods find themselves meeting in the agreement in the middle.

So is the case with Nakatani et al., “Abnormal Behavior in a Chromosome- Engineered Mouse Model for Human 15q11-13 Duplication Seen in Autism” [doi: 10.1016/j.cell.2009.04.024] who created a mouse that carries a 6.3 megabase duplication of a region in the mouse that luckily happens to be remarkably conserved in terms of gene identity and order with the 15q11-13 region in humans – a region that, when duplicated, is found in about 5% of cases with autism.  [click here for maps of mouse human synteny/homology on human chr15] Thus the team was able to engineer mice with the duplication and ask, “what goes wrong?” and “does it resemble autism in any kind of meaningful way (afterall these are mice we’re dealing with)?

Well, the results are rather astounding to me.  Most amazing is the expression of a small nucleoar RNA (snoRNA) – SNORD115 (mouse-HBII52) – that function in the nucleolus of the cell, and plays a role in the alternative splicing of exon Vb of the 5HT2C receptor.  The team then found that the editing of 5HTR2C was altered in the duplication mice and also that Ca++ signalling was increased when the 5HTR2C receptors were stimulated in the duplication mice (compared to controls).  Thus, a role for altered serotonin function – which has been a longstanding finding in the ‘topdown’ approach – was met midway and affirmed by this ‘bottom-up’ approach!  Also included in the paper are descriptions of the abberant social behaviors of the mice via a 3-chambered social interaction test where duplication mice were rather indifferent to a stranger mouse (wild-type mice often will hang out with each other).

Amazing stuff!

Another twist to the story is the way in which the 15q11-13 region displays a phenomenon known as genomic-imprinting, whereby only the mother or the father’s portion of the chromosome is expressed.  For example, the authors show that the mouse duplication is ‘maternally imprinted’ meaning that that pups do not express the copy of the duplication that comes from the mother (its expression is shut down via epigenetic mechanisms that involve – wait for itsnoRNAs!)  so the effects that they report are only from mice who obtained the duplication from their fathers.  So, if you by chance were wondering why its so tough to sort out the genetic basis of autism – here’s one reason why.  On top of this, the 5HTR2C gene is located on the X-chromosome which complicates the story even more in terms of sorting out the inheritance of the disorder.

Further weird & wild is the fact that the UBE3A gene (paternally imprinted) and the genetic cause of Angelman Syndrome sits in this region – as does the SNRPN gene (maternally imprinted) which encodes a protein that influences alternative RNA splicing and also gives rise to Prader-Willi syndrome.  Thus, this tiny region of the genome, which carries so-called “small” RNAs can influence a multitude of developmental disabilities.  Certainly, a region of the genome that merits further study!!

Reblog this post [with Zemanta]

Read Full Post »