Download this article as a PDF file here
This essay by STEVE DRURY reviews recent research in genetics and psychology, and argues that it can deepen our understanding of the forms of alienation experienced by humans through the succession of social systems in which they have lived since Palaeolithic times
What people are now is very different from what they were before about 10,000 years ago, when agriculture emerged. Before then, humanity lived more as part of the natural world, taking only what was necessary by gathering easily-had foodstuffs and increasing their protein intake by hunting. There is little, if any, evidence suggesting that they stored produce, but plenty to show that people were continually on the move – on a continent-wide, even global scale.
This simple truism – that people are very different – masks huge social changes: the tendency to remain in fixed locations and growing communities following the so-called “agricultural revolution” of the late Stone Age; storing grain surpluses; and having living protein supplies tamed and on the hoof. Central to how people have changed is the expropriation of surplus produce that inexorably led to class society and to what Marx saw as the downward spiral of “the isolated individual in civil society”, through the alienation inherent in such expropriation and growing control of the majority by minority groups, both economically and culturally.
Here I hope to explore evidence for the relationship between these changes and the inner world of individuals. That inner world has interacted with the world at large through these changes in society. The study of psychological characteristics that set an increasing number of humans apart from the rest can throw light on the way that all humans live. The discussion centres on recent discoveries in the fields of genetics and psychology, and the links between them. They appear to have uncovered wholly unexpected natural factors at work that may revolutionise our general view of people who experience deep inner turmoil.
Madness: its treatment and mistreatment
When I, and quite possibly the majority of people, encounter others who speak in what seems to be a very irrational fashion, or behave in ways that are extremely unusual, my reaction is likely to go through a gamut of alarm, sympathy, annoyance and other emotions, depending on their actual behaviour. If I do not know them personally, I will most likely beat a hasty retreat: they are “not right in the head” or “under the influence”. What are considered today as mental illnesses, behavioural disorders, extremes of mood, personality defects, or simply and crudely madness, have hitherto been the subject of what often amount to treatment by “trial and error”, or worse.
Diagnosis of such conditions is largely in the hands of psychologists and psychiatrists. More often than not the process of deciding on a “label” for unusual psychological conditions excludes those who show signs of such conditions and their carers. In fact there is a sizeable and continually updated technical tome, the notorious Diagnostic and Statistical Manual of Mental Disorders (DSM), used by many psychiatrists to pigeon-hole “mad” people in one or more of a large range of categories.[1] How they are treated is mainly by “prescriptions” of drugs, procedures, therapies and interventions. (A well-known exception to such “professionalism” was R.D Laing, who challenged the general outlook in psychiatry that considered mental illness as a biological phenomenon without regard for social, intellectual and cultural dimensions.)
The conditions have a common feature: ways of seeing one’s social surroundings that differ from what is regarded, by most psychologists and psychiatrists who adhere to the “official” line, as a “normal view”. Considering that the dominant features of social relationships in the era of globalised capital are various aspects of alienation, the concept of “normality” seems distinctly debatable.[2] Consequently, the “mind sciences” largely remain in the embarrassing position of being able to describe but not to explain.
Early in the twenty-first century it is a brave psychologist or psychiatrist who would claim that “cures” are in sight: there certainly are none even guessed at for autism. “Cures” would inevitably be defined as “making patients normal”, thereby avoiding real issues through circularity. Locked into that is the risk of driving people to be what others want them to be, madness emerging from both going along with such coercion and at the same time resisting it.
Historically, there has been a move from summary execution for showing “signs of witchcraft or demonic possession” (although recent cases of just that have hit the headlines), through disposing of individuals in madhouses,[3] then asylums – perhaps with some given licence to beg on the streets – and a slow development of means of treatment rather than physical restraint. Among these has been lobotomy, for which its inventor António Egas Moniz shared the Nobel Prize for Physiology or Medicine in 1949, sentencing tens of thousands to a subdued, almost vegetable-like existence. Following the inevitable discrediting of that procedure, much the same control was achieved through powerful mood-stabilising drugs, such as lithium – still widely used – and Largactil (Thorazine or Chlorpromazine), the “liquid cosh”.
Conditions such as depression and bipolarity – which are less alarming than those regarded as needing legalised confinement, although they can presage breakdown, that does – became the focus of various schools of non-invasive psychoanalysis and behavioural therapy, initially among sections of society that could afford it. Once it became clear that some disorders were associated with a variety of brain chemicals that govern nerve linkages to sites associated with pleasure, the regulation of mood, appetite, and sleep – neurotransmitters, the most familiar being serotonin and dopamine – the road seemed open to chemical cures, at least for lesser manifestations. In fact, the road was towards vast profits for the pharmaceutical companies, who could invent means of chemical intervention. At the time the patent for the anti-depression drug Prozac (the chemical fluoxetine) expired in 2001, its sales had accounted for US$1 billion profits for its manufacturer Eli Lilly over the previous year alone. For a sizeable number of those taking it, Prozac brought unexpected and wholly unwelcome side-effects, including an increase in the risk of suicide in young people.
In North America and Europe at the start of the twenty-first century, the gamut of mentally-linked conditions have reached epidemic proportions: at any one time about 4% of the population are thought to be debilitated by them in rich countries. Surveys in the US suggest that almost half the population will at some stage in their lives seek medical treatment for one debilitating psychological condition or another.
According to the US National Alliance on Mental Illness (NAMI), 1 in 17 Americans is currently considered to have a serious psychiatric condition, the proportion rising to one in 6 for people with incomes below the poverty line. Of those unable to afford insurance cover, only 32% receive any treatment and less than 12% get adequate help. It is not only income, or rather a relative deficit, that determines affliction: more and more young people are diagnosed. and suicides – 90% of which are attributable to “mental illness” – are among the top three causes of death in the US for under-25s.[4] The statistics suggest that the broad spectrum of disorders represent as much a social as a genetic or pathological source.
The role of genes
My own reactions and culturally ingrained prejudices concerning “madness” are now, thankfully, in a state of flux for a good reason: as well as the clear correlation with social factors, those whom we label as “mad” may well have been viewed very differently earlier in human evolution. This possibility is raised both by advances in psychological study and by the technology that allows the genetic make-up of individuals to be decoded quickly, accurately and at progressively lower cost.
Some conditions, such as the wide spectrum of bi-polar disorders and autism, are now known to be in part inherited. Yet links through family trees are no longer seen as being directly caused solely by “depression-”, “bi-polar-” or “paranoia genes”, etc. Instead, what is emerging from genetic and neurological research is a complex interplay between genetics, the action of chemicals such as neurotransmitters in the brain, environmental factors and early development – not “nature versus nature”, but processes at the level of the cell, the entire organism, its whole life and the conditions of that life.[5]
The powerful social stigma long associated with anything that smacked of “madness”, or even mood instability, begs a question of researchers, among whom support is emerging for the idea of a genetic connection and a variety of environmental factors: if such people are “bad and dangerous to know”, as well as being “mad”, how come there are so many of them? Put another way: why did natural selection not stamp out genes that confer, at least in part, the disadvantages faced by those afflicted? Had such social stigma and disadvantage existed throughout human evolution it would seem likely that natural selection would have eliminated any genes responsible for mental illness and it would be a rarity.[6] A few examples reaching the popular scientific press hint at what could be a surprising answer.
Penny Spikins of the Department of Archaeology at the University of York, UK, suggests that an explanation may have lain with social mechanisms for accepting and integrating “different minds” very early in the prehistory of fully modern humans, when what we might consider very odd behaviour and ideas carried important benefits.[7] Spikins examines the evidence for autism in the cultural record, including what may have the work of autistic “savants” with extraordinary drawing abilities.
Stone age art, autism and trance
The cave art in south-west Europe, or even small engravings on bone or ivory of about the same age (roughly, between 35 to 15 thousand years ago) seems astonishingly realistic, ranking with works by skilled painters of the modern era. Ancient cave art is generally found in the deepest recesses of cavern systems, and must have been made under very poor lighting conditions, which further enhances our awe of its creators. But equally amazing are drawings made by “savants” who are blessed with an ability to recall and sketch with unerring accuracy scenes that they may have witnessed only briefly. Spikins notes the close resemblances in style between ancient art and that of savants – the latter often work very quickly and the cave artists left signs that they too were able to draw quickly. Other savants are astonishingly skilful at musical mimicry: such a skill could once have been valued for something as everyday as calling birds and mammals to increase hunting efficiency.
The “attention to detail, exceptional memory, a thirst for knowledge and narrow obsessive focus” (Spikins, see note 7) linked with less extreme autistic conditions, such as Asperger’s, would also have had an important social value, for instance by drawing creative insight from patterns in the natural and social worlds that would have been overlooked by most people. It is increasing recognised that many major cultural advances in recorded history stem from people who were not “ordinary”.
As well as such material aspects, “odd” behaviour may contribute to the social binding effect of shared beliefs. Shamans among recent and a few still-active hunter-gatherer groups help other members to make sense of their world through myths and ritual, using spiritual means to embed vital information in everyone’s memories. Documented shamanism provides solid evidence of mood anomalies among shamans: hearing voices; entering trances; experiencing vivid abstract visions but also conferring astonishing powers of mimicry. Shamans among the San of Botswana and Namibia almost literally “become” the regular prey animals of their hunters, such as the eland, a large antelope.
As well the exquisite realism of some cave art, there are innumerable examples of apparently meaningless shapes, such as dots and concentric circles, patterns of parallel and intersecting straight lines, what appear to be allegories and astonishing abstractions, such as the “X-ray” pictures showing bones within living animals in ancient and contemporary Australian
aboriginal art. They have parallels with illusions associated with what used be termed “schizophrenia”, the visions of those in self-induced trances or epileptic fits. A rare but weird presence in ancient cave art are half-human half-animal figures, known as theriomorphs: perhaps another link to shamanism and trance-illusions of shape-shifting.
Moodiness – it depends where you live
Studies of genetics in humans and other primates have revealed a gene (SERT) that governs the movement of serotonin – a neurotransmitter linked to variations in mood – which appears to have a link with inherited disorders. There are two forms of the gene, “short” and “long”, both of which are combined in the human genome, whereas, apart from rhesus monkeys, no other primates carry the short variety. A “long-long” combination seems to confer some “immunity” from depression, whereas the “short-short” and “long-short” pairings have been thought to make people more prone to very low moods. But this relationship looks very different when viewed on a world scale.
Mapping the prevalence of “short” SERT reveals that its supposed link to depressive tendencies breaks down for the population of China, of which about 80% is estimated to carry it. China has one of the lowest incidences of depression (~5%)[8] whereas the mid-range, “short”-bearing US population has the world’s highest number of people diagnosed with depression (~22%). Possibly this is an artifact of under- and over-diagnosis respectively, but another personality survey (unconnected with genetics) of 100,000 members of cultural groups across the world by Dutch anthropologist Geert Hofstede[9] in the 1970s suggests a different perspective. Hofstede developed a questionnaire aimed as ranking people’s cultural outlook in terms of a “collectivism-individualism” spectrum and his survey, among other features, suggests a dominance of collectivist outlooks among “third world” people: China ranking among the highest, whereas advanced economies are characterized by individualist outlooks, the US and Australia coming out top for those.[10]
Using the so-called “molecular-clock” approach to the appearance of mutations, that in “short” SERT seems to be recent – relative to the 5 to 7 million years since the common ancestor shared by humans and chimpanzees, our nearest genetic relatives. It appeared among our ancestors about 100,000 years ago, just before it is reckoned some humans began to diffuse from Africa to the rest of the world – a journey fraught no doubt with difficulty and privations. Yet a gene variant that now links with depression among the more economically favoured of their descendants grew to almost universality among those most distant from Africa in east Asia … yet does not seem to blight them with depression, perhaps because they still maintain a dominantly communal outlook. One interpretation is that it is less the gene than the outlook that matters in this case: alienation happens to exploit the depressive aspect of “short” SERT’s function, perhaps less-efficient production of serotonin that doesn’t really matter in a more cohesive, communal society.
A problem for some, an advantage for others
The “pleasure-reward” neurotransmitter dopamine has a link with, among others, attention-deficit hyperactivity disorder (ADHD): that condition is associated with people who carry a variant of a gene, DRD4-7R, that regulates the function of one of the dopamine receptors in the nervous system. People, especially children, who have this variant in developed countries manifest ADHD as a debilitating and disturbing condition. Yet, DRD4-7R is found in 80% of native people in the Amazon basin of South America, who clearly thrived with it during their migration to, and colonisation of, this vast and biologically complex region. It was a journey made at the end of the last ice age via the whole of North America, the Bering Straits and Arctic north-east Asia, under extreme climatic conditions and through many ecosystems. The association of very high levels of energetic activity with ADHD may well have outweighed any tendency to the personality trait, and the advantages conferred by that manifestation of the variant’s coding for dopamine may well have suppressed its “down-side”.
Perhaps, as Spikins suggests, other genes associated with maladaptive behavioural traits, which we label “madness”, have a “Jekyll-Hyde” duality. The overwhelmingly dominant society today, dominated by capital, with its inherent drive to alienation, perhaps creates a one-sided pathology from the opposition of tendencies bound up with carriers of such genes. Through their now almost-vanished social relations, ancestors of such individuals may once have given society an intellectual diversity rich enough for it to adapt to all that the natural world might throw at it, turning the adversity of the huge and rapid environmental changes of the last 200,000 years to their societies’ advantage. That such potential still remains in the gene pool may serve humanity well in the face of future disruptions.
Another gene involved in brain development, neuregulin 1, is also attracting the attention of researchers who are questioning old assumptions about mental states. Neuregulin 1 has been linked to a slightly increased risk of schizophrenia, but it is not a simple connection. A mutation that changed one letter in the DNA pattern of the gene (in effect the position of one of the amino acids – adenine, cytosine, guanine and thymine –that make up DNA’s ACGT coding) decreases the amount of a protein made in the brain, thereby apparently enhancing tendencies to psychosis, poor memory and sensitivity to criticism. Half of all Europeans carry one copy of the mutation, but 15% carry two.
A remarkably large proportion of people capable of high creativity share that ability with a downside of mental “disorders” – Isaac Newton, Leonardo da Vinci, Vincent van Gogh and Albert Einstein are often cited as examples. Yet there are multitudes who suffer the “disorders”, with little sign of intellectual or artistic talent. Studies at Semmelweis University in Budapest throw some light upon whether or not the “mad artist” or “mad scientist “are myths.[11] Researcher Szabolcs Kéri advertised for volunteers who claimed high levels of creativity, and analysed their DNA in the region of neuregulin 1. They were given practical tests of creativity, results from which were combined with their actual creative achievements – a patent, or a book, for instance – to give a score.
The 12% of volunteers with two copies of the mutation achieved somewhat higher scores than those with one or no copies. A single copy linked with significantly greater creativity scores more highly than did the unmutated neuregulin 1 gene. Another remarkable feature was that volunteers with two copies of the mutation were no more likely than others to exhibit paranoia, odd speech patterns and inappropriate emotions.
So, on the one hand, the gene mutation implicated in some disorders did correlate with self-admitted creativity, but on the other it showed no special link to signs of instability among the volunteers. It seems that, somehow, the mutation is able to bring out creativity in some carriers, but psychoses in others. Yet these volunteers showed higher measures of intelligence – better education and encouragement? – while other studies of families carrying the neuregulin 1 variant, and showing high incidence of schizophrenia, revealed lower intelligence – poor education and discouragement? Perhaps the contrast is between those more able to cope with and communicate creatively the feelings induced by their genetic proclivity, and those unable to express themselves: perhaps another clear hint of a social side to the dichotomy.
The romantic idea of madness and genius being two sides of the same coin seems just part of the story, and, for most sufferers, madness – it seems to me – is just that and that alone. Nonetheless, the neuregulin 1 mutation is so common as to suggest it hasn’t simply slipped under the radar of natural selection. What if there was no social stigma against its worse effects, and all the carriers were free to express what are now “off the wall” and largely unacceptable thoughts and feelings (unless they are exhibited wittily or aesthetically)?
Hidden potential and hidden risks: the role of environment
Other research that throws light on the possible background to the dichotomy of “affliction vs celebration” is explored in a recent article by David Dobbs.[12] He reports on research by psychologists at the Hebrew University of Jerusalem, who focussed through simple experiments on the willingness of pre-school children to share treats. It turned out that those most likely to express that kind of generosity carried the DRD4-7R gene variant, the very variant also associated with ADHD – though not necessarily among the children shown to be most willing to share. That is not the only discovery of a gene with a sort of polarity of expression. The “short”-SERT gene variant, discussed earlier – which is known to lower serotonin levels – hints at a similar function. Possessing it does not guarantee being prone to depression, merely raising the risk, and people from different cultures are affected to dramatically different extents.
The issue of environment, in the broadest sense, is addressed by studies by a team led by Avshalom Caspi and Terrie Moffitt at King’s College London. They found that the depressive effect of the “short” SERT variant emerged only if subjects had suffered poor quality childhoods or periods of great stress as adults.[13] This finding was similar to one from an earlier study by Caspi and Moffitt [14] in which they showed that people who were abused as children were more likely to show violent sociopathic behaviour as adults if they had a variant of a gene (MAOA) that lessens the gene’s function in production of serotonin, dopamine and other neurotransmitters, i.e. potential factors in mood, concluding: “These findings may partly explain why not all victims of maltreatment grow up to victimize others, and they provide epidemiological evidence that genotypes can moderate children’s sensitivity to environmental insults.” Caspi and Moffitt’s work spurred similar studies by others, and further confirmation that some gene variants confer vulnerability to environmental stresses in their psychological expression.
Earlier work in the 1990s on seemingly unrelated physiological issues found an echo in the “vulnerability gene” model that is used to explain some psychological issues. It began with measured changes in heart rate, blood pressure and levels of the stress hormone cortisol in pre-school children, with a range of home conditions, in relation to environmental stress. The observations were compared with the children’s record of asthma attacks. All children with low reactions to stress showed a similar range of asthma about the average for children in general. One outcome was the unexpected discovery that, of those children who showed high reactions to stress, some were more prone to asthma than the less stress-sensitive children whereas other stress-sensitive individuals were less prone. Those less asthmatic had less stressful home environments than did those with a greater tendency to asthma attacks. It seemed, and was confirmed in other studies, that some children – dubbed “dandelion children” – who were not especially sensitive to their environment, whether “good” or “bad”, and fared about the same irrespective of their home environment. Yet there was another group – “orchid children” – on whom home environment had a much more marked effect, taking their range of responses outside that of “dandelion children”: their reactions were anything from extremely adverse to highly positive, across the range from “worst” to “best” parenting respectively.[15]
Such findings led researchers to suspect not only a genetic difference between “dandelions” and “orchids”, but also that there were environmental effects on the function of any gene variants responsible, i.e. a “nature-nurture” dialectic affecting gene variants capable of a range of outcomes (said to show high “plasticity”). With increased ease of partial DNA analysis for variants of genes known to have a function on neurotransmitters, such as SERT, DRD4, MAOA and others, it has become possible to test such a hypothesis. Jay Belsky of Birkbeck University of London and the University of California (Davis) led a 12-year study of more than 1500 adolescents enrolled in a US health project, basing their research on five genes that regulate dopamine, behavioural measures of the subjects such as their levels of self-control, and data on the level of care from and involvement by their mothers.[16]
The results were interesting, to say the least. Girls in the study showed no effects of the variants with high plasticity as regards their levels of self-control – they showed “dandelion” tendencies. Boys, on the other hand, showed a remarkable confirmation of the “dandelion vs orchid” hypothesis for this behavioural measure: those with only one such variant or none show more or less the same moderate self-control irrespective of maternal parenting quality; those carrying two and three variants clearly show the effects of plasticity of function according to environment with lower and higher self-control than those with one or none according to poor- and high-quality parenting respectively; those with four or five show the effect to extremes.
In the broad context of self-control among male adolescents there is clear evidence of genetic and environmental interplay separating “dandelions” from “orchids”, but the results are yet to be replicated or demonstrated for other measures of behaviour and personality. Yet the study appears to knock on the head any notion of single genetic factors determining behaviour and psychology, and suggests that home environment exerts a powerful and unexpected influence on “orchids”.
Psychiatrists since the days of Carl Jung have recognised a group of between about 15% to 20% of the population who are highly sensitive to subtle stimuli and who are often over-aroused by them. They are prone to innate shyness, social anxiety problems, inhibitedness, or even social phobia and innate fearfulness, introversion, and so on. Individuals with such a highly sensitive personality (HSP) are an obvious target for probing the possible influence of the “dandelion vs orchid” effect. A team of Chinese and US neuroscientists found a correlation between HSP characteristics and 10 gene variants that affect dopamine levels – clearly a highly complex range of possible interactions.[17] Part of the study assessed environmental factors, finding a significant effect from recent stressful events, but the study was unable to resolve influences from earlier influences such as parenting: it is work in progress.[18]
Implications for distant ancestors
Autism, Asperger’s Syndrome and other conditions involve to some extent deep, inward-looking “other-worldliness”. Should genetic mutations be linked with such conditions, as seems likely, once they entered the genome of some humans the gene variants would have placed those who carried them at the ends of a variety of behavioral spectrums. Yet the survival and spread of the variants to a large minority of living human populations may imply that they once conferred “fitness” on their carriers, in the sense of Darwin’s natural selection – or at least had no adverse effect on the ability of individual carriers to reproduce and pass genes through succeeding generations. In the case of the Stone Age artists whose creations can be interpreted as evidence for schizophrenia or autism, they were almost certainly highly valued individuals. Shamans in surviving hunter-gatherer groups seem to come into that category. By helping people through trance and ritual to come to terms with harsh conditions and maintain social cohesion, they become revered by their communities.
Since about 10,000 years ago the growing ability of settled societies to create a storable surplus, but increasingly to risk and suffer its expropriation by a minority, have increasingly alienated the majority from the products of their labour, from the natural world, from their fellows and society in general, and inwardly from their true selves. This fundamental shift negated the social forms that prevailed previously: the more egalitarian and communally oriented outlooks of nomadic hunter gatherers. The “dandelion-orchid” hypothesis helps account for the down-sides associated with some gene variants linked to neurotransmitters being increasingly activated through alienation, except for a lucky, tiny minority whose conditions of life allow them to blossom intellectually and aesthetically.
One of the most shocking outcomes of capital confronting peoples who until recently lived in egalitarian and highly supportive groups as hunter-gatherers are epidemics of mental suffering equally as devastating as the smallpox and other diseases visited on them through that contact. In much the same way that almost complete lack of immunity to viral and bacterial infections common to the “civilised” world decimated aboriginal peoples of Australia, the Americas, the high Arctic and remaining hunter-gatherers in Africa, the shock when capital’s “norm” of alienation descended caused a plague of “madness”, alcoholism and now drug abuse that blight many remaining communities. It may be that San, Innuit, Hazda, Lakota, the many indigenous groups of Australia and others, having avoided 10 millennia of class society and alienation, carry a higher proportion of “orchid” genes than the rest of humanity. Those 10,000 years have seen more-recent gene variants appear and spread through descendants of early agriculturalists and herders, which confer “fitness” in various contexts associated with agricultural produce, such as lactose and alcohol tolerance. It is not unreasonable to surmise that far older double-edged genes may, to the contrary, have declined in the dominant population since they no longer confer fitness to most carriers dominated by capital’s social order: they may make the less economically and socially fortunate carriers “unfit” for a newer, alien context.
At some time in the more distant past, when all humans lived as nomadic hunter-gatherers, “double-edged” gene variants that bear on personality appeared through mutation. A new diversity of mental abilities conferred by them on their carriers would have helped humans to travel through every known climatic and ecological zone during the last hundred millennia, to survive the conditions they met and to make rapid advances in culture and understanding. Such mutations are now commonplace in the world population, but in a social world turned on its head their potentialities likewise have been negated to cause inner torment and sometimes blind lashing out against a social wasteland created by the instigator of alienation – capital. It seems to me that psychiatric conditions as they express themselves nowadays are illnesses, since people do suffer immensely when they strike. But unlike ailments due to biological or physical factors, being “mad” bears hallmarks of being severely wounded by a society, and fundamentally by an economic system, that is increasingly inhuman and demonstrably capable of turning benefit to unbearable burden.
[1] The Diagnostic and Statistical Manual of Mental Disorders (DSM) published by the American Psychiatric Association is used worldwide and makes a tidy profit for AMA of around US$5 million a year thanks to its tightly guarded copyrights on definitions.
[3] The first was London’s Bethlem Royal Hospital, founded in the 13th century and named after the priory of the Order of the Star of Bethlehem, gave rise to the English word “bedlam”, meaning uproar and confusion.
[4] “Editorial: A neglect of mental illness”, Scientific American, v. 306 (March 2012). Based on data from http://www.nami.org/template.cfm?section=about_mental_illness
[5] Sometimes referred to as evo-devo (from “evolution and development”), but more formally as epigenetics – the study of heritable changes in the way genes express themselves, as a result of processes other than changes in the coding of genes in DNA – with “epi-” (from Greek) conveying the essence of over, above and beyond genetics. This is far from the genetic determinism favoured by some popularisers of the primacy of genes, for instance Richard Dawkins.
[6] Darwin in his On the Origin of Species by Means of Natural Selection, or the Preservation of Favoured Races in the Struggle for Life (usually abbreviated to The Origin of Species) introduced the idea that individual organisms which survive to maturity and produce offspring, and who do the same from generation to generation, are “fit” as regards the totality of their surroundings. In the course of all the characteristics of individuals within a species being pitted against their environment, those characteristics that confer “fitness” are retained in later generations. Any characteristics which make individuals “unfit” gradually disappear from the species’ population: hence “natural selection”. We now know that this process operates through genes that are “encoded” in DNA, each of which signals the body to produce various molecules that enable it to function physically and neurologically. Variants of genes arise though mutation, for instance though miscopying the code during cell reproduction or at the other extreme as a result of cosmic rays.
[7] Spikins, S., “Autism, the Integrations of ‘Difference’ and the Origins of Modern Human Behaviour”, Cambridge Archaeological Journal 19:2 (2011), pp. 179–201. Reviewed by Kate Ravilious in the 5 November 2011 issue of New Scientist
[13] Caspi, A. et al., “Influence of life stress on depression: moderation by a polymorphism in the 5-HTT Gene”, Science. v. 301 (2003), p. 386-389.
[14] Caspi, A. et al., “Role of genotype in the cycle of violence in maltreated children”, Science. v. 297 (2002), pp. 851-854.
[15] Boyce, W.T. & Ellis, B.J., “Biological sensitivity to context: I. An evolutionary–developmental theory of the origins and functions of stress reactivity”, Development and Psychopathology 17 (2005), pp. 271–301.
[16] Belsky, J. & Beaver, K.M., Cumulative-genetic plasticity, parenting and adolescent self-regulation. Journal of Child Psychology and Psychiatry, v. 52 (2011), pp. 619-626.
[17] Chen C. et al., “Contributions of dopamine-related genes and environmental factors to highly sensitive personality: A multi-step neuronal system-level approach”, PLoS ONE 6(7): e21636.
[18] For an interesting summary of epigenetics in the wider context of this discussion see: Holmes, R., “Ice age survival: clues in fossil DNA”, New Scientist, 4 February 2012, pp. 8-9.
[…] at work that may revolutionise our general view of people who experience deep inner turmoil. Continued HERE Share this:TwitterFacebookPrintLike this:LikeBe the first to like this […]
Thx Excellent