Saturday, February 25, 2023

Drought Causes Cultural Breakdown

What happened to the Hittites, and the late Bronze Age?

Climate change is already causing wars and migration, misery on a vast scale. The global South takes the heat, while the global North keeps making it, pumping out the CO2. Can we adapt, or is the human population going to decrease, either gently or not so gently, as conditions deteriorate? The answer is not at all clear. The adaptation measures taken by the rich world involve highly contentious politics, and uncertain technology that, at best, requires a great deal more resource extraction. The poor, on the other hand, are left to either try developing (if they can maintain good political and economic governance) to join the rich in their extractive ways, (China, India), or migrate en masse to rich countries (Africa, Central America). All this is going to get worse, not better, since we are still at peak CO2 emissions and only beginning the process of global heating.

Our emissions of CO2 are still going up, not down. Therefore climate change will be getting worse, faster. Conflict is one likely outcome.


Well, migrations and dislocation have happened before. Over the last millennium, it was cold temperatures, not hot, that have correlated with conflict. Epic migrations occurred in the declining days of the Roman Empire, when the Huns drove a domino series of migrations of Germanic tribes that fought their way throughout Europe. What prompted the Huns out of the Asian steppe is unknown, however. Jared Diamond wrote of several other cultures that met their end after exhausing their resources and technologies. A recent paper added one more such case- the Hittites of late Bronze Age.

The Hittites were a big deal in their time (1700 to 1200 BCE, very roughly), running what is now Eastern and Southern Turkey, and occasionally Syria and points South. They were an early offshoot of the Indo-European migrations, and had a convulsive (though not very well understood) history of rises and falls, mostly due to their political dynamics. At the height of Hittite power, they fought Egypt directy at the battle of Kadesh, (1274 BCE), which occured just a little north of current-day Lebanon. This was the complex frontier between Assyria / Babylon, the Hittites, and Egypt. Egyptian history is full of expeditions- military, economic, and diplomatic- through the Levant.

The Hittites were artists as well as warriors.

The Hittites were also one of several communities around the Mediterranean that shared in the late Bronze Age collapse. This is the epic time that saw the Greek siege of Troy, (~1200 BCE), and the "Sea People's" invasion of Egypt. Its causes and details remain a long-standing historical mystery. But its scale was vast. Greece entered its dark age that lasted from 1200 to the 800's BCE. North Africa, the Balkans, Turkey, Levant, and the Caucaus all declined. Assyria and Egypt were weakened, but did not collapse. The latest paper uses tree-ring data from junipers from around the former Hittite capital in what is now central Turkey to more precisely date a severe drought that may have caused this collapse. Drought is just the kind of cause that would have been wide-spread enough and foundational enough to destroy the regional economies and prompt migrations and wars. Wars.. there are always wars, but no single war would have caused the collapse of cultures on such a wide scale, including a weakening of Egypt. Plagues are also not a great candidate, since they do not harm a society's resource base, but only its population. Such population reductions typically benefit the survivors, who rebuild in short order.

Moisture levels inferred from tree ring data, with lower values dryer. There are three consecutive catastrophic years dated to 1198-1196 BCE in this region, which is around the ancient Hittite capital. The ensuing decade was also unusually dry and likely poor for agriculture. The 20% and 6.25% levels of drought are by comparison to wider sampling, including modern data.


The drought these authors identified and located with precision was extraordinary. They note that, using modern data for indexing, the 20% level (representing about 30 cm of annual rain) is the minimum viable threshold for growing wheat. The 6.25% level is far below that and represents widespread crop failure. They developed two types of data from the tree rings, drawn from 18 individual trees whose rings spanned about a thousand years across the second millenium BCE. First is the size of the rings themselves, whose data are shown above. Second is the carbon 13 isotope ratio, which is a separate index of dryness, based on the isotopic discrimination that plants exercise over CO2 respiration under different climatic conditions. 

The same tree rings that provided the inferences above from their geometry (width) also here provided carbon 13 isotope data that lead to a similar conclusion, though with much less precision. High proportions of C13 indicate drier climate, here continuous around 1200 BCE.

The paper shows three consecutive years at the 6.25% level of rainfall, starting at 1198 BC. The ensuing decade was also harshly dry. All this correlates with cuneiform texts found in the Levant that were letters from the Hittites, bemoaning their drought and begging for assistance. But everyone in the region was in a similar position. The Hittite culture never recovered. 

So drought is now a leading hypothesis for the ultimate cause of the late Bronze Age collapse around many parts of the Mediterranean, with Greece and Anatolia particularly affected. While it is reasonable to imagine that such conditions would lead to desperation, migration, and war, there is no direct link yet. The nature and origin of the Sea Peoples who attacked Egypt remain unknown, for instance. The reasons for the seige of Troy are lost to myth. The Illiad never mentions drought, nor would Troy have been in a much better position than Mycenaean Greece, climatically speaking. But the consequences of geopolitical shifts in alignment can be unpredictable, as we continue to experience today. It is exciting (as well as sobering) to get a glimpse into this cloudy history- into a vast swath of human experience that built great cultures and suffered epic defeats.


Saturday, February 18, 2023

Everything is Alive, but the Gods are all Dead

Barbara Ehrenreich's memoir and theological ruminations in "Living with a Wild God".

It turns out that everyone is a seeker. Somewhere there must be something or someone to tell us the meaning of life- something we don't have to manufacture with our own hands, but rather can go into a store and buy. Atheists are just as much seekers as anyone else, only they never find anything worth buying. The late writer Barbara Ehrenreich was such an atheist, as well as a remarkable writer and intellectual who wrote a memoir of her formation. Unusually and fruitfully, it focuses on those intense early and teen years when we are reaching out with both hands to seize the world- a world that is maddeningly just beyond our grasp, full of secrets and codes it takes a lifetime and more to understand. Religion is the ultimate hidden secret, the greatest mystery which has been solved in countless ways, each of them conflicting and confounding.

Ehrenreich's tale is more memoir than theology, taking us on a tour through a dysfunctional childhood with alcoholic parents and tough love. A story of growth, striking out into the world, and sad coming-to-terms with the parents who each die tragically. But it also turns on a pattern of mystical experiences that she keeps having, throughout her adult life, which she ultimately diagnoses as dissociative states where she zones out and has a sort of psychedelic communion with the world.

"Something peeled off the visible world, taking with it all meaning, inference, association, labels, and words. I was looking at a tree, and if anyone had asked, that's what I would have said I was doing, but the word "tree" was gone, along with all the notions of tree-ness that had accumulated in the last dozen years or so since I had acquired language. Was it a place that was suddenly revealed to me? Or was it a substance- the indivisible, elemental material out of which the entire known and agreed-upon world arises as a fantastic elaboration? I don't know, because this substance, this residue, was stolidly, imperturbably mute. The interesting thing, some might say alarming, was that when you take away all the human attributions- the words, the names of species, the wisps of remembered tree-related poetry, the fables of photosynthesis and capillary action- that when you take all this this away, there is still something left."

This is not very hard to understand as a neurological phenomenon of some kind of transient disconnection of just the kind of brain areas she mentions- those that do all the labeling, name-calling, and boxing-in. In schizophrenia, it runs to the pathological, but in Ehrenreich's case, she does not regard it as pathological at all, as it is always quite brief. But obviously, the emotional impact and weirdness of the experience- that is something else altogether, and something that humans have been inducing with drugs, and puzzling over, forever. 

Source

As a memoir, the book is very engaging. As a theological quest, however, it doesn't work as well, because the mystical experience is, as noted above, resolutely meaningless. It neither compels Ehrenreich to take up Christianity, as after a Pauline conversion, nor any other faith or belief system. It offers a peek behind the curtain, but, stripped of meaning as this view is, Ehrenreich is perhaps too skeptical or bereft of imagination to give it another, whether of her own or one available from the conventional array of sects and religions. So while the experiences are doubtless mystical, one can not call them religious, let alone god-given, because Ehrenreich hasn't interpreted them that away. This hearkens back to the writings of William James, who declined to assign general significance to mystical experiences, while freely admitting their momentous and convincing nature to those who experienced them.

Only in one brief section (which had clearly been originally destined for an entirely different book) does she offer a more interesting and insightful analysis. There, Ehrenreich notes that the history of religion can be understood as a progressive bloodbath of deicide. At first, everything is alive and sacred, to an animist mind. Every leaf and grain of sand holds wonders. Every stream and cloud is divine. This is probably our natural state, which a great deal of culture has been required to stamp out of us. Next is a hunting kind of religion, where deities are concentrated in the economic objects (and social patterns) of the tribe- the prey animals, the great plants that are eaten, and perhaps the more striking natural phenomena and powerful beasts. But by the time of paganism, the pantheon is cut down still more and tamed into a domestic household, with its soap-opera dramas and an increasingly tight focus on the major gods- the head of the family, as it were. 

Monotheism comes next, doing away with all the dedicated gods of the ocean, of medicine, of amor and war, etc., cutting the cast down to one. One, which is inflated to absurd proportions with all-goodness, all-power, all-knowledge, etc. A final and terrifying authoritarianism, probably patterned on the primitive royal state. This is the phase when the natural world is left in the lurch, as an undeified and unprotected zone where human economic greed can run rampant, safe in the belief that the one god is focused entirely on man's doings, whether for good or for ill, not on that of any other creature or feature of the natural world. A phase when even animals, who are so patently conscious, can, through the narcissism of primitive science and egoistic religion, be deemed mere mechanisms without feeling. This process doesn't even touch on the intercultural deicide committed by colonialism and conquest.

This in turn invites the last deicide- that by rational people who toss aside this now-cartoonish super-god, and return to a simpler reverence for the world as we naturally respond to it, without carting in a lot of social power-and-drama baggage. It is the cultural phase we are in right now, but the transition is painfully slow, uneven, and drawn-out. For Ehrenreich, there are plenty of signs- in the non-linear chemical phenomena of her undergraduate research, in the liveliness of quantum physics even into the non-empty vacuum, in the animals who populate our world and are perhaps the alien consciousnesses that we should be seeking in place of the hunt through outer space, and in our natural delight in, and dreams about, nature at large. So she ends the book as atheist as ever, but hinting that perhaps the liveliness of the universe around us holds some message that we are not the only thinking and sentient beings.

"Ah, you say, this is all in your mind. And you are right to be skeptical; I expect no less. It is in my mind, which I have acknowledged from the beginning is a less than perfect instrument. but this is what appears to be the purpose of my mind, and no doubt yours as well, its designed function beyond all the mundane calculations: to condense all the chaos and mystery of the world into a palpable Other or Others, not necessarily because we love it, and certainly not out of any intention to "worship" it. But because ultimately we may have no choice in the matter. I have the impression, growing out of the experiences chronicled here, that it may be seeking us out." 

Thus the book ends, and I find it a rather poor ending. It feels ripped from an X-Files episode, highly suggestive and playing into all the Deepak and similar mystical tropes of cosmic consciousness. That is, if this passage really means much at all. Anyhow, the rest of the trip is well worth it, and it is appropriate to return to the issue of the mystical experience, which is here handled with such judicious care and restraint. Where imagination could have run rampant, the cooly scientific view (Ehrenreich had a doctorate in biology) is that the experiences she had, while fascinating and possibly book-proposal-worthy, did not force a religious interpretation. This is radically unlike the treatment of such matters in countless other hands, needless to say. Perhaps our normal consciousness should not be automatically valued less than more rare and esoteric states, just because it is common, or because it is even-tempered.


  • God would like us to use "they".
  • If you are interested in early Christianity, Gnosticism is a good place to start.
  • Green is still an uphill battle.

Saturday, February 11, 2023

A Gene is Born

Yes, genes do develop out of nothing.

The "intelligent" design movement has long made a fetish of information. As science has found, life relies on encoded information for its genetic inheritance and the reliable expression of its physical manifestations. The ID proposition is, quite simply, that all this information could not have developed out of a mindless process, but only through "design" by a conscious being. Evidently, Darwinian natural selection still sticks on some people's craw. Michael Behe even developed a pseudo-mathematical theory about how, yes, genes could be copied mindlessly, but new genes could never be conjured out of nothing, due to ... information.

My understanding of information science equates information to loss of entropy, and expresses a minimal cost of the energy needed to create, compute or transmit information- that is, the Shannon limits. A quite different concept comes from physics, in the form of information conservation in places like black holes. This form of information is really the implicit information of the wave functions and states of physical matter, not anything encoded or transmitted in the sense of biology or communication. Physical state information may be indestructable (and un-create-able) on this principle, but coded information is an entirely different matter.

In a parody of scientific discussion, intelligent design proponents are hosted by the once-respectable Hoover Institution for a discussion about, well, god.

So the fecundity that life shows in creating new genes out of existing genes, (duplications), and even making whole-chromosome or whole-genome duplications, has long been a problem for creationists. Energetically, it is easy to explain as a mere side-effect of having plenty of energy to work with, combined with error-prone methods of replication. But creationistically, god must come into play somewhere, right? Perhaps it comes into play in the creation of really new genes, like those that arise from nothing, such as at the origin of life?

A recent paper discussed genes in humans that have over our recent evolutionary history arisen from essentially nothing. It drew on prior work in yeast that elegantly laid out a spectrum or life cycle of genes, from birth to death. It turns out that there is an active literature on the birth of genes, which shows that, just like duplication processes, it is entirely natural for genes to develop out of humble, junky precursors. And no information theory needs to be wheeled in to show that this is possible.

Yeast provides the tools to study novel genes in some detail, with rich genetics and lots of sequenced relatives, near and far. Here is portrayed a general life cycle of a gene, from birth out of non-gene DNA sequences (left) into the key step of translation, and on to a subject of normal natural selection ("Exposed") for some function. But if that function decays or is replaced, the gene may also die, by mutation, becoming a pseudogene, and eventually just some more genomic junk.

The death of genes is quite well understood. The databases are full of "pseudogenes" that are very similar to active genes, but are disabled for some reason, such as a truncation somewhere or loss of reading frame due to a point mutation or splicing mutation. Their annotation status is dynamic, as they are sometimes later found to be active after all, under obscure conditions or to some low level. Our genomes are also full of transposons and retroviruses that have died in this fashion, by mutation.

Duplications are also well-understood, some of which have over evolutionary time given rise to huge families of related proteins, such as kinases, odorant receptors, or zinc-finger transcription factors. But the hunt for genes that have developed out of non-gene materials is a relatively new area, due to its technical difficulty. Genome annotators were originally content to pay attention to genes that coded for a hundred amino acids or more, and ignore everything else. That became untenable when a huge variety of non-coding RNAs came on the scene. Also, occasional cases of very small genes that encoded proteins came up from work that found them by their functional effects.

As genome annotation progressed, it became apparent that, while a huge proportion of genes are conserved between species, (or members of families of related proteins), other genes had no relatives at all, and would never provide information by this highly convenient route of computer analysis. They are orphans, and must have either been so heavily mutated since divergence that their relationships have become unrecognizable, or have arisen recently (that is, since their evolutionary divergence from related species that are used for sequence comparison) from novel sources that provide no clue about their function. Finer analysis of ever more closely related species is often informative in these cases.

The recent paper on human novel genes makes the finer point that splicing and export from the nucleus constitute the major threshold between junk genes and "real" genes. Once an RNA gets out of the nucleus, any reading frame it may have will be translated and exposed to selection. So the acquisition of splicing signals is a key step, in their argument, to get a randomly expressed bit of RNA over the threshold.

A recent paper provided a remarkable example of novel gene origination. It uncovered a series of 74 human genes that are not shared with macaque, (which they took as their reference), have a clear path of origin from non-coding precursors, and some of which have significant biological effects on human development. They point to a gradual process whereby promiscuous transcription from the genome gave rise by chance to RNAs that acquired splice sites, which piped them into the nuclear export machinery and out to the cytoplasm. Once there, they could be translated, over whatever small coding region they might possess, after which selection could operate on their small protein products. A few appear to have gained enough function to encourage expansion of the coding region, resulting in growth of the gene and entrenchment as part of the developmental program.

Brain "organoids" grown from genetically manipulated human stem cells. On left is the control, in middle is where ENSG00000205704 was deleted, and on the right is where ENSG00000205704 is over-expressed. The result is very striking, as an evolutionarily momentous effect of a tiny and novel gene.

One gene, "ENSG00000205704" is shown as an example. Where in macaque, the genomic region corresponding to this gene encodes at best a non-coding RNA that is not exported from the nucleus, in humans it encodes a spliced and exported mRNA that encodes a protein of 107 amino acids. In humans it is also highly expressed in the brain, and when the researchers deleted it in embryonic stem cells and used those cells to grow "organoids", or clumps of brain-like tissue, the growth was significantly reduced by the knockout, and increased by the over-expression of this gene. What this gene does is completely unknown. Its sequence, not being related to anything else in human or other species, gives no clue. But it is a classic example of gene that arose from nothing to have what looks like a significant effect on human evolution. Does that somehow violate physics or math? Nothing could be farther from the truth.

  • Will nuclear power get there?
  • What the heck happened to Amazon shopping?

Saturday, February 4, 2023

How Recessive is a Recessive Mutation?

Many relationships exist between mutation, copy number, and phenotype.

The traditional setup of Mendelian genetics is that an allele of a gene is either recessive or dominant. Blue eyes are recessive to brown eyes, for the simple reason that blue arises from the absence of an enzyme, due to a loss of function mutation. So having some of that enzyme, from even one "brown" copy of that gene, is dominant over the defective "blue" copy. You need two "blue" alleles to have blue eyes. This could be generalized to most genes, especially essential genes, where lacking both copies is lethal, while having one working copy will get you through, and cover for a defective copy. Most gene mutations are, by this model, recessive. 

But most loci and mutations implicated in disease don't really work like that. Some recent papers delved into the genetics of such mutations, and observed that their recessiveness was all over the map, a spectrum, really, of effects from fully recessive to dominant, with most in the middle ground. This is informative for clinical genetics, but also for evolutionary studies, suggesting that evolution is not, after all, blind to the majority of mutations, which are mostly deleterious, exist most of the time in the haploid (one-copy) state, and would be wholly recessive by the usual assumption.

The first paper describes a large study over the Finnish population, which benefited from several advantages. Finns have a good health system with thorough records which are housed in a national biobank. The study used 177,000 health records and 83,000 variants in coding regions of genes collected from sequencing studies. Second, the Finnish population is relatively small and has experienced bottlenecks from smaller founding populations, which amplifies the prevalence of variants that those founders had. That allows those variants to rise to higher rates of appearance, especially in the homozygous state, which generally causes more noticeable disease phenotypes. Both the detectability and the statistics were powered by this higher incidence of some deleterious mutations (while others, naturally, would have been more rare than the world-wide average, or absent altogether).

Thirdly, the authors emphasize that they searched for various levels of recessive effect, which is contrary to the usual practice of just assuming a linear effect. A linear model says that one copy of a mutation has half the effect of two copies- which is true sometimes, but not most of the time, especially in more typical cases of recessive effect where one copy has a good deal less effect, if not zero. Returning to eye color, if one looks in detail, there are many shades of eyes, even of blue eyes, so it is evident that the alleles that affect eye color are various, and express to different degrees (have various penetrance, in the parlance). While complete recessiveness happens frequently, it is not the most common case, since we generally do not routinely express excess amounts of proteins from our genes, making loss of one copy noticeable most of the time, to some degree. This is why the lack of a whole chromosome, or an excess of a whole chromosome, has generally devastating consequences. Trisomies in only three chromosomes are viable (that is, not lethal), and confer various severe syndromes.

A population proportion plot vs age of disease diagnosis for three different diseases and an associated genetic variant. In blue is the normal ("wild-type") case, in yellow is the heterozygote, and in red the homozygote with two variant alleles. For "b", the total lack of XPA causes skin cancer with juvenile onset, and the homozygotic case is not shown. The Finnish data allowed detection of rather small recessive effects from variations that are common in that population. For instanace, "a" shows the barely discernable advancement of age of diagnosis for a disease (hearing loss) that in the homozygotic state is universal by age 10, caused by mutations in GJB2.

The second paper looked more directly at the fitness cost of variations over large populations, in the heterozygous state. They looked at loss-of-function (LOF) mutations of over 17,000 genes, studying their rate of appearance and loss from human populations, as well as in pedigrees. These rates were turned, by a modeling system, into fitness costs, which are stated in percentage terms, vs wild type. A fitness cost of 1% is pretty mild, (though highly significant over longer evolutionary time), while a fitness cost of 10% is quite severe, and one of 100% is immediately lethal and would never be observed in the population. For example, a mutation that is seen rarely, and in pedigrees only persists for a couple of generations, implies a fitness cost of over 10%.

They come up with a parameter "hs", which is the fitness cost "s" of losing both copies of a gene, multiplied by "h", a measure of the dominance of the mutation in a single copy.


In these graphs, human genes are stacked up in the Y axis sorted by their computed "hs" fitness cost in the heterozygous state. Error bars are in blue, showing that this is naturally a rather error-prone exercise of estimation. But what is significant is that most genes are somewhere on the spectrum, with very few having negligible effects, (bottom), and many having highly significant effects (top). Genes on the X chromosome are naturally skewed to much higher significance when mutated, since in males there is no other copy, and even in females, one X chromosome is (randomly) inactivated to provide dosage compensation- that is, to match the male dosage of production of X genes- which results in much higher penetrance for females as well.


So the bottom line is that while diploidy helps to hide alot of variation in sexual organisms, and in humans in particular, it does not hide it completely. We are each estimated to receive, at birth, about 70 new mutations, of which 1/1000 are the kind of total loss of gene function studied here. This work then estimates that 20% of those mutations have a severe fitness effect of >10%, meaning that about one in seventy zygotes carry such a new mutation, not counting what it has inherited from its parents, and will suffer ill effects immediately, even though it has a wild-type copy of that gene as well.

Humans, as other organisms, have a large mutational load that is constantly under surveillance by natural selection. The fact that severe mutations routinely still have significant effects in the heterozygous state is both good and bad news. Good in the sense that natural selection has more to work with and can gradually whittle down on their frequency without necessarily waiting for the chance of two meeting in an unfortunate homozygous state. But bad in the sense that it adds to our overall phenotypic variation and health difficulties a whole new set of deficiencies that, while individually and typically minor, are also legion.