Saturday, April 24, 2021

Way Too Much Dopamine

Schizophrenia and associated delusions/hallucinations as a Bayesian logic defect of putting priors over observations, partly due to excess dopamine or sensitivity to dopamine.

It goes without saying that our brains are highly tuned systems, both through evolution and through development. They are constantly active, with dynamic coalitions of oscillatory synchrony and active anatomical connection that appear to create our mental phenomena, conscious and unconscious. Neurotransmitters have long been talismanic keys to this kingdom, there being relatively few of them, with somewhat distinct functions. GABA, dopamine, serotonin, glutamate, acetylcholine are perhaps the best known, but there are dozens of others. Each transmitter tends to have a theme associated with it, like GABA being characteristic of inhibitory neurons, and glutamate the most common excitatory neurotransmitter. Each tends to have drugs associated with it as well, often from natural sources. Psilocybin stimulates serotonin receptors, for instance. Dopamine is central to reward pathways, making us feel good. Cocaine raises dopamine levels, making us feel great without having done anything particularly noteworthy.

As is typical, scientists thought they had found the secret to the brain when they found neurotransmitters and the variety of drugs that affect them. New classes of drugs like serotonin uptake inhibitors (imipramine, prozac) and dopamine receptor antagonists (haloperidol) took the world by storm. But they didn't turn out to have the surgical effects that were touted. Neurotransmitters function all over the brain, and while some have major themes in one area or another, they might be doing very different things elsewhere, and not overlap very productively with a particular syndrome such as depression or schizophrenia. Which is to say that such major syndromes are not simply tuning problems of one neurotransmitter or other. Messing with transmitters turned out to be a rather blunt medical instrument, if a helpful one.

All this comes to mind with a recent report of the connection between dopamine and hallucinations. As noted above, dopamine antagonists are widely used as antipsychotics (following the dopamine hypothesis of schizophrenia), but the premier hallucinogens are serotonin activators, such as Psilocybin and LSD, though their mode of action remains not fully worked out. (Indeed, ketamine, another hallucinogen, inhibits glutamine receptors.) There is nothing neat here, except that nature, and the occasional chemical accident, have uncovered amazing ways to affect our minds. Insofar as schizophrenia is characterized by over-active dopamine activity in some areas, (though with a curious lack of joy, so the reward circuitry seems to have been left out), and involves hallucinations which are reduced by dopamine antagonists, a connection between dopamine and hallucinations makes sense. 

"... there are multiple genes and neuronal pathways that can lead to psychosis and that all these multiple psychosis pathways converge via the high-affinity state of the D2 receptor, the common target for all antipsychotics, typical or atypical." - Wiki


So what do they propose? These researchers came up with a complex system to fool mice into pressing a lever based on uncertain (auditory) stimuli. If the mouse really thought the sound had happened, it would wait around longer for the reward, giving researchers a measure of its internal confidence in a signal which may have never been actually presented. The researchers thus presented a joint image and sound, but sometimes left out the sound, causing what they claim to be an hallucinated perception of the sound. Thus the mice, amid all this confusion, generated some hallucinations in the form of positive thinking that something good was coming their way. Ketamine increased this presumed hallucination rate, suggestively. The experiment was then to squirt some extra dopamine into their brains (via new-fangled optogenetic methods, which can be highly controllable in time and space) at a key area known to be involved in schizophrenia, the striatum, which is a key interface between the cortex and lower/inner areas of the brain involved in motion, emotion, reward, and cognition.

Normal perception is composed of a balance of bottom up observation and top-down organization. Too much of either one is problematic, sometimes hallucinatory.

This exercise did indeed mimick the action of a general dose of ketamine, increasing false assumptions, aka hallucinations, and confirming that dopamine is involved there. The work relates to a very abstract body of work on Bayesian logic in cognition, recognizing that perception rests on modeling. We need to have some model of the world before we can fit new observations into it, and we continually update this model by "noticing" salient "news" which differs from our current model. In the parlance, we use observation to update our priors to more accurate posterior probability distributions. The idea is that, in the case of hallucination, the top-down model is out-weighing (or making up for a lack of) bottom-up observation, running amok, and thus exposing errors in this otherwise carefully tuned Bayesian system. One aspect of the logic is that some evaluation needs to be made of the salience of a new bit of news. How much does it differ from what is current in the model? How reliable is the observation? How reliable is the model? The systems gone awry in schizophrenia appear to mess with all these key functions, awarding salience to unimportant things and great reliability to shaky models of reality. 

Putting neurotransmitters together with much finer anatomical specification is surely a positive step towards figuring out what is going on, even if this mouse model of hallucination is rather sketchy. So this new work constitutes a tiny step in the direction of boring, anatomically and chemically, into one tiny aspect of this vast syndrome, and into the interesting area of mental construction of perceptions.


  • Another crisis of overpopulation.
  • And another one.
  • Getting China to decarbonize will take a stiff carbon price, about $500 to $1000/ton.

Various policy scenarios of decarbonization in China, put into a common cost framework of carbon pricing (y-axis). Some policies are a lot more efficient than others. 

Saturday, April 17, 2021

Zooming In On The Genome

Better sequencing methods bring the human genome to higher resolution and accuracy.

Most of the progress in DNA sequencing over the last two decades has come in what is known as "short read" sequencing. The dominant company, Illumina, produces massive machines that crank out huge amounts of data, but in the form of tiny individual reads, only about 90 bases long. That means that there is a lot of work on the back end for data analysis to piece everything together. And given the frequent occurence in our genomes of repeats and repetitive sequences in many forms and sizes, these reads are simply too short to fully make sense of it. No amount of assembly magic can make up for a lack of long-range information.

So there has been a second movement of "next generation" sequencing methods, pursuing long reads, of tens of thousands of bases. Several methods exist, but the leader is Pacific Biosciences, (PacBio), whose method tacks down a single polymerase into a special optical well and then uses fluorescence to detect each individual nucleobase incorporation as the polymerase chugs away at the given template. This is not a fool-proof process, being at the single molecule level. While the Illumina system greatly amplifies the DNA and thus raises the signal (which is also ultimately fluorescence-based) to a high and reliable level, these long-read methods tend to have lower reliability. A recent paper described a way around this, featuring a long read system which was used to analyze 34 human genomes to collect new information about large scale structure and variation.

The "weird trick" that PacBio uses is to circularize templates of about 15,000 bases, and then drive the polymerase reaction described above around those circles upwards of fifty times. This allows multiple passes around the same DNA to make up (in volume/repetition) for the inherent error rate of each individual pass. Reads of this size are big enough to surround most forms of repetition and low complexity in our genomes, or at least cover enough landmarks/variants that one repeat can be distinguished from others. Indeed, these researchers could even figure out, based on allelic variants peppered through the genome, which parent each sequence came from, assembling each of the subject's two copies of each chromosome separately. All this makes it possible to assemble whole genomes with unprecedented accuracy, (~1 error in a half-million bases), especially in terms of long-range features and variations.

And that has been a growing area of interest in human genetics- variations in structure that lead to extra copies of genes, insertions, deletions, and altered regulatory contexts.  It is a frontier that has had to wait for these new techniques, while millions of single nucleotide variants have been piling up. Cancers are notorious, for instance, for being caused by accidental fusions of two distant genes whereby some developmental or cell cycle gene function is put under novel and (usually) high activation by some other gene regulatory region. Down Syndrome is caused by a whole-chromosome duplication to trisomy. Smaller deletions and duplications have significant effects as well, naturally.

The new paper digs up twice as many structural variants as previous analyses (and does so from only 37 human genomes, compared to the >2000 genomes used by other analyses) - 107,590 insertion/ deletions over 50 bp in size; 316 inversions, 2.3 million insertions/deletions under 50 bp in size; and 15.8 million single nucleotide variants. Many of these count as normal alleles, of long-standing in the human genome, just difficult to piece together previously. In non-gene regions, these variants may have little effect. Per individual, and in comparison to the current reference human genome, they claim to see 24,653 large structural variants, 794,406 small insertions/deletions under 50 bases, and 3,895,274 single nucleotide variants. This is quite a lot in a genome of 3 billion bases, amounting to about 0.1% of all individual positions that are varying in the population, and almost a million other re-arrangements, deletions, etc.

An example of one transposon (top) that the current paper discovered has jumped several times in succession, from chromosome 3 to chromosome 1, then from that landing spot to two other locations on chromosome 1 and one spot on chromosome 17. Each jump brought along a bit of extra DNA from the originating locus.

The vast majority of these mutations arose from repair events, where the DNA broke and was then fixed either by repair using the other homolog sequence for reference (~65% of cases), or simple blunt end rejoining, with a few percent coming from errors that happen during replication, particularly of repetitive sequences. Another source of mutation is the movement of mobile genetic elements, which encode their own apparatus of transposition to new locations. These researchers found ~ 10,000 that were not present or not identified in the human reference genome (because this is what is generally called "junk"). Their detailed data, in comparison to outside references like the chimpanzee genome, allowed them to assess the ages and relationships of these mobile elements. Most are old fossils and no longer active due to mutation. But others have clear and recent lineages, and are still giving rise to mutations, even causing cancers. One can imagine that genome editing could eventually turn these off permanently, reducing one source of cancer and birth defect risk.

Close-up view of one part of chromosome 3, cytological band q29. Even in this small population sample (individual haplotypes listed down the left side, bottom), there is a flurry of structural variations, including inversions and duplications. (CNP = copy number polymorphism.) At top left is a map of genes located here in the reference sequence (hg38). The light arrow shows the direction of transcription, and the heavy vertical lines are the exons. For example. TNK2 is a protein kinase that relays signals inside cells, is active during brain development, and can be an oncogene when activated or overexpressed, as well as having a role in cell entry by some viruses.

An additional analysis was for trait loci associated with the newly found structural variants. As can be surmised from the sample genomic location diagrammed above, this kind of jumbling of the genome is likely to have functional consequences, either by breaking genes apart, joining them to foreign regulatory regions, or by duplicating or deleting them, in part or whole. The researchers found that roughly half of structural variants that map to known trait loci (called quantitative trait loci, or QTLs), were newly found in this study. So while the accuracy increase may not seem like a lot, it can have profound consequences.

The count of structural variants that differ by population. Superpopulation (region) count in light color, and population-specific in dark color.

Lastly, this new fund of variation data allows another look at human ancestry. As we know, the bulk of human variation remains in Africa, and that is reflected in structural variation as well as other forms of variation. Populations elsewhere are far less diverse, due to the small groups that founded those populations from the mother-continent, and perhaps also through the new selective pressures that swept those populations, either positively or negatively. Twenty years after the original human genome was published, it continues to be a clinical and research goldmine, but also requires ongoing work to bring to complete accuracy- something this work gets us much closer to.


Saturday, April 10, 2021

We Are Still Poisoning the World

Anthropogenic environmental poisonings, intentional or not.

We have an EPA and long-standing bureaucracies of environmental review, so our environment should be clean, right? Well, rivers may not be burning anymore, but that doesn't mean things are as tidy as they may look. Humanity has proven capable of inventing and selling innumerable chemicals, creating a situation that is far more complex than any kind of precautionary testing or policy making can address. Shocking issues have arisen in recent years that remind us that there is a great deal more to do if we are serious about caring for the biosphere.

6PPD is engineered to react with ozone to become 6PPD-quinone. That protects tire rubber, but kills salmon.

A recent paper showed that the decline of salmon all over the Western US is attributable to a completely unanticipated source. An obscure chemical from automobile and truck tires, N-(1,3-dimethylbutyl)-N'-phenyl-p-phenylenediamine (6PPD), reacts with ozone to form an incredibly toxic compound, which is killing salmon exposed to roadway runoff. This chemical makes up about 1% of tire formulas, and is meant to react with ozone to protect the rest of the rubber in the tire from degradation. The yearly rate of tire rubber degradation and emission is about a pound per person, leaving a great deal of this poison in the environment. So ... does anyone care? A web search for 6PPD yields very little news, despite this being a clear environmental crisis.

Putting the tire problem in perspective. LC50 is the lethal concentration where half of organisms die.


An even more complicated story came up recently from the southeast of the US, where eagles are dying from a mysterious neurological syndrome. It turns out that an invasive water weed, hydrilla, responds vigorously to agricultural nutrient runoff pollution, and accumulates bromine, of all things. This in turn leads to a bloom of an algal parasite, Aetokthanos hydrillicola which grows on the hydrilla, and produces, as cyanobacteria are wont to do, various toxins, in this case a highly brominated amino acid derivative which causes the eagles' lethal neurological disease.

AETX, a heavily bromimated derivative of the amino acid tryptophan. This is a toxin, causing myelinopathy in eagles after it bio-accumulates in the food chain from ducks and other aquatic browsers that eat the cyanobacterial-infested hydrilla. 


But this is just the tip of the iceberg, based on incredibly painstaking work by chemists newly armed with today's analytical chemistry tools to look at particularly dramatic cases of dead wildlife. What about the lead in firearm ammunition, which litters the countryside and shatters into poisonous shrapnel in its targets? What of the intentional poisonings by farmers and ranchers, that are killing condors in the Andes? What of the landfills and coal ash heaps, and whatever leaches out of them? And what of the mountains of plastic that are increasingly filling the planet's waterways and oceans? They are not just physical nuisances but leach out an uncountable array of obscure chemicals. 

These are slow-motion Chernobyls, which need to be taken seriously and mitigated by a more precautionary approach to new products, a life-cycle approach to collecting and reprocessing existing products, and more investment in cleaning and protecting the environment.


  • Not quite alive, not quite dead ... what happens after you get that RNA vaccine injection. 
  • Bach, slow and easy.
  • Notes on antitrust.
  • Are we heading towards a "managed" democracy?
  • Carbon tax, now! Or maybe carbon neutral products at slightly higher prices?

Saturday, April 3, 2021

Gears Within Gears Within Gears

What the Antikythera mechanism says about the technology and culture of Western antiquity.

A recent paper has laid out a complete reconstruction of the Antikythera mechanism, which was an astronomical computer made around 50 to 200 BCE. It is a machine of breathtaking scope and ambition, far beyond what the ancient world was thought capable of- a detailed model of the motions of all the planets of the day, sun, moon (with phases), plus on the back, detailed predictions of solar eclipses and, in true sports page fashion, schedules for the most popular panhellenic games. All this was available decades in advance, though a true modeling computer that the user could wind through at will, forward or back. As a bonus, an instruction manual was inscribed on the back.

An artist's rendition of the current researcher's proposals about the front face of the Antikythera computer. The moon (black and white) revolves around the earth at center, with phases reflected in its rotation. Outer rings and pointers successively portray the Sun, Mercury, Venus, Earth date, Mars, Jupiter, and Saturn. A knob at the side would have allowed the user to run it.

To understand the detailed mechanisms and proposals of this research group, watch their video. No one would have known about this technology had not some sponge divers found a ~75 BCE wreck off the coast of Antikythera in 1900. One of the many artifacts was a lump of bronze, with a clearly mechanism-like structure but heavily corroded. Indeed only a third of the original was ever found, and it has taken all this time, including recent intensive X-ray examinations of its inner workings and inscriptions, to figure out its full splendor.

A partial rendition of the computer's inner workings, with gears noted with their numbers of gear teeth. Eccentric motions were provided by bars connected to gear-mounted pins off-set from the gear's pivot center.

One of the more puzzling aspects of this device, on a purely technological level, is how the precise machining was done, all in bronze. The front face displays at least nine separately moving dials or pointers, each driven by one of a set of nested tubes making up the central shaft of the computer. Could the Hellenistic Greeks cast bronze to this kind of uniformity and thinness? Or did they have lathes with such precision? What can be assumed is that this mechanism is not alone, and must have been the product of an ongoing tradition of precision device manufacture- a long technological evolution that accumulated the numerous ingenious solutions and remarkable miniaturization evident in this device. How could such a tradition have otherwise so thoroughly evaded historians and archeologists? And what became of this tradition into Roman times?

The astronomy that this device is based on has its roots with the Babylonians, who were avid readers of the skys and its many cycles. The gears within accord with various grand cycles with which key events, like the position of the moon or planets, recur with regularity. For example, the Saros cycle is when the Earth, Moon, and Sun line up for an eclipse, and recurs every 18 years, 11 1/3 days. The gearing in the Antikythera mechanism goes through five different pairings to come up with the 940/4237 ratio that approximates the 18 year cycle every four turns of the crank. The Greeks naturally contributed their own astronomical theories, such as a conviction in the regularity / sphericity of the planetary orbits, despite their wayward motion.

All this tells us strongly that these ancient people were every bit as inventive and thoughtful as we are. But they had very different cultural conceptions and resources to work with. The most frustrating aspect to this amazing story is why this inventiveness did not lead to a more general technological revolution, instead sputtering out with the fall of Rome and the fallow Dark Ages, before technological development resumed at a high level during the later Middle Ages.

I think the answer needs to be put down to the class relations and nature of work in the ancient world. Capitalism certainly was not lacking. Antiquity was just as capitalistic as modern times, with an extremely free business sector able to finance wide-ranging trade and manufacturing operations, and merchants occupying the pride of place in Greek and Roman fora- the malls of their time. The story of Crassus extorting Romans of their burning properties in return for fighting the fire tells you all you need to know about the nature of capitalism in these times. It was red in tooth and claw. But that was not enough to foster technological development on a broad basis.

Rather, slavery and vast inequality made work a degraded, mean affair, beneath the dignity of aristocrats. Their minds were on government, law, military conquest, art, and philosophy. Anything but practical affairs of efficiency, improved production, and technological advancement. Work was secondary to power relations- the essence of a slave society. We have seen this in the Southern culture of the US under slavery. It was obsessed with pursuit of "refinements" and honor. Even though labor was expensive in the form of slaves, and its management a social imperative of the highest order, the idea of supplementing it or replacing it via technology does not seem to have been a high priority. The cotton gin was invented, not by a Southerner, but by Eli Whitney of Massachusetts.

It is a general problem of highly unequal societies, that the maintenance (and defense) of inequality takes on a large part of the mental space of the society, (particularly its educated elite), overtaking the motivations that in a more egalitarian society- where all participate in work and all are eager to adopt improved methods in the work they all share in- that promote the development and propagation of technological advancements. (Think of the revolutions in mechanized agriculture in midwestern America in the late 1800's.) In scholarship as well, the segregation of abstract philosophy and other written forms / stores of knowledge into ivory towers, as was common in Hellenistic culture, reflected the same cultural stratification and lack of concern with the day-to-day drudgery that formed the rather static basis of economic existence.