Saturday, May 14, 2016

Dissection of an Enhancer

Enhancers provide complex, combinatorial control of gene expression in eukaryotes. Can we get past the cartoons?

How can humans get away with having no more genes than a nematode or a potato? It isn't about size, but how you use what you've got. And eukaryotes use their genes with exquisite subtlety, controlling them from DNA sequences called enhancers that can be up to a million base pairs away. Over the eons, countless levels of regulatory complexity have piled onto the gene expression system, more elements of which come to light every year. But the most powerful contol over genes comes from modular cassettes (called enhancers) peppered over the local DNA to which regulatory proteins bind to form complexes that can either activate or repress expression. These proteins themselves are expressed from yet other genes and regulatory processes that form a complex network or cascade of control.

When genome sequencing progressed to the question of what makes people different, and especially what accounts for differences in disease susceptibility, researchers quickly came up with a large number of mutations from GWAS, or genome-wide association studies, in data from large populations. But these mutations gave little insight into the diseases of interest, because the effect of each mutation was very weak. Otherwise the population would not be normal, as these were, typically, but afflicted. A slight change in disease susceptibility coming from a mutation somewhere in the genome is not likely to be informative until we have much more thorough understanding of the biological pathway of that disease.

This is one reason why biology is still going on, a decade and a half after the human genetic code was broken. The weak effect mutations noted above are often far away from any gene, and figuring out what they do is rather difficult, both because of their weakness, their perhaps uninformative position, and also because of the complexity of disease pathways and the relevant environmental effects.

Part of the problem comes down to a need to understand enhancers better, since they play such an important role in gene expression. Many sequencing projects study the exome, which comprises the protein-coding bits of the genome, and thus ignore regulatory regions completely. But even if the entire genome is studied, enhancers are maddening subjects, since they are so darned degenerate. Which is a technical term for being under-specified with lots of noise in the data. DNA-binding proteins tend to bind to short sites, typically of seven to ten nucleotides, with quite variable/noisy composition. But if helped by a neighbor, they may bind to a quite different site.. who knows? Such short sequences are naturally very common around the genome, so which ones are real, and which are decoys, among the tens or hundreds of thousands of basepairs around a gene? Again, who knows?

Thus molecular biologists have been content to do very crude analyses, deleting pieces of DNA around a specific gene, measuring a target gene's expression, and marking off sites of repression and enhancement using those results. Then they present a cartoon:

Drosophila Runt locus, with its various control regions (enhancers) mapped out a top on the genomic locus, and the proto-segmental stripes in the embryo within which each enhancer contributes to activate expression below. The locus spans 80,000 basepairs, of which the coding region is the tiny blue set of exons marked at top in blue with "Run".

This is a huge leap of knowlege, but is hardly the kind of quantative data that allows computational prediction and modeling of biology throughout the relevant regulatory pathways, let alone for other genes to which some of the same regulatory proteins bind. That would require a whole other level of data about protein-DNA binding propensities, effects from other interacting proteins, and the like, put on a quantitative basis. Which is what a recent paper begins to do.
"The rhomboid (rho) enhancer directs gene expression in the presumptive neuroectoderm under the control of the activator Dorsal, a homolog of NF-κB. The Twist activator and Snail repressor provide additional essential inputs"
A Drosophila early embryo, stained for gene expression of Rhomboid, in red. The expression patterns of the regulators Even-skipped (stripes) and Snail (ventral, or left) are both stained in green. The dorsal (back) direction is right, ventral (belly) is left, and the graph is of Rhomboid expression over the ventral->dorsal axis. The enhancer of the Rhomboid gene shown at top has its individual regulator sites colored as green (Dorsal), red (Snail) and yellow (Twist). 

Their analysis focused on one enhancer of one gene, the Rhomboid gene of the fruit fly, which directs embryonic gene expression just dorsal to the midline, shown above in red. The Snail regulator is a repressor of transcription, while Dorsal and Twist are both activators. A few examples of deleting some of these sites are shown below, along with plots of Rhomboid expression along the ventral/dorsal axis.

Individual regulator binding sites within the Rhomboid enhancer (B, boxes), featuring different site models (A) for each regulator. The fact that one regulator such as Dorsal can bind to widely divergent site, such as DL1 and DL2/3, suggests the difficulty of finding such sites computationally in the genome. B shows how well the models match the actual sequence at sites known to be bound by the respective regulators.

Plots of ventral-> dorsal expression of Rhomboid after various mutations of its Dorsal / Twist/ Snail enhancer. Black is the wild-type case, blue is the mutant data, and red is the standard error.

It is evident that the Snail sites, especially the middle one, plays an important role in restricting Rhomboid expression to the dorsal side of the embryo. This makes sense from the region of Snail expression shown previously, which is restricted to the ventral side, and from Snail's activity, which is repression of transcription.
"Mutation of any single Dorsal or Twist activator binding site resulted in a measurable reduction of peak intensity and retraction of the rho stripe from the dorsal region, where activators Dorsal and Twist are present in limiting concentrations. Strikingly, despite the differences in predicted binding affinities and relative positions of the motifs, the elimination of any site individually had similar quantitative effects, reducing gene expression to approximately 60% of the peak wild-type level"

However, when they removed pairs of sites and other combinations, the effects became dramatically non-linear, necessitating more complex modelling. In all they tested 38 variations of this one enhancer by taking out various sites, and generated 120 hypothetical models (using a machine learning system) of how they might cooperate in various non-linear ways.
"Best overall fits were observed using a model with cooperativity values parameterized in three 'bins' of 60 bp (scheme C14) and quenching in four small 25 or 35 bp bins (schemes Q5 and Q6)."
Example of data from some models (Y-axis) run on each of the 38 mutated enhancer data (X-axis). Blue is better fit between the model and the data.

What they found was that each factor needed to be modelled a bit differently. The cooperativity of the Snail repressor was quite small. While the (four) different sites differ in their effect on expression, they seem to act independently. In contrast, the activators were quite cooperative, an effect that was essentially unlimited in distance, at least over the local enhancer. Whether cooperation can extend to other enhancer modules, of which there can be many, is an interesting question.

Proof of their pudding was in the extension of their models to other enhancers, using the best models they came up with in a general form to predict expression from other enhancers that share the same regulators.

Four other enhancers (Ventral nervous system defective [vnd], Twist,  and Rhomboid from two other species of Drosophila, are scored for the modeled expression (red) over the dorsal-ventral axis, and actual expression in black.

The modeling turns out pretty decent, though half the cases are the same Rhomboid gene enhancer from related Drosophila species, which do not present a very difficult test. Could this model be extended to other regulators? Can their conclusion about the cooperativity of repressors vs activators be generalized? Probably not, or not very strongly. It is likely that similar studies would need to be carried out for most major classes of regulators to accumulate the basic data that would allow more general and useful prediction.

And that leaves the problem of finding the sites themselves, which this paper didn't deal with, but which is increasingly addressable with modern genomic technologies. There is a great deal yet to do! This work is a small example of the increasing use of modeling in biology, and the field's tip-toeing progress towards computability.

  • Seminar on the genetics of Parkinson's.
  • Whence conservatism?
  • Krugman on the phony problem of the debt.
  • Did the medievals have more monetary flexibility?
  • A man for our time: Hume, who spent his own time in "theological lying".
  • Jefferson's moral economics.
  • Trump may be an idiot, just not a complete idiot.
  • Obama and Wall Street, cont...
  • The deal is working.. a little progress in Iran.
  • More annals of pay for performance.
  • Corruption at the core of national security.
  • China's investment boom came from captive savings, i.e. state financial control.

Saturday, May 7, 2016

A Son of Hamas Turns His Back

Review of the documentary, the Green Prince. Spoiler alert.

In one of the more bizarre twists of the Palestinian drama, the son of a Hamas leader turned into a tireless worker for the Shin Bet from about 1997 to  2007. Now he lives in the US, at undisclosed locations. This film is essentially an memoir of this story, with two people talking to the camera, Mosab Hassan Yousef, the son, and Gonen Ben Yitzhak, his Israeli intellegence handler.

The format was oddly compelling, because the people are compelling- intelligent and dedicated. But to what? Yousef was raised in the West Bank, the eldest son in a leading family, and became his father's right hand. His father was one of the main people you would hear screaming on the news, preaching publicly about the evils of Israel, the righteousness of Islam and the Intifada, and the need for Hamas to run things in the West Bank as well as Gaza. As Hamas goes, he was not the most extreme, but nor was he a member of the Palestinian Authority- the Palestinian patsies.

Father HassanYousef at a Hamas Rally.

So turning to the Shin Bet was unthinkable in tribal terms. But when Yousef had his first experience in prison, courtesy of an Israeli checkpoint where he was found with some guns, he had a chance to compare tribes. While the Israelis were harsh, they had limits and operated under some kind of lawful system.

The Hamas cell in the prison, however, was brutally sadistic. Yousef describes the killing of scores of putative spies and informants in horrific fashion, with scant evidence. For an idealistic youth, it presented a problem, especially in contrast to the idealized version of the Palestinian cause that he had grown up with. Where at first he didn't take the offer from the Shin Bet seriously, now he had second thoughts. What if his idealism was more about non-violence, peace, and saving lives than about tribal competition?

There follows a lengthy career relaying information from his position at the center of Hamas with his father to the core of Shin Bet, preventing attacks, preventing assassinations, and also, in essence, dictating his father's fate. A central conundrum of intelligence work like this is how to use the informant's information without giving away his or her identity. To maintain Yousef's cover for a decade bespeaks very careful work on all sides.

But the larger issue remains untouched. While Yousef comes off as heroic and idealistic, the Israeli occupation of the West Bank is no more justified by Israel's lawful and partial restraint (or by its relentless stealing of land) than it is by the bottomless resentment and madness of Hamas. Treat people like prisoners and animals, and they often act that way. Moreover, Israel holds total control. They need no "partners" to resolve their doomed and immoral occupation. They only need to get out, and get their settlers out.


  • Muslims are screwing up the Netherlands and Europe generally.
  • Obama and Wall Street. Next showing: Hillary and Wall Street.
  • Do Republicans know anything about growth?
  • The Saudis are hurting.
  • Another business that "cares" for its customers.
  • Another case of pay for performance.
  • Non-competes run amok. "The Treasury Department has found that one in seven Americans earning less than $40,000 a year is subject to a non-compete. This is astonishing, and shows how easily businesses abuse their power over employees."
  • Our medical system is so dysfunctional and complex that error is third leading cause of death.
  • It almost makes you nostalgic for Richard Nixon.
  • Feel the heart, and the Bern.
  • Deflation and below-target monetary growth is a policy mistake.
  • Will extreme Christians let go of politics, at long last?
  • A little brilliant parenting.

Sunday, May 1, 2016

Audio Perception and Oscillation

Brains are reality modeling machines, which isolate surprising events for our protection and delectation. Does music have to be perpetually surprising, to be heard?

Imagine the most boring thing imaginable. Is it sensory deprivation? More likely it will something more active, like a droning lecturer, a chattering relative, or driving in jammed traffic. Meditation can actually be very exciting, (just think of Proust!), and sensory deprivation generates fascinating thought patterns and ideas. LSD and similar drugs heighten such internal experiences to the point that they can become life-altering. Which indicates an interesting thing about the nature of attention- that it is a precious resource that feels abused not when it is let loose, but when it is confined to some task we are not interested in, and particularly, that we are learning nothing from.

Music exists, obviously, not to bore us but to engage us on many levels, from the physical to the meditative and profound. Yet it is fundamentally based on the beat, which would seem a potentially boring structure. Beats alone can be music, hypnotically engaging, but typically the real business of music is to weave around the beat fascinating patterns whose charm lies in a tension between surprise and musical sense, such as orderly key shifts and coherent melody.

Why is all this attractive? Our brains are always looking ahead, forecasting what comes next. Their first rule is ... be prepared! Perception is a blend of getting new data from the environment and fitting it into models of what should be there. This has the virtues of providing understanding, since only by mapping to structured models of reality are new data understandable. Secondly, it reduces the amount of data processing, since only changes need to be attended to. And thirdly, it focuses effort on changing or potentially changing data, which are naturally what we need to be paying attention to anyhow ... the stuff about the world that is not boring.

"Predictive coding is a popular account of perception, in which internal representations generate predictions about upcoming sensory input, characterized by their mean and precision (inverse variance). Sensory information is processed hierarchically, with backward connections conveying predictions, and forward connections conveying violations of these predictions, namely prediction errors." 
"It is thus hypothesised that superficial cell populations calculate prediction errors, manifest as gamma-band oscillations (>30 Hz), and pass these to higher brain areas, while deep cell populations [of cortical columns] encode predictions, which manifest as beta band oscillations (12–30 Hz) and pass these to lower brain areas." 
"In the present study, we sought to dissociate and expose the neural signatures of four key variables in predictive coding and other generative accounts of perception, namely surprise, prediction error, prediction change and prediction precision. Here, prediction error refers to absolute deviation of a sensory event from the mean of the prior prediction (which does not take into account the precision of the prediction). We hypothesised that surprise (over and above prediction error) would correlate with gamma oscillations, and prediction change with beta oscillations."

A recent paper (and review) looked at how the brain perceives sound, particularly how it computes the novelty of a sound relative to an internal prediction. Prediction in the brain is known to resemble a Bayesian process where new information is constantly added to adjust an evolving model.

The researchers circumvented the problems of low-resolution fMRI imaging by using volunteers undergoing brain surgery for epilepsy, who allowed these researchers to study separate parts of their brains- the auditiory cortex- for purposes completely unrelated to their medical needs. They also allowed the researchers to only record from the surfaces of their brains, but to stick electrodes into their auditory cortexes to sample the cortical layers at various depths. It is well-known that the large sheet of the cortex does significantly different things in its different layers.

Frequencies of tones (dots) given to experimental subjects, over time.

The three subjects were played a series of tones at different frequencies, and had to do nothing in return- no task at all. The experiment was merely to record the brain's own responses at different positions and levels of the auditory cortex, paying attention to the various frequencies of oscillating electrical activity. The point of the study was to compare the data coming out with statistical models that they generated separately from the same stimuli- ideal models of Bayesian inference for what one would expect to hear next, given the sequence so far.

Electrode positions within the auditory areas of the subject's brains.

Unfortunately, their stimulus was not quite musical, but followed a rather dull algorithm: "For each successive segment, there is a 7/8 chance that that segment’s f [frequency] value will be randomly drawn from the present population, and a 1/8 chance that the present population will be replaced, with new μ [mean frequency] and σ [standard deviation of the frequency] values drawn from uniform distributions."

Correlations were calculated out between the observed and predicted signals, giving data like the following:

Prediction error and surprise are closely correlated, but the experimenters claim that surprise is a better correlated to the gamma band brain waves observed (B).

The difference between observation and prediction, and between surprise and prediction error. Surprise apparently takes into account the spread of the data, i.e. if uncertainty has changed as well as the mean predicted value.

What they found was that, as others have observed, the highest frequency oscillations in the brain correlate with novelty- surprise about how perceptions are lining up with expectations. The experimenter's surprise (S) measurement and prediction error (Xi) are very closely related, so they both correlate with each other and with the gamma wave signal. The surprise measure is slightly better correlated, however.

On the other hand, they observed that beta oscillations (~20 Hz) were correlated with changes in the predicted values. They hypothesized that beta oscillations are directed downward in the processing system, to shape and update the predictions being used at the prior levels.

Lastly, they find that the ~10 Hz alpha oscillations (and related bands) correlate with the uncertainty or precision of the predicted values. And theta oscillations at ~6 Hz were entrained to the sound stimulus itself, hitting when the next sound was expected, rather than encoding a derived form of the stimulus.

It is all a bit neat, and the conclusions are dredged out of very small signals, as far as is shown. But the idea that key variables of cognition and data processing are separated into different oscillatory bands in the auditory cortex is very attractive, has quite a bit of precedent, and is certainly an hypothesis that can and should be pursued by others in greater depth. The computational apparatus of the brain is very slowly coming clear.
"These are exciting times for researchers working on neural oscillations because a framework that describes their specific contributions to perception is finally emerging. In short, the idea is that comparatively slow neural oscillations, known as “alpha” and “beta” oscillations, encode the predictions made by the nervous system. Therefore, alpha and beta oscillations do not communicate sensory information per se; rather, they modulate the sensory information that is relayed to the brain. Faster “gamma” oscillations, on the other hand, are thought to convey the degree of surprise triggered by a given sound."

  • Bill Mitchell on the Juncker regime.
  • Who exactly is corrupt in Brazil, and how much?
  • There are too many people.
  • But not enough debt.
  • The fiscal "multiplier" is not constant.
  • Population has outstripped our willingness to build and develop.
  • What's going on in the doctor's strike?
  • Schiller on lying in business, Gresham's dynamics, and marketing.
  • Lying in religion.
  • Stiglitz on economics: "The strange thing about the economics profession over the last 35 year is that there has been two strands: One very strongly focusing on the limitations of the market, and then another saying how wonderful markets were."
  • Should banks be public institutions?
  • Does democratic socialism have a future in Russia?
  • A Sandersian / Keynesian stimulus is only effective if the Fed plays along.
  • Science yearns to be free.
  • Trump's brush with bankruptcy and friends in high places.

Saturday, April 23, 2016

Locating Abstractions in the Brain

The most human part of the brain is also the murkiest and least understood. Visualization studies of what is going on in the frontal cortex.

While it was in vogue, the lobotomy operation was used to treat in the neighborhood of 100,000 people in the mid twentieth century, rendering them more manageable- something that has since been more easily achieved with drugs. From the Wiki page:
"The purpose of the operation was to reduce the symptoms of mental disorder, and it was recognized that this was accomplished at the expense of a person's personality and intellect. British psychiatrist Maurice Partridge, who conducted a follow-up study of 300 patients, said that the treatment achieved its effects by 'reducing the complexity of psychic life'. Following the operation, spontaneity, responsiveness, self-awareness and self-control were reduced. Activity was replaced by inertia, and people were left emotionally blunted and restricted in their intellectual range."

What is odd is that for such a massive disruption to the brain, the effects were diffuse and hard to understand (though in fairness, the methods used were hardly uniform). "The first bilateral lobectomy of a human subject was performed by the American neurosurgeon Walter Dandy in 1930. The neurologist Richard Brickner reported on this case in 1932, relating that the recipient, known as 'Patient A', while experiencing a flattening of affect, had suffered no apparent decrease in intellectual function and seemed, at least to the casual observer, perfectly normal."

Some effects were that the subject no longer dreamed, they also lost their theory of mind, or the ability to empathize with others. Some entered a stupor or started suffering siezures. There were various intellectual and personality deficits- one became "smiling, lazy and satisfactory patient with the personality of an oyster". Five percent died. One subject mentioned:
"It took a great deal of effort to keep an abstraction in mind. For example, in talking with the speech therapist I would begin to give a definition of an abstract concern, but as I held it in mind it would sort of fade, and chances were that I'd end up giving a simplified version rather than one at the original level of conception. It was as though giving an abstraction required so much of my addled intelligence that halfway through the definition I would run out of the energy available to me and regress to a more concrete answer. Something like this happened again and again."

An irony is that the Soviet Union took the lead in banning the procedure, "Doctors in the Soviet Union concluded that the procedure was 'contrary to the principles of humanity' and 'through lobotomy an insane person is changed into an idiot.'"

Modern brain scanning allows researchers to peer into the frontal lobes and start figuring out what is going on there. A recent paper described some early work in that direction, devising simple tasks to differentiate levels of abstract thought and mapping where they happen, using fMRI. They manage to map separate zones in the frontal cortex that handle temporal / time shifting abstractions, category switching abstractions, and feature attention control.

The subjects were presented with points that through several frames that added up to a diagram, (C), a star with letters on the outside, with a color applied. There were several rules imposed, such as if the color setting was purple, the letters were supposed to be added up to form a word across the star (TABLET, in this case). If the color was orange, the subject was supposed to just trace the points of the star with her eyes. Then delay rules were added, asking whether the trail was the same type or a different type than the one before. Or the subject was given a new diagram but asked to maintain their place in the old diagram, to be recalled later. Then distraction periods were added in between to test for memory retention. It all begins to look like an intelligence test, for the subject's ability to keep ideas and rules in mind successfully.

Test design, in part. C shows the basic image presented to the subject, which would have included color as well, and varied the shape and text presented. The points of the star were not presented at once, but fed out one point at a time. B shows the combined tests that were devised. For instance. The restart test asked the subject not to delay their analysis, but just presented with a new diagram and asked to resolve the color and text diagram by the agreed rules.

The tests were designed to separate three topics of thought, and were added together in various combinations to allow the researchers to run combinatorial tests. The upshot was that they were able to map the three tasks to different parts of the frontal cortex:

Distinct mappings of each task to its region. Handling time delay and abstraction occupies the very front of the brain, (rostral), while simpler abstractions keeping track of the local context of a task, or attending to selected features of an image/task occupy precincts farther back (caudal). This is in addition to separate zones in the mid-brain
"Regressing these measures onto activation revealed a clear gradient such that caudal LPFC [lateral prefrontal cortex] was related to current, but not future processing, while rostral LPFC was related to future, but not current processing, with mid LPFC showing activity related to both current and future processing "


They end up with a beautiful depiction of the regions of the brain where their various tasks took place. Unfortunately, fMRI imaging technology remains very crude, in time and space, so their task breakdown was similarly crude to suit. It will probably take new technology to go to deeper detail on what is going on in the human frontal cortex- the part of the brain most responsible for making us human, but also, since it handles abstractions farthest from detailed concrete processing, the most nebulous and hard to define.

  • Inequality isn't just a bleeding heart issue, but an investment and prosperity issue.
  • Solow on labor power and inequality.
  • Tax complexity isn't entirely the government's fault, but another dividend of corruption.
  • Retirement is another big front in the inequality debate.
  • Utopia now and then.
  • Globalization is a problem.
  • Some problems with supply side theory. Perhaps taxes make people work harder.
  • Pay is a complicated construct.
  • We need more debt.
  • But perhaps less bail.

Saturday, April 16, 2016

Euhemerization

People making gods, as usual- and the mythical nature of Jesus.

All aspects of the existence and nature of Jesus are a matter of theory, not fact. So much of the early literature about him is forged, made-up, laced by myth and parable, and templated by religious traditions, philosophical preconceptions and political exigencies, that the nature of (or existence of) the actual, historical Jesus is a matter of speculation and inference at best.

Bart Ehrman wrote an exasperated book about the evidence for the historical Jesus, affirming, despite his own lack of conventional faith, and through his dedicated scholarship in the field, that the consensus position of Christians and scholars is correct. The problem of the thin-ness of the evidence remains, however, since all the evidence comes from internal (Christian) and late (not contemporaneous) sources. This is not unusual or unexpected for any Roman of this time, other than the very highest levels of emperors and writers, but hardly allows a solid case either pro or con. A great deal turns, for instance, on one's interpretation of the word "brother", since Paul, in letters that are widely agreed to be reasonably authentic, refers to James as a brother of Jesus. If this means a biological brother, it means that Jesus, by this chain of evidence, really existed biologically. Whether his mother was a perpetual virgin is another matter, of course! Or was James a spiritual brother, as is the common usage has been for many religious communities? Ehrman, as an expert, comes down clearly on the biological side.

Myth, or just mythic?


Both cases, for and against the historicity of Jesus, are thus circumstantial, based on the credibility of scraps of evidence, or the credibility of a counter-story elaborated by the mythicists, where Jesus begins as a deity who is brought down to earth (euhemerized) for a variety of motives that are quite understandable, and precedented by similar gods and god-men before and since. Casting one's god as a real person makes the provenance and stability of his teachings more secure than that of a deity that communicates through revelation, and could do so again at any time. And stories are easy to make up and write down. A recent talk by Richard Carrier makes this case with gusto.

I am not going rehash the arguments here. But only say that the pro-historical case, while certainly traditional, popular, and even likely, is, even by Bart Ehrman's telling, hung on very thin threads of internal evidence, on texts whose transmission to us is an endless story of copying, re-copying, correction, obfuscation, politics, and forgery. The early Christian times are a fascinating period of political and archetypal turmoil. No path is straight, least of all the texts that purport to tell the story. Take for instance, the case of Marcion, who supposedly collected letters of Paul and devised the first Christian cannon. Marcion is thought to have written a good bit of it himself, and founded a theology that was very popular in its day, only to ripen into heresy later on at the hands of what comes down to us as orthodoxy.

The project of making Christianity's hodge-podge of scriptures fit the orthodox story as it evolved through the centuries is mind-bogglingly complicated and obviously ongoing, given the many versions of the Bible and of Christianity that are still running around. The process is reminiscent of the paradox of Islam, where those who take its origins and scripture most seriously are the most righteous and violent, whereas those who merge into more mature traditions, as they ripened through time into human, and typically humane, institutions, are much more resistant to the fundamentalist call.

Getting back to the foundations, what is the precedent for euhemerization such as what happened to the person or entity we call Jesus? And for its complement, apotheosis? These days, the traffic between heaven and earth has hit some kind of traffic jam. But in antiquity, it was far more common for people such as kings and emperors to become gods, and also for gods to come down to earth, in tales such as the Homeric epics. Divinity was assumed to exist, and divine beings were pretty much formed in the image of ourselves, at our most powerful. Both the Jewish god(s) and the Greek gods were distinguished by their power much more than their knowledge, let alone their emotional wisdom or kindness.

Even farther back, the template is of course the family, and the trauma of death. The death of any person, let alone a powerful, archetypal person like a parent, is unimaginable. How can life stop cold, how can existence simply end? Impossible. We have thus come up with a rich set of rationalizations and theologies of additional existence. They typically involve the movement of people (souls) from this world to some other invisible world, where they look back with fondness to what is still the important place, our world.

But then comes the important question of whether and how this spritual world, if it is to have any ongoing function for us, interacts with ours. Our souls clearly have some modus operandi by which they co-function with our living bodies, mortal though they are. Likewise, spirits and gods must have some way back into the world if we wish to involve them in our dramas. Thus we end up with a rich literature of heroic journeys to heaven (or the underworld) and back, gods taking up disguises as women or men (or animals), throwing thunderbolts, causing natural cataclysms, etc.

It is only the higher psychological and philosophical sophistication of our age that has slowed down this traffic, though it peeks out of our unconscious in the endless array of super-hero movies, not to mention a majority of the country that still holds fast to some version of the traditional theological stories.

Let us close with a couple of quotes from Thomas Paine speaking of the Christian believer, vs a true deist, from his deist book, "The Age of Reason":
"Yet, with all this strange appearance of humility, and this contempt for human reason, he ventures into the boldest presumptions. He finds fault with everything. His selfishness is never satisfied; his ingratitude is never at an end. He takes on himself to direct the Almighty what to do, even in the govemment of the universe. He prays dictatorially. When it is sunshine, he prays for rain, and when it is rain, he prays for sunshine. He follows the same idea in everything that he prays for; for what is the amount of all his prayers, but an attempt to make the Almighty change his mind, and act otherwise than he does? It is as if he were to say -- thou knowest not so well as I."
"The Bible of the creation is inexhaustible in texts. Every part of science, whether connected with the geometry of the universe, with the systems of animal and vegetable life, or with the properties of inanimate matter, is a text as well for devotion as for philosophy -- for gratitude, as for human improvement. It will perhaps be said, that if such a revolution in the system of religion takes place, every preacher ought to be a philosopher. Most certainly, and every house of devotion a school of science."

  • Shadows from the past: Hillary and Honduras, one reason for a new influx of refugees to the US.
  • Freedom for me, but not for thee.
  • Who pays for corporate taxes? Is corporate power and capital mobility so great that they can off-load all costs onto workers and taxpayers? "We need also to account for the financial, administrative, and strategic costs of tax avoidance." Maybe we need stronger international governance.
  • Should central banks be unaccountable?
  • Lobbying and corruption is by far the best investment.
  • Stiglitz on negative rates... too little too late.
  • Mice who stutter!
  • The national debt is not a problem, at all.

Sunday, April 10, 2016

Who am I? Mechanics of Cell Identity




How do neurons in the fly know which segment they are in?

Organismal development is a biological mystery that is being gradually unravelled in labs all over the world in that heroic endeavor called "normal science". Which is the pedestrian counterpart to the Kuhnian revolutions termed paradigm shifts. That the endogenous materials and genetic code of the egg/embryo generate the later adult forms has been known ever since scientists gave up vitalistic and other religious ideas about our biology. But how that happens ... approaching that question has taken lots of modern technology and persistence.

Fruit flies are the leading model system for embryonic and organismal development, due to their marriage of complex body plans, simple experimental handling, and extraordinarily deep genetics. After almost a century of productive study, a revolution happened in the 1980s in fruit fly genetics, following new mutant screens that uncovered some of the most basic mechanisms in body plan development. The genes found and analyzed during this period established a basic paradigm that has extended to all metazoans that have segmented body plans. Do we have segments? Yes, our backbone is a testament to our segmented ancestors.

The fly is built out of segments, whose cells know where/what they are by virtue of special genes expressed in them- the homeotic genes. The major genes of the fly homeotic complexes are, in order, Labial, Proboscipedia, Deformed, Sex combs reduced, Antennapedia, Ultrabithorax, Abdominal-A, and Abdominal-B.
The theme of these studies was that a series of genes, typically regulators of the expression of other genes, are turned on in sequence during development to identify progressively finer regions of the developing body. So at first, the two ends of the egg cell or synctium are set as different, then some gross regions are defined, and later on, each segment (and each side of each segment) expresses a few key genes that identify its cells, so that another cell, say a nerve cell migrating through the area, can tell exactly where it is. Each protein is expressed in a gradient within its zone, allowing the next regulator in the process to detect which end of that gradient it lies in, and thus whether to turn on or not. Late in this genetic series are the Hox genes, which are notorious for the complexity of their own regulation, for their ability, when mutated, to transform the identity of some segments entirely into other ones, and for the linear relationship between their chromosomal position and the locations on the body where they are individually expressed.

Progressive genetic specification of the fly embryo body plan, dividing it up into segments.  Gradients of one gene product allow the next gene product to detect the sides of its compartment and thus refine its cellular and body identity to a finer level.

A recent paper took up this adventure in the area around the head and neck, asking how embryonic nerve cells (neuroblast stem cells) originating in segments 4 to 6 know who they are and where to go. While one might not think that an animal head has segments at all, in embryological and molecular terms, heads encompass about 7 segments, (in the fly), which go through very messy convolutions into the complex mature structure. In comparison, body segments are far more orderly. Indeed, the central thoracic segment appears to be the default state, needing no Hox gene expression to develop normally:
"While thoracic identities seem to represent a ground state (T2, no input of Hox genes), identities of consecutive posterior segments are established by adding the function of Bx-C Hox genes Ultrabithorax (Ubx), abdominal-A (abdA) and Abdominal-B (AbdB), an evolutionary highly conserved phenomenon described as posterior dominance or prevalence of Hox genes. The terminal abdominal neuromeres A8-A10 exhibit a progressively derived character regarding size and composition. In these segments, NB [neuroblast, or neuronal stem cell] patterns and segmental identities are controlled by combined action of the Hox gene AbdB and the ParaHox gene caudal."

Map of the Drosophila head region, stained to show the Engrailed gene product. This is a homeotic segment polarity gene, expressed on one side of each segment throughout the embryo at this stage. At bottom is a map, coding the different segments accounted for within the head: red- antenna segment; purple- ocular segment; orange- intercalary segment; brown- labral segment; black- mandibular segment; green- maxillary segment; blue- labial segment; gray- first thoracic segment. In ensuing figure, the embryo is squashed to lay out the segments better.

The head segments likewise require extensive input from the Hox genes to keep their identities distinct. The researchers use a series of mutants to figure out how the local (segments 4 to 6) neuronal stem cells respond to missing genetic homeotic inputs. To do this, they use a few morphological characteristics and gene markers (assays for a gene whose expression is restricted to a certain lineage or cell type, in this case antibodies specific to the respective proteins) to identify the neuroblasts or stem cells they are interested in.

Stem neurons in three segments are stained with a combination of gene expression probes: Eagle in green, Runt in red, and Engrailed in blue. Note how combined expression renders some key cells aqua (green + blue) or yellow (green + red). Other diagnostic genes used for cell identification, which are all known to have developmental roles, are Deadpan, Deformed, Repo, Even-skipped, Eyeless, Sex combs reduced, Proboscipedia, and Gooseberry. The segments, from front [top] to back, are mandibular (mad), maxillary (max) and labial (lab). In back of the labial segment is the first thoracic segment. This stage of development (12) is quite early, well before the first larva forms.

Many figures of embryos later, stained for the expression of various proteins, in flies mutated for various key homeotic genes, and analyzed for the presence of notable cells at various stages, the authors draw several conclusions about the genetic influences that determine the identity and existence of neurons in these head segments, some of which will go on to contribute to the adult fly's brain. First, the maxillary segment, including its neuronal stem cells, expresses Deformed and Sex combs reduced from the Hox genes, while the next labial segment expresses Labial, but not in its neuronal cells. These seem to be the principal determinants of segmental identity. Yet when Deformed is mutated, only about half the cells are transformed from maxillary identity to a labial or thoracic identity. Only when another homeotic gene is also mutated, either Antennapedia or Labial, is the transformation more complete.

The curious thing about this is that neither Antennapedia nor Labial are normally expressed in the maxillary head segment, so the effect of their mutation must not be what the resarchers term cell-autonomous. These other genes must be acting from some distance away, instead of directly via their own expression in the cells being affected. This gets these researchers quite excited, and they track down some of the mechanism behind this extra cell fate specification.
"We identify the secreted molecule Amalgam (Ama) as a downstream target of the Antennapedia-Complex Hox genes labial, Dfd, Sex combs reduced and Antennapedia. In conjunction with its receptor Neurotactin (Nrt) and the effector kinase Abelson tyrosine kinase (Abl), Ama is necessary in parallel to the cell-autonomous Dfd pathway for the correct specification of the maxillary identity of NB6-4. Both pathways repress CyclinE (CycE) and loss of function of either of these pathways leads to a partial transformation (40%), whereas simultaneous mutation of both pathways leads to a complete transformation (100%) of NB6-4 segmental identity."

Summary of findings, where Deformed is the main, local homeotic specifier for the maxillary segment neurons. But additional help comes from the next-door labial segment which expresses the homeotic gene Sex combs reduced, which influences expression in turn of the diffusible protein Amalgam, which helps the nearby maxillary segment keep its identity, via repression of the gene cyclin E. Interestingly, the Amalgam gene is located in the homeotic cluster right next to Deformed.

Summary of findings, where Deformed is the main, local homeotic specifier for the maxillary segment neurons. But additional help comes from the next-door labial segment which expresses the homeotic gene Sex combs reduced, which influences expression in turn of the diffusible protein Amalgam, which helps the nearby maxillary segment keep its identity, via repression of the gene cyclin E. Interestingly, the Amalgam gene is located in the homeotic cluster right next to Deformed.

So what had originally been though of as a fully cell-autonomous system, whereby each homeotic gene or combination thereof dictates the identity of cells in each respective segment where it is itself expressed, turns out to be a bit more messy, with neighbor effects that refine the identity code. Obviously this is getting into the deep weeds of developmental biology, but at the same time is an outstanding example of where the field is today, filling in ever-finer details of how development happens, using sophisticated techniques and backbreaking amounts of work.



Saturday, April 2, 2016

We Have Been Energy Hogs For a Billion Years

Mitochondria and the origins of eukaryotes.

Last week, we read about the origins of one important characteristic of eukaryotic cells- sex. But there are many more properties that distinguish eukaryotic cells from their bacterial forebears. These include the compartmentalized organelles like mitochondria, chloroplasts, nuclei, golgi, lysozomes, and the endoplasmic reticulum, a vastly expanded and junk-laden nuclear genome with introns, numerous new families of proteins, larger ribosomes, linear DNA with telomeres, separated transcription and translation, and centrosomes / cilia, among others. The mystery is how these many innovative characters all came to happen in one lineage that left no other discernable branches or traces, making the divide between the two forms of life truly gaping and hard to reconstruct. A paper from 2011 provides an illuminating attempt to explain some of these mysteries. Incidentally, it is well-written, and rewards a direct read.

Perhaps one of the most complex, yet at the same time simple, characteristics, is the mitochondrion. Though some eukaryotes live without them, such lineages all are known to have evolved from mitochondrion-containing ancestors. Thus mitochondria are truly part of the original equipment of eukaryotes as far as we currently know. Mitochondria are the descendents of bacteria completely distinct from the proto-eukaryotic host cell, (whether archaeal or something else), and became endosymbionts, whether by some dramatic engulfment / phagocytosis or something more cooperative and intricate. Thus the origin of the mitochondrion is simple- just take in a bacterial partner- even as the organelle and its effects on the host are highly complex. Indeed, mitochondria still have a tiny residual genome, encoding, 37 genes in humans, most of which are tRNAs. While the vast majority of its proteins are encoded far away in the nucleus, it runs its own replication, transcription, and translation apparatus complete with tRNAs and charging enzymes to make 13 proteins of its own, using a genome 1/200,000 the size of the host cell.

Mammalian cell with its nucleus labelled "N" and mitochondria labeled "M".

The current paper proposes that the partnership with mitochondria was the first step on the long road to the eukaryotic cell. The reason is that it is the one enabling change that unleashes all the others, by vastly increasing the energy available to the combined entity. The author goes through a lengthy calculation of the DNA carrying capacity of bacteria, which is clearly limiting and causes a ceaseless competition among bacteria to shorten and streamline their genomes, which account for roughly 2% of cellular metabolism simply for DNA replication, quite apart from all the other costs of gene expression, repair, etc.  Once the host cell convinced the endosymbiont to give up its excess energy (ATP) in return for safety and free food, the race was on to a very profitable division of labor.

If in the combined cell, each mitochondrion supplies the energy equivalent of a bacterium, but with only the genome of an influenza virus, the efficiencies of scale are substantial, perhaps transformative, enabling much larger cells and much larger central genomes. On the other hand, the eukaryotic cell has just as much protein, mRNA and other gene expression apparatus (by mass and energy) as the bacterial cell, (if not more), so the author's focus on the energy available per gene, which results in starting quantitative contrasts between the two domains, is not terribly persuasive.

Example of bacteria (Clostridium) in size comparison to human epithelial cells. The bar is 20 micrometers.

More persuasive is the advantage in membrane area. Bacteria, like mitochondria, manage their short-term energy via a proton motive force over their plasma membranes. Food sources are oxidized, generating electrons which power the pumping of protons out of the cell/organelle. That power, stored much like a battery, is used as needed by ATP-synthesizing machines that run off the power of letting those protons back in. Making a bacterium larger is a losing proposition since the cytoplasmic volume rises by the cube as the surface area rises by the square. While elongation is one solution, and many bacteria are filamentous, spiral, and other long-ish shapes, this poses other obvious problems of safety and internal management. The eukaryotic cell escaped all that by making of the mitochondrion an endlessly replicable internal energy unit, limited only by the host's ability to gather dinner on whatever scale it chooses to operate. And sometimes, operating on a large scale is very profitable.

This hypothesis leads to an interesting theory about the early phases of the symbiosis. However it happened, the earliest mitochondria were fully bacterial, conferring the membrane area advantage, but not the genome streamlining advantage. Given that many mitochondria can exist in each eukaryotic cell (up to several thousand), the advantage of minimizing the infrastructure and energy needed for the maintenance of each one is clear. In the first place, many of the free-living functions would have quickly become unnecessary. Mitochondria today have a complement of about 1000 proteins, far less than the ~ 5,000 proteins found in free-living bacteria. Getting rid of those and the DNA encoding them is a huge savings.

Second, this early mitochondrion would be constantly exposing the host to its own DNA, and the combined entity would gain a streamlining advantage every time the mitochondrion lost a gene that was integrated into the host genome. Putting aside the challenges of transporting all the proteins needed in the mitochondrion back from the host's expression apparatus, which are substantial, every time the mitochondrion lost a gene and had that function supplied externally instead, it became that much more efficient in terms of the genome it was carrying around, in addition to regulatory advantages from being centrally managed by the host.

Comparison of genome sizes, on a log scale. All eukaryotes have larger genomes than all bacteria. Mitochondria, at 16Kb, now fall into the viral range rather than the bacterial range.

Thus a sort of snow-balling process of mitochondrial genome miniturization took place, which had wide-ranging effects. The author speculates that controlling this exposure to external (mitochondrial) DNA, especially its primitive introns, may have led to the nuclear membrane as a form of protection and process management, which in turn created the space for new forms of eukaryotic regulation, like the spliceosomal processing that takes place during exit from the nucleus, and the myriad proteins that are specifically shuttled in and out of the nucleus for regulatory control. Overall, the union of a tiny mitochondrion and a central host genome provides a quantum leap of efficiency, compared to what is possible by scaling up single bacteria in any conceivable way (whether by invaginating their membranes, and / or multiplying their genomes to serve larger surfaces and volumes).

This in turn allowed energetic room for all sorts of new innovations and what a bacterium would regard as waste. A host genome full of introns and other junk DNA, a cytoplasm full of new cytoskeletal proteins devoted to shape control and internal cargo carrying, systems for internal membrane and vesicle management, and diploidy: carrying a full extra copy of its nuclear genome around, as part of the new sexual reproduction cycle. Also:
" The last eukaryotic common ancestor (LECA) had already increased its genetic repertoire by some 3000 novel gene families [over that of the presumed ancestral bacteria]."

Finally, the fact that this series of innovations seems to have happened only once and left no other lineages from along the way makes for a remarkable gap in the evolutionary record, far more profound than that observed around the Cambrian explosion of metazoan life. This paper is very eloquent about the many ways that prokaryotes are trapped in what might be called a version of fiscal austerity, always cutting spending, scrimping on infrastructure, and seeking efficiency über alles. That is no way to live! That any of them found a way out, to the endless vistas of higher complexity and cooperation that now cover the earth with beautiful, rich life, is worthy of wonder and gratitude.

  • smORFs, another genomic frontier.
  • But he's a winner, right?
  • What is an undue burden? Who knows? And does the state have an interest in unused sperm?
  • Whom do investment bankers work for?
  • Let's make the FIRE sector pay its way.
  • Corporate profits are sky-high. Is that investment, or rent?
  • What defines the middle class?
  • Is suffering an excuse for being  gullible?
  • No one seems to understand national debt, after all this time.
  • Green tip of the week: let's have fewer conferences.
  • Barney Frank: You have to vote.
  • Work is fundamentally important, perhaps more than trade.
  • Republicans in the forefront of bad government.

Saturday, March 26, 2016

Why have sex?

On the origins of meiosis.

Of the many innovations that occurred during the evolution of eukaryotes, one of the most mysterious and powerful was the development of meiosis and sexual reproduction. A paper from a few years back delved into how this process might first have started, given that it depends on several complex innovations. (And another paper more recently).

But first, why is sex so durable, evolutionarily speaking, resulting in an Earth where all large, complex life forms engage in it? The costs are quite severe, after all: partners must be found, which can be a particular trial for sessile plants, corals, etc.; selection for fitness can be waylaid by mate selection and other sexual games; Parents give up half their genes in creating each child, compared to creating complete copies as they would by traditional, clonal reproduction; likewise a sexual population has to have males, which double the resource needs compared to a clonal population where every member is functionally female.

The answer lies in population genetics. A clonal population, such as most bacteria, can generate mutations and adapt to external conditions. The development of antibiotic resistance is notorious. But evolution is a parallel process, where the whole population is tested, and many variants are more or less successful. Everyone has deleterious mutations mixed in amongst the beneficial ones, especially since they are always more numerous. So in a clonal population, any good mutation that occurs will be trapped in its current genome and have to compete against all the other clones with all their good and bad mutations. Another good mutation will have to fight the same battle, without the chance to team up with the first one, unless the first one has already taken over the population. This way many beneficial mutations are lost, especially those with weak effects, which are the majority, naturally.

Even worse, bad mutations tend to pile up in each clone lineage, since there is no way to get rid of them. Each mother gives the full complement of her mutations, both inherited and those that happened during her life, to her offspring. In humans, we accumulate about 175 mutations per individual before reproduction, of which roughly 1 or 2 are detectably deleterious. While astonishing repair processes have evolved to keep such errors to a minimum, there are always some, and in a clonal lineage, they are always building up, despite the ongoing selection against those which add up to worse than average problems, a process termed "Muller's ratchet".

What happens in a sexual population? Well, it is critical to realize that it does behave much more like a population- an evolutionary village, so to speak- than as competing clonal lineages. Individual alleles are recombined around and mixed up among offspring, so that there is far more diversity within the population, which allows, stochastically, for good mutations/alleles to come together in some offspring, and deleterious mutations/alleles to come together in others. Given selection where the latter die and the former flourish, the system enables far more effective use of the opportunities provided by mutations throughout the population then clonality does.

The autobiography of Julius Erving, "Dr. J", provides a graphic example. He is extraordinarily gifted in all respects, including writing. Yet his brother Marky was sickly and died very young. The difference is tragic on a human level, but routine on a genetic level. Embryos with especially deleterious alleles will frequently die before birth, hiding the true rate of this genetic "sweeping" mechanism.

Sex is so powerful that bacteria have developed several mechanisms for doing it on a small scale, such as extending pili to partners so that they can exchange limited amounts of DNA. This how antibiotic resistance spreads around so quickly, and helps bacteria exchange out some of their accumulated mutations by homologous repair, (which is common among all organisms), though at the cost of bringing in parasitic DNA elements like transposons and viruses. But bacteria have never developed the full monte: fusion of whole genomes with total remixture and sharing out to subsequent progeny, let alone obligatory sex before reproduction. Eukaryotic sex involves the fusion of complete, haploid genomes, which recombine pervasively, both by way of the independent assortment of whole chromosomes and by smaller sub-chromosomal recombintation events, to create unique, new haploid genomes, which are then sent out as new gametes.

Bacteria share small amounts of DNA though conjugation pili.

The rise of this process is a bit hard to understand because there are several complex events needed, none of which make immediate sense by themselves. First is the meiotic division, which uses most of the tools of normal mitotic cell division, plus some more (suppression of sister separation in the first division and then suppression of DNA replication in the second division) to turn a diploid genome into a carefully reduced haploid gamete genome. Second is the part of the process called synapsis where all chromosomes from the two parents align along their full length. Third is the recombination that is obligatory between these homologs in order to keep them attached at initial stages, and to interchange segments of the respective genomes. This is not to mention the fusion of gamete cells and other perphernalia of sex, which are less innovative from a molecular, and probably evolutionary, standpoint.

Basic model of meiosis, in comparison to mitosis.

The proposal made in this paper, by a luminary in the field, after whom the Holliday junction is named, is that synapsis of homologous chromosomes may have been the leading event in the development of meiosis and sex, and is understandable as a solution to a completely different problem. Quite apart from the recombination that happens when bacteria encounter DNA coming in from other cells, homologous recombination also happens after replication and before division to repair damage, of which replication is a frequent cause. But one of the hallmarks of eukaryotes is size- big cells and big genomes- they are the SUVs of cellular biology. As genomes grew, the chances of making an error in this internal recombination, with all the repetitive DNA and duplicated genes lying about, grew rapidly as well, and posed a serious danger of creating new damage. This led to the preferability of setting up a whole-genome alignment process, i.e. true synapsis, and confining it to the inverval between replication and division.
"To sum up, we propose that the selection pressures for homolog synapsis and the origins of meiosis were to improve recombinational accuracy and to restrict it to a safe interval, while retaining its short-term (repair) benefits. A cell lineage that had evolved this capability for diploid cells would be less error-prone in transmitting its genetic material."

In the original setting of haploid, single cells, as the first proto-eukaryotes undoubtedly were, this would have revolutionized post-replication homologous DNA repair, making the process far more systematic and reliable. Indeed, eukaryotes still have their DNA aligned during most of the cell cycle, though not in the elaborate synaptonemal complex now found in the first phase of meiosis. Since these organisms were not dipoid, but habitually haploid, the subsequent division would have already resembled the second division of meiosis, back to the haploid state.
"In principle, the molecular evolution of a new cohesin molecule that specifically promoted homolog pairing might have provided the crucial trigger for meiosis."

The next step to true sexuality by this model were to adopt the practice of mating between separate cells (which could have been related to the partial genetic exchange common among bacteria) to generate a diploid where truly different alleles over entire genomes are combined in one cell. While this could have worked on its own, (and exists today as parasexual cycles in fungi, where reduction back to a haploid set of chromosomes is more or less random), the addition of DNA replication, as well as synapsis and recombination as above, at this step would have necessitated a special variant of mitosis to become the first (reductional) division of meiosis, whereby the replicated homologs from the two parents align in a novel four-chromosome bundle and each segregate precisely in half, though with random polarity, prior to a second division.
"In many unicellular eukaryotes, haploid sex-cell fusion leads promptly to nuclear fusion, which immediately triggers meiosis, thus regenerating the haploid state. In contrast, in more complex, multicel- lular eukaryotes, meiosis is greatly delayed following the initial fusion of sex cells, taking place much later in the life cycle, during gametogenesis."

Subsequently, organisms would adopt the diploid state as the organismal default (not so difficult in microorganisms), which has powerful effects on genetic diversity, since it allows recessive alleles some breathing space to survive and even to fill strategic niches in the population. With the advent of multicellularity, the meiotic divisions could be re-scheduled to now-vestigial haploid cells as part of a special gamete generating process. Clearly there is a lot to chew on with this model, and a need to flesh out and gather evidence to support the many inferred steps. But it is a highly interesting idea for the stepwise development of a process that is now notoriously complex on the molecular as well as all other levels.

A problem, however, is that modern eukaryotic cells, though they use aligned sisters in a highly regulated fashion for post-replication DNA repair, do not use anything like true synapsis. That makes it difficult to suppose that synapsis ever played a role in this process, or that it was the leading element of the other steps of meiosis. One might counter that haploid cell fusion with para-sexual reduction was perhaps the first step in the sequence, (possibly even originating in a predatory setting), after which the development of replication, synapsis, diploidy, and the special homolog-separating division of meiosis I were developed to better clean up the mess. Anyhow, next week, we will delve into another theory about the origins of eukaryotes.

Overall, sex is a machine for speeding up evolution, generating a quantum leap of accessible genetic diversity within a species / population that allows bad genes to be left behind without discarding everything else in those genomes, and good genes to be concentrated in the winners. Do individuals benefit, or their genes, or the species as a whole? There is an element of group selection intrinsic to this rationale for sex, since the benefits accrue down the line to the genes of the species and the species as a whole, not to the individuals involved now, especially not those stuck with the short end of the genetic stick.

  • An unusual meiotic short-circuit & genome mixing pathway in yeast cells.
  • Women, and women's work, are not highly valued. Another lesson in the social construction of pay decisions.
  • Seniors (and the rich) are doing relatively well, at least vs young workers afflicted by a terrible job market. "I suspect that most Fed policymakers receive relatively little input on the economy from people who are younger than 40."
  • Does the US have any place for lower class workers?
  • Or does it have the education and institutions to take them to a higher class?
  • Mainstreaming MMT.
  • Repression is hard work, and Trump, a relief.

Saturday, March 19, 2016

Keynes Would be Spinning in His Grave

If he could see what is going on in the Euro Zone.

There are deep structural and policy problems in the Euro Zone which have prevented or muted its recovery from the recession. The policy problem, similar to ours, is a lack of stimulus spending to put an end to general deflation and recession. The structural problem is that the individual countries, while sharing the same monetary unit, can not adjust their levels of general economic activity relative to each other, or have independent monetary policy, except through trade. And their trade relationships are hopelessly unequal, with Germany in the lead, and the peripheral countries like Greece and Spain uncompetitive, at least in fixed Euro terms.

While the going was good, Germany was willing to lend the money which flowed in through trade back out to the southern and peripheral countries. But these were only loans, not fiscal transfers as would happen in a  more integrated country/zone. So then we had the drama of creditors asking for the money back, arranging for debt forgiveness, bailouts, etc., all when the debtor countries were on their knees.

In essence, the Euro Zone operates like an old-fashioned gold standard international system. A common unit of exchange and value is kept stable across supposedly independent countries which each have their own policies of trade, employment, corruption, public services, etc. This unit is not really shared out from the central bank on a per-country basis, let alone to correct specific trade imbalances, but generated in response to economic growth in sum across the whole zone, managing issuance and interest rates with the overarching goal of low inflation (since the Germans are running the ECB, more or less). Thus each country needs to accumulate what is to them a fixed unit of exchange through trade, lest they run into chronic debt to the other countries. Each country is moreover on the hook for its various economic disasters like insolvent banks, unemployment, and social unrest.

Here is what Keynes wrote about this kind of system, as opposed to a system of floating exchange between sovereign nations:
"Never in history was there a method devised of such efficiency for setting each country's advantage at variance with its neighbours' as the international gold (or, formerly, silver) standard. For it made domestic prosperity directly dependent on a competitive pursuit of markets and a competitive appetite for the precious metals. When by happy accident the new supplies of gold and silver were comparatively abundant, the struggle might be somewhat abated. But with the growth of wealth and the diminishing marginal propensity to consume, it has tended to become increasingly internecine."
"I have pointed out in the preceeding chapter that, under the system of domestic laissez-faire and an international gold standard such as was orthodox in the latter half of the nineteenth century, there was no means open to a government whereby to mitigate economic distress at home except through the competitive struggle for markets. For all measures helpful to a state of chronic or intermittent under-employment were ruled out, except measures to improve the balance of trade on income account."
".. those statesmen were moved by a common sense and a correct apprehension of the true course of events, who believed that if a rich country were to neglect the struggle for markets its prosperity would droop and fail. But if nations could learn to provide themselves with full employment by their domestic policy (and, we must add, if they can also attain equilibrium in the trend of their population), there need be no important economic forces calculated to set the interest of one country against that of its neighbors. There would still be room for the international division of labor and for international lending in appropriate conditions. But there would no longer be a pressing motive why one country need force its wares on another or repulse the offerings of its neighbor, not because this was necessary to enable it to pay for what it wished to purchase, but with the express object of upsetting the equilibrium of payments so as to develop a balance of trade in its own favor. International trade would cease to be what it is, namely a desperate expedient to maintain employment at home by forcing sales on foreign markets and restricting purchases, which, if successful, will merely shift the problem of unemployment to the neighbor which is worsted in the struggle, but a willing and unimpeded exchange of goods and services in conditions of mutual advantage."

One might add that this need for markets and trade also drove European countries toward colonialism which was so destructive, especially in the latter phases as laggards like Belgium, Italy and Germany got into the game. It is a wonder (of a negative kind) how, eighty years after Keynes's lessons were introduced in response to the Great Depression, and after several postwar decades during which they were put to such prosperous use, we, and especially his own continent, are struggling in the mire of older orthodoxies that he had laid to rest. I was about to note that we can at least be thankful that none of the current leaders advocate a return to an actual gold standard, but that turns out to be incorrect.