Saturday, May 28, 2016

The Housing Crisis- or is it a Transportation Crisis?

Bustling areas of the US are in the grips of a housing, transportation, and homelessness crisis.

While tracts of empty houses remain from the recent near-depression in areas like Florida, Nevada, and Detroit, other areas suffer from the opposite problem. Average detached house prices are at a million dollars in the San Francisco Bay Area. While this number is proudly trumpeted by the local papers for their satisfied home-owning constituents, the news for others is not so good. Houses are priced far above construction cost, clearly unaffordable for average workers, and rents are rising to unaffordable levels as well. How did we get here?

One of the great ironies is that environmentalism has allied with other status quo forces to stall development for decades. Existing homeowners have little interest in transforming their sprawly neighborhoods into denser, more efficient urban centers. Then they pat themselves on the back for preserving open space and small-town ambiance, along with inflated property values. Public officials have been stymied by proposition 13 and other low-tax movements from funding infrastructure to keep up with population growth. Local roads are now frequently at a standstill, making zoning for more housing essentially unthinkable. Add in a drought, and the policy response to growth is to hide one's head in the sand.

Then ... a scene from Dark Passage.

There is a basic public policy failure to connect population and business growth with the necessary supporting structures- a failure of planning. No new highway has been built for decades, even as the population of the Bay Area has increased by 10% since just 2000, the number of cars increased even more, and the population of the state has doubled since 1970. How was that supposed to work?

Now ... at the bay bridge.

An alternative approach would have been to limit population growth directly, perhaps via national immigration restrictions or encouragement for industry to move elsewhere. But that doesn't seem attractive to our public officials either, nor is it very practical. In a tragedy of common action, people flock to an attractive area, but eventually end up being driven away based on how crowded and unbearable the area becomes. A Malthusian situatuion, not from lack of food, but of other necessities. But with modern urban design & planning, it doesn't have to be that way- just look at Singapore, Hong Kong, New York, and other metropolises.

In the post-war era, the US, and California in particular, built infrastructure ahead of growth, inviting businesses to a beautiful and well maintained state. But once one set of roads was built, and a great deal of settled activity accumulated around them, expansion became inceasingly difficult. Now that a critical mass of talent and commercial energy is entrenched and growing by network forces, the contribution from the state has descended to negligible, even negative levels, as maintenance is given short shrift, let alone construction of new capacity, for roads, housing development, water, and sewer infrastructure. Prop 13 was, in retrospect, the turning point.

It is in miniature the story of the rise, decline, and fall of civilization. For all the tech innovation, the Bay Area is showing sclerosis at the level of public policy- an inability to deal with its most basic problems. The major reason is that the status quo has all the power. Homeowners have a direct financial interest in preventing further development, at least until the difficulties become so extreme as to result in mass exodus. One hears frequently of trends of people getting out of the area, but it never seems to have much effect, due to the area's basic attractiveness. Those who can afford to be here are also the ones investing in and founding businesses that keep others coming in their wake.

The post-war era was characterized by far more business influence on government, (especially by developers, the ultimate bogey-men for the environmentalists and other suburban status-quo activists), even while the government taxed businesses and the wealthy at far higher levels. Would returning to that system be desirable? Only if our government bodies can't get their own policy acts together. The various bodies that plan our infrastructure (given that the price signal has been cancelled by public controls on development) have been far too underfunded and hemmed in by short-sighted status quo interests- to whom the business class, which is typically interested in growth and labor availability more than holding on to their precious property values, are important counter-weights.

The problem is that we desperately need more housing to keep up with population, to keep housing affordable, and ultimately also resolve the large fraction of homelessness that can be addressed by basic housing affordability. But housing alone, without a full package of more transportion and other services, makes no sense on its own. So local planning in areas like the Bay Area needs a fundamental reset, offering residents better services first (more transit, cleared up roads) before allowing more housing. Can we build more roads? Or a new transit system? We desperately need another bridge across the bay, for example, and a vastly expanded BART system, itself a child of the post-war building boom, now fifty years old.

BART system map, stuck in time.

Incidentally, one can wonder why telecommuting hasn't become more popular, but the fact that a region like the Bay Area has built up a concentration of talent that is so enduring and growing despite all the problems of cost, housing, and transportation speaks directly to the benefits of (or at least the corporate desire for) corporeal commuting. Indeed, it is common for multinational companies to set up branches in the area to take advantage of the labor pool willing to appear in person, rather than trying to lure talent to less-connected areas cybernetically or otherwise.

One countervailing argument to more transit and road development is that the housing crisis and existing road network has motivated commuters to live in ever farther-flung outlying areas, even to Stockton. Thus building more housing first, in dense, central areas, might actually reduce traffic, by bringing those commuters back to their work places. This does not seem realistic, unfortunately. One has to assume that any housing increment will lead to more people, cars, and traffic, not less. There is no way to channel housing units to only those people who will walk to work, or take transit, etc., especially in light of the poor options currently available. The only way to relieve the transportation gridlock is to make using it dramatically more costly, or to provide more transportation- especially, more attractive transit options.

Another argument is that building more roads just leads to more usage and sprawl. This is true to some extent, but the solution is not to make the entire system dysfunctional in hopes of pushing marginal drivers off the road or out of the area in dispair. A better solution, if building more capacity is out of the question, is to take aim directly at driving by raising its price. The gas tax is far too low, and the California carbon tax (we have one, thankfully!) is also too low. There is already talk of making electric vehicle drivers pay some kind of higher registration or per-mile fee to offset their lack of gas purchases, but that seems rather premature and counter-productive from a global warming perspective. To address local problems, tolls could be instituted, not just at bridges as they are now, but at other areas where congestion is a problem, to impose costs across the board on users, as well as to fund improvements. This would also address the coming wave of driverless cars, which threatens to multiply road usage yet further.

In the end, housing and transportation are clearly interlinked, on every level. Each of us lives on a street, after all. Solving one problem, such as homelessness and the stratospheric cost of housing, requires taking a step back, looking at the whole system, and addressing root causes, which come down to zoning, transportation, money, and the quality and ambition of our planning.



  • Hey- how about those objective, absolutely true values?
  • Bernie has some mojo, and doing some good with it.
  • We know nothing ... at the State department.
  • The Fed is getting ready to make another mistake.
  • For the umpteenth time, we need more fiscal policy.
  • Cheating on taxes, the Trump way.
  • Yes, Trump is this stupid, and horrible.
  • Another disaster from Hillary Clinton's career.
  • Corporations are doing well the old-fashioned way, though corruption.
  • What happens when labor is too cheap, and how trade is not so peaceful after all.

Saturday, May 21, 2016

Tinier and Tinier- Advances in Electron Microscopy

Phase contrast and phase plates for electrons;  getting to near-atomic resolution.

Taken for granted today in labs around the world, phase contrast light microscopy won a Nobel prize in 1953. It is a fascinating manipulation of light to enhance the visibility of objects that may be colorless, but have a refractive index different from the medium. This allowed biologists especially to see features of cells while they were still alive, rather than having to kill and stain them. But it has been useful for minerology and other fields as well.

Optical phase contrast apparatus. The bottom ring blocks all but that ring of light from coming into the specimen from below, while the upper ring captures that light, dimming it and shifting its phase.

Refraction of light by a sample has very minor effects in normal bright field microscopy, but does two important things for phase contrast microscopy. It bends the light slightly, like a drink bends the image of a straw, and secondly, it alters the wave phase of the light as well, retarding it slightly relative to the unaffected light. Ultimately, these are both effects of slowing light down in a denser material.

The phase contrast microscope takes advantage of both properties. Rings are strategically placed both before and after the sample so that the direct light is channeled in a cone that is then intercepted after hitting the sample with the phase plate. This plate both dims to direct light, so that it does not compete as heavily with the scarcer refracted light, and more importantly, it also phase-retards the direct light by 90 degrees.

Light rotational phase relationships in phase contrast. The phase plate shifts the direct (bright) light from -u- to -u1-. Light that has gone through the sample and been refracted is -p-, which interferes far more effectively with -u1- (or -u2-, an alternate method) than with the original -u-, generating -p1- or -p2-, respectively.

The diagram above shows the phase relationships of light in phase contrast. The direct light is u on the right diagram, and p is the refracted and phase-shifted light from the specimen. d is the radial difference in phasing. Interference between the two light sources, given their slight phase difference, is also slight and gives very little contrast. But if the direct light is phase shifted by 90 degrees, either in the negative (orginal method, left side u1) or positive directions (right, u2), then adding the d vector via interference with the refracted light has much more dramatic effects, resulting in the phase contrast effect. Phase shifting is done with special materials, such as specifically oriented quartz.

Example of the dramatic enhancement possible with optical phase contrast.

A recent paper reviews methods for generating phase contrast for electron microscopy, which, with its far smaller wavelength, is able to resolve much finer details, and also revolutionized biology when it was invented, sixty years ago. But transmission electron microscopy is bedeviled, just as light microscopy was, by poor contrast in many specimens, particularly biological ones, where the atomic composition is all very light-weight: carbons, oxygens, hydrogens, etc, with little difference from the water medium or the various cellular or protein constituents. Elaborate staining procedures using heavy metals have been used, but it would be prefereable to image flash-frozen and sectioned samples more directly. Thus a decades-long quest to develop an electon analogue of phase contrast imaging, and a practical electron phase plate in particular.

Electrons have waves just as light does, but they are far smaller and somewhat harder to manipulate. It turns out that a thin plate of randomly deposited carbon, with a hole in the middle, plus electrodes to bleed off absorbed electrons and even bias the voltage to manipulate them, is enough to do the trick. Why the hole?  This is where the un-shifted electrons come through, (which mostly also do not interact significantly with the specimen), which then interfere with the refracted and shifted ones coming through the carbon plate outside. Which has the effect of emphasizing those electrons phase-shifted by the specimen which escape the destructive interference.
"A cosine-type phase-contrast transfer function emerges when the phase-shifted scattered waves interfere with the non-phase-shifted unscattered waves, which passed through the center hole before incidence onto the specimen."

The upshot is that one can go from the image on the right to the one on the left- an amazing difference.
Transmission electron microscopy of a bacterium. Normal is right, phase contrast is left.

At a more molecular scale, one can see individual proteins better, here the GroEL protein chaperone complex, which is a barrel-shaped structure inside of which other proteins are encouraged to fold properly.
Transmission electron microscopy of individual GroEL complexes, normal on left, phase contrast on right. 



Saturday, May 14, 2016

Dissection of an Enhancer

Enhancers provide complex, combinatorial control of gene expression in eukaryotes. Can we get past the cartoons?

How can humans get away with having no more genes than a nematode or a potato? It isn't about size, but how you use what you've got. And eukaryotes use their genes with exquisite subtlety, controlling them from DNA sequences called enhancers that can be up to a million base pairs away. Over the eons, countless levels of regulatory complexity have piled onto the gene expression system, more elements of which come to light every year. But the most powerful contol over genes comes from modular cassettes (called enhancers) peppered over the local DNA to which regulatory proteins bind to form complexes that can either activate or repress expression. These proteins themselves are expressed from yet other genes and regulatory processes that form a complex network or cascade of control.

When genome sequencing progressed to the question of what makes people different, and especially what accounts for differences in disease susceptibility, researchers quickly came up with a large number of mutations from GWAS, or genome-wide association studies, in data from large populations. But these mutations gave little insight into the diseases of interest, because the effect of each mutation was very weak. Otherwise the population would not be normal, as these were, typically, but afflicted. A slight change in disease susceptibility coming from a mutation somewhere in the genome is not likely to be informative until we have much more thorough understanding of the biological pathway of that disease.

This is one reason why biology is still going on, a decade and a half after the human genetic code was broken. The weak effect mutations noted above are often far away from any gene, and figuring out what they do is rather difficult, both because of their weakness, their perhaps uninformative position, and also because of the complexity of disease pathways and the relevant environmental effects.

Part of the problem comes down to a need to understand enhancers better, since they play such an important role in gene expression. Many sequencing projects study the exome, which comprises the protein-coding bits of the genome, and thus ignore regulatory regions completely. But even if the entire genome is studied, enhancers are maddening subjects, since they are so darned degenerate. Which is a technical term for being under-specified with lots of noise in the data. DNA-binding proteins tend to bind to short sites, typically of seven to ten nucleotides, with quite variable/noisy composition. But if helped by a neighbor, they may bind to a quite different site.. who knows? Such short sequences are naturally very common around the genome, so which ones are real, and which are decoys, among the tens or hundreds of thousands of basepairs around a gene? Again, who knows?

Thus molecular biologists have been content to do very crude analyses, deleting pieces of DNA around a specific gene, measuring a target gene's expression, and marking off sites of repression and enhancement using those results. Then they present a cartoon:

Drosophila Runt locus, with its various control regions (enhancers) mapped out a top on the genomic locus, and the proto-segmental stripes in the embryo within which each enhancer contributes to activate expression below. The locus spans 80,000 basepairs, of which the coding region is the tiny blue set of exons marked at top in blue with "Run".

This is a huge leap of knowlege, but is hardly the kind of quantative data that allows computational prediction and modeling of biology throughout the relevant regulatory pathways, let alone for other genes to which some of the same regulatory proteins bind. That would require a whole other level of data about protein-DNA binding propensities, effects from other interacting proteins, and the like, put on a quantitative basis. Which is what a recent paper begins to do.
"The rhomboid (rho) enhancer directs gene expression in the presumptive neuroectoderm under the control of the activator Dorsal, a homolog of NF-κB. The Twist activator and Snail repressor provide additional essential inputs"
A Drosophila early embryo, stained for gene expression of Rhomboid, in red. The expression patterns of the regulators Even-skipped (stripes) and Snail (ventral, or left) are both stained in green. The dorsal (back) direction is right, ventral (belly) is left, and the graph is of Rhomboid expression over the ventral->dorsal axis. The enhancer of the Rhomboid gene shown at top has its individual regulator sites colored as green (Dorsal), red (Snail) and yellow (Twist). 

Their analysis focused on one enhancer of one gene, the Rhomboid gene of the fruit fly, which directs embryonic gene expression just dorsal to the midline, shown above in red. The Snail regulator is a repressor of transcription, while Dorsal and Twist are both activators. A few examples of deleting some of these sites are shown below, along with plots of Rhomboid expression along the ventral/dorsal axis.

Individual regulator binding sites within the Rhomboid enhancer (B, boxes), featuring different site models (A) for each regulator. The fact that one regulator such as Dorsal can bind to widely divergent site, such as DL1 and DL2/3, suggests the difficulty of finding such sites computationally in the genome. B shows how well the models match the actual sequence at sites known to be bound by the respective regulators.

Plots of ventral-> dorsal expression of Rhomboid after various mutations of its Dorsal / Twist/ Snail enhancer. Black is the wild-type case, blue is the mutant data, and red is the standard error.

It is evident that the Snail sites, especially the middle one, plays an important role in restricting Rhomboid expression to the dorsal side of the embryo. This makes sense from the region of Snail expression shown previously, which is restricted to the ventral side, and from Snail's activity, which is repression of transcription.
"Mutation of any single Dorsal or Twist activator binding site resulted in a measurable reduction of peak intensity and retraction of the rho stripe from the dorsal region, where activators Dorsal and Twist are present in limiting concentrations. Strikingly, despite the differences in predicted binding affinities and relative positions of the motifs, the elimination of any site individually had similar quantitative effects, reducing gene expression to approximately 60% of the peak wild-type level"

However, when they removed pairs of sites and other combinations, the effects became dramatically non-linear, necessitating more complex modelling. In all they tested 38 variations of this one enhancer by taking out various sites, and generated 120 hypothetical models (using a machine learning system) of how they might cooperate in various non-linear ways.
"Best overall fits were observed using a model with cooperativity values parameterized in three 'bins' of 60 bp (scheme C14) and quenching in four small 25 or 35 bp bins (schemes Q5 and Q6)."
Example of data from some models (Y-axis) run on each of the 38 mutated enhancer data (X-axis). Blue is better fit between the model and the data.

What they found was that each factor needed to be modelled a bit differently. The cooperativity of the Snail repressor was quite small. While the (four) different sites differ in their effect on expression, they seem to act independently. In contrast, the activators were quite cooperative, an effect that was essentially unlimited in distance, at least over the local enhancer. Whether cooperation can extend to other enhancer modules, of which there can be many, is an interesting question.

Proof of their pudding was in the extension of their models to other enhancers, using the best models they came up with in a general form to predict expression from other enhancers that share the same regulators.

Four other enhancers (Ventral nervous system defective [vnd], Twist,  and Rhomboid from two other species of Drosophila, are scored for the modeled expression (red) over the dorsal-ventral axis, and actual expression in black.

The modeling turns out pretty decent, though half the cases are the same Rhomboid gene enhancer from related Drosophila species, which do not present a very difficult test. Could this model be extended to other regulators? Can their conclusion about the cooperativity of repressors vs activators be generalized? Probably not, or not very strongly. It is likely that similar studies would need to be carried out for most major classes of regulators to accumulate the basic data that would allow more general and useful prediction.

And that leaves the problem of finding the sites themselves, which this paper didn't deal with, but which is increasingly addressable with modern genomic technologies. There is a great deal yet to do! This work is a small example of the increasing use of modeling in biology, and the field's tip-toeing progress towards computability.

  • Seminar on the genetics of Parkinson's.
  • Whence conservatism?
  • Krugman on the phony problem of the debt.
  • Did the medievals have more monetary flexibility?
  • A man for our time: Hume, who spent his own time in "theological lying".
  • Jefferson's moral economics.
  • Trump may be an idiot, just not a complete idiot.
  • Obama and Wall Street, cont...
  • The deal is working.. a little progress in Iran.
  • More annals of pay for performance.
  • Corruption at the core of national security.
  • China's investment boom came from captive savings, i.e. state financial control.

Saturday, May 7, 2016

A Son of Hamas Turns His Back

Review of the documentary, the Green Prince. Spoiler alert.

In one of the more bizarre twists of the Palestinian drama, the son of a Hamas leader turned into a tireless worker for the Shin Bet from about 1997 to  2007. Now he lives in the US, at undisclosed locations. This film is essentially an memoir of this story, with two people talking to the camera, Mosab Hassan Yousef, the son, and Gonen Ben Yitzhak, his Israeli intellegence handler.

The format was oddly compelling, because the people are compelling- intelligent and dedicated. But to what? Yousef was raised in the West Bank, the eldest son in a leading family, and became his father's right hand. His father was one of the main people you would hear screaming on the news, preaching publicly about the evils of Israel, the righteousness of Islam and the Intifada, and the need for Hamas to run things in the West Bank as well as Gaza. As Hamas goes, he was not the most extreme, but nor was he a member of the Palestinian Authority- the Palestinian patsies.

Father HassanYousef at a Hamas Rally.

So turning to the Shin Bet was unthinkable in tribal terms. But when Yousef had his first experience in prison, courtesy of an Israeli checkpoint where he was found with some guns, he had a chance to compare tribes. While the Israelis were harsh, they had limits and operated under some kind of lawful system.

The Hamas cell in the prison, however, was brutally sadistic. Yousef describes the killing of scores of putative spies and informants in horrific fashion, with scant evidence. For an idealistic youth, it presented a problem, especially in contrast to the idealized version of the Palestinian cause that he had grown up with. Where at first he didn't take the offer from the Shin Bet seriously, now he had second thoughts. What if his idealism was more about non-violence, peace, and saving lives than about tribal competition?

There follows a lengthy career relaying information from his position at the center of Hamas with his father to the core of Shin Bet, preventing attacks, preventing assassinations, and also, in essence, dictating his father's fate. A central conundrum of intelligence work like this is how to use the informant's information without giving away his or her identity. To maintain Yousef's cover for a decade bespeaks very careful work on all sides.

But the larger issue remains untouched. While Yousef comes off as heroic and idealistic, the Israeli occupation of the West Bank is no more justified by Israel's lawful and partial restraint (or by its relentless stealing of land) than it is by the bottomless resentment and madness of Hamas. Treat people like prisoners and animals, and they often act that way. Moreover, Israel holds total control. They need no "partners" to resolve their doomed and immoral occupation. They only need to get out, and get their settlers out.


  • Muslims are screwing up the Netherlands and Europe generally.
  • Obama and Wall Street. Next showing: Hillary and Wall Street.
  • Do Republicans know anything about growth?
  • The Saudis are hurting.
  • Another business that "cares" for its customers.
  • Another case of pay for performance.
  • Non-competes run amok. "The Treasury Department has found that one in seven Americans earning less than $40,000 a year is subject to a non-compete. This is astonishing, and shows how easily businesses abuse their power over employees."
  • Our medical system is so dysfunctional and complex that error is third leading cause of death.
  • It almost makes you nostalgic for Richard Nixon.
  • Feel the heart, and the Bern.
  • Deflation and below-target monetary growth is a policy mistake.
  • Will extreme Christians let go of politics, at long last?
  • A little brilliant parenting.

Sunday, May 1, 2016

Audio Perception and Oscillation

Brains are reality modeling machines, which isolate surprising events for our protection and delectation. Does music have to be perpetually surprising, to be heard?

Imagine the most boring thing imaginable. Is it sensory deprivation? More likely it will something more active, like a droning lecturer, a chattering relative, or driving in jammed traffic. Meditation can actually be very exciting, (just think of Proust!), and sensory deprivation generates fascinating thought patterns and ideas. LSD and similar drugs heighten such internal experiences to the point that they can become life-altering. Which indicates an interesting thing about the nature of attention- that it is a precious resource that feels abused not when it is let loose, but when it is confined to some task we are not interested in, and particularly, that we are learning nothing from.

Music exists, obviously, not to bore us but to engage us on many levels, from the physical to the meditative and profound. Yet it is fundamentally based on the beat, which would seem a potentially boring structure. Beats alone can be music, hypnotically engaging, but typically the real business of music is to weave around the beat fascinating patterns whose charm lies in a tension between surprise and musical sense, such as orderly key shifts and coherent melody.

Why is all this attractive? Our brains are always looking ahead, forecasting what comes next. Their first rule is ... be prepared! Perception is a blend of getting new data from the environment and fitting it into models of what should be there. This has the virtues of providing understanding, since only by mapping to structured models of reality are new data understandable. Secondly, it reduces the amount of data processing, since only changes need to be attended to. And thirdly, it focuses effort on changing or potentially changing data, which are naturally what we need to be paying attention to anyhow ... the stuff about the world that is not boring.

"Predictive coding is a popular account of perception, in which internal representations generate predictions about upcoming sensory input, characterized by their mean and precision (inverse variance). Sensory information is processed hierarchically, with backward connections conveying predictions, and forward connections conveying violations of these predictions, namely prediction errors." 
"It is thus hypothesised that superficial cell populations calculate prediction errors, manifest as gamma-band oscillations (>30 Hz), and pass these to higher brain areas, while deep cell populations [of cortical columns] encode predictions, which manifest as beta band oscillations (12–30 Hz) and pass these to lower brain areas." 
"In the present study, we sought to dissociate and expose the neural signatures of four key variables in predictive coding and other generative accounts of perception, namely surprise, prediction error, prediction change and prediction precision. Here, prediction error refers to absolute deviation of a sensory event from the mean of the prior prediction (which does not take into account the precision of the prediction). We hypothesised that surprise (over and above prediction error) would correlate with gamma oscillations, and prediction change with beta oscillations."

A recent paper (and review) looked at how the brain perceives sound, particularly how it computes the novelty of a sound relative to an internal prediction. Prediction in the brain is known to resemble a Bayesian process where new information is constantly added to adjust an evolving model.

The researchers circumvented the problems of low-resolution fMRI imaging by using volunteers undergoing brain surgery for epilepsy, who allowed these researchers to study separate parts of their brains- the auditiory cortex- for purposes completely unrelated to their medical needs. They also allowed the researchers to only record from the surfaces of their brains, but to stick electrodes into their auditory cortexes to sample the cortical layers at various depths. It is well-known that the large sheet of the cortex does significantly different things in its different layers.

Frequencies of tones (dots) given to experimental subjects, over time.

The three subjects were played a series of tones at different frequencies, and had to do nothing in return- no task at all. The experiment was merely to record the brain's own responses at different positions and levels of the auditory cortex, paying attention to the various frequencies of oscillating electrical activity. The point of the study was to compare the data coming out with statistical models that they generated separately from the same stimuli- ideal models of Bayesian inference for what one would expect to hear next, given the sequence so far.

Electrode positions within the auditory areas of the subject's brains.

Unfortunately, their stimulus was not quite musical, but followed a rather dull algorithm: "For each successive segment, there is a 7/8 chance that that segment’s f [frequency] value will be randomly drawn from the present population, and a 1/8 chance that the present population will be replaced, with new μ [mean frequency] and σ [standard deviation of the frequency] values drawn from uniform distributions."

Correlations were calculated out between the observed and predicted signals, giving data like the following:

Prediction error and surprise are closely correlated, but the experimenters claim that surprise is a better correlated to the gamma band brain waves observed (B).

The difference between observation and prediction, and between surprise and prediction error. Surprise apparently takes into account the spread of the data, i.e. if uncertainty has changed as well as the mean predicted value.

What they found was that, as others have observed, the highest frequency oscillations in the brain correlate with novelty- surprise about how perceptions are lining up with expectations. The experimenter's surprise (S) measurement and prediction error (Xi) are very closely related, so they both correlate with each other and with the gamma wave signal. The surprise measure is slightly better correlated, however.

On the other hand, they observed that beta oscillations (~20 Hz) were correlated with changes in the predicted values. They hypothesized that beta oscillations are directed downward in the processing system, to shape and update the predictions being used at the prior levels.

Lastly, they find that the ~10 Hz alpha oscillations (and related bands) correlate with the uncertainty or precision of the predicted values. And theta oscillations at ~6 Hz were entrained to the sound stimulus itself, hitting when the next sound was expected, rather than encoding a derived form of the stimulus.

It is all a bit neat, and the conclusions are dredged out of very small signals, as far as is shown. But the idea that key variables of cognition and data processing are separated into different oscillatory bands in the auditory cortex is very attractive, has quite a bit of precedent, and is certainly an hypothesis that can and should be pursued by others in greater depth. The computational apparatus of the brain is very slowly coming clear.
"These are exciting times for researchers working on neural oscillations because a framework that describes their specific contributions to perception is finally emerging. In short, the idea is that comparatively slow neural oscillations, known as “alpha” and “beta” oscillations, encode the predictions made by the nervous system. Therefore, alpha and beta oscillations do not communicate sensory information per se; rather, they modulate the sensory information that is relayed to the brain. Faster “gamma” oscillations, on the other hand, are thought to convey the degree of surprise triggered by a given sound."

  • Bill Mitchell on the Juncker regime.
  • Who exactly is corrupt in Brazil, and how much?
  • There are too many people.
  • But not enough debt.
  • The fiscal "multiplier" is not constant.
  • Population has outstripped our willingness to build and develop.
  • What's going on in the doctor's strike?
  • Schiller on lying in business, Gresham's dynamics, and marketing.
  • Lying in religion.
  • Stiglitz on economics: "The strange thing about the economics profession over the last 35 year is that there has been two strands: One very strongly focusing on the limitations of the market, and then another saying how wonderful markets were."
  • Should banks be public institutions?
  • Does democratic socialism have a future in Russia?
  • A Sandersian / Keynesian stimulus is only effective if the Fed plays along.
  • Science yearns to be free.
  • Trump's brush with bankruptcy and friends in high places.

Saturday, April 23, 2016

Locating Abstractions in the Brain

The most human part of the brain is also the murkiest and least understood. Visualization studies of what is going on in the frontal cortex.

While it was in vogue, the lobotomy operation was used to treat in the neighborhood of 100,000 people in the mid twentieth century, rendering them more manageable- something that has since been more easily achieved with drugs. From the Wiki page:
"The purpose of the operation was to reduce the symptoms of mental disorder, and it was recognized that this was accomplished at the expense of a person's personality and intellect. British psychiatrist Maurice Partridge, who conducted a follow-up study of 300 patients, said that the treatment achieved its effects by 'reducing the complexity of psychic life'. Following the operation, spontaneity, responsiveness, self-awareness and self-control were reduced. Activity was replaced by inertia, and people were left emotionally blunted and restricted in their intellectual range."

What is odd is that for such a massive disruption to the brain, the effects were diffuse and hard to understand (though in fairness, the methods used were hardly uniform). "The first bilateral lobectomy of a human subject was performed by the American neurosurgeon Walter Dandy in 1930. The neurologist Richard Brickner reported on this case in 1932, relating that the recipient, known as 'Patient A', while experiencing a flattening of affect, had suffered no apparent decrease in intellectual function and seemed, at least to the casual observer, perfectly normal."

Some effects were that the subject no longer dreamed, they also lost their theory of mind, or the ability to empathize with others. Some entered a stupor or started suffering siezures. There were various intellectual and personality deficits- one became "smiling, lazy and satisfactory patient with the personality of an oyster". Five percent died. One subject mentioned:
"It took a great deal of effort to keep an abstraction in mind. For example, in talking with the speech therapist I would begin to give a definition of an abstract concern, but as I held it in mind it would sort of fade, and chances were that I'd end up giving a simplified version rather than one at the original level of conception. It was as though giving an abstraction required so much of my addled intelligence that halfway through the definition I would run out of the energy available to me and regress to a more concrete answer. Something like this happened again and again."

An irony is that the Soviet Union took the lead in banning the procedure, "Doctors in the Soviet Union concluded that the procedure was 'contrary to the principles of humanity' and 'through lobotomy an insane person is changed into an idiot.'"

Modern brain scanning allows researchers to peer into the frontal lobes and start figuring out what is going on there. A recent paper described some early work in that direction, devising simple tasks to differentiate levels of abstract thought and mapping where they happen, using fMRI. They manage to map separate zones in the frontal cortex that handle temporal / time shifting abstractions, category switching abstractions, and feature attention control.

The subjects were presented with points that through several frames that added up to a diagram, (C), a star with letters on the outside, with a color applied. There were several rules imposed, such as if the color setting was purple, the letters were supposed to be added up to form a word across the star (TABLET, in this case). If the color was orange, the subject was supposed to just trace the points of the star with her eyes. Then delay rules were added, asking whether the trail was the same type or a different type than the one before. Or the subject was given a new diagram but asked to maintain their place in the old diagram, to be recalled later. Then distraction periods were added in between to test for memory retention. It all begins to look like an intelligence test, for the subject's ability to keep ideas and rules in mind successfully.

Test design, in part. C shows the basic image presented to the subject, which would have included color as well, and varied the shape and text presented. The points of the star were not presented at once, but fed out one point at a time. B shows the combined tests that were devised. For instance. The restart test asked the subject not to delay their analysis, but just presented with a new diagram and asked to resolve the color and text diagram by the agreed rules.

The tests were designed to separate three topics of thought, and were added together in various combinations to allow the researchers to run combinatorial tests. The upshot was that they were able to map the three tasks to different parts of the frontal cortex:

Distinct mappings of each task to its region. Handling time delay and abstraction occupies the very front of the brain, (rostral), while simpler abstractions keeping track of the local context of a task, or attending to selected features of an image/task occupy precincts farther back (caudal). This is in addition to separate zones in the mid-brain
"Regressing these measures onto activation revealed a clear gradient such that caudal LPFC [lateral prefrontal cortex] was related to current, but not future processing, while rostral LPFC was related to future, but not current processing, with mid LPFC showing activity related to both current and future processing "


They end up with a beautiful depiction of the regions of the brain where their various tasks took place. Unfortunately, fMRI imaging technology remains very crude, in time and space, so their task breakdown was similarly crude to suit. It will probably take new technology to go to deeper detail on what is going on in the human frontal cortex- the part of the brain most responsible for making us human, but also, since it handles abstractions farthest from detailed concrete processing, the most nebulous and hard to define.

  • Inequality isn't just a bleeding heart issue, but an investment and prosperity issue.
  • Solow on labor power and inequality.
  • Tax complexity isn't entirely the government's fault, but another dividend of corruption.
  • Retirement is another big front in the inequality debate.
  • Utopia now and then.
  • Globalization is a problem.
  • Some problems with supply side theory. Perhaps taxes make people work harder.
  • Pay is a complicated construct.
  • We need more debt.
  • But perhaps less bail.

Saturday, April 16, 2016

Euhemerization

People making gods, as usual- and the mythical nature of Jesus.

All aspects of the existence and nature of Jesus are a matter of theory, not fact. So much of the early literature about him is forged, made-up, laced by myth and parable, and templated by religious traditions, philosophical preconceptions and political exigencies, that the nature of (or existence of) the actual, historical Jesus is a matter of speculation and inference at best.

Bart Ehrman wrote an exasperated book about the evidence for the historical Jesus, affirming, despite his own lack of conventional faith, and through his dedicated scholarship in the field, that the consensus position of Christians and scholars is correct. The problem of the thin-ness of the evidence remains, however, since all the evidence comes from internal (Christian) and late (not contemporaneous) sources. This is not unusual or unexpected for any Roman of this time, other than the very highest levels of emperors and writers, but hardly allows a solid case either pro or con. A great deal turns, for instance, on one's interpretation of the word "brother", since Paul, in letters that are widely agreed to be reasonably authentic, refers to James as a brother of Jesus. If this means a biological brother, it means that Jesus, by this chain of evidence, really existed biologically. Whether his mother was a perpetual virgin is another matter, of course! Or was James a spiritual brother, as is the common usage has been for many religious communities? Ehrman, as an expert, comes down clearly on the biological side.

Myth, or just mythic?


Both cases, for and against the historicity of Jesus, are thus circumstantial, based on the credibility of scraps of evidence, or the credibility of a counter-story elaborated by the mythicists, where Jesus begins as a deity who is brought down to earth (euhemerized) for a variety of motives that are quite understandable, and precedented by similar gods and god-men before and since. Casting one's god as a real person makes the provenance and stability of his teachings more secure than that of a deity that communicates through revelation, and could do so again at any time. And stories are easy to make up and write down. A recent talk by Richard Carrier makes this case with gusto.

I am not going rehash the arguments here. But only say that the pro-historical case, while certainly traditional, popular, and even likely, is, even by Bart Ehrman's telling, hung on very thin threads of internal evidence, on texts whose transmission to us is an endless story of copying, re-copying, correction, obfuscation, politics, and forgery. The early Christian times are a fascinating period of political and archetypal turmoil. No path is straight, least of all the texts that purport to tell the story. Take for instance, the case of Marcion, who supposedly collected letters of Paul and devised the first Christian cannon. Marcion is thought to have written a good bit of it himself, and founded a theology that was very popular in its day, only to ripen into heresy later on at the hands of what comes down to us as orthodoxy.

The project of making Christianity's hodge-podge of scriptures fit the orthodox story as it evolved through the centuries is mind-bogglingly complicated and obviously ongoing, given the many versions of the Bible and of Christianity that are still running around. The process is reminiscent of the paradox of Islam, where those who take its origins and scripture most seriously are the most righteous and violent, whereas those who merge into more mature traditions, as they ripened through time into human, and typically humane, institutions, are much more resistant to the fundamentalist call.

Getting back to the foundations, what is the precedent for euhemerization such as what happened to the person or entity we call Jesus? And for its complement, apotheosis? These days, the traffic between heaven and earth has hit some kind of traffic jam. But in antiquity, it was far more common for people such as kings and emperors to become gods, and also for gods to come down to earth, in tales such as the Homeric epics. Divinity was assumed to exist, and divine beings were pretty much formed in the image of ourselves, at our most powerful. Both the Jewish god(s) and the Greek gods were distinguished by their power much more than their knowledge, let alone their emotional wisdom or kindness.

Even farther back, the template is of course the family, and the trauma of death. The death of any person, let alone a powerful, archetypal person like a parent, is unimaginable. How can life stop cold, how can existence simply end? Impossible. We have thus come up with a rich set of rationalizations and theologies of additional existence. They typically involve the movement of people (souls) from this world to some other invisible world, where they look back with fondness to what is still the important place, our world.

But then comes the important question of whether and how this spritual world, if it is to have any ongoing function for us, interacts with ours. Our souls clearly have some modus operandi by which they co-function with our living bodies, mortal though they are. Likewise, spirits and gods must have some way back into the world if we wish to involve them in our dramas. Thus we end up with a rich literature of heroic journeys to heaven (or the underworld) and back, gods taking up disguises as women or men (or animals), throwing thunderbolts, causing natural cataclysms, etc.

It is only the higher psychological and philosophical sophistication of our age that has slowed down this traffic, though it peeks out of our unconscious in the endless array of super-hero movies, not to mention a majority of the country that still holds fast to some version of the traditional theological stories.

Let us close with a couple of quotes from Thomas Paine speaking of the Christian believer, vs a true deist, from his deist book, "The Age of Reason":
"Yet, with all this strange appearance of humility, and this contempt for human reason, he ventures into the boldest presumptions. He finds fault with everything. His selfishness is never satisfied; his ingratitude is never at an end. He takes on himself to direct the Almighty what to do, even in the govemment of the universe. He prays dictatorially. When it is sunshine, he prays for rain, and when it is rain, he prays for sunshine. He follows the same idea in everything that he prays for; for what is the amount of all his prayers, but an attempt to make the Almighty change his mind, and act otherwise than he does? It is as if he were to say -- thou knowest not so well as I."
"The Bible of the creation is inexhaustible in texts. Every part of science, whether connected with the geometry of the universe, with the systems of animal and vegetable life, or with the properties of inanimate matter, is a text as well for devotion as for philosophy -- for gratitude, as for human improvement. It will perhaps be said, that if such a revolution in the system of religion takes place, every preacher ought to be a philosopher. Most certainly, and every house of devotion a school of science."

  • Shadows from the past: Hillary and Honduras, one reason for a new influx of refugees to the US.
  • Freedom for me, but not for thee.
  • Who pays for corporate taxes? Is corporate power and capital mobility so great that they can off-load all costs onto workers and taxpayers? "We need also to account for the financial, administrative, and strategic costs of tax avoidance." Maybe we need stronger international governance.
  • Should central banks be unaccountable?
  • Lobbying and corruption is by far the best investment.
  • Stiglitz on negative rates... too little too late.
  • Mice who stutter!
  • The national debt is not a problem, at all.

Sunday, April 10, 2016

Who am I? Mechanics of Cell Identity




How do neurons in the fly know which segment they are in?

Organismal development is a biological mystery that is being gradually unravelled in labs all over the world in that heroic endeavor called "normal science". Which is the pedestrian counterpart to the Kuhnian revolutions termed paradigm shifts. That the endogenous materials and genetic code of the egg/embryo generate the later adult forms has been known ever since scientists gave up vitalistic and other religious ideas about our biology. But how that happens ... approaching that question has taken lots of modern technology and persistence.

Fruit flies are the leading model system for embryonic and organismal development, due to their marriage of complex body plans, simple experimental handling, and extraordinarily deep genetics. After almost a century of productive study, a revolution happened in the 1980s in fruit fly genetics, following new mutant screens that uncovered some of the most basic mechanisms in body plan development. The genes found and analyzed during this period established a basic paradigm that has extended to all metazoans that have segmented body plans. Do we have segments? Yes, our backbone is a testament to our segmented ancestors.

The fly is built out of segments, whose cells know where/what they are by virtue of special genes expressed in them- the homeotic genes. The major genes of the fly homeotic complexes are, in order, Labial, Proboscipedia, Deformed, Sex combs reduced, Antennapedia, Ultrabithorax, Abdominal-A, and Abdominal-B.
The theme of these studies was that a series of genes, typically regulators of the expression of other genes, are turned on in sequence during development to identify progressively finer regions of the developing body. So at first, the two ends of the egg cell or synctium are set as different, then some gross regions are defined, and later on, each segment (and each side of each segment) expresses a few key genes that identify its cells, so that another cell, say a nerve cell migrating through the area, can tell exactly where it is. Each protein is expressed in a gradient within its zone, allowing the next regulator in the process to detect which end of that gradient it lies in, and thus whether to turn on or not. Late in this genetic series are the Hox genes, which are notorious for the complexity of their own regulation, for their ability, when mutated, to transform the identity of some segments entirely into other ones, and for the linear relationship between their chromosomal position and the locations on the body where they are individually expressed.

Progressive genetic specification of the fly embryo body plan, dividing it up into segments.  Gradients of one gene product allow the next gene product to detect the sides of its compartment and thus refine its cellular and body identity to a finer level.

A recent paper took up this adventure in the area around the head and neck, asking how embryonic nerve cells (neuroblast stem cells) originating in segments 4 to 6 know who they are and where to go. While one might not think that an animal head has segments at all, in embryological and molecular terms, heads encompass about 7 segments, (in the fly), which go through very messy convolutions into the complex mature structure. In comparison, body segments are far more orderly. Indeed, the central thoracic segment appears to be the default state, needing no Hox gene expression to develop normally:
"While thoracic identities seem to represent a ground state (T2, no input of Hox genes), identities of consecutive posterior segments are established by adding the function of Bx-C Hox genes Ultrabithorax (Ubx), abdominal-A (abdA) and Abdominal-B (AbdB), an evolutionary highly conserved phenomenon described as posterior dominance or prevalence of Hox genes. The terminal abdominal neuromeres A8-A10 exhibit a progressively derived character regarding size and composition. In these segments, NB [neuroblast, or neuronal stem cell] patterns and segmental identities are controlled by combined action of the Hox gene AbdB and the ParaHox gene caudal."

Map of the Drosophila head region, stained to show the Engrailed gene product. This is a homeotic segment polarity gene, expressed on one side of each segment throughout the embryo at this stage. At bottom is a map, coding the different segments accounted for within the head: red- antenna segment; purple- ocular segment; orange- intercalary segment; brown- labral segment; black- mandibular segment; green- maxillary segment; blue- labial segment; gray- first thoracic segment. In ensuing figure, the embryo is squashed to lay out the segments better.

The head segments likewise require extensive input from the Hox genes to keep their identities distinct. The researchers use a series of mutants to figure out how the local (segments 4 to 6) neuronal stem cells respond to missing genetic homeotic inputs. To do this, they use a few morphological characteristics and gene markers (assays for a gene whose expression is restricted to a certain lineage or cell type, in this case antibodies specific to the respective proteins) to identify the neuroblasts or stem cells they are interested in.

Stem neurons in three segments are stained with a combination of gene expression probes: Eagle in green, Runt in red, and Engrailed in blue. Note how combined expression renders some key cells aqua (green + blue) or yellow (green + red). Other diagnostic genes used for cell identification, which are all known to have developmental roles, are Deadpan, Deformed, Repo, Even-skipped, Eyeless, Sex combs reduced, Proboscipedia, and Gooseberry. The segments, from front [top] to back, are mandibular (mad), maxillary (max) and labial (lab). In back of the labial segment is the first thoracic segment. This stage of development (12) is quite early, well before the first larva forms.

Many figures of embryos later, stained for the expression of various proteins, in flies mutated for various key homeotic genes, and analyzed for the presence of notable cells at various stages, the authors draw several conclusions about the genetic influences that determine the identity and existence of neurons in these head segments, some of which will go on to contribute to the adult fly's brain. First, the maxillary segment, including its neuronal stem cells, expresses Deformed and Sex combs reduced from the Hox genes, while the next labial segment expresses Labial, but not in its neuronal cells. These seem to be the principal determinants of segmental identity. Yet when Deformed is mutated, only about half the cells are transformed from maxillary identity to a labial or thoracic identity. Only when another homeotic gene is also mutated, either Antennapedia or Labial, is the transformation more complete.

The curious thing about this is that neither Antennapedia nor Labial are normally expressed in the maxillary head segment, so the effect of their mutation must not be what the resarchers term cell-autonomous. These other genes must be acting from some distance away, instead of directly via their own expression in the cells being affected. This gets these researchers quite excited, and they track down some of the mechanism behind this extra cell fate specification.
"We identify the secreted molecule Amalgam (Ama) as a downstream target of the Antennapedia-Complex Hox genes labial, Dfd, Sex combs reduced and Antennapedia. In conjunction with its receptor Neurotactin (Nrt) and the effector kinase Abelson tyrosine kinase (Abl), Ama is necessary in parallel to the cell-autonomous Dfd pathway for the correct specification of the maxillary identity of NB6-4. Both pathways repress CyclinE (CycE) and loss of function of either of these pathways leads to a partial transformation (40%), whereas simultaneous mutation of both pathways leads to a complete transformation (100%) of NB6-4 segmental identity."

Summary of findings, where Deformed is the main, local homeotic specifier for the maxillary segment neurons. But additional help comes from the next-door labial segment which expresses the homeotic gene Sex combs reduced, which influences expression in turn of the diffusible protein Amalgam, which helps the nearby maxillary segment keep its identity, via repression of the gene cyclin E. Interestingly, the Amalgam gene is located in the homeotic cluster right next to Deformed.

Summary of findings, where Deformed is the main, local homeotic specifier for the maxillary segment neurons. But additional help comes from the next-door labial segment which expresses the homeotic gene Sex combs reduced, which influences expression in turn of the diffusible protein Amalgam, which helps the nearby maxillary segment keep its identity, via repression of the gene cyclin E. Interestingly, the Amalgam gene is located in the homeotic cluster right next to Deformed.

So what had originally been though of as a fully cell-autonomous system, whereby each homeotic gene or combination thereof dictates the identity of cells in each respective segment where it is itself expressed, turns out to be a bit more messy, with neighbor effects that refine the identity code. Obviously this is getting into the deep weeds of developmental biology, but at the same time is an outstanding example of where the field is today, filling in ever-finer details of how development happens, using sophisticated techniques and backbreaking amounts of work.