Saturday, March 1, 2014

Jesus: miracle, midrash, or myth?

Did Jesus really exist? A review of Robert Price's "The Christ myth theory and its problems".

What?! Doesn't everyone agree on this most central historical fact? The fact from which our very historical time is measured? Indeed, wasn't Jesus blond-haired and blue-eyed? Well, no, and the reason is that the evidence for the existence of Jesus in any historical sense is extremely thin to non-existent. Biblical scholar and professor Robert Price weighs the evidence, and comes down very tentatively on the myth side of the equation. I will be following his analysis, more or less, below. But the fact of the matter is that we have so little to go on that either position is equally valid, and equally tenuous- Jesus might have existed, might not ... no one really knows. Indeed it would be accurate to say that we know that the Jesus we know is mostly myth. The only question is where in the low figures the percentage of reality is: 20%, 10%, or 0%?

The Jesus as myth hypothesis posits that the gospel writers were furiously filling an entirely blank biography with an amalgam of Old Testament rewrites, (similar to the Jewish practice of midrash), Homeric themes, and new archetypal and theological material. Whether the subject was historically real or not was, as frequently in the ancient world, (and today!), not of the highest concern, once the community had fastened onto its inverted Jewish Messiah story. Even today, the insistence of the political right wing in the US on its "facts" is an object lesson in real-time myth-making. And the Messiah itself was such a common theme in this tumultuous time, both in the Jewish world, and in the Roman world generally, that a miracle-working, dying and rising superman was easy to conjure, whatever the historical seed may or may not have been. Many others have raked over this territory far better than I, so take this as an appetizer of sorts for the critical analysis of others.

But let's get to the main points of the case- the evidence, and lack thereof.

  • The historical traces.
Aside from the New Testament, we have virtually no mention of Jesus, and those mentions are decades after his time, in some cases inserted by unnamed later authors, and in any case merely mention the Jesus story as was current among Christians of the time, with no detailed or independent information:
Josephus, writing ~93 CE
"Festus was now dead, and Albinus was but upon the road; so he assembled the sanhedrin of judges, and brought before them the brother of Jesus, who was called Christ, whose name was James, and some others; and when he had formed an accusation against them as breakers of the law, he delivered them to be stoned."
Tacitus, writing ~116 CE
"Consequently, to get rid of the report, Nero fastened the guilt and inflicted the most exquisite tortures on a class hated for their abominations, called Christians by the populace. Christus, from whom the name had its origin, suffered the extreme penalty during the reign of Tiberius at the hands of one of our procurators, Pontius Pilatus, and a most mischievous superstition, thus checked for the moment, again broke out not only in Judæa, the first source of the evil, but even in Rome, where all things hideous and shameful from every part of the world find their centre and become popular."

Pilate was a true historical figure, attested by archeology as well as historical writings. But Tacitus is evidently referring to the story as told by the Christian followers, so the connection to Jesus is here one of hearsay, as is the passage by Josephus. Now, the absence of evidence is in no way conclusive. Even though the miracles attributed to him, the quasi-revolution fostered by him, and the brutal judgement by his community and execution by Rome might well have excited some kind of contemporary commentary, none has come to light. Nor is likely to ever come to light, considering how fervently such material has already been sought.

  • The name. 
Jesus is a form of the Hebrew Joshua, meaning Yahweh saves ... a savior. While this was a reasonably common name, "The works of Josephus refer to at least twenty different people with the name Jesus... ", it is awfully convenient for the presumptive Messiah to have this name. One hypothesis might be that Mary received her vision of why she happened to be miraculously pregnant, and had Joseph choose this propitious name. Or it might have been a coincidence, or a miracle. Or there may have been many valid messiahs in Palestine at this time, of whom only those named Jesus rose above the noise or took the mission onto themselves. A bit like someone in Mexico named "Jesus" becoming the next religious founder. Or, it might have been applied after the fact to a mythical savior, along with the equally honorific "Christus", meaning messiah. Critical historical analysis, in which Price gives a few lessons, tends to favor the easier hypotheses over the outlandish, convoluted, or coincidental.

The hometown of "Nazareth" is hardly more helpful, since it is not clear that such town existed at the time. It is quite possibly yet another power-name to go with "Christ" and "Jesus", this one meaning "branch", another reference to the messiah. "Jeremiah 23:5: 'Behold, the days are coming,' declares the Lord, 'When I will raise up for David a righteous Branch. And He will reign as King and act wisely and do justice and righteousness in the Land.'"

  • The Epistles.
The earliest writing within Christianity is regarded as whatever of the Pauline Epistles are genuine, from about the 50's CE. These have a far different picture of Jesus than the storybook Gospels. Paul never claims to have met Jesus, other than by a vision, and refers to him always in a sort of Homeric formula, as Christ crucified, or our Lord, etc. Price mentions that if Paul had the various sayings from Jesus that are thought to have been current in the community via the Q source or gospel of Thomas material, he would surely have used them in his various arguments recorded in the epistles. But no quotes at all, indeed no biographical Jesus at all, other than indirectly in references to Jesus's brother James, which we will get to below. (For a contrary view..) Perhaps this was just Paul's style, and a mark of his "outsider" status vs the Jerusalem church, but it is hard to square with a personality cult, like the one developed from Muhammed, for instance. The vast Hadith collection, all of it thought to be false, grew up in Muhammed's wake, and it hardly made a difference whether he was real or not. But Paul's ignorance of such a tradition indicates that it may have arisen later, just in time for the gospel writers, or been distrusted as a source by Paul.

  • The Gospels.
The four gospels are great artistic achievements, certainly when transmitted through the language of the  King James committee. But where did they come from, and what were they really saying? Firstly, the authors are unknown, as the canonical names were applied by others. They were written in the 70's-80's CE, except for John, which comes later by a couple of decades. I won't even deal with the contradictions among them, which are legion despite being partly derived from some common sources.

Price notes that each of the gospels tells a very archetypal story. Each anecdote has a lesson, each epsiode a moral. It is not history in the conventional or modern sense, since the story is there to push the theology rather than say simply what happened. Jesus becomes the archetypal hero, with plenty of precedent, both ancient and modern. Born of a virgin, precocious, foretold in countless ways from the old testament, possessing special powers and insights, disbelieved, becoming a king (if in an inverted way), then brought down, only to rise again as the scapegoat for all sins. The Jews had long been on the lookout for a messiah, and the wider Roman world indulged in many similar hero-mystery religions.

Price spends most of the book going literally chapter and verse through the New Testament to dredge up the models that inform each passage. Most come from the Old Testament, though some also come from Homer or Euripedes. Many of the comparisons seem rather strained to me, but there are also quite convincing sections. For instance, a long section of Luke is passage for passage pretty much a re-casting of Deuteronomy. A few examples of Price's comments:

Deuteronomy 8:1-3 / Luke 10:38-42
"Luke has created the story of Mary and Martha as a commentary on Deuteronomy 8:3, 'Man does live by bread alone, but... man lives by every word that proceeds from the mouth of the lord.' Luke has opposed the contemplative Mary who hungers for Jesus' ('the lord's') 'words' with the harried Martha ('Lady of the house', hence an ideal, fictive character), whose preoccupation with domestic chores, especially cooking and serving, threatens to crowd out spiritual sustenance (cf. Deuteronomy 8:11-14). It is not unlikely that the passage is intended to comment in somne way on the issue of celibate women and their various roles in the church of Luke's day (cf. 1 Timothy 5:3-16)."

Deuteronomy 8:4-20 / Luke 11:1-13
"Deuteronomy compares the discipline meted out to Israel by God with the training a father gives his son, then remind the reader of the fatherly provision of God for his children in the wilderness and promises of security, prosperity, and sufficient food in their new land. Luke matches this with his version of he Q Lord's Prayer, sharing the same general themes of fatherly provision and asking God to spare his children 'the test', recalling the 'tests' sent upon the people by God in the wilderness. Luke adds the Q material about God giving good gifts to his children (Luke 11:9-13/Matthew 7:7-11), certainly the point of the Deuteronomy text, together with his own parable of the Importunate Friend, which (like its twin, the parable of the Unjust Judge, 18:1-8, also uniquely Lukan) urges the seeker not to give up praying 'How long, O Lord?'"

Deuteronomy 13:12-18 / Luke 12:54-13:5
"Whole judgement of his people.. Whole cities lapsing into pagan apostasy are to be eliminated, destroyed, Deuteronomy mandates, with nothing ever to be rebuilt on their desolation, so seriously does Israel's God take spiritual infidelity. No less gravely does the Lukan Jesus take the lack of repentance on the part of the Galileans and Jews. Past tragedies and atrocities will be seen as the mere beginning of the judgments to fall like the headsman's axe on an unrepentant people. Of course, the Lukan Jesus prophesies long after the fact, referring to the bloody triumph of Rome in Galilee and Judea culminating in 73 CE."

The idea is that the New Testament is a sort of midrash, a common form of Jewish literature, where homilies are given on scripture themes, sometimes with only glancing or metaphorical reference to the source. Old wine into new bottles, so to speak. While one might argue that Jesus himself may have delivered all these homilies in structured form, commenting on Torah passages and stories, in sequence, as he preached through the land, with scribal listeners taking careful note. But the much likelier hypothesis is that the structure as well as the content came much later, in the quiet of the authorial chamber, with the relevant scrolls at hand.

Hindu traditions are full of this kind of thing, (though generally oral, not scribal), as gods make multiple re-appearances, (avatars), each one provided with related, but different, stories. No one wonders whether any of these characters were "really real" or not. The human need for transcendent, not to say magical, heroic drama seems universal and insatiable. Harry Potter comes to mind also, including the vast fan literature it has generated. The Jewish community in its many sects and off-shoots was very active in this respect, to the point that one can imagine a fresh hero derived from the messaianic and prophetic strains of the old testament, who spend his (fictive) time fulfilling OT prophecies and updating lesson after lesson out of the OT. And the Hellenized proto-Christians took off with it, in perhaps unexpected and unintended directions.

The Jesus Seminar was a conclave of biblical scholars who attempted a sort of Jeffersonian re-write / re-daction of the bible, casting stones on all the less believable material (miscellaneous miracles, infancy stories), while keeping the moral sayings and teachings, as presumptively "genuine". But Price (who was a member of this august body!) points out that this hardly addresses the basic question of believability, let alone historicity. It is like taking the Superman story and deleting the flying-through-the-air parts, and thinking that what you have left is more truthful. No, the whole story was of a piece. An archtypal piece that has a purpose for its time, taking the form of history, but not necessarily being history.

Two other examples that come to mind are Islam and Mormonism. Islam would be amenable to the Jesus Seminar approach, since Muhammed is certainly a historical figure. Here it makes sense to separate, say, his night flight to Jerusalem from his marriage to nine-year old Aisha bint Abu Bakr. The latter, quite believable. The former, not so much. Mormonism, on the other hand, is fabricated from top to bottom. Not that Joseph Smith was not a historical person, but that the book of Mormon is a work of utter fantasy, concocted from Bible bits, completely made-up history, and portentious language. This type of thing seems endemic to the human condition, cropping up again in Scientology even more recently. The ancient world had even more porous relations between factual and fantasy history, and even sci-fi dystopia / analogy / futurology, as the book of Revelation makes clear. Heroes can be made to order.

A small further example is the birthday of Jesus, i.e. Christmas. This is a total fabrication, merely the co-optation of the existing Saturnalia by the new religion, with no knowledge whatsoever of the true birth date. Yet this too is taken as "gospel" by plenty of people.

  • James, brother of Jesus.
Then there is the reputed brother of Jesus, James, who is substantially better-attested historically, leading the early Jerusalem church, with plenty of tangles with Paul, among others. Price has some fun with the Catholic summersaults on the nature of James, since by its interpretation, Mary was a perpetual virgin, and thus Jesus having a brother was a no-no. But he is called by Paul and others the brother of Jesus. This is perhaps the biggest single problem with the myth hypothesis- the one thread that best testifies to the reality of Jesus himself. But "brother" is a notoriously flexible term. The medieval monestaries were rife with them, and Price offers that James was perhaps a follower of higher grade than the rest in some other respect, as was later reflected by his temporal leadership, and was thus inducted, whether contemporaneously or latterly, into the inner-most circle of the heroic mystery. One has to admit this interpretation is quite strained, given how Paul (and then Josephus, as above) refers to James as the lord's brother very casually in passing.

  • Analogous to climate heating denial?
Lastly, one has to ask whether this myth hypothesis is just headstrong denialism- the last gasp of the dedicated atheist. Price points out, however, that the Jesus myth theory is largely unrelated to atheism per se. Jesus could easily have been real, and done all the Seminar-approved things, and there still not be a god. Conversely, god could exist, yet Jesus not be his messaih, as the Jews have long maintained, or have not existed at all. There are plenty of other gods to choose from, after all.

It is certainly cantakerous, even in this skeptical age, to point out that the reality of Jesus is far from secure. And as everyone points out, the vast majority, even of Biblical scholars, take the opposing postion. But the vast majority of Biblical scholars are both believing Christians and have a vested interest in their subject. So a majority here does not count for as much as one might think.

A comparison with climate denialism is instructive. On that front, the majority is led by scholars working with far more data, much of which is contemporary, public, and reproducible. Their interaction with the historical record is far more dynamic, as new forms of evidence, like tree rings, stalagtite rings, fossil coral, isotope analyses, etc., allow us all to peer ever farther and more accurately into the instructive past. What a difference from the Biblical scholars (or ourselves) ruminating over their feelings about this or that passage!

Importantly, in contrast to the case of Christianity, it is the climate denialists who bear the metaphorical cross of motive in this case, since they are often paid by the very industries whose economic interest (indeed existance) lies in denying what has been patently obvious for over a century- that CO2 is a greenhouse gas, that our atmosphere acts as a greenhouse, and that the biosphere is currently being decimated by (geologically) rapid heating.

While climate scientists have a coherent theory of their topic, (and their deniers really do not), the Jesus myth theorists have yet to come up with a detailed theory of which community among the first century Jews had the motivations and materials to generate the tradition that was taken up by the apostles and Paul, as described more or less in Acts, before it was so nicely and systematically elaborated in the gospels. Robert Price takes a few stabs at this issue, invoking Marcion as a key generator and organizer of eary gospel material, with certain theological visions and axes to grind. Nevertheless, key data is missing from the story's origin period, perhaps necessarily so, since any heresies have been well and truly expunged from the record by this point. What we have is heavily sanitized and twice-told tales from many decades after, and little else.

At any rate, one should appreciate that, whether entirely mythical or not quite entirely mythical, there is precious little to nothing known about Jesus, once all the encrustations are pared away and one takes a careful and skeptical look at what is left. Our contemporary knowledge of the rapidity with which myths can grow, from seeds either fictional or factual, and the enthusiasm people show in augmenting them and expressing their own views through them, should be a big piece of the historical & critical approach we bring to bear on this question. The Jewish messianism & escatology that constructed early Christianity could have arisen either from a community of writers consulting their many sources for appropriate passages and prophesies, or, quite a bit less plausibly, a remarkably inspired (and scholarly) single person simultaneously embodying and preaching a precise set of midrashes based on Torah themes, brought up to date for the Hellenized, post Roman-conquest Middle East.

Saturday, February 22, 2014

Freaky speaking- what we know about stuttering

Review of "Out with it", by Katherine Preston, with supplementary data.

Yes, I stutter. And it is a royal drag. Preston's book memoir about tells her story of the trials and travails of not being able to face the world with normal ease and confidence. Stuttering is specially odd because speech is such a rich medium, conveying emotion and status and so much else along with the explicit information. All that gets garbled up if one is fighting to get every other syllable out.

The first half of Preston's book is outstanding, portraying her trials in very affecting and articulate terms. The second half is where I (and she, until she drastically rewrote the project), was expecting to learn the most up-to-date science about stuttering, and branch out to the stories of other people. But unfortunately, it kept being about her, bringing the reader pretty much up to the very moment of Preston's life in the midst of manuscript writing.

Anyhow, let me fill in here with some of what the book should have conveyed. Stuttering is, as Preston relates, a maddeningly protean condition, coming in a wide spectrum of forms and severity. It is affected by emotional tenor, and remits after pretty much any novel therapy, but then returns again, afflicting the subject with guilt as well as disfluency. Is it caused by mean parents? Is it strictly a genetic and brain development condition? Well, some of each, but mostly the latter. Preston cites the most supportive and healthy parents possible, but one gets the sense that other cases get contributions at least in part from the family dynamic.

At base, stuttering won't happen without a biological predisposition, which is known to be highly heritable (as well as male-biased). 82% heritable in a recent twin study, in fact. Unlike other disorders whose genetics have been vague, full of false turns and bad statistics, a few genes have successfully been linked to stuttering, including GNPTAB, GNPTG, and NAGPA, which all function in a pathway important to lysosomes, the cell's recycling centers. They encode enzymes needed to tag the roughly 40-50 lysosomal enzymes, which collectively break down fats, proteins, and other molecules so that the cell can get rid of its waste. Lack of the tag leads to the enzymes end up mis-addressed, secreted outside, and thus to lysosomes that can't do their jobs. The stuttering mutations and their effects are only partial, though. Far more severe diseases are caused by more severe mutations- the type II mucolipidoses, which are fatal.

How all this leads to brain-specific issues, let alone speech-specific issues, is quite unkown. But the genetics is not going to lie, so there must be some mechanism by which, say, some neurons, at some stage of development, might be more sensitive to this internal deficiency than the rest of the cells of the body.. etc. etc. Perhaps enough lysosomal proteins leak out of the cell by the external secretion pathway (which is the default, when proteins are incorrectly targetted) that they mess up neuronal pathfinding or myelination during development. One can fill in the tech-talk ad libitum at the moment.

But these three genes only account for about 10% of the genetic ingredients of stuttering, so others, a few of which are known, may be more informative as to the mechanism. One is FOXP2, a transcription regulator which is known to be responsible for other, far more severe, speech deficits when more heavily damaged, and to be a target of evolutionary change in the recent human lineage, perhaps relating to speech acquisition among other things. Another is CNTNAP2, which operates just downstream of FOXP2 in the same pathway. But it has to be said that, in light of the general theory that stuttering is a developmental brain deficit, there is little liklihood that any of these genes / molecules will lead to some chemical cure. They had their effect back during development, and that cake is baked, so to speak. Incidentally, one paper maintains that "... a mouse model of stuttering may be possible.", which sort of boggles the mind!

Proceeding to the anatomical level, there has been quite a bit of brain scanning work on stutterers recently, with a wide range of targets and findings. The networks in play are speech recognition, in the auditory cortex, then Broca's area more related to speech production, and of course the general motor system, which comprises the cerebral motor cortex running over the midline of the brain surface from ear to ear, and its outputs through the spinal cord, plus important modulatory motor systems like the basal ganglia and the cerebellum.

Basic surface brain map, including auditory speech area (Wernicke's area) and the speech production area (Broca's area), which are heavily tied to each other, before leading to later motor areas in the motor cortex, cerebellum, brain stem, larynx, etc. Broca's area is heavily lateralized, being larger on the left than the right. Broca's area can also be referred to as Brodman area 44/45, the inferior frontal gyrus of the cortex, and as the pars triangularis / pars opicularis.

The most frequent finding is that the left side of the cortex, where Broca's area is notoriously lateralized, is short-changed a bit, and that the right side takes over somewhat more physical gray matter and speech functions. This leads to a hypothesis that reduced lateral dominance leads to a sort of speech train wreck, where both sides of the brain are trying to run one mouth, as it were, and not doing very good job of it.

Example of differential scanning of the brain's anatomy, in this case tensor imaging of white matter tracts connecting to or from Broca's area. The significant density difference indicates that the conduits between Broca's area and others are deficient in stutterers. Connection to the incoming auditory areas is particularly deficient.

But many studies have been done, finding variations in many of the various areas involved in speech, and disagreeing in certain respects. One gets the sense that the natural variability of people's brains, combined with the low numbers of subjects one can use for this kind of study, and perhaps the relatively small effects, makes it difficult to reach definite conclusions, though the field is still young. A brief bibliography:
  • Evidence of Left Inferior Frontal–Premotor Structural and Functional Connectivity Deficits in Adults Who Stutter.
  • Atypical brain torque in boys with developmental stuttering.
  • Resting-state brain activity in adult males who stutter.
  • Functional brain activation differences in stuttering identified with a rapid fMRI sequence.
  • Motor excitability evaluation in developmental stuttering: a transcranial magnetic stimulation study.
  • Brain activity in adults who stutter: Similarities across speaking tasks and correlations with stuttering frequency and speaking rate.
  • Atypical caudate anatomy in children who stutter.
  • Using Brain Imaging to Unravel the Mysteries of Stuttering.
  • Corpus callosum differences associated with persistent stuttering in adults.
  • Computational modeling of stuttering caused by impairments in a basal ganglia thalamo-cortical circuit involved in syllable selection and initiation.
  • Stuttering: a dynamic motor control disorder

On the bright side, treatment for Parkinson's disease in people who happened to also have stuttering, by the novel methods of deep brain stimulation in the thalamus has led to alleviation of stuttering, according to a couple of papers (though it also made it worse in others). What is the ventral intermediate nucleus of the thalamus? It seems to sit between cortical imputs and the cortical motor system, so it is involved in learning and regulation of motor behavior.

Given its negative effects, why is stuttering as prevalent as it is, for as long as it has been, from the earliest historical records? I think, like with many other conditions, it is a matter of balancing selection, whereby some of its genetic ingredients, when not all concentrated in a fully stuttering phenotype, correlate generally with high reactivity and fast reflexes. Which can have positive aspects, in a past world if not this one.

It is also worth noting how stuttering is one more example of the absence of a soul. Pending more thorough research, all signs point to it being a circuitry problem where developmental deficiencies cause some lack of coordination. No demon, soul, or higher power need be, or can be, invoked.

  • Synanon and stuttering.
  • Post-Christian, with a little nostalgia.
  • The anthropocene will be (or has already been) distinguished by the death of most other large life forms. And countless not so large ones.
  • The IRS- another GOP whipping boy, starved to fail. Just like the post office.
  • Public libraries totally rock!
  • Gains slipping away in Afghanistan.
  • Money, meaning, and happiness.
  • Billionaires are not, typically, your friend. There is a class war, and they are winning.
  • Social security, on the other hand, is, and helped enormously in the recession.
  • Fracking for thee, but not for me.
  • How do banks work? Still a matter of some controversy.
  • What, exactly, is "public" about Facebook's corporate structure? Z-berg gets to spend the public's money.
  • This week in the WSJ "A taxpayer needed a taxable income of $307,000 to enter the top 1%, a figure that hardly qualifies as "rich" today, especially in cities like New York, Chicago, Los Angeles or San Francisco."

Saturday, February 15, 2014

A curious culture

The muslim encounter with the West; More from Bernard Lewis's "What went wrong?"

The last time I reviewed Bernard Lewis's book, "What went wrong? The clash between Islam and Modernity in the Middle East", I strongly supported an hypothesis he made in passing that Muslim women were perhaps the biggest problem of the contemporary Islamic world. That the patriarchial system of systematic disenfranchisement, sequestration, non-personhood, illiteracy, and non-education perpetuates not only a vast cultural deficit among women, but also among men, who are, after all, all raised by women.

Here I will take up a second thread from his book. That is the relative strength of the religious traditions within the Islamic and other cultures. Lewis lays out the unique strengths of Islam as follows:
"The children of Israel fled from bondage, and wandered for 40 years in the wilderness before they were permitted to enter the promised land. Their leader Moses had only a glimpse, and was not himself permitted to enter. Jesus was humiliated and crucified, and his followers suffered persecution and martyrdom for centuries, before they were finally able to win over the ruler, and to adapt the state, its language, and its institutions, to their purpose. Muhammad achieved victory and triumph in his own lifetime. He conquered his promised land, and created his own state, of which he himself was the supreme sovereign. As such, he promulgated laws, dispensed justice, levied taxes, raised armies, made war, and made peace. In a word, he ruled, and the story of his decisions and actions as ruler is sanctified in Muslim scripture and amplified in Muslim tradition."

The contrast with Christianity is particularly sharp. Christianity, as Nietzsche bitterly pointed out, is a loser religion. Jesus was tortured and killed by the Romans. He has never returned like he said he would. And if he ever does return, it will be hell on earth, as we are told in Revelation. Christianity had to be built on extreme cognitive dissonance, which had several effects. First was constant fission into sects and conflicting ideologies. If the core story is so unbelievable and requires such ideological gymnastics for palatability, it will naturally lead to conflicting interpretations and continuing dissatisfaction with any reigning interpretation. This was particularly evident in the early times of Christianity with the constant strife over the cannon, the creed, etc. And then it broke out all over again in the Reformation. There has been no reformation in Islam.

The second effect was a durable separation from the state. While medieval popes behaved more or less like full-fledged states, Christianity mostly fit the more traditional shamanistic role of advisor and arbiter of power, not the holder of power directly. Its internal doctrine was basically non-wordly, indeed highly impractical, and its model of Jesus was the epitome of the non-powerful, non-ruler. A giver of riddles and dreamy ideals more than than a tough Machiavellian. The Catholic church built this puzzle into an institution that invaded everyone's lives, took confessions, trafficked in the body and blood of its totem, made and unmade rulers, but never achieved what came naturally in the Islamic world- the full totalitarianism of the union of religious and temporal power.

As Lewis points out, the solution to the first problem in Christian Europe was the development of secularism and the civil society as a neutral zone among warring religions, giving up the totalitarian scope of most religions up until that point, in this case the ideological totalitarianism, if not the temporal. No such transition occured in Islam, which constitutes the manual of state, law, religion, morals, and a generally complete world view for its adherents. But this manual of state never underwent the kind of critique that happened during the enlightenment under Locke, Mill, Rousseau, et al. Or even underwent, as Europe did, centuries of gradual evolution of parliaments, the language of individual rights vs the state, and similar legal developments descended from Rome and in some instances from Christianity. So when the technology of modern state control entered the Islamic world, we ended up with lots of bad dictatorships, not democracies.

The excruciating developments in Egypt, where modern, democratic impulses have been smothered under the same old military model of strong-man government, dedicated to the proposition that the only loyal opposition is a dead opposition, goes to show how deep the cultural differences remain. The Egyptian government is hardly Islamic in any theocratic sense. It replaced an apparently more fundamentalist Islamic government. But contemporary fundamentalism is a false measure of authenticity, as it is merely a relatively modern reaction to the West and Westernization. The military dictatorship model is probably more traditional and durable in the Islamic world, going back to Muhammed himself, and certainly his successors. After all, that was the core of the Sunni-Shiah split: should the most powerful actor run the state and the religion, or should the most theologically / geneologically appropriate inheritor from Muhammad be given the keys? Sunnis have always chosen the former- a very practical choice, in a way.

But medieval stasis in political philosophy is hardly the worst of it. There is stasis in many other aspects of the culture, only glossed over by the fabulous wealth of the Muslim petro-states. There is a simple lack of interest in other cultures, in translations from other literatures, in science, in diplomacy, in art, in ideas that come from secular sources. While Europe's competitive ferment and legacy from Rome eventually generated endless inquisitiveness that is now institutionized in our universities, the grand Islamic schools of learning always "learn" about the same old thing ... the Koran. And not even using the critical tools that have blown up the study of ancient texts elsewhere.


  • Gratitude, Afghan style. Just which side is the government on?
  • Drug control can work, with public support and moderate policies.
  • Affirmative action- coopting and false-carding the black middle class?
  • Yes, an atheist world would be (will be) wonderful.
  • Brains age rationally- learning less, executing more.
  • Yes, Dorothy, crime really is criminal. But does anyone have legal standing to fight it?
  • Fossil fuel is so over.
  • Or not .. without a high carbon tax, no other action will work. BP projects renewables at 7% of consumption in 2035. Is that acceptable?
  • Social security needs to be increased. Because entitlements are ... good.
  • In Europe, will festering economic failure turn into political disaster?
  • Unearned money makes people conservative and mean.
  • Martin Wolf for redistribution, and for robots.
  • This week in the WSJ: "Reforming that public-school monopoly is the litmus test of seriousness on income inequality." It is truly incredible how WSJ columnists, who presumably are the intelligent creators of wealth and public good, can be so self-centered and blind. But I guess wealth does that.
  • Image of the week- religion in the US.

Saturday, February 8, 2014

Being and B.S.

Review of Martin Heidegger's Being and Time.

Martin Heidegger was a philosopher of the interwar and post-world war 2 period, and one of the founders of the continental school of modern philosophy which has headed into deconstruction and postmodernism. He coined the term existentialism, and is thought by many a leading or even the leading philosopher of the 20th century. His personal fixation was the question of being, to which he devoted what is deemed his greatest work, or even "towering achievement": "Being and Time".

In the development of modern philosophy, Heidegger stands against positivism and the whole analytical school, so I thought it worthwhile to read up on his ouvre. Surely something is lost in translation, but one does what one can. I can do no better than provide a few quotes, from a translation by Joan Stambaugh, 1977.

At the outset, he tries to forestall doubters:
"It is said that 'Being' is the most universal and the emptiest concept. As such it resists every attempt at definition. Nor does this most universal and thus undefinable concept need any definition. Everybody uses it constantly and also already understands what he means by it. Thus what made ancient philosophizing uneasy and kept it so by virtue of its obscurity has become obvious, clear as day; and this to the point that whoever pursues it is accused of an error of method."

And in the same vein...
" 'Being' is the self-evident concept. 'Being' is used in all knowing and predicating, in every relation to being and every relation to one's self, and the expression is understandable 'without further ado'. Everybody understands 'The sky is blue,' 'I am happy,' and similar statements. But this average comprehensibility only demonstrates incomprehensibility. It shows that an enigma lies a priori in every relation and being toward beings as beings. The fact that we live already in an understanding of Being and that the meaning of Being is at the same time shrouded in darkness proves the fundamental necessity of recapitulating the question of the meaning of 'Being.'"

He then discusses the origins of a scientific field from a vague intution to a metaphysical speculation, till finally it becomes a well-defined discipline, with methods, laws, theories, etc. Or at least I imagine that is what he is driving at.
"Being is always the Being of a being. The totality of beings can, with respect to its various domains, become the field where definite areas of knowledge- for example, history, nature, space, life, human being, and so on- can in their turn become thematic objects of scientific investigations. Scientific research demarcates and first establishes these areas of knowledge in rough and ready fashion. The elaboration of the area in its fundamental structures is in a way already accomplished by prescientific experience and interpretation of the domain of Being to which the area of knowledge is itself confined. The resulting 'fundamental concepts' comprise the guidelines for the first disclosure of the area. Whether or not the importance of the research always lies in such establishment of concepts, it true progress comes about not so much in collecting results and storing them in 'handbooks' as in being forced to ask questions about the basic constitution of each area, those questions being chiefly a reaction to increasing knowledge in each area."

Now we get into some heavy weather...
"The ontic priority of the question of Being. 
Science in general can be defined as the totality of fundamentally coherent true propositions. This definition is not complete, nor does it get at the meaning of science. As ways in which man behaves, sciences have this beings (man's) kind of Being. We are defining this being terminologically as Dasein. Scientific research is neither the sole nor the primary kind of Being of this being that is possible. Moreover, Dasein itself is distinctly different from other beings. We must make this distinct difference visible in a preliminary way. Here the discussion must anticipate the subsequent analyses which only later will become really demonstrative. 
Dasein is a being that does not simply occur among other beings. Rather, it is ontically distinguished by the fact that in its Being this being is concerned about its very being. Thus it is constitutive of the Being of Dasein to have, in its very Being, a relation of Being to this Being. And this in turn means that Dasein understands itself in its Being in some way and with explicitness. It is proper to this being that it be disclosed to itself with and through its Being. Understanding of Being is itself a determination of Being of Dasein. The ontic distinction of Dasein lies in the fact that it is ontological."

He seems to be trying to establish a conscious and self-reflective being as a special case of the general case of "being". In German, "dasein" means being (sein) there (da), which does not seem to add very much ... it is "existence" in any case, here or there.

Anyhow, one can imagine pages and pages of this, leading nowhere, and get a thorough sense of this text. It shares with its descendent postmodernism (not to mention its cousin theology) a sort of linguistic propulsiveness (with plenty of italics) and conviction of purpose without actually saying anything. Whether one agrees that, as Heidegger says, "The concept of 'Being' is rather the most obscure of all", he makes whatever it is less clear rather than more. It is a flood of sophism and pomposity that has led generations of all-too-serious students to strain their eyes and waste their talents, while setting itself up as some kind of tribunal of the highest, metaphysical kind over other fields.

  • Free markets for thee, but not for me.
  • Financial criminals reward each other with pay raises. And sycophantic press. And the uniquely powerful incentives to loot your own bank.
  • Workers of the world will not unite.
  • Yet unemployment is the worst fate of all.
  • NASA is a happy-talk disaster zone.
  • Eric Snowden's background.. how he reacted to army atmosphere: "Few of his new army colleagues, he maintained, shared his sense of noble purpose, or his desire to help oppressed citizens throw off their chains. Instead, his superiors merely wanted to shoot people. Preferably Muslims. ‘Most of the people training us seemed pumped up about killing Arabs, not helping anyone,’ he says."
  • PIMCO guru pushes MMT: deficits create money and credit, which we need to support growth. Don't pay attention to all the mistakes I made last year, though, and the year before that, and ...
  • This week in the Wall $treet Journal: "But the lesson from Europe is that the environmentalists who have been relentlessly hawking renewables are the real deniers." This piece makes a valid point, despite its hypocritical evasion of the appalling conservative denial of climate heating generally ... which is that transitioning to renewable energy is costly and difficult. Which is why we need a big carbon tax sooner, not later.

Saturday, February 1, 2014

Fins are not fingers

The evolution of arms and fingers from fish fins, a story of genetic redeployment.

There is still a great deal to learn about how our bodies and minds rise out of our genetic code. Despite a growing flood of genomic data- and we are right on the verge of the $1000 genome, meaning that everyone in the developed world will shortly have their genome sequenced as a matter of medical routine- a vast gulf remains between the concrete knowlege we now have about the two ends of the process: genotype and phenotype.

One of the great American scientists of the 20th century was Edward Lewis of Cal Tech, who studied the developmental genetics of fruit flies, focusing on mutations that affected their body plan. In one example, he developed mutants whose third thoracic segment, instead of growing tiny winglets called halteres, grew full wings, just like their second thoracic segment. They were a little like dragonflies. This led Lewis on a long path to characterize such "homeotic" mutations, (which transform body parts), and to a Nobel prize.

It is now known that the main gene Lewis studied, "Ultrabithorax" encodes a transcription regulator that sits in the middle of a large developmental network or cascade of transcription regulators. The process starts from the glimmerings of polarity determination in the mother's egg, and proceeds through successively finer divisions of space and identity within the growing embryo until we get to the ultimate effector genes that direct neuron production and migration, or muscle development, or one of a thousand other cell types that generate our tissues.

The genes that Lewis studied are collectively termed "hox" genes, short for homeobox, which itself is short for a DNA-binding motif that is found in all these genes whose mutations cause homeotic transformation, which has a characteristic DNA and protein sequence, only subtly altered in each one. They are all related because they are all evolutionary copies of a single ancestor.

These genes sit in the middle of the developmental cascade, and have themselves vast upstream regulatory regions, to gather regulatory information from earlier stages in the process. Segmentation has happened by the point they come into action, and the homeotic genes integrate the data about which segment their cell is in, and, if conditions are right, turn on expression of their encoded regulatory protein, thereby providing input to all the downstream genes that actually prompt the development of that segment's proper parts, be they wings, legs, antennae, arms, etc.



Hox genes occur in tandem clusters, and the clusters themselves have been duplicated during evolution. In the diagram above, (from wikipedia), sea urchins, at the top, have something like the original cluster of eleven hox genes, color coded by their position in the cluster, which also relates to the position along the body axis where they are expressed (at right). Fruit flies, at the bottom, lost a few copies, and gained a few others, but retain basically the same system. Fish and tetrapods, in the middle, duplicated the entire set, copying whole clusters to various chromosomes, and lost individual hox gene units along the way. This elaboration allowed more complicated body plans to develop, with the example of fingers being a new use of the hox code, added onto the basic body trunk segment-by-segment code. The head and brain are another place where the hox system has been re-used in tetrapods.

One confusing element of the field is that in tetrapods, the hox A and D clusters are partly redundant. Each can, on its own, direct formation of arm and fingers, and both need to be deleted to eliminate the arm. So the researchers in today's paper mix and match from both clusters to make their various points.
"During mammalian limb development, the activity of both HoxA and HoxD gene clusters is essential and the absence of these two loci leads to rudimentary and truncated appendages."

In the embryonic hand, expression of many Hox D genes, from d9 to d13, are required to specify tissues during development, as are a few of the Hox A genes. They have overlapping patterns rather than some neat, digital(!) code, this being messy biology, but through mutation and other studies, researchers have pieced together some information about which gene of the tandem arrays does what. The genes have some individual characteristics, but much of their regulation is collective, directed from enormous regions on both sides of the cluster, comprising over three million base pairs of DNA.

The Hox D locus, on human chromosome 2. It contains eight distinct hox genes, (numbered black boxes at bottom), flanked by enormous control regions on either side which drive expression of some cluster genes in the hand (blue) and some in the arm (red), responding to transcription regulators earlier in the cascade of developmental patterning and differentiation. What are those fancy-looking blue and red cubic designs? That reflects a separate study where the authors physically tested which DNA was close to which other DNA in embryonic cell chromosomes. And they found that the right and left regions form their own knotted-up domains each hooking up with the central hox D gene, but not touching anything on the opposite side.

A recent paper is one of a pair that find that two clusters, hox D and hox A, are both flanked by very large regulatory regions that in fish have only slight differentiation, one directing slightly more distal (towards the outside) expression than the other one (red). The large regulatory region downstream (red) which originally specified expression in fish fins, has pretty much retained the same function in tetrapods, specifying the arm.

But the large regulatory region on the other side (blue) in fish only adds a little bit of extra expression to some cluster members towards the outside of the limb. In tetrapods, however, it specialized to direct expression of hox D genes in the hand, quite exclusively from directing expression anywhere else. The basic finding is that fish fins are not proto-fingers, really, but are related principally to our arms. The fingers arose from a mostly new regulatory program established by the blue areas in the genome shown above. And the wrist ... that is specified in the gap, partly by the lack of hox expression. It is interesting to note as an aside that the hox B and hox C clusters seem to have regulatory control only from one side, not both sides.

Inference of the paper, that the hand-specifying regulatory  regions of hox D and hox A (blue) developed from earlier regions (yellow) that had relatively minor roles in fish, and which specified the margin of the fin, rather than a separate structure.

What is some of their evidence? Well, first, let's see some of the native expression of mouse hox A genes:

Expression of individual genes from the mouse hox A cluster, showing finger-specific expression for 9, 10, 11as, and 13. The exception of hox A11 is striking, as a departure from the hand-specific pattern of its nearby siblings, and in its well-defined zeugopod, or lower-arm expression pattern.

One obvious experiment was to transplant the fish hox DNA into mice to ask where it gets expressed. It always gets expressed in the same place- where the arm expression happens, at the base of the limb bud, not where finger expression happens. This makes the case pretty strongly that finger expression and development was, as one might imagine, a novel evolutionary development.

Mouse embryonic limb buds showing the expression of a transgenic zebrafish hox A cluster, with regulatory regions and genes it contains, including each of the ones as labeled. They all get expressed in the near, or arm region, not in the finger region. This was true no matter which regulatory region of the zebrafish hox A cluster was used, whether the upstream or the downstream side.

Even more striking, the researchers show expression patterns in complete embryos. Below is a stage E11.5 mouse embryo with transgenic fish hox A13, driven by the fish regulatory region corresponding to what would be the hand/finger-specifying region on tetrapods. Its expression appears in many areas of the body, but not in the fingers, as the mouse's own hox A13 does. It is worth noting that in vertebrates, the hox genes are used all over again in specifying brain region development, which does not happen in flies. It is a common theme- that through the accumulation of regulatory complexity, the same genes can be re-used many times to create ever more elaborate phenotypes.


As you can see from the genome locus diagram a few figures above, the regulatory regions controlling the hox D genes are far, far larger than the protein-coding genes themselves. Complexity of control is the theme in all genomes, especially ours. These regions contain many little modular bits of DNA that bind to various other transcriptional regulators that operate from upstream in the developmental cascade, allowing a progressive, step-by-step, though in actuality a stochastic and mix-and-match evolutionary process whereby the silk purse of our present bodies are made out of the sows' ear of a few thousand ancient genes.

  • 23 & me genetic testing- another front in privacy and big data.
  • Example of another paper on limb formation, in the transcription regulator cascade of development.
  • Creationism map.
  • The POTUS with the SOTUS- does work pay the worker, or only the CEO?
  • These kids just don't understand religion!
  • The patent backstory to the Google, Motorola, and Nortel deals.
  • Fascism, American style- corporations and the blacklist.
  • Economic quote of the week, from John Schmitt:
"Workers today are a lot older than they were in the 1960s or the 1970s, and they are enormously better-educated than they were in the 1960s or 1970s. The fact that most workers are doing barely better, and some workers are doing worse than their counterparts from 40 or 50 years ago … suggest that the problem is that the way the economy converts people’s skills, people’s experience, people’s education and their training, into good jobs is what has deteriorated over this period. Not people’s underlying skills, or work experience, or education."

Saturday, January 25, 2014

Surveillance, politeness, and privacy

Is the NSA killing us or protecting us?

Surveillance as a general social principle. We are always watching each other, and it is the primordial way of being in society. In the old days, gossip was the principal method of leveraging surveillance into social power and enforcement. Now we happily surveil each other by facebook, twitter, google earth, and leave comments. The issue in our new surveillance environment is not the existence of surveillance per se, but the asymmetry and invasiveness of surveillance. Do we know who is watching, what they are watching, and when they are watching? Are they harming us? Can we turn it off?

Traditionally, social surveillance is always mutual. You see me at the same time I see you- having a meal together, talking, hunting. The power of this mutual observation and interaction is immense, policing our behavior so as to enforce "normal" standards, alert for any deviation, political or moral lapse, for novel signals of fashion, disease, innovation, threat, etc. Religion is its purest expression- including extensive, in-depth thought policing.

Some people stand up well to all this observation, some don't. The pervasive social pressure has enormous effects on our mental health, causing depression, suicide, peer pressure, status anxiety, etc.. one of the great, if not the greatest, motive forces of politics and social life in general. One point of etiquette is to relieve people of this anxiety, leaving their private affairs politely out of the conversation, even as the observation goes silently on. The essence of privacy is not that we are not observed, but that we are not held to account or bullied about it beyond endurance.

The totalitarian societies were a sort of reversion back to the small town mode of intense surveillance, with a total invasion of privacy and violation of civilized etiquette in the bargain, using all this information against people at their most vulnerable points. But in large societies we have typically adapted to a much looser model of toleration & privacy, where due to the sheer numbers and sheer density, more observation and more diversity must be accommodated than humans are typically comfortable with. So we keep a small community of close relationships and mutual close surveillance, amid a large cloud of anonymous and little-noticed passers-by.

Big data has changed all this, bringing the intimacy of small town surveillance, where the local store clerk, for instance, knew what everyone bought, to the global stage. Some embrace the facebook-i-zation of personal surveillance. The question is mostly whether we can turn off portions of this surveillance that we do not like, or which we collectively deem asymmetrically unfair and invasive,m or corrupt and incompetent. For instance, our credit cards provide entree to all our purchases to faceless corporations who diligently mine them for scraps of sales leads, and sell them off to their "partners". It is a seamy, disreputable business, and not at all voluntary.

If they had reasons of state, and a secret court looking over their shoulders, I would be far more amenable. But they don't. Credit cards are not an optional institution in today's world, so this surveillance is essentially involuntary, and extremely asymmetric. Its typical results, however, are modestly annoying, rather than invasive or life-threatening, so the cost has to date been borne without too much complaint. And the monitoring of all our web comings and goings.. well, it is not far from George Orwell's Telescreens of 1984, which monitor everyone with an unblinking eye.

What to do? The NSA portion of this is relatively inconsequential, really. The average person's degree of invasion from their practices is essentially nil, though surely mistakes have happened and cause great individual harm. The government's no-fly list is an example of a relatively more open program plagued with error and invasiveness.

But the flood of other personal data rushing into corporate and other unknown hands is far more serious. The Target incident where tens of millions of accounts were stolen, the ongoing traffic in social security numbers, identity theft, false tax claims, endless spam, and targeted come-ons, etc.. all point to a "system" in crisis. We have let our virtual selves contain ever more important data without vetting anything, or any serious legal structure. Sure, the companies in question have a stake in customer faith and thus their own prudence & etiquette. But their transparency is nonexistent and failures clearly frequent. We have no idea, and even they have little idea, what has been stolen or bartered away into the nether worlds of cybercrime.

Even biometrics hold out little hope. A fingerprint or iris scan can be forged, as can any other piece of data. We are trapped in a data whirlwind, where it is only ourselves, in person and with competent memories, that can completely attest to identity. So we are back to the personal, one-to-one world of rich and personal information that we began with.

I don't think it is enough to hark back to the privacy provisions of the constitution and take an absolutist position that divides harsh restrictions on government surveillance from a wild-west atmosphere in the private realm, papered over with the concept of so-called "voluntary" participation. We need new law in this new realm, to enforce competence of information collection and safe-guarding on all entities that collect big data, (with business-ending penalties for flagrant breaches), and to match its social effects and invasiveness with public oversight.


  • Drone war- the surveillance that kills.
  • Is scandal and blackmail the currency of NSA surveillance? That is not at all clear.
  • Intensive spying by big Pharma.
  • The $1000 genome is upon us.
  • Why are we stuck in a Reagan-era-rut in policy towards Latin America?
  • Long hours.. are not productive if you are creative and/or competent.
  • In Afghanistan, ".. the security situation throughout the country has gotten worse, not better, since the 2009 election."
  • Martin Luther King and the job guarantee.
  • A better union model, from Germany.
  • Buñuel does the conference call.
  • Generate your own scholarly, postmodern web page.
  • The expert's humorous guide to science fiction.
  • Brain asymmetry- just the facts. please.
  • As usual, companies can break the law, and contribute to the rot of public services.
  • Europe's youth- twisting in the wind. Even on film.
  • Martin Wolf: The megabanks are still too big to fail. Bigger and bail-i-er than ever, actually. In his review of Bernanke's tenure, he misses one critical failure- the failure to explain clearly to congress that withdrawing continued fiscal support was criminal. Monetary policy can not and has not replaced straight spending.
  • Economic cartoon of the week, Toles on trickle-down, Keynes, and the GOP's plans for the poor.

Saturday, January 18, 2014

The problem with positivism

"Positivism states that all authentic knowledge allows verification and that all authentic knowledge assumes that the only valid knowledge is scientific."

What is truth? A simpler, and more frequently used word could hardly be imagined, but philosophers differ over it, probably because of sentimental attachments to beliefs that may not be true. In the hands of theologians, idealists, and artists, truth often stands for "something I believe". If a novel stirs a deep emotion, it is true, even while it is false. If an artwork reflects and expresses a facet of the human condition in a surprising or powerful way, it is true. And if a belief in a deity is beautiful, socially bonding, and morally edifying, it is also true. At least one athelete is the truth.

This definitional issue remains quite confusing and misleading. The subjective uses of "truth" have little to do with the canonical correspondence truth, (i.e. the equation of the thought and reality), in that what is corresponding to the feeling of truth is a feeling it agrees with, not a condition of the outside world. Subjective states surely deserve the recognition of their existence and texture. But the word truth may not be the best way to describe them.

In contrast, science and the law take a more blinkered view. If something is true, it actually happened, or is part of the real world verified by observation and continually available for re-observation, and / or other forms of close analysis. While the sciences are edging into regions traditionally part of the humanities, they still regard truth as objective, and separate from personal state, wishes, ideology, etc. The DNA reads one way, and not another. The defendent was at the scene of the crime, or not. Evidence may not exist, and the truth may not be known, but that does not impair the idea of truth- its definition and possibility.

In this regard, our minds are truth engines, working very hard to model reality with accuracy. Eyesight is the most dramatic example, bringing us incredibly rich and accurate scenes with no apparent effort. But on more abstract levels too, we are constantly trying to figure things out, particularly other people, the object of so much of our intuitive acuity. But there are limits.. we have no intuitive grasp of physics on any large or small scale, and nor is our introspection particularly effective. The self is a black box that we struggle our whole lives to understand.

And one tool of all this modeling is imagination, which both consciously and unconsciously conjures all sorts of worlds and images, sometimes as hypotheses to be pursued, sometimes as warnings to be avoided. Unfortunately, (or perhaps fortunately), the line between sober analysis and imagination is not all that clear, leading to the establishment of the scientific method as a general and organized way for communities of people to figure out the difference, in fields where real truth is at least conceivable.

This was the hope of the postivists, to put all knowledge on the this same footing, by setting verificationist, empirical standards for knowledge and truth, and keeping all else outside the door. They tried to define everything else as "nonsense", or as not meaningful. But unfortunately, most of human experience happens in far more nebulous realms of subjective experience, vague judgements, and hopeful propositions. Which are often very highly meaningful indeed. So this re-definitional part of the project was as futile as it was repugnant.

For instance, not even the most airy metaphysical questions are entirely meaningless, which is one of the propositions of positivism. Rather, their resolution, after thousands of years of speculation, does not lie, typically, with the speculators. Philosophers provide the service of keeping some of these questions alive, at least in the academy, and of trying out various intuitive solutions to them. But the remaining problems of philosophy are clearly ones where both data and intuition are lacking. Whether data ever arrives is the main question. Whether intuition will ever resolve them is much less of a question.

More technically, the word positivism signifies positive proof, and by various skeptical arguments, (such as Hume's and the problem of induction generally), and by historical experience, it is clear that proof (i.e. verificationism) is a mirage in science, not to mention other fields. The most that can be hoped for is a provisional model of reality that doesn't violate too many observations- a coherentist model of truth.

So Karl Popper, for instance, who was altogether sympathetic to positivism, came out with his falsificationist principle, in opposition to the verificationist principle of positivism- becoming formally an anti-positivist, or at least a post-positivist. But even falsificationism is too stringent, since a contradictory observation can as easily be erroneous as damning. Judgement and interpretation are always called for, on the appropriate level of analysis.

A positivist temple, with Auguste Comte out front.
My take on all this is that positivism was overly ambitious. The point can be well-taken without setting up a new altar to absolute truth. All truth is, on our level, probabalistic, and exists on a spectrum from the precise and well-attested to the hearsay and ludicrous. That is what the contemporary Bayesian revolution in statistics and science generally is getting at, and what was lost in the positivist's rather extreme, utopian, project, for which they were bickered out of existence. Far larger lies and absurdities, however, were (and are) rampant in the field of philosophy than the shades of truth-i-ness found in the scientific literature or the history of science. To whit, a quote from Nietzsche:
"The other idiosyncrasy of philosophers is no less dangerous; it consists in confusing the last and the first things. They place that which makes its appearance last ... the 'highest concept', that is to say, the most general, the emptiest, the last cloudy streak of evaporating reality, at the beginning as the beginning. This again is only their manner of expressing their veneration: the highest thing must not have grown out of the lowest, it must not have grown at all ... thus they attain to their stupendous concept 'God'. The last, most attenuated and emptiest thing is postulated as the first thing, as the absolute cause, as 'ens realissimum'. Fancy humanity having to take the brain diseases of morbid cobweb spinners seriously! - And it has paid dearly for having done so."
-Quoted by Max Horchheimer, in Eclipse of Reason.

  • Some atheist basics.
  • Big surprise- conformists tend to go to church. Where their children are taught...
  • Superior vaccine delivery and activation.
  • Full review of the Robert Gates memoir.
  • Reflections on a past basic income and job guarantee scheme.
  • How discrimination works. And the key importance of learning on the job.
  • Europe's elites are screwing up again. Though they are hardly alone.
  • To Bill O'Reilly, a 40% pay increase is "not a big deal".
  • Born to not run... subpoenas will be flying.
  • Evil, climate change, and collective action.
  • Robots, jobs, and the second machine age. But the problem is not technological, it is economic and political.
  • This week in the Wall $treet Journal, on how the FCC should let CEOs run the internet: "... the FCC should drop its pursuit of net-neutrality rules altogether.... Next, the FCC should unequivocally restate its commitment to the multi-stakeholder model of resolving network-management challenges and Internet governance."
  • Economic graph of the week; we are bumping along at bottom, in terms of overall employment:

Saturday, January 11, 2014

Sympathtic vibrations: speech waves and brain waves

Brain waves sync up with perceived speech, pointing to possible functions.

What do brain waves do? They are a prominent feature of live, working brains, and change markedly under different conditions, especially sleep and epilepsy. They seem like a natural analog to the CPU clocking that is so essential in artificial computers, but clearly are more chaotic, slower, and diverse. They seem to make up a moderately important dimension of brain processing, combining with the other more fundamental dimensions of anatomical organization and electrical/chemical pathway conduction to make up brain activty.

A recent paper makes the comment that.. "A large number of invasive and non-invasive neurophysiological studies provide converging evidence that cortical oscillations play an important role in gating information flow in the human brain, thereby supporting a variety of cognitive processes including attention, working memory, and decision-making."

So what does "gating" mean? That is a bit hard to say. In artifical computers, the clock cycle is essential to quantize the computations so that each transistor and each computation is given a chance to do its thing in a defined time, then rests so that other elements can catch up to it, keeping the whole computational process in logical register. Brains may need a similar service, but clearly it is far messier, since individual neurons take orders from no one- they seem to fire almost chaotically. While rhythmicity is a property of individual neurons, brain waves (aka cortical or electrical oscillations) are very much a mass phenomenon, only biassing the behavior of individual neurons, not ruling them outright.

An attractive place to look for their function is in auditory cognition, especially speech recognition, since each form of oscillation shares a multi-frequency mix of patterns of related frequencies, though the range of sound frequencies are substantially wider (~30 Hz to ~15,000 Hz) than the range of electrical brain oscillations (few Hz to maybe 150Hz). Maybe they map to each other in some discernable way? As the authors state:
"The similarity in the hierarchical organisation of cortical oscillations and the rhythmic components of speech suggests that cortical oscillations at different frequencies might sample auditory speech input at different rates. Cortical oscillations could therefore represent an ideal medium for multiplexed segmentation and coding of speech. The hierarchical coupling of oscillations (with fast oscillations nested in slow oscillations) could be used to multiplex complementary information over multiple time scales for example by separately encoding fast (e.g., phonemic) and slower (e.g., syllabic) information and their temporal relationships."

Basically, the authors had subjects (22 of them) listen to about seven minutes of speech, played either forward or backward, and at the same time used magnetoencephalography, i.e. a ginormous machine that detects slight magnetic fields emanating from the head, to track superficial brain waves. MEG is somewhat more sensitive than EEG that is done with electrodes pasted onto the head. Then they fed both data streams into a correlating procedure (below), and looked for locations where the two oscillations were related.

Procedure of analysis- each waveform stream was deconstructed and correlated, to find locations in the brain where electromagnetic surface waves reflect speech waves.

They found several instances of correlation. Two were in the low frequency (1-2, 4-8 Hz) delta and theta rhythms, which directly entrain with the speech rhythm. Two more were in the 20 and 50 Hz range, where the amplitude of these gamma rhythms correlated with the phase of the lower frequency speech rhythms, a somewhat indirect correlation. The locations of these brain wave correlations were naturally over the auditory and speech centers of the brain:

Location of brain waves, of various frequency bands, that correlated with speech patterns. This is a map of significant results, mapped to each hemisphere. Note significant differences between the hemispheres, right on the right side.

"In sum, this comprehensive analysis revealed two distinct speech tracking mechanisms in the brain. First, low-frequency speech modulations entrain (that is, align the phase of) delta and theta oscillations in the auditory cortex. Second, low-frequency speech modulations also entrain the amplitude dynamics of gamma oscillations."


Speech trace (A) shown with a superimposed dotted line (cosine) of the theta brain wave of the listener. In B, the brain is shown, with locations of 3-7 Hz entrainment labeled in red, specifically entrainment that differed significantly between the forward and backward speech renditions. C shows the overall cross-correlation data, for both hemispheres, with signals at 20 and 48 Hz, at least on one hemisphere. This tracked not overall speech, but the correlation with speech starts and stops, showing close phase tracking.

The phase entrainment shifted position when successive speech elements (stops/starts for sentences and words) arrived, showing that the system tracks the input quite carefully.

Most intriguingly, the authors found that backward speech was significantly less correlated with brain waves than forward speech. This indicates some top-down control, where intelligibility of the speech stream is broadcast back to lower levels of the auditory processing apparatus to fine-tune expectations of the next event, via stronger rhythmic alignment.

They also found differences between the hemispheres, with the low-frequency correlations stronger in the right hemisphere, and the gamma-wave correlations stronger in the left, which contains the primary language areas in most people (such as Broca's and Wernicke's areas).

"Our study supports emerging models of speech perception that emphasise the role of brain oscillations. Hierarchically organised brain oscillations may sample continuous speech input at rates of prominent speech rhythms (prosody, syllables, phonemes) and represent a first step in converting a continuous auditory stream to meaningful internal representations."

One can imagine that brain waves assist processing in several ways. When unified over large areas of the brain, they might enforce regimented processing, (i.e. transfer of neuronal signals from one cell / layer / module to the next, in ways that constitute signal processing from raw to more abstract representations), which could make it more efficient and also better able to affect other areas of the brain, such as consciousness. In auditory processing, the advantage in lining up processing with the actual signal should be clear enough. They could also reduce chatter in the system, which seems universal in other brain studies. Do they "carry" signals themselves? Not really, just as the computer clock cycle doesn't tell us what the computer happens to be doing, but facilitates the detailed processing flowing through its innumerable wires and junctions.


  • A better review of the same paper.
  • Test your hearing.
  • Religion, tribalism, hate, love, etc. etc...
  • But some still insist upon religion. And "definitively refute" atheism. And finish up with C. S. Lewis. Hmmm. 
  • The onion refutes it a little better.
  • And becoming an atheist.. not so easy.
  • Economic wreckers and capitalist running dogs in our midst.
  • Turns out, Republicans do favor redistribution, after all.
  • Managing the job guarantee.
  • 4K TVs work wonders as monitors.
  • The India diplomatic row is an example of why worker protections and minimum wage protections are so important... the system worked.
  • Satanists.. performing a public service.
  • Yes, he is a bully.
  • Inheritance is increasingly significant, so death taxes are more important than ever.
  • Economists have no idea what they are doing.
  • Economic graph of the week, on unemployment.

Saturday, January 4, 2014

An American Marco Polo: Josiah Harlan

Quaker, ruler of Gujrat in the northern Punjab, General of Afghanistan, all-around schemer and adventurer.

The adventures of Marco Polo (1254-1324) are famous, mostly because they were so well recorded. He followed the briefly open silk road during the heyday of Kublai Khan, travelling all over the far East, and ruling briefly in the Khan's service in China. But when he returned to Venice, he was overtaken by the vortex of local politics, and was co-imprisoned with a gifted writer who helped put his extraordinary, yet quite accurate, tales into clear and compelling prose. Tales that came to be disbelieved after the silk road closed up again with the dissolution of the Mongol empire.

Unfortunately, Josiah Harlan (1799-1871) had no practiced ghost-writer, and was so politically vociferous in his anti-imperial writings that his lengthy memoir never heard the clang of a printing press. Nevertheless, his story has obvious parallels with Polo's, and contains interesting lessons for our own brushes with imperialism.

The book is "The man who would be King: the first American in Afghanistan", by Ben MacIntyre. Harlan was born into a mecantile family, for whom he shipped out to Canton and points east as "supercargo", or manager and sales agent for a ship's mechandise. Hearing from afar that his recent fiancé had married another, he decided to never come back, and gave himself up to what he seems to have wanted to do anyhow, which was follow a life of adventure in the East, following the trails of Alexander the Great, the British Imperialists, etc. It is interesting to note that while most venturesome energy in the US was directed Westward, Harlan had been bitten, via brother Richard and lengthy immersion in Greek and Roman history, with the bug of the old world and its more exotic precincts.

Eventually, he hired on with the British East India Company as a doctor for which he had no expertise whatsoever, and gained familiarity with India and its frontiers. But his eventually formulated aim was to become a ruler somewhere, preferably Afghanistan, whose ever-volatile political system seemed ripe for just his kind of energy and interloping adventure. So he started playing politics, offering his services to those out of power (an exiled former king of Afghanistan) to scheme against those in power. (Cut to a long march into, then out of Afghanistan... and a decade-long interlude in the service of a Punjabbi Maharaja, eventually governing one of his districts.)

Over time, he finally gained entrance to the inner circle of Afghanistan's rulers, and his appreciation for their merits increased markedly, causing him to switch sides from the exiled ruler. Unfortunately, just after Harlan was appointed general by the Afghan ruler Dhost Muhammed Khan and conducted a remarkable and immensely arduous expedition north to parlay with and / or defeat the various Uzbek and Hazzara chiefs around Mazar-e Shrif, the British decided they wanted to rule Afghanistan. How dare they?!

As is by now well known, the British army marched into Afghanistan in vast force, easily defeated the locals, and settled into what they thought was another India-style posting, with polo and partying. But not for long... these locals were not obsequious farmers and caste-ridden hierarchs, amenable to foreign rule. No, the Afghans are freedom-loving, highly martial, fissiparous, and blessed with a religion that prizes power and warfare, and with a mountainous country ideal for guerilla warfare. Only a single Englishman escaped alive.

The British had also placed their bets on Harlan's previous employer- the exiled king Shah Shujah, who was in every way a bad bet as their puppet: cruel, out-of-touch, and incompetent. Harlan astonished the British with his very existence and high position, and during their occupation, argued feverishly for better administration:

"I have seen this country, sacred to the harmony of hallowed solitude, desecrated by the rude intrusion of senseless stranger boots, vile in habits, infamous in vulgar tastes, the prompt and apathetic intruments of master minds, callous leaders in the sangiunary march of heeless conquests, who crushed the feeble heart and hushed the merry voice of mirth, hilarity, and joy." 
"To subdue and crush the masses of a nation by military force, when all are unanimous in the determination to be free, is to attempt the imprisonment of a whole people: all such projects must be temporary and transient, and terminate in a catastrophe that force has ever to dread from vigorous, ardent, concentrated vengeance of a nation outraged, oppressed, and insulted, and desperate with the blind fury of a determined and unanimous will."

In short, he urged the British to buy off the major tribes with plenty of bribes, and include them in the government. Harlan ended up making his way back to the US and retired to a farm, where he kept scheming, to establish camels in the US military, to transplant Afghan grapes, and write vast books. He raised a regiment for the Civil war, and died lonely and destitute in that haven of adventurers, San Franscisco. It is a remarkable biography, under-appreciated in American history.

How are we doing in the present day? We are bribing the Afghans copiously.. check. We have a ruler in Hamid Karzai who is not incompetent or excessively cruel, but isn't exactly an historic stateman, either. Check. Will he be able to peacably retire to his fruit orchards in Afghanistan when his term is up and the US continues to melt away? When the foreign money dries up? Our program for Afghanistan requires some deep cultural change, in that elections are supposed to determine who has power, and merit determine who occupies the civil service. But the culture has never been democratic, rather thoroughly aristocratic, with patronage / clientage the vital transmission mechanism. The heads of families and tribes are the only people whose votes count, competing endlessly among each other for position. Can the two systems merge into a working state?

The US experiment has gone longer and better than the Russian, let alone the British, occupations. But whether it sticks in a final, cultural sense, is impossible to tell, and on that everything hangs.


  • Kansas: infra-red Aynrandistan?
  • A libertarian rethink.
  • Do all the wrong people admit being wrong?
  • More on the middle class and inequality.
  • Ella in a some serious scat. And with Mel Tormé.
  • State of finance, 2014.
  • Big data + free market + corporate oligopoly + no more privacy = another disaster.
  • Are unions the answer to the disappearing middle class?
  • This week in the Wall Street Journal: "In a republic, if majorities can change laws or rules however they please, you're on the road to life with no rules and no laws."
  • Again, money is a far greater danger to the Republic than snooping as it is currently done, despite the year of Snowden, etc.
  • Economics graph of the week. Whose money is pegged to whom?
Countries pegged more or less to either the dollar (green) or the Euro (blue).