Saturday, February 8, 2014

Being and B.S.

Review of Martin Heidegger's Being and Time.

Martin Heidegger was a philosopher of the interwar and post-world war 2 period, and one of the founders of the continental school of modern philosophy which has headed into deconstruction and postmodernism. He coined the term existentialism, and is thought by many a leading or even the leading philosopher of the 20th century. His personal fixation was the question of being, to which he devoted what is deemed his greatest work, or even "towering achievement": "Being and Time".

In the development of modern philosophy, Heidegger stands against positivism and the whole analytical school, so I thought it worthwhile to read up on his ouvre. Surely something is lost in translation, but one does what one can. I can do no better than provide a few quotes, from a translation by Joan Stambaugh, 1977.

At the outset, he tries to forestall doubters:
"It is said that 'Being' is the most universal and the emptiest concept. As such it resists every attempt at definition. Nor does this most universal and thus undefinable concept need any definition. Everybody uses it constantly and also already understands what he means by it. Thus what made ancient philosophizing uneasy and kept it so by virtue of its obscurity has become obvious, clear as day; and this to the point that whoever pursues it is accused of an error of method."

And in the same vein...
" 'Being' is the self-evident concept. 'Being' is used in all knowing and predicating, in every relation to being and every relation to one's self, and the expression is understandable 'without further ado'. Everybody understands 'The sky is blue,' 'I am happy,' and similar statements. But this average comprehensibility only demonstrates incomprehensibility. It shows that an enigma lies a priori in every relation and being toward beings as beings. The fact that we live already in an understanding of Being and that the meaning of Being is at the same time shrouded in darkness proves the fundamental necessity of recapitulating the question of the meaning of 'Being.'"

He then discusses the origins of a scientific field from a vague intution to a metaphysical speculation, till finally it becomes a well-defined discipline, with methods, laws, theories, etc. Or at least I imagine that is what he is driving at.
"Being is always the Being of a being. The totality of beings can, with respect to its various domains, become the field where definite areas of knowledge- for example, history, nature, space, life, human being, and so on- can in their turn become thematic objects of scientific investigations. Scientific research demarcates and first establishes these areas of knowledge in rough and ready fashion. The elaboration of the area in its fundamental structures is in a way already accomplished by prescientific experience and interpretation of the domain of Being to which the area of knowledge is itself confined. The resulting 'fundamental concepts' comprise the guidelines for the first disclosure of the area. Whether or not the importance of the research always lies in such establishment of concepts, it true progress comes about not so much in collecting results and storing them in 'handbooks' as in being forced to ask questions about the basic constitution of each area, those questions being chiefly a reaction to increasing knowledge in each area."

Now we get into some heavy weather...
"The ontic priority of the question of Being. 
Science in general can be defined as the totality of fundamentally coherent true propositions. This definition is not complete, nor does it get at the meaning of science. As ways in which man behaves, sciences have this beings (man's) kind of Being. We are defining this being terminologically as Dasein. Scientific research is neither the sole nor the primary kind of Being of this being that is possible. Moreover, Dasein itself is distinctly different from other beings. We must make this distinct difference visible in a preliminary way. Here the discussion must anticipate the subsequent analyses which only later will become really demonstrative. 
Dasein is a being that does not simply occur among other beings. Rather, it is ontically distinguished by the fact that in its Being this being is concerned about its very being. Thus it is constitutive of the Being of Dasein to have, in its very Being, a relation of Being to this Being. And this in turn means that Dasein understands itself in its Being in some way and with explicitness. It is proper to this being that it be disclosed to itself with and through its Being. Understanding of Being is itself a determination of Being of Dasein. The ontic distinction of Dasein lies in the fact that it is ontological."

He seems to be trying to establish a conscious and self-reflective being as a special case of the general case of "being". In German, "dasein" means being (sein) there (da), which does not seem to add very much ... it is "existence" in any case, here or there.

Anyhow, one can imagine pages and pages of this, leading nowhere, and get a thorough sense of this text. It shares with its descendent postmodernism (not to mention its cousin theology) a sort of linguistic propulsiveness (with plenty of italics) and conviction of purpose without actually saying anything. Whether one agrees that, as Heidegger says, "The concept of 'Being' is rather the most obscure of all", he makes whatever it is less clear rather than more. It is a flood of sophism and pomposity that has led generations of all-too-serious students to strain their eyes and waste their talents, while setting itself up as some kind of tribunal of the highest, metaphysical kind over other fields.

  • Free markets for thee, but not for me.
  • Financial criminals reward each other with pay raises. And sycophantic press. And the uniquely powerful incentives to loot your own bank.
  • Workers of the world will not unite.
  • Yet unemployment is the worst fate of all.
  • NASA is a happy-talk disaster zone.
  • Eric Snowden's background.. how he reacted to army atmosphere: "Few of his new army colleagues, he maintained, shared his sense of noble purpose, or his desire to help oppressed citizens throw off their chains. Instead, his superiors merely wanted to shoot people. Preferably Muslims. ‘Most of the people training us seemed pumped up about killing Arabs, not helping anyone,’ he says."
  • PIMCO guru pushes MMT: deficits create money and credit, which we need to support growth. Don't pay attention to all the mistakes I made last year, though, and the year before that, and ...
  • This week in the Wall $treet Journal: "But the lesson from Europe is that the environmentalists who have been relentlessly hawking renewables are the real deniers." This piece makes a valid point, despite its hypocritical evasion of the appalling conservative denial of climate heating generally ... which is that transitioning to renewable energy is costly and difficult. Which is why we need a big carbon tax sooner, not later.

Saturday, February 1, 2014

Fins are not fingers

The evolution of arms and fingers from fish fins, a story of genetic redeployment.

There is still a great deal to learn about how our bodies and minds rise out of our genetic code. Despite a growing flood of genomic data- and we are right on the verge of the $1000 genome, meaning that everyone in the developed world will shortly have their genome sequenced as a matter of medical routine- a vast gulf remains between the concrete knowlege we now have about the two ends of the process: genotype and phenotype.

One of the great American scientists of the 20th century was Edward Lewis of Cal Tech, who studied the developmental genetics of fruit flies, focusing on mutations that affected their body plan. In one example, he developed mutants whose third thoracic segment, instead of growing tiny winglets called halteres, grew full wings, just like their second thoracic segment. They were a little like dragonflies. This led Lewis on a long path to characterize such "homeotic" mutations, (which transform body parts), and to a Nobel prize.

It is now known that the main gene Lewis studied, "Ultrabithorax" encodes a transcription regulator that sits in the middle of a large developmental network or cascade of transcription regulators. The process starts from the glimmerings of polarity determination in the mother's egg, and proceeds through successively finer divisions of space and identity within the growing embryo until we get to the ultimate effector genes that direct neuron production and migration, or muscle development, or one of a thousand other cell types that generate our tissues.

The genes that Lewis studied are collectively termed "hox" genes, short for homeobox, which itself is short for a DNA-binding motif that is found in all these genes whose mutations cause homeotic transformation, which has a characteristic DNA and protein sequence, only subtly altered in each one. They are all related because they are all evolutionary copies of a single ancestor.

These genes sit in the middle of the developmental cascade, and have themselves vast upstream regulatory regions, to gather regulatory information from earlier stages in the process. Segmentation has happened by the point they come into action, and the homeotic genes integrate the data about which segment their cell is in, and, if conditions are right, turn on expression of their encoded regulatory protein, thereby providing input to all the downstream genes that actually prompt the development of that segment's proper parts, be they wings, legs, antennae, arms, etc.



Hox genes occur in tandem clusters, and the clusters themselves have been duplicated during evolution. In the diagram above, (from wikipedia), sea urchins, at the top, have something like the original cluster of eleven hox genes, color coded by their position in the cluster, which also relates to the position along the body axis where they are expressed (at right). Fruit flies, at the bottom, lost a few copies, and gained a few others, but retain basically the same system. Fish and tetrapods, in the middle, duplicated the entire set, copying whole clusters to various chromosomes, and lost individual hox gene units along the way. This elaboration allowed more complicated body plans to develop, with the example of fingers being a new use of the hox code, added onto the basic body trunk segment-by-segment code. The head and brain are another place where the hox system has been re-used in tetrapods.

One confusing element of the field is that in tetrapods, the hox A and D clusters are partly redundant. Each can, on its own, direct formation of arm and fingers, and both need to be deleted to eliminate the arm. So the researchers in today's paper mix and match from both clusters to make their various points.
"During mammalian limb development, the activity of both HoxA and HoxD gene clusters is essential and the absence of these two loci leads to rudimentary and truncated appendages."

In the embryonic hand, expression of many Hox D genes, from d9 to d13, are required to specify tissues during development, as are a few of the Hox A genes. They have overlapping patterns rather than some neat, digital(!) code, this being messy biology, but through mutation and other studies, researchers have pieced together some information about which gene of the tandem arrays does what. The genes have some individual characteristics, but much of their regulation is collective, directed from enormous regions on both sides of the cluster, comprising over three million base pairs of DNA.

The Hox D locus, on human chromosome 2. It contains eight distinct hox genes, (numbered black boxes at bottom), flanked by enormous control regions on either side which drive expression of some cluster genes in the hand (blue) and some in the arm (red), responding to transcription regulators earlier in the cascade of developmental patterning and differentiation. What are those fancy-looking blue and red cubic designs? That reflects a separate study where the authors physically tested which DNA was close to which other DNA in embryonic cell chromosomes. And they found that the right and left regions form their own knotted-up domains each hooking up with the central hox D gene, but not touching anything on the opposite side.

A recent paper is one of a pair that find that two clusters, hox D and hox A, are both flanked by very large regulatory regions that in fish have only slight differentiation, one directing slightly more distal (towards the outside) expression than the other one (red). The large regulatory region downstream (red) which originally specified expression in fish fins, has pretty much retained the same function in tetrapods, specifying the arm.

But the large regulatory region on the other side (blue) in fish only adds a little bit of extra expression to some cluster members towards the outside of the limb. In tetrapods, however, it specialized to direct expression of hox D genes in the hand, quite exclusively from directing expression anywhere else. The basic finding is that fish fins are not proto-fingers, really, but are related principally to our arms. The fingers arose from a mostly new regulatory program established by the blue areas in the genome shown above. And the wrist ... that is specified in the gap, partly by the lack of hox expression. It is interesting to note as an aside that the hox B and hox C clusters seem to have regulatory control only from one side, not both sides.

Inference of the paper, that the hand-specifying regulatory  regions of hox D and hox A (blue) developed from earlier regions (yellow) that had relatively minor roles in fish, and which specified the margin of the fin, rather than a separate structure.

What is some of their evidence? Well, first, let's see some of the native expression of mouse hox A genes:

Expression of individual genes from the mouse hox A cluster, showing finger-specific expression for 9, 10, 11as, and 13. The exception of hox A11 is striking, as a departure from the hand-specific pattern of its nearby siblings, and in its well-defined zeugopod, or lower-arm expression pattern.

One obvious experiment was to transplant the fish hox DNA into mice to ask where it gets expressed. It always gets expressed in the same place- where the arm expression happens, at the base of the limb bud, not where finger expression happens. This makes the case pretty strongly that finger expression and development was, as one might imagine, a novel evolutionary development.

Mouse embryonic limb buds showing the expression of a transgenic zebrafish hox A cluster, with regulatory regions and genes it contains, including each of the ones as labeled. They all get expressed in the near, or arm region, not in the finger region. This was true no matter which regulatory region of the zebrafish hox A cluster was used, whether the upstream or the downstream side.

Even more striking, the researchers show expression patterns in complete embryos. Below is a stage E11.5 mouse embryo with transgenic fish hox A13, driven by the fish regulatory region corresponding to what would be the hand/finger-specifying region on tetrapods. Its expression appears in many areas of the body, but not in the fingers, as the mouse's own hox A13 does. It is worth noting that in vertebrates, the hox genes are used all over again in specifying brain region development, which does not happen in flies. It is a common theme- that through the accumulation of regulatory complexity, the same genes can be re-used many times to create ever more elaborate phenotypes.


As you can see from the genome locus diagram a few figures above, the regulatory regions controlling the hox D genes are far, far larger than the protein-coding genes themselves. Complexity of control is the theme in all genomes, especially ours. These regions contain many little modular bits of DNA that bind to various other transcriptional regulators that operate from upstream in the developmental cascade, allowing a progressive, step-by-step, though in actuality a stochastic and mix-and-match evolutionary process whereby the silk purse of our present bodies are made out of the sows' ear of a few thousand ancient genes.

  • 23 & me genetic testing- another front in privacy and big data.
  • Example of another paper on limb formation, in the transcription regulator cascade of development.
  • Creationism map.
  • The POTUS with the SOTUS- does work pay the worker, or only the CEO?
  • These kids just don't understand religion!
  • The patent backstory to the Google, Motorola, and Nortel deals.
  • Fascism, American style- corporations and the blacklist.
  • Economic quote of the week, from John Schmitt:
"Workers today are a lot older than they were in the 1960s or the 1970s, and they are enormously better-educated than they were in the 1960s or 1970s. The fact that most workers are doing barely better, and some workers are doing worse than their counterparts from 40 or 50 years ago … suggest that the problem is that the way the economy converts people’s skills, people’s experience, people’s education and their training, into good jobs is what has deteriorated over this period. Not people’s underlying skills, or work experience, or education."

Saturday, January 25, 2014

Surveillance, politeness, and privacy

Is the NSA killing us or protecting us?

Surveillance as a general social principle. We are always watching each other, and it is the primordial way of being in society. In the old days, gossip was the principal method of leveraging surveillance into social power and enforcement. Now we happily surveil each other by facebook, twitter, google earth, and leave comments. The issue in our new surveillance environment is not the existence of surveillance per se, but the asymmetry and invasiveness of surveillance. Do we know who is watching, what they are watching, and when they are watching? Are they harming us? Can we turn it off?

Traditionally, social surveillance is always mutual. You see me at the same time I see you- having a meal together, talking, hunting. The power of this mutual observation and interaction is immense, policing our behavior so as to enforce "normal" standards, alert for any deviation, political or moral lapse, for novel signals of fashion, disease, innovation, threat, etc. Religion is its purest expression- including extensive, in-depth thought policing.

Some people stand up well to all this observation, some don't. The pervasive social pressure has enormous effects on our mental health, causing depression, suicide, peer pressure, status anxiety, etc.. one of the great, if not the greatest, motive forces of politics and social life in general. One point of etiquette is to relieve people of this anxiety, leaving their private affairs politely out of the conversation, even as the observation goes silently on. The essence of privacy is not that we are not observed, but that we are not held to account or bullied about it beyond endurance.

The totalitarian societies were a sort of reversion back to the small town mode of intense surveillance, with a total invasion of privacy and violation of civilized etiquette in the bargain, using all this information against people at their most vulnerable points. But in large societies we have typically adapted to a much looser model of toleration & privacy, where due to the sheer numbers and sheer density, more observation and more diversity must be accommodated than humans are typically comfortable with. So we keep a small community of close relationships and mutual close surveillance, amid a large cloud of anonymous and little-noticed passers-by.

Big data has changed all this, bringing the intimacy of small town surveillance, where the local store clerk, for instance, knew what everyone bought, to the global stage. Some embrace the facebook-i-zation of personal surveillance. The question is mostly whether we can turn off portions of this surveillance that we do not like, or which we collectively deem asymmetrically unfair and invasive,m or corrupt and incompetent. For instance, our credit cards provide entree to all our purchases to faceless corporations who diligently mine them for scraps of sales leads, and sell them off to their "partners". It is a seamy, disreputable business, and not at all voluntary.

If they had reasons of state, and a secret court looking over their shoulders, I would be far more amenable. But they don't. Credit cards are not an optional institution in today's world, so this surveillance is essentially involuntary, and extremely asymmetric. Its typical results, however, are modestly annoying, rather than invasive or life-threatening, so the cost has to date been borne without too much complaint. And the monitoring of all our web comings and goings.. well, it is not far from George Orwell's Telescreens of 1984, which monitor everyone with an unblinking eye.

What to do? The NSA portion of this is relatively inconsequential, really. The average person's degree of invasion from their practices is essentially nil, though surely mistakes have happened and cause great individual harm. The government's no-fly list is an example of a relatively more open program plagued with error and invasiveness.

But the flood of other personal data rushing into corporate and other unknown hands is far more serious. The Target incident where tens of millions of accounts were stolen, the ongoing traffic in social security numbers, identity theft, false tax claims, endless spam, and targeted come-ons, etc.. all point to a "system" in crisis. We have let our virtual selves contain ever more important data without vetting anything, or any serious legal structure. Sure, the companies in question have a stake in customer faith and thus their own prudence & etiquette. But their transparency is nonexistent and failures clearly frequent. We have no idea, and even they have little idea, what has been stolen or bartered away into the nether worlds of cybercrime.

Even biometrics hold out little hope. A fingerprint or iris scan can be forged, as can any other piece of data. We are trapped in a data whirlwind, where it is only ourselves, in person and with competent memories, that can completely attest to identity. So we are back to the personal, one-to-one world of rich and personal information that we began with.

I don't think it is enough to hark back to the privacy provisions of the constitution and take an absolutist position that divides harsh restrictions on government surveillance from a wild-west atmosphere in the private realm, papered over with the concept of so-called "voluntary" participation. We need new law in this new realm, to enforce competence of information collection and safe-guarding on all entities that collect big data, (with business-ending penalties for flagrant breaches), and to match its social effects and invasiveness with public oversight.


  • Drone war- the surveillance that kills.
  • Is scandal and blackmail the currency of NSA surveillance? That is not at all clear.
  • Intensive spying by big Pharma.
  • The $1000 genome is upon us.
  • Why are we stuck in a Reagan-era-rut in policy towards Latin America?
  • Long hours.. are not productive if you are creative and/or competent.
  • In Afghanistan, ".. the security situation throughout the country has gotten worse, not better, since the 2009 election."
  • Martin Luther King and the job guarantee.
  • A better union model, from Germany.
  • Buñuel does the conference call.
  • Generate your own scholarly, postmodern web page.
  • The expert's humorous guide to science fiction.
  • Brain asymmetry- just the facts. please.
  • As usual, companies can break the law, and contribute to the rot of public services.
  • Europe's youth- twisting in the wind. Even on film.
  • Martin Wolf: The megabanks are still too big to fail. Bigger and bail-i-er than ever, actually. In his review of Bernanke's tenure, he misses one critical failure- the failure to explain clearly to congress that withdrawing continued fiscal support was criminal. Monetary policy can not and has not replaced straight spending.
  • Economic cartoon of the week, Toles on trickle-down, Keynes, and the GOP's plans for the poor.

Saturday, January 18, 2014

The problem with positivism

"Positivism states that all authentic knowledge allows verification and that all authentic knowledge assumes that the only valid knowledge is scientific."

What is truth? A simpler, and more frequently used word could hardly be imagined, but philosophers differ over it, probably because of sentimental attachments to beliefs that may not be true. In the hands of theologians, idealists, and artists, truth often stands for "something I believe". If a novel stirs a deep emotion, it is true, even while it is false. If an artwork reflects and expresses a facet of the human condition in a surprising or powerful way, it is true. And if a belief in a deity is beautiful, socially bonding, and morally edifying, it is also true. At least one athelete is the truth.

This definitional issue remains quite confusing and misleading. The subjective uses of "truth" have little to do with the canonical correspondence truth, (i.e. the equation of the thought and reality), in that what is corresponding to the feeling of truth is a feeling it agrees with, not a condition of the outside world. Subjective states surely deserve the recognition of their existence and texture. But the word truth may not be the best way to describe them.

In contrast, science and the law take a more blinkered view. If something is true, it actually happened, or is part of the real world verified by observation and continually available for re-observation, and / or other forms of close analysis. While the sciences are edging into regions traditionally part of the humanities, they still regard truth as objective, and separate from personal state, wishes, ideology, etc. The DNA reads one way, and not another. The defendent was at the scene of the crime, or not. Evidence may not exist, and the truth may not be known, but that does not impair the idea of truth- its definition and possibility.

In this regard, our minds are truth engines, working very hard to model reality with accuracy. Eyesight is the most dramatic example, bringing us incredibly rich and accurate scenes with no apparent effort. But on more abstract levels too, we are constantly trying to figure things out, particularly other people, the object of so much of our intuitive acuity. But there are limits.. we have no intuitive grasp of physics on any large or small scale, and nor is our introspection particularly effective. The self is a black box that we struggle our whole lives to understand.

And one tool of all this modeling is imagination, which both consciously and unconsciously conjures all sorts of worlds and images, sometimes as hypotheses to be pursued, sometimes as warnings to be avoided. Unfortunately, (or perhaps fortunately), the line between sober analysis and imagination is not all that clear, leading to the establishment of the scientific method as a general and organized way for communities of people to figure out the difference, in fields where real truth is at least conceivable.

This was the hope of the postivists, to put all knowledge on the this same footing, by setting verificationist, empirical standards for knowledge and truth, and keeping all else outside the door. They tried to define everything else as "nonsense", or as not meaningful. But unfortunately, most of human experience happens in far more nebulous realms of subjective experience, vague judgements, and hopeful propositions. Which are often very highly meaningful indeed. So this re-definitional part of the project was as futile as it was repugnant.

For instance, not even the most airy metaphysical questions are entirely meaningless, which is one of the propositions of positivism. Rather, their resolution, after thousands of years of speculation, does not lie, typically, with the speculators. Philosophers provide the service of keeping some of these questions alive, at least in the academy, and of trying out various intuitive solutions to them. But the remaining problems of philosophy are clearly ones where both data and intuition are lacking. Whether data ever arrives is the main question. Whether intuition will ever resolve them is much less of a question.

More technically, the word positivism signifies positive proof, and by various skeptical arguments, (such as Hume's and the problem of induction generally), and by historical experience, it is clear that proof (i.e. verificationism) is a mirage in science, not to mention other fields. The most that can be hoped for is a provisional model of reality that doesn't violate too many observations- a coherentist model of truth.

So Karl Popper, for instance, who was altogether sympathetic to positivism, came out with his falsificationist principle, in opposition to the verificationist principle of positivism- becoming formally an anti-positivist, or at least a post-positivist. But even falsificationism is too stringent, since a contradictory observation can as easily be erroneous as damning. Judgement and interpretation are always called for, on the appropriate level of analysis.

A positivist temple, with Auguste Comte out front.
My take on all this is that positivism was overly ambitious. The point can be well-taken without setting up a new altar to absolute truth. All truth is, on our level, probabalistic, and exists on a spectrum from the precise and well-attested to the hearsay and ludicrous. That is what the contemporary Bayesian revolution in statistics and science generally is getting at, and what was lost in the positivist's rather extreme, utopian, project, for which they were bickered out of existence. Far larger lies and absurdities, however, were (and are) rampant in the field of philosophy than the shades of truth-i-ness found in the scientific literature or the history of science. To whit, a quote from Nietzsche:
"The other idiosyncrasy of philosophers is no less dangerous; it consists in confusing the last and the first things. They place that which makes its appearance last ... the 'highest concept', that is to say, the most general, the emptiest, the last cloudy streak of evaporating reality, at the beginning as the beginning. This again is only their manner of expressing their veneration: the highest thing must not have grown out of the lowest, it must not have grown at all ... thus they attain to their stupendous concept 'God'. The last, most attenuated and emptiest thing is postulated as the first thing, as the absolute cause, as 'ens realissimum'. Fancy humanity having to take the brain diseases of morbid cobweb spinners seriously! - And it has paid dearly for having done so."
-Quoted by Max Horchheimer, in Eclipse of Reason.

  • Some atheist basics.
  • Big surprise- conformists tend to go to church. Where their children are taught...
  • Superior vaccine delivery and activation.
  • Full review of the Robert Gates memoir.
  • Reflections on a past basic income and job guarantee scheme.
  • How discrimination works. And the key importance of learning on the job.
  • Europe's elites are screwing up again. Though they are hardly alone.
  • To Bill O'Reilly, a 40% pay increase is "not a big deal".
  • Born to not run... subpoenas will be flying.
  • Evil, climate change, and collective action.
  • Robots, jobs, and the second machine age. But the problem is not technological, it is economic and political.
  • This week in the Wall $treet Journal, on how the FCC should let CEOs run the internet: "... the FCC should drop its pursuit of net-neutrality rules altogether.... Next, the FCC should unequivocally restate its commitment to the multi-stakeholder model of resolving network-management challenges and Internet governance."
  • Economic graph of the week; we are bumping along at bottom, in terms of overall employment:

Saturday, January 11, 2014

Sympathtic vibrations: speech waves and brain waves

Brain waves sync up with perceived speech, pointing to possible functions.

What do brain waves do? They are a prominent feature of live, working brains, and change markedly under different conditions, especially sleep and epilepsy. They seem like a natural analog to the CPU clocking that is so essential in artificial computers, but clearly are more chaotic, slower, and diverse. They seem to make up a moderately important dimension of brain processing, combining with the other more fundamental dimensions of anatomical organization and electrical/chemical pathway conduction to make up brain activty.

A recent paper makes the comment that.. "A large number of invasive and non-invasive neurophysiological studies provide converging evidence that cortical oscillations play an important role in gating information flow in the human brain, thereby supporting a variety of cognitive processes including attention, working memory, and decision-making."

So what does "gating" mean? That is a bit hard to say. In artifical computers, the clock cycle is essential to quantize the computations so that each transistor and each computation is given a chance to do its thing in a defined time, then rests so that other elements can catch up to it, keeping the whole computational process in logical register. Brains may need a similar service, but clearly it is far messier, since individual neurons take orders from no one- they seem to fire almost chaotically. While rhythmicity is a property of individual neurons, brain waves (aka cortical or electrical oscillations) are very much a mass phenomenon, only biassing the behavior of individual neurons, not ruling them outright.

An attractive place to look for their function is in auditory cognition, especially speech recognition, since each form of oscillation shares a multi-frequency mix of patterns of related frequencies, though the range of sound frequencies are substantially wider (~30 Hz to ~15,000 Hz) than the range of electrical brain oscillations (few Hz to maybe 150Hz). Maybe they map to each other in some discernable way? As the authors state:
"The similarity in the hierarchical organisation of cortical oscillations and the rhythmic components of speech suggests that cortical oscillations at different frequencies might sample auditory speech input at different rates. Cortical oscillations could therefore represent an ideal medium for multiplexed segmentation and coding of speech. The hierarchical coupling of oscillations (with fast oscillations nested in slow oscillations) could be used to multiplex complementary information over multiple time scales for example by separately encoding fast (e.g., phonemic) and slower (e.g., syllabic) information and their temporal relationships."

Basically, the authors had subjects (22 of them) listen to about seven minutes of speech, played either forward or backward, and at the same time used magnetoencephalography, i.e. a ginormous machine that detects slight magnetic fields emanating from the head, to track superficial brain waves. MEG is somewhat more sensitive than EEG that is done with electrodes pasted onto the head. Then they fed both data streams into a correlating procedure (below), and looked for locations where the two oscillations were related.

Procedure of analysis- each waveform stream was deconstructed and correlated, to find locations in the brain where electromagnetic surface waves reflect speech waves.

They found several instances of correlation. Two were in the low frequency (1-2, 4-8 Hz) delta and theta rhythms, which directly entrain with the speech rhythm. Two more were in the 20 and 50 Hz range, where the amplitude of these gamma rhythms correlated with the phase of the lower frequency speech rhythms, a somewhat indirect correlation. The locations of these brain wave correlations were naturally over the auditory and speech centers of the brain:

Location of brain waves, of various frequency bands, that correlated with speech patterns. This is a map of significant results, mapped to each hemisphere. Note significant differences between the hemispheres, right on the right side.

"In sum, this comprehensive analysis revealed two distinct speech tracking mechanisms in the brain. First, low-frequency speech modulations entrain (that is, align the phase of) delta and theta oscillations in the auditory cortex. Second, low-frequency speech modulations also entrain the amplitude dynamics of gamma oscillations."


Speech trace (A) shown with a superimposed dotted line (cosine) of the theta brain wave of the listener. In B, the brain is shown, with locations of 3-7 Hz entrainment labeled in red, specifically entrainment that differed significantly between the forward and backward speech renditions. C shows the overall cross-correlation data, for both hemispheres, with signals at 20 and 48 Hz, at least on one hemisphere. This tracked not overall speech, but the correlation with speech starts and stops, showing close phase tracking.

The phase entrainment shifted position when successive speech elements (stops/starts for sentences and words) arrived, showing that the system tracks the input quite carefully.

Most intriguingly, the authors found that backward speech was significantly less correlated with brain waves than forward speech. This indicates some top-down control, where intelligibility of the speech stream is broadcast back to lower levels of the auditory processing apparatus to fine-tune expectations of the next event, via stronger rhythmic alignment.

They also found differences between the hemispheres, with the low-frequency correlations stronger in the right hemisphere, and the gamma-wave correlations stronger in the left, which contains the primary language areas in most people (such as Broca's and Wernicke's areas).

"Our study supports emerging models of speech perception that emphasise the role of brain oscillations. Hierarchically organised brain oscillations may sample continuous speech input at rates of prominent speech rhythms (prosody, syllables, phonemes) and represent a first step in converting a continuous auditory stream to meaningful internal representations."

One can imagine that brain waves assist processing in several ways. When unified over large areas of the brain, they might enforce regimented processing, (i.e. transfer of neuronal signals from one cell / layer / module to the next, in ways that constitute signal processing from raw to more abstract representations), which could make it more efficient and also better able to affect other areas of the brain, such as consciousness. In auditory processing, the advantage in lining up processing with the actual signal should be clear enough. They could also reduce chatter in the system, which seems universal in other brain studies. Do they "carry" signals themselves? Not really, just as the computer clock cycle doesn't tell us what the computer happens to be doing, but facilitates the detailed processing flowing through its innumerable wires and junctions.


  • A better review of the same paper.
  • Test your hearing.
  • Religion, tribalism, hate, love, etc. etc...
  • But some still insist upon religion. And "definitively refute" atheism. And finish up with C. S. Lewis. Hmmm. 
  • The onion refutes it a little better.
  • And becoming an atheist.. not so easy.
  • Economic wreckers and capitalist running dogs in our midst.
  • Turns out, Republicans do favor redistribution, after all.
  • Managing the job guarantee.
  • 4K TVs work wonders as monitors.
  • The India diplomatic row is an example of why worker protections and minimum wage protections are so important... the system worked.
  • Satanists.. performing a public service.
  • Yes, he is a bully.
  • Inheritance is increasingly significant, so death taxes are more important than ever.
  • Economists have no idea what they are doing.
  • Economic graph of the week, on unemployment.

Saturday, January 4, 2014

An American Marco Polo: Josiah Harlan

Quaker, ruler of Gujrat in the northern Punjab, General of Afghanistan, all-around schemer and adventurer.

The adventures of Marco Polo (1254-1324) are famous, mostly because they were so well recorded. He followed the briefly open silk road during the heyday of Kublai Khan, travelling all over the far East, and ruling briefly in the Khan's service in China. But when he returned to Venice, he was overtaken by the vortex of local politics, and was co-imprisoned with a gifted writer who helped put his extraordinary, yet quite accurate, tales into clear and compelling prose. Tales that came to be disbelieved after the silk road closed up again with the dissolution of the Mongol empire.

Unfortunately, Josiah Harlan (1799-1871) had no practiced ghost-writer, and was so politically vociferous in his anti-imperial writings that his lengthy memoir never heard the clang of a printing press. Nevertheless, his story has obvious parallels with Polo's, and contains interesting lessons for our own brushes with imperialism.

The book is "The man who would be King: the first American in Afghanistan", by Ben MacIntyre. Harlan was born into a mecantile family, for whom he shipped out to Canton and points east as "supercargo", or manager and sales agent for a ship's mechandise. Hearing from afar that his recent fiancé had married another, he decided to never come back, and gave himself up to what he seems to have wanted to do anyhow, which was follow a life of adventure in the East, following the trails of Alexander the Great, the British Imperialists, etc. It is interesting to note that while most venturesome energy in the US was directed Westward, Harlan had been bitten, via brother Richard and lengthy immersion in Greek and Roman history, with the bug of the old world and its more exotic precincts.

Eventually, he hired on with the British East India Company as a doctor for which he had no expertise whatsoever, and gained familiarity with India and its frontiers. But his eventually formulated aim was to become a ruler somewhere, preferably Afghanistan, whose ever-volatile political system seemed ripe for just his kind of energy and interloping adventure. So he started playing politics, offering his services to those out of power (an exiled former king of Afghanistan) to scheme against those in power. (Cut to a long march into, then out of Afghanistan... and a decade-long interlude in the service of a Punjabbi Maharaja, eventually governing one of his districts.)

Over time, he finally gained entrance to the inner circle of Afghanistan's rulers, and his appreciation for their merits increased markedly, causing him to switch sides from the exiled ruler. Unfortunately, just after Harlan was appointed general by the Afghan ruler Dhost Muhammed Khan and conducted a remarkable and immensely arduous expedition north to parlay with and / or defeat the various Uzbek and Hazzara chiefs around Mazar-e Shrif, the British decided they wanted to rule Afghanistan. How dare they?!

As is by now well known, the British army marched into Afghanistan in vast force, easily defeated the locals, and settled into what they thought was another India-style posting, with polo and partying. But not for long... these locals were not obsequious farmers and caste-ridden hierarchs, amenable to foreign rule. No, the Afghans are freedom-loving, highly martial, fissiparous, and blessed with a religion that prizes power and warfare, and with a mountainous country ideal for guerilla warfare. Only a single Englishman escaped alive.

The British had also placed their bets on Harlan's previous employer- the exiled king Shah Shujah, who was in every way a bad bet as their puppet: cruel, out-of-touch, and incompetent. Harlan astonished the British with his very existence and high position, and during their occupation, argued feverishly for better administration:

"I have seen this country, sacred to the harmony of hallowed solitude, desecrated by the rude intrusion of senseless stranger boots, vile in habits, infamous in vulgar tastes, the prompt and apathetic intruments of master minds, callous leaders in the sangiunary march of heeless conquests, who crushed the feeble heart and hushed the merry voice of mirth, hilarity, and joy." 
"To subdue and crush the masses of a nation by military force, when all are unanimous in the determination to be free, is to attempt the imprisonment of a whole people: all such projects must be temporary and transient, and terminate in a catastrophe that force has ever to dread from vigorous, ardent, concentrated vengeance of a nation outraged, oppressed, and insulted, and desperate with the blind fury of a determined and unanimous will."

In short, he urged the British to buy off the major tribes with plenty of bribes, and include them in the government. Harlan ended up making his way back to the US and retired to a farm, where he kept scheming, to establish camels in the US military, to transplant Afghan grapes, and write vast books. He raised a regiment for the Civil war, and died lonely and destitute in that haven of adventurers, San Franscisco. It is a remarkable biography, under-appreciated in American history.

How are we doing in the present day? We are bribing the Afghans copiously.. check. We have a ruler in Hamid Karzai who is not incompetent or excessively cruel, but isn't exactly an historic stateman, either. Check. Will he be able to peacably retire to his fruit orchards in Afghanistan when his term is up and the US continues to melt away? When the foreign money dries up? Our program for Afghanistan requires some deep cultural change, in that elections are supposed to determine who has power, and merit determine who occupies the civil service. But the culture has never been democratic, rather thoroughly aristocratic, with patronage / clientage the vital transmission mechanism. The heads of families and tribes are the only people whose votes count, competing endlessly among each other for position. Can the two systems merge into a working state?

The US experiment has gone longer and better than the Russian, let alone the British, occupations. But whether it sticks in a final, cultural sense, is impossible to tell, and on that everything hangs.


  • Kansas: infra-red Aynrandistan?
  • A libertarian rethink.
  • Do all the wrong people admit being wrong?
  • More on the middle class and inequality.
  • Ella in a some serious scat. And with Mel Tormé.
  • State of finance, 2014.
  • Big data + free market + corporate oligopoly + no more privacy = another disaster.
  • Are unions the answer to the disappearing middle class?
  • This week in the Wall Street Journal: "In a republic, if majorities can change laws or rules however they please, you're on the road to life with no rules and no laws."
  • Again, money is a far greater danger to the Republic than snooping as it is currently done, despite the year of Snowden, etc.
  • Economics graph of the week. Whose money is pegged to whom?
Countries pegged more or less to either the dollar (green) or the Euro (blue).

Saturday, December 28, 2013

Why the middle class is so great

Working, but not desperately working.

Last week's post about robots and what happens when humans don't need to do anything raises a basic issue of economics- what is the point of an economic system? Is its point to rape the earth of its biological and mineral resources? Is it to justify the dominance of one class over another? Is it to afford a select few the leisure to do absolutely nothing? Is it to distribute goods of all kinds in accordance with each person's value, Or is it to make as many humans as happy as possible given the means at our disposal, whether those means are limited or unlimited?

People may idolize idleness, as the golden reward after a life of toil, a social right based on the exploitation of lesser classes, or the utopia of a roboticized world. But however arrived at, it is not, in fact, conducive to happiness. Not after the first day or two. Nor does entertainment fill the void on its own. Constant entertainment, without the fundamental bass of productive endeavor, palls quickly. Perhaps competition is the ultimate occupation, coming to the fore when other needs have been sated. But competition is also ultimately destructive when not carefully channeled to productive or legitimate ends.

So one point of an economic system is to keep people productively occupied. Hopefully with work they find interesting and fulfilling, but at any rate with something by which they can be and feel useful to others. If abundance is the norm, with sustanance a given, then this may be the *only point of an economic system. While some people have the fiery self-motivation and talent to create their own productive (or predatory) path- say, in the arts, or in entrepreneurial business, most people need more of a push.

Thus our system of capitalism, where one's income depends on some kind of service rendered to others, judged by the labor market, does its immense work of matching roles and enrollees admirably, for the most part. But it is hardly the last word. If due to the high productivity or ultimate roboticization of the economy as a whole, the vast majority of people do not have to work for sustenance, what then? Should the system allow vast riches to flow to the few so that the rest, while they could be amply provided for with little effort, end up scrambling for low-paying or even non-existent positions in a terrifying game of musical chairs?

Clearly, it is much worse to have no work due to due to the system malfunction of unemployment versus the blight of excess riches. Both waste the energies of a person who could be of service to her fellows. Unemployment adds existential and social terror. Low-paid work is less than optimal, tending to trap people in menial tasks that should be automated, and keep them (and their children) from the education and other cultural resources which would make them capable of greater service to their fellows, not to mention just basic happiness and flourishing. If menial tasks absolutely have to be done, that is bad enough. But why pay poorly for them as well?

My point is that the ideal economic system is one that generates the broadest and largest middle class. This is the economically (and psychologically) optimal condition, not only by way of political bromides aimed at the majority (if the middle class is the majority), but in an objective sense. It is the class that has the incentive to work diligently at productive jobs, the education to be maximally productive, and the means to enrich its communities and its children to generate still better future conditions along the same lines. The possible / prospective services that can keep everyone employed are absolutely boundless- they do not have to be "stuff" manufactured on an assembly line or drilled out of the ground. They can be philosophizing, teaching, music making, writing, street performing.. we just have to find a way to organize payment. It is sort of unfortunate that so much human creativity has over the last decade moved to the internet and become simultaneously far more productive than ever and far worse-paid, if paid at all. But that is a question of business models and what, perhaps, the public sector can do to either directly pay for creative works, or to alter the rules to make them financially sustainable.

Historically, high culture and high education were the preserve of the elite, and if a revolution generated more egalitarian economic conditions, equality would be achieved at a low level, not a high one. Great cultures of antiquity were built on brutal inequality. But after the French Revolution, and much more so in our current developed and wealthy age, this rule has been turned on its head. Cultural leadership is not a matter of wealth at all, and movement after movement of popular music and other arts rise from humble roots, not wealthy ones. Economic leadership comes from our meritocratic educational and corporate structures, not from the skull and bones (and blood) elites of old. The idea that the wealthy provide some special service that enriches us culturally or economically is completely defunct. So the only remaining rationale for wealth is simple just deserts- the reward of special and individual service to others, through leadership, invention, innovation, thought, and the like. Which pretty much leaves the bulk of the financial industry by the wayside, not to mention the lucky (ducky!) inheritors of wealth.

This is not a revolutionary manifesto, just a statement of principle about the point of our economic and political community. As we look ahead to the imperatives of global warming and other dire environmental issues, some observers advocate "degrowth". But this is misguided. Getting off fossil fuels doesn't mean we have to wear hairshirts and eat soy wafers. It doesn't mean that everyone can't or shouldn't be employed doing useful things for each other. The means of economic organization and the distribution of its rewards are separate issues from what it is that we have to distribute.


  • Keynes on the future mix of consumption, leisure, investment. At any rate, involuntary unemployment is the worst possible, and unnecessary, outcome.
  • But it is perfectly fine with corporations: Krugman.
  • And underemployment is the norm in capitalism.
  • Robert Reich on inequality.
  • Carbon needs to stay in the ground... and would need to be written off.
  • Wealth makes us into jerks. Yeah, we built it!
  • Perspectives on the evolution of capitalism- the constant battle between regulatory stabilization and financial innovation, aka destabilization. Such as under surrender monkey Alan Greenspan.
  • For all of the UK's austerity madness, it still has better prospects than the Euro union.
  • Bernanke: good or bad?
  • On the reason for the season.
  • Economic quote of the week, from Bill Mitchell:
"The US economy has stacks of idle capacity so increasing net public spending will bring real resources back into productive use rather than straining the price level. That doesn’t mean I support leaving the tax structure as it is. But I would consider that question quite differently from the aims of improving the fortunes of the poor. 
Further, I would also declare most speculative financial activity to be illegal (given it is unproductive and destabilising) and that, alone, would put a major dent in the incomes of the uber-rich in the US and force them to get a real job."

Saturday, December 21, 2013

A robot conundrum

Can we use them, once they are usable?

The trajectory of future technology is pretty clear. We already talk to computers, and they talk back. Soon they will be driving us around on the open roads. Where will it all end? What I want a robot for is to do my laundry and dust the house. What will it take to get there? Honestly, I think it will take consciousness.

Robots will eventually do all kinds of jobs. But to do general tasks like health care and home maintenance, how smart will they have to be? Very smart. And this leads to a conundrum- if robots are smart enough to fix up around the house, hang the Christmas lights, mow the yard, and do all things we don't want to do, won't they have the consciousness that gives them rights against being callously exploited to do just those jobs?

A great deal of intelligence would be required to be a general helper robot. We consider chimpanzees highly intelligent, and worthy of extensive protections of a humane character. But they are not nearly intelligent enough to be a handy helper around the house. Quite the opposite, in fact. Of course they were never designed for such a role, and in contrast share naturally much more of our emotional makeup, which leads to our mutual empathy. But still, it is hard to imagine that the requisite intelligence could happen without some modest dose of emotion and all-around empathy-engendering sense of self, of purpose, of social facility- in short, human-ness.

A robot-heavy future is portrayed in Isaac Asimov's The Naked Sun, whose planet Solaria has a tiny population of humans, served by countless robots, affording all possible luxury. Robots run all factories, including those making more robots, keep house (i.e. mansions), attend all needs, serve as police, and raise children, all under the most tenuous supervision. It is a little far-fetched to imagine such possibilities without also considering that these robots have a consciousness that reaches as far as that of their masters, comprehending the workings of the world they are in, including their own role in it.

Would they have emotions regarding self-worth, self-determination, freedom? One would imagine that this comes with the territory of consciousness. Such beings must look out for their own interests to some degree, to be able to function. They need to experience pain or some analog in order to avoid damage. They need to function socially, interpreting the unavoidably complex and conflicting needs of humans in order to serve them. Jeeves comes to mind. They may not need to be quite as greedy, competitive, and self-centered as humans are, but some of such emotion is required for independent and useful agency.

Also, at a certain high level of functioning, we may not be able to tell very clearly what such beings feel inside, subjectively. Even if we think they are programmed to feel no social pains at being a servile class, or annoyance at being ordered about by, say, an ignorant or even mischievious child, their complexity at that level is likely going to make it impossible to know for sure. Many complicated computer systems already are quite a bit beyond anyone's full understanding, leading to the many software contracting fiascos in the news.

Ironically, Asimov's Solaria is a sort of hell where the humans neurotically isolate themselves from each other and tend to lose their capabilities and interests- a common syndrome of the overly well-off. So in such a future, it's no good, either for the robots or for us.

  • Krugman on the progressive program, rolling back the feudal class war.
  • Guess which debtors get the shaft, and which creditors get first dibs?
  • Fifth graders must not think for themselves!
  • Will CEOs and other corporate officers presiding over crime be prosecuted?
  • Gun nuts deserve intensive Freudian therapy.
  • Religion is not good for politics, nor is politics good for religion.
  • Yes, she really was a welfare queen, and so much more.
  • The US biomedical research establishment is unsustainable and cruel to trainees. STEM shortages are, incidentally, a "myth".
  • Annals of feudalism- temp workers.
  • Economics graph of the week, from Martin Wolfe, financial times. Where does all that superior US productivity go? Not to ordinary people, let alone to public services.

Saturday, December 14, 2013

Alta California, filibusters, and the exceptional nation

A little California history. Mentioning General Vallejo, by Allan Rosenus, and 75 Years in California, by William Heath Davis.

Is the US exceptional, and exceptionally good? Are we the exceptional nation just because we have the biggest navy, or for some more positive attribute? Are we generous, or greedy? Do we confer democracy and good government on other nations and stand as a beacon of hope to the downtrodden, or do we confer kleptocracies and rob the downtroden of the little mite they have through pernicious trade deals and relentless consumerism?

We have shown many faces to the world over the years, and the recent JFK assassination anniversary was a chance to reflect on some of them. Oswald was apparently incited to some degree by the (true) stories he had heard in Mexico of the plots carried out by the US to assassinate Fidel Castro of Cuba, over and above the Bay of Pigs invasion. Our history in Latin America generally is a rather uninspiring one of rampant meddling and empowerment of the worst elements available. If one looks up the term "filibuster" on Wikipedia, one is met with a cavalcade of such instances of "manifest destiny", where Americans tried, with more or less success, to take over various Latin states, which must have seemed ripe for the picking, in an imperialist kind of way. For filibustering is unlawful predation, hostage taking, free-booting, meddling, etc. in another country, only later becoming that parliamentary gridlocking device.

The history of California is a fine example of this tradition. I have been reading two books: "General Vallejo", an excellent biography by Alan Rosenus, and "Seventy Five years in California", a beautifully written and very detailed memoir by William Heath Davis, an early merchant.

Spain set up a trail of missions up the California coast starting in 1769, enslaving the native indians with the Catholic church's one-way ticket to heaven, forming ranchos where the padres were in charge, each had a small military detachment to maintain control, and a vast flock of "conversos" to do the work. Who incidentally a died like flies from the treatment, the novel diseases, diet, etc. Mexico revolted from Spain in 1821, and the departments of Baja California and Alta California came under Mexican control, the missions were divided up and granted to, typically, former military officers. Such grants gradually encroached inland, past the coastal areas where the missions were originally confined. General Mariano Vallejo, who commanded the Presidio at San Francisco in 1833, among other posts, was granted large ranchos in the Sonoma area, north of the bay.

The rancheros slaughtered a portion of their stock each fall for hides and fat alone, leaving the rest of the carcasses, which attracted bears, which gave rise in turn to the excitement of roping and killing bears. California now has no grizzly bears, and maybe 30,000 black bears.
Mexico's hold over California was remarkably tenuous. Its own post-revolutionary government was tumultuous and unstable in the extreme, so its capacity to pay for or pay attention to the far-away province of Alta California was meagre. Mexicans looked down at their Northern rustic brethren, who used their enormous ranchos to run thousands of cattle and horses, their hides and tallow being pretty much the sole export of the province for the pre-US period, along with the furs of wild animals such as otters. The racheros carried on the Padre's practice of enslaving the native Americans, paying them solely in clothes and food, which was sometimes served from common troughs.

Indeed, it was a close-run thing whether California was going to side with the South or the North in the brewing Civil war. However, the predominant cultural influence from the US came from Boston, whose merchants (including William Davis) had traded up the coast since the Mexican accession,  (Richard Dana's Two Years Before the Mast is another great book in this historical literature), and married into the Californio social system.

"The native Californians [Californios, not Indians] were about the happiest and most contented people I ever saw, as also were the early foreigners who settled among them and intermarried with them, adopted their habits and customs, adn became, as it were, a part of themselves." - William Heath Davis, 75 years in California.

The exception was Sutter's fort. John Sutter was a Swiss/German adventurer and neer-do-well who after various failures around the world arrived in California (1839) with a small German entourage and enough charm to buy up Fort Ross, the Russian outpost North of San Francisco which was shutting down for lack of otters, which they had hunted to extinction. Sutter promised payment in goods (to be sent to Sitka, the remaining Russian outpost) to be raised around his land-grant near what is now Sacramento, also obtained with a good bit of charm from the Mexican authorities. While far from the coast, Sutter's fort (equipped with the materiel from Fort Ross) was still on navigable waters (the American and Sacramento rivers, and strategically placed at the foothills of the Sierras to intercept immigrants coming overland from the East. It soon became a hotbed of Americans and pro-American sentiment.
"Having accomplished my purpose of landing Captain Sutter at the junction of the American and Sacramento rivers with his men and his freight, the following morning we left him there, and headed the two vessels for Yerba Buena [now San Francisco]. As we moved away Captain Sutter gave us a parting salute of nine guns- the first ever fired at that place- which produced a most remarkable effect. As the heavy report of the guns and the echoes dies away, the camp fo the little party was surrounded by hundreds of the Indians, who were excited and astonished at the unusual sound. A large number of deer, elk, and other animals on the plains were startled, running to and fro, stoping to listen, their heads raised, full of curiosity and wonder, seeming attracted and fascinated to the spot, while from the interior of the adjacent wood the howls of wolves and coyotes filled the air, and immense flocks of water fowl flew wildly about over the camp. 
Standing on the deck of the 'Isabel' I witnessed this remarkable sight, which filled me with astonishment and admiration, and made an indelible impression on my mind. This salute was the first echo of civilization in the primitive wilderness so soon to become populated, and developed into a great agricultural and commercial center."

Enter John C Fremont, Major in the US army, whose assignment was to find the source of the Arkansas river. While the US was heading to war with Mexico over Texas, government policy at the time was to be a nice as possible to the Californians and not give any cause for grievance. But greed and glory were overwhelming temptations, and Fremont, who was evidently a persuasive and charismatic figure, led his troop of some 50 soldiers through surveys through the West, into Oregon, and down into California. There he began agitating for a takeover of California, under what seems to be a general sense of imperialism, manifest destiny, ambition, greed, etc. And perhaps competition with the other imperial powers of England and France. At first he kept the US out of it by not using his own soldiers, rather inciting a rabble of malcontents around Sutter's fort to start the proceedings.

Led by William Ide and the stuttering Ezekiel Merritt, this posse descended on General Vallejo's ranch in Sonoma in June, 1846 and took him prisoner, back to Sutter's fort. As Vallejo was the leading figure of Northern California at the time, this esentially decapitated local resistance, in case any was contemplated, which it was not. The Californios had had several revolutions against their governors from Mexico, and other political disagreements, but never were blood shed or manners forgotten. In contrast, Vallejo and several other prisoners were treated poorly, losing a great deal of weight, and the Anglo rabble stole countless horses and other livestock throughout the area. Along the way, they proclaimed a somewhat comical "California Republic", complete with flag, whose mascot was mocked as looking more like a pig than a bear. Fremont took increasing control, and on a foray out to Marin county, ordered three Californios captured in San Rafael to be shot in cold blood.

The original "bear" flag of the bear flag revolt.
It all created a great deal of bad blood between the Anglos and the Californios, and was completely unnecessary, as the direction of the political winds had long been clear. Leading Californios, especially Vallejo, were pro-American, favored development and competent government for the state, and preferred the nearby Republican power to a European imperial monarchy such as England or France. Indeed, U.S. Commodore Thomas Jones had captured the capital of Alta California, Monterrey, in 1842, holding it for a day before, amid a flurry of apologies, lowering the US flag once again when it was made clear that his belief that war had been declared between the two countries was in error. Not a shot had been fired, let alone a drop of blood spilled.

As it happened, while the Bear flag revolt was developing, war had indeed broken out between the US and Mexico over the Texas territory. US Commodore John Sloat pulled into Monterrey on July 7, 1846 and this time proclaimed California a US posession for good, and without any trouble. Upon meeting Fremont, he chewed him out for his filibustering, against orders. Eventually Fremont was court-martialed for his various departures from orders and policy, but let off the hook through his political connections and returned to service in the Civil war, only to be dimissed again by President Lincoln for corruption and insubordination. Vallejo for his part was eventually impoverished through a combination of bad business decisions, excess generosity, chicanery by Anglo partners, and finally a callous decision by the Supreme Court against some of his land claims.

The adventures and meddling of the sort that Fremont engaged in are by no means isolated in US history, under either official or unoffical auspices. The exceptional nation, with manifest destiny, muscular Christianity, a white man's burden, family values, and occupying a shining city on a hill, can do whatever it takes to remake the world in our image, which is naturally the best image imaginable.

Surely we have very beneficial things to offer others. But looking back, we have also run brutally roughshod over so much in the drive to conquer- natural, native, and foreign- that some grief and humility is also called for.