Saturday, January 25, 2014

Surveillance, politeness, and privacy

Is the NSA killing us or protecting us?

Surveillance as a general social principle. We are always watching each other, and it is the primordial way of being in society. In the old days, gossip was the principal method of leveraging surveillance into social power and enforcement. Now we happily surveil each other by facebook, twitter, google earth, and leave comments. The issue in our new surveillance environment is not the existence of surveillance per se, but the asymmetry and invasiveness of surveillance. Do we know who is watching, what they are watching, and when they are watching? Are they harming us? Can we turn it off?

Traditionally, social surveillance is always mutual. You see me at the same time I see you- having a meal together, talking, hunting. The power of this mutual observation and interaction is immense, policing our behavior so as to enforce "normal" standards, alert for any deviation, political or moral lapse, for novel signals of fashion, disease, innovation, threat, etc. Religion is its purest expression- including extensive, in-depth thought policing.

Some people stand up well to all this observation, some don't. The pervasive social pressure has enormous effects on our mental health, causing depression, suicide, peer pressure, status anxiety, etc.. one of the great, if not the greatest, motive forces of politics and social life in general. One point of etiquette is to relieve people of this anxiety, leaving their private affairs politely out of the conversation, even as the observation goes silently on. The essence of privacy is not that we are not observed, but that we are not held to account or bullied about it beyond endurance.

The totalitarian societies were a sort of reversion back to the small town mode of intense surveillance, with a total invasion of privacy and violation of civilized etiquette in the bargain, using all this information against people at their most vulnerable points. But in large societies we have typically adapted to a much looser model of toleration & privacy, where due to the sheer numbers and sheer density, more observation and more diversity must be accommodated than humans are typically comfortable with. So we keep a small community of close relationships and mutual close surveillance, amid a large cloud of anonymous and little-noticed passers-by.

Big data has changed all this, bringing the intimacy of small town surveillance, where the local store clerk, for instance, knew what everyone bought, to the global stage. Some embrace the facebook-i-zation of personal surveillance. The question is mostly whether we can turn off portions of this surveillance that we do not like, or which we collectively deem asymmetrically unfair and invasive,m or corrupt and incompetent. For instance, our credit cards provide entree to all our purchases to faceless corporations who diligently mine them for scraps of sales leads, and sell them off to their "partners". It is a seamy, disreputable business, and not at all voluntary.

If they had reasons of state, and a secret court looking over their shoulders, I would be far more amenable. But they don't. Credit cards are not an optional institution in today's world, so this surveillance is essentially involuntary, and extremely asymmetric. Its typical results, however, are modestly annoying, rather than invasive or life-threatening, so the cost has to date been borne without too much complaint. And the monitoring of all our web comings and goings.. well, it is not far from George Orwell's Telescreens of 1984, which monitor everyone with an unblinking eye.

What to do? The NSA portion of this is relatively inconsequential, really. The average person's degree of invasion from their practices is essentially nil, though surely mistakes have happened and cause great individual harm. The government's no-fly list is an example of a relatively more open program plagued with error and invasiveness.

But the flood of other personal data rushing into corporate and other unknown hands is far more serious. The Target incident where tens of millions of accounts were stolen, the ongoing traffic in social security numbers, identity theft, false tax claims, endless spam, and targeted come-ons, etc.. all point to a "system" in crisis. We have let our virtual selves contain ever more important data without vetting anything, or any serious legal structure. Sure, the companies in question have a stake in customer faith and thus their own prudence & etiquette. But their transparency is nonexistent and failures clearly frequent. We have no idea, and even they have little idea, what has been stolen or bartered away into the nether worlds of cybercrime.

Even biometrics hold out little hope. A fingerprint or iris scan can be forged, as can any other piece of data. We are trapped in a data whirlwind, where it is only ourselves, in person and with competent memories, that can completely attest to identity. So we are back to the personal, one-to-one world of rich and personal information that we began with.

I don't think it is enough to hark back to the privacy provisions of the constitution and take an absolutist position that divides harsh restrictions on government surveillance from a wild-west atmosphere in the private realm, papered over with the concept of so-called "voluntary" participation. We need new law in this new realm, to enforce competence of information collection and safe-guarding on all entities that collect big data, (with business-ending penalties for flagrant breaches), and to match its social effects and invasiveness with public oversight.


  • Drone war- the surveillance that kills.
  • Is scandal and blackmail the currency of NSA surveillance? That is not at all clear.
  • Intensive spying by big Pharma.
  • The $1000 genome is upon us.
  • Why are we stuck in a Reagan-era-rut in policy towards Latin America?
  • Long hours.. are not productive if you are creative and/or competent.
  • In Afghanistan, ".. the security situation throughout the country has gotten worse, not better, since the 2009 election."
  • Martin Luther King and the job guarantee.
  • A better union model, from Germany.
  • Buñuel does the conference call.
  • Generate your own scholarly, postmodern web page.
  • The expert's humorous guide to science fiction.
  • Brain asymmetry- just the facts. please.
  • As usual, companies can break the law, and contribute to the rot of public services.
  • Europe's youth- twisting in the wind. Even on film.
  • Martin Wolf: The megabanks are still too big to fail. Bigger and bail-i-er than ever, actually. In his review of Bernanke's tenure, he misses one critical failure- the failure to explain clearly to congress that withdrawing continued fiscal support was criminal. Monetary policy can not and has not replaced straight spending.
  • Economic cartoon of the week, Toles on trickle-down, Keynes, and the GOP's plans for the poor.

Saturday, January 18, 2014

The problem with positivism

"Positivism states that all authentic knowledge allows verification and that all authentic knowledge assumes that the only valid knowledge is scientific."

What is truth? A simpler, and more frequently used word could hardly be imagined, but philosophers differ over it, probably because of sentimental attachments to beliefs that may not be true. In the hands of theologians, idealists, and artists, truth often stands for "something I believe". If a novel stirs a deep emotion, it is true, even while it is false. If an artwork reflects and expresses a facet of the human condition in a surprising or powerful way, it is true. And if a belief in a deity is beautiful, socially bonding, and morally edifying, it is also true. At least one athelete is the truth.

This definitional issue remains quite confusing and misleading. The subjective uses of "truth" have little to do with the canonical correspondence truth, (i.e. the equation of the thought and reality), in that what is corresponding to the feeling of truth is a feeling it agrees with, not a condition of the outside world. Subjective states surely deserve the recognition of their existence and texture. But the word truth may not be the best way to describe them.

In contrast, science and the law take a more blinkered view. If something is true, it actually happened, or is part of the real world verified by observation and continually available for re-observation, and / or other forms of close analysis. While the sciences are edging into regions traditionally part of the humanities, they still regard truth as objective, and separate from personal state, wishes, ideology, etc. The DNA reads one way, and not another. The defendent was at the scene of the crime, or not. Evidence may not exist, and the truth may not be known, but that does not impair the idea of truth- its definition and possibility.

In this regard, our minds are truth engines, working very hard to model reality with accuracy. Eyesight is the most dramatic example, bringing us incredibly rich and accurate scenes with no apparent effort. But on more abstract levels too, we are constantly trying to figure things out, particularly other people, the object of so much of our intuitive acuity. But there are limits.. we have no intuitive grasp of physics on any large or small scale, and nor is our introspection particularly effective. The self is a black box that we struggle our whole lives to understand.

And one tool of all this modeling is imagination, which both consciously and unconsciously conjures all sorts of worlds and images, sometimes as hypotheses to be pursued, sometimes as warnings to be avoided. Unfortunately, (or perhaps fortunately), the line between sober analysis and imagination is not all that clear, leading to the establishment of the scientific method as a general and organized way for communities of people to figure out the difference, in fields where real truth is at least conceivable.

This was the hope of the postivists, to put all knowledge on the this same footing, by setting verificationist, empirical standards for knowledge and truth, and keeping all else outside the door. They tried to define everything else as "nonsense", or as not meaningful. But unfortunately, most of human experience happens in far more nebulous realms of subjective experience, vague judgements, and hopeful propositions. Which are often very highly meaningful indeed. So this re-definitional part of the project was as futile as it was repugnant.

For instance, not even the most airy metaphysical questions are entirely meaningless, which is one of the propositions of positivism. Rather, their resolution, after thousands of years of speculation, does not lie, typically, with the speculators. Philosophers provide the service of keeping some of these questions alive, at least in the academy, and of trying out various intuitive solutions to them. But the remaining problems of philosophy are clearly ones where both data and intuition are lacking. Whether data ever arrives is the main question. Whether intuition will ever resolve them is much less of a question.

More technically, the word positivism signifies positive proof, and by various skeptical arguments, (such as Hume's and the problem of induction generally), and by historical experience, it is clear that proof (i.e. verificationism) is a mirage in science, not to mention other fields. The most that can be hoped for is a provisional model of reality that doesn't violate too many observations- a coherentist model of truth.

So Karl Popper, for instance, who was altogether sympathetic to positivism, came out with his falsificationist principle, in opposition to the verificationist principle of positivism- becoming formally an anti-positivist, or at least a post-positivist. But even falsificationism is too stringent, since a contradictory observation can as easily be erroneous as damning. Judgement and interpretation are always called for, on the appropriate level of analysis.

A positivist temple, with Auguste Comte out front.
My take on all this is that positivism was overly ambitious. The point can be well-taken without setting up a new altar to absolute truth. All truth is, on our level, probabalistic, and exists on a spectrum from the precise and well-attested to the hearsay and ludicrous. That is what the contemporary Bayesian revolution in statistics and science generally is getting at, and what was lost in the positivist's rather extreme, utopian, project, for which they were bickered out of existence. Far larger lies and absurdities, however, were (and are) rampant in the field of philosophy than the shades of truth-i-ness found in the scientific literature or the history of science. To whit, a quote from Nietzsche:
"The other idiosyncrasy of philosophers is no less dangerous; it consists in confusing the last and the first things. They place that which makes its appearance last ... the 'highest concept', that is to say, the most general, the emptiest, the last cloudy streak of evaporating reality, at the beginning as the beginning. This again is only their manner of expressing their veneration: the highest thing must not have grown out of the lowest, it must not have grown at all ... thus they attain to their stupendous concept 'God'. The last, most attenuated and emptiest thing is postulated as the first thing, as the absolute cause, as 'ens realissimum'. Fancy humanity having to take the brain diseases of morbid cobweb spinners seriously! - And it has paid dearly for having done so."
-Quoted by Max Horchheimer, in Eclipse of Reason.

  • Some atheist basics.
  • Big surprise- conformists tend to go to church. Where their children are taught...
  • Superior vaccine delivery and activation.
  • Full review of the Robert Gates memoir.
  • Reflections on a past basic income and job guarantee scheme.
  • How discrimination works. And the key importance of learning on the job.
  • Europe's elites are screwing up again. Though they are hardly alone.
  • To Bill O'Reilly, a 40% pay increase is "not a big deal".
  • Born to not run... subpoenas will be flying.
  • Evil, climate change, and collective action.
  • Robots, jobs, and the second machine age. But the problem is not technological, it is economic and political.
  • This week in the Wall $treet Journal, on how the FCC should let CEOs run the internet: "... the FCC should drop its pursuit of net-neutrality rules altogether.... Next, the FCC should unequivocally restate its commitment to the multi-stakeholder model of resolving network-management challenges and Internet governance."
  • Economic graph of the week; we are bumping along at bottom, in terms of overall employment:

Saturday, January 11, 2014

Sympathtic vibrations: speech waves and brain waves

Brain waves sync up with perceived speech, pointing to possible functions.

What do brain waves do? They are a prominent feature of live, working brains, and change markedly under different conditions, especially sleep and epilepsy. They seem like a natural analog to the CPU clocking that is so essential in artificial computers, but clearly are more chaotic, slower, and diverse. They seem to make up a moderately important dimension of brain processing, combining with the other more fundamental dimensions of anatomical organization and electrical/chemical pathway conduction to make up brain activty.

A recent paper makes the comment that.. "A large number of invasive and non-invasive neurophysiological studies provide converging evidence that cortical oscillations play an important role in gating information flow in the human brain, thereby supporting a variety of cognitive processes including attention, working memory, and decision-making."

So what does "gating" mean? That is a bit hard to say. In artifical computers, the clock cycle is essential to quantize the computations so that each transistor and each computation is given a chance to do its thing in a defined time, then rests so that other elements can catch up to it, keeping the whole computational process in logical register. Brains may need a similar service, but clearly it is far messier, since individual neurons take orders from no one- they seem to fire almost chaotically. While rhythmicity is a property of individual neurons, brain waves (aka cortical or electrical oscillations) are very much a mass phenomenon, only biassing the behavior of individual neurons, not ruling them outright.

An attractive place to look for their function is in auditory cognition, especially speech recognition, since each form of oscillation shares a multi-frequency mix of patterns of related frequencies, though the range of sound frequencies are substantially wider (~30 Hz to ~15,000 Hz) than the range of electrical brain oscillations (few Hz to maybe 150Hz). Maybe they map to each other in some discernable way? As the authors state:
"The similarity in the hierarchical organisation of cortical oscillations and the rhythmic components of speech suggests that cortical oscillations at different frequencies might sample auditory speech input at different rates. Cortical oscillations could therefore represent an ideal medium for multiplexed segmentation and coding of speech. The hierarchical coupling of oscillations (with fast oscillations nested in slow oscillations) could be used to multiplex complementary information over multiple time scales for example by separately encoding fast (e.g., phonemic) and slower (e.g., syllabic) information and their temporal relationships."

Basically, the authors had subjects (22 of them) listen to about seven minutes of speech, played either forward or backward, and at the same time used magnetoencephalography, i.e. a ginormous machine that detects slight magnetic fields emanating from the head, to track superficial brain waves. MEG is somewhat more sensitive than EEG that is done with electrodes pasted onto the head. Then they fed both data streams into a correlating procedure (below), and looked for locations where the two oscillations were related.

Procedure of analysis- each waveform stream was deconstructed and correlated, to find locations in the brain where electromagnetic surface waves reflect speech waves.

They found several instances of correlation. Two were in the low frequency (1-2, 4-8 Hz) delta and theta rhythms, which directly entrain with the speech rhythm. Two more were in the 20 and 50 Hz range, where the amplitude of these gamma rhythms correlated with the phase of the lower frequency speech rhythms, a somewhat indirect correlation. The locations of these brain wave correlations were naturally over the auditory and speech centers of the brain:

Location of brain waves, of various frequency bands, that correlated with speech patterns. This is a map of significant results, mapped to each hemisphere. Note significant differences between the hemispheres, right on the right side.

"In sum, this comprehensive analysis revealed two distinct speech tracking mechanisms in the brain. First, low-frequency speech modulations entrain (that is, align the phase of) delta and theta oscillations in the auditory cortex. Second, low-frequency speech modulations also entrain the amplitude dynamics of gamma oscillations."


Speech trace (A) shown with a superimposed dotted line (cosine) of the theta brain wave of the listener. In B, the brain is shown, with locations of 3-7 Hz entrainment labeled in red, specifically entrainment that differed significantly between the forward and backward speech renditions. C shows the overall cross-correlation data, for both hemispheres, with signals at 20 and 48 Hz, at least on one hemisphere. This tracked not overall speech, but the correlation with speech starts and stops, showing close phase tracking.

The phase entrainment shifted position when successive speech elements (stops/starts for sentences and words) arrived, showing that the system tracks the input quite carefully.

Most intriguingly, the authors found that backward speech was significantly less correlated with brain waves than forward speech. This indicates some top-down control, where intelligibility of the speech stream is broadcast back to lower levels of the auditory processing apparatus to fine-tune expectations of the next event, via stronger rhythmic alignment.

They also found differences between the hemispheres, with the low-frequency correlations stronger in the right hemisphere, and the gamma-wave correlations stronger in the left, which contains the primary language areas in most people (such as Broca's and Wernicke's areas).

"Our study supports emerging models of speech perception that emphasise the role of brain oscillations. Hierarchically organised brain oscillations may sample continuous speech input at rates of prominent speech rhythms (prosody, syllables, phonemes) and represent a first step in converting a continuous auditory stream to meaningful internal representations."

One can imagine that brain waves assist processing in several ways. When unified over large areas of the brain, they might enforce regimented processing, (i.e. transfer of neuronal signals from one cell / layer / module to the next, in ways that constitute signal processing from raw to more abstract representations), which could make it more efficient and also better able to affect other areas of the brain, such as consciousness. In auditory processing, the advantage in lining up processing with the actual signal should be clear enough. They could also reduce chatter in the system, which seems universal in other brain studies. Do they "carry" signals themselves? Not really, just as the computer clock cycle doesn't tell us what the computer happens to be doing, but facilitates the detailed processing flowing through its innumerable wires and junctions.


  • A better review of the same paper.
  • Test your hearing.
  • Religion, tribalism, hate, love, etc. etc...
  • But some still insist upon religion. And "definitively refute" atheism. And finish up with C. S. Lewis. Hmmm. 
  • The onion refutes it a little better.
  • And becoming an atheist.. not so easy.
  • Economic wreckers and capitalist running dogs in our midst.
  • Turns out, Republicans do favor redistribution, after all.
  • Managing the job guarantee.
  • 4K TVs work wonders as monitors.
  • The India diplomatic row is an example of why worker protections and minimum wage protections are so important... the system worked.
  • Satanists.. performing a public service.
  • Yes, he is a bully.
  • Inheritance is increasingly significant, so death taxes are more important than ever.
  • Economists have no idea what they are doing.
  • Economic graph of the week, on unemployment.

Saturday, January 4, 2014

An American Marco Polo: Josiah Harlan

Quaker, ruler of Gujrat in the northern Punjab, General of Afghanistan, all-around schemer and adventurer.

The adventures of Marco Polo (1254-1324) are famous, mostly because they were so well recorded. He followed the briefly open silk road during the heyday of Kublai Khan, travelling all over the far East, and ruling briefly in the Khan's service in China. But when he returned to Venice, he was overtaken by the vortex of local politics, and was co-imprisoned with a gifted writer who helped put his extraordinary, yet quite accurate, tales into clear and compelling prose. Tales that came to be disbelieved after the silk road closed up again with the dissolution of the Mongol empire.

Unfortunately, Josiah Harlan (1799-1871) had no practiced ghost-writer, and was so politically vociferous in his anti-imperial writings that his lengthy memoir never heard the clang of a printing press. Nevertheless, his story has obvious parallels with Polo's, and contains interesting lessons for our own brushes with imperialism.

The book is "The man who would be King: the first American in Afghanistan", by Ben MacIntyre. Harlan was born into a mecantile family, for whom he shipped out to Canton and points east as "supercargo", or manager and sales agent for a ship's mechandise. Hearing from afar that his recent fiancé had married another, he decided to never come back, and gave himself up to what he seems to have wanted to do anyhow, which was follow a life of adventure in the East, following the trails of Alexander the Great, the British Imperialists, etc. It is interesting to note that while most venturesome energy in the US was directed Westward, Harlan had been bitten, via brother Richard and lengthy immersion in Greek and Roman history, with the bug of the old world and its more exotic precincts.

Eventually, he hired on with the British East India Company as a doctor for which he had no expertise whatsoever, and gained familiarity with India and its frontiers. But his eventually formulated aim was to become a ruler somewhere, preferably Afghanistan, whose ever-volatile political system seemed ripe for just his kind of energy and interloping adventure. So he started playing politics, offering his services to those out of power (an exiled former king of Afghanistan) to scheme against those in power. (Cut to a long march into, then out of Afghanistan... and a decade-long interlude in the service of a Punjabbi Maharaja, eventually governing one of his districts.)

Over time, he finally gained entrance to the inner circle of Afghanistan's rulers, and his appreciation for their merits increased markedly, causing him to switch sides from the exiled ruler. Unfortunately, just after Harlan was appointed general by the Afghan ruler Dhost Muhammed Khan and conducted a remarkable and immensely arduous expedition north to parlay with and / or defeat the various Uzbek and Hazzara chiefs around Mazar-e Shrif, the British decided they wanted to rule Afghanistan. How dare they?!

As is by now well known, the British army marched into Afghanistan in vast force, easily defeated the locals, and settled into what they thought was another India-style posting, with polo and partying. But not for long... these locals were not obsequious farmers and caste-ridden hierarchs, amenable to foreign rule. No, the Afghans are freedom-loving, highly martial, fissiparous, and blessed with a religion that prizes power and warfare, and with a mountainous country ideal for guerilla warfare. Only a single Englishman escaped alive.

The British had also placed their bets on Harlan's previous employer- the exiled king Shah Shujah, who was in every way a bad bet as their puppet: cruel, out-of-touch, and incompetent. Harlan astonished the British with his very existence and high position, and during their occupation, argued feverishly for better administration:

"I have seen this country, sacred to the harmony of hallowed solitude, desecrated by the rude intrusion of senseless stranger boots, vile in habits, infamous in vulgar tastes, the prompt and apathetic intruments of master minds, callous leaders in the sangiunary march of heeless conquests, who crushed the feeble heart and hushed the merry voice of mirth, hilarity, and joy." 
"To subdue and crush the masses of a nation by military force, when all are unanimous in the determination to be free, is to attempt the imprisonment of a whole people: all such projects must be temporary and transient, and terminate in a catastrophe that force has ever to dread from vigorous, ardent, concentrated vengeance of a nation outraged, oppressed, and insulted, and desperate with the blind fury of a determined and unanimous will."

In short, he urged the British to buy off the major tribes with plenty of bribes, and include them in the government. Harlan ended up making his way back to the US and retired to a farm, where he kept scheming, to establish camels in the US military, to transplant Afghan grapes, and write vast books. He raised a regiment for the Civil war, and died lonely and destitute in that haven of adventurers, San Franscisco. It is a remarkable biography, under-appreciated in American history.

How are we doing in the present day? We are bribing the Afghans copiously.. check. We have a ruler in Hamid Karzai who is not incompetent or excessively cruel, but isn't exactly an historic stateman, either. Check. Will he be able to peacably retire to his fruit orchards in Afghanistan when his term is up and the US continues to melt away? When the foreign money dries up? Our program for Afghanistan requires some deep cultural change, in that elections are supposed to determine who has power, and merit determine who occupies the civil service. But the culture has never been democratic, rather thoroughly aristocratic, with patronage / clientage the vital transmission mechanism. The heads of families and tribes are the only people whose votes count, competing endlessly among each other for position. Can the two systems merge into a working state?

The US experiment has gone longer and better than the Russian, let alone the British, occupations. But whether it sticks in a final, cultural sense, is impossible to tell, and on that everything hangs.


  • Kansas: infra-red Aynrandistan?
  • A libertarian rethink.
  • Do all the wrong people admit being wrong?
  • More on the middle class and inequality.
  • Ella in a some serious scat. And with Mel TormĂ©.
  • State of finance, 2014.
  • Big data + free market + corporate oligopoly + no more privacy = another disaster.
  • Are unions the answer to the disappearing middle class?
  • This week in the Wall Street Journal: "In a republic, if majorities can change laws or rules however they please, you're on the road to life with no rules and no laws."
  • Again, money is a far greater danger to the Republic than snooping as it is currently done, despite the year of Snowden, etc.
  • Economics graph of the week. Whose money is pegged to whom?
Countries pegged more or less to either the dollar (green) or the Euro (blue).