Saturday, March 30, 2013

Making the web pay, one cent at a time

The internet has killed arts & media funding, just as we need more of it. What to do?

What do we want an economy for? Isn't it to give us more of what we want, and less of what we don't? But somehow through the last few decades, our collective aims have devolved into providing the financial gambling industry more money and keeping the poor down. It doesn't have to be that way.

We are a very rich country with lots of unemployed people. More and more of our basic needs are filled ever more efficiently. Wouldn't you think that arts and other forms of culture would be a bigger part of our lives than they currently are? Yes, there are industries of film, TV, and music, but they are being hammered by the disappearance of their gatekeeper functions, replaced by the wide-open, share-everything internet.

One would think that one form of employment we could all agree on is the performing, teaching, and propagation of music and other arts- some of the most positive experiences possible. But ironically, just as digital technology made spreading music and visual art (and recipes, and cranky opinions) easier than ever, the same process has rendered it economically perilous. When there is no gatekeeper, no restrictions, no scarcity, there is no income, by the typical business model. The same applies to the news media, likewise being destroyed by free information.

Music hasn't become less culturally important or desirable, but it has become markedly less profitable. The only franchise really left is live performance, which has undeniable scarcity.

Other countries make much more generous government-sponsored provision for the arts. Yet in the US, the measly amount sent to the National Endowment for the Arts and the public broadcasters is perpetually under threat from what Bobby Jindal calls "the stupid party". Certainly one option is to expand those avenues for funding.

But I think a more powerful way is to finally implement a concept that has been knocking around the internet for a long time- micropayments. If every download, every listen, every view, and every complete pageview were worth a cent, then the economics of our media lives would be transformed.

Bob Cringely recently wrote a post about how the economics of his own blog were just not working out. Even with 10 million page views on a typical posting, his ad rates are miserable. When was the last time you clicked on an internet ad? But if each of his readers contributed one cent after reading a full posting, he would be rolling in money, at very little individual cost to his readership. Heck, even I could earn a couple of beers out of such a scheme!

We have been addled by the advertising model of media funding. It is an appalling way to conduct our media lives in aesthetic terms, and inefficient, and has empowered some of the most socially destructive actors, culturally and politically (think of all those greenwashing ads by oil companies). Even public broadcasting is being gradually eroded by its exposure to advertising, since its model of having its viewers/listeners pay voluntarily (after relentless hectoring) doesn't work very well either.

If a heavy web surfer looks at, say, 200 sites per day, and reads fully, say fifty pages per day, that amounts to fifty cents spent. Add to that thirty songs listened to and ten videos watched, and it all adds up to a dollar, which seems like a very acceptable cost structure to the user.

Likewise, hooking up iTunes to a micropayment scheme on a per-play basis could fully unleash the internet for music, allowing all music to be open everywhere, and paid on the basis of actual use and enjoyment. The same for Youtube. Viral videos would be paid in appropriate fashion, by the masses who enjoy them.

So, rather than complicated and imprecise pay walls as they currently exist, a much better solution would be to reconstruct the internet on a broad micropayment foundation. Providers would choose whether to demand standard micropayments for their content (and thus be part of a very low-threshold paywall). Users would be blocked from those sites if they had not enabled an overall micropayment system on their browser with an associated account. If the rate is set low enough, i.e. one cent, I think it would be a no-brainer for everyone to participate. I think that the buy-in would be rapid and universal, and would transform our media landscape.

One benefit would be that advertising would be subject to new pressures. If providers only get paid after a user reads their post fully (scrolling down to the end, or paging to the end, or watching to the end), then having ads which clutter up the page and slow down readers would be selected against, and would only survive if they paid more than the users being turned away. Multi-page posts might be a thing of the past, among many other sins of design.

It is time to take back the internet for its users, and away from the corporations that are muddying its waters while inadvertantly bleeding so many other industries dry. Micropayments did not take off at the beginning of the internet, since the network was so small, and advertising seemed an easier method. But now might be a better time to bring that idea back.

  • And playing music is good for your health.
  • Another thing we could do if congress weren't constipated.
  • And yet another thing- super easy taxes.
  • Corruption is pervasive- "rent", in economist's lingo.
  • Workers being abused and killed- what century are we living in?
  • What's a free market?
  • The mortgage actors were irrational, not rational.
  • Evolution, feelings, love, and pain.
  • Religion- kicking the addiction.
  • A rough road out of Afghanistan.
  • Economics graph of the week. Median income is the lowest it has been in a decade or more. So while corporate profits are at an all time high, and jobs are trickling back, it looks like those jobs don't pay very well. More on the same...

Saturday, March 23, 2013

Mammals, rising from the ashes

Yes, placental mammals diversified after the KT boundary

The asteroid hypothesis for dinosaur extinction hit the world of paleontology like a meteor, and has been a constant source of amazement, nitpicking, and doubt since. The fossil record is patchy, so even if dinosaurs really died out at the cretaceous/tertiary (K/T) boundary formed by this impact in the geological layers, not all species would have representatives neatly recovered right below the boundary, but not above. The record would present a more sporadic view of some dying out earlier, depending on their general abundance. But over time it has become increasingly certain that the impact was indeed cataclysmic, and while some groups of dinosaurs were in decline beforehand, most gave up the ghost right at the boundary.

Example of the K/T boundary, from Starkville New Mexico. One signature is a high level of the element iridium, common in asteroids. 

Likewise for their thankful successors- the mammals- there have been long-standing disputes about when the major classes originated, especially the diversified descendents of the placental mammals- whether most of them were already present as the dinosaurs strode the earth, and what they really gained from the asteroid impact.

A couple of years ago, a thorough molecular study placed the the divergence of placental mammals into its major lineages, like bats, whales, primates, herbivores, etc, well before the K/T impact boundary, at least 100 million years ago.

An older mammalian phylogeny made from molecular evidence. Note the K/T boundary marked by the shaded boundary at ~66 million years ago, which was recently honed to just a 100,000 year window- 66.043 ± 0.043 Ma. This tree has most major lineages splitting well prior to that boundary.

But recently a paper came out that used both classical anatomical/cladistic methods and molecular methods to revise the story back to what had been thought for quite a long time- that placental mammals split into their various modern subgroups only after, but relatively soon after, the K/T impact. Within five million years, as these authors have it. At this time, there was a vast explosion of lineages that led to all the modern types we know and love, and others that went extinct. In comparison, very few major lineages separated in the last forty million years.

Unfortunately, their phylogenetic tree diagram is such an overstuffed mess as to be unpresentable. But just scrunch the one above in the horizontal scale, and you pretty much get the idea, although both groups agree that the split between marsupial and placental mammals happened much farther back- about 180 million years ago. The divergence between the two views is remarkable, and comes from the interesting fact that the molecular "clock" doesn't tick very evenly.

Molecular phylogenetics uses comparisons of protein sequences. You can have your pick over a wide spectrum of sequences, from genes like immune system components that change very quickly, (in an evolutionary sense), to others like histones or ribosomes that change very slowly. So you have your choice of clocks ticking at different speeds. With modern sequencing technology, it is relatively easy to collect data for your selected proteins from many species, and crunch then computationally to align the sequences and judge their divergence.

It is very quantitative in a way, and thus attractive over the old ways of comparing the slow change of tooth shapes over a fossil series, or skull shapes, or ear bones, or ankle bone shapes, etc. But there is a catch- that the only sequences we have are modern, so we have an enormous and perilous estimate to make when we want to translate a sequence divergence into a divergence in actual biological history. (And this isn't the only catch- there are a blizzard of possible mathematical techniques and associated theories/models available for the basic alignment comparisons and other steps, which have taken a long time to shake out.)

These estimates are calibrated using divergence times of well-understood fossils that track reasonably well to species we are familiar with in sequence terms (say, sheep and horses). But this calibration becomes rapidly more hazy as we go back in time, combining the uncertainty of the winding sequence history since divergence with the uncertainty of the fossil record. So it is easiest to stick closer to home (i.e. recent evolutionary times) and project those "calibration" clock rates back to more ancient events mathematically.

However it is apparent that evolution is not a constant process, and that especially during periods of promiscuous evolutionary radiation, sequence variation (as reified in founded linages) may speed up, causing a molecular-only analysis to cast events substantially farther back in time than they really occurred. We also know that different lineages have different evolutionary rates at non-radiating times, making simple calibration rather perilous. The only way is to put more weight on studying the fossil record, which is what these current authors seem to have done.

Characters on the skull used by the authors to compare fossil mammals for lineage membership determination- just an example of the methods used to establish relatedness among real fossils, in contrast and comparison with molecular relationships.

This issue of the wayward molecular clock applies even more strongly to the advent of eukaryotes, estimates of which range from less than 1 to 2 billion years ago. The revolutionary nature of this transition can hardly be overemphasized, generating the enormous and enormously complex eukaryotic cell out of the symbiosis of two or more bacterial cells. Many new systems arose denovo (meiosis, nuclei, intermediate filaments, goli and endoplasmic reticulum). It also involved a long gestation in evolutionary obscurity, followed by an astonishing radiation to a multitude of forms most various, including all animals.

So evolution remains a story in progress, with a fair amount of its history still shrouded in misty uncertainty, and only gradually coming into focus as more data come in and are more carefully analyzed.

The green line is projected labor force, given population growth and proportion employed during good (more or less!) conditions prior to 2008. The blue line is actual labor force participation in real numbers. The gap stands at about 12 million people. What could 12 million people be doing for us?

Saturday, March 16, 2013

Invasion of the stone cold killers

How one virus docks and enters.

These are wonderful times in biology. Advances in technology make routine what was before extremely arduous or impossible. An example is the study of structures at atomic and near-atomic scales. The data bank of biological atomic structures- of proteins, mostly, but also of DNA and RNAs- has ~90,000 entries, including increasingly complicated and large structures.

Combining atomic structures (deduced from X-ray crystallography, typically) with more gross-level imaging by electron microscopy has become possible as well. A recent paper described the large-scale structure of a bacterial virus as it docks and then injects its payload of DNA. Quite reminiscent of a space landing craft, really!

Example images of viral docking, in three stages. Stage A, the virus is positioned on the surface. Stage B, it has inserted its syringe-like channel. Stage C, its head is empty, after all its DNA has entered the cell. Note that this is not to scale- these cells are not full-size E. coli, but mini-ized vesicles derived from them.

This paper clarified the first parts of this docking story, showing where the tail fibers are at different stages of infection, at reasonably high resolution. The method was a lot of electron microscopy at the highest possible resolution, of hundreds of viruses, which were then averaged together to form smoothed-out pictures of higher resolution that any single one alone.

Averaged image of T7 viruses, showing its tail, internal core, and tail fibers around the outside (C is the view from bottom). In A, a viral mutant was used that has no tail fibers.

This virus (the T7 bacteriophage, well-studied in molecular terms) first detects and binds to a bacterial cell (the usual lab specimen of E. coli) using its tail fibers, shown in yellow:

Schematic interpretation of the data, with tail fibers in tucked position, and the detailed structure of the end of the tail fiber, which binds specifically to bacterial surface LPS, superimposed at scale.

These yellow spokes are the landing gear, nicely tucked up against the body of the virus, which is a sort of ordered crystal of coat proteins enclosing a DNA cargo (of only 39937 nucleotides). One oddity is that, while there are six tail fibers, the body of the virus is an icosahedron with five-fold symmetry. So the fibers can't be very tightly bound to the body. The ends of these fibers seem free enough to bind their receptors whether tucked against the virus body or extended out. They bind a common and large molecule on the surface of bacteria like E. coli, called lipopolysaccharide, or LPS. This binding is very loose, however, so the virus can roll around a bit before settling down on its target.

Once a fiber binds, it flips out into an extended shape, and other fibers can then bind and flip out as well. When all six have bound, the virus is well-attached to the ill-fated bacterium, and is also properly positioned for the next step, which is the descent of the central (red) channel and injection of the viral DNA.

Data and schematics of T7 in the docked position, before and after tail insertion. OM is the bacterial outer membrane, IM is the bacterial inner membrane. PG is the peptidoglycan cell wall layer between the two membranes.

What happens next is a little more mysterious. The binding of the tail fibers to the bacterial surface- even all six- does not seem to signal the viral core to fire its payload. The central red tail does not seem to have a specific receptor that it binds to or recognizes either, as far as I can read. Rather, it seems to expose enzymes perhaps like lysozyme, that begin to degrade the bacterial surface. This results in a little indentation of the surface (see the image above). At some point, a large core complex of proteins inside the virus bodies is signalled to come out, forming a pore that spans both the outer and inner bacterial envelopes / membranes (parts C & D, above).

Author's interpretation of the docking and insertion sequence. The DNA enters quite slowly, actually, taking several minutes and happening in three stages, pulled successively by viral proteins/internal pressure, then the host RNA polymerase, and lastly the RNA polymerase encoded by T7 itself.

An interesting aspect of what happens next is that the viral DNA doesn't just shoot into to the bacterial cell. It carries sequences that would be susceptible to one of the bacteria's defense mechanisms- the so-called "restriction" enzymes that cut foreign DNA. So the virus feeds in just a little of its DNA, (850 nucleotides), protecting it with special proteins. That DNA holds three promoters, which attract the resident bacterial RNA polymerase, which through its action of transcription physically pulls more of the viral DNA into the cell. One of these genes is a specific inhibitor of bacterial restriction enzymes. Later on, after T7's own RNA polymerase is transcribed and synthesized, (and after many host proteins have been destroyed), it comes back to pull the rest of the viral DNA into the cell at a speedier rate. The whole cycle of infection can be done within half an hour.

A diabolical mechanism, to be sure. But viruses like this one are being considered as next-generation antibiotics, to take up the slack after our penicillin-related antibiotics wear out due to overuse and evolved resistance. So don't be surprised if you end up swallowing some of these viruses as medicine someday.


Saturday, March 9, 2013

Socialism with English characteristics

Can't we all live together?

I've been watching the first few episodes of Downton Abbey, and honestly, it reminds me of Glee, with its machine-gunned melodrama. One is left in a tizzy trying to follow it all. But one plot line in episode 4 was intriguing, where the head housekeeper Mrs. Hughes entertains a marriage proposal from an old flame, a widowed farmer from Scotland. She says no, with reluctance and some wistfulness for the freedoms he offered.

But she does live in a castle, after all- not a bad spot, even if one is downstairs in service. It made me think about the ecosystem the show portrays, vs the pioneer homesteading ethic we so commonly hold in the US, madly rushing into debt and overwork to stake our claim on a single-family home, proudly isolated from the rest of humanity.

The Edwardian estate was a communal living arrangement, with a large population cheek-by-jowl, all serving each other's needs. Washing was done, meals cooked, grounds kept, bills and employees paid. Very few communal living schemes have had such durability, certainly not more idealistic ones forsaking hierarchy in the name of communism or socialism.

The right/Republican end of the human temperament spectrum certainly has an important point to make in this respect, that hierarchy is an essential part of the human condition, putting the order into "the social order", putting each member in a place with a known role, from which they hopefully have a structured route for ascent depending on luck and talent, as evaluated by other members of the collective.

The conceptual revolutions of the enlightenment, and finally Marxism, began to see this hierarchy as intrinsically oppressive. As a matter of humanity, no person had the right to order others around, or to be fussed and dressed by a simpering valet- certainly not by birth. The capitalist and estate proprietor was seen not as the orchestrator of a complex community that provided roles and sustenance for all its members, but a parasite who skimmed off profits and labor-value from the powerless worker-bees who, if only they could shake off the chains of a socially programmed hierarchy, could earn their full value and enjoy a life of personal freedom.

But sometimes that life is less rich than one within the communal hive, however constricted. Even if the servants get only a fraction of the good food, and rudimentary dormitories, even if they experience the daily sting of inequality, they have other creature comforts and social comforts that might well make up for it. Some of our greatest literature has arisen out of the complex communities of the estate. (Though admittedly, rarely from downstairs.)

In our day, the corporation is perhaps the most dominant social community, leaving its members to a freely atomized personal life, even as it imposes hierarchy of a very traditional sort on their daily working lives. The recent dustup over Yahoo's renunciation of telecommuting touches on the heart of this power structure- how closely should the corporation dominate its slice of our lives?

Is this modern division of work and "life" the final solution, or will the future bring other modes of social organization? Work is a central human value, so even if we don't need to do anything- once robots take care of all our mundane needs- we will still live in hierarchies of some kind, and work to make each other's lives better, in some, hopefully higher, more artistic, more humane, way. Perhaps that is what Marcel Proust saw in his fin de siècle time, running from one Parisian Salon to the next, scrambling up the social ladder with no greater goal than to be loved.


  • Temperament, politics, and social order, continued...
  • Suicide bombings OK.. or not OK?
  • Afghanistan- the good news.
  • As markets go up, labor keeps getting screwed.
  • In Japan, finally the prospect of an exit to normality, via more spending.
  • We, too, need a fiscal kick in the pants.
  • Our libraries, our homeless shelters.
  • Economic quote of the week, from Friedrich Hayek, on equilibrium methods and assumptions in economics:
"The assumption of a perfect market then means nothing less than that all the members of the community, even if they are not supposed to be strictly omniscient, are at least supposed to know automatically all that is relevant for their decisions. It seems that that skeleton in our cupboard, the 'economic man', whom we have exorcised with prayer and fasting, has returned through the back door in the form of a quasi-omniscient individual.
The statement that, if people know everything, they are in equilibrium is true simply because that is how we define equilibrium."
...
"Clearly there is here a problem of the Division of Knowledge which is quite analogous to, and at least as important as, the problem of the division of labour. But while the latter has been one of the main subjects of investigation ever since the beginning of our science, the former has been as completely neglected, although it seems to me to be the really central problem of economics as a social science."

Saturday, March 2, 2013

What's next for Apple?

Robots.

With Apple being the most valuable company, more or less, one has to wonder how it can top its previous tricks. Bob Cringely asked this question, and got me thinking.

Honestly, it is hard to expect any growth at all. Given its already enormous size, the most likely path goes down, not up towards a future where Apple would become lord and master of the business world.

Nevertheless, what could be next? How can they top their megahits of iPod, iPhone, and iPad? They have made a science of striking a consumer category ripe for computer-i-zation with a fully-formed solution that transcends what others have imagined and melds design, computerization, and cloud services into all-too convenient ecosystems. As an example of bad timing, Apple first offered a tablet computer way back in 1993- the Newton. It was an amazing achievement, but far before its time. At which point Steve Jobs finally pulled the trigger a second time, and the iPad arrived.

What other area of our lives are ripe for this kind of treatment? Where have nascent computer-consumer interfaces been lurking, ready to inspire an Apple-style invasion and makeover?

I think it is robots. Admittedly, invading this area would be extremely ambitious, involving far more moving parts than Apple is used to dealing with. But the field seems to be in a perfect state. There is an active industrial sector with usable, worked-out technologies. There is a nascent consumer sector, in a few specialized niches- cleaning and telepresence robots (which are even based on the iPad). Autonomous cars seem to be imminent as well.

The potential is clearly very high. Just as desktop computers took over areas of our lives and previously separate jobs that no one imagined, (secretaries, music production, guitar hero, postal service, phone calling), so robots will doubtless work their way into innumerable areas of our lives, limited only by their intelligence and design.

For running a household, the electrical revolution has left a great deal undone. Compared to having a house full of servants, having a washing machine leaves quite a little to be desired. So does having a microwave, compared to having a cook. So does having a vacuum cleaner, compared to having a maid. Could robots bridge this gap between dumb machinery and true service?

It is going to be a very long road, but with real uses already here, it is the kind of market that may be ready for exploitation, and for unimaginable growth.

Could robots take over these tasks without also taking over and ruling the world? I think so, given that we design them. But in any case, it is a world we are headed towards, and Apple would be well-positioned to help design it.


  • Oh, robots do interpretive dance, too.
  • Is the thrill gone at Apple?
  • Henry Ford- another visionary / megalomaniac, building the future.
  • Telecommuting rocks.
  • Christian martyrs: a mess of legend.
  • Christian church- right or wrong, the tribe is most important.
  • IBM, Circa 1970's-80's. "Every IBM employee’s ambition is apparently to become a manager, and the company helps them out in this area by making management the company’s single biggest business." Also ... how the world ended up using the "quick and dirty operating system".
  • Why priests? ... indeed.
  • The sequester ... is what it's like being ruled by the stupid party.
  • Solow on debt. And some MMT corrections.
  • Economics graph of the week- sectoral balances in the US, since 1952. Note how the household sector was in a highly usual deficit since roughly 1998, up till the crisis.


Saturday, February 23, 2013

Riding into the dawn of history, with the Indo-Europeans

Review of David Anthony's "The horse, the wheel, and language."

The nineteenth century experienced a rush of excitement about the linguistic consanguinity among the Indo-European languages and thus the cultures of India, Iran, and Europe. The most exotic cultures were joined through an invisible history that was at last re-bridging their farthest-flung representatives through the English colonial project.

By the next century, this connection curdled into the Aryan theory of racial distinction from the semitic and other "races", buttressing with pseudoscience the already virulent antisemitism of Christian Europe.

Anthony's book brings the whole story up to date, covering vast grounds of archeology and linguistics that give us more information about the Indo-Europeans, resolving many of the big questions about them, principally where and when they originated.



The first thing to say is that whatever the Aryans were at the outset, they soon lost themselves (genetically speaking) in the vast bodies of humanity they collided with. Their dominance, based on the military advantages of horse-riding, chariot-fighting, and outstanding metalwork, gave their language a privileged status. But they practiced a client-patron form of rule, and accepted into their culture whoever conformed to the cult (best represented in the Rig Veda (~3500 ybp, or years before present), but also reflected strongly in the Roman culture), so by the time of the Nazis, there was no such thing as an Aryan.
"The Rig Veda (of India) and the Avesta (of Iran) agreed that the essence of their shared parental Indo-Iranian identity was linguistic and ritual, not racial. If a person sacrificed to the right gods in the right way using the correct forms of the traditional hyms and poems, that person was a Aryan. Otherwise, the individual was a Dasyu, again not a racial or ethnic label but a ritual and linguistic one- a person who interrupted the cycle of giving between gods and humans, and therefore a person who threatened cosmic order, r'ta (Rig Veda) or asa (Avesta)."

One of the most interesting observations from the linguistics is the inexorable change languages undergo. The English of only 1,000 years ago is unrecognizable to us. So not only can one make conclusions of common origin based on linguistic similarities, one can make negative conclusions from a lack of similarity and also rough timing conclusions about when branches split off from each other.

For instance, the Indo-European languages have a lot in common- many root terms and core concepts, like horses, gods, wheels, wool, carts, portable wealth, dogs, milk, and much else. They have such a strong core that they can not have diverged more than 5 or 6 thousand years ago. This core must have functioned as a coherent cultural language for a group that couldn't have occupied a terribly large territory originally, given the technologies of the time, or persisted for a very long time in that early state, yet which subsequently spread like wildfire to all corners of the western ancient world, and may have significatly influenced the rise of Chinese culture as well.

The linguistics point to some key innovations- the domestication of horses (estimated about 7000 ybp), horseback riding (~6200 ybp), the invention of the wheel (estimated at ~5700 ybp), and the use of wheels on light, one- or two-person war chariots (estimated at ~4000 ybp). There is also the adoption of long-haired (mutant) sheep for wool production, and entry into the bronze age proper, developing arsenic and tin alloys with copper into a regular industry. All these steps were evidently first taken in the Pontic steppes, the regions North of and around the Black and Caspian seas.

Why? Well, the first reason is that this is one of the few regions horses remained in the wild. People in this area had already domesticated pigs, goats, sheep, and cattle. But horses are another kettle of fish! All domestic horses trace their ancestry on the male side to a single male progenitor, indicating the difficulty of establishing a domestic herd. Horses were hunted commonly for food, one of those "pre-adaptations" one hears about so often in the evolutionary stories. So it would naturally be in this region as well that horses were first domesticated for food, then ridden for better managment, and lastly ridden and harnessed for many other purposes.

While Anthony's expertise generally leads to grievous over-writing on the archeological issues he is most familiar with, his work on bit wear is quite significant, finally figuring out how to tell whether horse remains show signs of riding, by way of the slight damage done by bits placed before the molars by horses chewing on them. This is how he roughly dates the advent of horse riding to about 6200 ybp in the Kazak to Caucasas area. One can only imagine how daunting the prospect of riding the first horse must have been, and how bizarre one's first sight of another human on horseback.

Anthony describes an interesting process where the productivity of the steppes was transformed by the horse and wheeled carts, from a wasteland where herders could visit only on brief forays from the river valleys, into a perpetually productive zone where they could nomadically herd as far afield as they pleased. A bit like how the iron plow transformed the farming (and destruction) of the steppe / prairies in later times. This economic change also introduced the possibility of vast accumulation (and vast differences) in wealth, stored on the hoof as livestock. Which then fostered a cultural transformation towards much more differentiated status hierarchy / patriarchy, where the rich were buried in very labor-intensive monuments.

It obviously also knitted together large regions not previously in contact, between these steppe areas and the more urban areas to their south, and across the steppes even over to China. Again, this is more than a little like the later rise of the Mongols, who ranged even more widely using the same technology of horse-based nomadism to make of Central Asia a highway of conquest and trade- the silk road. Anthony highlights a large amount of Russian archeology from the last forty years that has not been very accessible or appreciatedd to the West, and focuses particularly on one culture at the southern end of the Urals, the Sintashta, which seems to embody the ur-Indo-European culture.

Sintashta grave, with metalwork, horse remains, and chariot remains. At lower right are horse bit cheek pieces, whose knobs are believed to have been placed inward against the horse's lips, giving the driver/rider even more control with a very light touch.

These people (about ~4000 ybp) built compact, strongly palisaded encampments, filled with bronze workshops. They buried a small proportion of their population in kurgan graves, which were a steppe specialty of a large circular built-up mound with a central grave, often structurally supported with wooden bulwarks. These particular graves contained a good deal of bronze, and war chariots. Most interestingly, they contained horse-intensive sacrifices eerily similar to the central horse sacrifice ritual described in the Rig Veda, with heads and feet arranged artistically around an overturned pot. (The rest of the animals were served in the feast; unfortunately, I can not offer an image of such a grave that does it justice).
"Similarities between the ritual excavated at Sintashta and Arkaim and those described later in the Rig Veda have solved, for many, the problem of Indo-Iranian origins."

So there we have it, the origin in time and space of the Indo-Europeans, more or less. What they brought to the rest of the world, in addition to their language, continues to ring down the ages. Roman culture was a typical example. The patron-client system was honed to perfection in Rome, making of the paterfamilias practically a god, to be revived periodically via the public parade of his death mask. Women were of vanishingly little account, on the other hand. This culture was starkly patriachial. When the Romans invaded Britain, they were astonished to see war chariots being driven about- something the Romans had only heard about in the Greek epics of a by-gone age, despite their own carefully tended "sporting" rituals of chariot racing and other manly feats of brutality and human sacrifice.

In our own day, we remain inheritors of many of these traditions, struggling still to overcome patriarchalism, colonialist tendencies, a large cast of sky-gods, and our love of speedy chariots. Perhaps our love of technological innovation will start to solve some of these problems, rather than feeding greed and powerlust as it has so often in the past.

Wotan takes leave of Brunhilde, Konrad Dielitz, 1892.

  • A contrary view of the origins. The Anatolian branch is accepted by both sides as particularly early, but whether it was the immediate precursor to most of the rest of the Indo-European family is in question.
  • Excel and the London Whale.
  • Martin Wolf gets MMT religion.
  • "The Economist"- another right wing rag.
  • What's so bad about corruption?
  • Chromium- another study in corruption.
  • The triple pane way to climate control.
  • The postal way to simple banking.
  • "As the door revolves" ...
  • Gay blackmail in the Vatican? I wouldn't want to be pope either.
  • No wonder we watch Downton Abbey- we are now a class-ridden, immobile society.
  • Economic quote of the week:
"... high unemployment we know depresses wage growth throughout the wage scale, but more so for the bottom than the middle and the middle than the top."

Saturday, February 9, 2013

Eisenhower: hidden hand, or itchy trigger finger?


Review of Evan Thomas's "Ike's Bluff"

I picked up, from my library's wonderful "new" rack, a biography of Dwight Eisenhower that focuses on his foreign policy as president. The H-bomb was created while he was in office, multiplying the conundrum (and unthinkability) of pulling the trigger. These bombs can be made to literally any scale desired, given enough materials.

The conundrum it presented Eisenhower was that he wanted to save money and reduce the size of the military by brandishing various nuclear threats, but he didn't actually want to use them. The author tries to make the case that Eisenhower was perhaps the only person respected and feared enough to pull off such a bluff persuasively, and to do so, he knew that he had to tell no one of his inmost decision, whether or not to drop the bomb. It was a lonely position.

So the country practiced its duck-and-cover drills, built its bomb shelters, and employed the "Bland" corporation to devise ever more esoteric, even shamanic, rationales for mutual assured destruction.

But this kind of bluffing that wears thin pretty fast. A few crises further along, (at least), it became clear that no one really wanted to use nuclear weapons under any non-existential circumstances, (or even then), so we went back to fighting wars the old-fashioned way, with guns and proxy fighters in far-away countries. The Korean war had already demonstrated all this, so the idea that anyone, including the Russians, took Eisenhower seriously when he dropped his various hints about using nuclear weapons is really a bit hard to swallow.

Thomas works valiantly to make Eisenhower look commanding and wise in his conduct of all these policies, heading an administration noted for its ostensible blandness while a happy and prosperous country took its cue from its chief and went golfing. Internationally as well, Eisenhower was mostly respected and even loved, as America was still the colossus, leading the way both morally and materially out of World War II.

Retrospectively, there are definitely high points, such as Eisenhower's brutal string-pulling to march the English and French out of their Suez adventure. It is not well remembered that less than a decade after Israel declared itself a country, it launched an unprovoked war on Egypt, after Egypt started buying arms from Russia and nationalized the Suez Canal. Israel's attack plan was hatched in secret with Britain and France, who were supposed to magnanimously broker a peace where the European powers would take back ownership of the canal- for the good of all, no doubt. The US was also playing for influence in the Mideast, and, with some lip service to anti-colonial principles, used its dominant financial position to destroy England's exchange rate and starve it for oil.

Another positive was the U2 spy plane program. Eisenhower was receptive to technological advances, and was intimately involved in the approval and running of this plane that flew at 70,000 feet, twice the height of today's commercial airliners, and (for a time), beyond the Soviet air defenses as well. The U2 gathered immensely useful photo reconnaissance, making it clear to Eisenhower that the Soviets were far behind the US in nuclear armaments. There was no bomber gap, and no missile gap.

But did Eisenhower tell his jittery countrymen? No. I find this very hard to understand. The rationale was that such announcements would betray the U2 program, so its findings needed to kept secret. But it was not as though we had no other capabilities, and couldn't just generally state that we had, through various means, a very good idea of Soviet capabilities. It would have been very positive for US leadership on all fronts to make it clear that we had no doubt of our overall position in this arms race, preventing the kind of domestic fears, divisive politics, and foreign adventurism that happened through this time.

Unfortunately, the blowup over the U2 after Gary Powers was shot down derailed the blossoming detente that Eisenhower was pursuing with Khruschchev, whose sight-seeing trip through the US had been so successful in thawing up the cold war. But it really wasn't Eisenhower's fault, in my view. The more information was available, the better, both for us and for the general stability of the world, as Eisenhower had earlier advocated with his "open skies" program. And which would later be implemented in various test ban and arms control agreements.

Eisenhower also refused to get involved in Vietnam, letting the French face anihilation (decolonialization?) at Dien Bien Phu. On the other hand, Eisenhower coined the fateful "domino" theory of communist expansion in Southeast Asia, and started US support for the disastrous Ngo Diem in Vietnam. Would he have gone back in later on, when the South came under mortal danger? It is doubtful that he would have done nothing, but, having extricated us from Korea, I think he would have been very reluctant to escalate into another quagmire on China's doorstep.

On the other hand, Eisenhower let the CIA run horribly amok, staging coups and other more or less amateur operations all over the world. Meddling freely without getting the US into a shooting war seems to have been the theme of the Eisenhower presidency. Coups in Iran and Guatamala, attempted coups in Indonesia and Syria. This theme continued through the cold war, with the Bay of Pigs and the overthrow of Allende in Chile by the Nixon administration. Our record of picking rulers for other countries, has been, to say the least, poor (cough... Karzai). And the eventual blowback from our meddling in Iran in particular has been epic in scale and duration.

So on the whole, I find the Eisenhower foreign policy to fall short of the model of far-seeing statesmanship. Notably unsuccessful in his own stated goal of reducing the military-industrial complex, Eisenhower did keep the US out of wars small, large, and apocalyptic. But then we were the most powerful nation by far, making that task a little less difficult. His enthusiasm for covert operations was not only damaging by its direct effects, but infected future administrations with misguided bravado and sullied the US's reputation into the present day.



  • Conversely, how has Hillary done?
  • Meanwhile, we are slipping into our own rogue policy problems.
  • Filibuster, still killing democracy and government.
  • Geithner, stabilizer of finance, blind to the long term.
  • Car of the future?
  • Bill Black continues on the toxic and ideological pusillanimity of our current elites.
  • A glorious anniversary- the income tax is 100 years old.
  • The university is so yesterday.
  • But Facebook is evil.
  • Austerity- cover for class war. And for pro-cyclical futility.
  • Dell deal is a tax dodge.
  • Why some people just aren't very good at lying, or even BS. (Or religion.)
  • Economic graph of the week- :



Saturday, February 2, 2013

Guns- a love story

The archetypal and psychological valence of guns.

It is pretty clear that our gun debate has little to do with reason. Even less with the consitution, militias, self-government, or self-protection. It is about psychology.

In this fallen age, when the old patriarchial verities are in decline, men marry other men, women run for president, bromances dot the cinema, and sources of virility are thin on the ground, guns stand forth as an undisputed fount of macho manliness, projecting potent globules far and wide.

Far more than a simple tool, guns are civilizationally transformative, and blatantly symbolic of male power. Owning a gun is deeply personal, conferring on the owner and the home the combined potency of fire and phallic symbologies. Can you say "pump action"? Not to mention the flirtation with killing and murder, the most potent act of all. One never forgets that a gun is in the house.

Therefore, taking guns away from people conjures up castration anxieties, prompting endless rationalizations of how many armed intruders one might have to battle with one's pleasingly long AR-15 with plenty of juice in its thirty or ninety-round clips, holsters, and bandoleers.

Unfortunately, once we are in this kind of twilight territory of the archetypes, the attraction of guns for unstable, embittered losers looms even larger than it does for the usual red-blooded male. Which then leads to the occasional complete breakdown, enacting a fantasy of glorious retribution for all the belittling, emasculating affronts that the world, and especially females, have heaped upon this frustrated male.

Obviously, then, the answer is to test each gun owner with a simple, if paradoxical, question: may I take your gun away? If the answer is no, then the gun really does need to be taken away. Can our social collective transcend this psychological difficulty? Can the superego rule the id, to continue with this Freudian line? We shall see.


  • What I am talking about..
  • Department of injustice.
  • Treasury gave $37 million to bankers bank accounts.
  • Is crony capitalism now hopelessly entrenched?
  • From the Atlantic- the most knowledgeable investors and analysts have no idea what lurks inside the banks.
  • Are things not getting worse all the time?
  • Reform, or serfdom for a select group of immigrants?
  • The coming storm in Afghanistan.
  • Who is running the show? Afghan conference of clerics to condemn suicide bombing... cancelled by the Taliban. Islam seems more than a little conflicted.
  • Cats kill billions of birds in the US each year. Feral cats need to die.
  • The wild and wooly world of atheism.
  • Economics quote of the week, from Bill Black:
"The liar’s loans “crisis” of 1990-1992 in Orange County, California was stopped in its tracks without any expensive failures because we (the OTS West Region) realized that such loans inherently would lead to endemic fraud and losses."
  • Bonus economic graph of the week: GDP has been dragged down by declining government spending, of all things. 

Saturday, January 26, 2013

WaMu-ellujah! Forgetfulness and greed


The failure of Washington Mutual, and a new paradigm of corporate responsibility.

A blurb on the back of "The Lost Bank", by Kirsten Grind, calls it "entertaining". If having your teeth pulled is entertaining ... perhaps. It is tragedy of a classic sort, except that the main protagonist (the CEO) rides off rich as sin, leaving behind a landscape of smoldering properties, empty bank accounts, and angry mobs.

Washington Mutual began as a well-loved local bank of the northwest, focusing on customer service. It barely survived the S&L crisis, but thrived thereafter, buying up like-minded banks and integrating them into its low-key, family-oriented atmosphere. The key transition during this era was from a beloved CEO, Lou Pepper, to Kerry Killinger, a more MBA-style, somewhat inscrutable, and socially awkward banker. It turned out that he had enormous ambition and energy, however, and drove the bank to new heights of acquisition, all based on its retail banking prowess, honed through many years of takeovers and careful back-office integration.

Eventually, the company outgrew its small town values, became the sixth-largest bank in the US, hosted blowout parties featuring such inspirational messages as WaMu-ellujah! Along the way, the company acquired mysterious loan issuers in Southern California that reaped ever-increasing profits, while keeping the details murky. For a bank, loans equal assets. This means that if customers can be convinced to take out loans far beyond their ability to pay, the loan is, at least in the short term, booked as a higher asset for the bank.

Washington Mutual spent a long time torn between its addiction to these profits, and its underwriting standards, which gradually fell by the board. Risk officers came and went like ghosts in the night. Killinger became a rock star, by the standards of the business press, at least. As S&L crisis veteran Bill Black explains with regularity, the easiest way to rob a bank is to own one and run a control fraud on it. The money is guaranteed. (Short term, at least.)

The question that the author does not address is the most difficult one- how much of this was conscious, and how much unconscious? How much was the bubbly atmosphere where collapsing house prices were inconceivable, real estate assessors were bullied into marking to the desired price rather than to value, ratings agencies colluded with their bank customers to mark toilet paper as AAA, and real estate bundled-security investors trusted that someone else was minding the store?

Or how much was a conscious decision by Killinger and his lieutenants to throw the core banking concept of underwriting standards into the circular file and let the money roll in? How much was the negligence of auditors, investors, regulators, analysts, and journalists to connect the flow of profits to the sewer of underwriting that Washington Mutual was tending in secret?

We are not told, and it might take a psychoanalyst to address properly. In any case, we have heard a great deal about "moral hazard", a concept that applies here in spades. The institution of Washington Mutual was destroyed, absorbed into the ever-growing colossus of JPMorgan. But the executives responsible for all the bad decisions and betrayed standards and trusts ... they kept their loot and retired in comfort.

This is a problem- that corporations and their officers pay insufficient attention to the long term. Only corporations that thumb their noses at conventional business advice and analyst pressures (think Apple) have an inner culture strong enough to ignore the short term temptations and to focus on the long-term. The default criterion is the quarterly earnings, by which executives live and die, and get the bonuses and options whose time horizon is a few years at most. Take the money and run has become the normal way of business in the US, and it is damaging not only in rare crises, but in broader cultural terms.

Washington Mutual was mainlining the drug of greed- raking in money from its shady lending operations, and getting patted on the back in the bargain for extending credit to previously underserved communities(!) Internal risk assessment was subverted, long-standing cultural norms over-ridden. The regulatory establishment likewise lost its mind, likewise captured by criminal amnesia (especially considering the S&L crisis had blown through less than two decades before) and subject to the same Minsky cycle of forgetfulness as the bankers.

What to do? I think the answer is to create a societal mechanism to extend clawbacks deep into the ill-gotten gains of the executives and others profiting from this kind of toxic, antisocial activity. We need to think beyond simple criminal accountability. Corporations are organized to have limited liability. And as far as shareholders, that is perfectly fine. They takes their risks, and they gets their rewards. But when it comes to the general public interest, no such liability protection is appropriate. If corporations are to be people, they can not be immune from basic duties to the public interest, especially in view of their vastly expanded influence on that public interest, relative to the scope available to an individual citizen, due to their size and organization.

Just as lawyers are deemed officers of the court, and are held to some professional standards that go beyond fulfilling the letter of the law, corporate executives should be deemed officers of the state, with extra duties and liabilities that extend beyond the letter of the law. Prime among them would be general and long-term liability for all their gains beyond a base salary of perhaps the median for their company. All other monies would be subject to long-term review and surrender based on legislative findings of culpable irresponsibility and harm to the public good.

Now, all businesses are in the more or less culpable in cutting corners, transforming public goods into private gains, imparing markets, and the like. This proposal wouldn't be aimed at typical businesses. On the other hand, there are many corporate endeavors that are socially positive but are not sufficiently rewarded in the market. Those who invent the lasers, microchips, diagnostic tests, and other great things do not always get their due, and might be beneficiaries of some of the funds gathered from the malefactors held to account by the above process.

Obviously, this is a highly problematic type of proposal. Now that corruption is standard practice in the revolving door between business and government, how could we possibly expect this nakedly political process to work any better than the criminal process that has already so obviously failed to bring financial criminals to account? Would this new process of accountability not skirt the protections of the rule of law and lead to hyper-political battles that leave the country even more corrupt, and well-connected but disastrous businessmen even better off than before?

I do not have good answers. All I know that it is fundamentally unjust and wrong to see someone like Mr. Killinger hide his ill-gotten gains behind a curtain of legalism, when the whole idea of law is to create a just society, and the whole idea of business is to render people useful to each other, not only for a New York minute, but over the long term.




Saturday, January 19, 2013

How do we get five digits?

Hox genes control the number of digits (fingers and toes) in an interesting way.

Once the trail of animal development research was blazed in the fruit fly, mammalian investigators eagerly followed, using similar methods and looking at related genes. Some of the most interesting have been the Hox genes, which control patterning at a very high level- the identities of segments in flies, and the identities and numbers of related body areas in mammals (vertebrae, ribs, limbs, digits, etc.)

Genomic diagram of Hox genes in various organisms. A rough phylogeny is on the left, and diagrams of where some of the genes are expressed and have their effects is on the right. The middle shows the clusters of Hox genes, where the entire clusters has been quadruplicated in the vertebrate lineage, creating A, B, C, and D clusters of genes 1 to 13, though a few are missing here and there due to later loss. Note the general rule of linear expression of Hox genes from front to back in the organism, coordinate with genomic position.

The wiki page on Hox genes supplies this graphic of the Hox gene clusters of various species, related by a rough phylogenetic tree (left; tetrapods are us). Each colored box represents one protein-coding gene, positioned roughly as it appears in the genome (not to scale). Note that vertebrates picked up a quadruplication of the entire Hox cluster, after which a few individual genes were later lost. This expanded the body plan repertoire of this lineage substantially- a significant evolutionary event. The original Hox cluster was incidentally already the result of long-ago duplication of a single gene encoding a transcription regulatory protein. All Hox proteins have very similar structures.

Hox stands for homeobox, which stands for homeotic transforming transcriptional regulator protein containing a diagnostic protein sequence that binds to DNA and was called a "box" of sequence, due to its appearance in sequence alignments. And homeotic? That is not kinky at all, but refers to genetic effects on the body plan, i.e. the transformation of one part of the body into one "like" another via mutation, from the Greek for similar. Hox proteins all have their effects by binding to other genes and controlling their expression, though unfortunately little is known about these details, at least in mice, since this end of things rapidly becomes extremely complex.

Another part of the story of digit control is a different DNA-binding protein, Gli3. When mutated, Gli3 is known to cause polydactyly- the production of typically a sixth digit- as well as many other malformations (see image at bottom, left side). Gli3's activity (details of which are largely unknown) is the effect of a gradient of another protein, called sonic hedgehog (Shh) in the developing limb bud, and which at last is a protein that forms an actual physical gradient in the tissues of a developing limb.

Shh protein forms gradients that help direct development of body patterns during early embryonic times. But it can't do its job alone.

A recent paper showed that these two systems, the Shh/Gli3 system and the Hox system (specifically Hoxa13 & Hoxd11-13, the last genes in the tetrapod clusters above, dark blue) interact to generate the five-finger pattern. The last ingredient believed to be involved is another gradent forming protein, fibroblast growth factor 1 (Fgf1), presumed to be downstream of the various Hox regulators. The researchers speculate that these two gradients, of Fgf1 and Shh, are controlled by different genetic inputs, and interact to create patterns. In this case it is fingers, but in other organisms, similar processes are thought to make zebra stripes, wing patterns, shells, etc..

One molecular gradient can provide some information about where to put things, but probably not anything very consistent or detailed. The interesting part of this story, though the actual biology is unfortunately not yet well developed, is that the combination of two molecular gradients generates far more interesting possibilities. This was, intriguingly enough, pointed out by one of the greatest mathematicians of all time, Alan Turing, who, taking time off from inventing the computer, provided the mathematical foundations of a two-component chemical system which, across a gradient field of both chemicals, which react at different rates as they go forth, can create amazingly stable and interesting patterns, somewhat counter-intuitively.

Abstract model of a Turing wave 2-component system, resolving itself over time spontaneously from a homogenous solution into a complex binary pattern.

Naturally, these researchers made mutations to look at the effects of these genes. The complete deletion of Hoxa13 turns out to be lethal in early embryos. On the other hand, they find that the Hox code is rather complicated, such that deletion of Hox members d11 to d13 causes added effects that mimick or accentuate deletion of a13. In early embryos, they can see the hand region setting up dramatically more fingers (marked by staining for the protein Sox9, a marker of pre-cartilage/bone formation) as they delete either of the gradient-forming or responding genes Gli3 and Hox*.

Early embryonic (day 12.5) limb-buds stained for finger primordia, from mice mutated for various genes, as noted. "+" is wild-type for one or both genes, while "-" is deleted or otherwise inactivated. While deletion of Gli3 alone has some effect on finger number, only with the added deletion of the Hox genes do finger numbers become truly uncontrolled.

There is something amazing going on here. By the time all these genes are deleted (-, or other non-"+" variants) for Gli3, Hoxa13, Hoxd11, Hoxd12, and Hoxd13, there are no individual digits left. The whole zone has turned into a smooth non-digital mess. Lesser amounts of the Hox genes in particular lead to dramatically rising numbers of digits.

Mice mutated for various genes as noted, now at birth, stained for cartilage (blue) and bone (red).

In roughly the same amount of tissue, many different numbers of digits can develop, based on a few genetic alterations.  Clearly the researchers are hot on the trail of how this pattern develops and will be looking for the particular components downstream of the Hox genes that carry out its regulatory directions, especially the protein or other chemical that forms the counter-gradient to Shh. It is a common theme in biology, that most of the action lies in complex layers of regulation (i.e. management) so that the ultimate actors can toe their lines with precision even in a variable genetic and external environment.

"It reports recent cases such as the US iconic firm Caterpillar which “reported record profits last year” but “insisted on a six-year wage freeze for many of its blue-collar workers”. That is not an isolated case. Indeed it is the norm and is one of the defining characteristics of the neo-liberal era that has dominated economic policy making over the last three decades.
... Everybody should benefit from productivity growth – that is what we call a society."

Saturday, January 12, 2013

Our parents give us meaning

We crave meaning in the approval of the parent, real and imagined.

How many times do you hear.. if only my father had said I was doing OK, if only my father said he loved me, if only I had a chance to show my mother how well I was doing before she died...? We grow up competing for our parent's attention and utterly dependent on it. If there is one sure influence from childhood, it is the frame of reference and attitudes of the parents. Sometimes this happens in reverse, by rebellion, but inevitably we later become our parents, so enmeshed in their world that leaving entirely is not an option.

And when they are gone? What then? The Romans, Japanese, and many other cultures made cults of their parents (in patriarchial fashion, just the male line). The parent is called to an alter where their judgement, forgiveness, boons, and advice are sought. Their gifts to us and ongoing effects are recognized. Their utter absence is so inconceivable that prayer starts to make sense. After all they are so much a part of us that even if we are mumbling to ourselves, we speak to them too.

But why keep a different flame at every alter and hearth? They are ultimately the same supervisory concept, and a culture gains solidarity from giving them the same name. God. It is funny how, no matter the theological complexity and reasoned mystery of one's god, it is never "it", but always "Him" (or in outré cases, "Her").

The model never strays far from the father/mother model, which makes it immensely powerful- as a way to acculturate children with concepts the actual parents are not strong enough to convey, as a way to sanction whatever the reigning powers want to do, as a way to comfort and soothe adults who remain children deep inside. It goes to the extreme of denying death itself, as if putting our heads under the covers will make the horror go away.

And of course, it gives the deepest meaning to those who believe most "deeply". Who see the universe as a machine to give them meaning through the imagined directives of the invisible father, who gives them the most arduous tasks, attends to their most minute needs, and gives them the most glorious rewards. It almost makes you wonder just how far that great principle of neoteny can go- how far humans can go by refusing to grow up. For creating meaning is the true task of the adult.