Showing posts with label history. Show all posts
Showing posts with label history. Show all posts

Saturday, March 16, 2024

Ideologies of Work

Review of Elizabeth Anderson: "Hijacked: How neoliberalism turned the work ethic against workers, and how workers can take it back."

We live by the sweat of our brow, though work. At least that has been the story after we were thrown out of the garden of Eden, where we had previously foraged without effort. By the time of Puritans, work had been re-valued as being next to godliness, in what became known as the Puritan work ethic. Elizabeth Anderson takes this as her point of departure in a fascinating historical study of the winding (and mostly descending) road that attitudes toward work took down the centuries, in the perennial battle between workers and parasites who have found ways to avoid sweating, yet eat just the same ... or better.

Anderson trots through all the classical economists and philosophers, down to John Stuart Mill and Marx, showing two main threads of thought. First is the progressive thread, in which the Puritans can (curiously) be classed, as can Adam Smith. They value work as both a cultural and meaningful activity, not just a means of sustenance. They think everyone should work, and criticize anyone, high or low, who shirks this responsibility. Genteel landowners who spend their time hunting rather than improving their estates are just as culpable as drunkards and other able-bodied peasants who fail to do their share. Learning and innovation are highly valued, as not just ameliorating the lot of those making improvements, but at the same time raising the wealth of, and standard of living for, all.

In contrast is the conservative thread. Anderson herself describes it trenchantly:

"From the conservative perspective, however, poverty reflected an individual's failure to filfill the demands of the work ethic. Society is at fault solely in establishing institutions that violate natural law in promoting vice through provisions such as the Poor Law. Conservatives agreed that the Poor Law must therefore be abolished or radically reformed. If poverty is caused by the vice of the poor, the remedy for poverty must be to force the poor to practice virtue, to live up to the demands of the work ethic. Conservatives differed somewhat on which virtue was most necessary for the poor to practice. Priestly focused on frugality, Bentham on industry, Malthus on chastity, Paley on contentment (understood as the opposite of covetous envy of the rich). Thus, Priestly hoped to convert poor workers into virtuous bourgeios citizens through a legally mandated individual savings plan. Bentham favored a workfare system that turned the working poor into imprisoned debt peons of capitalist entrepreneurs. Malthus advocated leaving the poor to starvation, disease and destitution, but offered them the hope that they could rescue themselves by postponing marriage and children. Burke and Wately agreed with Malthus, but attempted to put a liberal-tory paternalist veneer on their view. ...

"The moral accounting that assigns responsibilities to individuals without regard- and even in inverse proportion- to the means they have to fulfill them remains a touchstone of conservative thought to the present day. ...

"The ideology of the conservative work ethic is distinguished by a harsh orientation toward ordinary workers and the poor, and an indulgent one toward the 'industrious' rich- those who occupy themselves with making money, either through work or investment of their assets, regardless of whether their activities actually contribute to social welfare. in practice, this orientation tends to slide into indulgence toward the rich, whether or not they are industrious even in this morally attenuated sense. ...

"Here lies a central contradiction of the conservative work ethic. All the conservatives claimed that the key to overcoming poverty was to make the poor bourgeois in attitude. All they needed to do was adopt the work ethic, or be forced to adopt it, along with the spirit of competitive emulation, the desire to better others in the race for riches and ensure that one's children not fall beneath the standard of living in which they were raised. Poverty was proof that they hadn't adopted bourgeois virtues and aspirations. This presupposed that the poor suffered from no deficit in opportunities. The path to prosperity was open; the poor were simply failing to take it. Yet we have seen that, Priestly partially excepted, conservative policies knowingly reduced the opportunities of the poor to acquire or retain property, work for themselves, or escape precarity."


My major critique of Anderson's analysis is that putting all this conflict and history into the frame of the work ethic is inappropriate and gives the work ethic far more weight than it merits. Firstly, everyone thinks of themselves as working. The most sedentary rentier doubtless thinks of his or her choosing among investments as of critical importance to the health and future of the nation. Even his or her shopping choices express taste and support a "better" sort of business, in that way performing work towards a better community. The English royals probably see themselves as doing essential cultural work, in their choice of hats and their preservation of cherished traditions. Parenting, community associations, and political agitation can all, to an expansive mind, be construed as "work". And indeed some of our greater artistic and other accomplisments come from the labors of wealthy people who were entirely self-directed rather than grubbily employed. All this implies that a work ethic can be accommodated in all sorts of ways if markets are not going to be the standard, as they hardly can be in any philosophical or moral system of a work ethic. This makes work ethics rather subjective and flexible, as Anderson implicitly demonstrates through the centuries.

However a more serious problem with Anderson's analysis is that it leaves out the ethic of power. Her presentation laments the sad misuse that the work ethic has been subjected to over the years, (by conservatives), without focusing on the reason why, which is that a whole other ethic was at work, in opposition to the work ethic. And that is the power ethic, which values domination of others and abhors work as commonly understood. Or, at best, it construes the organization of society for the benefit of a leisured upper crust as work of momentous, even civilizational, significance. Nietzsche had a field day calling us to recognize and embrace the power ethic, and not hide it under sweeter-smelling mores like the Christian or work ethics.


Anderson does helpfully discuss in passing the feudal background to the Puritan work ethic, where the Norman grandees and their progeny parcelled out the land among themselves, spent their time warring against each other (in England or in France), and lived high off the labors of their serfs/peasants. No thought was given to improvement, efficiency, or better ways to organize the system. Conservatism meant that nothing (god-willing) would change, ever. Even so, the work of politics, of war, and of religious ideology was never done, and the wealthy could easily see themselves as crucial to the maintenance of a finely-balanced cultural and economic system.

Anderson also notes that the original rationale of the gentry, if one must put it in an economic frame, was that they were responsible for military support of the king and country, and thus needed to have large estates with enough surplus in people, livestock, horses, and food to field small armies. When this rationale disappeared with the ascendence of parliament and general (at least internal) peace, they became pure rentiers, and uncomfortably subject to the critique of the Puritan work ethic, which they naturally countered with one of their own devising. And that was essentially a restatement of the power ethic, that the rich can do as they please and the poor should be driven as sheep to work for the rich. And particularly that wealth is a signifier of virtue, implying application of the work ethic, (maybe among one's forebears, and perhaps more by plunder than sweat, but ... ), or transcending it via some other virtues of nobility or class. 

But in Locke and Adam Smith's day, as today, the sharpest and most vexing point of the work ethic is not the role of the rich, but that of the poor. By this time, enclosure of lands was erasing the original version of the job guarantee- that is, access to common lands- and driving peasants to work for wages, either for landowners or industrialists. How to solve extreme poverty, which was an ever more severe corollary of capitalism and inequality? Is it acceptable to have homeless people sleeping on the streets? Should they be given work? money? social services? education? Do the poor need to be driven to work by desperation and starvation? Or is the lash of work not needed at all, and lack of wealth the only problem? Malthus was doggedly pessimistic, positing that population growth will always eat up any gains in efficiency or innovation. Thus it requires the predatory power of the gentry to enable society to accumulate anything in the way of capital or cultural goods, by squelching the poor in sufficient misery that they will not over-reproduce.

The progressive view of work and the poor took a much more sanguine view. And here one can note that much of this discussion revolves around "natural" laws. Is the population law of Malthus true? Or is the natural communitarian tendency of humans also a natural law, leading to mutual help, spontaneous community formation, and self-regulation? Are some people "naturally" superior to others? Is a hierarchical and domineering social system "natural" and necessary? Adam Smith, in Anderson's reading, took a consistently pro-worker attitude, inveighing against oppressive practices of employers, collusion of capital, and cruel goverment policies. Smith had faith that, given a fair deal and decent education, all workers would strive to the best of their abilities to better their own condition, work diligently, and thereby benefit the community as well as themselves.


For the story of Eden is fundamentally wrong. Humans have always worked, and indeed valued work. Looking outside the window at a squirrel trying to get into the bird feeder ... is to see someone working with enthusiasm and diligence. That is our natural state. The only problem was that, as human civilization progressed, power relations, and then even more- industrialization- generated work that was not only cruel and oppressive, but meaningless. The worker, forced to work for others instead of him- or herself, and routinized into a factory cog, became fully alienated from it. How to get workers to do it, nevertheless? Obviously, having a work ethic is not a full solution, unless it is of a particularly astringent and dogmatic (or tyrannical) sort. Thus the dilemma of capitalist economies. For all their trumpeting of the "natural laws" of competition and "freedom" for employers to exploit and workers to be fired, capitalism violates our true natures in fundamental ways.

So the question should be, as Anderson eventually alludes to, do we have a life ethic that includes work, rather than just a work ethic? She states plainly that the most important product of the whole economic system is ... people. Their reproduction, raising, education, and flourishing. It is not consumption products that should be the measure of economic policy, but human happiness. And a major form of human happiness is doing meaningful work, including the domestic work of the family. The world of Star Trek is even alluded to in Anderson's last chapter- one where no one works for subsistance, but rather, people work for fulfillment. And they do so with zeal.

Anderson sees great potential in the more progressive forms of the work ethic, and in the social democratic political systems that implemented them after World War 2. She argues that this is the true legacy of Marxism (and of Thomas Paine, interestingly enough) and expresses the most durable compromise between market and capital-driven corporate structures and a restored work ethic. Some amount of worker participation in corporate governance, for instance, is a fundamental reform that would, in the US, make corporations more responsive to their cultural stakeholders, and work more meaningful to workers. Tighter regulation is needed throughout the private economy to make work more humane for the very low-paid, giving workers better pay and more autonomy- real freedom. More public goods, such as free education to university levels, and better provision for the poor, especially in the form of a job guarantee, would make life bearable for many more people. For my part, inheritance seems a key area where the ethics of the dignified work and equal opportunity run up against completely unjust and artificial barriers. In America, no one should be born rich, and everyone should grow and express themselves by finding a place in the world of work.


  • Annals of capitalist control.
  • Corporations and the royal we.
  • More equal societies are better societies.
  • The Stepford wife.
  • The Supreme Court is dangerously wrong.

Saturday, February 10, 2024

How the Supreme Court Should Rule in the Colorado Ballot Case

There is one path forward that can salvage the court's standing.

The US Supreme Court is sinking to unusual depths of corruption and illegitimacy. Bush v. Gore was a step down in its ability to manage the rules of our political and legal system, where it made a hasty and, it claimed, one-time-only carve-out for its favored candidate, leading to almost a decade of tragically bad policy and poor government. Then came Citizens United, another step downward, opening firehoses of secret money from the wealthy, using the fig leaf of "free speech" to cover the corruption of politics with money. Then came the overturning of Roe, deeming women unworthy of rights that are far more basic and intimate than those enumerated in the Bill of Rights. And most recently have come the drumbeat of reports of corruption among the right-wing justices, who appear to regard themselves as too dignified to abide by the laws and norms they hold others to.

Now it is faced with a case that tests the very core of the court's abitlity to do its job. What does the constitution mean? Does the fourteenth amendment mean what it says, and if so, should it be enforced? A great deal of commentary suggests, probably correctly, that this court is desperately looking for a way out of this legal conundrum that allows it to do nothing and avoid overturning any apple carts. That would not, however, be a wise course. 

To recap, the Colorado case was brought by voters who sought to bar Donald Trump from the Colorado primary and general election ballots, due to his participation in the insurrection of January 6, 2021. The fourteenth amendment to the federal constitution bars such participants from federal and state offices. The Colorado Supreme Court ultimately agreed, sending the case to the US Supreme Court. The congressional report on the January 6 events makes the record of those events quite clear. It uses the word "insurrection" several times, as do many of its sources, and it is crystal clear about the participation by and culpability of Donald Trump in those events. 


The question is really about how the Constitutional provision should be brought into execution, being worded without a lot of explicit legal structure. One thing it does say is that congress can relieve its prohibition in individual cases by two-thirds votes of each house. But it leaves unsaid who should adjudicate the question of fitness for office, as is also the case for the more basic qualifications such as age and citizenship. Trump had previously, and ironically, dabbled in these same legal waters by casting doubt on the citizenship of Barack Obama. But since no one with half a brain took him seriously, the issue never entered the legal sphere.

Well, the worst course would be to let the clear language of the constitution lay inert and find some technicality by which to do nothing. What I would suggest instead is that the court recognize that there needs to be a national adjudicating power to decide this question in the case of candidates for national office (and indeed for any office whose qualifications are mentioned in the constitution). And that power should be itself, the US Supreme Court. The court might invite the legislative branch to provide more regular methods of fact-finding, (or even a clarifying amendment to the constitution), but given the constitutional clear intent, history, and logic, (not to mention the general Article III clauses putting all questions arising from the constitution in its hands), the court should take upon itself the power to say that the buck stops at its door. And naturally, in consequence, that Trump merits disqualification, on the facts of the January 6 events as found by the lower courts, and on his position as an officer, indeed the paramount officer, of the United States.

This solution would neatly take over from the states the responsibility of saying that any national candidate meets or does not meet the various qualifications set forth in the constitution. Such cases could begin in state courts, as this one did, but would need to go to the US Supreme Court for final decision. This solution would hold Trump to account for his actions, a principle that Republicans have, at least  traditionally, cherished. This solution would also go some way to removing the stain of the Bush v Gore decision, and establish a new and clear element of constitutional jurisprudence, in setting forth who adjudicates the qualifications of national political candidates. In fact, this function can be tied to the practice of having the chief justice of the United States administer the oath of office to the incoming president. It would be proper for the court to be the institution that decides on the basic fitness tests, and thus who in general may take the oath, while the people decide the political outcome of the election, among fit candidates.

I am no legal scholar, but the merits of this solution seem pretty clear. On an earlier occasion, the court summarily took on the task of determining the constitutionality of laws. This role was not explicitly set out in the text, but was a logical consequence of the structure that the constitution set up. Here likewise, the logic of the constitution indicates strongly that the final word on the fitness of candidates for national office must rest with, not the voters, not the states, and not the legislative or, heaven forbid, the executive branch, but with the federal judicial branch, of which the US Supreme Court is the head.

An alternative, and perhaps more likely, solution, is for this court to state all the principles above, but then hold that in its judgement, Donald Trump is fit for office after all. Maybe it will deem the insurrection just a little insurrection, and not the kind of big insurrection that would turn a jurist's head (despite the over thousand charges filed, and hundreds of federal convictions so far). Or maybe it will deem Trump insufficiently involved in the insurrection to merit disqualification. What it can not do is deem him not an officer of the federal government- that would be beyond belief. The pusilanimous, partisan sophistry of this alternative would not go over well, needless to say. Many would ask whether Clarence Thomas, himself virtually a participant in the insurrection at one remove, should have recused himself. Minds would be blown, but few would be surprised, since for this court, expectations could hardly be lower. Going against its partisan grain would, on the other hand, be a signal and heartening achievement.

This second approach would at least resolve the legal questions, but at the cost of further erosion of the court's legitimacy, given that the events of January 6 are so well documented, and the constitutional peril that Trump poses so obvious. For the whole point of having a Supreme Court which takes on tough issues and plugs logical holes in our constitution is that it also takes some care to plug them well, and preserve our democracy in the process.


  • What happens when the Supreme Court gives in to politics?
  • One state, one system.
  • A solar energy insurrection in Puerto Rico.
  • Democratic inequality is related to wealth inequality.
  • More on the court case- ballots vs office holding.

Saturday, February 3, 2024

Spiritual Resources for the Religiously Abstemious

Nones are now a plurality in the US. What are we supposed to do?

The Pew research institute recently came out with polling that shows a significantly changed religious landscape in the US. Over the last couple of decades, while the religious right has been climbing the greasy pole of political power, gaining seats on the Supreme Court, and agitating for a return to patriarchy, their pews have been emptying. The religiously unaffiliated, or "nones", comprise 28% of the US population now, almost double the level two decades ago.

One has only to see the rabid support evangelicals give their orange-haired messiah to understand what has been turning people off. Or glance over the appalling chronicle of sexual abuse unearthed in the Catholic church. Maybe the horsemen of the Atheist apocalypse have had something to do with it. Russia under Putin is strenuously demonstrating that the same system can be just as cruel with or without religion. But these patterns of gross institutional, moral, and intellectual failure, and their ensuing critiques, are hardly new. Luther made a bit of hay out of the abuses of the Catholic church, Voltaire, among many other thinkers, ridiculed the whole religious enterprise, and Hitler was a forerunner of Trump in leaning on religion, at least early in his career, despite being a rather token Christian himself (other than in the antisemitism, of course). What is new now?

A dramatic rise in numbers of people with no religious affiliation and little interest, from Pew polling.

I am not sure, frankly. Europe has certainly been leading the way, showing that declining religion is quite compatible with prosperous and humane culture. But perhaps this phenomenon is part of the general isolation and atomization of US culture, and thus not such a good thing. It used to be that a community was unthinkable without a church (or several) to serve as the central hub. Churches served to validate the good and preach to the bad. They sponsored scout troops, weddings, charitable events and dinners, and committees and therapeutic encounters of all sorts. They were socially essential, whether one believed or not. That leaders of society also led the churches knit the whole circle together, making it easy to believe that something there was indeed worth believing, whether it made sense or not.

Now, the leadership of society has moved on. We are mesmerized by technology, by entertainment, and sports, perhaps to a degree that is new. The capitalist system has found ways to provide many of the services we used to go to churches for, to network, to get psychotherapy, to gossip, and most of all, to be entertained. Community itself is less significant in the modern, suburban, cocooned world. Successful churches meet this new world by emphasizing their social offerings in a mega-church community, with a dash of charismatic, but not overly intellectually taxing, preaching. Unfortunately, megachurches regularly go through their own crises of hypocrisy and leadership, showing that the caliber of religious leaders, whatever their marketing skills, has been declining steadily.

The "nones" are more apathetic than atheistic, but either way, they are not great material for making churches or tightly knit communities. Skeptical, critical, or uninterested, they are some of the least likely social "glues". Because, frankly, it takes some gullibility and attraction to the core human archetypes and drama to make a church, and it takes a lot of positive thinking to foster a community. I would promote libraries, arts institutions, non-profits, and universities as core cultural hubs that can do some of this work, fostering a learning and empathetic culture. But we need more.

As AI takes over work of every sort, and more people have more time on their hands, we are facing a fundamental reshaping of society. One future is that a few rich people rake off all the money, and the bulk of the population descends into poverty and joblessness, unneeded in a society where capitalism has become terminally capital-intensive, with little labor required. Another future is where new forms of redistribution are developed, either by bringing true competition to bear on AI-intensive industries so that they can not take excess profits, or by thorough regulation for the public good, including basic income schemes, public goods, and other ways to spread wealth broadly. 


Such a latter system would free resources for wider use, so that a continuing middle class economy could thrive, based on exchanges that are now only luxuries, like music, personal services, teaching, sports, counseling. The destruction of the music recording industry by collusion of music labels and Spotify stands as a stark lesson in how new technology and short-sighted capitalism can damage our collective culture, and the livelihood of a profession that is perhaps the avatar of what an ideal future would look like, culturally and economically.

All this is to say that we face a future where we should, hopefully, have more resources and time, which would in principle be conducive to community formation and a life-long culture of learning, arts, and personal enrichment, without the incessant driver of work. The new AI-driven world will have opportunities for very high level work and management, but the regular hamburger flippers, baristas, cabbies, and truck drivers will be a thing of the past. This is going to put a premium on community hubs and new forms of social interaction. The "nones" are likely to favor (if not build) a wide range of such institutions, while leaving the church behind. It is a mixed prospect, really, since we will still be lacking a core institution that engages with the whole person in an archetypal, dream-like fantasy of hope and affirmation. Can opera do that work? I doubt it. Can Hollywood? I doubt that as well, at least as it applies to a local community level that weaves such attractions together with service and personal connection.


  • Those very highly moral religious people.
  • Molecular medicine is here.
  • Why do women have far more autoimmune syndromes?
  • What to do about Iran.
  • "As we’ll see, good old-fashioned immortality has advantages that digital immorality cannot hope to rival." ... I am not making this up!


Saturday, January 20, 2024

The Tragedy of Daniel Boone

Pathfinding and hunting his way through the paradise the Indians had built.

Daniel Boone is (or used to be) one of the most iconic / archetypal figures in US history and popular consciousness. His remains have been fought over, his life mythologized and serialized, and his legacy cherished as heroic and exemplary. It all began with his trusty rifle, with which he was the surest shot. He was a pathfinder, never lost in the vast wilderness he explored and helped settle. And he was a steadfast leader of men, rescuer of damsels in distress, and killer of Indians. What's not to admire? His definitive biography, by John Faragher, paints a more ambivalent picture, however.

Boone loved the woods- loved hunting, loved nature, and loved solitude. Given those talents and tendencies, he naturally strayed from the borderlands of North Carolina into the mountains, becoming a full time hunter and trapper. In a couple of early forays into what we now know as Kentucky, he hunted on a commercial basis, wasting the animals to pile up hundreds of pelts, which his employees / colleagues processed in camp. 

The biography emphasizes that what Boone found in Kentucky was a paradise- lush and full of game. The region, believe it or not, was full of not just deer and beaver, but bear and buffalo. It is the kind of eden that had been encountered by Europeans many times over in the "New World". Fisheries of unimaginable richness, skies full of birds, forests as far as the eye could see. Kentucky was not an uninhabited eden, however- it was the cherished hunting ground of native Cherokee and Shawnee, among others, who saw exactly what Boone saw, but responded to it differently. Not with plunder and destruction, but with care and stewardship.

Boone blindly shot away, and then followed his cultural programming further by leading his family and many others across the mountains to found Boonesborough, building a fort and defending it against numerous Indian attacks. The biography notes that Boone's parents had ten children, and he had ten children, and his children had similar sized families. One can imagine where that kind of reproduction leads, to desperate expansion and heedless use of resources. While acknowledged as the pioneer of Kentucky settlement, Boone was no businessman, and all his grasping for land in the speculative rush that developed in his wake came to naught. He was sloppy in his paperwork and was outlawyered and out-cheated at every turn. One may see the personality type of his adversary in the current senior senator from Kentucky, Mitch McConnell. Boone was all too honest and simple, having been raised a Quaker.

Portrayal of the siege of a stockade, not unlike that of Boonesborough, as Native Americans try to drive off the cloud of locusts denuding their land.

The game had been hunted out, the people had become unfriendly and dense underfoot, and Boone's property and business schemes had all fallen apart. In despair over what he had wrought in Kentucky, Boone pulled up stakes and moved out to the next frontier, near St. Louis. An extremely late hunting trip has him heading through what is now Yellowstone park, reliving for the last time the kind of eden that Native Americans had nurtured with their respect for the value and cycles of nature, and even more, with their light footprint as small populations.

European culture and immigrants have accomplished wonderful things in America. But decimating its natural wonders, resources, and native peoples is not one of them. Daniel Boone was caught up in the economics of inexorable population growth and the need to make a "business model" out of hunting and trapping. Well, what comes of that is not pretty, and not at all sustainable of what had brought him into the woods to start with.


Saturday, January 6, 2024

Damned if You do, Damned if You Don't

The Cherokee trail of tears, and the Palestinian conundrum.

History is a long and sad tale of conflict, interspersed with better times when people can put their animosities aside. Just as economics deals in scarcity and its various solutions, history likewise turns on our inevitable drive towards overpopulation, with resulting scarcity and conflict. Occasionally, special technological, spiritual, organizational achievements- or catastrophes- may allow periods of free population growth with its attendant bouyant mood of generosity. But more commonly, groups of people covet each other's resources and plot ways to get them. This was one of the lessons of Malthus and Darwin, who addressed the deeper causes of what we see as historical events.

The "New World" provided Europeans with an unprecedented release for their excess populations, especially the malcontented, the desperate, and the ambitious. They rhapsodized about the "virgin" lands that lay open, generally dismissing the numerous and well-organized natives present all over these lands, as "savages", occupying a lower technological and theological level of existence. There were plenty of rationalizations put forth, like Christianizing the natives, or "civilizing" them. But the hypocrisy of these formulations becomes clear when you consider the fate of the Cherokees, one of the "five civilized tribes". 

By the early 1800's, a couple of centuries of contact had already gone under the bridge, (as narrated by Pekka Hämäläinen in "Indigenous continent"), and native Americans were all integrated to various degrees in trading networks that brought them European goods like guns, pots, knives, and novel practices like horse riding. The Cherokees, occupying the lower Appalachians and piedmont between what is now Georgia and Alabama, were more integrated than most, adopting European farming, living, schooling, and governing practices. They even owned African American slaves, and wrote themselves a US-modeled constitution in 1827, in the script devised the scholar Sequoya.

Did this "progress" toward assimilation with the European culture help them? Far from! Their excellence in farming, literacy, and government raised fears of competition in the white colonists, and the Georgia state government lobbied relentlessly for their removal. Andrew Jackson finally obliged. He pressured the Cherokees to re-open their status as a settled nation, devised a removal treaty with a minority party, and then sent all the Cherokees in the region (about 16,000) off on the Trail of Tears, to the barren lands of Oklahoma. These Cherokees lost roughly a quarter of their population along the way, in a brutal winter. Compare this with the partition of India, where about twelve percent of the refugees are thought to have perished, out of roughly 16 million total.

A small part of the annals of ethnic cleansing, US edition. Needless to say, the "Indian territory" ended up a lot smaller than originally promised.
 

Georgia was thus ethnically cleansed, and does not seem to experience a great deal of regret about it. The logic of power is quite simple- the winner gets the land and spoils. The loser is lucky to not be killed. That the Europeans were significantly more powerful than their native antagonists doesn't change the logic, though it might appeal to our empathy and nostalgia in retrospect. The Cherokees and other Native Americans might have been accepted into US society. They might have been given one or two states for their sovereign governments, as the Mormons managed. There were a lot of possibilities that might have made us a more interesting and diverse nation. But at the same time, most Native Americans participated fully in the politics of power, terrorizing each other, making slaves of each other, and killing each other. They were not innocents. So the fact that they came up against a stronger power was hardly a novelty, though in this case that power was blundering and cruel, shared very few of their cultural coordinates, and was highly hypocritical about its own.

All this comes to mind when viewing the Israeli-Palestinian conflict. Israel won the major Middle East wars that so dramatically emasculated the Palestinians, first in the civil war that left Jordan and Egypt in charge of the Palestinian areas, then in the 1967 war that left all these areas in Israeli hands. But what to do with them? On founding, Israel was a liberal, New Testament kind of country, with humanist values and lefty kibbutzim. The then-recent Holocaust also caused a bit of hesitance when it came to either killing or exiling the losing Palestinians. Indeed, given that its neighbors Jordan and Egypt lost these wars, it would have made some sense at that time to deport all the Palestinians, of which there were about one to two million. But rather than do that, or make a firm border, Israel immediately started encroaching into Palestinian territory with security areas and "settlements", and has set up an ever more elaborate, though selectively porous and self-serving, security and boundary system.

Both sides have a schizophrenic reaction to the other. On the Palestinian side, the psychology of losing has meant quietism and acquiescence by some, but resentment and militantcy by others. Both lead to a spiral of worse treatment, the weakness of the former inviting abuse, and the desperate depredations of the latter inciting revenge, "security" measures, and tighter occupation. The provocations by each side are unendurable, and thus the situation deteriorates. Yet, in the end, Israel has all the power and the responsibility to come up with a long term solution. Over the decades, Israel has morphed from its founding ethos into something much more conservative and Old Testament, less beholden to the humanisitic ideals of the post-WW2 period. The wanton killing, starvation, and collective punishment of Gaza makes visible this moral breakdown.

The Palestinians can't win either way, either through Hamas's implacable hatred and impotent attacks, nor through the acquiescence of the Palestinian National Authority, which, in thanks for its good behavior, has received the creeping expansion of Israeli "settlements" on its land. These now take up, according to a detailed map, about 1/3 to 1/2 of the land of the West Bank. Overall, the options are: 1) to expel the Palestinians outright, which appears to be, for Gaza at least, where Israeli policy is heading, (made more ironic by the realization by historians that the Biblical Exodus never actually took place), or 2) to continue to muddle along in a torturous occupation with creeping dispossession, or 3) to grant Palestine some kind of autonomy and statehood. Assimilation, (4), long dreamt of by some, seems impossible for a state that is fundamentally an ethnic (or theological) state, and whose whole raison d'etre is ethnic separation, not to even mention the preferences of the Palestinians. Though perhaps assimiliation without voting rights, in sort of semi-slavery or apartheid, is something the Israelies would be attracted to? Perhaps insignia will need to be worn by all Palestinians, sewn to their clothing?

Map of the West Bank of the Jordan, color coded by Palestinian marginal control in brown, and settler/Israeli control in red.

What should happen? Indigenous Americans were infected, decimated, hunted down, translocated, re-educated, and confined to a small and very remote system of reservations. Hopefully we have have progressed a little since then, as a largely European civilization, which is putatively shared by Israel. Thus the only way forward, as is recognized by everyone outside Israel, is the two-state solution, including a re-organization of the Palestinian territories into a final, clearly demarked, and contiguous state. Israel's current political system will never get there. But we can help the process along in a few ways.

First, it is disappointing to see our current administration shipping arms to Israel at a furious pace, only to see them used to kill thousands of innocent, if highly resentful, civilians. Israel has plenty of its own money to buy whatever it needs elsewhere. We need to put some limitations on our military and other aid relationships, to motivate change. (Though that raises the question of Israel's increasingly cozy relationship with Russia). Second, we should recognize Palestine as a state, and bring forward its integration into the international system. This will not resolve its borders or myriad security and territory issues viz Israel, but it would helpfully motivate things in that direction. Israel has constantly cried wolf about the lack of a credible partner to negotiate with, but that is irrelevant. Israel is perfectly capable of building the walls it needs to keep Palestinians at bay. But then it wants pliant workers as well, and a peaceful neighbor, security viz Jordan and Egypt, territorial encroachments, and many other things that are either destructive, or need to be negotiated. 

By far the most constructive thing that could be done is to freeze and re-organize the Jewish settlements and other periphernalia that have metastasized all over the West Bank. There is no future without a decent and fair solution in territory, which is the third big thing we need to press- our own detailed territorial plan for Palestine. For one thing, Israel could easily vacate the whole corridor / valley facing Jordan. That would give a consolidated Palestine a working border with a country that is now peaceful, quite well run, and friendly to both sides. There are countless possible maps. We just need to decide on one that is reasonably fair and force it on both sides, which are each, still after all these years, apparently unwilling to imagine a true peace. This means principally forcing it on Israel, which has been the dominant and recalcitrant party the entire time.

The Cherokees are now one of the largest indigenous populations in the US, at roughly a quarter million, with their own territory of about seven thousand square miles in Oklahoma. They have internal and partial sovereignty, which means that they deal with their own affairs on a somewhat independent basis, but otherwise are largely subject to most laws of the enclosing governments. The Cherokees could easily have been assimilated into the US. Only racism stood in the way, in a mindset that had long descended into a blind and adversarial disregard of all native Americans as "others", (the irony!), competitive with and less than, the newly arrived immigrants. We could have done much better, and one would like to think that, a hundred or a hundred and fifty years on, we would have.

In the end, the West (read as European civilization, as developed out of the ashes of World War 2) is either for or against wars of aggression, ethnic cleansing, apartheid, and human rights. Israel has won its wars, but never faced up to its responsibilities to the conquered Palestinians, and has tried to have it both ways, to be viewed by the world as a modern, enlightened state, even as it occupies and slowly strangles the people it defeated decades ago. 


  • Slovenly strategic thinking. But really, visionless long-term politics.
  • One Gazan speaks.
  • Settler colonialism.
  • Who's the victim?
  • Shades of Scientology ... the murky networks of the deep evangelical state.
  • In California, solar still makes sense.

Saturday, December 23, 2023

How Does Speciation Happen?

Niles Eldredge and the theory of punctuated equilibrium in evolution.

I have been enjoying "Eternal Ephemera", which is an end-of-career memoir/intellectual history from a leading theorist in paleontology and evolution, Niles Eldredge. In this genre, often of epic proportions and scope, the author takes stock of the historical setting of his or her work and tries to put it into the larger context of general intellectual progress, (yes, as pontifically as possible!), with maybe some gestures towards future developments. I wish more researchers would write such personal and deeply researched accounts, of which this one is a classic. It is a book that deserves to be in print and more widely read.

Eldredge's claim to fame is punctuated equilibrium, the theory (or, perhaps better, observation) that evolution occurs much more haltingly than in the majestic gradual progression that Darwin presented in "Origin of Species". This is an observation that comes straight out of the fossil record. And perhaps the major point of the book is that the earliest biologists, even before Darwin, but also including Darwin, knew about this aspect of the fossil record, and were thus led to concepts like catastrophism and "etagen". Only Lamarck had a steadfastly gradualist view of biological change, which Darwin eventually took up, while replacing Lamarck's mechanism of intentional/habitual change with that of natural selection. Eldridge unearths tantalizing and, to him, supremely frustrating, evidence that Darwin was fully aware of the static nature of most fossil series, and even recognized the probable mechanism behind it (speciation in remote, peripheral areas), only to discard it for what must have seemed a clearer, more sweeping theory. But along the way, the actual mechanism of speciation got somewhat lost on the shuffle.

Punctuated equilibrium observes that most species recognized in the fossil record do not gradually turn into their descendents, but are replaced by them. Eldredge's subject of choice is trilobites, which have a long and storied record for almost 300 million years, featuring replacement after replacement, with species averaging a few million years duration each. It is a simple fact, but one that is a bit hard to square with the traditional / Darwinian and even molecular account of evolution. DNA is supposed to act like a "clock", with constant mutational change through time. And natural selection likewise acts everywhere and always... so why the stasis exhibited by species, and why the apparently rapid evolution in between replacements? That is the conundrum of punctuated equilibrium.

There have been lot of trilobites. This comes from a paper about their origin during the Cambrian explosion, arguing that only about 20 million years was enough for their initial speciation (bottom of image).

The equilibrium part, also termed stasis, is seen in the current / recent world as well as in the fossil record. We see species such as horses, bison, and lions that are identical to those drawn in cave paintings. We see fossils of animals like wildebeest that are identical to those living, going back millions of years. And we see unusual species in recent fossils, like saber-toothed cats, that have gone extinct. We do not typically see animals that have transformed over recent geological history from one (morphological) species into another, or really, into anything very different at all. A million years ago, wildebeest seem to have split off a related species, the black wildebeest, and that is about it.

But this stasis is only apparent. Beneath the surface, mutations are constantly happening and piling up in the genome, and selection is relentlessly working to ... do something. But what? This is where the equilibrium part comes in, positing that wide-spread, successful species are so hemmed in by the diversity of ecologies they participate in that they occupy a very narrow adaptive peak, which selection works to keep the species on, resulting in apparent stasis. It is a very dynamic equilibrium. The constant gene flow among all parts of the population that keeps the species marching forward as one gene pool, despite the ecological variability, makes it impossible to adapt to new conditions that do not affect the whole range. Thus, paradoxically, the more successful the species, and the more prominent it is in the fossil record, the less change will be apparent in those fossils over time.

The punctuated part is that these static species in the fossil record eventually disappear and are replaced by other species that are typically similar, but not the same, and do not segue from the original in a gradual way that is visible in the fossil record. No, most species and locations show sudden replacement. How can this be so if evolution by natural selection is true? As above, wide-spread species are limited in what selection can do. Isolated populations, however, are more free to adapt to local conditions. And if one of those local conditions (such as arctic cold) happens to be what later happens to the whole range (such as an ice age), then it is more likely that a peripherally (pre-)adapted population will take over the whole range, than that the resident species adapts with sufficient speed to the new conditions. Range expansion, for the peripheral species, is easier and faster than adaptation, for the wide-ranging originating species.

The punctuated equilibrium proposition came out in the 1970's, and naturally followed theories of speciation by geographic separation that had previously come out (also resurrected from earlier ideas) in the 1930's to 1950's, but which had not made much impression (!) on paleontologists. Paleontologists are always grappling with the difficulties of the record, which is partial, and does not preserve a lot of what we would like to know, like behavior, ecological relationships, and mutational history. But they did come to agree that species stasis is a real thing, not just, as Darwin claimed, an artifact of the incomplete fossil record. Granted- if we had fossils of all the isolated and peripheral locations, which is where speciation would be taking place by this theory, we would see the gradual change and adaptation taking place. So there are gaps in the fossil record, in a way. But as long as we look at the dominant populations, we will rarely see speciation taking place before our eyes, in the fossils.

So what does a molecular biologist have to say about all this? As Darwin insisted early in "Origin", we can learn quite a bit from domesticated animals. It turns out that wild species have a great amount of mostly hidden genetic variation. This is apparent whenever one is domesticated and bred for desired traits. We have bred dogs, for example, to an astonishingly wide variety of traits. At the same time, we have bred them out to very low genetic diversity. Many breeds are saddled with genetic defects that can not be resolved without outbreeding. So we have in essence exchanged the vast hidden genetic diversity of a wild species for great visible diversity in the domesticated species, combined with low genetic diversity.

What this suggests is that wild species have great reservoirs of possible traits that can be selected for the purposes of adaptation under selective conditions. Which suggests that speciation in range edges and isolated environments can be very fast, as the punctuated part of punctuated equilibrium posits. And again, it reinforces the idea that during equilibrium with large populations and ranges, species have plenty of genetic resources to adapt and change, but spend those resources reinforcing / fine tuning their core ecological "franchise", as it were.

In population genetics, it is well known that mutations arise and fix (that is, spread to 100% of the population on both alleles) at the same rate no matter how large the population, in theory. That is to say- bigger populations generate more mutations, but correspondingly hide them better in recessive form (if deleterious) and for neutral mutations, take much longer to allow any individual mutation to drift to either extinction or fixation. Selection against deleterious mutations is more relentless in larger populations, while relaxed selection and higher drift can allow smaller populations to explore wider ranges of adaptive space, perhaps finding globally higher (fitness) peaks than the parent species could find.

Eldredge cites some molecular work that claims that at least twenty percent of sequence change in animal lineages is due specifically to punctuational events of speciation, and not to the gradual background accumulation of mutations. What could explain this? The actual mutation rate is not at issue, (though see here), but the numbers of mutations retained, perhaps due to relaxed purifying selection in small populations, and founder effects and positive selection during the speciation process. This kind of phenomenon also helps to explain why the DNA "clock" mentioned above is not at all regular, but quite variable, making an uneven guide to dating the past.

Humans are another good example. Our species is notoriously low in genetic diversity, compared to most wild species, including chimpanzees. It is evident that our extremely low population numbers (over prehistoric time) have facilitated speciation, (that is, the fixation of variants which might be swamped in bigger populations), which has resulted in a bewildering branching pattern of different hominid forms over the last few million years. That makes fossils hard to find, and speciation hard to pinpoint. But now that we have taken over the planet with a huge population, our bones will be found everywhere, and they will be largely static for the foreseeable future, as a successful, wide-spread species (barring engineered changes). 

I think this all adds up to a reasonably coherent theory that reconciles the rest of biology with the fossil record. However, it remains frustratingly abstract, given the nature of fossils that rarely yield up the branching events whose rich results they record.


Saturday, November 25, 2023

Are Archaea Archaic?

It remains controversial whether the archaeal domain of life is 1 or 4.5 billion years old. That is a big difference!

Back in the 1970's, the nascent technologies of molecular analysis and DNA sequencing produced a big surprise- that hidden in the bogs and hot springs of the world are micro-organisms so extremely different from known bacteria and protists that they were given their own domain on the tree of life. These are now called the archaea, and in addition to being deeply different from bacteria, they were eventually found to be the progenitors of eukaryotic cell- the third (and greatest!) domain of life that arose later in the history of the biosphere. The archaeal cell contributed most of the nuclear, informational, membrane management, and cytoskeletal functions, while one or more assimilated bacteria (most prominently the future mitochondrion and chloroplast) contributed most of the metabolic functions, as well as membrane lipid synthesis and peroxisomal functions.

Carl Woese, who discovered and named archaea, put his thumb heavily on the scale with that name, (originally archaebacteria), suggesting that these new cells were not just an independent domain of life, totally distinct from bacteria, but were perhaps the original cell- that is, the LUCA, or last universal common ancestor. All this was based on the sequences of rRNA genes, which form the structural and catalytic core of the ribosome, and are conserved in all known life. But it has since become apparent that sequences of this kind, which were originally touted as "molecular clocks", or even "chronometers" are nothing of the kind. They bear the traces of mutations that happen along the way, and, being highly important and conserved, do not track the raw mutation rate, (which itself is not so uniform either), but rather the rate at which change is tolerated by natural selection. And this rate can be wildly different at different times, as lineages go through crises, bottlenecks, adaptive radiations, and whatever else happened in the far, far distant past.

Carl Woese, looking over filmed spots of 32P labeled ribosomal RNA from different species, after size separation by electrophoresis. This is how RNAs were analyzed, back in 1976, and such rough analysis already suggested that archaea were something very different from bacteria.

There since has been a tremendous amount of speculation, re-analysis, gathering of more data, and vitriol in the overall debate about the deep divergences in evolution, such as where eukaryotes come from, and where the archaea fit into the overall scheme. Compared with the rest of molecular biology, where experiments routinely address questions productively and efficiently due to a rich tool chest and immediate access to the subject at hand, deep phylogeny is far more speculative and prone to subjective interpretation, sketchy data, personal hobbyhorses, and abusive writing. A recent symposium in honor of one of its more argumentative practitioners made that clear, as his ideas were being discarded virtually at the graveside.

Over the last decade, estimates of the branching date of archaea from the rest of the tree of life have varied from 0.8 to 4.5 Gya (billion years ago). That is a tremendous range, and is a sign of the difficulty of this field. The frustrations of doing molecular phylogeny are legion, just as the temptations are alluring. Firstly, there are very few landmarks in the fossil record to pin all this down. There are stromatolites from roughly 3.5 Gya, which pin down the first documented life of any kind. Second are eukaryotic fossils, which start, at the earliest, about 1.5 Gya. Other microbial fossils pin down occasional sub-groups of bacteria, but archaea are not represented in the fossil record at all, being hardly distinguishable from bacteria in their remains. Then we get the Cambrian explosion of multicellular life, roughly 0.5 Gya. That is pretty much it for the fossil record, aside from the age of the moon, which is about 4.5 Gya and gives us the baseline of when the earth became geologically capable of supporting life of any kind.

The molecules of living organisms, however, form a digital record of history. Following evolutionary theory, each organism descends from others, and carries, in mutated and altered form, traces of that history. We have parts of our genomes that vary with each generation, (useful for forensics and personal identification), we have other parts that show how we changed and evolved from other apes, and we have yet other areas that vary hardly at all- that carry recognizable sequences shared with all other forms of life, and presumably with LUCA. This is a real treasure trove, if only we can make sense of it.

But therein lies the rub. As mentioned above, these deeply conserved sequences are hardly chronometers. So for all the data collection and computer wizardry, the data itself tells a mangled story. Rapid evolution in one lineage can make it look much older than it really is, confounding the whole tree. Over the years, practitioners have learned to be as judicious as possible in selecting target sequences, while getting as many as possible into the mix. For example, adding up the sequences of 50-odd ribosomal proteins can give more and better data than assembling the 2 long-ish ribosomal RNAs. They provide more and more diverse data. But they have their problems as well, since some are much less conserved than others, and some were lost or gained along the way. 

A partisan of the later birth of archaea provides a phylogenetic tree with countless microbial species, and one bold claim: "inflated" distances to the archaeal and eukaryotic stems. This is given as the reason that archaea (lower part of the diagram, including eukaryotes, termed "archaebacteria"), looks very ancient, but really just sped away from its originating bacterial parent, (the red bacteria), estimated at about 1 Gya. This tree is based on an aligned concatentation of 26 universally conserved ribosomal protein sequences, (51 from eukaryotes), with custom adjustments.

So there has been a camp that claims that the huge apparent / molecular distance between the archaea and other cells is just such a chimera of fast evolution. Just as the revolution that led to the eukaryotic cell involved alot of molecular change including the co-habitation of countless proteins that had never seen each other before, duplications / specializations, and many novel inventions, whatever process led to the archaeal cell (from a pre-existing bacterial cell) might also have caused the key molecules we use to look into this deep time to mutate much more rapidly than is true elsewhere in the vast tree of life. What are the reasons? There is the general disbelief / unwillingness to accept someone else's work, and evidence like possible horizontal transfers of genes from chloroplasts to basal archaea, some large sequence deletion features that can be tracked through these lineages and interpreted to support late origination, some papering over of substantial differences in membrane and metabolic systems, and there are plausible (via some tortured logic) candidates for an originating, and late-evolving, bacterial parent. 

This thread of argument puts the origin of eukaryotes roughly at 0.8 Gya, which is, frankly, uncomfortably close to the origination of multicellular life, and gives precious little time for the bulk of eukaryotic diversity to develop, which exists largely, as shown above, at the microbial level. (Note that "Animalia" in the tree above is a tiny red blip among the eukaryotes.) All this is quite implausible, even to a casual reader, and makes this project hard to take seriously, despite its insistent and voluminous documentation.

Parenthetically, there was a fascinating paper that used the evolution of the genetic code itself to make a related point, though without absolute time attributions. The code bears hallmarks of some amino acids being added relatively late (tryptophan, histidine), while others were foundational from the start (glycine, alanine), when it may have consisted of two RNA bases (or even one) rather than three. All of this took place long before LUCA, naturally. This broad analysis of genetic code usage argued that bacteria tend to use a more ancient subset of the code, which may reflect their significantly more ancient position on the tree of life. While the full code was certainly in place by the time of LUCA, there may still at this time have been, in the inherited genome / pool of proteins, a bias against the relatively novel amino acids. This finding implies that the time of archaeal origination was later than the origination of bacteria, by some unspecified but significant amount.

So, attractive as it would be to demote the archaea from their perch as super-ancient organisms, given their small sizes, small genomes, specialization in extreme environments, and peripheral ecological position relative to bacteria, that turns out to be difficult to do. I will turn, then, to a very recent paper that gives what I think is much more reasoned and plausible picture of the deeper levels of the tree of life, and the best general picture to date. This paper is based on the protein sequences of the rotary ATPases that are universal, and were present in LUCA, despite their significant complexity. Indeed, the more we learn about LUCA, the more complete and complex this ancestor turns out to be. Our mitochondrion uses a (bacterial) F-type ATPase to synthesize ATP from the food-derived proton gradient. Our lysosomes use a (archaeal) V-type ATPase to drive protons into / acidify the lysosome in exchange for ATP. These are related, derived from one distant ancestor, and apparently each was likely to have been present in LUCA. Additionally, each ATPase is composed of two types of subunits, one catalytic, and one non-catalytic, which originated from an ancient protein duplication, also prior to LUCA. The availability of these molecular cousins / duplications provides helpful points of comparison throughout, particularly for locating the root of the evolutionary tree.

Phylogenetic trees based on ATP synthase enzymes that are present in all forms of life. On left is shown the general tree, with branch points of key events / lineages. On right are shown sub-trees for the major types of the ATP synthase, whether catalytic subunit (c), non-catalytic (n), F-type, common in bacteria, or V type, common in archaea. Note how congruent these trees are. At bottom right in the tiny print is a guide to absolute time, and the various last common ancestors.

This paper also works quite hard to pin the molecular data to the fossil and absolute time record, which is not always provided The bottom line is that archaea by this tree arise quite early, (see above), co-incident with or within about 0.5 Gy of LUCA, which was bacterial, at roughly 4.4 Gya. The bacterial and archaeal last common ancestors are dated to 4.3 and 3.7 Gya, respectively. The (fused) eukaryotic last common ancestor dates to about 1.9 Gya, with the proto-mitochondrion's individual last common ancestor among the bacteria some time before that, at roughly 2.4 Gya. 

This time line makes sense on many fronts. First, it provides a realistic time frame for the formation and diversification of eukaryotes. It puts their origin right around the great oxidation event, which is when oxygen became dominant in earth's atmosphere, (about 2 to 2.4 Gya), which was a precondition for the usefulness of mitochondria to what are otherwise anaerobic archaeal cells. It places the origin of archaea (LACA) a substantial stretch after the origin of bacteria, which agrees with the critic's points above that bacteria are the truly basal lineage of all life, and archaea, while highly different and pretty archaic, also share a lot of characteristics with bacteria, and perhaps more so with certain early lineages than with others that came later. The distinction between LUCA and the last common bacterial ancestor (LBCA) is a technical one given the trees they were working from, and are not, given the ranges of age presented, (see figure above), significantly different.

I believe this field is settling down, and though this paper, working from only a subset of the most ancient sequences plus fossil set-points, is hardly the last word, it appears to represent a consensus view and is the best picture to date of the deepest and most significant waypoints in the deep history of life. This is what comes from looking through microscopes, and finding entire invisible worlds that we had no idea existed. Genetic sequencing is another level over that of microscopy, looking right at life's code, and at its history, if darkly. What we see in the macroscopic world around us is only the latest act in a drama of tremendous scale and antiquity.


Saturday, October 14, 2023

America as Hegemon

The imperial track record is not good, but the hegemonic track record isn't all that bad.

I was recently visiting the USS Hornet, a WW2-era aircraft carrier now turned into a museum on San Francisco bay. Soon after, it was Fleet Week, when the US navy pays a visit to the Bay Area in force, capped by a Blue Angels air show. An appalling display of naked militarism, granted. But also an occasion to reflect on our world-wide empire, the nature of American power, the competence of our military, and the state of things internationally.

It was a little weird, seeing decades-old technology swooping up and down the bay, which has been, beneath this benevolent protection, so restlessly advancing the technological frontier in totally different directions- computers, phones, applications, streaming, social media. Which trends are more important for America's place in the world? Which technologies rule? What are we doing with all this military hardware? I tend to have pretty conservative views on all this, that the US is right to stick with the post-WW2 consensus that our military should be as strong as possible, and partner with like-minded countries around the world to advance the vision of that era, of human rights and democracy for all. 

When we have tried to do this task directly, in Vietnam, Iraq, and Afghanistan, however, it has generally turned out very badly. The Iraq war was misconceived from the start, and went downhill from there. Despite the laudible aim of sparing the Iraqi people from the continued depredations of Saddam Hussein, the lying and the incompetence at all levels made the cure far worse than the disease, with anarchy and hundreds of thousands dead. But let's write that one off as a George Bush-as-decider blunder.

The Afghanistan debacle is more painful to contemplate, in some ways, in what it says about our fundamental incompetence as an imperial power. Its rationale was straightforward, international support wide-spread, and our power there absolute in the opening acts of the takeover. Yet with all those advantages, we ended up, twenty years later, turning tail and watching our hand-built Afghan military melt away even before we left the country. The Russians had, frankly, a better record in their Imperial Afghan turn. 

It is an appalling track record, really. We evidently and thankfully do not have the advantage of ruthlessness that ancient Rome enjoyed, or modern day spoilers like Russia and Iran. But nor, apparently, do we have the advantage of friendly relations, favorable hearts & minds, and good intelligence. We were constantly led astray by "friends" with all kinds of personal vendettas and agendas. We pride ourselves in our independence from the rest of the world, and thus know little about it, which means that we go into these settings woefully unprepared, besotted by whatever ideological issue du jure is fashionable in the US. Our priorities in Afghanistan seemed to be to hold elections and educate women. But were those the right aims? And even if so, were they carried out with any kind of wisdom and sense of priorities and proper preparation?

Most concretely, our military relationship was a disaster. The US military tried to make the new Afghan military into its own image and graft onto it its own systems and capablities, creating a dependence that caused immediate failure when Afghans caught wind that we were really, actually, going to leave. This was an incredible result, especially after the US military had been responsible for "training" countless militaries all over the world for decades. 

What on earth were we doing? Similarly to the intelligence failures, the military failures came from some fundamental inability to understand the problem at hand, and work with the society as it existed. Instead of creating a sustainable, right-sized, and politically viable force, we just assumed we were the good guys and anything we did was good. There was an intrinsic tension between leaving the society as it was, thereby just funding a reboot of a Taliban-like (or northern alliance-like) force to keep the country pacified, and forcing some change, on social, political, economic, and technological levels, by changing the form of government and associated institutions. The US clearly did not invade Afghanistan to keep everything the same. But by overreaching, we essentially achieved nothing, allowing precisely the group we dethroned to come back into power, and casting the country back into its pre-invasion economic and social abyss. At least, thanks to other technological bequests of the US and the West, the Afghans now have cell phones.

So our military and other institutions do not come off well in any of their recent engagements. It is a case of losing every battle, while winning the war. For we still enjoy a hegemonic position, not thanks to our incompetent and technology-bedazzled military, but thanks to our friends, with whom we still lead the world. The core groups of the anglophone countries, NATO, and the East Asian alliances with Japan, South Korea, and Taiwan remain the core of the developed world, enjoying peaceful relations, democracy, and prosperous economies. China is advancing mightily to displace that grouping, but can not do so alone, and has little hope of doing so with streadfast friends like Russia and North Korea by its side.


Tiers of development. Blue is the developed world, yellow the middle-tier (developing), and red, the lower tiers of development (desperately developing, one might say).

The advantages of joining this developed core are so evident, that one wonders why it is under threat, both from the spoiler countries like Russia, and from endogenous authoritarians in the US, Poland, Hungary, India, and elsewhere. Two decades ago, we were looking at the end of history, when a futuristic society of peace and contentment would inherit the post-cold war earth, Russia would join NATO, and we would live happily ever after. But democracy is a cultural pattern that not everyone can easily understand, especially people who run (or want to run) undemocratic countries. As our framers understood so well, sovereign power is dangerous, and needs to be diluted among publicly competing branches, candidates, officers, and voters for it to be durably controlled, a bit like an atomic chain reaction. It takes wisdom and humility to figure that out and abide by such fundamental (constitutional) rules. 

It is tempting to take that power directly in hand, to satisfy a burning desire to "do something". In the US, a Republican minority has progressively lost its commitment to popular rule and the viability of contemporary governmental institutions. This is, incidentally, only possible because of their special relationship with sources of money and of media influence, without which they would have little popular purchase. In China, the communist party figured that, despite its own history of ravaging its country, it had developed a stable enough system of governance, and had obtained implicit popular support ... reflecting either brainwashing or acquiescence ... that it did not need actual elections or Western-style divided government. And in Russia, the bitterness of its descent into kleptocracy, under the poisoned banner of "capitalism", combined with various snubs from the West and general historical and cultural distance, rendered the idea of becoming a Western country too much to bear.

Each authoritarian system has, like an unhappy family, its own reasons, while the happy families of the West seem to, think along similar lines almost involuntarily, at least until some authoritarian mountebank comes along to solve all our problems by doing away with our safeguards. We are in a grand race to find out which systems are more stable. Those that rely on one person, such as the aging Vladimir Putin, for their decisions, or those that rely on popular will and a controlling set of institutions. The lessons of history could not be more stark, telling us that the former is the bigger crapshoot. Sometimes it turns out well, but more often not. That is why liberalism and deliberative democracy developed in the first place.

There remains a great deal of middle ground around the world. The muslim countries, for example, form a middle tier of populous and developing countries comprising, between Pakistan, Egypt, Indonesia, Bangladesh, Turkey, Iran, the Gulf states and others, well over a billion people. Our wars in Iraq and Afghanistan didn't help our relations there, but on the other hand, China is hardly making itself loved either, with its extermination campaign in Xinjiang. The cultural patterns of the Islamic world make it a particularly hard sell for Western democracy vs authoritariansim. Thus the brief Arab Spring came to a painful and inglorious end, mostly in whimpers, sometimes in horror. The liberatlization process took a long time in the West as well, measured perhaps from the French revolution, through the revolutions of 1848, culminating the aftermath of World War 2, with developmental delays in the Eastern European deep freeze. Ideas and new social patterns take a long time to take root, even when the templates (Switzerland, the US, ancient Greece) are at hand.

The American hegemony is little more than an agreement among like-minded and friendly nations to maintain their democratic systems, their prosperous (if environmentally rapacious and unsustainable) economies, and to largely offload their military responsibilities on the US. Whether those responsibilities have been well-stewarded is certainly doubtful. But up to this point, the agreement has been highly successful, mostly because the US has been a willing, stable, and vigorous anchor. Can the EU take our place? It is conceivable, but the EU is structurally less decisive. Bodies like the UN or the G20 are even less capable, in any executive sense. So, until we come up with something better, with a hot war against Russia and a cold one developing against China, and while other cultures are slowly chewing over their various problems with authoritarianism, it is critical that the US remain that anchor for the democratic developed world.


Saturday, September 30, 2023

Are we all the Same, or all Different?

Refining diversity.

There has been some confusion and convenient logic around diversity. Are we all the same? Conventional wisdom makes us legally the same, and the same in terms of rights, in an ever-expanding program of level playing fields- race, gender, gender preference, neurodiversity, etc. At the same time, conventional wisdom treasures diversity, inclusion, and difference. Educational conventional wisdom assumes all children are the same, and deserve the same investments and education, until some magic point when diversity flowers, and children pursue their individual dreams, applying to higher educational institutions, or not. But here again, selectiveness and merit are highly contested- should all ethnic groups be equally represented at universities, or are we diverse on that plane as well?

It is quite confusing, on the new political correctness program, to tell who is supposed to be the same and who different, and in what ways, and for what ends. Some acute social radar is called for to navigate this woke world and one can sympathize, though not too much, with those who are sick of it and want to go back to simpler times of shameless competition; black and white. 

The fundamental tension is that a society needs some degree of solidarity and cohesion to satisfy our social natures and to get anything complex done. At the same time, Darwinian and economic imperatives have us competing with each other at all levels- among nations, ethnicities, states, genders, families, work groups, individuals. We are wonderfully sensitive to infinitesimal differences, which form the soul of Darwinian selection. Woke efforts clearly try to separate differences that are essential and involuntary, (which should in principle be excluded from competition), from those that are not fixed, such as personal virtue and work ethic, thus forming the proper field of education and competition.

But that is awfully abstract. Reducing that vague principle to practice is highly fraught. Race, insofar as it can be defined at all, is clearly an essential trait. So race should not be a criterion for any competitive aspect of the society- job hunting, education, customer service. But what about "diversity" and what about affirmative action? Should the competition be weighted a little to make up for past wrongs? How about intelligence? Intelligence is heritable, but we can't call it essential, lest virtually every form of competition in our society be brought to a halt. Higher education and business, and the general business of life, is extremely competitive on the field of intelligence- who can con whom, who can come up with great ideas, write books, do the work, and manage others.

These impulses towards solidarity and competition define our fundamental political divides, with Republicans glorying in the unfairness of life, and the success of the rich. Democrats want everyone to get along, with care for unfortunate and oppressed. Our social schizophrenia over identity and empathy is expressed in the crazy politics of today. And Republicans reflect contemporary identity politics as well, just in their twisted, white-centric way. We are coming apart socially, and losing key cooperative capacity that puts our national project in jeopardy. We can grant that the narratives and archetypes that have glued the civic culture have been fantasies- that everyone is equal, or that the founding fathers were geniuses that selflessly wrought the perfect union. But at the same time, the new mantras of diversity have dangerous aspects as well.


Each side, in archetypal terms, is right and each is an essential element in making society work. Neither side's utopia is either practical or desirable. The Democratic dream is for everyone to get plenty of public services and equal treatment at every possible nexus of life, with morally-informed regulation of every social and economic harm, and unions helping to run every workplace. In the end, there would be little room for economic activity at all- for the competition that undergirds innovation and productivity, and we would find ourselves poverty-stricken, which was what led other socialist/communist states to drastic solutions that were not socially progressive at all.

On the other hand is a capitalist utopia where the winners take not just hundreds of billions of dollars, but everything else, such as the freedom of workers to organize or resist, and political power as well. The US would turn into a moneyed class system, just like the old "nobility" of Europe, with serfs. It is the Nietzschian, Randian ideal of total competition, enabling winners to oppress everyone else in perpetuity, and, into the bargain, write themselves into the history books as gods.

These are not (and were not, historically) appetizing prospects, and we need the tension of mature and civil political debate between them to find a middle ground that is both fertile and humane. Nature is, as in so many other things, an excellent guide. Cooperation is a big theme in evolution, from the assembly of the eukaryotic cell from prokaryotic precusors, to its wonderous elaboration into multicellular bodies and later into societies such as our own and those of the insects. Cooperation is the way to great accomplishments. Yet competition is the baseline that is equally essential. Diversity, yes, but it is competition and selection among that diversity and cooperative enterprise that turns the downward trajectory of entropy and decay (as dictated by physics and time) into flourishing progress.


  • Identity, essentialism, and postmodernism.
  • Family structure, ... or diversity?
  • Earth in the far future.
  • Electric or not, cars are still bad.
  • Our non-political and totally not corrupt supreme court.
  • The nightmare of building in California.