Showing posts with label history. Show all posts
Showing posts with label history. Show all posts

Saturday, April 20, 2024

The Impossibility of Morality

We have dark sides and do bad things. How come we all think we are good people?

Part of our political, and temperamental, divide revolves around how seriously to take morality. How idealistic to be about goodness, how hard to try, or whether to be more realistic to be about our dark side. For all the platitudes and commandments, the sad fact is that morality is impossible, so the question is perhaps more how intensively we blind ourselves to darkness rather than how dark we will actually be.

Weird, right? But the closer you look, the more impossible it is to follow any system of morality. There are Jains who will not hurt a fly, let alone eat meat. But plants have feelings too. And our guts contain astronomical numbers of organisms in a roiling dance of macabre death. What about them? Existence as a human is unavoidably destructive. Simpler moral systems preach kindness to others. But again, existence requires feeding one's own fire, and that must come at cost to something, or someone. Every trade is unequal, even if voluntary. Employees are notoriously exploited to give more than their fair share. The Earth is relentlessly exploited. There is no end to our appetites, as long as we are alive.

Psychologically, we build up defenses to say that we are no worse than others, that we are good people. Even if we are bad people, we say that we have been driven to crime, and it is no worse than the rich people who thoughtlessly abuse others. Or if we are a presidential candidate, we say that we are saving the world, and making America great, and the subject of cruel witchhunts. Self-defense is one more essential part of living, even if it comes at the expense of seeing the world clearly. Unflattering visions of our way of life are rejected and repressed, the more so if they come as criticism from others.

Defensive blindness is integral to "modern" life. The agriculture and food processing industry keeps the slaughterhouses hidden, the feedlots and inhumane poultry coops under wraps. The less we know, the better we feel. Money is the ultimate screen against the squeems and qualms of existence, shielding us from the rapacious mining that our electronics drive in tropical forests, the slave labor that makes our clothes, and countless other immoral and destructive processes we are ultimately responsible for. Clear consciousness of all this would make the whole system collapse.

Protesters carrying the pine tree flag of Christian nationalism. While doing good things for the country.

Religions offer their own forms of defense. Confession in the Catholic church is a classic way to touch the darkness, but then to be absolved and feel good again. Exorcisms are offered as well. Protestant approaches tend to focus more on works, like community service, or in fringier precincts, on sermons of self-glorification. Everyone who is reborn in Christ is part of the club, and though a sinner, is also good, glorious, and heaven-bound. Possibly, even, in the Mormon system, himself a god. How they engage with moral darkness varies tremendously by religion, but the common need is to control it, in ourselves and others, sufficiently that our self-image of goodness and light can be preserved.

The extensive repression of moral darkness leads to the countervailing temptation to take another peek at it, under controlled conditions. It is the inspiration for much art- the detective thiller, the horror movie, the general apparatus of drama. Without darkness, there is no interest or light. And people differ markedly in their approach to such material. The more liberal and optimistic tend to focus on the light side, not the dark side, and do so politically as well. They have more moral idealism and hope, which means they have more repression of darker tendencies. Kumbaya is sung. Conversely, the more "realistic", conservative attitude scoffs at the do-gooder idealism of the left, and sees darkness around every corner- in foreigners, in sexual transgression and expression, in fluid social systems, in change itself. They recognize that moral aspiration is futile- such as the woke trend of recent times .. the bending over backwards to every minority group, micro-aggression, every insect and animal, and the climate.. is putting up an impossible and futile bar. That sticking to basics and tradition is going to get us further than such refusal to recognize the dark reality of human existence. 

These valences are apparent in the Palestinian dilemma. As the Palestinians were expelled from Israel during its establishment, the Jews proclaimed a right for Jews all over the world to come to Israel. Meanwhile, the UN created a right of return for Palestinians, to the very same land that formed Israel. It was the ultimate expression of bleeding heart unrealism, and has led (in part) to the existentially stuck misery of Palestinians for all these decades, as the UN took it upon itself to nurture an absurd dream of return and set up a now-permanent refugee apparatus of feeding, schools, and health care, all of which fuels the seething anger and terrorist dreams of ever-growing generations of Palestinians.

Another example is the US war in Vietnam- a curious and tragic mix of blindness, idealism, and realism. We wanted to help the (South) Vietnamese defend themselves from communism. In light of what happened in North Korea in the ensuing decades, this was not a bad goal. North Korea is moral darkness incarnate- a cruel and criminal dictatorship. But once the enormity of the task became clear, the moral realists took charge, with the aim of bombing Vietnam and its neighboring countries into submission. But even such extreme measures failed, leaving us with the ashes of horrible means used in the service of a futile goal. The US media was increasingly unwilling to hide the horrors, bringing into American consciousness all this darkness, which turned out to be unbearable.

So, is it better to blind ourselves to the darkness, and risk destruction and error, or better to be realistic, explore it, even celebrate it, as the Homeric epics do, and gird ourselves to deal with it, and deal it out to others? As in most things, societies are probably best off with a mix of perspectives. This mix is perennially expressed in our political spectrum, though of late the right seems to have gotten caught up in a peculiar reaction against the pieties of the left. As the left has gained the cultural and governmental high ground, as shown by the triumph of gay rights, ever-increasing concern for racial minorities, and a rising tide of official movement on environmental concerns, the right has turned apoplectic. They seem to be saying ... "We love our trucks, we won the continent fair and square, and we won the racial contest as well.."- leave us to our spoils, and don't be so concerned about "fairness" .. life isn't fair or moral, but goes to the darkest, baddest winner. (One can hear echos of the Confederate South in all this clearly enough.) Those on the left who are besotted with woke-ness and fairness will be singing a different tune when they are not at the top of the heap anymore, in their well-gentrified, rich and safe neighborhoods.

Perhaps this portrayal is extreme, but extreme concern for the moral fairness within a society can blind us to other issues, such as the competitive underpinnings of life, both within and verus other societies, and the ultimate impossibility of being totally fair, or moral, as historical actors. A balance of moral idealism and realism about unavoidable dark aspects is needed, but not in a conflict that tears the society apart. That depends on communication between the two sides, and less totalizing certainty from each side's respective mechanisms that repress doubt and screen (or valorize, in extreme cases) various different aspects of darker morality. Religion is notorious for reshaping its adherent's realities and protecting them psychologically from their own evil actions. But left wing certainty functions similarly, with its echo chambers and pieties. So, as usual, deeper insight is needed, mostly of our own blind spots and what they are hiding, but also of how such mechanisms work across the spectrum.


Saturday, April 13, 2024

The Shadow War

We are in a new world-wide cold war. And ironically, the many new technologies from the West have given autocratic states extraordinary new powers. 

Paul Theroux had a remarkable passage in one of his travel books, as he was passing through Myanmar, a military dictatorship then and now, that illuminated attitudes towards China and from China. 

"I heard lots of praise for the United States in distancing itself from the regime, and lots of blame for China and Russia and Singapore in supporting it- China especially. But China's prosperity, its need for oil and wood and food, had created a new dynamic. China had no interest in any country's developing democratic institutions; on the contrary, it was a natural ally of repressive regimes. When the World Bank withheld funds from an African country because it was corrupt and tyrranous, demanding that it hold an election before it could qualify for aid, China would appear with money- 'rogue aid,' with no strings attached, and got the teak, the food, and the drugs." - Ghost Train to the Eastern Star, 2008


The world seems headed into another cold war, definitely rhyming with the last cold war. It is highly unfortunate, and testament to some defects in US management of the post-cold war era, to the surprising durability, even attractiveness, of authoritarian systems, and to the many weaknesses of democratic systems. This new cold war, which I will call the shadow war, features Russia and China as the main poles of opposition to democratic and developed countries, mostly in the West, but including many others. This time around, China is the stronger power by far, and both Russia and especially China are quite advanced in their development, so that the West no longer has a monopoly in any particular technology or kind of organization. China has adopted all the magic of capitalist market mechanisms to grow its wealth, and stolen (or forced the transfer of) huge amounts technology and knowledge to make itself a leader in all sorts of industries.

The West has lately begun to wake up to the problem. Our hope that capitalism was somehow related to, or a leading wedge for, democracy has been dashed several times over. Instead of China turning into Hong Kong, it is Hong Kong that is turning into China. Not only is capitalism, as has been tirelessly pointed out from the left, amoral and indifferent to human rights, (as we already knew from slavery in the US), but democracy is also far more fragile than we had hoped, requiring a wide range of civic understandings, media practices, and forms of education that are far from universal, or natural. We had, in the windup to the cold war, seen many countries make slow and fraught transitions to democracy (Philippines, Taiwan, South Korea, and Eastern Europe), but have more recently seen countries backtrack into autocracy (Russia, Hungary).

Naturally, the war in Ukraine has put the most urgent point on this conflict, where Russia, which is to say its autocratic leader, felt that the existence of an independent and democratic Ukraine next door was too much to bear. Now, China also tells us that it loves its brothers in Taiwan so much that re-unification will come, no matter what the Taiwanese themselves might want. Love certainly takes some strange forms!

But it is a much broader issue, spanning the globe, and the depths of human psychology. On one list of countries ranked by democratic governance, the median country is Armenia, with a "hybrid regime" and scores of roughly five out of ten. This is not a great situation, where half the world, in rough terms, lives in various states of miserable, oppressive government. And as the quote above suggests, the authoritarians have in some ways the stronger hand. What happened?

We in the West had thought that democracy was the natural harbor of all peoples- the end of history, indeed. But in the first place, people power is a very limited power, if whoever has power is authoritarian enough to use tanks against it. And in the second place, democracy is not natural in many cultures. The Muslim culture, for instance, for all its virtues, has a fundamentally patriarchal and tribal governance model, with little room for democracy, though there are, traditionally, various forms of freedom, for men at least. So however attractive democracy is in theoretical terms, and as a model in the West that people from authoritarian countries like to vacation to... as a cultural pattern, it is not universal. And authoritarian patterns are hardly foreign to the West either. The Catholic church is an example of the preserved archetypes of patriarchy and authoritarian strong-man rule.

The Chinese dream is highly militaristic, and rather threatening.


But more deeply, the archetypes we have of leadership and politics are authoritarian.. the king, the hero. Jungian psychology, aside from its focus on archetypes, deals in the shadow, which is our real needs and instincts, insofar as they run counter to our surface goodness and conscious ego construction. A person like Donald Trump exemplifies all these trends. Why on earth are we still saddled with this sociopath after a decade of drama-queenity? He clearly touches a lot of people's archetypal conceptions of strength and heroism. His powers of psychological projection, reflectively rejecting his own shadow, are immense. He is rubber, others are glue. And his fundamental bond with the followers, by licensing their shadow sides of hate and violence, makes his every pronouncement right no matter what. We in the US are facing a cataclysmic political season, trying to repress the shadow of humanity, which is so amply expressed around the world in political / power systems that follow the logic of strength, ending up in states of terror.

Modern technology hasn't helped, either. After a brief flush of excitement about the ability of social media to amplify people power, especially across the Muslim world, it all went to pot as the shallow-ness and disorganization of such movements became apparent. The powers of databases, personal identification, surveillance, and media manipulation have been much more useful to authoritarian governments than to their antagonists, making state terror more effective than ever. Authoritarian countries now control their internet and media environments with great precision, increasingly project their twisted narratives abroad, and even hunt down dissidents outside their borders using the new information tools. So while information may want to be free, it doesn't really have a say in the matter- those with power do.

What to do about it? We in the West have lost control of our media environments. While we are waking up to some extent the the malevalent media from abroad, domestic media is controlled by money, which in the current environment of yawning wealth inequality, political fissiparation, and clickbait "business models" is just as crazy and corrupt. So there should be two approaches to this. One is to strengthen quality media, like PBS and its cohorts, with more offerings and deeper reporting. The other is to restrict how corporations can control media. The right to individual free speech can be preserved while making corporations more sensitive to social goods. The Dominion case against FOX was a small example of the powers available. Liability for lying should be a broader effort in the law, specifically against corporations, which are creatures of the state, not natural persons. We need to recognize the deep psychological powers we are up against in preserving enlightened, respectful civil government and discourse.

Obviously getting our own house in order, against the atavistic forces of political authoritarianism, is the first order of business. Abroad, paradoxically, we need to project strength as a democratic and developed community, holding the line in Ukraine and Taiwan, and against all sorts of authoritarian encroachments, until temperatures are lowered, and the current nationalist fevers abate. For what China has right now is an imperialist fever. It has been weak for so long and surrounded by so many unfriendly countries, that one can understand that it sees its recent economic prosperity as a special opportunity to recover a leading position in its neighborhood, militarily and politically as well as economically. That would be fine if it were not also trying to subvert free political systems and prop up tyrannical ones. There are good reasons why its neighbors are fearful of China.

Like in the last cold war, I think time plays a key role. We have to believe that democracies, for all their weaknesses, are better, and are seen as better, by people around the world. While today's authoritarian powers may have greater durability than those of the communist era due to their embrace of, rather than flouting of, market principles and modern technologies, they are ultimately fragile and subject to the opinions of their own people. Putin will not last forever. Xi will not last for ever. (The Kim regime of North Korea may, however, last forever!) Change is the achilles heal of authoritarian conservatism. So we are in for a very long haul, to keep spreading people power and peace internationally.


Saturday, March 16, 2024

Ideologies of Work

Review of Elizabeth Anderson: "Hijacked: How neoliberalism turned the work ethic against workers, and how workers can take it back."

We live by the sweat of our brow, though work. At least that has been the story after we were thrown out of the garden of Eden, where we had previously foraged without effort. By the time of Puritans, work had been re-valued as being next to godliness, in what became known as the Puritan work ethic. Elizabeth Anderson takes this as her point of departure in a fascinating historical study of the winding (and mostly descending) road that attitudes toward work took down the centuries, in the perennial battle between workers and parasites who have found ways to avoid sweating, yet eat just the same ... or better.

Anderson trots through all the classical economists and philosophers, down to John Stuart Mill and Marx, showing two main threads of thought. First is the progressive thread, in which the Puritans can (curiously) be classed, as can Adam Smith. They value work as both a cultural and meaningful activity, not just a means of sustenance. They think everyone should work, and criticize anyone, high or low, who shirks this responsibility. Genteel landowners who spend their time hunting rather than improving their estates are just as culpable as drunkards and other able-bodied peasants who fail to do their share. Learning and innovation are highly valued, as not just ameliorating the lot of those making improvements, but at the same time raising the wealth of, and standard of living for, all.

In contrast is the conservative thread. Anderson herself describes it trenchantly:

"From the conservative perspective, however, poverty reflected an individual's failure to filfill the demands of the work ethic. Society is at fault solely in establishing institutions that violate natural law in promoting vice through provisions such as the Poor Law. Conservatives agreed that the Poor Law must therefore be abolished or radically reformed. If poverty is caused by the vice of the poor, the remedy for poverty must be to force the poor to practice virtue, to live up to the demands of the work ethic. Conservatives differed somewhat on which virtue was most necessary for the poor to practice. Priestly focused on frugality, Bentham on industry, Malthus on chastity, Paley on contentment (understood as the opposite of covetous envy of the rich). Thus, Priestly hoped to convert poor workers into virtuous bourgeios citizens through a legally mandated individual savings plan. Bentham favored a workfare system that turned the working poor into imprisoned debt peons of capitalist entrepreneurs. Malthus advocated leaving the poor to starvation, disease and destitution, but offered them the hope that they could rescue themselves by postponing marriage and children. Burke and Wately agreed with Malthus, but attempted to put a liberal-tory paternalist veneer on their view. ...

"The moral accounting that assigns responsibilities to individuals without regard- and even in inverse proportion- to the means they have to fulfill them remains a touchstone of conservative thought to the present day. ...

"The ideology of the conservative work ethic is distinguished by a harsh orientation toward ordinary workers and the poor, and an indulgent one toward the 'industrious' rich- those who occupy themselves with making money, either through work or investment of their assets, regardless of whether their activities actually contribute to social welfare. in practice, this orientation tends to slide into indulgence toward the rich, whether or not they are industrious even in this morally attenuated sense. ...

"Here lies a central contradiction of the conservative work ethic. All the conservatives claimed that the key to overcoming poverty was to make the poor bourgeois in attitude. All they needed to do was adopt the work ethic, or be forced to adopt it, along with the spirit of competitive emulation, the desire to better others in the race for riches and ensure that one's children not fall beneath the standard of living in which they were raised. Poverty was proof that they hadn't adopted bourgeois virtues and aspirations. This presupposed that the poor suffered from no deficit in opportunities. The path to prosperity was open; the poor were simply failing to take it. Yet we have seen that, Priestly partially excepted, conservative policies knowingly reduced the opportunities of the poor to acquire or retain property, work for themselves, or escape precarity."


My major critique of Anderson's analysis is that putting all this conflict and history into the frame of the work ethic is inappropriate and gives the work ethic far more weight than it merits. Firstly, everyone thinks of themselves as working. The most sedentary rentier doubtless thinks of his or her choosing among investments as of critical importance to the health and future of the nation. Even his or her shopping choices express taste and support a "better" sort of business, in that way performing work towards a better community. The English royals probably see themselves as doing essential cultural work, in their choice of hats and their preservation of cherished traditions. Parenting, community associations, and political agitation can all, to an expansive mind, be construed as "work". And indeed some of our greater artistic and other accomplisments come from the labors of wealthy people who were entirely self-directed rather than grubbily employed. All this implies that a work ethic can be accommodated in all sorts of ways if markets are not going to be the standard, as they hardly can be in any philosophical or moral system of a work ethic. This makes work ethics rather subjective and flexible, as Anderson implicitly demonstrates through the centuries.

However a more serious problem with Anderson's analysis is that it leaves out the ethic of power. Her presentation laments the sad misuse that the work ethic has been subjected to over the years, (by conservatives), without focusing on the reason why, which is that a whole other ethic was at work, in opposition to the work ethic. And that is the power ethic, which values domination of others and abhors work as commonly understood. Or, at best, it construes the organization of society for the benefit of a leisured upper crust as work of momentous, even civilizational, significance. Nietzsche had a field day calling us to recognize and embrace the power ethic, and not hide it under sweeter-smelling mores like the Christian or work ethics.


Anderson does helpfully discuss in passing the feudal background to the Puritan work ethic, where the Norman grandees and their progeny parcelled out the land among themselves, spent their time warring against each other (in England or in France), and lived high off the labors of their serfs/peasants. No thought was given to improvement, efficiency, or better ways to organize the system. Conservatism meant that nothing (god-willing) would change, ever. Even so, the work of politics, of war, and of religious ideology was never done, and the wealthy could easily see themselves as crucial to the maintenance of a finely-balanced cultural and economic system.

Anderson also notes that the original rationale of the gentry, if one must put it in an economic frame, was that they were responsible for military support of the king and country, and thus needed to have large estates with enough surplus in people, livestock, horses, and food to field small armies. When this rationale disappeared with the ascendence of parliament and general (at least internal) peace, they became pure rentiers, and uncomfortably subject to the critique of the Puritan work ethic, which they naturally countered with one of their own devising. And that was essentially a restatement of the power ethic, that the rich can do as they please and the poor should be driven as sheep to work for the rich. And particularly that wealth is a signifier of virtue, implying application of the work ethic, (maybe among one's forebears, and perhaps more by plunder than sweat, but ... ), or transcending it via some other virtues of nobility or class. 

But in Locke and Adam Smith's day, as today, the sharpest and most vexing point of the work ethic is not the role of the rich, but that of the poor. By this time, enclosure of lands was erasing the original version of the job guarantee- that is, access to common lands- and driving peasants to work for wages, either for landowners or industrialists. How to solve extreme poverty, which was an ever more severe corollary of capitalism and inequality? Is it acceptable to have homeless people sleeping on the streets? Should they be given work? money? social services? education? Do the poor need to be driven to work by desperation and starvation? Or is the lash of work not needed at all, and lack of wealth the only problem? Malthus was doggedly pessimistic, positing that population growth will always eat up any gains in efficiency or innovation. Thus it requires the predatory power of the gentry to enable society to accumulate anything in the way of capital or cultural goods, by squelching the poor in sufficient misery that they will not over-reproduce.

The progressive view of work and the poor took a much more sanguine view. And here one can note that much of this discussion revolves around "natural" laws. Is the population law of Malthus true? Or is the natural communitarian tendency of humans also a natural law, leading to mutual help, spontaneous community formation, and self-regulation? Are some people "naturally" superior to others? Is a hierarchical and domineering social system "natural" and necessary? Adam Smith, in Anderson's reading, took a consistently pro-worker attitude, inveighing against oppressive practices of employers, collusion of capital, and cruel goverment policies. Smith had faith that, given a fair deal and decent education, all workers would strive to the best of their abilities to better their own condition, work diligently, and thereby benefit the community as well as themselves.


For the story of Eden is fundamentally wrong. Humans have always worked, and indeed valued work. Looking outside the window at a squirrel trying to get into the bird feeder ... is to see someone working with enthusiasm and diligence. That is our natural state. The only problem was that, as human civilization progressed, power relations, and then even more- industrialization- generated work that was not only cruel and oppressive, but meaningless. The worker, forced to work for others instead of him- or herself, and routinized into a factory cog, became fully alienated from it. How to get workers to do it, nevertheless? Obviously, having a work ethic is not a full solution, unless it is of a particularly astringent and dogmatic (or tyrannical) sort. Thus the dilemma of capitalist economies. For all their trumpeting of the "natural laws" of competition and "freedom" for employers to exploit and workers to be fired, capitalism violates our true natures in fundamental ways.

So the question should be, as Anderson eventually alludes to, do we have a life ethic that includes work, rather than just a work ethic? She states plainly that the most important product of the whole economic system is ... people. Their reproduction, raising, education, and flourishing. It is not consumption products that should be the measure of economic policy, but human happiness. And a major form of human happiness is doing meaningful work, including the domestic work of the family. The world of Star Trek is even alluded to in Anderson's last chapter- one where no one works for subsistance, but rather, people work for fulfillment. And they do so with zeal.

Anderson sees great potential in the more progressive forms of the work ethic, and in the social democratic political systems that implemented them after World War 2. She argues that this is the true legacy of Marxism (and of Thomas Paine, interestingly enough) and expresses the most durable compromise between market and capital-driven corporate structures and a restored work ethic. Some amount of worker participation in corporate governance, for instance, is a fundamental reform that would, in the US, make corporations more responsive to their cultural stakeholders, and work more meaningful to workers. Tighter regulation is needed throughout the private economy to make work more humane for the very low-paid, giving workers better pay and more autonomy- real freedom. More public goods, such as free education to university levels, and better provision for the poor, especially in the form of a job guarantee, would make life bearable for many more people. For my part, inheritance seems a key area where the ethics of the dignified work and equal opportunity run up against completely unjust and artificial barriers. In America, no one should be born rich, and everyone should grow and express themselves by finding a place in the world of work.


  • Annals of capitalist control.
  • Corporations and the royal we.
  • More equal societies are better societies.
  • The Stepford wife.
  • The Supreme Court is dangerously wrong.

Saturday, February 10, 2024

How the Supreme Court Should Rule in the Colorado Ballot Case

There is one path forward that can salvage the court's standing.

The US Supreme Court is sinking to unusual depths of corruption and illegitimacy. Bush v. Gore was a step down in its ability to manage the rules of our political and legal system, where it made a hasty and, it claimed, one-time-only carve-out for its favored candidate, leading to almost a decade of tragically bad policy and poor government. Then came Citizens United, another step downward, opening firehoses of secret money from the wealthy, using the fig leaf of "free speech" to cover the corruption of politics with money. Then came the overturning of Roe, deeming women unworthy of rights that are far more basic and intimate than those enumerated in the Bill of Rights. And most recently have come the drumbeat of reports of corruption among the right-wing justices, who appear to regard themselves as too dignified to abide by the laws and norms they hold others to.

Now it is faced with a case that tests the very core of the court's abitlity to do its job. What does the constitution mean? Does the fourteenth amendment mean what it says, and if so, should it be enforced? A great deal of commentary suggests, probably correctly, that this court is desperately looking for a way out of this legal conundrum that allows it to do nothing and avoid overturning any apple carts. That would not, however, be a wise course. 

To recap, the Colorado case was brought by voters who sought to bar Donald Trump from the Colorado primary and general election ballots, due to his participation in the insurrection of January 6, 2021. The fourteenth amendment to the federal constitution bars such participants from federal and state offices. The Colorado Supreme Court ultimately agreed, sending the case to the US Supreme Court. The congressional report on the January 6 events makes the record of those events quite clear. It uses the word "insurrection" several times, as do many of its sources, and it is crystal clear about the participation by and culpability of Donald Trump in those events. 


The question is really about how the Constitutional provision should be brought into execution, being worded without a lot of explicit legal structure. One thing it does say is that congress can relieve its prohibition in individual cases by two-thirds votes of each house. But it leaves unsaid who should adjudicate the question of fitness for office, as is also the case for the more basic qualifications such as age and citizenship. Trump had previously, and ironically, dabbled in these same legal waters by casting doubt on the citizenship of Barack Obama. But since no one with half a brain took him seriously, the issue never entered the legal sphere.

Well, the worst course would be to let the clear language of the constitution lay inert and find some technicality by which to do nothing. What I would suggest instead is that the court recognize that there needs to be a national adjudicating power to decide this question in the case of candidates for national office (and indeed for any office whose qualifications are mentioned in the constitution). And that power should be itself, the US Supreme Court. The court might invite the legislative branch to provide more regular methods of fact-finding, (or even a clarifying amendment to the constitution), but given the constitutional clear intent, history, and logic, (not to mention the general Article III clauses putting all questions arising from the constitution in its hands), the court should take upon itself the power to say that the buck stops at its door. And naturally, in consequence, that Trump merits disqualification, on the facts of the January 6 events as found by the lower courts, and on his position as an officer, indeed the paramount officer, of the United States.

This solution would neatly take over from the states the responsibility of saying that any national candidate meets or does not meet the various qualifications set forth in the constitution. Such cases could begin in state courts, as this one did, but would need to go to the US Supreme Court for final decision. This solution would hold Trump to account for his actions, a principle that Republicans have, at least  traditionally, cherished. This solution would also go some way to removing the stain of the Bush v Gore decision, and establish a new and clear element of constitutional jurisprudence, in setting forth who adjudicates the qualifications of national political candidates. In fact, this function can be tied to the practice of having the chief justice of the United States administer the oath of office to the incoming president. It would be proper for the court to be the institution that decides on the basic fitness tests, and thus who in general may take the oath, while the people decide the political outcome of the election, among fit candidates.

I am no legal scholar, but the merits of this solution seem pretty clear. On an earlier occasion, the court summarily took on the task of determining the constitutionality of laws. This role was not explicitly set out in the text, but was a logical consequence of the structure that the constitution set up. Here likewise, the logic of the constitution indicates strongly that the final word on the fitness of candidates for national office must rest with, not the voters, not the states, and not the legislative or, heaven forbid, the executive branch, but with the federal judicial branch, of which the US Supreme Court is the head.

An alternative, and perhaps more likely, solution, is for this court to state all the principles above, but then hold that in its judgement, Donald Trump is fit for office after all. Maybe it will deem the insurrection just a little insurrection, and not the kind of big insurrection that would turn a jurist's head (despite the over thousand charges filed, and hundreds of federal convictions so far). Or maybe it will deem Trump insufficiently involved in the insurrection to merit disqualification. What it can not do is deem him not an officer of the federal government- that would be beyond belief. The pusilanimous, partisan sophistry of this alternative would not go over well, needless to say. Many would ask whether Clarence Thomas, himself virtually a participant in the insurrection at one remove, should have recused himself. Minds would be blown, but few would be surprised, since for this court, expectations could hardly be lower. Going against its partisan grain would, on the other hand, be a signal and heartening achievement.

This second approach would at least resolve the legal questions, but at the cost of further erosion of the court's legitimacy, given that the events of January 6 are so well documented, and the constitutional peril that Trump poses so obvious. For the whole point of having a Supreme Court which takes on tough issues and plugs logical holes in our constitution is that it also takes some care to plug them well, and preserve our democracy in the process.


  • What happens when the Supreme Court gives in to politics?
  • One state, one system.
  • A solar energy insurrection in Puerto Rico.
  • Democratic inequality is related to wealth inequality.
  • More on the court case- ballots vs office holding.

Saturday, February 3, 2024

Spiritual Resources for the Religiously Abstemious

Nones are now a plurality in the US. What are we supposed to do?

The Pew research institute recently came out with polling that shows a significantly changed religious landscape in the US. Over the last couple of decades, while the religious right has been climbing the greasy pole of political power, gaining seats on the Supreme Court, and agitating for a return to patriarchy, their pews have been emptying. The religiously unaffiliated, or "nones", comprise 28% of the US population now, almost double the level two decades ago.

One has only to see the rabid support evangelicals give their orange-haired messiah to understand what has been turning people off. Or glance over the appalling chronicle of sexual abuse unearthed in the Catholic church. Maybe the horsemen of the Atheist apocalypse have had something to do with it. Russia under Putin is strenuously demonstrating that the same system can be just as cruel with or without religion. But these patterns of gross institutional, moral, and intellectual failure, and their ensuing critiques, are hardly new. Luther made a bit of hay out of the abuses of the Catholic church, Voltaire, among many other thinkers, ridiculed the whole religious enterprise, and Hitler was a forerunner of Trump in leaning on religion, at least early in his career, despite being a rather token Christian himself (other than in the antisemitism, of course). What is new now?

A dramatic rise in numbers of people with no religious affiliation and little interest, from Pew polling.

I am not sure, frankly. Europe has certainly been leading the way, showing that declining religion is quite compatible with prosperous and humane culture. But perhaps this phenomenon is part of the general isolation and atomization of US culture, and thus not such a good thing. It used to be that a community was unthinkable without a church (or several) to serve as the central hub. Churches served to validate the good and preach to the bad. They sponsored scout troops, weddings, charitable events and dinners, and committees and therapeutic encounters of all sorts. They were socially essential, whether one believed or not. That leaders of society also led the churches knit the whole circle together, making it easy to believe that something there was indeed worth believing, whether it made sense or not.

Now, the leadership of society has moved on. We are mesmerized by technology, by entertainment, and sports, perhaps to a degree that is new. The capitalist system has found ways to provide many of the services we used to go to churches for, to network, to get psychotherapy, to gossip, and most of all, to be entertained. Community itself is less significant in the modern, suburban, cocooned world. Successful churches meet this new world by emphasizing their social offerings in a mega-church community, with a dash of charismatic, but not overly intellectually taxing, preaching. Unfortunately, megachurches regularly go through their own crises of hypocrisy and leadership, showing that the caliber of religious leaders, whatever their marketing skills, has been declining steadily.

The "nones" are more apathetic than atheistic, but either way, they are not great material for making churches or tightly knit communities. Skeptical, critical, or uninterested, they are some of the least likely social "glues". Because, frankly, it takes some gullibility and attraction to the core human archetypes and drama to make a church, and it takes a lot of positive thinking to foster a community. I would promote libraries, arts institutions, non-profits, and universities as core cultural hubs that can do some of this work, fostering a learning and empathetic culture. But we need more.

As AI takes over work of every sort, and more people have more time on their hands, we are facing a fundamental reshaping of society. One future is that a few rich people rake off all the money, and the bulk of the population descends into poverty and joblessness, unneeded in a society where capitalism has become terminally capital-intensive, with little labor required. Another future is where new forms of redistribution are developed, either by bringing true competition to bear on AI-intensive industries so that they can not take excess profits, or by thorough regulation for the public good, including basic income schemes, public goods, and other ways to spread wealth broadly. 


Such a latter system would free resources for wider use, so that a continuing middle class economy could thrive, based on exchanges that are now only luxuries, like music, personal services, teaching, sports, counseling. The destruction of the music recording industry by collusion of music labels and Spotify stands as a stark lesson in how new technology and short-sighted capitalism can damage our collective culture, and the livelihood of a profession that is perhaps the avatar of what an ideal future would look like, culturally and economically.

All this is to say that we face a future where we should, hopefully, have more resources and time, which would in principle be conducive to community formation and a life-long culture of learning, arts, and personal enrichment, without the incessant driver of work. The new AI-driven world will have opportunities for very high level work and management, but the regular hamburger flippers, baristas, cabbies, and truck drivers will be a thing of the past. This is going to put a premium on community hubs and new forms of social interaction. The "nones" are likely to favor (if not build) a wide range of such institutions, while leaving the church behind. It is a mixed prospect, really, since we will still be lacking a core institution that engages with the whole person in an archetypal, dream-like fantasy of hope and affirmation. Can opera do that work? I doubt it. Can Hollywood? I doubt that as well, at least as it applies to a local community level that weaves such attractions together with service and personal connection.


  • Those very highly moral religious people.
  • Molecular medicine is here.
  • Why do women have far more autoimmune syndromes?
  • What to do about Iran.
  • "As we’ll see, good old-fashioned immortality has advantages that digital immorality cannot hope to rival." ... I am not making this up!


Saturday, January 20, 2024

The Tragedy of Daniel Boone

Pathfinding and hunting his way through the paradise the Indians had built.

Daniel Boone is (or used to be) one of the most iconic / archetypal figures in US history and popular consciousness. His remains have been fought over, his life mythologized and serialized, and his legacy cherished as heroic and exemplary. It all began with his trusty rifle, with which he was the surest shot. He was a pathfinder, never lost in the vast wilderness he explored and helped settle. And he was a steadfast leader of men, rescuer of damsels in distress, and killer of Indians. What's not to admire? His definitive biography, by John Faragher, paints a more ambivalent picture, however.

Boone loved the woods- loved hunting, loved nature, and loved solitude. Given those talents and tendencies, he naturally strayed from the borderlands of North Carolina into the mountains, becoming a full time hunter and trapper. In a couple of early forays into what we now know as Kentucky, he hunted on a commercial basis, wasting the animals to pile up hundreds of pelts, which his employees / colleagues processed in camp. 

The biography emphasizes that what Boone found in Kentucky was a paradise- lush and full of game. The region, believe it or not, was full of not just deer and beaver, but bear and buffalo. It is the kind of eden that had been encountered by Europeans many times over in the "New World". Fisheries of unimaginable richness, skies full of birds, forests as far as the eye could see. Kentucky was not an uninhabited eden, however- it was the cherished hunting ground of native Cherokee and Shawnee, among others, who saw exactly what Boone saw, but responded to it differently. Not with plunder and destruction, but with care and stewardship.

Boone blindly shot away, and then followed his cultural programming further by leading his family and many others across the mountains to found Boonesborough, building a fort and defending it against numerous Indian attacks. The biography notes that Boone's parents had ten children, and he had ten children, and his children had similar sized families. One can imagine where that kind of reproduction leads, to desperate expansion and heedless use of resources. While acknowledged as the pioneer of Kentucky settlement, Boone was no businessman, and all his grasping for land in the speculative rush that developed in his wake came to naught. He was sloppy in his paperwork and was outlawyered and out-cheated at every turn. One may see the personality type of his adversary in the current senior senator from Kentucky, Mitch McConnell. Boone was all too honest and simple, having been raised a Quaker.

Portrayal of the siege of a stockade, not unlike that of Boonesborough, as Native Americans try to drive off the cloud of locusts denuding their land.

The game had been hunted out, the people had become unfriendly and dense underfoot, and Boone's property and business schemes had all fallen apart. In despair over what he had wrought in Kentucky, Boone pulled up stakes and moved out to the next frontier, near St. Louis. An extremely late hunting trip has him heading through what is now Yellowstone park, reliving for the last time the kind of eden that Native Americans had nurtured with their respect for the value and cycles of nature, and even more, with their light footprint as small populations.

European culture and immigrants have accomplished wonderful things in America. But decimating its natural wonders, resources, and native peoples is not one of them. Daniel Boone was caught up in the economics of inexorable population growth and the need to make a "business model" out of hunting and trapping. Well, what comes of that is not pretty, and not at all sustainable of what had brought him into the woods to start with.


Saturday, January 6, 2024

Damned if You do, Damned if You Don't

The Cherokee trail of tears, and the Palestinian conundrum.

History is a long and sad tale of conflict, interspersed with better times when people can put their animosities aside. Just as economics deals in scarcity and its various solutions, history likewise turns on our inevitable drive towards overpopulation, with resulting scarcity and conflict. Occasionally, special technological, spiritual, organizational achievements- or catastrophes- may allow periods of free population growth with its attendant bouyant mood of generosity. But more commonly, groups of people covet each other's resources and plot ways to get them. This was one of the lessons of Malthus and Darwin, who addressed the deeper causes of what we see as historical events.

The "New World" provided Europeans with an unprecedented release for their excess populations, especially the malcontented, the desperate, and the ambitious. They rhapsodized about the "virgin" lands that lay open, generally dismissing the numerous and well-organized natives present all over these lands, as "savages", occupying a lower technological and theological level of existence. There were plenty of rationalizations put forth, like Christianizing the natives, or "civilizing" them. But the hypocrisy of these formulations becomes clear when you consider the fate of the Cherokees, one of the "five civilized tribes". 

By the early 1800's, a couple of centuries of contact had already gone under the bridge, (as narrated by Pekka Hämäläinen in "Indigenous continent"), and native Americans were all integrated to various degrees in trading networks that brought them European goods like guns, pots, knives, and novel practices like horse riding. The Cherokees, occupying the lower Appalachians and piedmont between what is now Georgia and Alabama, were more integrated than most, adopting European farming, living, schooling, and governing practices. They even owned African American slaves, and wrote themselves a US-modeled constitution in 1827, in the script devised the scholar Sequoya.

Did this "progress" toward assimilation with the European culture help them? Far from! Their excellence in farming, literacy, and government raised fears of competition in the white colonists, and the Georgia state government lobbied relentlessly for their removal. Andrew Jackson finally obliged. He pressured the Cherokees to re-open their status as a settled nation, devised a removal treaty with a minority party, and then sent all the Cherokees in the region (about 16,000) off on the Trail of Tears, to the barren lands of Oklahoma. These Cherokees lost roughly a quarter of their population along the way, in a brutal winter. Compare this with the partition of India, where about twelve percent of the refugees are thought to have perished, out of roughly 16 million total.

A small part of the annals of ethnic cleansing, US edition. Needless to say, the "Indian territory" ended up a lot smaller than originally promised.
 

Georgia was thus ethnically cleansed, and does not seem to experience a great deal of regret about it. The logic of power is quite simple- the winner gets the land and spoils. The loser is lucky to not be killed. That the Europeans were significantly more powerful than their native antagonists doesn't change the logic, though it might appeal to our empathy and nostalgia in retrospect. The Cherokees and other Native Americans might have been accepted into US society. They might have been given one or two states for their sovereign governments, as the Mormons managed. There were a lot of possibilities that might have made us a more interesting and diverse nation. But at the same time, most Native Americans participated fully in the politics of power, terrorizing each other, making slaves of each other, and killing each other. They were not innocents. So the fact that they came up against a stronger power was hardly a novelty, though in this case that power was blundering and cruel, shared very few of their cultural coordinates, and was highly hypocritical about its own.

All this comes to mind when viewing the Israeli-Palestinian conflict. Israel won the major Middle East wars that so dramatically emasculated the Palestinians, first in the civil war that left Jordan and Egypt in charge of the Palestinian areas, then in the 1967 war that left all these areas in Israeli hands. But what to do with them? On founding, Israel was a liberal, New Testament kind of country, with humanist values and lefty kibbutzim. The then-recent Holocaust also caused a bit of hesitance when it came to either killing or exiling the losing Palestinians. Indeed, given that its neighbors Jordan and Egypt lost these wars, it would have made some sense at that time to deport all the Palestinians, of which there were about one to two million. But rather than do that, or make a firm border, Israel immediately started encroaching into Palestinian territory with security areas and "settlements", and has set up an ever more elaborate, though selectively porous and self-serving, security and boundary system.

Both sides have a schizophrenic reaction to the other. On the Palestinian side, the psychology of losing has meant quietism and acquiescence by some, but resentment and militantcy by others. Both lead to a spiral of worse treatment, the weakness of the former inviting abuse, and the desperate depredations of the latter inciting revenge, "security" measures, and tighter occupation. The provocations by each side are unendurable, and thus the situation deteriorates. Yet, in the end, Israel has all the power and the responsibility to come up with a long term solution. Over the decades, Israel has morphed from its founding ethos into something much more conservative and Old Testament, less beholden to the humanisitic ideals of the post-WW2 period. The wanton killing, starvation, and collective punishment of Gaza makes visible this moral breakdown.

The Palestinians can't win either way, either through Hamas's implacable hatred and impotent attacks, nor through the acquiescence of the Palestinian National Authority, which, in thanks for its good behavior, has received the creeping expansion of Israeli "settlements" on its land. These now take up, according to a detailed map, about 1/3 to 1/2 of the land of the West Bank. Overall, the options are: 1) to expel the Palestinians outright, which appears to be, for Gaza at least, where Israeli policy is heading, (made more ironic by the realization by historians that the Biblical Exodus never actually took place), or 2) to continue to muddle along in a torturous occupation with creeping dispossession, or 3) to grant Palestine some kind of autonomy and statehood. Assimilation, (4), long dreamt of by some, seems impossible for a state that is fundamentally an ethnic (or theological) state, and whose whole raison d'etre is ethnic separation, not to even mention the preferences of the Palestinians. Though perhaps assimiliation without voting rights, in sort of semi-slavery or apartheid, is something the Israelies would be attracted to? Perhaps insignia will need to be worn by all Palestinians, sewn to their clothing?

Map of the West Bank of the Jordan, color coded by Palestinian marginal control in brown, and settler/Israeli control in red.

What should happen? Indigenous Americans were infected, decimated, hunted down, translocated, re-educated, and confined to a small and very remote system of reservations. Hopefully we have have progressed a little since then, as a largely European civilization, which is putatively shared by Israel. Thus the only way forward, as is recognized by everyone outside Israel, is the two-state solution, including a re-organization of the Palestinian territories into a final, clearly demarked, and contiguous state. Israel's current political system will never get there. But we can help the process along in a few ways.

First, it is disappointing to see our current administration shipping arms to Israel at a furious pace, only to see them used to kill thousands of innocent, if highly resentful, civilians. Israel has plenty of its own money to buy whatever it needs elsewhere. We need to put some limitations on our military and other aid relationships, to motivate change. (Though that raises the question of Israel's increasingly cozy relationship with Russia). Second, we should recognize Palestine as a state, and bring forward its integration into the international system. This will not resolve its borders or myriad security and territory issues viz Israel, but it would helpfully motivate things in that direction. Israel has constantly cried wolf about the lack of a credible partner to negotiate with, but that is irrelevant. Israel is perfectly capable of building the walls it needs to keep Palestinians at bay. But then it wants pliant workers as well, and a peaceful neighbor, security viz Jordan and Egypt, territorial encroachments, and many other things that are either destructive, or need to be negotiated. 

By far the most constructive thing that could be done is to freeze and re-organize the Jewish settlements and other periphernalia that have metastasized all over the West Bank. There is no future without a decent and fair solution in territory, which is the third big thing we need to press- our own detailed territorial plan for Palestine. For one thing, Israel could easily vacate the whole corridor / valley facing Jordan. That would give a consolidated Palestine a working border with a country that is now peaceful, quite well run, and friendly to both sides. There are countless possible maps. We just need to decide on one that is reasonably fair and force it on both sides, which are each, still after all these years, apparently unwilling to imagine a true peace. This means principally forcing it on Israel, which has been the dominant and recalcitrant party the entire time.

The Cherokees are now one of the largest indigenous populations in the US, at roughly a quarter million, with their own territory of about seven thousand square miles in Oklahoma. They have internal and partial sovereignty, which means that they deal with their own affairs on a somewhat independent basis, but otherwise are largely subject to most laws of the enclosing governments. The Cherokees could easily have been assimilated into the US. Only racism stood in the way, in a mindset that had long descended into a blind and adversarial disregard of all native Americans as "others", (the irony!), competitive with and less than, the newly arrived immigrants. We could have done much better, and one would like to think that, a hundred or a hundred and fifty years on, we would have.

In the end, the West (read as European civilization, as developed out of the ashes of World War 2) is either for or against wars of aggression, ethnic cleansing, apartheid, and human rights. Israel has won its wars, but never faced up to its responsibilities to the conquered Palestinians, and has tried to have it both ways, to be viewed by the world as a modern, enlightened state, even as it occupies and slowly strangles the people it defeated decades ago. 


  • Slovenly strategic thinking. But really, visionless long-term politics.
  • One Gazan speaks.
  • Settler colonialism.
  • Who's the victim?
  • Shades of Scientology ... the murky networks of the deep evangelical state.
  • In California, solar still makes sense.

Saturday, December 23, 2023

How Does Speciation Happen?

Niles Eldredge and the theory of punctuated equilibrium in evolution.

I have been enjoying "Eternal Ephemera", which is an end-of-career memoir/intellectual history from a leading theorist in paleontology and evolution, Niles Eldredge. In this genre, often of epic proportions and scope, the author takes stock of the historical setting of his or her work and tries to put it into the larger context of general intellectual progress, (yes, as pontifically as possible!), with maybe some gestures towards future developments. I wish more researchers would write such personal and deeply researched accounts, of which this one is a classic. It is a book that deserves to be in print and more widely read.

Eldredge's claim to fame is punctuated equilibrium, the theory (or, perhaps better, observation) that evolution occurs much more haltingly than in the majestic gradual progression that Darwin presented in "Origin of Species". This is an observation that comes straight out of the fossil record. And perhaps the major point of the book is that the earliest biologists, even before Darwin, but also including Darwin, knew about this aspect of the fossil record, and were thus led to concepts like catastrophism and "etagen". Only Lamarck had a steadfastly gradualist view of biological change, which Darwin eventually took up, while replacing Lamarck's mechanism of intentional/habitual change with that of natural selection. Eldridge unearths tantalizing and, to him, supremely frustrating, evidence that Darwin was fully aware of the static nature of most fossil series, and even recognized the probable mechanism behind it (speciation in remote, peripheral areas), only to discard it for what must have seemed a clearer, more sweeping theory. But along the way, the actual mechanism of speciation got somewhat lost on the shuffle.

Punctuated equilibrium observes that most species recognized in the fossil record do not gradually turn into their descendents, but are replaced by them. Eldredge's subject of choice is trilobites, which have a long and storied record for almost 300 million years, featuring replacement after replacement, with species averaging a few million years duration each. It is a simple fact, but one that is a bit hard to square with the traditional / Darwinian and even molecular account of evolution. DNA is supposed to act like a "clock", with constant mutational change through time. And natural selection likewise acts everywhere and always... so why the stasis exhibited by species, and why the apparently rapid evolution in between replacements? That is the conundrum of punctuated equilibrium.

There have been lot of trilobites. This comes from a paper about their origin during the Cambrian explosion, arguing that only about 20 million years was enough for their initial speciation (bottom of image).

The equilibrium part, also termed stasis, is seen in the current / recent world as well as in the fossil record. We see species such as horses, bison, and lions that are identical to those drawn in cave paintings. We see fossils of animals like wildebeest that are identical to those living, going back millions of years. And we see unusual species in recent fossils, like saber-toothed cats, that have gone extinct. We do not typically see animals that have transformed over recent geological history from one (morphological) species into another, or really, into anything very different at all. A million years ago, wildebeest seem to have split off a related species, the black wildebeest, and that is about it.

But this stasis is only apparent. Beneath the surface, mutations are constantly happening and piling up in the genome, and selection is relentlessly working to ... do something. But what? This is where the equilibrium part comes in, positing that wide-spread, successful species are so hemmed in by the diversity of ecologies they participate in that they occupy a very narrow adaptive peak, which selection works to keep the species on, resulting in apparent stasis. It is a very dynamic equilibrium. The constant gene flow among all parts of the population that keeps the species marching forward as one gene pool, despite the ecological variability, makes it impossible to adapt to new conditions that do not affect the whole range. Thus, paradoxically, the more successful the species, and the more prominent it is in the fossil record, the less change will be apparent in those fossils over time.

The punctuated part is that these static species in the fossil record eventually disappear and are replaced by other species that are typically similar, but not the same, and do not segue from the original in a gradual way that is visible in the fossil record. No, most species and locations show sudden replacement. How can this be so if evolution by natural selection is true? As above, wide-spread species are limited in what selection can do. Isolated populations, however, are more free to adapt to local conditions. And if one of those local conditions (such as arctic cold) happens to be what later happens to the whole range (such as an ice age), then it is more likely that a peripherally (pre-)adapted population will take over the whole range, than that the resident species adapts with sufficient speed to the new conditions. Range expansion, for the peripheral species, is easier and faster than adaptation, for the wide-ranging originating species.

The punctuated equilibrium proposition came out in the 1970's, and naturally followed theories of speciation by geographic separation that had previously come out (also resurrected from earlier ideas) in the 1930's to 1950's, but which had not made much impression (!) on paleontologists. Paleontologists are always grappling with the difficulties of the record, which is partial, and does not preserve a lot of what we would like to know, like behavior, ecological relationships, and mutational history. But they did come to agree that species stasis is a real thing, not just, as Darwin claimed, an artifact of the incomplete fossil record. Granted- if we had fossils of all the isolated and peripheral locations, which is where speciation would be taking place by this theory, we would see the gradual change and adaptation taking place. So there are gaps in the fossil record, in a way. But as long as we look at the dominant populations, we will rarely see speciation taking place before our eyes, in the fossils.

So what does a molecular biologist have to say about all this? As Darwin insisted early in "Origin", we can learn quite a bit from domesticated animals. It turns out that wild species have a great amount of mostly hidden genetic variation. This is apparent whenever one is domesticated and bred for desired traits. We have bred dogs, for example, to an astonishingly wide variety of traits. At the same time, we have bred them out to very low genetic diversity. Many breeds are saddled with genetic defects that can not be resolved without outbreeding. So we have in essence exchanged the vast hidden genetic diversity of a wild species for great visible diversity in the domesticated species, combined with low genetic diversity.

What this suggests is that wild species have great reservoirs of possible traits that can be selected for the purposes of adaptation under selective conditions. Which suggests that speciation in range edges and isolated environments can be very fast, as the punctuated part of punctuated equilibrium posits. And again, it reinforces the idea that during equilibrium with large populations and ranges, species have plenty of genetic resources to adapt and change, but spend those resources reinforcing / fine tuning their core ecological "franchise", as it were.

In population genetics, it is well known that mutations arise and fix (that is, spread to 100% of the population on both alleles) at the same rate no matter how large the population, in theory. That is to say- bigger populations generate more mutations, but correspondingly hide them better in recessive form (if deleterious) and for neutral mutations, take much longer to allow any individual mutation to drift to either extinction or fixation. Selection against deleterious mutations is more relentless in larger populations, while relaxed selection and higher drift can allow smaller populations to explore wider ranges of adaptive space, perhaps finding globally higher (fitness) peaks than the parent species could find.

Eldredge cites some molecular work that claims that at least twenty percent of sequence change in animal lineages is due specifically to punctuational events of speciation, and not to the gradual background accumulation of mutations. What could explain this? The actual mutation rate is not at issue, (though see here), but the numbers of mutations retained, perhaps due to relaxed purifying selection in small populations, and founder effects and positive selection during the speciation process. This kind of phenomenon also helps to explain why the DNA "clock" mentioned above is not at all regular, but quite variable, making an uneven guide to dating the past.

Humans are another good example. Our species is notoriously low in genetic diversity, compared to most wild species, including chimpanzees. It is evident that our extremely low population numbers (over prehistoric time) have facilitated speciation, (that is, the fixation of variants which might be swamped in bigger populations), which has resulted in a bewildering branching pattern of different hominid forms over the last few million years. That makes fossils hard to find, and speciation hard to pinpoint. But now that we have taken over the planet with a huge population, our bones will be found everywhere, and they will be largely static for the foreseeable future, as a successful, wide-spread species (barring engineered changes). 

I think this all adds up to a reasonably coherent theory that reconciles the rest of biology with the fossil record. However, it remains frustratingly abstract, given the nature of fossils that rarely yield up the branching events whose rich results they record.


Saturday, November 25, 2023

Are Archaea Archaic?

It remains controversial whether the archaeal domain of life is 1 or 4.5 billion years old. That is a big difference!

Back in the 1970's, the nascent technologies of molecular analysis and DNA sequencing produced a big surprise- that hidden in the bogs and hot springs of the world are micro-organisms so extremely different from known bacteria and protists that they were given their own domain on the tree of life. These are now called the archaea, and in addition to being deeply different from bacteria, they were eventually found to be the progenitors of eukaryotic cell- the third (and greatest!) domain of life that arose later in the history of the biosphere. The archaeal cell contributed most of the nuclear, informational, membrane management, and cytoskeletal functions, while one or more assimilated bacteria (most prominently the future mitochondrion and chloroplast) contributed most of the metabolic functions, as well as membrane lipid synthesis and peroxisomal functions.

Carl Woese, who discovered and named archaea, put his thumb heavily on the scale with that name, (originally archaebacteria), suggesting that these new cells were not just an independent domain of life, totally distinct from bacteria, but were perhaps the original cell- that is, the LUCA, or last universal common ancestor. All this was based on the sequences of rRNA genes, which form the structural and catalytic core of the ribosome, and are conserved in all known life. But it has since become apparent that sequences of this kind, which were originally touted as "molecular clocks", or even "chronometers" are nothing of the kind. They bear the traces of mutations that happen along the way, and, being highly important and conserved, do not track the raw mutation rate, (which itself is not so uniform either), but rather the rate at which change is tolerated by natural selection. And this rate can be wildly different at different times, as lineages go through crises, bottlenecks, adaptive radiations, and whatever else happened in the far, far distant past.

Carl Woese, looking over filmed spots of 32P labeled ribosomal RNA from different species, after size separation by electrophoresis. This is how RNAs were analyzed, back in 1976, and such rough analysis already suggested that archaea were something very different from bacteria.

There since has been a tremendous amount of speculation, re-analysis, gathering of more data, and vitriol in the overall debate about the deep divergences in evolution, such as where eukaryotes come from, and where the archaea fit into the overall scheme. Compared with the rest of molecular biology, where experiments routinely address questions productively and efficiently due to a rich tool chest and immediate access to the subject at hand, deep phylogeny is far more speculative and prone to subjective interpretation, sketchy data, personal hobbyhorses, and abusive writing. A recent symposium in honor of one of its more argumentative practitioners made that clear, as his ideas were being discarded virtually at the graveside.

Over the last decade, estimates of the branching date of archaea from the rest of the tree of life have varied from 0.8 to 4.5 Gya (billion years ago). That is a tremendous range, and is a sign of the difficulty of this field. The frustrations of doing molecular phylogeny are legion, just as the temptations are alluring. Firstly, there are very few landmarks in the fossil record to pin all this down. There are stromatolites from roughly 3.5 Gya, which pin down the first documented life of any kind. Second are eukaryotic fossils, which start, at the earliest, about 1.5 Gya. Other microbial fossils pin down occasional sub-groups of bacteria, but archaea are not represented in the fossil record at all, being hardly distinguishable from bacteria in their remains. Then we get the Cambrian explosion of multicellular life, roughly 0.5 Gya. That is pretty much it for the fossil record, aside from the age of the moon, which is about 4.5 Gya and gives us the baseline of when the earth became geologically capable of supporting life of any kind.

The molecules of living organisms, however, form a digital record of history. Following evolutionary theory, each organism descends from others, and carries, in mutated and altered form, traces of that history. We have parts of our genomes that vary with each generation, (useful for forensics and personal identification), we have other parts that show how we changed and evolved from other apes, and we have yet other areas that vary hardly at all- that carry recognizable sequences shared with all other forms of life, and presumably with LUCA. This is a real treasure trove, if only we can make sense of it.

But therein lies the rub. As mentioned above, these deeply conserved sequences are hardly chronometers. So for all the data collection and computer wizardry, the data itself tells a mangled story. Rapid evolution in one lineage can make it look much older than it really is, confounding the whole tree. Over the years, practitioners have learned to be as judicious as possible in selecting target sequences, while getting as many as possible into the mix. For example, adding up the sequences of 50-odd ribosomal proteins can give more and better data than assembling the 2 long-ish ribosomal RNAs. They provide more and more diverse data. But they have their problems as well, since some are much less conserved than others, and some were lost or gained along the way. 

A partisan of the later birth of archaea provides a phylogenetic tree with countless microbial species, and one bold claim: "inflated" distances to the archaeal and eukaryotic stems. This is given as the reason that archaea (lower part of the diagram, including eukaryotes, termed "archaebacteria"), looks very ancient, but really just sped away from its originating bacterial parent, (the red bacteria), estimated at about 1 Gya. This tree is based on an aligned concatentation of 26 universally conserved ribosomal protein sequences, (51 from eukaryotes), with custom adjustments.

So there has been a camp that claims that the huge apparent / molecular distance between the archaea and other cells is just such a chimera of fast evolution. Just as the revolution that led to the eukaryotic cell involved alot of molecular change including the co-habitation of countless proteins that had never seen each other before, duplications / specializations, and many novel inventions, whatever process led to the archaeal cell (from a pre-existing bacterial cell) might also have caused the key molecules we use to look into this deep time to mutate much more rapidly than is true elsewhere in the vast tree of life. What are the reasons? There is the general disbelief / unwillingness to accept someone else's work, and evidence like possible horizontal transfers of genes from chloroplasts to basal archaea, some large sequence deletion features that can be tracked through these lineages and interpreted to support late origination, some papering over of substantial differences in membrane and metabolic systems, and there are plausible (via some tortured logic) candidates for an originating, and late-evolving, bacterial parent. 

This thread of argument puts the origin of eukaryotes roughly at 0.8 Gya, which is, frankly, uncomfortably close to the origination of multicellular life, and gives precious little time for the bulk of eukaryotic diversity to develop, which exists largely, as shown above, at the microbial level. (Note that "Animalia" in the tree above is a tiny red blip among the eukaryotes.) All this is quite implausible, even to a casual reader, and makes this project hard to take seriously, despite its insistent and voluminous documentation.

Parenthetically, there was a fascinating paper that used the evolution of the genetic code itself to make a related point, though without absolute time attributions. The code bears hallmarks of some amino acids being added relatively late (tryptophan, histidine), while others were foundational from the start (glycine, alanine), when it may have consisted of two RNA bases (or even one) rather than three. All of this took place long before LUCA, naturally. This broad analysis of genetic code usage argued that bacteria tend to use a more ancient subset of the code, which may reflect their significantly more ancient position on the tree of life. While the full code was certainly in place by the time of LUCA, there may still at this time have been, in the inherited genome / pool of proteins, a bias against the relatively novel amino acids. This finding implies that the time of archaeal origination was later than the origination of bacteria, by some unspecified but significant amount.

So, attractive as it would be to demote the archaea from their perch as super-ancient organisms, given their small sizes, small genomes, specialization in extreme environments, and peripheral ecological position relative to bacteria, that turns out to be difficult to do. I will turn, then, to a very recent paper that gives what I think is much more reasoned and plausible picture of the deeper levels of the tree of life, and the best general picture to date. This paper is based on the protein sequences of the rotary ATPases that are universal, and were present in LUCA, despite their significant complexity. Indeed, the more we learn about LUCA, the more complete and complex this ancestor turns out to be. Our mitochondrion uses a (bacterial) F-type ATPase to synthesize ATP from the food-derived proton gradient. Our lysosomes use a (archaeal) V-type ATPase to drive protons into / acidify the lysosome in exchange for ATP. These are related, derived from one distant ancestor, and apparently each was likely to have been present in LUCA. Additionally, each ATPase is composed of two types of subunits, one catalytic, and one non-catalytic, which originated from an ancient protein duplication, also prior to LUCA. The availability of these molecular cousins / duplications provides helpful points of comparison throughout, particularly for locating the root of the evolutionary tree.

Phylogenetic trees based on ATP synthase enzymes that are present in all forms of life. On left is shown the general tree, with branch points of key events / lineages. On right are shown sub-trees for the major types of the ATP synthase, whether catalytic subunit (c), non-catalytic (n), F-type, common in bacteria, or V type, common in archaea. Note how congruent these trees are. At bottom right in the tiny print is a guide to absolute time, and the various last common ancestors.

This paper also works quite hard to pin the molecular data to the fossil and absolute time record, which is not always provided The bottom line is that archaea by this tree arise quite early, (see above), co-incident with or within about 0.5 Gy of LUCA, which was bacterial, at roughly 4.4 Gya. The bacterial and archaeal last common ancestors are dated to 4.3 and 3.7 Gya, respectively. The (fused) eukaryotic last common ancestor dates to about 1.9 Gya, with the proto-mitochondrion's individual last common ancestor among the bacteria some time before that, at roughly 2.4 Gya. 

This time line makes sense on many fronts. First, it provides a realistic time frame for the formation and diversification of eukaryotes. It puts their origin right around the great oxidation event, which is when oxygen became dominant in earth's atmosphere, (about 2 to 2.4 Gya), which was a precondition for the usefulness of mitochondria to what are otherwise anaerobic archaeal cells. It places the origin of archaea (LACA) a substantial stretch after the origin of bacteria, which agrees with the critic's points above that bacteria are the truly basal lineage of all life, and archaea, while highly different and pretty archaic, also share a lot of characteristics with bacteria, and perhaps more so with certain early lineages than with others that came later. The distinction between LUCA and the last common bacterial ancestor (LBCA) is a technical one given the trees they were working from, and are not, given the ranges of age presented, (see figure above), significantly different.

I believe this field is settling down, and though this paper, working from only a subset of the most ancient sequences plus fossil set-points, is hardly the last word, it appears to represent a consensus view and is the best picture to date of the deepest and most significant waypoints in the deep history of life. This is what comes from looking through microscopes, and finding entire invisible worlds that we had no idea existed. Genetic sequencing is another level over that of microscopy, looking right at life's code, and at its history, if darkly. What we see in the macroscopic world around us is only the latest act in a drama of tremendous scale and antiquity.