Saturday, January 25, 2025

The Climate is Changing

Fires in LA, and a puff of smoke in DC.

An ill wind has blown into Washington, a government of whim and spite, eager to send out the winged monkeys to spread fear and kidnap the unfortunate. The order of the day is anything that dismays the little people. The wicked witch will probably have melted away by the time his most grievous actions come to their inevitable fruition, of besmirching and belittling our country, and impoverishing the world. Much may pass without too much harm, but the climate catastrophe is already here, burning many out of their homes, as though they were made of straw. Immoral and spiteful contrariness on this front will reap the judgement and hatred of future generations.

But hasn't the biosphere and the climate always been in flux? Such is the awful refrain from the right, in a heartless conservatism that parrots greedy, mindless propaganda. In truth, Earth has been blessed with slowness. The tectonic plates make glaciers look like race cars, and the slow dance of Earth's geology has ruled the evolution of life over the eons, allowing precious time for incredible biological diversification that covers the globe with its lush results.

A stretch of relatively unbroken rain forest, in the Amazon.

Past crises on earth have been instructive. Two of the worst were the end-Permian extinction event, about 252 million years ago (mya), and the end-Cretaceous extinction event, about 66 mya. The latter was caused by a meteor, so was a very sudden event- a shock to the whole biosphere. Following the initial impact and global fire, it is thought to have raised sun-shielding dust and sulfur, with possible acidification, lasting for years. However, it did not have very large effects on CO2, the main climate-influencing gas.

On the other hand, the end-Permian extinction event, which was significantly more severe than the end-Cretaceous event, was a more gradual affair, caused by intense volcanic eruptions in what is now Siberia. Recent findings show that this was a huge CO2 event, turning the climate of Earth upside down. CO2 went from about 400 ppm, roughly what we are at currently, to 2500 ppm. The only habitable regions were the poles, while the tropics were all desert. But the kicker is that this happened over the surprisingly short (geologically speaking) time of about 80,000 years. CO2 then stayed high for the next roughly 400,00 years, before returning slowly to its former equilibrium. This rate of rise was roughly 2.7 ppm per 100 years, yet that change killed off 90% of all life on Earth. 

The momentous analysis of the end-Permian extinction event, in terms of CO2, species, and other geological markers, including sea surface temperature (SST). This paper was when the geological brevity of the event was first revealed.

Compare this to our current trajectory, where atmospheric CO2 has risen from about 280 ppm at the dawn of the industrial age to 420 ppm now. That is rate of maybe 100 ppm per 100 years, and rising steeply. It is a rate far too high for many species, and certainly the process of evolution itself, to keep up with, tuned as it is to geologic time. As yet, this Anthropocene extinction event is not quite at the level of either the end-Permian or end-Cretaceous events. But we are getting there, going way faster than the former, and creating a more CO2-based long-term climate mess than the latter. While we may hope to forestall nuclear war and thus a closer approximation to the end-Cretaceous event, it is not looking good for the biosphere, purely from a CO2 and warming perspective, putting aside the many other plagues we have unleashed including invasive species, pervasive pollution by fertilizers, plastics and other forever chemicals, and the commandeering of all the best land for farming, urbanization, and other unnatural uses. 

CO2 concentrations, along with emissions, over recent time.

We are truly out of Eden now, and the only question is whether we have the social, spiritual, and political capacity to face up to it. For the moment, obviously not. Something disturbed about our media landscape, and perhaps our culture generally, has sent us for succor, not to the Wizard who makes things better, but to the Wicked Witch of the East, who delights in lies, cruelty and destruction.


Saturday, January 18, 2025

Eeking Out a Living on Ammonia

Some archaeal microorganisms have developed sophisticated nano-structures to capture their food: ammonia.

The earth's nitrogen cycle is a bit unheralded, but critical to life nonetheless. Gaseous nitrogen (N2) is all around us, but inert, given its extraordinary chemical stability. It can be broken down by lightning, but little else. It must have been very early in the history of life that the nascent chemical-biological life forms tapped out the geologically available forms of nitrogen, despite being dependent on nitrogen for countless critical aspects of organic chemistry, particularly of nucleic acids, proteins, and nucleotide cofactors. The race was then on to establish a way to capture it from the abundant, if tenaciously bound, dinitrogen of the air. It was thus very early bacteria that developed a way (heavily dependent, unsurprisingly, on catalytic metals like molybdenum and iron) to fix nitrogen, meaning breaking up the triple N≡N bond, and making ammonia, NH3 (or ammonium, NH4+). From there, the geochemical cycle of nitrogen is all down-hill, with organic nitrogen being oxidized to nitric oxide (NO), nitrite (NO2-), nitrate (NO3), and finally denitrification back to N2. Microorganisms obtain energy from all of these steps, some living exclusively on either nitrite or nitrate, oxidizing them as we oxidize carbon with oxygen to make CO2. 

Nitrosopumilus, as imaged by the authors, showing its corrugated exterior, a layer entirely composed of ammonia collecting elements (can be hexameric or pentameric). Insets show an individual hexagonal complex, in face-on and transverse views. Note also the amazing resolution of other molecules, such as the ribosomes floating about.

A recent paper looked at one of these denizens beneath our feet, an archaeal species that lives on ammonia, converting it to nitrite, NO2. It is a dominant microbe in its field, in the oceans, in soils, and in sewage treatment plants. The irony is that after we spend prodigious amounts of fossil fuels fixing huge amounts of nitrogen for fertilizer, most of which is wasted, and which today exceeds the entire global budget of naturally fixed nitrogen, we are faced with excess and damaging amounts of nitrogen in our effluent, which is then processed in complex treatment plants by our friends the microbes down the chain of oxidized states, back to gaseous N2.

Calculated structure of the ammonia-attracting pore. At right are various close-up views including the negatively charged amino acids (D, E) concentrated at the grooves of the structure, and the pores where ammonium can transit to the cell surface. 

The Nitrosopumilus genus is so successful because it has a remarkable way to capture ammonia from the environment, a way that is roughly two hundred times more efficient than that of its bacterial competitors. Its surface is covered by a curious array of hexagons, which turn out to be ammonia capture sites. In effect, its skin is an (relatively) enormous chemical antenna for ammonia, which is naturally at low concentration in sea water. These authors do a structural study, using the new methods of particle electron microscopy, to show that these hexagons have intensely negatively charged grooves and pores, to which positively charged ammonium ions are attracted. Within this outer shell, but still outside the cell membrane, enzymes at the cell surface transform the captured ammonium to other species such as hydroxylamine, which enforces the ammonium concentration gradient towards the cell surface, and which are then pumped inside.

Cartoon model of the ammonium attraction and transit mechanisms of this cell wall. 

It is a clever nano-material and micro-energetic system for concentrating a specific chemical- a method that might inspire human applications for other chemicals that we might need- chemicals whose isolation demands excessive energy, or whose geologic abundance may not last forever.


Saturday, January 11, 2025

A Housing Guarantee

A proposal for an updated poor house.

I agree with MMT economists who propose a job guarantee. That would put a floor on the labor market with an offer to anyone who wants to work for a low, but living wage, probably set below the minimum wage mandated for the private sector. State and local governments would run cleanups, environmental restoration, and care operations as needed, requiring basic discipline and effort, but no further skills. But they could use higher skilled workers as they come along for more beneficial, complex tasks.

Similarly, I think we could offer a housing guarantee, putting a floor on homelessness and misery. In the state of California, homelessness is out of control, and we have not found solutions, despite a great deal of money spent. Housing in the private market is extremely expensive, far out of reach of those with even median incomes. The next level down is housing vouchers and public housing, of which there are not enough to go around, and which is extremely expensive. And below that are shelters, which are heavily adverse settings. They are not private, chaotic, unpleasant, meant to be temporary, can be closed much of the time. And they also do not have enough space. 

A local encampment, temporarily approved during the pandemic under the freeway.

As uncompassionate as it sounds, it is unacceptable, and should be illegal, for public spaces to be commandeered by the homeless for their private needs. Public spaces have many purposes, specifically not including squatting and vagrancy. It is a problem in urban areas, because that is where people are, and where many services exist at the intersection of public and private spaces- food, bathrooms, opportunities to beg, get drugs, etc. Just because we have been, as governments and citizens, neglectful of our public spaces, does not mean we should give them over to anyone who wants to camp on them. I was recently at San Francisco city hall and the beautiful park surrounding it. But at lunch time, I realized that there was nowhere to sit. The plague of homelessness had rendered park benches untenable. We deserve to keep these public spaces functional, and that means outlawing the use of public spaces by the homeless. At the same time, provision must be made for the homeless, who by this policy would have nowhere to go in fully zoned areas. Putting them on busses to the next town, as some jurisdictions do, is also not a solution. As a rich country, we can do more for the homeless even while we preserve public spaces.

I think we need to rethink the whole lower end of housing / shelter to make it a more regular, accessible, and acceptable way to catch those who need housing at a very basic level. The model would be a sort of cross between a hostel, an SRO (single room occupancy hotels) and army barracks. It would be publicly funded, and provide a private room as well as food, all for free. It would not throw people out, or lock them in.

This poor house would not demand work, though it would offer centralized services for finding jobs and other places to live. It would be open to anyone, including runaway teens, battered women, tourists, etc. It would be a refuge for anyone for any reason, on an unlimited basis. The space and the food would be very basic, motivating clients to seek better accommodation. It would be well-policed and its clients would have to behave themselves. The next step down in the ladder of indigent care would not be homelessness, which would be outlawed in areas offering this kind of poorhouse, but would be institutionalization, in increasingly stringent settings for either criminal or mental issues. 

Such a poor house might become a community center, at least for the indigent. It would be quite expensive, but given the level of inequality and lack of care for people in various desperate straits, we need to furnish a humane level of existence between the market housing system and institutionalization. Why not give everyone a house? That is neither financially practical, nor would that co-exist well with the market housing system. Certainly, more housing needs to be built and everything done to bring prices down. But to address the current issues, stronger housing policy is needed.

Why not go back to a public housing model? It turned out that public housing was somewhat unrealistic, promising far more than it could deliver. It promised fully functional neighborhoods and housing, pretty much the equivalent of market housing, but without the ongoing discipline from the market via private financial responsibility by the residents or from the programs via their bureaucratic structures and funding, to follow through on the long term. The public authorities generally took a hands-off approach to residents and their environment, in line with the (respectful) illusion that this was the equivalent of market housing. And the long-term is what counts in housing, since it is ever in need of repair and renovation, not to mention careful use and protection by its residents. Building is one thing, but maintaining is something quite different, and requires carefully though-out incentives. 

With a public poorhouse model, the premises and residents are extensively policed. Individual rooms may descend to squalor, but the whole is built, run and maintained by the public authorities with intensive surveillance and intervention, keeping the institution as a whole functioning and growing as needed for its mission. There is going to be a sliding scale of freedom vs public involvement via financing and policing. The less functional a person is, the more control they will have to accept. We can not wash our hands of the homeless by granting them "freedom" to thrash about in squalor and make dumps of public spaces.


  • Or you could join the squid game.
  • Economic policy should not be about efficiency alone, let alone rewarding capital and management, but about long-term cultural and environmental sustainability.
  • Could AI do biology?
  • Carter was an evangelical. But that was a different time.

Saturday, January 4, 2025

Drilling Into the Transcriptional Core

Machine learning helps to tease out the patterns of DNA at promoters that initiate transcription.

One of the holy grails of molecular biology is the study of transcriptional initiation. While there are many levels of regulation in cells, the initiation of transcription is perhaps, of all of them, the most powerful. An organism's ability to keep the transcription of most genes off, and turn on genes that are needed to build particular tissues, and regulate others in response to other urgent needs, is the very soul of how multicellular organisms operate. The decision to transcribe a gene into its RNA message (mRNA) represents a large investment, as that transcript can last hours or more and during that time be translated into a great many protein copies. Additionally, this process identifies where, in the otherwise featureless landscape of genomic DNA, genes are located, which is another significant process, one that it took molecular biologists a long time to figure out.

Control over transcription is generally divided into two conceptual and physical regions- enhancers and promoters. Enhancers are typically far from the start site of transcription, and are modules of DNA sequences that bind innumerable regulatory proteins which collectively tune, in fine and rough ways, initiation. Promoters, in contrast, are at the core and straddle the start site of transcription (TSS, for short). They feature a much more limited set of motifs in the DNA sequence. The promoter is the site where the proteins bound to the various enhancers converge and encourage the formation of a "preinitiation complex", which includes the RNA polymerase that actually carries out transcription, plus a lot of ancillary proteins. The RNA polymerase can not initiate on its own or find a promoter on its own. It requires direction by the regulatory proteins and their promoter targets before finding its proper landing place. So the study of promoter initiation and regulation has a very long history, as a critical part of the central flow of information in molecular biology, from DNA to protein.

A schematic of a promoter, where initiation of transcription of Gene A, happens, with the start site (+1) right at the boundary of the orange and green colors. At this location, the RNA polymerase will melt the DNA strands, and start synthesizing an RNA strand using the (bottom) template strand of the DNA. Regulatory proteins bound to enhancers far away in the genomic DNA bend through space to activate proteins bound at the core promoter to load the polymerase and initiate this process.

A recent paper provided a novel analysis of promoter sequences, using machine learning to derive a relatively comprehensive account of the relevant sequences. Heretofore, many promoters had been dissected in detail and several key features found. But many human promoters had none of them, showing that our knowledge was incomplete. This new approach started strictly from empirical data- the genome sequence, plus large experimental compilations of nascent RNAs, as they are expressed in various cells, and mapped to the precise base where they initiated from- that is, their respective TSS. These were all loaded into a machine learning model that was supplemented with explanatory capabilities. That is, it was not just a black box, but gave interpretable results useful to science, in the form of small sequence signatures that it found are needed to make particular promoters work. These signatures presumably bind particular proteins that are the operational engines of regulatory integration and promoter function.

The TATA motif, found about 30 base pairs upstream of the transcription start site in many promoters. This is a motif view, where the statistical prevalence of the base is reflected in the height of the letter (top, in color) and its converse is reflected below in gray. Regular patterns like this found in DNA usually mean that some protein typically binds to this site, in this case TFIID.


For example, the grand-daddy of them all is the TATA box, which dates back to bacteria / archaea and was easily dug up by this machine learning system. The composition of the TATA box is shown above in a graphical form, where the probability of occurrence (of a base in the DNA) is reflected in height of the base over the axis line. A few G/C bases surround a central motif of T/A, and the TSS is typically 30 base pairs downstream. What happens here is that one of the central proteins of the RNA polymerase positioning complex, TFIID, binds strongly to this sequence, and bends the DNA here by ninety degrees, forming a launchpad of sorts for the polymerase, which later finds and opens DNA at the transcription start site. TFIID and the TATA box are well known, so it certainly is reassuring that this algorithmic method recovered it. TATA boxes are common at regulated promoters, being highly receptive to regulation by enhancer protein complexes. This is in contrast to more uniformly expressed (housekeeping) genes which typically use other promoter DNA motifs, and incidentally tend to have much less precise TSS positions. They might have start sites that range over a hundred base pairs, more or less stochastically.

The main advance of this paper was to find more DNA sites, and new types of sites, which collectively account for the positioning and activation of all promoters in humans. Instead of the previously known three or four factors, they found nine major DNA sequences, and a smattering of weaker patterns, which they combine into a predictive model that matches empirical data. Most of these DNA sequences were previously known, but not as part of core promoters. For example, one is called YY1, because it binds the YY1 protein, which has long been appreciated to be a transcriptional repressor, from enhancer positions. But now it turns out to also be core promoter participant, identifying and turning on a class of promoters that, as for most of the new-found sequence elements, tend to operate genes that are not heavily regulated, but rather universally expressed and with delocalized start sites. 

Motifs and initiator elements found by the current work. Each motif, presumably matched by a protein that binds it, gets its own graph of relation of the motif location (at 0 on the X axis) vs the start site of transcription that it directs, which for TATA is about 30 base pairs downstream. Most of the newly discovered motifs are bi-directional, directing start sites and transcription both upstream and downstream. This wastes a lot of effort, as the upstream transcripts are typically quickly discarded. The NFY motif has an interesting pattern of 10.5 bp periodicity of its directed start sites, which suggests that the protein complex that binds this site hugs one side of the DNA quite closely, setting up start sites on that side of the helix.

Secondly, these authors find that most of the new sequences they identify have bidirectional effects. That is, they set up promoters to fire in both directions, both typically about forty base pairs downstream and also upstream from their binding site. This explains a great deal of transcription data derived from new sequencing technologies, which shows that many promoters fire in both directions, even though the "upstream" or non-gene side transcript tends to be short-lived.


Overview of the new results, summarized by type of DNA sequence pattern. The total machine learning prediction was composed of predictions for larger motifs, which were the dominant pattern, plus a small contribution from "initiators", which comprise a few patterns right at the start site, plus a large but diffuse contribution from tiny trinucleotide patterns, such as the CG pattern known to mark active genes and carry activating DNA methylation marks.


A third finding was the set of trinucleotide motifs that serve as the sort of fudge factor for their machine learning model, filling in details to make the match to empirical data come out better. The length was set more or less arbitrarily, but they play a big part in the model fit. They note that one common example is the CG pattern, which is one of the stronger trinucleotide motifs. This pattern is known as CpG, and is the target of chemical methylation of DNA by regulatory enzymes, which helps to mark and regulate genes. The current work suggests that there may be more systems of this kind yet to be discovered, which play a modulating role in gene/promoter selection and activation.

The accuracy of this new learning and modeling system exemplifies some of the strengths of AI, of which machine learning is a sub-discipline. When there is a lot of data available, and a problem that is well defined and on the verge of solution (like the protein folding problem), then AI, or these machine learning methods, can push the field over the edge to a solution. AI / ML are powerful ways to explore a defined solution space for optimal results. They are not "intelligent" in the normal sense of the word, (at least not yet), which would imply having generalized world models that would allow them to range over large areas of knowledge, solve undefined problems, and exercise common sense.