Showing posts with label consciousness. Show all posts
Showing posts with label consciousness. Show all posts

Saturday, November 14, 2020

Are Attention and Consciousness the Same?

Not really, though what consciousness is in physical terms remains obscure.

A little like fusion power, the quest for a physical explanation of consciousness has been frustratingly unrewarding. The definition of consciousness is fraught to start with, and since it is by all reasonable hypotheses a chemical process well-hidden in the murky, messy, and mysterious processes of the brain, it is also maddeningly fraught in every technical sense. A couple of recent papers provide some views of just how far away the prospect of a solution is, based on analyses of the visual system, one in humans, the other in monkeys.

Vision provides both the most vivid form of consciousness, and a particularly well-analyzed system of neural processing, from retinal input through lower level computation at the back of the brain and onwards through two visual "streams" of processing to conscious perception (the ventral stream in the inferior temporal lobe) and action-oriented processing (in the posterior parietal lobe). It is at the top of this hierarchy that things get a bit vague. Consciousness has not yet been isolated, and how it could be remains unclear. Is attention the same as consciousness, or different? How can related activities like unconscious high-level vision processing, conscious reporting, pressing buttons, etc. be separated from pure consciousness? They all happen in the brain, after all. Or do those activities compose consciousness?

A few landmarks in the streams of visual processing.  V1 is the first level of visual processing, after pre-processing by the retina and lateral geniculate nucleus. Processing then divides into the two streams ending up in the inferotemporal lobe, where consciousness and memory seem to be fed, while the dorsal stream to the inferior parietal lobule and nearby areas feed action guidance in the vicinity of the motor cortex

In the first paper, the authors jammed a matrix of electrodes into the brains of macaques, near the "face cells" of the inferotemporal cortex of the ventral stream. The macaques were presented with a classic binocular rivalry test, with a face shown to one eye, and something else shown to the other eye. Nothing was changed on the screen, nor the head orientation of the macaque, but their conscious perception alternated (as would ours) between one image and the other. It is thought to be a clever way to isolate perceptual distinctions from lower level visual processing, which stay largely constant- each eye processes each scene fully, before higher levels make the choice of which one to focus on consciously. (But see here). It has been thought that by the time processing reaches the very high level of the face cells, they only activate when a face is being consciously perceived. But that was not the case here. The authors find that these cells, when tested more densely than has been possible before, show activity corresponding to both images. The face could be read using one filter on these neurons, but a large fraction (1/4 to 1/3) could be read by another filter to represent the non-face image. So by this work, this level of visual processing in the inferotemporal cortex is biased by conscious perception to concentrate on the conscious image, but that is not exclusive- the cells are not entirely representative of consciousness. This suggests that whatever consciousness is takes place somewhere else, or at a selective ensemble level of particular oscillations or other spike coding schemes.

"We trained a linear decoder to distinguish between trial types (A,B) and (A,C). Remarkably, the decoding accuracy for distinguishing the two trial types was 74%. For comparison, the decoding accuracy for distinguishing (A, B) versus (A, C) from the same cell population was 88%. Thus, while the conscious percept can be decoded better than the suppressed stimulus, face cells do encode significant information about the latter. ... This finding challenges the widely-held notion that in IT cortex almost all neurons respond only to the consciously perceived stimulus."

 

The second paper used EEG on human subjects to test their visual and perceptual response to disappearing images and filled-in zones. We have areas in our visual field where we are physically blind, (the fovea), and where higher levels of the visual system "fill in" parts of the visual scene to make our conscious perception seem smooth and continuous. The experimenters came up with a forbiddingly complex visual presentation system of calibrated dots and high-frequency snow whose purpose was to oppose visual attention against conscious perception. When attention is directed to the blind spot, that is precisely when the absence of an image there becomes apparent. This allowed the experimenters to ask whether the typical neural signatures of high-level visual processing (the steady-state visually evoked potential, or SSVEP) reflect conscious perception, as believed, or attention or other phenomena. They presented and removed image features all over the scene, including blind spot areas. What they found was that the EEG signal of SSVEP was heightened as attention was directed to the invisible areas, exactly the opposite of what they hypothesized if the signal was tied to actual visual conscious perception. This suggested that this particular signal is not a neural correlate of consciousness, but one of attention and perhaps surprise / contrast instead.

So where are the elusive neural correlates of consciousness? Papers like these refine what and where it might not be. It seems increasingly unlikely that "where" is the right question to ask. Consciousness is graded, episodic, extinguishable in sleep, heightened and lowered by various experiences and drugs. So it seems more like a dynamic but persistent pattern of activity than a locus, let alone an homunculus. And what exactly that activity is.. a Nobel prize surely awaits someone on that quest.


  • Unions are not a good thing ... sometimes.
  • Just another debt con.
  • Incompetent hacks and bullies. An administration ends in character.
  • Covid and the superspreader event.
  • Outgoing Secretary of State is also a deluded and pathetic loser.
  • But others are getting on board.
  • Bill Mitchell on social capital, third-way-ism, "empowerment", dogs, bones, etc.
  • Chart of the week: just how divided can we be?

Saturday, July 25, 2020

The Gift

How to be thankful, without anyone to be thankful to.

Remember back when Barack Obama told business leaders that "you didn't build that"? He meant that they didn't build all the public goods that their businesses relied on- the roads, the legal system, the military defense, the regulatory bodies creating fair playing fields, the educational system. Businesses make it their business to be as myopic as possible, feeding off "business models" that foist as much cost onto others- workers, the government, the environment- as amorally possible. That is the only way to survive.

We all are a little like that, with tunnel vision focused on what we need, what we can get, and what we can do. Sometimes it is all one can do merely to survive in a world that seems so difficult, competitive, even hostile. But at the same time, who and what are "we"? Is our next need the full measure of our place in reality? Our focus on doing and on agency is a highly misleading aspect of consciousness. It presupposes a gazillion things that we have no agency over, couldn't even if we tried, and couldn't understand in any case. We didn't make our bodies, for one thing. This biology that we think we are so familiar with is, to biologists, incredibly inscrutible. The trillions of cells, billions of neurons, gajillions of molecules, all work away in obscurity to make us go. But are we thankful? Rarely. We didn't make them. We don't even understand them, and a century or two ago, we really, really didn't understand them. They are utterly alien. Yet they are also us.

The story goes similarly with everything else about us- the flow of time and fate, the universe we live in. All these are, at a fundamental level, still hardly understood. Where did all the energy of the big bang come from? What did it expand into? Why did it cool into the particles of physics? Are there other universes? No idea. And even if we had an idea, we weren't there and didn't make it happen. We are recipients, not actors, in this most vast drama. We should not be distracted by the competitive social systems we live in, and the pressing difficulties of life, to forget that we, as the conscious "I" of an individual human, are mysterious feathers floating on rivers of unplumbed unconscious depths, in a rich forest of abundance, on a planet mild and pleasant, in a universe that rendered these provisions in fantastic plentitude, to us and possibly to countless other worlds as well.
The lilies of the field, well, they toil quite hard, actually, in their own way. But that may not be apparent to the homilist, and took some science to figure out.

There needn't have been an intention behind all this- to conjure a cosmos, and evolve life. Indeed, it is rather unlikely given the little we do know. At any rate, we have speculated long and hard enough to know that more speculation isn't going to get us very far, or obtain any brownie points. We are, regardless, the benificiaries of these gifts. This is a, perhaps the, fundamental religious feeling- thankfulness for the infinite powers and entities that we embody, experience, and rely on, yet have precious little understanding of- the mysterium tremendum.

Does this all imply god? No. God is a rather pathetically inferred solution to, or better yet, an anthropomorphization of, this mystery. As social beings, and products of families, we in a primitive state might naturally ascribe the vast mysteries that undergird our existence and far outstrip our conceptions to a personified father figure (or mother, if one's society happens to be matriarchial). No error could be more obvious. Science has served to push the boundaries of mystery a little farther out, from a choking fog where virtually everything is obscure, to a view that goes billions of light-years across the universe. What all this has shown is, that as far as we can see, mechanism is the rule. Our bodies are mechanisms. The universe is a mechanism. Diseases are not the vengeance of jealous gods, nor is the weather. The inference of god has not held up well over time- not well at all. Yet that does not mean that we shouldn't be thankful for the gifts we receive, which are so rich on our life-giving planet. Nor that we shouldn't strive to pass them on rather than destroying them in the current moment of greed, by our thoughtless overpopulation and immiseration of this world.

  • Another soul eaten by the president.
  • And his base... the truly demented.
  • The ideology of business naturally shoots itself in the foot.
  • Failure of public management angers some.

Sunday, June 14, 2020

The Music in my Head

Brain waves correlate with each other, and with cognitive performance.

This is a brief return to brain waves. Neural oscillations are low-frequency synchronized activity of many neurons, from the low single Hz to 50 or 100 Hz. They do not broadcast information themselves, as a radio station might. Rather, they seem to represent coalitions of neurons being ganged together into a temporary unified process. They seem to underlie attention, integration of different sensory and cognitive modes, and perhaps subjective "binding". A recent paper provides a rare well-written presentation of the field, along with a critical approach to correlations between waves coming from different places in the brain. They also find that strength of oscillatory coupling correlates with cognitive performance.

The issue they were trying to approach was the validity of cross-frequency coupling, where a rhythm at one frequency is phase-coupled with one at a higher frequency, at some integer difference in frequency, like 1:2. (In other words, harmony was happening). Such entrainment would allow fundamentally different cognitive processes to relate to each other. They study two different types of correlation- the straight frequency coupling as above, and a phase-amplitude coupling where the amplitudes of a higher frequency oscillation are shaped to resemble a lower frequency wave. This resembles AM radio, where the high-frequency radio signal "carries" a sound signal encoded in its rising and falling amplitudes, though its frequency is completely stable. This latter form of coupling was more difficult to find, analyze, and in the end failed to have significant functional consequences, at least in this initial work.

Cartoons of cross-frequency couplings (CFC, aka harmonies) that were investigated.

The authors' first goal was to isolate gold-standard couplings, whose participating waves come from different locations in the brain, and do not (respectively) resemble contaminating similar waves inherent in the complementary location. After isolating plenty of such cases, they then asked where such phenomena tend to take place, and do they correlate with function, like performance on tests. They used resting brains, instead of any particular task setting. This makes the study more reproducible and comparable to others, but obviously fails to offer much insight into waves as they are (if they are) used for critical task performance. Resting brains have an ongoing hum of activity, including a well-known network of waves and frequency couplings called the default mode network. Going past the authors' statistical analysis of maximally valid correlations, they found a variety of "hubs" of cross-frequency coupling, which had an interesting nature:

"In α:β and α:γ CFS, the α LF hubs were observed in PFC and medial regions that belong to the default mode network [29] or to control and salience networks in the functional parcellation based on fMRI BOLD signal fluctuations [94–96]. This is line with many previous studies that have found α oscillations in these regions to be correlated with attentional and executive functions [14–19]. In contrast, the β and γ HF hubs were found in more posterior regions such as the SM region and the occipital and temporal cortices, where β and γ oscillations are often associated with sensory processing"
Alpha, beta, and gamma are different frequency bands of neural oscillations. CFS stands for cross-frequency coupling. LF stands for the low frequency partner of the coupling, while HF stands for the high frequency partner or source. PFC stands for the prefrontal cortex, which seems to be a locus of relatively low frequency brain waves, while the sensori-motor (SM) regions are loci of higher-frequency activity. This is interesting, as our brains are generally filters that gather lots of information (high frequency) which is then winnowed down and characterized into more abstract, efficient representations, which can operate at lower frequency.

And does more correlation in the resting state mean better brain performance when doing tasks? These authors claim that yes, this is the case:

"CFS between θ–α with β–γ oscillations (θ–α: β–γ CFS) and CFS between β and γ oscillations (β:γ CFS) showed significant positive correlations with scores on Trail-Making Tests (TMTs), which measure visual attention, speed of processing, and central executive functions, as well as with Zoo Map Tests, which measure planning ability (p < 0.05, Spearman rank correlation test, Fig 8). Intriguingly, negative correlations with the test scores were observed for CFS of α and β oscillations with higher frequencies (α–β:γ) and for γ:Hγ CFS in the Digits Tests measuring WM performance.
...
These results suggest that in a trait-like manner, individual RS CFC brain dynamics are predictive of the variability in behavioral performance in separately measured tasks, which supports the notion that CFC plays a key functional role in the integration of spectrally distributed brain dynamics to support high-level cognitive functions."

RS refers to the resting state, of the subjects and their brains.

It is exciting to see this kind of work being done, gaining insight into information processing in the brain. It is an opaque, alien organ. Though it houses our most intimate thoughts and feelings, how it does so remains one of the great tasks of humanity to figure out.

Saturday, February 29, 2020

Greedy, Hateful, Lustful Bastards

The shadow in Jungian psychology. Our motive force, but also our deepest secret.

As the Buddhists know very well, this thing we call the "I" is not a single thing, and may not be anything at all. It certainly isn't a coherent story of perseverence and triumph. The deeper you go, the less identifiable and singlular it is, since we knit together vast numbers and scales of activity, from the reactions of metabolism to the synapsing of neurons and the drive for social success, even to communal and shared culture, into this being entitled "I". Even on the psychological level, there are myriad unconscious elements, making the quest to know one's self a life-long and generally unsuccessful endeavor, for those who are so inclined.

In Freudian psychology, the contents of the unconscious (referred to sometimes as the subconscious) are uniformly bleak. It is the realm of lusts and drives, a pandora's box to be kept firmly repressed, in order for its custodian to be a functioning member of society. But the effort of repression is draining and costly, leading to a sort of hydraulic theory of the unconscious, where the more material there is to repress, the more effort is required, to the point that people "break down" from the strain. Likewise, releases of pressure through swearing, or watching violent films, or thrill-seeking and similar forms of "fun" relieve some strain, and help maintain the proper psychological pressure.

Jungian psychology sees the unconscious as a much larger and varied entity. It forms the basis of our positive as well as negative motivations, and operates, among many levels, at a level of archetypal symbology that is richly descriptive and informative when allowed expression via dreams, free association, and creative activities like writing and visual arts. It includes our intuition, and can be tremendously healing, persistently giving us images / glimmers of needed changes and goals.

Tibetan Buddhism hosts a large collection of monster and shadow figures. This is Palden Lhamo, who is a protector, but a wrathful one who rides through a lake of blood, spreading death and destruction to Tibet's enemies. Not enough to keep out the Chinese, unfortunately.

But even in Jungian psychology, the unconscious has a dark side- the shadow, which comprises the motivations we try to deny or hide. But can not get rid of- they are always with us and part of us. The greed, hate, and lust that undeniably drive us, but which we do not want as part of our persona- our face to the world. In the theatrical presentation of the self, we are good, virtuous, and respectful. Repression is the order of the day. While much of Jungian psychology is devoted to interpreting positive messages from the unconscious, managing the negative and the dark is very much a focus as well, as these aspects are universal and persistent. It is the work of consciousness to integrate the shadow into the ego / personality, in a controlled and accepting way.

One particular specialty of the shadow is projection, causing us to consciously reject bad traits in ourselves by ascribing them to others. Our president is a master of projection, insulting others, accusing them of the very things he himself is guilty of, as a way of keeping himself sane and narcissistically coherent. Why anyone else puts up with it is hard to fathom, but then certain bloggers have similar problems of casting stones from glass houses. There are also collective projections, like the concept of hell. An important goal of depth psychology is to come to a mature accommodation with all of one's own facets, in order to be able withdraw projections of this sort, to own one's behavior, good and bad, and thus to master the shadow, without giving up its motivating virtues.

Another way to engage with the shadow is to indulge it to a controlled extent, as happens in bacchanals, carnivals, video games, and Trump rallies. Giving free reign to our dark side is, in the hydraulic sense, very free-ing, re-creational, and possibly even an ecstatic experience. But it must be carefully bounded and controlled. It is no way to run a positive life or culture. One can grade various cultures and their religions on a sort of shadow scale, from the carnage of the Aztecs and Nazis to the perhaps unrealistic compassion of Buddhist culture as in pre-invasion Tibet. Many religions have shown shadow aspects, such as the duality of Zoroastrianism and Manichaeism, and the jihads and crusades of the Islamic and Christian varieties. The happiest societies seem to have the least shadow aspect- places like the Scandinavian countries, with their increasing mild secularity, and pre-invasion Tibet. In contrast, the unhappiest societies are heavily driven by shadow, like the Islamic countries of today, who not only valorize violence, but mix in plenty of "honor" and misogyny as well.

I think the lesson is that the hydraulic theory of controlled shadow release is not correct, rather, that more repression is better, when done consistently and intelligently. Releasing the shadow is bad, whatever the dose. The Buddhist technologies of meditation and cultivation in ways of charity, compassion, and love are clearly successful in cultivating a wider society that reflects those values. Conversely, having a president whose tastes tend to beauty pageants and WWE, and whose modus tweeterandi is hate, fosters a society that will be experiencing the opposite values.

Saturday, October 19, 2019

The Participation Mystique

How we relate to others, things, environments.

We are all wrapped up in the impeachment drama now, wondering what could be going on with a White House full of people who have lost their moral compasses, their minds. Such drama is an exquisite example of participation mystique, on our part as we look on in horror as the not very bright officials change their stories by the day, rats start to leave the sinking ship, and the president twists in the wind. We might not sympathize, but we recognize, and voyeuristically participate in, the emotions running and the gears turning.

Carl Jung took the term, participation mystique, from the anthropologist Lucien Levy Bruhl. The original conception was a rather derogotory concept about the animism common among primitive people, that they project anthropomorphic and social characters to objects in the landscape, thus setting up mystical connections with rocks, mountains, streams, etc. Are such involvements characteristic of children and primitive people, but not of us moderns? Hardly. Modern people have distancing and deadening mechanisms to manage our mental involvement with projected symbologies, foremost among which is the scientific mindset. But our most important and moving experiences partake of identification with another- thing or person, joining our mental projection with their charisma, whatever that might be.

Participation mystique remains difficult to define and use as a concept, despite books being written about it. But I would take it as any empathetic or identification feelings we have toward things and people, by which the boundaries in between become blurred. We have a tremendous mental power to enter into other's feelings, and we naturally extend such participation (or anthropomorphism) far beyond its proper remit, to clouds, weather events, ritual objects, etc. This is as true today with new age religions and the branding relationships that every company seeks to benefit from, as it is in the more natural setting of imputing healing powers to special pools of water, or standing in awe of a magnificent tree. Such feelings in relation to animals has had an interesting history, swinging from intense identification on the part of typical hunters and cave painters, to an absurd dismissal of any soul or feeling by scientistic philosophers like Descartes, and back to a rather enthusiastic nature worship, nature film-making, and a growing scientific and philosophical appreciation of the feelings and moral status of animals in the present day.




Participation mystique is most directly manipulated and experienced in the theater, where a drama is specifically constructed to draw our sympathetic feeings into its world, which may have nothing to do with our reality, or with any reality, but is drenched in the elements of social drama- tension, conflict, heroic motivations, obstacles. If you don't feel for and with Jane Eyre as she grows from abused child, to struggling adult, to lover, to lost soul, and finally to triumphant partner, your heart is made of stone. We lend our ears, but putting it psychologically, we lend a great deal more, with mirror neurons hard at work.

All this is involuntary and unconscious. Not that it does not affect our conscious experience, but the participation mystique arises as an automatic response from brain levels that we doubtless share with many other animals. Seeing squirrels chase each other around a tree gives an impression of mutual involvement and drama that is inescapable. Being a social animal requires this kind of participation in each other's feelings. So what of the psychopath? He seems to get these participatory insights, indeed quite sensitively, but seems unaffected- his own feelings don't mirror, but rather remain self-centered. He uses his capabilities not to sympathise with, but to manipulate, others around him or her. His version of participation mystique is a truncated half-experience, ultimately lonely and alienating.

And what of science, philosophy and other ways we systematically try to escape the psychology of subjective identification and participation? As mentioned above in the case of animal studies, a rigid attitude in this regard has significantly retarded scientific progress. Trying to re-establish objectively what is so obvious subjectively is terribly slow, painstaking work. Jane Goodall's work with chimpanzees stands as a landmark here, showing the productive balance of using both approaches at once. But then when it comes to physics and the wide variety of other exotic phenomena that can not be plausibly anthropomorphized or participated in via our subjective identification, the policy of rigorously discarding all projections and identifications pays off handsomely, and it is logic alone that can tell us what reality is.

  • The Democratic candidates on worker rights.
  • Was it trade or automation? Now that everything is made in China, the answer should be pretty clear.
  • On science.
  • Turns out that Google is evil, after all.
  • Back when some Republicans had some principles.
  • If all else fails, how about a some nice culture war?
  • What is the IMF for?
  • #DeleteFacebook
  • Graphic: who is going to tax the rich? Who is pushing a fairer tax system overall? Compare Biden with Warren carefully.

Saturday, October 5, 2019

High Intelligence is Highly Overrated by Highly Intelligent People

AI, the singularity, and watching way too much science fiction: Review of Superintelligence by Nic Bostrom.

How far away is the singularity? That is the point when machine intelligence exceeds human intelligence, after which it is thought that this world will no longer be ours to rule. Rick Bostrom, a philosopher at Oxford, doesn't know when this will be, but is fearful of its consequences, since, if we get it wrong, humanity's fate may not be a happy one.

The book starts strongly, with some well argued and written chapters about the role of intelligence in humanity's evolution, and the competitive landscape of technology today that is setting the stage for this momentous transition. But thereafter, the armchair philosopher takes over, with tedious chapters of hairsplitting and speculation about how fast or slow the transition might be, how collaborative among research groups, and especially, how we could pre-out-think these creations of ours, to make sure they will be well-disposed to us, aka "the control problem".

Despite the glowing blurbs from Bill Gates and others on the jacket, I think there are fundamental flaws with this whole approach and analysis. One flaw is a failure to distinguish between intelligence and power. Our president is a moron. That should tell us something about this relationship. It is not terribly close- the people generally acknowledged as the smartest in history have rarely been the most powerful. This reflects a deeper flaw, which is, as usual, a failure to take evolution and human nature seriously. The "singularity" is supposed to furnish something out of science fiction- a general intelligence superior to human intelligence. But Bostrom and others seem to think that this means a fully formed human-like agent, and those are two utterly different things. Human intelligence takes many forms, and human nature is composed of many more things than intelligence. Evolution has strained for billions of years to form our motivations in profitable ways, so that we follow others when necessary, lead them when possible, define our groups in conventional ways that lead to warfare against outsiders, etc., etc. Our motivational and social systems are not the same as our intelligence system, and to think that anyone making an AI with general intelligence capabilities will, will want to, or even can, just reproduce the characteristics of human motivation to tack on and serve as its control system, is deeply mistaken.

The fact is that we have AI right now that far exceeds human capabilities. Any database is far better at recall than humans are, to the point that our memories are atrophying as we compulsively look up every question we have on Wikipedia or Google. And any computer is far better at calculations, even complex geometric and algebraic calculations, than we are in our heads. That has all been low-hanging fruit, but it indicates that this singularity is likely to be something of a Y2K snoozer. The capabilities of AI will expand and generalize, and transform our lives, but unless weaponized with explicit malignant intent, it has no motivation at all, let alone the motivation to put humanity into pods for its energy source, or whatever.

People-pods, from the Matrix.

The real problem, as usual, is us. The problem is the power that accrues to those who control this new technology. Take Mark Zuckerberg for example. He stands at the head of multinational megacorporation that has inserted its tentacles into the lives of billions of people, all thanks to modestly intelligent computer systems designed around a certain kind of knowledge of social (and anti-social) motivations. All in the interests of making another dollar. The motivations for all this do not come from the computers. They come from the people involved, and the social institutions (of capitalism) that they operate in. That is the real operating system that we have yet to master.

  • Facebook - the problem is empowering the wrong people, not the wrong machines.
  • Barriers to health care.
  • What it is like to be a psychopathic moron.

Saturday, August 24, 2019

Incarnation and Reincarnation

Souls don't reincarnate. Heck, they don't even exist. But DNA does.

What a waste it is to die. All that work and knowledge, down the drain forever. But life is nothing if not profligate with its gifts. Looking at the reproductive strategies of insects, fish, pollen-spewing trees, among many others gives a distinct impression of easy come, easy go. Life is not precious, but dime-a-dozen, or less. Humanity proves it all over again with our rampant overpopulation, cheapening what we claim to hold so dear, not to mention laying the rest of the biosphere to waste.

But we do cherish our lives subjectively. We have become so besotted with our minds and intelligence that it is hard to believe, (and to some it is unimaginable), that the machinery will just cease- full stop- at some point, with not so much as a whiff of smoke. Consciousness weaves such a clever web of continuous and confident experience, carefully blocking out gaps and errors, that we are lulled into thinking that thinking is not of this world- magical if not supernatural. Believing in souls has a long and nearly universal history.

Reincarnation in the popular imagination, complete with a mashup of evolution. At least there is a twisty ribbon involved!

Yet we also know it is physical- it has to be something going on in our heads, otherwise we would not be so loath to lose them. Well, lose them we do when the end comes. But it is not quite the end, since our heads and bodies are reincarnations- they come from somewhere, and that somewhere is the DNA that encodes us. DNA incarnates through biological development, into the bodies that are so sadly disposable. And then that DNA is transmitted to new carnate bodies, and re-incarnates all over again in novel combinations through the wonder of sex. It is a simple, perhaps trite, idea, but offers a solid foundation for the terms (and archetypes) that have been so abused through theological and new-age history.

Saturday, June 15, 2019

Can Machines Read Yet?

Sort of, and not very well.

Reading- such a pleasure, but never time enough to read all that one would like, especially in technical fields. Scholars, even scientists, still write out their findings in prose- which is the richest form of communication, but only if someone else has the time and interest to read it. The medical literature is, at the flagship NCBI Pubmed resource, at about 30 million articles in abstract and lightly annotated form. Its partner, PMC, has 5.5 million articles in full text. This represents a vast trove of data which no one can read through, yet which tantalizes with its potential to generate novel insights, connections, and comprehensive and useful models, were we only able to harvest it in some computable form.

That is one of the motivations for natural language processing, or NLP, one of many subfields of artificial intelligence. What we learn with minimal effort as young children, machines have so far been unable to truly master, despite decades of effort and vast computational power. Recent advances in "deep learning" have made great progress in pattern parsing, and learning from large sets of known texts, resulting in the ability to translate one language to another. But does Google Translate understand what it is saying? Not at all. Understanding has taken strides in constricted areas, such as phone menu interactions, and Siri-like services. As long as the structure is simple, and has key words that tip off meaning, machines have started to get the hang of verbal communication.

But dealing with extremely complex texts is another matter entirely. NLP projects directed against the medical literature have been going on for decades, with relatively little to show, since the complexity of the corpus far outstrips the heuristics used to analyze it. These papers are, indeed, often very difficult for humans to read. They are frequently written by non-English speakers, or just bad writers. And the ideas being communicated are also complex, not just the language. The machines need to have a conceptual apparatus ready to accommodate, or better yet, learn within such a space. Recall how perception likewise needs an ever-expanding database / model of reality. Language processing is obviously a subfield of such perception. These issues raises a core question of AI- is general intelligence needed to fully achieve NLP?


I think the answer is yes- the ability to read human text with full understanding assumes a knowledge of human metaphors, general world conditions, and specific facts and relations from all areas of life which amounts to general intelligence. The whole point of NLP, as portrayed above, is not to spew audio books from written texts, (which is already accomplished, in a quite advanced way), but to understand what it is reading fully enough to elaborate conceptual models of the meaning of what those texts are about. And to do so in a way that can be communicated back to us humans in some form, perhaps diagrams, maps, and formulas, if not language.

The intensive study of NLP processing over the Pubmed corpus reached a fever pitch in the late 2000's, but has been quiescent for the last few years, generally for this reason. The techniques that were being used- language models, grammar, semantics, stemming, vocabulary databases, etc. had fully exploited the current technology, but still hit a roadblock. Precision could be pushed to ~ %80 levels for specific tasks, like picking out the interactions of known molecules, or linking diseases with genes mentioned in the texts. But general understanding was and remains well out of reach of these rather mechanical techniques. This is not to suggest any kind of vitalism in cognition, but only that we have another technical plateau to reach, characterized by the unification of learning, rich ontologies (world models), and language processing.

The new neural network methods (tensorflow, etc.) promise to provide the latter part of the equation, sensitive language parsing. But from what I can see, the kind of model we have of the world, with infinite learnability, depth, spontaneous classification capability, and related-ness, remains foreign to these methods, despite the several decades of work lavished on databases in all their fascinating iterations. That seems to be where more work is needed, to get to machine-based language understanding.


  • What to do about media pollution?
  • Maybe ideas will matter eventually in this campaign.
  • Treason? Yes.
  • Stalinist confessions weren't the only bad ones.
  • Everything over-the-air ... the future of TV.

Sunday, June 2, 2019

Backward and Forward... Steps to Perception

Perception takes a database, learning, and attention.

We all know by now that perception is more than simply being a camera, getting visual input from the world. Cameras see everything, but they recognize nothing, conceptualize nothing. Perception implies categorization of that input into an ontology that makes hierarchical sense of the world, full of inter-relationships that establish context and meaning. In short, a database is needed- one that is dynamically adaptive to allow learning to slice its model of reality into ever finer and more accurate categories.

How does the brain do that? The work of Karl Friston has been revolutionary in this field, though probably not well-enough appreciated and admittedly hard for me and others not practiced in mathematical statistics to understand. A landmark paper is "A theory of cortical responses", from 2005. This argues that the method of "Empirical Bayes" is the key to unlock the nature of our mental learning and processing. Bayesian statistics seems like mere common sense. The basic proposition is that the likelihood of some thing is related to our naive model (hypothesis) of its likelihood arrived at prior to any evidence or experience, combined with evidence expressed in a way that can weight or alter that model. Iterate as needed, and the model should improve with time. What makes this a statistical procedure, rather than simple common sense? If one can express the hypothesis mathematically, and the evidence likewise, in a way that relates to the hypothesis, then the evaluation and the updating from evidence can be done in a mechanical way.

Friston postulates that the brain is such a machine, which studiously models the world, engaging in what statisticians call "expectation maximization", which is to say, progressive improvements in the in detail and accuracy of its model, driven by inputs from sensory and other information. An interesting point is that sensory input functions really as feedback to the model, rather than the model functioning as an evaluator of the inputs. We live in the model, not in our senses. The overall mechanism works assiduously to reduce surprise, which is a measure of how inputs differ from the model. Surprise drives both attention and learning.

Another interesting point is the relationship between inference and learning. The model exists to perform inference- that is the bottom-up process of judging the reality and likely causes of some event based on the internal model, activated by the input-drive attention. We see a ball fall down, and are not surprised because our model is richly outfitted with calculations of gravitation, weight, etc. We infer that it has weight, and no learning is required. But suppose it is a balloon that floats up instead of falling- a novel experience? The cognitive dissonance represents surprise, which prompts higher-level processing and downward, top-down alterations to the model to allow for lighter-than-air weights. Our inferences about the causes may be incorrect. We may resort to superstition rather than physics for the higher-level inference or explanation. But in any case, the possibility of rising balls would be added to our model of reality, making us less surprised in the future.
The brain as a surprise-minimizing machine. Heading into old age, we are surprised by nothing, whether by great accumulated experience or by a closure against new experiences, and thus reach a stable / dead state. 

This brings up the physiology of what is going on in the brain, featuring specialization, integration, and recurrent networks with distinct mechanisms of bottom-up and top-down connection. Each sensory mode has its specialized processing system, with sub-modules, etc. But these only work by working together, both in parallel, cross-checking forms of integration, and by feeding into higher levels that integrate their mini-models (say for visual motion, or color assignment) into more general, global models.
"The cortical infrastructure supporting a single function may involve many specialized areas whose union is mediated by functional integration. Functional specialization and integration are not exclusive; they are complementary. Functional specialization is only meaningful in the context of functional integration and vice versa."

But the real magic happens thanks to the backward connections. Friston highlights a variety of distinctions between the forward and backward (recurrent) connections:

Forward connections serve inference, which is the primary job of the brain most of the time. They are regimented, sparsely connected, topographically organized, (like in the regular striations of the visual system). They are naturally fast, since speed counts most in making inferences. On the molecular level, forward connections use fast voltage-gated receptors, AMPA and GABA.

Backward connections, in contrast, serve learning and top-down modulation/attention. They are slow, since learning does not have to obey the rapid processing of forward signals. They tend to occupy and extend to complimentary layers of the cortex vs the forward connecting cells. They use NMDA receptors, which are roughly 1/10 as fast in response as the receptors use in forward synapses. They are diffuse and highly elaborated in their projections. And they extend widely, not as regimented as the forward connections. This allows lots of different later effects (i.e. error detection) to modulate the inference mechanism. And surprisingly, they far outnumber the forward connections:
"Furthermore, backward connections are more abundant. For example, the ratio of forward efferent connections to backward afferents in the lateral geniculate is about 1 : 10. Another distinction is that backward connections traverse a number of hierarchical levels whereas forward connections are more restricted."

Where does the backward signal come from, in principle? In the brain, error = surprise. Surprise expresses a violation of the expectation of the internal model, and is accommodated by a variety of responses. An emotional response may occur, such as motivation to investigate the problem more deeply. More simply, surprise would induce backward correction in the model that predicted wrongly, whether that is a high-level model of our social trust network, or something at a low level like reaching for a knob and missing it. Infants spend a great deal of time reaching, slowly optimizing their models of their own capabilities and the context of the surrounding world.
"Recognition is simply the process of solving an inverse problem by jointly minimizing prediction error at all levels of the cortical hierarchy. The main point of this article is that evoked cortical responses can be understood as transient expressions of prediction error, which index some recognition process. This perspective accommodates many physiological and behavioural phenomena, for example, extra classical RF [receptive field] effects and repetition suppression in unit recordings, the MMN [mismatch negativity] and P300 in ERPs [event-related potentials], priming and global precedence effects in psychophysics. Critically, many of these emerge from the same basic principles governing inference with hierarchical generative models."

This paper came up due to a citation from current work investigating this model specifically with non-invasive EEG methods. It is clear that the model cited and outlined above is very influential, if not the leading model now of general cognition and brain organization. It also has clear applications to AI, as we develop more sophisticated neural network programs that can categorize and learn, or more adventurously, develop neuromorphic chips that model neurons in a physical rather then abstract basis and show impressive computational and efficiency characteristics.

Monday, April 8, 2019

That's Cool: Adolescent Brain Development

Brain power and integration increases with development, particularly in the salience network and in the wakeful, attentive beta waves.

We see it happen, but it is still amazing- the mental powers that come on line during child development. Neurobiologists are starting to look inside and see what is happening mechanistically- in anatomical connectivity, activity networks, and brain wave patterns. Some recent papers used fMRI and magnetoencephalography to look at activity correlations and wave patterns over adolescent development. While the methods and analyses remain rather abstruse and tentative, it is clear that such tendencies as impulsivity and cognitive control can be associated with observations about stronger brain wave activity at higher frequencies, lower activity at lower frequencies, and inter-network integration.

An interesting theme in the field is the recognition that not only is the brain organized physically in various crinkles, folds, nodules, etc., and by functional areas like the motor and sensory cortexes or Broca's area, involved in speech production, but that it is also organized in connectivity "networks" that can cross anatomical boundaries, yet show coherence, being coordinately activated inside much more densely than outside the network. An example is the default mode network (DMN, or task-negative network), which happens when adults are just resting, not attending to anything in particular, but also not asleep. This is an example of the brain being "on" despite little conscious mental work being done. It may be our unconscious at work or play, much like it is during sleep on a much longer leash. As one might imagine for this kind of daydreaming activity, it is strongly self-focused, full of memories, feelings, social observations, and future plans. Anatomically, the DMN extends over much of the brain, from the frontal lobes to the temporal and parietal lobes, touching on regions associated with the functions mentioned, like the hippocampus involved in memory, temperoparietal areas involved in sociality/ theories of mind, etc. There are roughly twenty such networks currently recognized, which activate during different mental fuctions, and they provide some answers to the question of how different brain areas are harnessed together for key functions typical of mental activity.

Two networks relevant to this current work are the salience network (SN) and the cingulo-opericular network (CN or CO). The latter is active during chronic attention- our state of being awake and engaged for hours at a time, termed tonic alertness. (This contrasts with phasic alertness, which is much shorter-term / sporadic and reactive).  It is one of several task-positive networks that function in attention and focus. The salience network spans cortical (anterior insula an dorsal anterior cingulate) and subcortical areas (amygdala and central striatum) binding together locations that play roles in salience- assigning value to new events, reacting to unusual events. It can then entrain other brain networks to take control over attention, behavior, thoughts, etc.

fMRI studies of the activity correlations between brain networks. The cingulo-opercular and salience network connections (gray) take a large jump in connectivity to other regions in early adolescence. At the same time, fronto-parietal network connections (yellow), characteristic of frontal control and inhibition of other networks, take a dive, before attaining higher levels going into adulthood.

Here we get to brain waves, or oscillations. Superimposed on the constant activity of the brain are several frequencies of electrical activity, from the super-slow delta waves (~ 1Hz) of sleep to the super-fast gamma waves (~50 Hz) which may or may not correlate with attention and perception. The slower waves seem to correlate with development, growth, and maintenance, while the faster waves correlate with functions such as attention and behavior. Delta waves are thought to function during the deepest sleep in resetting memories and other brain functions, and decline sharply with age, being pervasive in infants, and disappearing by old age. Faster waves such as theta (5-9 Hz), alpha (8-12), and beta (14-26 Hz) correlate with behavior and attention, and are generally thought to help bind brain activities together, rather than transmitting information as radio waves might. Attention is a clear example, where large brain regions are bound by coordinated waves, depending on what is being attended to. Thus the "spotlight of attention" is characterized both by the activation of selected relevant brain areas, and also by their binding via phase-locked neural oscillations. These are naturally highly variable and jumbled as time goes on, reflecting the saccadic nature of our mental lives.

One of the papers above focused on theta and beta waves, finding that adolescents showed a systematic move from lower to higher frequencies. While fMRI scans of non-oscillatory network activity showed greater integration with age, studies of oscillations showed that the main story was *de-coupling mainly at the lower frequencies. What this all seems to add up to is a reduction of impulsivity, via reduced wave/phase coupling between especially between the salience and other networks, at the same time as control over other networks is more integrated and improved, via increased connectivity. So control by choice goes up, while involuntary reactivity goes down. It is suggested that myelination of axons, as part of brain development along with pruning extra cells and connections, makes long-range connections faster, enabling greater power in these higher frequency binding/coordination bands.

Brain wave phase coordination between all areas of the brain, measured by frequency and age. Low frequencies associated with basal arousal, motor activity, and daydreaming are notably less correlated in adults, while beta-range frequencies about 25 Hz, associated with focused attention, are slightly more correlated. 

Is this all a little hand-wavy at this point? Yes indeed- that is the nature of a field just getting to grips with perhaps the most complicated topic of all. But the general themes of oscillations as signs/forms of coordination and binding, and active sub-networks as integrating units of brain/mental activity on top of the anatomical and regional units are interesting developments that will grow in significance as more details are filled in.

Saturday, November 24, 2018

The Problem With Atheism

Bernard Mandeville and the impossibility of getting along without lying.

We live in a little cloud of lies. From the simplest white lie and social protocol for hiding unpleasantness, to the universal belief that one's own family, city, country are better than the other ones, untruth is pervasive, and also essential. Vanity, optimism, a standard set of cognitive biases.. are opposed to the reality principle. The economic commonplaces which are, as Keynes noted, unknowningly derived from some defunct economist. Our unconscious is resolutely irrational. Euphemism, humor and swearing are ways to refer to truths that are difficult to bring up in straightforward fashion. But more serious truths are the more deeply hidden. Such as death, the final stop on everyone's trip. Full-on honesty and truth? No one wants that, or could live with it.


Many thinkers have plumbed these depths, from Machiavelli to Freud. Bernard Mandeville was one, profiled in a recent BBC podcast. His most enduring (and brief) work was the Fable of the Bees, which portrays a society much like Britain's, rife with greed, ambition, corruption, and crime. Due to moralist complaints, god decides to make this hive moral and good, upon which everything promptly goes to pot. The economy, previously held up by a love of luxury, collapses. Courts and lawyers have nothing to do, clothing fashions fail to change. The traders leave the seas for lack of demand, and the military succumbs for lack of population. The hive ends up resembling one truly composed of bees, and goes to live a hollow tree, never to be heard from again.
"Those, that were in the Wrong, stood mute,
And dropt the patch'd vexatious Suit.
On which, since nothing less can thrive,
Than Lawyers in an honest Hive."
... 
"Do we not owe the Growth of Wine
To the dry, crooked, shabby Vine?
Which, whist its shutes neglected stood,
Choak'd other Plants, and ran to Wood;
But blest us with his Noble Fruit;
As soon as it was tied, and cut:
So Vice is beneficial found,
When it's by Justice lopt and bound;"

His point, naturally, was that vice is both natural and to a some extent the underpinning of national greatness and economic vitality (given some beneficial management). Greed is good, as is irrational optimism and ambition. Mandeville was also a famous anticlericalist in his day, but that is another story. It was a classic contrarian point, that what we fight tooth and nail to vanquish or hide has, in reality, a role to play in the national character and success, for all its embarrassment. And that we routinely lie, to ourselves above all, to hide the truth of reality so that we can go on our way from one day to the next.
"My aim is to make Men penetrate into their Consciences, and be searching without Flattery into the true Motives of their Actions, learn to know themselves."
- Bernard Mandeville, in Free Thoughts on Religion, the Church, and National Happiness.

What is our most florid and communal lie, but religion? This is the salve of social togetherness, moral self righteousness, and imaginary immortality. It is the finely tuned instrument that addresses alike our private fears and social needs. And atheists know it is completely, utterly wrong! But what is the point of saying so? Religions have been corrupt, abusive, greedy and murderous from time immemorial- they have many faults. But untruth is not a flaw.. it is the reigning feature of this imaginative confection, providing the credulous a full belief system to support a positive and hopeful self-image, (not to mention conventional authority!), so important to happiness, providing the more skeptical an endless labyrinth of theological puzzles, while providing even the most skeptical or apathetic a social institution to call home.

So why go around ripping the clothes from believers, crying that their cherished narratives of meaning are senseless- that they should go forth theologically naked? It is a serious question for atheists, going to the heart of our project. For Freud, after all, repression had a positive function, and was not to be comprehensively cleared away, root and branch, only pruned judiciously. Lying is indeed integral to mature social functioning. Clearly, untruth is not, by itself, an unacceptable portion of the human condition. This implies that atheists need to be generally gentle in approach, and selective in what they address directly- the most significant outrages and injustices perpetrated by religions, of which there is no shortage. When religions invade the territory of science, making bone-headed proclamations about biology and geology, that clearly crosses such a line. And likewise when religions insinuate themselves into governmental institutions, bent on seeking power to foist their beliefs and neuroses on others.

The so-called arrogance of atheists consists of their opposing / exposing the cherished verities of others as false. Such arrogance is of course not unknown among religious believers and zealots either, and for much more modest cause. The secular state settlement of the West has forced religions to forego armed conflict and state violence in the pursuit of their truths and enemies. Atheists should take a page from this success to lead by example and humor, rather than frontal assault, even rhetorically.


  • BBC to continue spouting religion.
  • Silicon valley has its religion as well- a sort of Stockholm syndrome.
  • But lies in politics.. is there no limit?
  • Hate is in the textbooks, in Saudi Arabia.
  • Euro countries are not independent.
  • 5G to rule them all.
  • Heredity counts for a lot.. more than parenting.
  • The labor market could run much, much better.

Saturday, August 11, 2018

Was Jung an Atheist?

Short answer: yes. Understanding religion, and believing in it, are two different things.

Jung was highly sympathetic to religion- Christianity in particular- seeking to explain its psychology and origins, and even to replicate it. There is an old joke among Jungians. A child asks her Analyst parent.. are we Christians? And the parent answers: "heavens no, we are Jungians!" While Freud was a rather vociferous atheist, Jung took a much more ambiguous, understanding approach to religion. Rather than a pack of lies, it was a truth, just not about the cosmos. What makes Jungians distinct is their respect for the power and psychology of religion, which they are generally obsessed with, and devoted to understanding. They are more anthropologists of religion than disparagers.

It is common for god and religion in general to embody the psyche of its practitioners. Even atheists take god's name in vain, to express strong emotions. Intellectuals customarily make of god whatever most interests them. Einstein and Spinoza took god to be the universe. Jung took it to be the self. While religion touches on many archetypes and psychic complexes, the nexus around which it all revolves is the self. Am I saved? Will I live ever after? Am I good? Is anyone? What is the meaning of my life? Jung took these questions to be significant and deep, not just the superficial reflections of repressed sexuality. Indeed, his view of the unconscious was much more positive than Freud's, seeing it as a fount of deep insight and healing, whose therapeutic power is not just the exposure and extinguishing of childhood traumas and instinctive conflicts. The unconscious has its own perceptual apparatus and methods of communcation (symbols, images) which can be seen as an autonomous entity within ourselves. I.e. god.

This is why symbology and ritual are so much more important in religion than is theology. All the Western attempts to rationalize the concept of god are so much wasted effort, not only because they are intellectually bankrupt due to the non-existence of the cosmic god they posit. They operate on a typically intellectual level that is totally inappropriate to the subject at hand.

An image painted by Jung, from his Red Book. The unconscious holds dark shadows as well as  compassion.

God is indeed real and an autonomous thing, at the same time it is a psychological construct, arising from our own selves and depths. The psychological concepts that Jung fostered, about an immense and fertile unconscious, which partakes not only of individual concerns, but of communal and cosmic ones, represents a significant and irreversible step in our understanding of religion and its panoply of symbols, motivations, gods, and other artistic paraphernalia.

Late in his career, Jung offered an interpretation of the evolution of Christianity, in "Answer to Job". God, as the manifestation of Israel's unconscious longings and strivings, is in the Pentateuch a thin-skinned, and fickle tyrant. He is immature, and when Job calls him to moral account for the Trumpian way he has toyed with his devoted subject, all god can do is blow up in an insulting twitter-esque rage. This exchange raises to consciousness the primitive nature of the god-concept in this culture, and rankles for several hundred years, at which point the solution becomes to make a better man of god by making him (notionally) into a real man. So, Pinocchio-like, he comes to Earth as Jesus, does good deeds, expresses some compassion, (though unimaginable ego seeps through in the commands for followership and claims of overlordship), and then ritually offers his self-sacrifice to assure us that he has really changed his ways and is now meek as a lamb.

Another self-explanatory image from the Red Book.

Obviously, this made a pretty modest impression on Jews at the time and since. But the combination of monotheism and a quasi-charitable, egalitarian form of god, leavened by Greek gnosticism and other intellectual additions, spread like wildfire through a West enervated by the relentless brutality of Roman civilization, and its fractured spiritual resources.

Many gods have come and gone, as cultures evolve and elaborate new images of themselves and their ideals. While Jung dabbled in some mysticism along the way, and was frustratingly ambiguous and unscientific in his writings on the subject, he laid what we can take as a very trenchant foundation for understanding religion as a psychological phenomenon. In this he followed the lead of William James, who recognized that it is a special area, so heavily subjective that philosophy has little hold. Like other freelance religious practitioners, Jungian analysts today split their time between writing books of uplift and psychological insight, and listening to clients bring up their difficulties, whether shallow or deep. They provide spiritual solace to the lost, while trying to heal the larger culture by bringing to consciousness the powers, compassion, and insight that lie within.

  • The planet is burning.
  • Workers, citizens, unite!
  • An emotion in every chord.
  • How China beat the recession- classic Keynes.
  • What makes unemployed farmers so much better than other unemployed people?
  • And why is the Labor party giving up on labor?
  • Resignation- an excellent precedent!
  • A difference between just desserts and business models.

Saturday, June 2, 2018

How Big is Your Working Memory?

Evanescent working memory may be defined by gamma brain waves, whose number is limited by the capacity of theta waves containing them.

Human working memory is sadly minuscule. We can keep only a handful of things in immediate mind at a time, like a new telephone number. How luxurious it is, in comparison, to program a computer with its gigabytes of ram, which can be consulted instantaneously! Humans have lots of intermediate and long-term memory, which are accessible quite rapidly. But working memory is a special class, happening (as far as we know) without any neural cell alterations, rather purely on the electro-chemical level. A recent paper pursued the theory that working memory is mechanistically constituted by the encoding of gamma electrochemical rhythms within the theta rhythm cycle, a bit like AM radio carries sound amplitudes encoded in its carrier wave.

This theory (more generally reviewed here) would imply that gamma waves individually mark different bits of content, which is somewhat difficult to understand, really. Neural oscillations have come to be seen as entraining selected networks across the brain, allowing attentive synchronization and binding of content from various anatomical regions. The waves do not carry the content, rather the anatomy does. But each network actuated by separate peaks of the gamma oscillation could be different, thus "carrying" different information, though the wave is simply a timing and separation device. This mechanism of enclosing a set of distinct gamma patterns (typically running at roughly 40 Hz) within a theta wave (typically running at a much slower 5 Hz, but ranging from 3 to 8 Hz) is already understood in the case of place cell firing/encoding in the hippocampus, so it is not a stretch to think, as many seem to, that it is also responsible for working memory in many different subsytems of the brain. In that place cell system, the encoding is not only differentiated by gamma cycle, but time-compressed, such that a physical traversal of a space that takes a second may be encoded by gamma peaks only tens of milliseconds apart. So there is true encoding of information going on here.

Experimental protocol. Subjects where asked to memorize a pattern of colored dots for roughly a second. They were randomly directed to memorize the left side set (experiment) or the right side set (control). The number of dots that they successfully recalled as staying the same or changing was the measured outcome.

A recent paper (review) sought to support this hypothesis by using transcranial AC current stimulation (TACS) to entrain the theta rhythm in the visual cortex to faster or slower pace than normal, and asking subjects to memorize visual features. TACS is a very interesting technique, different from the transcranial magnetic stimulation you may have heard of before. These AC currents are specifically designed to alter neural oscillations, not general activity. The authors found that slowing down the theta rhythm allowed for a slightly increased memory capacity, consistent with the theory that the slower theta wave could fit in more gamma waves. Conversely, speeding up the theta rhythm significantly cut the subject's memory capacity. Amazing! Control experiments that sent the TACS current over a more superficial path from the left visual field, or, even better, which asked for memorization in the right visual field rather the left one that was being stimulated, showed no significant effect.

Theory of the experiment. If the theta rhythm is slowed down (left), more gamma waves, and thus distinct working memory engrams, could be enclosed within each theta wave, increasing effective memory capacity.

The authors mention that the difference in induced theta rates (between 4 and 7 Hz), would have theoretically have allowed two more or less gamma cycles to be enclosed, thus two more items memorized in working memory at 4 Hz than at 7 Hz. The effect size was very small, (about 0.8 items more or less were memorizable), but the experimental intervention was rather diffuse and blunt as well. This kind of work helps gives specific shape to our models of what oscillations do- how they can organize information transfer and binding within the brain, while not themselves really carrying anything in their waves/waveforms.

Data. Memory retention (in terms of items remembered, vs average) is plotted on the right vs the induced theta current. See the paper for controls.