Current psychology is wedded to its history with much stronger ties than any other science. As a laboratory investigation, psychology is only a century old; as a body of insights, observations, and hypothesis, it is the oldest science in the world. -Julian Jaynes
Unlike Pt1 and Pt2, this article is not a response to a particular Less Wrong post, but instead aims to offer an antidote to the basic assault that such “reality heads” usually direct at myth and religion. In a previous post I showed how Sam Harris made a horse’s ass out of himself by trying to deride the notion of a “soul” while at the same time claiming that primitives and myth-makers simply had no clue how cosmically special and complex the brain was. Since the mind, though more complex than a galaxy, is not suitably special to qualify for a term like “soul,” typically the genome is the only thing a “reality head” will be willing to admit vis a vis his essence. Accordingly, I want to explore the pervasive mythological theme of personal responsibility endemic to reincarnation, metempsychosis, as well as notions like “blood guilt” and “sins of the father.”
The skeptic or rationalist can hew down all of these silly stories by simply claiming that they are absurd if interpreted literally, word-for-word, but this presumes to know exactly what “soul” and “transmigration” refer to, were meant to refer to, and could refer to. That is, it assumes the words are meaningless and proceeds to invoke ridiculous imagery from Raiders of the Lost Ark, and so forth. These people get in a little more trouble when asked if the myths are true metaphorically, but this usually doesn’t delay them much from a cynical dismissal. However, with the discovery of epigenetics (see the documentary “The Ghost In Your Genes”), this cluster of myths now appears to be quite literally true to a certain extent. The way we behave and the environment we choose to occupy will change our epigenetics and subsequently the expression of our genes, but most importantly, the expression of our childrens’ genes (sometimes up to three generations down the line). That is, we are to a certain extent the stewards of our own genome, though we are only now scratching the surface of such transgenerational epigenetic inheritance phenomena. My claim here is that the above-mentioned myths might be a sort of opaque, primitive realization of this truth. It doesn’t really matter if the myths were after the fact rationalizations of a manifest drive, for this is far superior to their being delusions completely untethered to reality. When someone discards myth as such, they fail to see that such myths are precisely the phenomenon that the mind sciences need to explain, as well as the fact that they constitute some of the best evidence we have available for understanding the brain and its evolution.
There are plenty of inherited motives and instincts that help regulate the quality of a given gene pool: fitness markers, the incest taboo, infanticide of deformed infants, etc. However, these do not focus on one’s own genetic essence as the myths in question do. It doesn’t matter that bicameral man was unaware of the science of epigenetics or that his fanciful stories might correlate well with this future science: it is still reasonable to claim that these people were “aware” of a certain fact and that the various associated fables and myths, however bizarre-sounding, contain an important kernel of truth. Furthermore, if this inherited drive to preserve one’s genomic integrity exists, it could have provided the first step towards the creation of ideas like “Free Will” and other memes that fundamentally change the human operating system. Myths, poems, and metaphors are generative and are not meant to have a fixed meaning or operational definition.
The wonderful thing about poetry, religious or otherwise, is that the poet isn’t aware of the whole range of interpretations and meanings of his work; that the work doesn’t just call for interpretation, but demands multiple interpretations, perhaps ad infinitum. In fact, “reason” and “science” were born in this process of interpreting ambiguous, cryptic art, just as history was born from mythology. For science-minded rationalists to dismiss myth is quite silly, when you think about it, for their own coveted mental “software” evolved from these beta-versions, which accordingly could not fail to contain some truth. I must invoke the “No Miracles Argument” here if anyone disagrees with me on this point, for it would simply be miraculous for history, philosophy, law, and science to have emerged from total and utter nonsense produced by psychosis-addled artists, poets, and bards. No, it has always been art that has reached out into the unknown to find or create meaning, with rational tools playing a game of catch-up that has often proven Ockham’s Razor to be far too dull and slow a tool for the task. In epigenetics we see a case where Ockham’s Razor has caught up, providing a plausible interpretation of mythical themes involving responsibility for one’s soul and the souls of one’s children. Before epigenetics, these myths were usually interpreted as referring to the mind or personality as soul or essence.
The mass of myths dealing with reincarnation of an individual into one or another life form, its status dependent on how this person has lived his or her life, attests to the awareness in the experience of the race that the individual does have some responsibility for how he or she lives. Sartre’s argument that we invent ourselves by virtue of the multitude of our choices may be overstated, but its partial truth must nevertheless be admitted. -Rollo May
But epigenetics adds that we also partially invent or curate our own genetic inheritance and that of our children! However, there are two deeper layers to this general line of argument I want to explore:
1) The human brain has evolved in conjunction with culture, such that changes in culture, like the invention of new words and ideas, actually has an effect on the future evolution of the brain, which has an effect on which ideas are likely to run “natively” and which are likely to be invented easily, and so on, in a feedback loop. Therefore, choosing to follow ancient rituals and perpetuate old cultural practices will have an effect on the phenotypes of your tribe down the line. The genetic quality of your tribe will change depending on whether you endorse a mythology that celebrates inquiry versus one that celebrates blind obedience, or one that celebrates the truly human versus one that denigrates it.
2) As Julian Jaynes argued, what we call “consciousness” is the product of language, culture and socialization, not strictly biology per se, meaning that the stories that you decide to teach your children and the fidelity with which your actions reflect those stories will change what kind of consciousness they enjoy. Therefore, whether you identify the idea of a “soul” with your genetic essence, or your particular brand of self-awareness “software,” you simply must admit that modern science has begun to pour meaning back into the very myths that it largely set out to annihilate.
Though the scientific narrative is certainly “less wrong” than myth, I hasten to ask whether it is “more right”? This narrative says a whole lot less than myth and, while perhaps “less wrong,” does not support the cavalier disparagement that someone like Sam Harris heaps on myth, claiming that the history of religion is a “carnival of errors,” as if the history of science too could not also be aptly described as such (see PMI). I’m sure the reader has heard of “physics envy” before, but I would like to introduce and coin a new term, “psychology envy,” to describe the jealousy that physicists and the like should have for the worlds oldest science, for unlike any other science, ideas in psychology have the unique trait of actually changing the phenomenon in question. Psychology (in this general sense) is far more powerful than physics, for it gave birth to physics and remains fertile still.
Inventing an idea like the indivisible atom does not change atomic structure or behavior, while inventing an idea like “wrestling with the gods” changes the software and thus hardware of the human brain. An entire people (the Israelites) were changed by such an idea and proceeded to develop a culture of rational discourse, textual criticism, and thoughtful questioning. In fact the very name “Israel” means “wrestles with god.” I think this idea has done more to promote “rationality” than a list of logical fallacies or cognitive biases ever has. One should wrestle with the gods, wrestle with his own biases and prejudices, instead of discard them or ignore them in deference to the maxims of logic. That is a better formula for “rationality,” for it actually acknowledges and attempts to incorporate the “irrational.” But no, let’s just discard the story of Jacob because nobody has seen asexual, winged people with halos flying around trumpeting messages from god or wrestling with mortals؟ In this case Scientism fails to really get the good stuff out of actual “science” (see the work of Jaynes, Persinger, Ramachandran, or the journalist John Geiger for an explanation of “third man” phenomenon).
Julian Jaynes emphasizes that there is no common pattern of growth between the disparate sciences:
My point is that the history, philosophy, and sociology of one science should not be modeled on that of another, that there is no such thing as normal scientific progress, no one pattern of scientific activity, no one criterion of excellence though there may be of aesthetic satisfaction, that there is no one ‘scientific’ method, and no one way of scientific history.
This doesn’t stop the “reality head” from asserting the superiority of “science” over and above any other organization of knowledge, as if it were one “thing,” method, or process. Crucially, these people miss the fact that the history of psychology is uniquely relevant to the science of psychology, as well as relevant to the history of every other science. Hell, 17th century terms from physics are entirely anthropomorphic: originally ‘force’ referred to muscular strength, ‘inertia’ to an idle and unemployable person, and ‘acceleration’ to the hastening of one’s steps. Need I analyze the term ‘attraction’? I’ll repeat and expand the quote we started with:
Current psychology is wedded to its history with much stronger ties than any other science. As a laboratory investigation, psychology is only a century old; as a body of insights, observations, and hypothesis, it is the oldest science in the world. Moreover, its history is not a musty attic of intellectual bric-a-brac and mildewed curiosa, as are often found in the history of chemistry or neurology, for example. -Julian Jaynes
This should be earth-shattering to any of you rationalists out there, for it means that the mentality described in the “Illiad” is better evidence about the reality of the physical brain than the idea of the atom is evidence about the structure of the material world. The “Illiad” is both a model and a fact; a theory and evidence. “After all, what is a fine lie? Simply that which is its own evidence,” wrote Oscar Wilde. Bizarre as much of the “Illiad” is to our modern minds, it captures, describes, and teaches a certain form of mentality quite accurately; a mentality that actually existed and can exist again, for instance in a psychotic break (see Johnathan Shay). Furthermore, the growing body of myths and religious revelations quite literally altered the physical states of human brains, forcing them to run certain software and be prone to certain further religious (psychological) epiphanies, whereas the idea of the indivisible atom was just an idea bandied about for a couple millenia, then discarded for a more accurate model once the idea could be sufficiently probed. I think psychology should be the queen of the sciences, for every idea ever spoken or written down is a fact or piece of data that we can use to better understand the human mind and brain, while this improved understanding will help illuminate the source and limitations of every idea or theory whatsoever.
Those addled with physics envy should remember that most physicists do not share their limited purview. Heisenberg once said that our mechanisms and technology make us “uncertain in the impulses of the spirit.” That’s right, spirit! He did not refer to it as arbitrary cultural/linguistic software running on a clumsy, intuition-addled meat-computer. Furthermore, is this not the raison d’etre of Less Wrong: to make us “uncertain in the impulses of the spirit”?
Perhaps I am being unfair, but my fear about that particular blog is that it obsesses on physics, math, statistics, and basically everything but real psychology, but purports to be telling us deep truths about our selves. It is meant to “promote rationality,” but fails to see that the irrational is not overcome by rationality; it must be fully expressed and integrated, ordered and curated, if it is to stop interfering with rational decision making. But more importantly, the “irrational” is precisely what grounds the majority of our actual decisions in life! Unless you are taking a math test or have found yourself in a cognitive science laboratory you are going to be trying to make rational choices about your irrational drives, motives, and dreams. “Reason” is there to order the Will and arbitrate disputes between the passions, not replace them with infallible logical dictates or a Bayesian panacea.
One cannot protect the self and become stronger, smarter, and more emotionally stable simply by downloading the latest Baloney Detection Kit. This assumes a sort of mechanical/technological solution to spiritual troubles, while not taking responsibility for cutting the reader off from anything approaching the spiritual, a term which the Less Wrongian perhaps reserves only for god-like-machine-intelligences. The most endearing feature of the blog in question is that it has an unwritten agenda of “nerd rehabilitation,” but sadly, it all-too-often just strokes the nerd for his strengths and encourages him to ignore his weaknesses. The nerd does not need more technology, logic, rationality, or behavioral economics; he needs a martial arts class, a caring therapist, a musical instrument to play, and perhaps a creative writing class. (In fairness, Eliezer Yudkowsky seems to have the creative writing element down just fine؟). It was no less a logician than Aristotle who said that educating the head but not the heart is no education at all! For my part, I am quite wary of obsessions with technology, especially when they are mixed with a thinly veiled longing for immortality.
This means that technology will be clung to, believed in, and depended on far beyond its legitimate sphere, since it also serves as a defense against our fears of irrational phenomenon. -Ernest Becker
Note added May 18: One should read Thomas Carlyle in order to thoroughly purge any cynical scientism from his soul. This paragraph is a good summary of my sentiments on the matter:
For Paganism, therefore we have still to inquire, When came that scientific certainty, the parent of such a bewildered heap of allegories, errors, and confusions? How was it, what was it? Surely it were a foolish attempt to pretend ‘explaining,’ in this place, or in any place, such a phenomenon as that fardistant distracted cloudy imbroglio of Paganism,–more like a cloudfield than a distant continent of firm land and facts! It is no longer a reality, yet it was once. We ought to understand that this seeming cloudfield was once a reality; that not poetic allegory, least of all that dupery and deception as the origin of it. Men, I say, never did believe idle songs, never risked their soul’s life on allegories; men in all times, especially in early earnest times, have had an instinct for detecting quacks, for detesting quacks. Let us try if, leaving out both the quack theory and the allegory one, and listening with affectionate attention to that far-off confused rumour of the Pagan ages, we cannot ascertain so much as this at least, That there was a kind of fact at the heart of them; that they too were not mendacious and distracted, but in their own poor way true and sane!