(Pareidolia + Paranoia)
Reflections on the novel Delusions of Clarity, and the perplexities faced by its psychologist protagonist. (April 2019)
Some years ago, a news story told of a freshly whitewashed church wall that had been streaked by rain, creating the image of a man’s face that some locals presumed to be Jesus. Further investigation proved that there was indeed the likeness of a long-haired, bearded man on the wall. The workers had painted over a Willie Nelson concert poster.
Of course, context matters. Had the image been seen on the side of a music hall instead of a church, it might have been identified more accurately.
In most cases, however, the perceiving of faces in ordinary things (pareidolia) arises from the imagination, as our minds are primed by evolution for facial recognition. If something approached you in the primeval darkness, you needed to quickly ascertain whether it was a bear or Uncle Grog, which might have been difficult, given that Uncle Grog was likely a big, hairy guy wearing an animal skin.
The specific propensity toward facial recognition is part of a larger predisposition to search for patterns and meaning in things in general. Some suggest that this tendency contributes to the creation of conspiracy theories, where people perceive malevolent intention behind otherwise random events, making connections that don’t exist.
That potential linkage matters greatly for a psychologist trying to excise a patient’s conspiracy theory. It is well established that once you perceive a face in something, you will continue to see it afterward. Take the clever logo for the Pittsburgh Zoo. (www.pittsburghzoo.org) At first glance, you see only the tree. Upon closer inspection, you discern the faces of the gorilla and the lion. More importantly, you won’t fail to see the gorilla or lion thereafter. They remain unerasable.
So we need to wonder, perhaps worry, whether a conspiracy theorist, having once perceived a malevolent face in the fabric of events, can be taught not to see it. Or, to put it simply, can a conspiracy theorist ever be cured?
Here we must distinguish between the personal and cultural variants. The personal type sees conspiracies everywhere. The cultural type spies them only on the other side of the tribal fence. But the two share a resistance to having their illusions dispelled. So the question of remedy applies to both. Is there any hope?
In the opening example, the subsequent cleaning of the poster provided unambiguous clarity. Seeing is believing, and that resolved the confusion. But the misperception originated from the converse. Believing is seeing. We see things not as they are, but as we are.
Preventing such errors is generally preferable to correcting them afterward. Suspending judgment is often well-advised when you have the luxury of time.
But when it’s really, really dark, and that thing is moving toward you quickly, you need to know now. Is it a bear or Uncle Grog?
So how shall we respond to such a quandary, one demanding immediate action in the absence of solid information? You can be a smart person but still discover to your dismay that wisdom flees from urgency.
The Flavors of Conspiracy
Today, conspiracy theories pop up more copiously than disturbed grasshoppers in late summer. And modern communications technology allows such theories to spread faster than ever. Lest we forget, this is not a natural occurrence such as a storm or locust plague, but a distinctly human phenomenon that depends on an individual person making a deliberate decision to pass along and spread a specific theory, or, more likely, a specific tale.
What is their motivation? The inclination to believe and spread such accounts can vary based on the sort of person who does so. There are two broad types of conspiracy theorists: the individual type and the tribalist type.
The individual type is non-selective, embracing a wide variety of conspiracy theories, even when they are contradictory. Accepted theories may include political conspiracy theories as well as apolitical theories. The tendency arises from a core personality trait whose severity depends on the person’s position on the paranoid-thinking continuum, ranging from being mildly suspicious to seeing malevolent intention behind everything that happens. With this view, there are no random events, only bad guys up to no good. Or, more likely, big bad institutions beating up on the helpless little guy. Corporations, governments, and powerful secret societies are all villains and sometimes they collaborate on their villainy.
The tribalist type of conspiracy theorist, by contrast, is highly selective, believing only those theories that pertain to someone outside the ingroup, especially in a polarized circumstance where one group stands in opposition to another, as in partisan politics and the culture wars. Here the tendency to believe arises from ingroup thinking and affiliation. Its severity depends upon the degree of loyalty to the group and by the level of perceived threat to the group, which ranges from the exaggerated to the imaginary.
A hybrid third type combines elements of the previous two. This can be seen often in anti-Semitic tropes in which the fear of secret, all-powerful institutions (first type) blends with the fear of threatening outsiders (second type).
Do these distinctions matter? Perhaps, if one is proposing potential remedies. The individual type of conspiracy theorist can benefit from a course of therapy designed to reduce paranoid thinking, a course which may include certain prescription medications. The tribalist, ingroup type conspiracy theorist can benefit from therapy that reduces fealty to ingroup thinking and lowers the perception of threat from outsiders. In milder cases, a course in critical thinking may suffice.
Unfortunately, the common thread among all types is that none of them perceives an ailment in need of remedy, so none seeks one. It’s the unresolvable paradox of misperception. You don’t seek corrective clarity when you believe you already possess it.
The situation is reminiscent of The Country of the Blind, a short story by H.G. Wells, in which a tribe of sightless people determines that the vision of a sighted stranger is a disability that must be corrected by removing his eyes.
On February 26, 2015, the Internet went totally insane over something trivial. (I know, hard to believe.) The dress. Was it blue and black? Or was it white and gold? The picture and subsequent debate went viral, being argued all over the world, spawning numerous analyses and scientific investigations. The Wikipedia article on it cites almost fifty separate sources. (en.wikipedia.org/wiki/The_dress)
The affair was unusual in that the dispute was serendipitous and rooted in the complexities of color perception and lighting. By contrast, the art world has long had fun deliberately creating works of visual ambiguity, such as the familiar rabbit-duck illusion. (en.wikipedia.org/wiki/Rabbit-duck_illusion)
Not surprisingly, the realm of storytelling also exhibits a fondness for ambiguity, both generally and of this specific form, where people observe the same thing but interpret it differently.
The most well-known example can be found in the Akira Kurosawa film Rashomon, based on the short story, In a Grove, by Ryunosuke Akutagawa. The narrative revolves around characters who provide varying accounts of a rape and murder. Subsequent discussion of the tale’s concept inspired a term for it—the Rashomon effect.
So how do we determine the real objective truth when our only witnesses are unreliable narrators? That problem presents two corollary questions. How do misperceptions arise? How can they be corrected? Observers finding parallels in contemporary politics and cultural conflicts might suggest that such discordances can be lessened if we were warier of our own cocksureness. Intelligence alone confers no immunity. In fact, it often bolsters our ability to defend our misperceptions.
Objective truth does exist. The dress, for example, was blue and black.
If you were to reveal the details of your alien abduction back in the conformist Fifties, that admission might have earned you a stay in a psychiatric hospital and possibly a lobotomy. That same disclosure in the Seventies and onward might get you a book contract and possibly a talk show.
What happened in between? The Sixties.
The counter-culture assault on traditional mores and folkways was the Big Bang of nonconformity that loosened social standards for attire, behavior, speech, and thinking. For better and for worse, the Great Unfettering generated a broader acceptance of alternative ideas, some enlightening, and some bizarre. The latter included wide range of paranormal beliefs and wild conspiracy theories.
Should we care? Does it matter what people believe so long as they remain law-abiding, self-supporting citizens?
When does a peculiar, improbable belief constitute a serious break with reality? When does eccentricity cross the line into genuine psychological disorder? Can that line be defined by objective standards that stand the test of time, or is “normal behavior” merely a protean cultural artifact that follows fashion?
The answers to those questions may be hazy, but one thing remains clear. After more than a half century, the Great Unfettering remains in full bloom and its effects will be lasting.
Rewriting Self-Told Stories
Much has been written about how stories provide coherence and identity to a tribe or culture, but less has been written about the importance of stories to a coherent sense of self. Every person is a self-employed troubadour with an appreciative audience of one. Our self-told stories affirm our attitudes and justify our actions. They provide a narrative arc that yields a sense of meaning and purpose, the idea that it is all headed somewhere.
A good psychologist knows that to properly understand someone, you must listen to their inner stories. And if you would presume to mend someone, you must rewrite some of their harmful stories and replace them with better, healthier ones. Sing a more melodious song. Be a more audible troubadour than the one they hold within. Some will listen, and others will resist, seeing it as an attack on their identity. You can’t win them all. Some stories are written in stone.
Hamlet vs. Alexander
Imagine a mystery novel featuring Alexander the Great as the investigator. What might that be like? His handling of the Gordian Knot suggests a certain impatience with untangling things, so we can assume story would be brief. Very brief.
Young Alex was a man of action, not deliberation, which is sometimes good, and sometimes not. And those opposing concepts set up a dynamic tension in how we handle our affairs, both at the personal level and at the larger level of nations.
A rational person should be a person of deliberation, but not too much so. We don’t want to be a dithering Hamlet at crunch time, but we prefer to have complete information before taking significant action, lest a grave mistake be made. The problem here is that perfect information is rarely available.
Impulsive people are more of the Alexander mold, their sword hand ever-ready to solve knotty problems. They are primed to act because they understand that opportunities for decisive action are often fleeting, that one must seize the moment before it disappears.
So how do we determine when the threshold of knowledge has been met? How do we know when it’s time to let the arrow fly?
Action yields consequences. So does inaction.
The Dimness of Brightness
We don’t see things as they are, but as we are. Our perception is influenced by our emotions. If you’re smart, you already know that. And if you’re really smart, you assume that your awareness of the foible protects you from it. And that assumption of immunity likely makes you less vigilant and more susceptible, which is not smart.
It seems counterintuitive that being intelligent can make you unwise, but there are several good reasons for it. One is that your intelligence improves your ability to argue your intuitive position, not just to others, but to yourself. When you believe you have arrived at your position by reasoning, you don’t see it as intellectual arrogance. You see it simply as being correct. Besides, you’re a nice person. Affable and self-effacing. It’s just not possible for you to be imperiously dogmatic. You’re just plain right, and unassailably so.
Another problem is proximity. It is easier to dispense wisdom than to live wisely because there is some clarifying distance between you and the recipient of your sage advice. But when your face is shoved into your own mess, vision gets impaired. This recalls the familiar biblical caution that you should not presume to remove a speck from someone else’s eye when you fail to see the plank in your own.
But if you happen to be a psychologist, you must presume. Removing specks from the mind’s eye is part of the job description. It’s what you do. And if perfection was a prerequisite for being a therapist, there would be none. So solving someone else’s problems while overlooking your own is a constant occupational hazard.
But you can’t acknowledge that in front of your patient. To build trust, you must sound confident. Speak with authority. Entertain no doubts. But what if such doubts exist?
For example, a psychologist would be reflexively inclined to dispel a patient’s fears. But what if those fears had some basis in reality? Even paranoids have real enemies. How would you know without investigating?
To distinguish between fantasy and reality, you need to establish the objective truth of the matter, nail down the facts. But facts can be elusive and subject to interpretation.
Furthermore, if you step outside your domain to investigate, then you are no longer a spectator viewing it all from the scenic overlook. You’ve just jumped into the roiling cauldron, and your own actions are contributing to the boil. And just like that, there you are, down with the rest of them, vision obscured, seeing things not as they are, but as you are.