Anything Can Be Interpreted in Any WayInherent ambiguity, and the role of creativity in dealing with it.

Posted:

I can’t remember a time when I didn’t want to know the Truth. I wanted to be someone who doesn’t stop at surface appearances and digs down to find out how things really are, even if it’s inconvenient or painful. But how did I know I was getting close to paydirt? A signal I often used was pain. Unconsciously, I leapt from “I want to know what’s right even if it’s inconvenient or painful” to “if it’s inconvenient or painful, it must be right.” I knew there would be some collateral damage with this approach, but at least I wasn’t going to be that one bozo who doesn’t see the Truth that is obvious to everyone else.

A few years ago, at work, I called a meeting to discuss a new software project, and I invited a talented junior developer even though I knew he was on call (which meant, it was his week to solve operational problems, so he might be pre-empted at any time). At one point in the meeting, I saw that he was engrossed in his laptop, and I was worried that he might be engaged on a production issue and distracted by the meeting. So I interrupted the discussion and said to him, “You know, you don’t have to be here if you don’t want.” He closed his laptop, sat up straight, and looked ahead at the whiteboard with a serious expression on his face. I think it took a moment before I realized that he thought I had criticized his meeting participation. Time dilated for me while my mind raced through a simulation of possible responses, until I became overwhelmed and had to give up on salvaging the interaction. I carried on with the meeting and found myself reflecting on it later.

That experience showed me what my pain-based heuristic must look like from the outside. What I saw was that it was not just unfair, it could be self-validating. That was the most obvious takeaway. Then, it occurred to me that no amount of skill, tact, or benevolence could be enough to avoid being misunderstood or having unintended impact. This was a disillusionment because it turns out the pain-based heuristic contained a hidden grandiosity. If anything that goes wrong is my fault, then that means I have the power to prevent things from going wrong if I can ever someday get good enough. But it had now become clear that this plan would never work. Finally, I realized that on some level, it wasn’t meaningful to ask if the developer had taken what I’d said the wrong way. What if an independent committee of reasonable people evaluated a video snippet of the meeting, and came to the same conclusion as the developer about what I meant? If there were a genuine Truth that he missed, is there any practical sense in which it mattered?


Evolution has been called a “punctuated equilibrium” which unfolds by forward lurches. When you take a casual look, species seem stable from generation to generation, but these periods interleave with phases of rapid adaptation, like an earthquake reshaping geography. As it happens, science itself seems to progress in this way. Thomas Kuhn blew the lid off of this in his mid-20th-century book, “The Structure of Scientific Revolutions” (which has become arguably the most cited philosophical paper of all time). Before Kuhn, the presumption was that science advances through a process of falsification. Supporting this, another 20th century philosopher, Karl Popper, claimed that scientists make progress by attempting to disprove hypotheses. This view appeals to the intuition because we can visualize it as a process of stripping away the false assumptions to reveal the gleaming Truth hidden beneath. But Kuhn noticed that when you look at scientists actually doing science, they are generally engaged in trying to prove hypotheses, not disprove them. They are trying to build, not strip away. Kuhn argued that science undergoes periods of relative stability and elaboration punctuated by events of rapid change in which a new framework, or paradigm, overtakes the old. And this change is usually contested by the established order; paradigms tend to fully shift, Kuhn wrote, only once the last holdouts have died.

The tidy, cheerful science textbooks we grew up with belie the chaos of their origins. Even today, the leading edge of science consists of a messy and fallible peer review process, rife with retractions and reproducibility gaps. These themes of challenge and reversal appear in the subject of literature, too. George Orwell’s “Animal Farm” was rejected by at least four publishers before making it into print; “The Catcher in the Rye,” “Moby Dick,” and “The Great Gatsby” are just a few others that I personally read in school, which were initially rebuffed by the gatekeepers of culture.

It’s natural to believe that science (and literature, and evolution) are progressing on a linear path toward an ever more accurate or mature or fit condition. We may assume that the latest paradigm overcomes all previous errors and deficiencies. But Kuhn identified that every scientific paradigm suffers anomalies — observations that cannot be explained within the framework. Growing intolerance for the anomalies of the current paradigm is part of what drives a shift to the new one, but the new paradigm necessarily has its own anomalies, even if they are not fully understood yet. Moreover, competing paradigms are inherently incommensurable. There is no evidence that can prove one paradigm is better than another, because each paradigm interprets all available evidence differently. You have to first get inside a paradigm before you can have access to rules and standards and even a possibility of discourse. Everyone, even the most experienced and influential among us, must operate within a paradigm. And any paradigm, no matter how dominant today, may eventually fall out of favor, just as any species may eventually become extinct.

If there really were a sense in which science winnows ever closer to Truth, we would expect there to be findings that remain constant from one paradigm to the next, and scientists might be engaged in refining that durable understanding. But that is not what we see. Instead, there is complete turnover, because every paradigm reinterprets all of the earlier findings. For example, Newton’s laws become a special case in the Einsteinian relativistic framework (Newton’s laws might remain useful in an engineering context, but are no longer valid for making progress in science). Moreover, we would expect to see gradual transitions from one paradigm to the next, such that the new ideas extend the old, but instead we have conflict and revolution — not extension but replacement. It is hard to reconcile these observations with the notion that science is honing in on an inevitable, final result that we could call Truth.


There is a curious property of rules. We come up against inherent ambiguity both in formulating a rule based on examples, and also in judging whether a new example fits an existing rule. Wittgenstein’s rule-following paradox asserts that any course of action can be made out to accord with any rule. I was explaining this to a friend who doubted it, so I asked him to state a rule. He said, “You should wash the dishes after dinner.” I replied, “OK, I wash the dishes the next day. It’s still ‘after.’ How long after dinner is too late to count as following the rule?” “Oh,” he said. To make a rule air-tight would mean adding so many caveats as to become impractical and eventually match the level of detail in the world itself. Which means there is always some degree of freedom in interpreting a rule. Of course, this is why judges and juries and lawyers exist, in order to argue about what a “reasonable person” would do or assume in contested matters. But if there is ambiguity on the question of whether or not an example matches a rule, it begins to look even more difficult to ever know the truth on nontrivial subjects such as what constitutes great literature.

Here is a thought exercise. The next time you are in a meeting where your organization’s Director or Vice President is reviewing a proposal, if it is getting a favorable response, imagine a parallel universe in which the meeting went badly. Notice the weaker points in the paper that the leader could have fixated on if he or she chose. Or, if the paper is getting torn to shreds, find the positive aspects that the leader could have praised, and flip the tone of the meeting to approval in your mind. You will really get this point once you can convincingly imagine any outcome of any meeting given the same participants and inputs. When I was naive in my early career, I believed that a paper would get a reception according to its merits, in all but the most pathological scenarios. With enough years of experience, I came to assume that a paper will get a reception reflecting the reviewer’s predisposition, in all but the most pathological scenarios. (This doesn’t mean that merit doesn’t matter, it just means it’s not enough.)

In psychology, there is a body of research on the subject of cognitive dissonance. When people are confronted with evidence that contradicts what they believe, they go to great lengths to contort the interpretation of that evidence to make it compatible with their existing beliefs or behaviors. From “Mistakes Were Made (but Not by Me)” by Carol Tavris and Elliot Aronson:

To test this observation, Elliot predicted that if people go through a great deal of pain, discomfort, effort, or embarrassment to get something, they will be happier with that “something” than if it came to them easily. For behaviorists, this was a preposterous prediction. Why would people like anything associated with pain? But for Elliot, the answer was obvious: self-justification. The cognition “I am a sensible, competent person” is dissonant with the cognition “I went through a painful procedure to achieve something” — say, join a group — “that turned out to be boring and worthless.” Therefore, a person would distort his or her perceptions of the group in a positive direction, trying to find good things about it and ignoring the downside.

[…​]

And so Elliot and his colleague Judson Mills conducted [a controlled] experiment. Stanford students were invited to join a group that would be discussing the psychology of sex, but to qualify for admission, they first had to fulfill an entrance requirement. Some of the students were randomly assigned to a severely embarrassing initiation procedure: they had to recite, out loud to the experimenter, lurid, sexually explicit passages from Lady Chatterley’s Lover and other racy novels. (For conventional 1950s students, this was a painfully embarrassing thing to do.) Others were randomly assigned to a mildly embarrassing initiation procedure: reading aloud sexual words from the dictionary.

After the initiation, each of the students listened to an identical tape recording of a discussion allegedly being held by the group of people they had just joined. Actually, the audiotape was prepared in advance so that the discussion was as boring and worthless as it could be. The discussants talked haltingly, with long pauses, about the secondary sex characteristics of birds — changes in plumage during courtship, that sort of thing. The taped discussants hemmed and hawed, frequently interrupted one another, and left sentences unfinished.

Finally, the students rated the discussion on a number of dimensions. Those who had undergone only a mild initiation saw the discussion for what it was, worthless and dull, and they correctly rated the group members as being unappealing and boring. One guy on the tape, stammering and muttering, admitted that he hadn’t done the required reading on the courtship practices of some rare bird, and the mild-initiation listeners were annoyed by him. What an irresponsible idiot! He didn’t even do the basic reading! He let the group down! Who’d want to be in a group with him? But those who had gone through a severe initiation rated the discussion as interesting and exciting and the group members as attractive and sharp. They forgave the irresponsible idiot. His candor was refreshing! Who wouldn’t want to be in a group with such an honest guy? It was hard to believe that they were listening to the same tape recording. Such is the power of dissonance.

Not only does it seem possible to construe a piece of evidence to support any conclusion, the human propensity to do exactly this turns out to be well studied. This phenomenon has been observed in justifying the ownership of a five million dollar Stradivarius violin even when blind panels were not able to tell any difference in sound vs. more modern instruments, justifying a decision to buy one house over another, justfying the decision to keep smoking in spite of evidence that it is bad for health, and so on.


The conclusion that’s hard to escape is: If there is such a thing as objective Truth, it seems inaccessible from any available route. Even the most respected authorities of our time can only offer us their opinions, and history has often overruled such experts in due course. People can, and will, disagree about even the most basic conclusions — and there is no objective remedy. Anything can be interpreted in practically any way. And choice of interpretation, in most cases, appears to be a matter of taste.

Epistemology has just taken a hard left turn and crashed into aesthetics. Truth was always Beauty wearing Groucho Marx glasses. Even in science, Occam’s Razor remains as the flag marking high ground, but what is “the simpler explanation is to be preferred” if not an aesthetic criterion? If paradigms are incommensurable, when one overtakes another is this very different than a fashion trend?

The idea that we might be free to believe whatever we want seems to flirt with psychosis. But, for all we have the tools to know, maybe no one has ever truly been in touch with reality to begin with. Most of us are leery of “alternative facts” and “fake news,” but maybe, in some sense, there is no other kind.

The idea that we can just pick whatever interpretations we like sounds a bit like the premise of the entire self-help genre. Many of the classic inspirational speakers try to convince us of empowering views to replace self-defeating ones. But a guru’s belief system, however appealing, doesn’t always catch on outside of the lectures or books. This may keep the user coming back, and clinging to this space may start to look like living in a fantasy world. The apparent gap here illustrates that there is some agent of stability that holds back the anarchy. Even though it’s possible for multiple different views on a subject to be held by sincere and competent observers, it’s difficult for one person to switch among them. We seem to have an immune system against potentially disruptive beliefs, so new interpretations that are not consistent with what we already believe are not really available to us. Even if we want to believe them, they don’t stick.

The agent of stability for our beliefs must be something akin to Kuhn’s idea of paradigm. It’s the lens or filter through which we interpret what we experience. This filter is made up of the ideas that we have taken as axiomatic, like religious and political convictions. For the most part, this filter is invisible, like water to fish who swim in it.

Perhaps the reason our filter is not easy to modify is that the process of interpretation is not necessarily a conscious act which we can override. Even an action as simple as reading text is a kind of interpretation. Apprehending written (and spoken) language is mostly involuntary — you can’t choose not to understand words in front of you in a language you know how to read (or at least, I can’t). Of course it is possible to read new languages, but it takes training and practice. So, thinking we can top-down our way into a preferred filter or interpretation may be like thinking we can will ourselves into reading an unfamiliar language. Such a task requires an investment of effort, just as a paradigm change in a scientific community only comes after a period of crisis.

But even if it is difficult, changing our personal filter or paradigm must be possible, and maybe inevitable, just as our sense of taste can develop and mature.


One way to internalize these ideas is to conclude that there is no Truth. Another take is that there are as many Truths as there are observers. This is a framing that gives access to a potentially helpful insight.

My original heuristic — “if it’s inconvenient or painful, it must be right” — was motivated by an attempt at humility, to avoid attributing error to others. The trouble, it seems, was coupling this with the assumption that error must lie somewhere, and therefore the error must be mine. But perhaps it is unnecessary and counterproductive to assume that one of two conflicting views must be wrong. Those of us who prefer harmony and coherence (in other words, presumably, all of us) may like to believe that one day, humanity will converge in agreement on all matters of consequence. But even if that were ever to happen, as of now every day millions of other people in the world operate in mental schemas that are all but irreconcilable with our own. These people get through their day successfully. Their sky does not fall just because it is a different color from ours. There’s no obvious sign that there will ever be a reckoning to round up all dissonant views and bring them to alignment.

Doing away with the idea of wrongness entirely might seem a bridge too far. But nothing about this model suggests we have to like or endorse all possible alternative views, or that we can’t or shouldn’t take action to protect ourselves from violence. Extreme cases aside, the vast majority of the time, differences of opinion are sincere and not directly threatening. They represent something more like votes for the most appealing or salutary interpretation out of the many available that fit our shared experiences.

Someone is wrong on the internet

So maybe we are not obligated to find and fix what is wrong in the world, any more than scientists are primarily engaged in falsification, or any more than we have been charged with a duty to seek out and denounce ugly paintings. Our task, instead, might be to gently but patiently demonstrate what seems right.


Mentioned in the post:

Also sort of relevant: