By Craig Axford | Canada
Imagine you’ve just watched Star Wars or a Harry Potter movie with a friend. As you throw your empty popcorn bucket into the trash and head for the exit, your friend asks you if you believe the movie is true.
Perhaps book clubs are more your cup of tea. After reading the Da Vinci Code, which everyone in your group agrees was a real page-turner, your club discovers in its midst someone who thought they were reading a scholarly historical work and insists the code really exists.
To be fair, someone having the opposite reaction would be just as far off base, even if perhaps not as obviously so at first glance. If, instead of insisting the movie was a documentary of some sort your companion had concluded the movie was false, citing as proof the fact that faster than light travel is impossible or that the artificial gravity enabling everyone to walk about the decks of the starship at 1G seems implausible, we might find ourselves conceding that they are technically correct yet still reasonably conclude that they had missed the story’s point.
We find such literal true or false dichotomies ridiculous when it comes to the arts. Even the sciences, properly understood, deal in probabilities rather than absolute certainty. Yet we have no difficulty making such absolute claims about our religious myths. These stories, we insist, must either be true or false.
. . .
This situation is largely the fault of those insisting their religion is factually true. In taking this position they often push even those with nuanced views on the subject into the opposite corner. When we insist it’s all or nothing, we can’t blame the opposition when we find the door to communication and compromise closed.
Having consistently had that door shut in their face, doubters and disbelievers are increasingly resorting to mockery and derision. It’s an understandable stance to take when you’re talking about people who insist dinosaurs walked the earth with humans and two of every living thing can actually fit on a boat. In addition, after the centuries worth of both physical and emotional abuse that has been heaped upon doubters (as well as believers) constitutionally protected freedom of expression is a hard opportunity to pass up when it becomes available.
All that said, fundamentally what we’re dealing with here are stories, and the purpose of a good story isn’t to convince us of its historical or scientific accuracy. Its function is to draw us in and cause us to lose ourselves for a while as we experience its telling. A good story ideally leads us to suspend disbelief, which is a very different thing from either belief or disbelief.
Suspending disbelief is the act of setting the choice between truth and falsehood aside. In this state of mind, we are not evaluating what we are reading, hearing, or seeing to determine its compatibility with reality as such. We are not engaged in analytical thinking or looking to poke holes in the tale any more than we are unconditionally accepting it as factual. In a state of suspended disbelief, all such considerations disappear from consciousness while we “become lost” in the pages of a good book or “take a journey ” with Frodo and Sam across Middle Earth from our seats in the theater.
It’s usually understood going into these experiences that we are leaving reality behind for a while. Sometimes storytellers will even explicitly invite us to suspend disbelief before the story has even begun. Such signals to drop our guard, if done right, are readily followed. However, had George Lucus opened his first Star Wars movie with the words “Recently, in a solar system near us” instead of “A long time ago in a galaxy far, far away…” the implicit invitation to believe in even just the possibility of what was to follow would likely have ruined the whole story.
. . .
A society’s most powerful myths — the ones that ultimately shape and come to define its culture — are only superficially about the characters and events depicted in them. These are merely the vehicles for conveying deeper lessons. But anyone who has attended a religious service recently knows very well that the question ‘what did you get out of the story?’ rarely if ever comes up. Even in our very secular age doubt, or even a willing suspension of disbelief, is still largely unwelcome at Friday, Saturday, or Sunday services.
Consider for a moment the historian Jennifer Michael Hecht’s description of the story of Job, a story with which even those raised in non-religious Western households are at least vaguely familiar. Was there really a man named Job? Does God really exist and did He really make a bet with Satan that facilitated Job’s suffering? From Hecht’s perspective, those kinds of questions are at best secondary:
There is something grand about a story that tries to reconcile human beings to loss, to letting go of the things that the universe has allowed us to amass and keep for a while — including the idea that after we lose everything, there is a good chance we’ll get it all back someday. Could the Job author have been satisfied with this as a parable of divine justice? It is not a parable of divine justice. It is a parable of resignation to a world-making force that has no justice as we understand justice. God comes off sounding like a metaphor for the universe: violent and chaotic yet bountiful and marvelous. The Job story is a story of doubt. God’s list brings Job back into the fold, but the fight has transformed the fold. With Job, that paradigm of a just God was carried to an extreme that immediately identified the problem with the idea: the world is not just. If justice exists, the Book of Job concludes, it does so in a way inconceivable to humanity. Job asked deep questions and they have lingered for millennia. ~ Jennifer Michael Hecht, Doubt: A History
Job is a proxy for everyone who has experienced a deep and powerful loss. Whether you’re an atheist, agnostic, unaffiliated but “spiritual”, or a regular churchgoer, the problem of suffering remains central to the human experience. Whether an individual that went by the name of Job and lived in a particular time and place ever actually existed is so far beyond the point that one must conclude that anybody who insists upon it is, like a person fixated on the reality of Lucas’ far away galaxy, seriously out of touch with reality.
Wrestling with the issues raised by the story of Job, and others like it from a variety of traditions, requires a willingness to avoid making the literal truth or falsehood of the story the place where we take our stand. That leaves suspending disbelief as the only way we can get to the heart of the matter. Suspending disbelief allows us to maintain a healthy skepticism without allowing it to interfere with our experience of the story. We aren’t accepting the story on blind faith, but we aren’t dwelling on its lack of historic or scientific veracity either. We can acknowledge factual problems if circumstances demand it, then quickly find our way back to the message without lingering for too long with the irrelevancies.
There is a morality play going on here, not a history lesson. Whether intentionally or not, when a believer insists that we have a debate about whether dinosaurs actually walked the Garden of Eden with Adam and Eve or Jesus really did walk on water, they are making the story the end instead of the means. That is something our myths were never intended to be.
. . .
In all fairness, the development of writing shares much of the blame for our literalism. For most of human history, we lacked any means of confirming whether or not the stories being told around the campfire were the same from one telling to the next, let alone from one generation to another. There were no audio or video recordings available to make sure a storyteller was adhering to the original version, let alone anyone around to take notes.
Since memory was all people had to go on — an unreliable record-keeper under the best of circumstances — the best anyone concerned with fidelity could hope for was that any major changes made to the sacred tribal myths would be noticed by those who had heard them before. However, even assuming people wanted to catch them, minor additions and subtractions were impossible to consistently detect. This combination of small but intentional creative changes and unintentional memory lapses built up like mutations over time. Some went over like lead balloons with their audience and were quickly dropped while others were powerful and popular enough to become long-term features.
Storytelling, like evolution, is a process. In oral cultures, this was intuitively understood. The meaning and knowledge embedded within the story rather than the words themselves tended to take precedence. Comparing a modern society that has the ability to not only write, but also create a real-time audio and visual record of its existence down to the minutest detail, to an oral culture for whom stories are not merely a source of identity but a matter of survival is more like comparing apples to coconuts than it is apples to oranges.
Writing provided a mechanism for ensuring consistency unlike anything humanity had encountered before and it transformed how we approach both our myths and our physical environment in ways we never could have anticipated in advance. Of course, stories were still alterable, but as long as the original text or something very close to it survived new versions could be compared to the old and even subtle differences could be readily detected.
At that point, our sacred stories began to both literally and figuratively be seen as chiseled in stone and many of our traditions ceased to be living. Increasingly, the goal was to preserve them through a kind of textual mummification. It was in this context that the written word was sanctified and the story it recorded came to be seen as historical.
. . .
“Symbols are only the vehicle of communication; they must not be mistaken for the final term, the tenor, of their reference,” Joseph Campbell wrote in his classic work The Hero With A Thousand Faces. “No matter how attractive or impressive they may seem, they remain but convenient means, accommodated to the understanding.” Campbell concludes by reminding us that “Mistaking a vehicle for its tenor may lead to the spilling not only of valueless ink, but of valuable blood.”
We should take mythology seriously, but not too seriously. A decent level of respect rather than a reverential posture is what’s called for. Modern technology enables us to compare notes and police each other for consistency, but in the context of storytelling, there’s no opportunity for either fun or learning in that. The same technology also gives us an opportunity to play with our myths: to find humor and fresh interpretations that reveal themselves best through the use of contemporary language and references.
Consider Jonathan Goldstein’s reinterpretation of the story of Adam, Eve, and our loss of innocence in the Garden of Eden. Such a retelling is only possible when the storyteller sees the text as living rather than dead. It’s both humorous and evocative without demanding either belief or disbelief. It would be difficult for a listener to come away from Goldstein’s reimagining of the opening chapters of Genesis with a desire to storm the next local school board meeting demanding Intelligent Design be given equal time with evolution. Likewise, anyone insisting the story isn’t true after hearing Goldstein’s version would also be missing the mark by quite a wide margin.
The Abrahamic traditions, in particular, have consistently doubled down on belief, generally insisting that any who would darken the doorway of their institutions be willing to profess their faith in the word as it is written. Failure to do so often means ostracism, excommunication, or far worse.
But these religions don’t have many chips left to play. Nor has the modern world dealt the literalists in their midst a particularly strong hand. The best play at this point is to fold and acknowledge humanity’s myths are now, as they have always been, a means of fostering meaning and spreading wisdom rather than a mechanism for describing the physical universe or communicating historical events to future generations.
In the closing pages of Myths To Live By, Joseph Campbell said it best. As is so often the case when it comes to mythology, he deserves the final word:
The difficulty faced today by Christian thinkers in this regard follows from their doctrine of the Nazarene as the unique historical incarnation of God; and in Judaism, likewise, there is the no less troublesome doctrine of a universal God whose eye is on but one Chosen People of all in his created world. The fruit of such ethnocentric historicism is poor spiritual fare today; and the increasing difficulties of our clergies in attracting gourmets to their banquets should be evidence enough to make them realize that there must be something no longer palatable about the dishes they are serving. These were good enough for our fathers, in the tight little worlds of knowledge of their days, when each little civilization was a thing more or less to itself. But consider that picture of the planet Earth that was taken from the surface of the moon!
Other articles that you may enjoy:
Why the relativists and the absolutists are both incorrect (or why I’m tired of reading headlines about science being…medium.com
Values & ideas will always have an ethereal quality. Humanism offers us a way to celebrate that fact without abandoning…medium.com
How my experience with Mormonism changed how I view truth and the institutions that insist they possess it.medium.com