Without a doubt, police brutality is a serious issue that the country must grip. After all, in the United States, cops kill 10 times as many people as mass shooters do; even the FBI grants that more than half of those are unjustified. It doesn’t stop at murder, either. With nearly a quarter of the world’s prison population (despite having only 4% of the world’s overall population, the highest incarceration rate on Earth), the country has sent its police officers to arrest countless people. Many of these individuals have not committed a crime, and an astonishing 86% of prisoners did not commit a crime with a victim. But despite this, many will claim that the police are doing no wrong and should have no accountability, putting these people behind bars. After all, “they’re just doing their jobs”.
By Craig Axford | Canada
Imagine you’ve just watched Star Wars or a Harry Potter movie with a friend. As you throw your empty popcorn bucket into the trash and head for the exit, your friend asks you if you believe the movie is true.
Perhaps book clubs are more your cup of tea. After reading the Da Vinci Code, which everyone in your group agrees was a real page-turner, your club discovers in its midst someone who thought they were reading a scholarly historical work and insists the code really exists.
To be fair, someone having the opposite reaction would be just as far off base, even if perhaps not as obviously so at first glance. If, instead of insisting the movie was a documentary of some sort your companion had concluded the movie was false, citing as proof the fact that faster than light travel is impossible or that the artificial gravity enabling everyone to walk about the decks of the starship at 1G seems implausible, we might find ourselves conceding that they are technically correct yet still reasonably conclude that they had missed the story’s point.
We find such literal true or false dichotomies ridiculous when it comes to the arts. Even the sciences, properly understood, deal in probabilities rather than absolute certainty. Yet we have no difficulty making such absolute claims about our religious myths. These stories, we insist, must either be true or false.
. . .
This situation is largely the fault of those insisting their religion is factually true. In taking this position they often push even those with nuanced views on the subject into the opposite corner. When we insist it’s all or nothing, we can’t blame the opposition when we find the door to communication and compromise closed.
Having consistently had that door shut in their face, doubters and disbelievers are increasingly resorting to mockery and derision. It’s an understandable stance to take when you’re talking about people who insist dinosaurs walked the earth with humans and two of every living thing can actually fit on a boat. In addition, after the centuries worth of both physical and emotional abuse that has been heaped upon doubters (as well as believers) constitutionally protected freedom of expression is a hard opportunity to pass up when it becomes available.
All that said, fundamentally what we’re dealing with here are stories, and the purpose of a good story isn’t to convince us of its historical or scientific accuracy. Its function is to draw us in and cause us to lose ourselves for a while as we experience its telling. A good story ideally leads us to suspend disbelief, which is a very different thing from either belief or disbelief.
Suspending disbelief is the act of setting the choice between truth and falsehood aside. In this state of mind, we are not evaluating what we are reading, hearing, or seeing to determine its compatibility with reality as such. We are not engaged in analytical thinking or looking to poke holes in the tale any more than we are unconditionally accepting it as factual. In a state of suspended disbelief, all such considerations disappear from consciousness while we “become lost” in the pages of a good book or “take a journey ” with Frodo and Sam across Middle Earth from our seats in the theater.
It’s usually understood going into these experiences that we are leaving reality behind for a while. Sometimes storytellers will even explicitly invite us to suspend disbelief before the story has even begun. Such signals to drop our guard, if done right, are readily followed. However, had George Lucus opened his first Star Wars movie with the words “Recently, in a solar system near us” instead of “A long time ago in a galaxy far, far away…” the implicit invitation to believe in even just the possibility of what was to follow would likely have ruined the whole story.
. . .
A society’s most powerful myths — the ones that ultimately shape and come to define its culture — are only superficially about the characters and events depicted in them. These are merely the vehicles for conveying deeper lessons. But anyone who has attended a religious service recently knows very well that the question ‘what did you get out of the story?’ rarely if ever comes up. Even in our very secular age doubt, or even a willing suspension of disbelief, is still largely unwelcome at Friday, Saturday, or Sunday services.
Consider for a moment the historian Jennifer Michael Hecht’s description of the story of Job, a story with which even those raised in non-religious Western households are at least vaguely familiar. Was there really a man named Job? Does God really exist and did He really make a bet with Satan that facilitated Job’s suffering? From Hecht’s perspective, those kinds of questions are at best secondary:
There is something grand about a story that tries to reconcile human beings to loss, to letting go of the things that the universe has allowed us to amass and keep for a while — including the idea that after we lose everything, there is a good chance we’ll get it all back someday. Could the Job author have been satisfied with this as a parable of divine justice? It is not a parable of divine justice. It is a parable of resignation to a world-making force that has no justice as we understand justice. God comes off sounding like a metaphor for the universe: violent and chaotic yet bountiful and marvelous. The Job story is a story of doubt. God’s list brings Job back into the fold, but the fight has transformed the fold. With Job, that paradigm of a just God was carried to an extreme that immediately identified the problem with the idea: the world is not just. If justice exists, the Book of Job concludes, it does so in a way inconceivable to humanity. Job asked deep questions and they have lingered for millennia. ~ Jennifer Michael Hecht, Doubt: A History
Job is a proxy for everyone who has experienced a deep and powerful loss. Whether you’re an atheist, agnostic, unaffiliated but “spiritual”, or a regular churchgoer, the problem of suffering remains central to the human experience. Whether an individual that went by the name of Job and lived in a particular time and place ever actually existed is so far beyond the point that one must conclude that anybody who insists upon it is, like a person fixated on the reality of Lucas’ far away galaxy, seriously out of touch with reality.
Wrestling with the issues raised by the story of Job, and others like it from a variety of traditions, requires a willingness to avoid making the literal truth or falsehood of the story the place where we take our stand. That leaves suspending disbelief as the only way we can get to the heart of the matter. Suspending disbelief allows us to maintain a healthy skepticism without allowing it to interfere with our experience of the story. We aren’t accepting the story on blind faith, but we aren’t dwelling on its lack of historic or scientific veracity either. We can acknowledge factual problems if circumstances demand it, then quickly find our way back to the message without lingering for too long with the irrelevancies.
There is a morality play going on here, not a history lesson. Whether intentionally or not, when a believer insists that we have a debate about whether dinosaurs actually walked the Garden of Eden with Adam and Eve or Jesus really did walk on water, they are making the story the end instead of the means. That is something our myths were never intended to be.
. . .
In all fairness, the development of writing shares much of the blame for our literalism. For most of human history, we lacked any means of confirming whether or not the stories being told around the campfire were the same from one telling to the next, let alone from one generation to another. There were no audio or video recordings available to make sure a storyteller was adhering to the original version, let alone anyone around to take notes.
Since memory was all people had to go on — an unreliable record-keeper under the best of circumstances — the best anyone concerned with fidelity could hope for was that any major changes made to the sacred tribal myths would be noticed by those who had heard them before. However, even assuming people wanted to catch them, minor additions and subtractions were impossible to consistently detect. This combination of small but intentional creative changes and unintentional memory lapses built up like mutations over time. Some went over like lead balloons with their audience and were quickly dropped while others were powerful and popular enough to become long-term features.
Storytelling, like evolution, is a process. In oral cultures, this was intuitively understood. The meaning and knowledge embedded within the story rather than the words themselves tended to take precedence. Comparing a modern society that has the ability to not only write, but also create a real-time audio and visual record of its existence down to the minutest detail, to an oral culture for whom stories are not merely a source of identity but a matter of survival is more like comparing apples to coconuts than it is apples to oranges.
Writing provided a mechanism for ensuring consistency unlike anything humanity had encountered before and it transformed how we approach both our myths and our physical environment in ways we never could have anticipated in advance. Of course, stories were still alterable, but as long as the original text or something very close to it survived new versions could be compared to the old and even subtle differences could be readily detected.
At that point, our sacred stories began to both literally and figuratively be seen as chiseled in stone and many of our traditions ceased to be living. Increasingly, the goal was to preserve them through a kind of textual mummification. It was in this context that the written word was sanctified and the story it recorded came to be seen as historical.
. . .
“Symbols are only the vehicle of communication; they must not be mistaken for the final term, the tenor, of their reference,” Joseph Campbell wrote in his classic work The Hero With A Thousand Faces. “No matter how attractive or impressive they may seem, they remain but convenient means, accommodated to the understanding.” Campbell concludes by reminding us that “Mistaking a vehicle for its tenor may lead to the spilling not only of valueless ink, but of valuable blood.”
We should take mythology seriously, but not too seriously. A decent level of respect rather than a reverential posture is what’s called for. Modern technology enables us to compare notes and police each other for consistency, but in the context of storytelling, there’s no opportunity for either fun or learning in that. The same technology also gives us an opportunity to play with our myths: to find humor and fresh interpretations that reveal themselves best through the use of contemporary language and references.
Consider Jonathan Goldstein’s reinterpretation of the story of Adam, Eve, and our loss of innocence in the Garden of Eden. Such a retelling is only possible when the storyteller sees the text as living rather than dead. It’s both humorous and evocative without demanding either belief or disbelief. It would be difficult for a listener to come away from Goldstein’s reimagining of the opening chapters of Genesis with a desire to storm the next local school board meeting demanding Intelligent Design be given equal time with evolution. Likewise, anyone insisting the story isn’t true after hearing Goldstein’s version would also be missing the mark by quite a wide margin.
The Abrahamic traditions, in particular, have consistently doubled down on belief, generally insisting that any who would darken the doorway of their institutions be willing to profess their faith in the word as it is written. Failure to do so often means ostracism, excommunication, or far worse.
But these religions don’t have many chips left to play. Nor has the modern world dealt the literalists in their midst a particularly strong hand. The best play at this point is to fold and acknowledge humanity’s myths are now, as they have always been, a means of fostering meaning and spreading wisdom rather than a mechanism for describing the physical universe or communicating historical events to future generations.
In the closing pages of Myths To Live By, Joseph Campbell said it best. As is so often the case when it comes to mythology, he deserves the final word:
The difficulty faced today by Christian thinkers in this regard follows from their doctrine of the Nazarene as the unique historical incarnation of God; and in Judaism, likewise, there is the no less troublesome doctrine of a universal God whose eye is on but one Chosen People of all in his created world. The fruit of such ethnocentric historicism is poor spiritual fare today; and the increasing difficulties of our clergies in attracting gourmets to their banquets should be evidence enough to make them realize that there must be something no longer palatable about the dishes they are serving. These were good enough for our fathers, in the tight little worlds of knowledge of their days, when each little civilization was a thing more or less to itself. But consider that picture of the planet Earth that was taken from the surface of the moon!
Other articles that you may enjoy:
Why the relativists and the absolutists are both incorrect (or why I’m tired of reading headlines about science being…medium.com
Values & ideas will always have an ethereal quality. Humanism offers us a way to celebrate that fact without abandoning…medium.com
How my experience with Mormonism changed how I view truth and the institutions that insist they possess it.medium.com
By Adam Burdzy | United States
“Trade school? Is that, like, a place where you learn how to trade things?”
Every parent thinks that his or her child is smart in their own special way. This leads to Mom and Dad pushing Johnny to go to a nice university, where he will learn how to think, how to be a productive citizen, and most importantly, how to be a lawyer, doctor or even a professor.
Johnny decides to apply for a school that he seems to like, and is lucky enough to get accepted. Little does his parents know, Johnny doesn’t have what it takes to become a doctor, he can’t argue his way to become a lawyer, and he has no people skills, so there is no way that he will become a professor. But he tries anyway, and in his third year, he drops out. Not only did he waste three years of his life, but now he is in massive debt. After realizing he isn’t good enough for these kinds of careers, Johnny becomes depressed and has to work two jobs, the McDonald’s morning shift and the Taco Bell night shift, just to make enough money to get by.
This is something that occurs too often. About half of the people who enlist in a college or university will graduate. Johnny should have gone to trade school. Trade school is the university for people who aren’t good at the career choices offered by colleges. With the rise of labor jobs in the United States, we need more people to fill these shoes. These people are the backbone of society. Sure, some of the jobs offered may not seem that pleasant. In the end, however, the positives of attending a trade school are higher than attending a college.
To graduate from a trade school, you need about two years of schooling. To receive a bachelors degree, you need to have four years of schooling. And to make matters worse, most students will then apply to another school to get their masters degree, which takes another two years. Most people with a bachelors degree won’t be able to find a job as easily as a person with a masters degree. So you just wasted six years of your life, compared to two.
Not only do you save time in a trade school, but you also save money. Lots of money. To graduate with a bachelors degree, you will spend anywhere from $100,000 to $200,000 in those four years. Most of this money will come from student loans, or from outside scholarships. Trade school, on the other hand, costs about $33,000 to attend for 2 years. This is the average amount that students pay for one year at a university.
At a trade school, you get more opportunities to become whatever you want, and you focus in on one specific career. These career options are vast and can vary from construction work to being a commercial pilot. The wide array and the fact that you focus on one certain skill allows for you to dedicate your time to become what you want, and you don’t need to take all those classes that universities require you to take, even if they don’t relate to your major or minor. If you are going for a biology major, there should be no reason as to why you should be required to take an intro the 1700’s literature, unless you really want to.
Before you go to college or send your kids to college, take into account the benefits of attending a trade school. I am not against universities, and if the job you want requires a masters degree, then, by all means, go to college. However, if you are undecided like many college students are, then visit a trade school, find out about their work programs, and inform yourself before you make a decision that could cost you hundreds of thousands of dollars.
Get awesome merchandise. Help 71 Republic end the media oligarchy. Donate today to our Patreon, which you can find here. Thank you very much for your support!
According to NPR, only 55% of families that are receiving SNAP (Supplemental Nutrition Assistance Program) benefits have one family member that is working. In fifteen U.S. states, the other 45% of recipients are making the right choice to not work.
According to 2013 Cato Institute Research, there are 15 states in which welfare recipients actually make more than full-time minimum wage workers. This study was conducted using a single mother with two children as their welfar package example. The welfare recipients also make more than a starting-wage secretary in 39 states and more than a first-year teacher in 11 states.
According to the study, welfare recipients will make double what a 40-hour-work-week minimum wage worker will make in most of these states.
The states (in order of smallest pay discrepancy to largest) include Pennsylvania, North Dakota, Nevada, California, Maryland, Vermont, Wyoming, Rhode Island, Connecticut, New York, New Hamshire, Washington D.C., New Jersey, Massachusets, and Hawaii. Data used for the study has been updated for today’s numbers. Alabama, Louisiana, and South Carolina were not a part of the study as they did not (and still do not) have a state-mandated minimum wage. Tennessee and Mississippi have joined the list as they no longer have a state-mandated minimum wage.
The top ten of the previously mentioned 15 states has welfare recipients making $10 an “hour” more than their working counterparts. Their hourly “wage” is determined by taking their total weekly welfare benefits and dividing it by forty hours.
The top eight states have their welfare recipients making over $20 an hour for over $43,330 in a pre-tax wage equivalent.
Perhaps one of the most interesting “states” is Washington D.C. On this list of 15, they had the highest minimum wage of anyone at $12.50 an hour. Their welfare recipients make $11.93 an “hour” more than full-time minimum wage workers.
Topping the list, though, is Hawaii. A single mother with two kids will make $19.88 an “hour” more than a minimum wage worker in the state. Their 2017 minimum wage was $9.25 an hour. This means that their welfare recipients made $29.13 an “hour”. This is about the average wage of an electrical engineering technician. It is very much worth noting that Hawaii has the second highest state personal income tax in the country at eleven percent.
This study is an obvious example of why welfare benefits need to be lowered. What motivation is there to get a job when you can $60,000 a year in Hawaii in welfare benefits?