Author: Craig Axford

I'm a US citizen, but for most of the past decade I've been living in British Columbia, Canada. I have degrees in both anthropology and environmental studies from the University of Victoria. I am currently working on a master's degree in environmental management at Royal Roads University. I've been a Green Party candidate for the US Congress as well as an organizer for the DNC. I've also served as the program director for a local non-profit environmental group.

Making Room for Reflection in the Learning Process

By Craig Axford | Canada

Stillness is the sort of thing we don’t really appreciate until we have felt its absence for a while. The chance to sit and reflect upon our experiences is essential to integrating the learning we squeeze from them into our lives. That means occasionally putting our life on pause for a while, which is hardly a fashionable thing to do these days.

A few days ago, I completed the first of three residencies required by my master’s program. Each residency is three weeks in duration and, if the first one is any indication, they will all involve long days in the classroom followed by more hours of work on various projects stretching well into the evening.

This recent pedagogical sprint still leaves me rather dazed. Hundreds of PowerPoint slides, numerous spur of the moment classroom readings, and one major high-pressure group assignment left little room for anything like reflection. To figure out how, why and when to incorporate the learning of the past 21 days will likely require much more time than the residency itself took from my life. Even then, much of the program’s content will at best be only dimly remembered.

Education, at least in the form typically proceeded by the qualifier “public”, usually strives for efficiency. Getting the most information possible to the greatest number in the shortest time supposedly maximizes social benefit at minimum cost. The slow contemplative pace the academy was once known for is now seen as an indicator of waste.

But personal downtime is as valuable as time spent behind a desk listening while the professor clicks through her slides. So is time spent discussing the concepts being presented and debating their merits with others. Learning is not a passive process; nor can it be rushed liked a download via a highspeed Internet connection.

The word contemplation derives from the Latin templum, which translates as “a place for observation.” Temple likewise traces its roots to this Latin noun. By adding the prefix con to templum we literally have the phrase “with(in) a place for observation.” That’s a door no program, no matter how well designed, can force us to walk through. However, how our educational and corporate institutions operate can disincentivize making the effort.

Contemplation is not synonymous with the kind of instantaneous and often faulty observations we associate with witnessing a car accident or the rushed decisionmaking forced upon us by often arbitrary deadlines. A certain degree of intentionality is built into it.

I don’t blame the university, my professors, or the students I was studying with for the pace of my recent experience and the stress that it imposed. My program is designed for working adults, most of whom are either already working in the field they are studying or have some background in it. Few if any of my instructors or my classmates enjoy an abundance of spare time. However, that our lives have become so busy is all the more reason to put contemplation on the calendar.

We live in a culture that insists upon interrupting us at regular intervals. Making time for reflection and to play with the ideas we encounter is essential to getting the most from our experiences. That making room for contemplation requires more effort than it used to is no excuse for failing to do so.

Follow Craig on Twitter or read him on Medium.com

Other stories you may enjoy:

Grandma’s Painting
A reflection on art, memory & meaningmedium.com


Get awesome merch. Help 71 Republic end the media oligarchy. Donate today to our Patreon, which you can find here. Thank you very much for your support!

Advertisements

The Lost Art of Suspending Disbelief

By Craig Axford | Canada

Imagine you’ve just watched Star Wars or a Harry Potter movie with a friend. As you throw your empty popcorn bucket into the trash and head for the exit, your friend asks you if you believe the movie is true.

Perhaps book clubs are more your cup of tea. After reading the Da Vinci Code, which everyone in your group agrees was a real page-turner, your club discovers in its midst someone who thought they were reading a scholarly historical work and insists the code really exists.

To be fair, someone having the opposite reaction would be just as far off base, even if perhaps not as obviously so at first glance. If, instead of insisting the movie was a documentary of some sort your companion had concluded the movie was false, citing as proof the fact that faster than light travel is impossible or that the artificial gravity enabling everyone to walk about the decks of the starship at 1G seems implausible, we might find ourselves conceding that they are technically correct yet still reasonably conclude that they had missed the story’s point.

We find such literal true or false dichotomies ridiculous when it comes to the arts. Even the sciences, properly understood, deal in probabilities rather than absolute certainty. Yet we have no difficulty making such absolute claims about our religious myths. These stories, we insist, must either be true or false.

. . .

This situation is largely the fault of those insisting their religion is factually true. In taking this position they often push even those with nuanced views on the subject into the opposite corner. When we insist it’s all or nothing, we can’t blame the opposition when we find the door to communication and compromise closed.

Having consistently had that door shut in their face, doubters and disbelievers are increasingly resorting to mockery and derision. It’s an understandable stance to take when you’re talking about people who insist dinosaurs walked the earth with humans and two of every living thing can actually fit on a boat. In addition, after the centuries worth of both physical and emotional abuse that has been heaped upon doubters (as well as believers) constitutionally protected freedom of expression is a hard opportunity to pass up when it becomes available.

All that said, fundamentally what we’re dealing with here are stories, and the purpose of a good story isn’t to convince us of its historical or scientific accuracy. Its function is to draw us in and cause us to lose ourselves for a while as we experience its telling. A good story ideally leads us to suspend disbelief, which is a very different thing from either belief or disbelief.

Suspending disbelief is the act of setting the choice between truth and falsehood aside. In this state of mind, we are not evaluating what we are reading, hearing, or seeing to determine its compatibility with reality as such. We are not engaged in analytical thinking or looking to poke holes in the tale any more than we are unconditionally accepting it as factual. In a state of suspended disbelief, all such considerations disappear from consciousness while we “become lost” in the pages of a good book or “take a journey ” with Frodo and Sam across Middle Earth from our seats in the theater.

It’s usually understood going into these experiences that we are leaving reality behind for a while. Sometimes storytellers will even explicitly invite us to suspend disbelief before the story has even begun. Such signals to drop our guard, if done right, are readily followed. However, had George Lucus opened his first Star Wars movie with the words “Recently, in a solar system near us” instead of “A long time ago in a galaxy far, far away…” the implicit invitation to believe in even just the possibility of what was to follow would likely have ruined the whole story.

. . .

A society’s most powerful myths — the ones that ultimately shape and come to define its culture — are only superficially about the characters and events depicted in them. These are merely the vehicles for conveying deeper lessons. But anyone who has attended a religious service recently knows very well that the question ‘what did you get out of the story?’ rarely if ever comes up. Even in our very secular age doubt, or even a willing suspension of disbelief, is still largely unwelcome at Friday, Saturday, or Sunday services.

Consider for a moment the historian Jennifer Michael Hecht’s description of the story of Job, a story with which even those raised in non-religious Western households are at least vaguely familiar. Was there really a man named Job? Does God really exist and did He really make a bet with Satan that facilitated Job’s suffering? From Hecht’s perspective, those kinds of questions are at best secondary:

There is something grand about a story that tries to reconcile human beings to loss, to letting go of the things that the universe has allowed us to amass and keep for a while — including the idea that after we lose everything, there is a good chance we’ll get it all back someday. Could the Job author have been satisfied with this as a parable of divine justice? It is not a parable of divine justice. It is a parable of resignation to a world-making force that has no justice as we understand justice. God comes off sounding like a metaphor for the universe: violent and chaotic yet bountiful and marvelous. The Job story is a story of doubt. God’s list brings Job back into the fold, but the fight has transformed the fold. With Job, that paradigm of a just God was carried to an extreme that immediately identified the problem with the idea: the world is not just. If justice exists, the Book of Job concludes, it does so in a way inconceivable to humanity. Job asked deep questions and they have lingered for millennia. ~ Jennifer Michael Hecht, Doubt: A History

Job is a proxy for everyone who has experienced a deep and powerful loss. Whether you’re an atheist, agnostic, unaffiliated but “spiritual”, or a regular churchgoer, the problem of suffering remains central to the human experience. Whether an individual that went by the name of Job and lived in a particular time and place ever actually existed is so far beyond the point that one must conclude that anybody who insists upon it is, like a person fixated on the reality of Lucas’ far away galaxy, seriously out of touch with reality.

Wrestling with the issues raised by the story of Job, and others like it from a variety of traditions, requires a willingness to avoid making the literal truth or falsehood of the story the place where we take our stand. That leaves suspending disbelief as the only way we can get to the heart of the matter. Suspending disbelief allows us to maintain a healthy skepticism without allowing it to interfere with our experience of the story. We aren’t accepting the story on blind faith, but we aren’t dwelling on its lack of historic or scientific veracity either. We can acknowledge factual problems if circumstances demand it, then quickly find our way back to the message without lingering for too long with the irrelevancies.

There is a morality play going on here, not a history lesson. Whether intentionally or not, when a believer insists that we have a debate about whether dinosaurs actually walked the Garden of Eden with Adam and Eve or Jesus really did walk on water, they are making the story the end instead of the means. That is something our myths were never intended to be.

. . .

In all fairness, the development of writing shares much of the blame for our literalism. For most of human history, we lacked any means of confirming whether or not the stories being told around the campfire were the same from one telling to the next, let alone from one generation to another. There were no audio or video recordings available to make sure a storyteller was adhering to the original version, let alone anyone around to take notes.

Since memory was all people had to go on — an unreliable record-keeper under the best of circumstances — the best anyone concerned with fidelity could hope for was that any major changes made to the sacred tribal myths would be noticed by those who had heard them before. However, even assuming people wanted to catch them, minor additions and subtractions were impossible to consistently detect. This combination of small but intentional creative changes and unintentional memory lapses built up like mutations over time. Some went over like lead balloons with their audience and were quickly dropped while others were powerful and popular enough to become long-term features.

Storytelling, like evolution, is a process. In oral cultures, this was intuitively understood. The meaning and knowledge embedded within the story rather than the words themselves tended to take precedence. Comparing a modern society that has the ability to not only write, but also create a real-time audio and visual record of its existence down to the minutest detail, to an oral culture for whom stories are not merely a source of identity but a matter of survival is more like comparing apples to coconuts than it is apples to oranges.

Writing provided a mechanism for ensuring consistency unlike anything humanity had encountered before and it transformed how we approach both our myths and our physical environment in ways we never could have anticipated in advance. Of course, stories were still alterable, but as long as the original text or something very close to it survived new versions could be compared to the old and even subtle differences could be readily detected.

At that point, our sacred stories began to both literally and figuratively be seen as chiseled in stone and many of our traditions ceased to be living. Increasingly, the goal was to preserve them through a kind of textual mummification. It was in this context that the written word was sanctified and the story it recorded came to be seen as historical.

. . .

“Symbols are only the vehicle of communication; they must not be mistaken for the final term, the tenor, of their reference,” Joseph Campbell wrote in his classic work The Hero With A Thousand Faces. “No matter how attractive or impressive they may seem, they remain but convenient means, accommodated to the understanding.” Campbell concludes by reminding us that “Mistaking a vehicle for its tenor may lead to the spilling not only of valueless ink, but of valuable blood.”

We should take mythology seriously, but not too seriously. A decent level of respect rather than a reverential posture is what’s called for. Modern technology enables us to compare notes and police each other for consistency, but in the context of storytelling, there’s no opportunity for either fun or learning in that. The same technology also gives us an opportunity to play with our myths: to find humor and fresh interpretations that reveal themselves best through the use of contemporary language and references.

Consider Jonathan Goldstein’s reinterpretation of the story of Adam, Eve, and our loss of innocence in the Garden of Eden. Such a retelling is only possible when the storyteller sees the text as living rather than dead. It’s both humorous and evocative without demanding either belief or disbelief. It would be difficult for a listener to come away from Goldstein’s reimagining of the opening chapters of Genesis with a desire to storm the next local school board meeting demanding Intelligent Design be given equal time with evolution. Likewise, anyone insisting the story isn’t true after hearing Goldstein’s version would also be missing the mark by quite a wide margin.

The Abrahamic traditions, in particular, have consistently doubled down on belief, generally insisting that any who would darken the doorway of their institutions be willing to profess their faith in the word as it is written. Failure to do so often means ostracism, excommunication, or far worse.

But these religions don’t have many chips left to play. Nor has the modern world dealt the literalists in their midst a particularly strong hand. The best play at this point is to fold and acknowledge humanity’s myths are now, as they have always been, a means of fostering meaning and spreading wisdom rather than a mechanism for describing the physical universe or communicating historical events to future generations.

In the closing pages of Myths To Live By, Joseph Campbell said it best. As is so often the case when it comes to mythology, he deserves the final word:

The difficulty faced today by Christian thinkers in this regard follows from their doctrine of the Nazarene as the unique historical incarnation of God; and in Judaism, likewise, there is the no less troublesome doctrine of a universal God whose eye is on but one Chosen People of all in his created world. The fruit of such ethnocentric historicism is poor spiritual fare today; and the increasing difficulties of our clergies in attracting gourmets to their banquets should be evidence enough to make them realize that there must be something no longer palatable about the dishes they are serving. These were good enough for our fathers, in the tight little worlds of knowledge of their days, when each little civilization was a thing more or less to itself. But consider that picture of the planet Earth that was taken from the surface of the moon!


Follow Craig on Twitter or read him on Medium.com

Other articles that you may enjoy:

 


Get awesome merchandise. Help 71 Republic end the media oligarchy. Donate today to our Patreon, which you can find here. Thank you very much for your support!

A Healthy Dose of Collectivism is in Our Individual Interest

Craig Axford | Canada

During the summer of the 2012 presidential campaign, President Barack Obama asserted that America was the product of a lot of collective hard work.

If you were successful, somebody along the line gave you some help. There was a great teacher somewhere in your life. Somebody helped to create this unbelievable American system that we have that allowed you to thrive. Somebody invested in roads and bridges. If you’ve got a business, you didn’t build that. Somebody else made that happen.The Internet didn’t get invented on its own. Government research created the Internet so that all the companies could make money off the Internet.

The point is, is that when we succeed, we succeed because of our individual initiative but also because we do things together. There are some things, just like fighting fires, we don’t do on our own. I mean, imagine if everybody had their own fire service. That would be a hard way to organize fighting fires. ~ President Obama, Roanoke, Virginia, July 2012

This lack of deference to the American myth of the rugged individual able to overcome virtually any obstacle through a combination of hard work and perseverance was duly noted by Obama’s 2012 opponent, Mitt Romney. The key phrase from Obama’s speech used by the GOP in their counterattack was “you didn’t build that.” Romney replied personally, arguing that “What he’s [Obama] saying is that if someone has succeeded, if they built something, he’s saying they didn’t really build it — no, it was the government, it was the government that takes responsibility.”

Well, of course, government research is responsible for a great many of the things that Americans currently enjoy, but that wasn’t really Obama’s point. His reference to government involvement in the creation of the Internet aside, his larger argument was that without a lot of collective effort on the part of countless fellow citizens, all the things we like to think of as our own individual accomplishments wouldn’t be possible.

Barack Obama was hardly the first president to recognize that civilization is a group endeavor for which no single person can take credit. Our individual successes are made possible by these collective exertions. Given Mitt Romney had made his fortune in finance rather than swinging a sledgehammer or on the assembly line, Abraham Lincoln might have reminded him that his wealth was largely the product of other people’s hard work:

Labor is prior to and independent of capital. Capital is only the fruit of labor, and could never have existed if labor had not first existed. Labor is the superior of capital, and deserves much the higher consideration. Capital has its rights, which are as worthy of protection as any other rights. Nor is it denied that there is, and probably always will be, a relation between labor and capital producing mutual benefits. The error is in assuming that the whole labor of community exists within that relation.~ Abraham Lincoln, First Annual Message to Congress [December 3, 1861]

The controversy surrounding Obama’s 2012 comments are reminiscent of the earlier attacks leveled against Hillary Clinton for her 1996 book It Takes A Village. In a stunning example of judging a book by its title, conservatives attacked then-First Lady Hillary Clinton for promoting the state as a substitute for parents and family.

Mrs. Clinton’s book title was derived from an African saying, “It takes a village to raise a child.” No doubt the former first lady thought it would be uncontroversial to suggest that communities, as well as families, play a critical role in a child’s wellbeing and happiness. She, like others before and since, underestimated American resistance to anything that challenges its cult of individuality.

A willingness to work hard is a valuable attribute for an individual to possess. Certainly, it increases their chances of reaching and maintaining at least some degree of comfort and status. But, as has been pointed out many times before, if hard work alone guaranteed success every poor mother who has to haul their family’s daily water supply in buckets from distant wells or watering holes would be a millionaire. Both social and physical infrastructure matter and nobody can build or maintain them on their own. That’s as true in the so-called “advanced” nations of the planet as it is in the least developed.

Americans are fond of thinking of the poor as lazy. Unfortunately, that the data shows most of them do work and work hard represents another assault to the myth that it’s their work ethic that makes the successful worthy of their relative wealth and status. “Today, 41.7 million laborers — nearly a third of the American workforce — earn less than $12 an hour,” according to one recent New York Times article, “and almost none of their employers offer health insurance.”

So if it wasn’t all their hard work that made the rich and famous rich and famous, or even made the middle-class, what was it? Showing up, punching the clock, putting in a solid eight hours or maybe a little overtime is, perhaps, arguably necessary but has proven over and over again to be far from sufficient.

If we live anywhere in the developed world we should count ourselves fortunate to have been born in or to have successfully immigrated to a country where hauling our drinking water in buckets is no longer required for our survival. For those of us born in such a nation, it’s pure luck that landed us in a place where we had access to things such as safe drinking water and a public education system. In the US, the only developed country without universal healthcare, being born into a family covered by health insurance provided us with another lucky break. To paraphrase Barack Obama, as children we certainly had nothing to do with building any of those advantages and the many more we inherited.

It should be obvious that being randomly selected by an indifferent universe to begin life in a place with well-paved roads, educational opportunities, healthcare, supermarkets and, more recently, the Internet, gives certain individuals a huge headstart from day one. If you happen to have been driven home from the hospital by a chauffeur and had a trust fund established for you before your first birthday, your odds of success rise much further still. If anyone thinks, upon graduating from Harvard or Yale, that this sort of headstart had nothing to do with their degree or the school they received it from it is, to be frank, an indication that their Ivy League education was probably wasted on them.

Try for a moment to imagine sustaining any of these benefits of civilization without the labor of the “nearly a third of the American workforce” currently scraping by on $12 an hour or less. Many of them are in the healthcare industry. Others clean the homes of far wealthier families so both parents can work without either having to worry much about housecleaning or the other basic chores that are daily routines for their poorer neighbors on the other side of town. Hundreds of thousands if not millions of these workers are in the construction trade busy building the homes, schools, and roads of tomorrow. Yes, some are fast-food workers, of which a rapidly shrinking minority are teenagers, but why should that matter? Work is work and even the president is rather fond of KFC.

By now some readers will no doubt be mumbling furiously to themselves about all the effort doctors, lawyers, scientists, and business executives put into their education. Doesn’t this effort deserve to be rewarded? If you’re one of them, you aren’t wrong; you’re just not looking at the big picture.

Picture yourself earning a diploma or degree under the following conditions: To begin with, you must hunt and gather all the food it takes to sustain yourself as you complete your studies. Then there’s the small matter of making the clothes you must wear each day. Before you can even begin to think about studying for your finals you will need to build yourself a shelter, dig a latrine (or invent indoor plumbing complete with a steady water supply and a reliable sewage system), dig a well, build a kiln to fire all your plates and cups in, mine the minerals you’ll need for your utensils (to say nothing of your laptop, which you’ll need to build from scratch). Of course, you’ll also need to cut down a few trees and convert them into paper so you can hand copy all your textbooks (or perhaps you can barter a copy from some earlier student who somehow made it this far).

Hopefully, you get the picture. An awful lot of people make their living taking care of all the necessities for us so we can focus on pursuing our aspirations. Pride is the wrong response given this reality. We didn’t single-handedly overcome the challenges life throws in our way to get where we are today. None of us “pulled ourselves up by our bootstraps.” We all stand upon the shoulders of millions of others, both living and dead, who sacrificed themselves to make it possible for us to pursue our dreams — dreams they believed would ultimately benefit them and their posterity as much or more as it would you or I. The proper attitude in this light is one of gratitude and humility.

When companies like Amazon and Walmart fail to pay the men and women that show up to work each day a living wage and when politicians stand before the cameras to defend their “right” to do so, they aren’t just being unjust; they’re actively undermining the very social contract that made our world possible. When people dismiss the working poor as lazy they aren’t merely wrong; they’re lying to themselves in order to justify their own privileged place in our social hierarchy.

Such narratives may be self-serving in the short-term but in the long run, they’re deadly myths that can undo entire nations. In light of current events, one is forced to wonder just how many more Americans need to be driven into poverty before the ideology of the self-made man (or woman) is finally recognized for the civil society killing cancer that it is.

Follow Craig on Twitter or read him on Medium.com

Other articles that you may enjoy:

 
 

Get awesome merchandise. Help 71 Republic end the media oligarchy. Donate today to our Patreon, which you can find here. Thank you very much for your support!
 
 
 

“The Once And Future Liberal”-How Does the Left Move Forward?

Craig Axford | United States

The upcoming 2018 midterms may be about to prove that Donald Trump has been good for the left and the Democratic Party, at least in the short-term. However, he’s still a far cry from a cure for what ails it. His abusive style and bull in a china shop approach to governance have merely provided a shot of adrenaline to an institution that’s been increasingly showing signs of exhaustion for decades.

Trump has consistently given the appearance of an easy foil that, like the ancient Sirens, has the perpetual potential to lure America’s left onto the rocks. Adrenaline wears off quickly once we’re convinced the crisis has passed. Between the danger of Trump fatigue and the very real chance that the Democratic Party will once again decide to take a collective nap as soon as the current administration has been dealt with, midterm victories and success in 2020 could prove short-lived. While the left sleeps off the bad trip of the Trump era, we can be sure that other far more savvy demagogues will be busy working to seize upon America’s discontent to launch their own attempts to take power.

The essayist and Columbia University professor of Humanities, Mark Lilla, picked up his pen and wrote a short but powerful antidote to the American left’s malaise. Unfortunately, his obvious understanding of the problem and how we got here leaves Lilla at best only a very mildly reassuring read.

While the efforts at organizing by those that commonly refer to themselves as “the resistance” have potential, Lilla warns us that these efforts need to lead us somewhere other than simply removing Trump from office and winning the elections of 2018 and 2020. “So it’s encouraging to see how quickly liberals have organized to resist Trump,” Lilla writes. “But resistance is by nature reactive, it is not forward-looking.”

Lilla does not dismiss or treat lightly the post-1960s habit by the left to ignore down-ballot races. Its increasingly presidential focus all but ceded school boards, city councils, state legislatures, and even governorships to the Republican Party. By 2016 Democrats were in worse shape than at any time since the 1920s. Indeed, the Obama years were particularly bad ones for the Democratic Party, with losses far exceeding those experienced under any previous Democratic president.

Lilla isn’t the first to chastise Democrats for putting most of their eggs in the presidential basket and, unfortunately, he probably won’t be the last. We still occasionally hear commentators feel the need to remind Democrats to pay attention more often than just once every four years, but oddly the party that supposedly believes most in government continues to generally find local and state races pretty unimportant.

With regard to the vision question, there’s some movement around issues like universal health care. Senator Sanders has demonstrated that ideas like Medicare for all and a tuition-free education can generate a high enough turnout in at least some districts to win elections and enough energy to fill large arenas virtually anywhere.

But there’s still an elephant in the room by the name of identity politics and the left simply doesn’t know how to navigate around it without upsetting its fragile ego. Indeed, the left has spent decades nurturing that ego by fostering an environment in which debates are increasingly seen as synonymous with confrontation and more attention is paid to policing speech than to regulating corporations or reporting campaign donations.

Identity politics, according to Lilla, represents the brand of individualism the left adopted to counter the Reagan revolution’s own distinct identification with rugged ‘pull yourself up by your bootstraps’ libertarian individualism. America doesn’t have citizens so much as it has individuals, interests, and groups that identify themselves this way or that.

“The most important lesson is this,” Lilla tells us on the opening page of the third and final chapter of his short treatise, “that for two generations America has been without a political vision of its destiny. There is no conservative one, there is no liberal one. There are just two tired individualistic ideologies intrinsically incapable of discerning the common good and drawing the country together to secure it under present circumstances.”

Lilla isn’t wrong. The problem, as I see it, is that this describes America throughout most of its history. It has never shown much interest in abandoning this character flaw. It has always been a nation that preferred to see its people’s isolated dreams as a substitute for an overarching philosophy that saw the whole as greater than the sum of its parts.

The periods when it has enjoyed a “vision of its destiny” have been the exception rather than the rule. The only reason to think that America might be ready to enter one of these exceptional periods now is that it again finds itself in a crisis. It’s always been an emergency of fairly significant proportions that’s precipitated the emergence of such a shared vision in the past. This vision lingers for a while after the crisis has passed but it inevitably fades within a generation or so.

On the opening page of his book’s first chapter, Lilla himself recognizes this very American tendency by providing two quotes from two very different men separated by nearly two centuries:

I see an immense crowd of similar and equal men who spin restlessly around themselves, seeking vulgar little pleasures to fill their souls. Living apart, each is like a foreigner to the fate of others. His children and friends are for him the entire human race. As for his fellow citizens, his is next to them but does not see them, he touches them but does not feel them. He exists only in and for himself, alone. And though he may still have a family, he no longer has a country. ~ Alexis de Tocqueville

My ideal citizen is the self-employed, homeschooling, IRA-owning guy with a concealed-carry permit. Because that person doesn’t need the goddamn government for anything. ~ Grover Norquist

While Bill Clinton’s rhetoric is certainly imbued with greater empathy than Grover Norquist’s, his 1992 campaign was nonetheless intended to prove de Tocqueville’s point regarding America’s true character. Lincoln’s emancipatory vision or FDR’s commitment to fairness and economic justice were the sorts of things the country would only swallow after two years of civil war or 20 plus percent unemployment. Even then, Bill Clinton and his centrist fellow travelers warned Democrats that articulating grand ideas was risky at best in the post-Reagan era and they would be wise to steer clear of them if they wanted to win elections.

Clinton won in 1992, but in 1994 the GOP took the House for the first time in four decades and the rest, as they say, is history. Democrats have been out of power in the House and Senate more often than not ever since. In spite of these mounting losses, however, they’ve generally just kept doubling down on Bill Clinton’s insistence on moderation. In lieu of a grand vision for the country, the “first black president” together with his fellow baby boomers ardently embraced identity politics and small initiatives that could be fairly quickly undone by the next Republican president.

Lilla’s suggestion for revitalizing the left is a radical departure from identity politics, though it is by no means a new or radical idea: bring back the concept of citizenship. Citizens are part of a community, whereas individuals are merely unbonded social atoms that keep bumping into one another, sometimes with great force.

The only adversary left is ourselves. And we have mastered the art of self-sabotage. At a time when we liberals need to speak in a way that convinces people from very different walks of life, in every part of the country, that they share a common destiny and need to stand together, our rhetoric encourages self-righteous narcissism. At a moment when political consciousness and strategizing need to be developed, we are expending our energies on symbolic dramas over identity. ~ Mark Lilla

Lilla doesn’t argue that the left should abandon the minorities that have struggled or are still struggling to gain access to everything from voting rights to the use of the bathroom but he does believe the left needs to reframe the way we discuss these problems. Equal treatment under the law is a human rights issue first and foremost. The word human is all-inclusive. Identity politics, on the other hand, demands equality by drawing attention to what we are that others are not, inviting potential allies to make some other concern their top priority on the grounds that they cannot possibly understand our own. No wonder Steve Bannon openly hopes the left will be stupid enough to continue meandering drunkenly down this divisive road.

Lilla is part of a small but (hopefully) growing group of liberal thinkers arguing that all anyone ultimately needs to understand is that the dignity and worth we all possess entitle each of us to equality under the law. This is not a difficult concept to grasp. It does not require a degree in gender studies or regular staff meetings to address our unconscious biases.

Neither the Rev. Martin Luther King Jr. nor the abolitionists 100 years before him described the problem in the narrow language of minority rights or the angry hopelessness of those who claim that people outside their cherished tribe simply can’t get it. King, as well as the suffragettes and abolitionists before him, were simply demanding everyone be given an equal opportunity to sit at humanity’s table.

Lilla calls upon liberalism to return to a larger more inclusive rhetoric that excludes no one; a liberalism that embraces diversity not because it has a list of interests and identity groups that need to be checked off but because it recognizes everyone’s humanity. True liberalism doesn’t care about the color of your skin, your gender, or your sexual orientation. Humanity and character are the only things that matter. Liberalism embraces Martin Luther King’s dream. Identity politics rejects it.

It remains to be seen whether Lilla and others like him will be heard. A small but vocal segment of the Democratic Party seems to enjoy spending their time getting mad at professors who don’t share their particular worldview or typing angry tweets about Google employees who wrote a memo most of them never bothered to read. None of this fosters dialogue and compassion let alone brings America any closer to providing health care to all its citizens, eliminating the growing burden of student debt, reforming the justice system, or providing an income to a working class facing increasing pressures from automation. Such debates are as divisive in their own way as Trumpism is.

Mark Lilla’s book is worthy of the few hours it takes to read. His argument needs thoughtful consideration and debate within liberal circles everywhere. Unfortunately, it’s hard for this Democrat to ignore his personal experience of the past four decades. The signs that America’s left — a movement that is already centrist by contemporary Western democratic standards — will respond to the need to abandon identity politics in favor of the more inclusive and shared commitment that citizenship demands are tentative at best.

Follow Craig on Twitter or read him on Medium.com

Other articles that you may enjoy:

 
 

 


To support 71 Republic, please donate to our Patreon, which you can find here.

Featured Image Source

From The Information Age To The Era Of Intellectual Laziness

Craig Axford | Canada

In a 2013 column published in the Huffington Post entitled Why Public Schools Don’t Teach Critical Thinking, retired high-school teacher Frank Breslin lamented the state of modern American education:

The minds of children need room to breathe, to be inspired by vision, and the health-bringing balm of many perspectives. They need exercise, play, and relaxation; in short, they need a sound body and spirit to have a sound mind. Rather than spending their magical years entombed in cram-school dungeons that prepare them for impossibly difficult tests, children need old-fashioned schools where every day they can learn something new in classrooms that echo with laughter and joy!

Unfortunately, it’s government policy to make sure schools are anything but the kind of places Breslin envisioned for students. By emphasizing standardized testing that evaluates how many predetermined facts a student can memorize rather than their capacity to conduct research and pursue their own lines of inquiry, America has created a citizenry increasingly predisposed to simply accept whatever they read uncritically. Now it is paying dearly for following that path.

At a time when we often bemoan the inability of Democrats and Republicans to work together on much of anything, it’s worth remembering that the pursuit of standardized testing has been a thoroughly bipartisan undertaking; President George W. Bush’s No Child Left Behind (NCLB) legislation passed in 2001 with strong bipartisan support. In the speech he delivered before the student’s of Ohio’s Hamilton High School prior to signing the NCLB legislation, President Bush spoke of the importance of “accountability” and made it clear that a strong emphasis on testing was key to determining whether or not schools were meeting expectations:

The first way to solve a problem is to diagnose it. And so, what this bill says, it says every child can learn. And we want to know early, before it’s too late, whether or not a child has a problem in learning. I understand taking tests aren’t fun. Too bad. We need to know in America. We need to know whether or not children have got the basic education.

When President Obama took office, he initially doubled down on standardized testing. He too was, at least at first, clearly convinced that what was needed was a more objective measurement of a student’s knowledge. Though President Obama’s “Race To The Top” initiative did call upon states to “develop standards and assessments that don’t simply measure whether students can fill in a bubble on a test, but whether they possess 21st century skills like problem-solving and critical thinking, entrepreneurship and creativity,” it still placed an extremely strong emphasis upon standardization to ensure these goals were being achieved.

However, by 2015, President Obama was doing a mea culpa on standardized testing. He announced the amount of time spent in the classroom preparing for tests should be limited. In one of the more reflective moments of his presidency, Obama stated, “When I look back on the great teachers who shaped my life, what I remember isn’t the way they prepared me to take a standardized test.” He went on to admit that “too much testing, and from teachers who feel so much pressure to teach to a test that it takes the joy out of teaching and learning,” had caused more harm than good.

. . .

We live in a culture that places a high value on efficiency. Understandably, we want the next generation to have a firm grasp on certain basic skills that are essential to any real chance of success in the modern world. Reading, writing, and arithmetic — commonly referred to as “the 3 Rs” — are at the top of the list.

Unfortunately, the mastery of these skills doesn’t guarantee that a student has also learned how to put them to good use. While the United States has achieved a reasonably high literacy rate, increasingly people are using their ability to read and write to kill hours each day on social media rather than becoming informed citizens or otherwise enriching their lives.

According to a study just released by the American Psychological Association, the use of digital media by teens increased dramatically between 2006 and 2016. Jean M. Twenge, professor of psychology at San Diego State University and the lead author of the study, states that social media use during leisure hours doubled among high school seniors during that period. Among 10th graders usage increased by 75% while among 8th graders it increased by 68%.

“In the mid-2010s, the average American 12th-grader reported spending approximately two hours a day texting, just over two hours a day on the internet — which included gaming — and just under two hours a day on social media,” Twenge is quoted saying on the science website Science Daily. “That’s a total of about six hours per day on just three digital media activities during their leisure time.”

According to the same Science Daily story, the steep rise in digital media usage has been associated with an even more extreme drop in the use of print media. The article states:

The decline in reading print media was especially steep. In the early 1990s, 33 percent of 10th-graders said they read a newspaper almost every day. By 2016, that number was only 2 percent. In the late 1970s, 60 percent of 12th-graders said they read a book or magazine almost every day; by 2016, only 16 percent did. Twelfth-graders also reported reading two fewer books each year in 2016 compared with 1976, and approximately one-third did not read a book (including e-books) for pleasure in the year prior to the 2016 survey, nearly triple the number reported in the 1970s.

Perhaps these trends wouldn’t be nearly as disconcerting if the rise in digital media use and the associated decline in the use of printed material wasn’t also coming at a time when so many members of the same generation were exhibiting such difficulty discerning between reliable news stories and “fake” news.

In a study coincidentally released just two weeks after the 2016 presidential election, Stanford University researchers reported students at all levels exhibited extremely poor skills when it came to conducting research and evaluating content online. According to the study’s executive summary, “Our ‘digital natives’ may be able to flit between Facebook and Twitter while simultaneously uploading a selfie to Instagram and texting a friend. But when it comes to evaluating information that flows through social media channels, they are easily duped.”

The Stanford study involved 7,804 subjects from middle school through university age. The sample comprised students from 12 states, including students from elite universities that rejected over 90% of their applicants and public institutions with high acceptance rates. Students were given age-appropriate problems to evaluate and research online including reasons to doubt the accuracy of content, assessing evidence, and verification of various claims. The results were not encouraging.

The Stanford team’s assessment of middle schoolers found that “More than 80% of students believed that the native advertisement, identified by the words ‘sponsored content,’ was a real news story.” Among high school students shown a post entitled “Fukushima Nuclear Flowers” with a picture of what appear to be white daisies exhibiting what were alleged to be various “birth defects,” the students “ignored key details, such as the source of the photo. Less than 20% of students constructed ‘mastery’ responses, or responses that questioned the source of the post or the source of the photo. On the other hand, nearly 40% of students argued that the post provided strong evidence because it presented pictorial evidence about conditions near the power plant.” The caption gave no indication where the photo was actually taken.

University undergrads from three different universities were shown a tweet announcing “new polling” on NRA members’ views on background checks for potential gun purchasers. According to the Stanford study, “Results indicated that students struggled to evaluate tweets. Only a few students noted that the tweet was based on a poll conducted by a professional polling firm and explained why this would make the tweet a stronger source of information.” Only a third of the students paid any attention to the agendas of MoveOn.org or the Center for American Progress and how that might influence the content.

When it came to undergraduate students, researchers also noted “An interesting trend that emerged” from their tests. Over 50% of “students failed to click on the link provided within the tweet.” In addition, “Some of these students did not click on any links and simply scrolled up and down within the tweet.” Others tried to investigate, but searched using the CAP acronym for the Center for American Progress provided in the tweet. This type of search “did not produce useful information.”

. . .

The use of fake news to influence the election of 2016, reveals it isn’t just our young adults that lack the skills to detect and resist misinformation. Many of their parents and grandparents also lack the critical thinking and research skills necessary to place information in context and separate the wheat from the chaff. In many respects, the most troubling aspect of this problem isn’t our apparent gullibility but our ongoing refusal to do much if anything about it.

The focus on standardized testing is a symptom of an education system literally designed to teach students what to think rather than how to think. Memorization, not research skills and hands-on learning, became even more of a focus as successive governments drank the testing Kool-Aid. Time-consuming experiments or other projects were dropped to make room for lessons that drilled the right answers into students. Arts programs that fostered creativity and instilled an appreciation for culture were cut or eliminated altogether in the name of efficiency. Our schools became factories that mimicked the routinized schedules of the workplace while denying students the chance to ask questions, challenge the ideas being presented to them, and figure things out for themselves.

It doesn’t have to be this way. A recent episode of the BBC World Service podcast The Documentary highlighted work being done to determine the best approaches for instilling in children a basic grasp of what qualifies as evidence and the importance of understanding the basis of the claims they will inevitably hear from salesmen, politicians, and even family members over the course of their lives. The program, entitled You Can Handle The Truth, doesn’t just reveal how successful such efforts can be but how much delight children actually take in learning how to unmask poorly supported assertions and outright falsehoods.

The program’s host, the British statistician Sir David Spiegelhalter, traveled to Uganda to see the results of these efforts for himself. Researchers and educators in that country had been working with a Norwegian team on educational materials designed to teach elementary age students how to make more informed health choices.

The young Ugandan students were given a comic book that depicted individuals confronting a number of difficult choices. Among the most popular comic book characters is a parrot that, as parrots are known to do, repeats back everything it hears unquestioningly. Over the course of the school year, students discussed the various scenarios described within the book with their teachers and learned the importance of asking those making a claim what the basis for it was and how to better evaluate the answers they heard in response.

The Ugandan program involved 10,000 students from 120 schools. Sixty of the schools were placed in a control group. Students at these institutions received no additional instruction. In the remaining 60 schools, students participated in lessons and activities designed to provide them with basic critical thinking skills. At the end of the year, students from all 120 schools were tested and the differences between the control group and the test group assessed.

The results of that testing revealed the program had produced the desired effect and in a big way. All students were given 24 problems to solve or evaluate. Thirteen right answers were considered a passing grade. The 24 questions presented to students on the test were unique and had not been problems considered as part of the critical thinking curriculum.

In the control group, 27% of the students passed the test. In the intervention group, 69% received a passing score. Even teachers in the two groups were tested. Among the control group’s teachers, 87% passed, while the intervention group saw 98% of the teachers get a passing grade.

One of the problems the researchers anticipated but never encountered is one that will likely sound familiar to Americans; parents becoming upset as their children begin coming home from school with tough questions about cherished beliefs and cultural practices. Uganda is a country with a rich history of folk remedies and superstition. Researchers feared that having children go off to school in the morning happily accepting particular family or cultural traditions only to return in the evening wondering about the basis for the claims surrounding grandma’s famous herbal remedy could turn parents against their efforts.

However, Ugandan parents, at least so far, haven’t made a fuss. Their children are excited to be learning and take delight in being empowered to question their elders about things that have been taken for granted for generations. To everyone’s surprise, parents and other family members don’t seem to mind.

Sir David Spiegelhalter also took a trip to California for his BBC program. That state is currently considering legislation that will mandate media literacy education.  Spiegelhalter paid a visit to one California classroom where students were asked to research various theories into who or what sank the Battleship Maine in Havana’s harbor on February 15, 1898. The sinking of the Maine ultimately led the United States into a war with Spain.

American parents aren’t likely to get too upset if their children conclude an American battleship that sank over 100 years ago went down due to an accident instead of a Spanish mine as was widely assumed at the time, but it’s hard to imagine many of them remaining silent when it comes to climate change, evolution, vaccines, or race relations. They haven’t so far. In just the past year Mark Twain and Harper Lee were targeted by the school board in Duluth, Minnesota because their books contained language that might make students feel “humiliated or marginalized.”

One of the appeals of the reading, writing, and arithmetic mantra is that learning these skills, at least in theory, doesn’t require teachers to raise too many questions or address contemporary controversies. Once a kid has the capacity to read, it’s just assumed they will figure it all out for themselves as an adult when and if they choose to. But learning to read is about more than just memorizing the alphabet and passing a spelling test. It’s about knowing how to think too.

The California media literacy bill failed on its first attempt in 2017, but it’s back again this year. If it passes, implementation will certainly be carefully watched to see what kind of impact it has on students being thrown into the sea of digital technologies we’ve created. Will they sink or swim? One thing is certain, however; it increasingly appears as though everything is riding upon their capacity to keep their heads above water.

In the August 27 issue of the New York Times, the author Thomas Chatterton Williams reviews two new books hitting the shelves: The Splintering of the American Mind and The Coddling of the American Mind. As the titles suggest, their authors rue the polarization, hypersensitivity, and inability to cope with controversies that now grips Americans right across the political spectrum.

But what got my attention wasn’t Williams assessment of these newly published works so much as the closing paragraph of his review. It was clearly more about us than it was either of the books he had just shared his thoughts on. Williams concludes:

What both of these books make clear from a variety of angles is that if we are going to beat back the regressive populism, mendacity and hyperpolarization in which we are currently mired, we are going to need an educated citizenry fluent in a wise and universal liberalism. This liberalism will neither play down nor fetishize identity grievances, but look instead for a common and generous language to build on who we are more broadly, and to conceive more boldly what we might be able to accomplish in concert. Yet as the tenuousness of even our most noble and seemingly durable civil rights gains grows more apparent by the news cycle, we must also reckon with the possibility that a full healing may forever lie on the horizon. And so we will need citizens who are able to find ways to move on despite this, without letting their discomfort traumatize or consume them. If the American university is not the space to cultivate this strong and supple liberalism, then we are in deep and lasting trouble.”

The anti-democratic forces that are currently so vocal in the United States would no doubt frame the kind of educational goals Williams identifies as some sort of conspiracy to destroy their movement and they would be right. They will claim that any attempt to instill in children critical thinking skills and an understanding of the nation’s history, laws, and aspirations are biased because these efforts fail to treat their own anti-intellectual, unscientific, and undemocratic points of view as worthy of equal of time. Again, they will be correct.

Freedom of speech means everyone gets to express themselves. However, it does not mean that every idea deserves equal press coverage or even any press coverage at all. Thinking is hard work precisely because it requires us to critically evaluate the concepts we’re exposed to. It determines not only what is and isn’t worthy of our time and attention but which ideas have the potential to either threaten or enrich our lives and those of our fellow citizens. There are sound methods for making these determinations that have proven themselves over and over again, but they can’t do us any good if we refuse to learn them.


To support 71 Republic, please donate to our Patreon, which you can find here.

Featured Image Source