The libertarian tradition has been slowly but steadily growing in the United States since the 1970s. From Rothbard to Gary Johnson and from Ron Paul to John McAfee, the movement has been kept alive. Yet obviously, the libertarian social order doesn’t yet exist. The theoretical foundation is already here. Libertarians know what they want broadly speaking. The pragmatics of libertarianism, though, are in their infantile stage. Chomsky seems to think this is because libertarians believe in a senseless utopia.
“This illustration depicts NASA’s exoplanet hunter, the Kepler space telescope. The agency announced on Oct. 30, 2018, that Kepler has run out of fuel and is being retired within its current and safe orbit, away from Earth. Kepler leaves a legacy of more than 2,600 exoplanet discoveries.” Credits: NASA/Wendy Stenzel.
Craig Axford | United States
We live in an age of discovery far beyond any other our species has experienced so far, yet we hardly seem to even notice. We live in an era of staggering loss, but we seem paralyzed by the immensity of the problem. Had Charles Dickens foreseen the early 21st century, he may very well have reconsidered his opening line in A Tale of Two Cities.
Over this past week, two news stories drove home the point that we’re living in an extraordinary time. The first broke on October 29th. The World Wildlife Fund (WWF) announced the results of a report indicating that between 1970 and 2014, global wildlife populations had declined by a staggering 60%. Even if their estimates are off by half, a 30% decline over such a relatively brief period would still be alarming.
The second story, coming just one day after the first, was NASA’s announcement that its Kepler space telescope had run out of fuel and would no longer be continuing its stunningly successful search for exoplanets. NASA’s Kepler had discovered more than 2,600 planets orbiting other worlds during its lifetime, further dislocating humanity from its perceived place at the center of the universe. By revealing “that 20 to 50 percent of the stars visible in the night sky are likely to have small, possibly rocky, planets similar in size to Earth, and located within the habitable zone of their parent stars”, NASA’s Kepler seems to support those convinced that we are unlikely to be the only place in the universe where life has emerged.
The tension these two stories represent stirs something deep within me, and not just because they arrived within 24 hours of each other. Because of their coincidental relationship to my own personal arrival on this planet, they each, in their own way, reflect the seemingly conflicting currents of history that have become increasingly evident with age.
I was born just one month before Neil Armstrong and Buzz Aldrin set foot on the Moon. I also entered this world just a few months before the WWF’s baseline year of 1970. So the 60% decline in wildlife populations and the nearly 28,000% increase in the number of known planets discovered during my lifetime is jarring, to say the least.
As I’ve said elsewhere, I’m not fond of adopting either optimism or pessimism as default outlooks. Going through life either perpetually cheerful or gloomy seems like avoiding confronting the world on its own terms, even if an often unconscious one. Even terrible news for us is good news for somebody. If you and bunch of your coworkers get laid off, odds are the company’s shareholders are happy. Even a corpse can be a reason to celebrate if you’re a bacteria or a vulture.
I’m also not too keen on the way we often describe ourselves as a species. We tend to point to our impact upon the planet as though it was an indication either of genius or stupidity, leaving little room for the vast landscape of complexity and nuance that lies between these two extreme assessments. It’s just trade-offs all the way down.
As the Kepler telescope and all the other probes we’ve sent into space demonstrate, we aren’t idiots. That said, as the WWF study reminds us, scaling up our civilization to this point has also too often been an ad hoc operation that fails to consider all the possible consequences of our actions or quickly correct for them once those costs have become clear.
The progress paradox refers to a curious phenomenon that social scientists have documented over and over again: that there is often an inverse relationship between objective improvements in human well-being and people’s reported overall happiness. While those living in extreme poverty will report significant gains in personal life satisfaction following increases in income and access to resources, these gains don’t continue to follow a linear trajectory as income continues to grow. Instead, people’s happiness growth curve begins to flatten once their basic needs are satisfied. For many living in the wealthiest nations on the planet, they have even take a U-turn.
In a recent article published in the October 2018 issue of Science, researchers Carol Graham, Kate Laffan, and Sergio Pinto cite both the United States and China as strong examples of the progress paradox. “The United States has one of the wealthiest economies in the world,” the authors state, “yet life expectancy is falling owing to deaths driven by suicides and drug and alcohol overdose. This particularly affects Caucasians with less than a college education.”
In China, which “is perhaps the most successful example of rapid growth and poverty reduction in modern history,” with GDP increasing “fourfold between 1990 and 2005” and life expectancy during the same period skyrocketing by more than 6 years, life satisfaction none-the-less dropped significantly as the nation’s middle class ballooned and overall health improved. Graham, Lafan, and Pinto report that there too “suicide increased, reaching one of the highest rates in the world.”
In China’s case, however, it wasn’t those lacking an education but those with one that was “the unhappiest cohorts” surveyed. While they “benefited from the growing economy,” they also had to endure “long working hours and a lack of sleep and leisure time.”
It’s difficult to appreciate all the new planets being unveiled by instruments like the Kepler space telescope when our lives here on Earth don’t even allow us to get enough sleep. Furthermore, all our city lights are blocking out the stars that our ancestors previously enjoyed: stars that we can no longer see without first traveling great distances deep into the heart of one of the few remaining desolate landscapes large enough for us to escape the nearly omnipresent urban glow.
This rapid scaling up of our civilization without regard to its toll on the individual psyche is also happening without much regard to its toll on nature as a whole. Our inability to find the time to spend even just a few hours each week outside smelling the roses, let alone spending a leisurely weekend in the woods now and then, is directly connected to our failure to find the political will to protect the environment upon which all life, including our own, depends.
In his book On Trails, the Canadian author Robert Moor writes “We can travel at the speed of sound and transmit information at the speed of light, but deep human connection still cannot move faster than the comparatively lichenous rate at which trust can grow.” As with individual connections to one another, so it is with connections to our wider world. Slowing down enough to observe and build a relationship with the earth can only happen at a “lichenous rate”.
We cannot continue to pull ourselves out toward the stars and toward an ecological crash simultaneously. Sooner or later the lights will need to be dimmed not only for survival’s sake but so that our children can again see what it is we are reaching for. Reaching into the heavens can sustain our spirits and bring us the wisdom we need to carry on, but only if we take the time to look at what we’re finding there. Ultimately, even our loftiest achievements are still grounded here on Planet Earth.
Follow Craig on Twitter or read him at Medium.com
Other stories that you might enjoy:
Civilization is a shared effort. Before anyone can even begin to “pull themselves up by their bootstraps” someone else…medium.com
If we want to be treated like customers instead of the product, we need to pay upmedium.com
Objectively things are generally better than ever. The problem is, life is by definition a subjective experiencemedium.com
Get awesome merchandise. Help 71 Republic end the media oligarchy. Donate today to our Patreon, which you can find here. Thank you very much for your support!
By Ryan Lau | @agorists
Enlightenment-era philosopher John Locke was a vocal supporter of the idea of rights. His famous works outlined life, liberty, and property as the three basic natural rights in the world. Though granted by an all-powerful force (nature or a creator), a government would protect these rights. However, Locke’s perception of the very idea of rights is simply inaccurate. In the grand scheme of things, a right to perform an action means very little, as it cannot stop an ensuing consequence from occurring.
First, it is worth noting that government is a downright awful guardian of rights. Inherently, the state takes away both the liberty and the property of nearly every individual it claims to protect. When it signs into a law a bill regarding a victimless act, the state usurps liberty. And, when that state takes time and money from the people via conscription and taxation to execute and enforce said law, it usurps property. Thus, with nearly every action it takes, a state is in violation of two of the three Lockean principles. This, of course, throws a wrench into the idea of a government protecting rights.
Now, if a government is not the solution, what is? Surely, there must be a way to guard these rights. After all, they have been touted as the cornerstones of a free society for hundreds of years. Yet, as stated above, the allegedly free society’s function relies on restricting the very rights it claims to protect. This progression of thought leads many, including me, to abandon the notion of a successful state, instead believing that an anarchist community will best guard rights.
Alas, a society without rulers will clearly have its flaws, too. In the absence of police and prison, there will be some people able to infringe more upon the rights of others. Simply stated, the existence of a right will never stop someone from infringing upon it. The idea of a right is actually quite similar to the idea of a gun-free zone. If a shooter has an intent of murder, then a sign that tells them they cannot shoot will in no way prevent them from doing so. Though the sign has a good intention, it does nothing, as the gunman has a stronger motive.
The exact same concept applies to the idea of a right to life. Sure, all humans, according to Locke, have a right to life. Yet, that right seemingly dissolves when the gunman pulls his trigger. The right to life, in itself, does no more to actually guard lives than does a gun-free zone sign. In fact, it may be less effective, as the sign may be a slight crime deterrent in a few instances. Hence, a society without a state operates only marginally better than one with a state, when both claim protection of rights as an ultimate goal. Sadly, this renders the very idea of rights to be insignificant to a society’s mode of function.
If not rights, then what should determine the workings of a society? In short, the answer is based on morality and on true, informed consent. More specifically, it involves ensuring, on a local level, that every individual is treated in an acceptable manner, by their own standards. It is wrong to assume that a singular definition of “right” will work for a large group of people. In fact, such an assumption may be one of very few objective wrongs in this world. Such an assumption allows for the great inhumanity of misunderstanding.
In the vast world we humans live in, it is impossible to count the sheer number of cultures, ideologies, and philosophies that exist in it. This is because that number is in constant flux, rising with every birth, and falling with every death. How, then, can we ever trust a state to seek the interest of all of them? The thought is a naive impossibility, especially with the state’s inherent tendency to rob. A single anarchist idea will fail, in nearly the same way. It simply does not come even close to representing the vast scope of ideas present in the world. The only idea that can truly guard the subjective needs of all, is no idea at all.
Without a designated philosophy, a written or unwritten code of ethics, individuals can be free to form their own. Yet, unlike with a state, or even an anarchist community, a true lack of designation allows for people to create multiple unions with those of differing values. In a state, trade barriers often limit the access people have with those bound by other states. In anarchist communities, strict economic and social guidelines may do the same. It is only when no community is given preference, that all can thrive at once.
In such a realm, may some violations of individual standards still occur? Of course they will. Such is human nature, and the imperfect state of our planet. Yet, when we abandon the universal concept of rights, and instead focus on the needs of the individual, we move away from imperfection. In the gunman scenario before, imagine the scene occurring in a hospital bed. The victim is terminally ill, yet the hospital’s policy prohibits a swift end to the victim’s suffering. Now, the gunman is no evil force; he is rather trying to meet the needs of the sick man. Objective rights would state that the gunman is evil, and violating the sick man’s right to life. Yet, voluntary action and individual need trump the very concept of rights in every situation concerning an individual’s own self.
Objective standards for a society are an incredibly dangerous chasm, in which most of us have fallen. Rights are merely a long-standing manifestation of this chasm. Yet, hope still exists for the world, and by moving away from a preference for objective standards, we begin to return to a moral existence.
Craig Axford | United State
In his famous 1893 essay The Significance of the Frontier in American History, the historian Frederick Jackson Turner lamented the 1890 Census Report’s conclusion that the “frontier line” beyond which large tracts of unbroken land could still be found had ceased to exist. “Therefore,” the report concluded, the frontier would “no longer have a place in the census reports.”
Turner believed, not without good reason, that America’s character was substantially if not entirely a product of its first century of westward expansion. He summed his thesis up early, writing in the second paragraph:
The peculiarity of American institutions is, the fact that they have been compelled to adapt themselves to the changes of an expanding people — to the changes involved in crossing a continent, in winning a wilderness, and in developing at each area of this progress out of the primitive economic and political conditions of the frontier into the complexity of city life.
With the frontier more than a century and a quarter behind us— at least according to the US Census Bureau — American institutions were perhaps never as challenged by the physical presence of a frontier as individuals are now by its absence. Without a landscape to test and define us, we are left to shape our own lives without the former environmental constraints imposed by a hostile natural world that needed taming. That’s a great luxury few before us have enjoyed, but not one that comes without personal cost.
Modern humanity has largely forgotten that not so long ago Mother Nature was a much greater imminent threat than it is today. We set aside “wilderness areas” and engage in activities like skydiving in large part because contemporary society is so safe it’s now necessary to seek out opportunities to experience a little bit of danger. From vaccination and seatbelts to chlorinated water and coffee cups with temperature warnings, civilization has successfully marshaled its resources to protect us from disease and injury. Distances that brought the pioneers of the 19th century weeks of hardship we typically travel in a weekend in air-conditioned comfort with time to spare for camping, hiking, mountain biking, or rafting. If we return to the office from these excursions Monday morning with a couple of visible scratches we feel we’ve proved our courage to our often envious co-workers.
The obstacles we must overcome no longer exist out there. Now it’s our own internal demons that we must conquer. The threats these pose are more subtle than mountain ranges or vast advances of desert. They play upon our capacity for self-deception and our skill as architects of elaborate rationalizations. Toying with our emotions they cause us to fear the other while assuring us that our own faults are actually strengths in disguise.
The unknown remains, as it always will. But on our home planet, the undiscovered places tend to be nooks and crannies rather than lost cities or unexplored canyons. To the extent, we pursue it the thrill of discovery is now much more personal than it is public. Those looking for fulfillment climb the peak because they have never climbed it before, knowing full well that hundreds if not thousands ascended it before them. It is their own curiosity more than humanity’s that they’re attempting to satisfy.
But for most of us, the experiences we settle for typically perform a much baser function. Instead of seeking meaning and sharing what we learn from the search, we record experiences as a means of keeping score. Selfies taken here and there serve to advertise things we got to do that others we know perhaps didn’t. Our smartphones provide both the soundtrack and the camera for a movie about ourselves we hope will get more clicks than whatever the proverbial Joneses may have posted. Many of us no longer even bother to edit the content we share, speaking whatever pops into our head or photographing ourselves whenever the mood strikes. Even our “leaders” are now increasingly getting in on the act. What we write isn’t as important as how often we write and how many people we get to follow us while we do it.
In a race that’s won by the person or group receiving the most attention, easy and shallow pastimes are a far more efficient means of generating material than activities requiring effort, planning, research and other forms of deep engagement. Unfortunately, attention will always be a poor substitute for meaningful relationships and banality will never be as fulfilling as pursuits that expand our horizons.
Frederick Jackson Turner concluded that the frontier had rendered “Movement” the “dominant fact” of American history during the country’s first century of nationhood. He argued that “the American energy will continually demand a wider field for its exercise…in spite of environment, and in spite of custom, each frontier did indeed furnish a new field of opportunity, a gate of escape from bondage of the past; and freshness, and confidence, and scorn of older society, impatience of its restraints and its ideas, and indifference to its lessons, have accompanied the frontier.”
Today there is no longer an opportunity to escape to fresh unsettled territory, but there are still frontiers galore for each of us to explore. In the absence of blank places on the map enticing us onward, we are faced with empty spaces within ourselves. It is our fear and ignorance that we must strive to overcome to find our “new field of opportunity.” For each of us, this frontier will offer somewhat different challenges and take unique shapes. But if we can transcend the easy narcissistic fixes that consumerism and social media invite us to indulge, who knows what we might be able to discover that’s truly worthy of sharing along the way.
Other recent stories by Craig include: Epigenetics: Where Biology And Culture Meet & Are You Getting Enough Awe In Your Experiential Diet?
I’ve never been much of a winter person. I don’t like having to get all bundled up to go outside, and camping in frigid temperatures for me usually means a miserable sleepless night.
I have gone snowshoeing a couple of times, and alpine skiing a few additional times. I briefly organized monthly trips to a local nordic center. Each full moon the center would line the groomed trail around a nearby frozen lake with luminaries. The moonlight reflecting off the snow gave the whole world a kind of silver aura which seemed particularly magical after a couple of hot toddies.
But if I’m being honest, those monthly trips were more social events than moonlit escapes into nature. It wasn’t the kind of thing I ever seemed inclined to do on my own. Winter for me, as for so many people, is spent predominantly indoors. It takes special circumstances to lure me outside for any significant length of time from December to March, and I always reserve the right to cancel on account of the weather.
Experiences accumulate like fat during the other three seasons of the year. During the winter this stored energy is burnt off in various essays and a few other creative pursuits, for better or for worse. Time is spent trying to stretch the supply of experiential material hopefully accumulated during the warmer part of the year when the only item of clothing not technically optional was a pair of hiking boots. Winter was ideally made for research, typing up and reviewing notes, and scribbling short grey days and long cold nights away.
. . .
I have a table at a local pub that I use to get through the colder months. Generally, I visit it only one day a week, though two has not been unheard of. Weekdays are preferable to weekends. Mondays or Tuesdays are the best.
The bar is usually pretty empty early in the week. Just about everyone else has a liver exhausted from a weekend of over-indulgence and is back to their regular 9–5 routine. I’m blessed (or cursed) with a routine that is out of sync with most of the employed world, and so can’t tax my liver on the same schedule.
Regardless, I like having the pub almost exclusively to myself. Sometimes there’s just me and the staff huddling nearby for their weekly meeting. There’s a plug for the computer, and I bring a backpack filled with notes and books to fill the afternoon while casually eavesdropping to learn which beers and liquors were most popular over the course of the previous week.
The TV in the corner behind the bar provides a mild visual distraction, though I can’t hear it over the music. The setting is familiar, but not too familiar. The bartenders have come to know me and to expect I’ll be staying a while. They are polite but respectfully keep their distance knowing the only interruptions I expect or will long endure are those necessary to keep the pints coming. It’s as though I’m a fixture, and I like it that way.
Though my regular Monday and/or Tuesday table would seem about as far removed from nature as one could get — especially being situated, as it is, at a window overlooked by a busy city sidewalk — it plays a similar role. The pub, like treks into the nearby mountains and deserts, provides the occasional necessary change of scenery to keep the creative process on track.
It is often said that the discipline a daily routine imposes is essential to every would be writer, artist, scholar and/or scientist. But everything is poison at a certain dose. Creativity requires breaks from the usual surroundings, even if these changes are themselves a predictable part of a regular schedule. If five or six days are spent working largely alone staring at the same four walls, introducing the mildly unpredictable ruckus of a pub and some different faces to look at now and then can lubricate the gears a little. Of course, the beer helps too.
. . .
But now it’s mid-March. In just a few days spring officially gets underway. This may qualify as a fifth season of the year: the anticipatory season. Thoughts increasingly turn toward the chance to really get away. The eyes begin to scan the nearby mountains more and more to assess how much longer the snow-pack will interfere with the chance to go for a hike.
Text messages with particular friends during these final days of winter inevitably include the possibility of near future day hikes and camping trips. We know that many of these will never take place. A lack of time and resources will squeeze the possibility out of most of these schemes before they have a chance to become reality.
But that’s not the point. Just imagining weekends or even whole weeks away is itself a kind of mini vacation. Researching new places to go on the internet and sharing the findings with friends that likewise find sitting around campfires, scrambling over rocks, or climbing unfamiliar mountain peaks intoxicating brings its own rewards. Indulging these fantasies is essential to maintaining the outward appearance that we are fully engaged with reality that polite society generally expects.
But at least a few of these fantasies will materialize into experiences more tangible than bucket lists and daydreams. “Wisdom often wanders,” Robert Moor writes in his book On Trails: An Exploration. “St. Augustine, Siddhartha, Li Po, Thomas Merton, Maya Angelou — the insight of each was deepened by wild and meandering youth.” I’m not exactly youthful anymore. Nor do I claim to be particularly wise. That said, I’m looking forward to leaving this winter’s pub table for another season of wandering of both the imaginary and real variety. Perhaps in the process, I’ll get lucky and gain some wisdom that will make its way to paper next pub season.
Cover Image by author along Canada’s West Coast Trail in Pacific Rim National Park