Freedom from the nine-to-five or another form of exploitation? The gig economy is both!
Book gives you the truth about about what to expect and helps you make a plan when nothing is predictable.

Everything Must Go reads like a meticulously documented treatise by a cultural historian. Yet, it contains enough narrative about both the variety of doomsday scenarios and their cultural context that it is a fascinating read. The average person should easily be able to finish the 400 or so pages without getting bogged down in “academia.”
Doomsday prophecies have been around practically since the time when humans began to record history. Lynskey attributes this to an inherent need to believe that something about the time that we live in is unique and/or a crucial turning point. “Such temporal egotism is baked into apocalyptic thought since…the Book of Revelation…We attempt to take the mess and mystery of the future, which has always been frightening because it is the ultimate unknown, and tidy it into a story.”
Today, we face real potentially world-ending (or at least civilization-ending) threats: the “apocalyptic twins” of nuclear weapons and climate change. Lynskey discusses the rise of “Longtermists,” a school of thought (that surprisingly includes Elon Musk) that we must do everything possible to avert extinction. Unfortunately, this often means that huge amounts of resources must be diverted to “save” a privileged few who will safeguard the “legacy of everybody who has ever lived and the infinite potential of everybody not yet born.” However, “when there are so many things to worry about…nobody can agree what to prioritize.”
Early theories about the end of the world were often grounded in real disasters. The world was smaller and people were not globally connected, so anyone who lived through a major flood or earthquake perceived such a disaster as the end of everything that they were familiar with. Such events and stories were also incorporated into religion, which met the human need for a beginning, a middle and an ending, especially an ending where the good guys (God) triumphed over evil.
Lynskey traces the Christian origin of apocalypse, which initially (at the time of the writing of Revelation) was believed to be imminent. Later predictions of apocalypse centered on millennial calendar years or astrological events–notwithstanding Jesus’ admonition that no one but God—no man, nor the angels, and not even the Son (Jesus himself) would know the day and the hour of the end times. Nonetheless, there are innumerable documentations of various preachers prophesying the final days only to be disproved by tomorrow.
During the 19th Century, American Christianity developed theories of premillennialism (we were already in the end times and awaiting Christ’s return) and postmillennialism (Jesus would NOT return until Christians had established a good society). In our own American history, a blend of “exceptionalist optimism and violent paranoia” has become permanently embedded in “the lifeblood of American conspiracy theories,” which unfortunately continues to affect our politics.
Lynskey then proceeds to document various doomsday scenarios by topic: The Last Man (it’s always a man), impact from an extraterrestrial object, the Bomb, machines (robots, computers and AI), the collapse of civilization, pandemic, and climate change.
The English Astronomer Edmond Halley established the periodicity of comets and predicted the return of (what become known as) Halley’s comet in 1758. Halley attempted to disabuse the public of fearing comets (which aggravated conservative religious authorities), but he also attempted to connect comets to meteorological disasters on earth. So, although the comet itself would not end the earth, it could be sufficiently destructive to kill everything on earth. Halley attempted to view this as a necessary “cleansing” that would allow the emergence of a renewed world.
Apocalyptic astral collision stories continue to entertain us, from comet and asteroid collisions—Meteor (1979 and 2009), Without Warning (1994), 3 Minutes to Impact (1996), Doomsday Rock (1997), Armageddon and Deep Impact (both in 1998)—to the actual collision of planets: When Worlds Collide (1951), the 2022 Moonfall (the moon is inexplicably knocked out of orbit and sent hurtling to earth), and Melancholia (2011). A plethora of these movies came out in the 1990s, when the world was relatively stable, but advances in technographics allowed us to imagine what such a disaster might actually look like. Moreover, computer modeling has enabled us to predict geophysical consequences of impact based on the size and speed of whatever hits us. The 2021 film Don’t Look Up parodies the denial of one disaster (astral impact) with another (climate change).
No sooner had the first World War ended than the science fiction author H.G. Wells began writing about the Second World War in 1930. Even as the real World War II was kicking off, science fiction writers were imagining an apocalyptic superweapon. Lynskey argues that the creators of the atomic bomb paradoxically hoped that they could create a weapon that was so destructive that no one would actually use it, because it would literally result in the end of civilization. They did not imagine that one day an American malignant narcissist sociopath would have access to nuclear launch codes.
This juxtaposition of “deliverance or doom” was a predictable sequela of the horrors unleashed by World War II: “Auschwitz and Hiroshima, the gas chamber and the mushroom cloud, thus became twinned in the catastrophic imagination as two unprecedented experiences of mass death and trauma, but one could be blamed on a defeated tyranny while the other had been inflicted by a victorious democracy and seemed considerably more likely to happen again.” Thus humanity was confronted with the dual unpleasant realities that it had both the technological capacity to wipe itself (and most other life) off the face of the earth, as well as its own hateful, destructive dark side.
Yet, the Bomb was both a wake-up call as well as the “conundrum of using a life-or-death choice as a rhetorical strategy: if you tell people that the cost of failure is world destruction and then fail, then they are left with either the psychologically insupportable prospect of constant dread or the impression that you were exaggerating the stakes. Neither response inspires action.”
In order to keep the dangers of nuclear weapons in the public consciousness, Met Lab physicist-turned-arms-control activist Eugene Rabinowitch (1901-1973) co-founded the newsletter (still in existence today) Bulletin of the Atomic Scientists. Today, the Bulletin maintains its concern with nuclear war but now includes concerns about multiple existential threats from biological warfare, climate change, and artificial intelligence. The early Bulletin engaged the services of artist Martyl Langsdorf (whose husband was also a physicist-now-turned-peace-activist) to design the now iconic Doomsday Clock. The Clock was initially set at seven minutes to midnight, ironically “delighting premillennialists as they counted down to Armageddon.” When the first hydrogen bomb was detonated in the Eniwetok Atoll of the Marshall Islands on November 1, 1952—proving that previously-believed-to-be-theoretical nuclear fusion was now a reality, Rabinovitch moved the time to two minutes to midnight.
But the Doomsday Clock can move backwards as well as forwards. Because humans seemed to be taking nuclear arms control seriously, the Clock was moved back to twelve minutes to midnight by 1973. When Ronald Reagan became President in 1981 and started up saber-rattling against the former Soviet Union (as someone who was visiting Moscow as part of a “peacekeeping” cultural exchange in January 1981, I remember this viscerally), the Clock was set forward to four minutes to midnight. When the Berlin Wall fell in 1989, the Clock was at six minutes to midnight, but was adjusted backwards to 10 minutes to midnight, and then again backwards to 17 minutes to midnight, following the signature of the first Strategic Arms Reduction Treaty (START) in 1991.
The symbolism of the Doomsday clock reaching midnight is that it purportedly marks the end of civilization as we know it, or alternatively, the point when humanity has rendered the earth uninhabitable. Which begs the question, if such a situation were to occur, who would be left to “reset” the Clock and inform what’s left of the rest of us? And would it even matter? In January 2025, as the second Trump maladministration took power, the Doomsday Clock was moved forward only one second—from 90 seconds to 89 seconds before midnight—the closest it has ever been to midnight. The Clock is scheduled to be updated in January 2026.
Unfortunately, “apocalyptic Christianity” has played a huge role in religious conservative administrations’ relationship to nuclear war. Premillennialists practically “salivate over the prospect of nuclear war,” believing it will usher in the biblical Armageddon and the Second Coming of Christ. They gleefully point to biblical “prophecies” from 2 Peter 3:10 (“…the day of the Lord will come as a thief in the night; in which the heavens shall pass away with a great noise, and the elements shall melt with fervent heat, the earth also and the works that are therein shall be burned up.”); and Zechariah 14:12 (“Their flesh shall consume away while they stand upon their feet, and their eyes shall consume away in their holes, and their tongue shall consume away in their mouth”).
Ironically, even with all the politically testosterone-fueled posturing during the 1980s, real nuclear danger presented itself as a consequence of technical malfunction rather than political choice. On September 26, 1983, the Soviet Union’s new early-warning system announced (five times) that a US missile strike was incoming. Fortunately, Lieutenant Colonel Stanislav Petrov had seen similar technical glitches before and took time to double-check the data before alerting the chain of command with the authority to launch a retaliatory strike. There have been three major nuclear accidents (reactor core meltdown and release of radioactivity) of note involving “peaceful” electric power plants: Three Mile Island in 1979 (Pennsylvania); Chernobyl in 1986 (occurred when Ukraine was part of the former Soviet Union); and Fukushima Daiichi in 2011 (following an earthquake and tsunami in Japan). It seems like we no longer needed science fiction writers and Hollywood producers to help us imagine new forms of disaster.
In this section, Lynskey traces science fiction dealing with robots, computers and artificial intelligence, along with the corresponding technical advances. Isaac Asimov (1920-1992) is one of the most famous authors of stories about robots, but in most of Asimov’s stories the robots are programmed with the “Three Laws of Robotics,” which render them incapable of harming humans. In other stories, the robots are not so much evil as amoral—obedient to their programming, but without a conscience to understand the negative consequences of their actions.
Lynskey quotes a lament from the physicist Freeman Dyson about the banality of evil in so-called “push button warfare,” where the operators could not see the people they killed: “Technology has made evil anonymous. Through science and technology, evil is organized bureaucratically so that no individual is responsible for what happens…None of us ever saw the people we killed. None of us particularly cared.”
In 1968, film-maker Stanley Kubrick and science fiction writer Arthur C. Clarke co-produced 2001: A Space Odyssey. In addition to interplanetary space flight, we are introduced to an early version of AI: HAL 9000 is an onboard computer that has been programmed to both tell the truth while lying to the astronauts about the purpose of their interplanetary mission. This creates a form of cognitive dissonance in which HAL decides the only way it can complete its assigned mission while maintaining secrecy is to kill the astronauts. The surviving astronauts figure out what is happening and find a way to shut HAL down, who displays almost human-like symptoms of dementia while in the process of shutting down.
Lynskey briefly covers the “Y2K” scare that all computers would stop at midnight on December 31, 1999, because older software that used two-digit year formats would be unable to tell the difference between 1900 versus 2000 and likely crash. As we all know now (like so many other end-of-the-world predictions), nothing shut down New Year’s Eve celebrations as clocks changed around the world at midnight on 12-31-99. Lynskey suggests that this is due to the so-called “prophet’s dilemma,” where a prediction inspires the action needed to thwart it—which also suggests the same could happen for global nuclear war, AI, or climate change. While the Y2K scenario never materialized (less than 5 % of the world’s population was online in 1999), it inspired a number of novels about the collapse of civilization when the machines stop—planes crash, screens go blank, phones are dead, streets go dark, quiet and empty.
In Arthur C. Clarke’s Nine Billion Names of God (1953), a group of Tibetan monks hires two American engineers to build a supercomputer that can list all of the possible names of God in 100 days rather than the 15,000 years the monks figure will take them to do the job themselves. The monks believe that once all nine billion names of God are listed, the purpose of the universe will be fulfilled and the universe will end. As the engineers complete the project, they vacillate between worry about whether the universe will actually end, or if it doesn’t end the monks will be unhappy customers. As they leave the monastery at the end of the project, they look up to see the light of the stars going out one by one—just as the monks predicted.
Rarely are machines depicted as being inherently evil. Rather their apparent malevolence stems either from a technical malfunction or a single-minded pursuit of their programmatic goals. In “The Nine Billion Names of God, the computer does only what it has been instructed to do; the dire results are down to human arrogance, fanaticism, or error.” This raises the concern about the morals (or lack thereof) of those doing the programming. In a tech world dominated by overprivileged and hubristic Silicon Valley-type tech bros, this does not bode well for the rest of us.
This line of stories addresses scenarios involving the end-of-civilization-as-we-know it as opposed to the literal end of the world. Here, the stories often begin in a post-apocalyptic world, where a remnant of human survivors struggles to figure out what’s next. For film producers, this avoids the expensive disaster graphics but sometimes leaves viewers with questions about “how did we get here.” Story lines frequently address questions of human nature—do we find ways to cooperate and build back better or do we devolve into a war of all against all.
Lynskey begins with an analysis of John Wyndham’s 1951 The Day of the Triffids—which came out around the same time as George Orwell’s 1984. According to Lynskey, the mid-twentieth century birthed this “inchoate genre” of dystopian futures now called “catastrophe narrative.” The essential elements of catastrophe fiction are: “The sturdy loner who knows how to look out for himself. The securing of guns, fuel and food by looting deserted shops. The transformation of cities into battlegrounds and bunkers. The flight to the countryside. The formation of militias and gangs. The recrudescence of millenarian enthusiasm. The constant motion in search of asylum from the chaos. The grace notes of nostalgia.”
In catastrophe fiction, the nature of the event is far less important than its consequences. Here, authors are willing to ask questions about the darkest depth of human nature. In Anna Kavan’s Ice (1968), where instead of a single cataclysmic event we see an irreversible disintegration, as nameless countries collapse in the advance of a (never explained how or why) advancing cliff of ice. “The prose has a numbing, pitiless quality…’there would soon be only ice, snow, stillness. Death; no more violence, no war, no victims; nothing but frozen silence, absence of life. The ultimate achievement of mankind would be, not just self-destruction, but the destruction of all life; the transformation of the living world into a dead planet.”
A related story form is the survivalist narrative. World War II (the Bomb), Y2K and even climate change have fueled an interest in both survivalist fiction as well as practical survivalist literature. “In 2012, a show called Doomsday Preppers became the most watched show in the history of the National Geographic Channel.” George Miller, the creator of the Mad Max movies, reports being inspired by the “scuffles at gas stations during a period of severe fuel rationing in Australia. The economic shocks of the 1970s had radically changed assumptions about the sustainability of post-war prosperity.”
Survivalist stories modeled after Mad Max illustrated what Lynskey terms “incipient fascism,” with the “most abhorrent armageddonist story being William Luther Pierce’s (1978) The Turner Diaries. “The post-apocalyptic trope of rebirth from the ashes overlaps, often unintentionally, with fascist notions of regeneration achieved through virility and violence….It is not that survival forces people to do terrible things…rather, it enables then to do terrible things.”
Survivalism appears to be a uniquely American phenomenon, with its “undercurrent of rugged individualism, of vigilantism even. Americans take to the hills to fend off the nuclear holocaust with a shotgun and a supply of food.” Lynskey quotes Oregon sociology professor Richard G. Mitchell, who studied survivalists in the field for over two decades. Mitchell concludes that “some were murderous, paranoid racists and some merely eccentric hobbyists, but all were storytellers. Feeling themselves powerless to shape society as it was—high-tech, bureaucratic, increasingly incomprehensible—they engaged in a form of speculative fiction in which they could be reborn as warriors, entrepreneurs, and builders of a new world.”
Like almost everything else in America today, even survivalism has been bifurcated along class lines. Survivalists are no longer comprised only of rugged hunters with a store of dried food living in the wilderness but now include elite preppers with remote airstrips that can only be reached by private planes, along with fortified compounds and nuclear-hardened bunkers. The elite discuss the end using coded terms like WTSHTF (when the shit hits the fan), WROL (without rule of law) and TEOTWAWKI (the end of the world as we know it). Peter Thiel and other billionaires have purchased large acreages in New Zealand, as it is considered to be relatively safe(r) from threats like climate change and sufficiently remote to avoid the worst effects of social and political collapse. Mark Zuckerberg purchased 700 acres on the Hawaiian island of Kauai, complete with its own water and energy systems, along with a 5,000 square foot underground bunker.
In Children of Men (P.D. James novel in 1992, movie in 2006), “the last country standing is a survivalist nation which has closed its borders and its heart…Here is the ‘lifeboat ethic’ in action, as the relative comfort of life in the Council necessitates both the inhumane treatment of refugees and a daily dedication to ignoring that inhumanity….in the traumatized world of 2027…the Council’s authoritarian state could pass for a tatty democracy—a semi-dystopia in which coffeeshop chains coexist with internment camps, and digital billboards with suicide pills.”
The COVID-19 pandemic of 2020 both resurrected the old and created new pandemic and contagion narratives. During the early stages of COVID-19, when even the experts didn’t really understand it and the world was on lockdown, many folks felt like they were indeed living in some version of end times. People stuck at home streamed movies like Outbreak, Contagion, and 12 Monkeys.
Here, Lynskey gives us a historical run-down of prior real pandemics, particularly the Black Death that devasted Europe between 1346 and 1353. At that time, even physicians were ignorant of the germ/virus theory of disease, instead attributing illness to an imbalance in the four “humors” (blood, phlegm, black bile and yellow bile), or bad air (miasma). Most other folks attributed supernatural or religious causes. “For almost the entire span of human history, infectious disease has been by far the biggest killer.”
Unfortunately, people also tend to racialize disease, often attributing it to some foreign “other.” In Matthew Phipps Shiell’s (1901) The Purple Cloud, the protagonist (with the Biblical name of Adam) is the sole survivor of a volcanic explosion that releases a “mephitic hydrocyanic fog.” Adam goes forth to burn what remains of the world’s cities, losing his “Western, modern mind” and regressing to a “primitive Eastern one…[He] racializes his uncertainty about whether to continue the race in terms of ‘White Power’ and ‘Black Power.” We also see a foundation for Sinophobia: “… southern China was the source of so many diseases. Including the third plague pandemic, fused with concern over the country’s booming population [which] would one day give it military and economic supremacy.”
“Politics has always appropriated the language of health and illness.” “We” are imbued with the essence of purity, which must be maintained against the noxious scourge of the “other.” We are all familiar with how Hitler dehumanized the Jews by describing them as “vermin.” At the same time, an Austrian Jew writing in 1933 inquired “whether the time hasn’t come where it is our duty to quarantine the world around us, so that it doesn’t get infected” by the contagion of Nazism.
Some of us remember the Ebola and Legionnaires disease scares in the 1970s. When an outbreak of swine flu occurred at Fort Dix in 1976, the administration of President Gerald Ford became concerned for another 1918-style epidemic. The government paid $135 million for vaccines, but the epidemic never materialized—then paid out another $100 million for Guillain-Barré syndrome traced to the vaccine. “In a real epidemic, flu is a greater risk factor than the vaccine but in 1976, the solution was worse than the problem. The fiasco was a major blow to pandemic preparedness.”
Lynskey next covers the AIDS epidemic. The retrovirus that causes AIDS was discovered in 1983. By the end of the 1980s, AIDS had killed more Americans than the Vietnam War. Attention to AIDS was “complicated by the fact that homosexuality itself was still widely regarded as a shameful illness.” Larry Kramer—playwright, novelist and screenwriter—became one of the leading early advocates for AIDS research, publishing a “sensational tirade” in the gay press warning, “If this article doesn’t scare the shit out of you, we’re in real trouble.” Newly liberated gay men viewed advice to abstain from sex as a mandate to “return to the closet,” but Kramer “insisted it was existentially necessary.” Although there is no cure for AIDS, today it can be managed with antiretroviral therapy, which allows many people with HIV to live long and healthy lives.
“Due to Ebola and AIDS, the jungles and caves of Africa temporarily replaced China in the Western imagination as the cradle of microscopic assassins, producing a new set of racialized assumptions…Something was bound to succeed the bomb as the ultimate terror generator in the big-budget thrillers, and the virus is the strongest, most charismatic candidate to come along in years.”
George Stewart’s 1949 When Earth Abides connects epidemics with the spread of air travel and international commerce. When Stewart’s novel was published, around two million people traveled on international commercial flights. At the pre-COVID peak, this number was four and a half billion. “Airports are hubs of anxiety in many pandemic movies, where a departure board comes to resemble a kill list.” Lynskey reports that an outbreak of SARS in 2003 became “the millennium’s first jetset disease.”
A theme that often comes up in post-apocalyptic fiction is how humans react. In the majority of narratives, terrified humans devolve into brutality: “Those who don’t die from the virus will die from the fighting.” However, one author—Emily St. John Mandel—prefers to ask what parts of civilization would be lost and what would be salvaged? In the 2014 Station Eleven, 79 people are stranded in a Michigan airport when the planes are stopped. One young man becomes radicalized when he reads the Book of Revelation after his Nintendo dies. This “prophet” argues that civilization should be buried and forgotten. Another stranded passenger wants to preserve it, and sets about to curate a Museum of Civilization, collecting laptops, radios, iPhones and passports.
Most of the pandemic/contagion fiction focuses on the spread of the virus, which is presumed to arise from some unexplained natural causes. There was a brief period where the US government was concerned about potential bioterrorism after 9/11. Lynskey alleges that the administrations of both Presidents Clinton and Bush “had in fact spent too much on combatting bioterrorism and not enough on preparing for naturally occurring pandemics. For members of the White House Biodefense Directorate, weaponized smallpox was far more exciting than the flu.”
“Most of the catastrophic events in this book have not happened, from the Christian millennium to thermonuclear war and Y2K meltdown, but climate change is happening, one way or another, right now.” In October 2018, sociologist and former organic farmer Roger Hallam launched Extinction Rebellion. The UN Intergovernmental Panel on Climate Change (IPCC) published the Doomsday Report. “For many people, it was the end of denial—not intellectual denial that human-made climate change is real but emotional denial of the implications. Anyone younger than sixty will likely live to see “the radical destabilization of life on earth—massive crop failures, apocalyptic fires, imploding economies, epic flooding, hundreds of millions of refugees fleeing regions made uninhabitable by extreme heat or permanent drought.” Anyone younger than thirty is “all but guaranteed to witness it.”
A corollary to climate change is overpopulation. Thomas Malthus, in An Essay on the Principle of Population, was the first to warn about human population overtaking available food supply in 1834. Although Malthus was deemed by some to be a “dark and terrible genius” destroying hope for mankind’s future, Malthus himself said his intent was to warn humanity in sufficient time to remedy the problem he was predicting. As we all know, advances in agricultural technology allowed the world human population to expand from 1.6 billion in 1900 to over 8 billion today. Although there have been periodic regional famines, the widespread global death by starvation as predicted by Malthus never materialized.
However, as global population growth reached an unprecedented rate of 2 percent—along with increased concerns about environmental pollution in the 1970s, there was renewed interest in it. China began mandating later marriages and fewer children in 1970, culminating in the “one child” mandate in 1979. This was gradually lifted to two children in 2016, then to three children in 2021, before it was abolished altogether.
Very little fiction addresses either the issue of climate change or overpopulation, as most of the literature involved non-fiction: The Population Bomb (1968), The Limits to Growth (1972), The End of Affluence (1974), The Population Explosion (1990) and Population 10 billion (2013). Lynskey briefly mentions the 1973 movie Soylent Green. Unfortunately, there is no mention of the 2013 movie Elysium, where a privileged elite lives in pristine comfort on a space station while the rest of humanity struggles on an overcrowded and polluted Earth. Lynskey also makes no mention of Kim Stanley Robinson’s Ministry for the Future (2020), which actually portrays humanity making the necessary changes—including huge geoengineering projects and addressing extreme inequality—providing a narrative for a hopeful future rather than dystopia.
“It might seem trivial to worry about the dearth of compelling stories about climate change, but the failure of fiction to get a handle on the subject reflects the struggle of scientists to craft an unignorable narrative. Compared to nuclear war, the climate emergency deprives popular storytellers of their usual toolkit. Global warming may move too fast for the planet but it is too slow for catastrophe fiction.”
During March 2020, there were many posts on various platforms alleging that humanity was the virus and COVID-19 was the vaccine. Under worldwide (for the most part) lockdown and stay-at-home orders, there was a measurable drop in carbon emissions as well as deaths from traffic accidents. It was as if the Earth’s immune system was kicking in, ridding itself of the human parasitic infection.
“We have for so long, over decades if not centuries, defined predictions of the collapse of civilization or the end of the world as something close to proof of insanity, and the communities that spring up around them as ‘cults,’ that we are now left unable to take any warnings of disaster all that seriously….Fear has its uses, but fear on this scale seems to be disabling, paralyzing.”
Lynskey seems to share a philosophy of original sin, or the belief that it is our own dark side rather than our creations (like bombs, AI, or climate change) that presents the real existential threat—which means in essence that there is no “cure” for it other than our own extinction. Lynskey quotes Canadian psychologist Jordan B. Peterson: “Perhaps Man is something that should never have been. Perhaps the world should even be cleansed of human presence, so that Being and consciousness could return to the innocent brutality of the animal. I believe that the person who claims never to have wished for such a thing has neither consulted his memory nor confronted his darkest fantasies.”
Today, abandoned urban areas like inner-city Detroit and the Exclusion Zone around Chernobyl give us a glimpse of what a post-apocalyptic future might look like. The Chernobyl disaster has even inspired a survival video game. “Integral to the Zone’s fascination is a certain callousness about the people who once lived there, so much less eerily compelling than what they left behind.”
Lynskey admits that he “does not have an apocalyptic imagination…partly because I am prone to depression and health anxiety. If you are susceptible to catastrophizing, then the standard advice is to check your fears against the facts and put them in perspective.” While Lynskey has managed to guard himself against end-of-the-world thinking, he nonetheless believes things like climate change, nuclear war and pandemics pose genuine threats. During his “two-year binge” of doomsday research, Lynskey asked himself, “what good does all this fascination do [for] us?” He suggests that we would better spend our time working on solutions rather than watching doomsday movies.
“Invoking the end is entirely legitimate for writers of speculative fiction and a valid, if risky, strategy for activists, in order to spur reflection, appreciation or prophylactic action.” However, Lynskey acknowledges that there is a deep-seated need for apocalyptic fiction among nihilists (who want the world to end because life is painful and pointless), the “apocalyptic heirs of John of Patmos” eagerly anticipating Armageddon, and the “doomers…who mistake defeatism for moral seriousness…and have overdosed on dread.”
“The higher the stakes, the less tenable fear is, and the more likely it is to breed disassociation or hopelessness…The evolutionary purpose of fear is to jolt one into an act of self-preservation but one cannot live in a constant state of psychic emergency…We have to live in a space between ‘everything will be OK’ and ‘everything is fucked.’”