Our Man in Hong Kong is Tim Wiseman, an American, a professional man who works for a multinational company. Two years ago, after happily living and working in Colorado Springs for years, he accepted an overseas assignment with his firm in Asia. A self-described Bumpkin, he began writing an email journal of his experiences, exploits, trials, and triumphs (surely there were some) as a stranger in a strange land. Here is one of his humorous and informative Bumpkin Reports, presented in complete and unadulterated form. Caution: The subject matter has been known to cause gustatory and olfactory distress. You’ve been warned.
The Bumpkin Report Volume 49.5 – Durian Tasting Adventure
A Photo Adventure of Durian, The King of Fruits.
My friend Natalie at work regularly introduces me to local foods, especially fruits and vegetables. Last week she said, “Tim, it’s Durian season again. I’m taking you out for a proper hand selected fresh Durian.”
Okay Natalie. I’m ready!
So after work on Friday she took me to the upscale market (rather like the Whole Foods). She then spends 5 minutes consulting with the Durian specialist to determine which of the three varieties will be least toxic for this Bumpkin. I notice the terms on the signs have descriptors like Creaminess and Richness. These are lies. The real translation is Pastiness and Stinky-ness.
After some deliberation the package is selected and we are given plastic gloves to wear because the smell is so strong.
I grab a large bottle of Snapple Raspberry Tea to wash it down and Natalie laughs at me.
We step over to the open seating tables and she proceeds to unwrap the Durian. It is in FOUR layers of plastic. Not kidding.
At this point I am breathing through my mouth to avoid smelling the fruit.
We don our gloves as Natalie tries to describe the texture of Durian. I’m a little concerned when she uses phrases like toothpaste and “stiff yogurt.”
What appears to be a dried mango texture becomes quickly replaced with a slimy and squishy effect that I can feel under the gloves.
The first bite is to hard to describe. Even holding my breath, I can smell the rancid dirty sock/garlic overtones. The texture is like a semi-spoiled banana. It’s unpleasantly pasty and somewhat stringy. Based on texture alone…I’m out.
The after taste lingers and lingers. It’s not as rancid as the initial bite but is still overpowering and DOES NOT go away.
I’m still breathing through my mouth because I know if I smell it, I will gag.
I try two more bites before giving up and drink the entire Snapple.
Natalie loves it. She wraps up the left over to take home.
Two days later I am STILL burping up Durian. Honestly, I’ll eat 10 more fish eyes and a dozen chicken feet before I ever try this again.
Natalie captured this photo journey. Any blurry photos are because she is laughing so hard.
I don’t think Durian should be named the King of Fruits. I think it should be named the Madman Genocidal Dictator of fruits.
Many thanks to the Bumpkin, Tim Wiseman, for sharing this daring Durian tasting adventure. Look for more Bumpkin Reports in the months to come.
Here is a list of questions someone asked of children about dating and marriage. This has been on the Internet, but it crossed my desk decades ago. I can’t be certain of its providence, but it was attributed to one Curtis Singmaster of The Salisbury School in Salisbury, Connecticut. If that’s true, then thanks to Curtis for the hilarious responses children gave to these questions.
Question: How do you know whom to marry?
Answer: You’ve got to find somebody who likes the same stuff. Like, if you like sports, she should like it that you like sports, and she should keep the chips and dip coming. (Alan, age 10)
Answer: No person really decides before they grow up who they’re going to marry. God decides it all way before, and you get to find out later who you’re stuck with. (Kristen, age 10)
Question: What is the right age to get married?
Answer: Twenty-three is the best age because you’ve known the person FOREVER by then. (Camille, age 10)
Question: Can a stranger tell if two people are married?
Answer: You might have to guess based on whether they seem to be yelling at the same kids. (Derrick, age 8)
Question: What do you think your mom and dad have in common?
Answer: Both don’t want any more kids. (Lori, age 8)
Question: What do most people do on a date?
Answer: Dates are for having fun, and people should use them to get to know each other. Even boys have something to say if you listen long enough. (Lynnette, age 8)
Answer: One the first date, they just tell each other lies and that usually gets them interested enough to go for a second date. (Martin, age 10)
Question: What would you do on a first date that was turning sour?
Answer: I’d run home and play dead. The next day I would call all the newspapers and make sure they wrote about me in all the dead columns. (Craig, age 9)
Question: When is it okay to kiss someone?
Answer: When they’re rich. (Pam, age 7)
Answer: The law says you have to be eighteen, so I wouldn’t want to mess with that. (Curt, age 7)
Answer: The rule goes like this: If you kiss someone, then you should marry them and have kids with them. It’s the right thing to do. (Howard, age 8)
Question: Is it better to be married or single?
Answer: It’s better for girls to be single but not for boys. Boys need someone to clean up after them. (Anita, age 9)
Question: How would the world be different if people didn’t get married?
Answer: There sure would be a lot of kids to explain, wouldn’t there? (Kelvin, age 8)
Question: How would you make a marriage work?
Answer: Tell your wife that she looks pretty even if she looks like a dump truck. (Ricky, age 10)
The late Professor Joseph Campbell said, “As you proceed through life, following your own path, birds will shit on you. Don’t bother to brush if off. Getting a comedic view of your situation gives you spiritual distance. Having a sense of humor saves you.”
If you see something humorous, email it to me and I’ll consider adding it to the Reflections.blog. Email it to firstname.lastname@example.org.
Photo credits: (1) Boy and girl outside: Hwahl3 at Dreamstine. (2) Boy and girl; girl pushing up nose: Ron Sumners at Dreamstine. (3) Child offering kiss: Suzanne Tucker at Dreamstime.
The progress toward social liberalism may have been impeded since Trump took office, but it is inevitable, despite those who oppose it. Had Rush Limbaugh had been on the radio in 1918, we can imagine him loudly denouncing the women’s suffrage movement. “Women should not be allowed to vote,” he might have argued, “because their role is in the home. The fairer sex is concerned with children and local issues. They don’t understand the loftier matters of state, government, war, or politics, nor should we ask them to.” He would have been joined in his opposition by Sean Hannity, Tucker Carlson, Steve Bannon, Laura Ingraham, and other firebrand conservatives who would use their media platforms to impede social progress.
Denying women the right to vote would be incomprehensible in 21st century America, but until 1920, when the Nineteenth Amendment to the U.S. Constitution was ratified, women’s suffrage was a hot social issue in this country, and many conservatives—some waving their Bibles and quoting scripture—vehemently opposed giving women the right to vote. Prior to 1920, the opposition to women’s suffrage was as strident and vocal as Donald Trump ranting about immigrants at one of his rallies. The Anti’s, as opponents of women’s suffrage were known, offered these kinds of arguments:
Only a minority of women want the vote. The majority are happy not having it.
Doubling the number of voters would lead to more corrupt voting practices.
Women have been making advances without having the vote; therefore, they don’t need it.
Women should focus on what suits them: education, reform, and charity.
Women already have important roles in society. Giving them the vote would force them to sacrifice their higher interests, namely the family.
Because women are excused from public service requiring the use of force (e.g., police, military), they would be irresponsible voters.
Despite these lame arguments, the Nineteenth Amendment passed. Women have now had the vote in this country for nearly a century, yet as recently as 2015 conservative extremist Ann Coulter said on the radio program “Free Speech” with fellow conservative Gavin McInnes that women should not have the right to vote. Coulter argues that if women were not able to vote, “We’d never have to worry about another Democrat president.” In Coulter’s opinion, women, especially single women, “are voting stupidly.” In taking away their right to vote, these women “would finally be silenced.”
American society, like all societies, will always have people who oppose social change. It upsets the status quo. It challenges conservatives’ value system and threatens their identity. It enables debate that may threaten the legitimacy of their ideology. It frightens them because they don’t want to lose or share the privileges or special status they and their kind have enjoyed. Allowing any social change may open the door to even more social changes, and they worry that the world they are comfortable with will end. In the worst of cases, they are not just concerned with protecting their own social status, they actively seek to deny others the opportunity to enjoy equal status.
Some opponents of social change find moral justification for their opposition in the Bible, the Koran, or another religious text. They invoke God because no one can envision a higher authority. If the Creator of the Universe opposes women’s suffrage, or racial equality, or gay marriage, their reasoning goes, then who are we to challenge God’s will? These arguments are spurious and illogical on numerous grounds, but when you have no better arguments to make, invoking the name of God may be if not your best option then your option of last choice.
Social changes do not come easily. Many people advocated women’s suffrage well before the mid-nineteenth century, but the movement’s formal origin can be traced to the Seneca Falls Convention in 1848. However, the movement did not achieve its objective until 1920—seventy-two years later. The struggle for racial equality has lasted even longer, and many would argue that it still has not been achieved. Unquestionably, racial prejudice remains to a greater or lesser degree in some American’s hearts and minds, but even in the relatively short span of my lifetime, I have seen considerable change in how African Americans are perceived in this country and treated by those in mainstream society. The full extent of Martin Luther King’s dream may not yet have been achieved, but considerable progress has been made since the 1960s, and today it would be as unthinkable to return to segregation and Jim Crow laws as it would be to return to slavery.
The women’s suffrage movement and the struggle for racial equality illustrate my principal thesis: that social liberalism is inevitable, although not without struggle, sacrifice, and fierce, highly opinionated resistance. Social change does not come easily—but it comes. Social conservatives initially resist change—and some will furiously, even violently resist it—but social progress is inevitable. Like the arrow of time, it only has forward direction if we view progress from the scale of decades and centuries rather than years. Reversals will occur, as they are occurring now under the Trump administration, and societies may backslide now and then, but as a macro view of human history shows, in the long run the arrow of social progress does not reverse. Rights and freedoms gained can be taken away—but when that happens, as it did in Cambodia under Pol Pot, the setback lasts for a relatively brief period in the longer scale of human history before rights and freedoms are restored and social progress continues. The pace of progress varies by culture, and one might argue that progress will never occur in North Korea and other repressive countries, but I would counter that we have to view progress on a macro scale. Social progress will come to North Korea and Syria and countries like them but not perhaps in our lifetimes.
LGBT Rights and Gay Marriage
In Trump’s America, the battle lines between social conservatism and social liberalism are currently drawn around LGBT rights and gay marriage. Our society is somewhere in mid-struggle with these issues but progress continues. The acceptance of gay and lesbian relationships has been steadily increasing in the past 25 years. In 1991, for instance, more than two-thirds of Americans said that sexual relations between two adults of the same sex was always wrong. By 2008, just over one-half of Americans had the same opinion. In a 2013 PEW survey, 60 percent of Americans believed that homosexuality should be accepted, and that number has since risen to 63 percent. Attitudes about homosexuality began changing around 1970 and acceptance has trended upward since.
Attitudes about LGBT vary considerably from culture to culture. Acceptance is greater in more affluent and less religious societies. Acceptance also varies by generation. Millennials are more accepting of homosexuality than baby boomers, and the generation following millennials is likely to be even more tolerant. Younger people are more exposed to diversity, particularly in school and through social media, television, and films, where favorable depictions of gays are now more commonplace. Despite Trump’s on-again, off-again support for LGBT people serving in the military and the reactionary attitudes of his followers, acceptance of homosexuality has become more mainstream. It’s difficult to envision social forces that could reverse that trend.
Likewise, attitudes toward gay marriage have evolved considerably since 2001 when 57 percent of Americans opposed it. By 2017, only 32 percent opposed it while 62 percent were supportive. Support for gay marriage is highest among people unaffiliated with a religion and slightly less so by white Protestants and Catholics. It is lowest among white evangelical Protestants. As expected, more than seventy percent of Democrats and independents favor gay marriage, while only forty percent of Republicans are supportive (although that number continues to rise).
In 2015, the Supreme Court ruled in a 5-4 decision that gay marriage was legal in all fifty states. When that ruling was made, thirty-seven states and the District of Columbia had already legalized gay marriage, but the issue remained contentious, and several Republican contenders for the 2016 presidential election—Ted Cruz of Texas and Scott Walker of Wisconsin—favored a constitutional amendment banning gay marriage. It is conceivable that an upswell of conservatism could result in a reversal of this Supreme Court ruling now that we have a more conservative court, but public support for gay marriage is strongest among younger generations, and all those young voters are likely to continue supporting this socially progressive development. Despite conservative religious opposition, this social advance seems likely to have passed the point of no return, and if a more conservative Supreme Court reverses the 2015 ruling, the rising tide of support for gay marriage will eventually restore this right.
Less contentious today is the idea of interracial marriage, but half a century ago it was not only frowned upon but illegal in some states. In the mid-sixties, Richard Loving, a white man, and Mildred Jeter, who was part black and part Native American, were sentenced to one year in prison for marrying. Their marriage violated the State of Virginia’s anti-miscegenation statutes and the Racial Purity Act of 1924, which forbade marriage between whites and people of color. In a landmark ruling that year, the Supreme Court declared that raced-based restrictions on marriage were unconstitutional.
In 1967, only three percent of married couples consisted of spouses from different races. By 2015, that number had jumped to seventeen percent. In 2017, 29 percent of Asian newlyweds (born outside the U.S.) and 27 percent of Latino newlyweds (born outside the U.S.) were married to spouses of another race. For Asians and Latinos who were born in the U.S., the rates of interracial marriage were even higher: 46 percent of Asians and 39 percent of Hispanics. Today, eighteen percent of black newlyweds and eleven percent of white newlyweds are marrying spouses from other races.
In a PEW Research study of attitudes toward interracial couples, non-black respondents were asked if they would oppose having a close relative marry someone who is black. In 1990, 63 percent said they would oppose it; in 2016, that number had shrunk to 14 percent—a remarkable attitudinal shift in just twenty-six years. We can see evidence for this shift on television, where a number of current ads depict interracial couples. What was once taboo is becoming more commonplace. Even the majority of Republicans today say interracial marriage does not matter. By 2014, there were more than five million interracial married couples in the United States, a number that will continue to rise, as will the number of children with biracial heritage. At some point in the future, our country will have more people with mixed-race heritage than people born from parents of the same race.
Women in the Workplace and the Military
We have to take a longer view to appreciate the strides women have made in the workplace and the military. In the 19th century, women had few workplace choices and were largely confined to the homestead raising children. The Civil War gave many the opportunity to work as nurses, just as the Crimean War had given that opportunity to British women. During this era, women who worked outside the home were primarily teachers, dressmakers, and domestic servants. In the twentieth century more opportunities opened up, but it wasn’t until World War II that the dam really burst. With so many men off to war, women filled many jobs previously open only to men, including jobs in industry. My mother and her older sister worked in an airplane plant in California during the war, part of the Rosie the Riveter generation. The situation regressed a bit in the “Father Knows Best” era in the 1950s when men, returning from war, took back their old jobs and women were expected to tie their aprons back on and keep house and raise children.
Today, while women still earn less for doing the same jobs as men, the gap is slowing narrowing, and women make up an increasing percentage of the workforce. In 2013, more women entering the workforce had at least a four-year college degree than men, and the gap between men and women is still growing. Forbes reported in 2008 that more than 11 million women were enrolled in college, compared to just over 8 million men. A PEW Research study found that in 2012, 71 percent of recent female high school graduates had enrolled in college, compared to just 61 percent of men. Today, women make up nearly half of the nation’s workforce, and about forty percent of them are in management, leadership, or professional roles. With more women earning college degrees than men, they will eventually surpass the number of men in jobs requiring a higher education.
American women have always served in the military, a few in combat roles disguised as men in the Revolutionary and Civil Wars. But women were prohibited from officially serving in combat roles until 1994, although they made up nearly 14 percent of service members in all branches. Even after 1994, women’s roles in combat were restricted. However, in 2015, the Pentagon announced that women would now be allowed to serve in front-line ground combat roles. Progress in women’s equality has been steady, albeit with grudging social acceptance.
However, that acceptance demonstrates a fundamental shift in societal attitudes, and it’s difficult to image the circumstances that would compel our society to move in retrograde—to prohibit women from combat, to restrict the types of jobs women can hold, to reverse decades of progress, and, despite Ann Coulter, to take away women’s right to vote.
The Inevitability of Social Liberalism
In 2018, we are witnessing a resurgence of conservatism and with it a resurgence of white power hate groups, racism, and xenophobia—energized and condoned by Donald Trump and Steve Bannon (under the guise of nativists and nationalists) and their media trumpets, particularly the crews at Breitbart, Fox News, and other ultra-conservative media operations.
They are funded by a few radially conservative billionaires and their families, like the Mercers and the Hunts, who are pouring hundreds of millions of dollars into socially regressive causes and political campaigns. But no matter how loud and powerful they are, the sheer numbers of mainstream Americans who make choices about who and what they’ll accept and how they’ll live, and what they’ll support form a rising tide of social change that can be retarded but not denied.
It is possible to deny people rights they once had. We saw it in Russia during the Bolshevik Revolution and in Nazi Germany in the 1930s, and like Russia and Germany it usually occurs during periods of extraordinary national upheaval when a minority that assumes power seeks to retain power by controlling and restricting the masses. But social attitudes and values, once planted, can be suppressed for only so long. Eventually, societies evolve beyond the temporary denials of rights once held—a process that may also require extraordinary upheaval. Human progress moves inexorably forward, though it may span more than a few lifetimes.
Social regressionists like Trump, Bannon, Limbaugh, Hannity, Carlson, Coulter, and Ingraham may work hard to reverse social progress. They may achieve some successes along the way, but ultimately they are like pimples on the ass of progress because, as I have shown in this discussion of women’s progress, racial equity, and LGBT rights, social liberalism is inevitable. It occurs as successive generations become more accepting of diversity and more willing to extend equal rights to all people, regardless of gender, race, religious preference, or sexual orientation. It occurs when the bastions of the Old Order crumble as the hatred and privilege that support them are exposed and eroded. It occurs as the edifice of intolerance is crushed by the weight of tectonic demographic shifts that the powerful are powerless to resist.
Sources: U.S. Department of Labor, PEW Research Center, NORC at the University of Chicago, The Williams Institute, Madamenoire, and Forbes magazine.
In the opening of George Romero’s cult classic Night of the Living Dead (1968), a young man and woman are driving through a desolate countryside to a visit a cemetery. As Barbara and her brother Johnny are laying a cross wreath on their father’s grave, they are stalked by an old man who lurches awkwardly toward them, a twisted look on his face. After the man attacks Barbara, Johnny tries to defend her and is killed. Then the maniac chases Barbara to an old farm house, where she finds refuge with another man, Ben, who kills several zombies before boarding up the house and trying to save them from a growing horde of the undead that is surrounding the house and trying to break in. Later, when a mass of zombies attack the house, Barbara is killed by her dead brother.
Night of the Living Dead embodies one of the darkest themes in human pop culture—the living pursued by the dead, isolated and nearly defenseless against an unreasoning enemy we don’t understand whose sole aim is to kill us or devour our flesh. Ghouls, vampires, werewolves, demons, zombies, and the gigantic nuclear creatures of the 1950’s—in our collective nightmares, they are the instruments of our destruction, monsters who attack without provocation, as powerful and soulless as they are uncompromising and devoid of humanity. Among our fears, we fear the unfamiliar and the irrational. We fear losing our freedom and our sense of well-being. We fear being pursued by forces stronger than ourselves. We fear not knowing where safety lies. In short, we fear threats we do not understand and cannot control.
In the dark of night when we are most vulnerable, these fears manifest themselves in our nightmares, which are stories our subconscious creates to bring our fears into awareness, to let us play them out in frightening scenarios that evaporate when we awaken. Most of us have had the standard nightmares—being naked in public, being trapped or lost, being late for something important, being betrayed by a loved one, falling or being injured, or failing a test (popular among student dreamers). These nightmare themes are universal and probably common to all eras, but the stuff of nightmares also reflects the age in which we live. The monsters each age imagines, the monsters that gain broad social currency, are the archetypes for what the people of the age most fear. In the Eighteenth Century, the vampire archetype emerged in part as a clash between heathens and Christians in which the blood-sucking ghoul emerges from the grave to try to steal the Christian maiden’s soul.[i] Vampires also had their genesis in the fear of premature burial, which was a common fear at that time. The dead were supposed to remain dead, not reanimate, perhaps in some putrefied form, which would scare the dickens out of anyone. The religious theme is largely lost in Bram Stoker’s landmark novel, Dracula(1897), which appeared at the end of the Victorian era, but his novel depicts the erotic overtones that are central to Victorian angst—the fear of unwelcome seduction. As cast here, and in the 1931 film starring Bela Lugosi,
Count Dracula is a suave European aristocrat who steals into a young woman’s bedroom while she is asleep and defenseless and penetrates her body with his long, sharp teeth (phallic image intended). He deposits a fluid (saliva) in her, which causes her body and soul to change. What Dracula represents was an anathema to the early Victorians (circa 1840), who valued proper behavior and social advancement, but the later Victorians (Stoker’s audience) rebelled against conservative social norms, and Dracula was, in a sense, their guilty pleasure—embodying both their fear of seduction and, at the daring end of that era, the shameless pursuit of it.[ii]
The early-to-mid Nineteenth Century was the Romantic period in the arts, an era of unbridled emotional expression, and among the prominent writers of that age was Edgar Allan Poe, not the
inventor of the horror story but one of its most eminent practitioners. However, the High Priestess of horror in the Romantic era was Mary Shelley, author of Frankenstein; or, The Modern Prometheus (1818). In Greek mythology Prometheus was the creator of mankind. He stole fire from Mount Olympus and gave it to humanity and for that and other transgressions was punished by Zeus. In Shelley’s tale, Dr. Frankenstein uses electricity to give life to a manlike Creature fashioned from dead human flesh. Frankenstein’s crime was to usurp God’s power, as Prometheus had done, and the Creature he animated became a murderer and outcast disgusted by his own image. What’s notable about Frankenstein is that it paralleled the rise of modern medicine, humankind’s growing mastery of nature through science, where physicians could heal the sick through increasingly scientific methods and restore the living from the dead through, among other means, electricity (e.g., the defibrillator). Advances in medical science during the past two centuries have brought physicians closer and closer to the role of Prometheus.
Frankenstein represents our fear of scientific advancement without moral restraint, and this fear really blew up (pun intended) after the development of the atomic bomb and its use at the end of World War II. In the 1950’s, the adolescence of the nuclear age, our universal nightmare was our fear of the unintended consequences of nuclear power. We imagined that radiation from nuclear testing would genetically alter creatures we had little reason to fear otherwise and turn them into monsters, including ants (Them!, 1954), crabs (Attack of the Crab Monsters, 1957), spiders (Tarantula, 1955), snails (The Monster that Challenged the World, 1957), octopi (It Came From Beneath the Sea, 1955), prehistoric beasts from the ocean (Godzilla, 1954), and many other species in films about irradiated nature grown large, vicious, and deadly.
The monsters we conjure reflect our fears. Werewolves emerged from our fear of being contaminated by nature and losing our humanity in the process. Specifically, it’s our fear of animal-borne diseases like rabies, of being bitten by infected animals and becoming rabid and uncontrollably vicious ourselves. But in a larger sense, it’s our fear of losing possession of our higher faculties and having our baser passions and instincts unleashed. Having been bitten or infected, we lose ourselves and become monsters, Jekylls turned permanently into Hydes. At the mildest level, this fear is revealed in our concern with athletes using performance-enhancing drugs, making them stronger and more powerful than they should be. At the more extreme level, we see it in our paranoia about genetically altered crops, cloning (especially of humans), and the prospect of doctors using genetic manipulation to create designer babies. As with Frankenstein, we fear science run amok as well as nature’s ability to strip us of our humanity. Mummies (The Mummy, 1932) illustrate our fear of digging up the past and disturbing the dead, of not leaving well enough alone, while invisible beings (The Invisible Man, 1933; The Entity, 1982; and Poltergeist, 1982) portray our fear of the unknown and unseen, both externally (malevolent spirits) and internally (the beast within—depicted best in 1956’s sci-fi classic, Forbidden Planet, but also in 2000’s Hollow Man).
The latest manifestation of our collective fear is the zombie—a mindless, infected, rotting ghoul whose sole compulsion is to kill and eat living people. Zombies have their origins in Africa as far back as the 1600’s and later in Haiti, where zombies were usually depicted as mindless drones, virtual slaves who, while grotesque, were not always aggressive. With Night of the Living Dead, however, zombies became flesh-eating ghouls, which is the characterization of them in almost every subsequent zombie film and story. The zombie as a monster archetype is not new, but zombies have rapidly grown in popularity and significance in the past fifteen years. As Michael J. Totten noted in a Dallas News article, “It’s probably no coincidence that the zombie craze began barely a year after the Sept. 11, 2001, terrorist attacks, with Danny Boyle’s hit film 28 Days Later. . . . The fascination with the zombie apocalypse, I believe, is a cultural reflection of the new age of anxiety that opened on 9/11, with its fear of social collapse.”[iii]
Our world changed instantly and dramatically with the terrorist attacks on 9/11 in much the same way our parents’ and grandparents’ world changed overnight with the Japanese attack on Pearl Harbor in 1941. Our angst about these attacks is not just that we were attacked and suffered horrendous loss of life but that our collective sense of safety and security could vanish so suddenly. In an instant, our world was more dangerous and threatening, our government weaker and less able to protect us, and our enemies more vicious and determined. What the zombie apocalypse represents, then, is an almost total collapse of the safe and secure world most of us have known and expect to continue. If a Walking Dead–type of zombie apocalypse ever actually happened, the world we know would collapse. We would lose not only the basic social, political, and economic means by which we live our lives—government, law and order, borders, banks and the monetary system, communication, commerce (and the resupply of basic goods), social and infrastructure services, access to resources, jobs and opportunities, and so on—but also our psychological safety net and the fabric of social discourse—our shared values, the principles of civilized behavior we normally adhere to, and the trust we have in others to treat us with equanimity and respect. As Totten observes, in the post-apocalyptic dystopia that would follow the collapse of society, we would confront not only a zombie horde but also “bands of desperate and sometimes predatory survivors competing with one another for dwindling supplies of food, ammunition, and defensible shelter. Everyone left alive [would learn] that distrust is essential.”[iv] Following a zombie apocalypse, every survivor would be reduced to seeking and competing for the lowest levels of Maslow’s Hierarchy of Needs—water, food, shelter, sleep, clothing, and physical safety.
In zombie books and films of the past few decades, people become zombies through a variety of causes: a biological contagion (28 Days Later, 2002, and World War Z, 2013), an irradiated space probe (Night of the Living Dead, 1968), a manufactured virus set loose by sabotage (Resident Evil, 2002), a comet (Night of the Comet, 1984), and an unknown pathogen (the wildly popular series The Walking Dead, which opened in 2010 and is now in its seventh season). Once transformed, the new zombies literally lose themselves. They have no volition, no free will, no family or interpersonal bonds, no capacity to make conscious choices, and no ability to love—or hate for that matter. Lacking reason and emotion, they become mindless creatures driven by base instincts, human in form only. It is this complete loss of self that is most terrifying about the prospect of being attacked and overrun by the walking dead.
Fighting zombies would also be terrifying because zombies are utterly unlike traditional human foes. In familiar human conflicts killing is a means to an end, not the end itself. The enemy wants to conquer territory or prevent you from conquering territory, or they want your resources, or they want to weaken you so you can’t oppose them elsewhere. But zombies have no goals or motives except to kill or eat you or convert you by biting you and turning you into one of them. In traditional conflicts, the enemy can sometimes be reasoned with. They may see that further conflict is pointless and seek a truce, or they may be willing to negotiate, or, seeing that they are about to be annihilated, they may simply surrender. But zombies are mindless and cannot be reasoned with. They are hell bent on your destruction, and the only way to stop them is to kill them. They murder indiscriminately, and their methods of murder are primitive. Alarmingly, the zombie who
tries to kill you may recently have been a friend or neighbor, someone you knew and liked and trusted. Even worse, it could be someone you loved—your brother or sister, son or daughter, your spouse or your parents.
Does any of this sound familiar? We’ve been seeing it on the evening news and reading about it in newspapers for years. A man or woman who appeared to be an ordinary family member, neighbor, colleague, or friend suddenly opens fire on a crowd of strangers or sets off bombs in a nightclub, subway, or sporting event? An invading army that murders relentlessly in the most terrifying ways? A horde of killers overrunning territory, leaving mass casualties in its wake? People in those stricken territories having to convert or be killed? Cities rendered uninhabitable? Monuments and historic treasures looted or destroyed? The zombies envisioned in Night of the Living Dead and The Walking Dead are a physiological impossibility—you can’t reanimate decomposing flesh—but ISIS is very real and very deadly.
The so-called Islamic State is the most virulent and horrendous collection of murderers the world has seen since the Nazis operated Auschwitz and the Japanese invaded Nanking. Like the zombie hordes, ISIS murders people without remorse or moral restraint. They have crucified people, burned and buried people alive (including children), conducted mass beheadings, planted people’s severed heads on spikes, televised executions, raped captured women, sold women and children as sex slaves, and given captured people the choice of either converting to their perverted form of Islam or being killed. In their wanton disregard for human life and contempt for anyone who does not share their beliefs, ISIS has demonstrated that they want not only to conquer other people but to consume them as well. Their intent, they say, is to extend their caliphate throughout the Arab world and then to all other countries. They want to turn the entire world into an Islamic state governed by their strict interpretation of Sharia law. You only have to look at images of the cities they’ve destroyed, the museums and irreplaceable art they’ve demolished, and the mass graves of their victims to glimpse what a horrifying prospect that would be.
In an article entitled, “What Makes Zombies So Frightening?” Andrew Bloom observed that, “The survivors in a zombie film are not simply waiting for the storm to pass so things can return to normal. They’re trying to figure out what kind of life they can have in a world where they’re under a constant, mortal threat.”[v] We have seen this existential despair in the breakdown of society in territories ISIS has conquered. The people ISIS hasn’t murdered during its invasions live in constant fear. Some are executed immediately after an ISIS takeover. Some join ISIS, just as some people overrun by zombies are bitten and become zombies themselves. Others are raped and sold into slavery. Some huddle in their homes, not knowing when death will come for them, trying to make sense of the new order outside their homes, fearful of appearing at odds with the new governors of their lives, living in a time bomb without knowing how much time remains. Those who can, flee, leaving behind their possessions and their former lives, including some family members. It is difficult, in fact, to look at fictional films of areas overrun by zombies and see much difference between them and news videos of areas overrun by ISIS. The degrees of horror and social destruction are practically indistinguishable.
Later in his article, Bloom makes the following observation: “More frightening than the fall of society is the fall of the self. Zombies, with their insatiable hunger and single-minded focus, represent the idea of our base instincts taking over our better natures. They are, in fact, pure instinct, a tribute to and a caution against the mindless beast lurking beneath the surface.”[vi] It goes without saying that ISIS’s brutal brand of murder, rape, and sexual slavery and their videotaping and broadcasting of executions ranks among the most vile acts human beings can commit. Furthermore, one of ISIS’s most disturbing weapons is the suicide bomber, which they’ve employed in Iraq and Syria in abundance and in other parts of the world through lone wolf attacks (e.g., San Bernardino, Nice, and Orlando) and suicide teams like the ones that staged attacks in Paris, Brussels, Istanbul, and Dhaka. Throughout human history, soldiers have known they may die once they enter a conflict, but that doesn’t negate their instinct for self-preservation. Indeed, throughout all of the animal kingdom the instinct for self-preservation is a primal driver. By brainwashing young people into wearing suicide vests and blowing themselves up in crowds, or by exploiting individuals who are vulnerable, isolated, lonely, or mentally ill, or by coercing them into sacrificing their lives, ISIS has, I would argue, effectively stolen their suicide bombers’ selves. Once these disturbed individuals have followed orders and committed themselves to that self-destructive course of action, they have lost their freedom and volition, denied their normal human hopes and desires, and sacrificed themselves in the ultimate self-denying act. In effect, they have become zombies, more of the walking dead. Moreover, since Islamic law prohibits suicide, they have committed a major sin against God, something their ISIS handlers conveniently ignore.
ISIS is a plague on humanity. It is a manifestation of humankind’s darkest and basest nature, the beast within that we have striven for thousands of years to subjugate and control through civilization, humanity, laws, mutual respect,[vii] and faith in a Higher Power. We cannot negotiate with ISIS or find compromise with them, just as we could not negotiate with zombies if they were a real threat. The only way to save ourselves from the ISIS horde is to kill them, and that takes its own toll because, as Friedrich Nietzsche warned, “Beware that, when fighting monsters, you yourself do not become a monster . . . for when you gaze long into the abyss, the abyss gazes also into you.” Our struggle in this post-apocalyptic world is to maintain our humanity and not descend into the inhuman madness ISIS inhabits. “Decency is still possible,” Michael Totten writes regarding the survivors’ struggles in The Walking Dead, “but ruthlessness is needed as well.”[viii]
The zombie archetype is the monster of our age. In our nightmares, the zombie apocalypse dramatized by numerous zombie films and by AMC’s hit series The Walking Deadrepresents the total breakdown of society and humankind’s descent into the darkest pits of our nature. It represents the awful struggle we would have if our society suddenly disintegrated and the challenges we would face not only to survive but to maintain our humanity and our moral center. Zombies would be a horrific enemy and threat to our existence. But we don’t need Hollywood to conjure the monsters of a zombie apocalypse. They’re already here.[ix]
[i] This is why in most vampire stories and films the vampire is afraid of and recoils from the Christian cross.
[ii] The Victorian era saw the invention and rise of photography and, at the end of the era, motion pictures. Not surprisingly, many of the tintypes, daguerreotypes, stereoscopes, and moving pictures produced around the turn of the century featured nudes or partially clothed models, sexual situations, and mild porn.
[vii] The Golden Rule is a simple and longstanding example of a moral principle we have adopted to live with each other peacefully. Indeed, the Golden Rule, which is the foundation of mutual respect, is one of the moral bases of civilization. In virtually everything ISIS does, they violate this fundamental principle of civilization.
[viii] Totten, Ibid. Some people have argued that our use of unmanned drones and the consequent collateral damage of innocent civilians who are killed by drone strikes is inhumane and a violation of the laws of war. Drone supporters argue that it’s a safer way for us to conduct this war on terror and that if we had boots on the ground instead of drones civilian casualties would still occur. This ideological conflict reflects our desire to remain decent, on the one hand, with the need to be ruthless, on the other.
[ix] As I write this, ISIS’s second-in-command, Abu Mohammad al-Adnani, has reportedly been killed in a drone strike. El-Adnani was one of ISIS’s chief strategists and spokesmen, and there is some thought that with his death—and with ISIS’s recent defeats on the battlefield, the Islamic State will decline and could soon be defeated and dissolved. But this is unlikely. The conditions that brought about ISIS have not changed, and ISIS still has a lot of money and supporters. The group is likely to be more resilient than people expect and will remain a plague on humanity for years to come.
When I was a boy not much older than Ralphie in A Christmas Story, one of my prized possessions was a Daisy Red Ryder BB gun. My father came from a rural family where guns were commonplace for hunting, and it was assumed I would follow in that tradition. The BB gun was my apprenticeship for the shotgun I was to inherit from my grandfather when I got old enough. As the oldest child, I had my own bedroom, and that room had a large closet with a slanted ceiling. It served as my target range. I had dozens of plastic toy soldiers, and I arranged them in menacing formations on the floor of my closet and spent hours shooting them down. The closet was handy because most of the BB’s I shot stayed in there, so it was easy to gather them and reload after an hour of target practice.
Over weeks and months of happily mowing down toy soldiers, I became so proficient that I never missed. So I began hiding them behind obstacles—shoes, wadded up socks, my sister’s wood blocks, anything that would partially conceal the enemy. At first, their torsos were exposed. Then that became too easy, so I started concealing all but their heads. And after a while I never missed those shots either. My brother had some hollow plastic dinosaurs that were open at their bellies. I would place them around my bedroom at various angles to my line of sight. If I shot the dinosaurs on their sides they would be knocked backwards, but that wasn’t much fun. So I learned to angle my shots off the floor and topple the beasts as BB’s ricocheted up underneath them.
As a kid, I learned how to hold a rifle, steady myself, control my breathing, and squeeze the trigger without losing my aim. I became an expert marksman. With a BB gun.
Then I joined the Army. I already knew how to shoot, but firing real rifles was different, of course. Real rifles kicked when you pulled the trigger. I had to learn to hold the butt of my M-14 into my shoulder so it wouldn’t bruise me when it fired, and I had to learn to squeeze the trigger without flinching at the explosive bang when the primer ignited the propellant and shot the bullet through the barrel. If you don’t anticipate that sudden bang, you’ll jerk the trigger and miss your target. Learning to fire a real rifle expertly took hours of training and experience, and I became proficient with that weapon, too.
But no amount of training can prepare you for what happens in combat, which I experienced as a member of the 101st Airborne Division in Vietnam. When the shooting starts, your adrenalin spikes and you are hyper alert, but initially you don’t know what’s going on. Everything is chaotic, especially at night. Unless you initiated the shooting, for a few moments you don’t know who’s firing or where the shots are coming from. Getting your bearings is tough because you’re frightened—at the sudden noise, at the chaos around you, at the bullets flying around, and at the real prospect that you may not survive this. Combat is frightening enough when your side is prevailing, but it’s incredibly worse when the enemy is—when guys around you are getting hit, when your lines being are overrun, and when explosions from rockets and grenades are added to rifle fire. Being a good shot doesn’t matter much in the chaos of a firefight, especially with sudden, in-your-face, close-in fighting, because you don’t have time to aim your rifle, steady yourself, control your breathing, and squeeze the trigger. No. You put your rifle on automatic and spray bullets in the direction of the enemy hoping to kill or wound your adversary before he kills you.
Actual combat is mind-numbing and scary as hell. Don’t let the Rambo’s or super heroes on the movie screen convince you otherwise. Everyone is scared in combat unless they’re insane or numb to combat because they’ve experienced it too much (which is the same as being insane).
I’m writing this article after the Parkland, Florida shootings at the Majory Stoneman Douglas High School, where fourteen students and three teachers were killed by a troubled former student with an AR-15-type assault rifle. President Trump said afterwards that if he had been at the scene he would have rushed into the school even if he didn’t have a weapon. That bit of bravado was spoken like someone who’s never faced that situation and never been in combat. Self-proclaimed heroes like Trump are absolutely and completely clueless.
Trump also said that if some of the teachers there had been armed they would have “shot the hell out of that kid.” Don’t bet on it, Donald. There were armed deputies outside Stoneman, and they didn’t confront the gunman. You think Mrs. Pratt, who teaches freshman English, is going to rush into a hallway Rambo-style and take out the bad guy? Ain’t no way, hot shot.
Hardcore gun nuts believe that the solution to gun violence is to arm everyone. One right-wing spokesman on CNN said that you never hear of bad guys shooting up pawn shops or biker bars because the patrons and operators of those places would shoot back. Their solution to school gun violence is the same as Donald Trump’s: arm the teachers.
Imagine a society where nearly everyone is packing. A lunatic walks into a school or a mall or a movie theatre and starts shooting. Then a half dozen people in the vicinity pull out their handguns and start firing at the shooter—or at the person they think started shooting first. But when seven people are firing weapons, how will any of them know who was the original bad guy? It’s as likely that some of those well-intentioned good Samaritans will shoot each other. Moreover, with all those bullets flying around some innocent bystanders will be shot and killed. In the chaos of armed conflict, well-intentioned people will not know what’s going on, will be bewildered and terrified, and are likely to make bad decisions that kill innocent people.
In the combat situations I experienced, I was working with men who were trained to fire weapons and knew what to expect when the shooting started. But even in that circumstance the outcomes are not always what you expect. After I rotated out of Vietnam, a captain in my unit who took my place leading ambush patrols was killed by one of his own men who mistook him for an enemy soldier in the dark of night. In the fog of battle. When even well-trained people make mistakes. Friendly fire deaths happen routinely in wars—even though the combatants are all soldiers and presumably know what they’re doing. Does anyone believe that mistakes wouldn’t happen in a high school when teachers start firing weapons and students are scrambling for safety through their classrooms and hallways?
You can’t be well-trained enough to always avoid accidents when the shooting starts. In the chaos of a firefight, in the noise and confusion of a live shooter incident, it’s difficult enough to figure out what’s happening, to control your adrenalin and fear, to know which way to run, and to act responsibly with a firearm, especially if you’re not a professional—a soldier, a police officer, a SWAT team member, a Navy Seal.
Arming teachers is only a solution to people like Donald Trump and his NRA buddies who have cinematic fantasies of armed conflict and imagine Everyman and Everywoman are Rambo’s-in-waiting. Recently, Republican Representative Ralph Norman from South Carolina held a town hall meeting in a diner and pulled out a loaded Smith & Wesson .38- caliber handgun while talking to constituents. He explained himself by saying, “Given the scenario that if someone had walked into that diner and began to fire a weapon, I told them I would be able to defend myself and them as well.” “I’m not going to be a Gabby Giffords,” he added. No, Ralph, you certainly are not. Nor are you Rambo.
I know how to shoot. I learned how in my bedroom shooting toy soldiers with a BB gun, and I re-learned it for real in Vietnam. In firefights I was simultaneously confused, determined, terrified, anxious, excited, cowardly, and brave—all in the space of minutes that passed like hours—and I never want to do it again. I want my grandchildren to be safe in their schools, and my wife safe at work, and myself safe at the movies, and all of us safe in our homes. But if the price of that safety is arming everyone, if it means carrying a concealed weapon myself, if it means risking accidental death because some well-intentioned souls around me start firing at the presumed bad guy and hit me or my loved ones by mistake, then I’d rather pass.
Let’s do thorough background checks before someone can purchase a weapon, including at gun shows. Let’s require gun safety courses for every gun owner. Let’s prevent the troubled, the mentally ill, and the unstable from owning weapons. Let’s outlaw bump stocks and high-capacity magazines. Let’s require the police and FBI to investigate dangerous postings on social media. Let’s prevent anyone under 21 from owning an assault rifle. Let’s treat schools like the safe havens they should be and make sure they’re secure by controlling access and having well-trained security officers on duty. Let’s end the insanity of living in a country where mass shootings are commonplace and children are murdered in their classrooms. But let’s not arm Mrs. Pratt and her fellow teachers, Donald. That’s not the solution.
In 1978, a neo-Nazi group led by white supremacist Frank Collin planned a march through Skokie, Illinois, a suburb of Chicago in which more than sixteen percent of the population were Jewish Holocaust survivors. Their plan met with widespread condemnation and the City of Skokie tried to prevent the Nazis from marching. In a highly controversial move, the American Civil Liberties Union (ACLU) sided with the Nazis, arguing that they had a First Amendment right to freedom of speech and freedom of assembly. In the legal fight that followed, the Illinois Supreme Court, the United States Court of Appeals, and the United States Supreme Court agreed that Skokie could not prevent the Nazis from marching, that doing so would violate their First Amendment rights.
More recently, white nationalist Richard Spencer was denied permission to speak at Ohio State University and Penn State University, and his attempt to speak at the University of Florida was disrupted by demonstrators protesting his racist views. In April 2017, conservative provocateur Ann Coulter had a planned speech cancelled at the University of California-Berkeley when the campus became concerned by public safety. As Berkeley Chancellor Nicholas Dirks explained, “This is a university, not a battlefield. We must make every effort to hold events at a time and location that maximizes the changes that First Amendment rights can be successfully exercised and that community members can be protected. While our commitment to freedom of speech and expression remains absolute, we have an obligation to heed our police department’s assessment of how best to hold safe and successful events.”
Coulter used the occasion to skewer the left: “It’s sickening when a radical thuggish institution like Berkeley can so easily snuff out the cherished American right to free speech.” Similarly, when protesters disrupted Richard Spencer’s speech at the University of Florida, he shouted, “You are trying to stifle our free speech.” Spencer, president of the National Policy Institute, an alt-right think tank that promotes white supremacy, supports President Trump and has argued for a separate all-white nation.
Radical conservatives like Spencer and Coulter have a keen grasp of marketing and know that their radical, racist views will spark protests, which will draw media attention, and will build their brands, give them the attention they seek, and help them sell more books and generate more contributions to them and their causes. Do they care if their events are cancelled or disrupted by protesters? The jaundiced view says no. They get what they want just by causing public controversy.
Are their free speech rights really being violated? That’s an interesting question.
Freedom of Speech—and Its Limitations
In America, the Constitution guarantees us the right to freedom of speech. However, that right is not without some limitations. Freedom of speech allows you to voice your opinions. It gives you the right to burn the American flag (symbolic free speech) in protest and to kneel during the national anthem (yes, former pro quarterback Colin Kaepernick is allowed take a knee). If you’re a student, it allows you to wear black armbands to protest a war.
However, you can’t say something that would harm others (like shouting “fire” in a crowded theater) and you can’t create or distribute obscene materials. If you’re a student, you can’t make an obscene speech at a school event, advocate the use of illegal drugs at those events, or, if you go to a private school, print articles in a student newspaper the administration does not approve of. However, public school publications have more free speech protection.
When you belong to an organization, your free speech rights depend on whether your organization is public or private. Within reason, companies can prohibit what their employees say. You could be fired, for instance, if you worked for a beauty salon and openly called a customer ugly or if you used racial slurs. Google did not violate a male engineer’s free speech rights when it fired him for writing that women were not biologically fit for roles in technology.
Public institutions like schools, police departments, and government agencies cannot restrict an employee’s free speech rights unless they determine that an employee’s remarks impact their ability to do their jobs. The FBI agent on Robert Mueller’s staff who wrote in an email that President Trump is an idiot was exercising his free speech rights; however, Mueller removed the agent from his staff because the remark may have impacted his impartiality in investigating the Russia connection to the Trump campaign. But could that agent be fired from his job for making that remark? Probably not, although you could argue that it showed a lack of judgment and professionalism.
Social media platforms like Facebook, Twitter, Pinterest, and Instagram are private organizations, so they can legally curtail what their members post online, and they can bar individuals from being members if those individuals violate the platform’s policies or decency guidelines. Freedom of speech allows you to speak your mind, but it doesn’t guarantee you a public pulpit like Facebook. And bigots like Richard Spencer are allowed to voice their racist and inflammatory views as long as they don’t advocate or incite violence or harm to others. However, that doesn’t mean you have to listen to them. Often, the best response to someone who’s saying something you don’t agree with is to ignore them, and to exercise your right to disagree.
Free Speech in Higher Education—a Liberal Crisis
In 1994 controversial political scientist Charles Murray published a book entitled The Bell Curve in which he argued that African Americans, Latinos, women, and the poor are genetically inferior to white males. His arguments have been thoroughly debunked, but he clings to his beliefs and occasionally appears on college campuses to promote his views. In March 2017, he appeared at Middlebury College in Vermont and was met with strong student opposition. Protesting his ideas, between 100 to 150 students shouted down Murray and would not let him speak. When he moved to another room, students pulled fire alarms, and several masked protestors attacked Murray when he left the building and injured faculty interviewer Allison Stanger, who is a liberal. She had prepared tough questions for Murray but was never able to ask them.
The Murray-Middlebury incident illustrates a disturbing pattern on campuses. While saying they espouse freedom of speech, liberal students on many campuses try to shut down speakers whose views they disagree with, which is precisely contrary to free speech. Some universities have created “safe spaces,” where students can go and not be exposed to ideas that would offend them, and faculty members are asked to give “trigger warnings” if they’re about to say something in class that might upset students.
Steps like these take political correctness to an absurd extreme. College campuses should be venues where a robust exchange of ideas and viewpoints is encouraged and where people with contrary views can speak without fear of violence or recrimination. Sixty-seven students at Middlebury College were sanctioned for their behavior but none were expelled or suspended.
No matter which side of the ideological divide you’re on, freedom of speech means that people with views opposite your own should nonetheless be allowed to speak. You don’t have to agree with them. You can question their views. You can offer evidence that contradicts them. Or you can conclude that they are fools and just ignore them. But to shout them down and not allow them to speak is a violation of their right to free speech.
We are currently going through a crisis on campuses where liberal students don’t want to hear ideas that conflict with their own. The ideas that Ann Coulter, Richard Spencer, and Charles Murray espouse are repugnant and often offensive, but in America they have right to express their views, and the students or authorities who try to prevent them from speaking are wrong.