Four Questions: Professor's Book Probes Brainwashing, Freedom
Scott Selisker, UA assistant professor of English and author of the book "Human Programming: Brainwashing, Automatons and American Unfreedom," discusses how decades of cultural and sociological influences have shaped the ways people think — or don't think.
Many political scholars and pundits have called the 2016 election cycle the most tumultuous and hostile in recent memory.
The divide between Democrats and Republicans is wider than ever, and the divisions within these parties have become increasingly vicious. People on opposite sides of an issue struggle mightily to find common ground due in large part to a lack of trust. A common tactic used to discredit opposition viewpoints is a simple three-word phrase:
"You're being brainwashed."
The concept of brainwashing and mental manipulation has been a key component in dystopian films and novels for decades. From "The Manchurian Candidate" to "A Clockwork Orange" to "1984" to "The Hunger Games," the removal of a human's ability to think freely is among the most frequently revisited concepts in American entertainment.
Scott Selisker, an assistant professor of English at the University of Arizona, argues that these cultural and mass media influences have done more to shape the current discourse surrounding terrorism, politics and foreign relations than essentially anything else.
Selisker's new book, "Human Programming: Brainwashing, Automatons and American Unfreedom," released on Aug. 1 through University of Minnesota Press, dissects these literary, cinematic and scientific representations of the programmed mind and connects them to uniquely American concepts of freedom versus unfreedom in hopes of broadening people's understanding of why they think the way they think.
Q: How has writing a book on brainwashing changed your perspective on the current election cycle?
A: We have a long history in America of worrying that media influence, psychological manipulation or even a charismatic would-be dictator might undermine some fundamentals of the democratic process. At the same time, the ideal of democracy depends on nominally free-thinking individuals choosing their leaders thoughtfully.
This summer, I've had several occasions to recall the psychological manipulation strategies I learned about in my research. Every successful cult leader of the 1960s and 1970s, for instance, sought out disaffected people and convinced them that he alone could turn their lives around, and that everyone else was lying to them. I've also thought a lot about the rhetoric of the term "brainwashing" during this election season, where I've seen a lot of talk in the media and on social media about brainwashing, "drinking the Kool-Aid" (a term borrowed from a tragic cult suicide in 1978), Bernie Bots, sheeple and so forth.
Q: What role does the idea of "brainwashing" play in American political conversations? Does social media have an effect on how we perceive others' views?
A: The term "brainwashing" comes from the time of the Korean War, when Americans speculated about the thought reform regime in communist China, and later the techniques used on the American POWs in Korea who went on to criticize the war, and even in a few cases to renounce the U.S. and refuse to come home after the war was over. It's such an evocative term that it caught on almost immediately as a way to describe someone's views as rote, robotic or even unthinkable.
We see a lot more of this rhetoric in the new millennium, with the advent of openly partisan cable news networks, and now with the phenomenon of social media "bubbles" where users often see largely the views of those who agree with them ideologically. Many people openly mistrust those they disagree with as mindless slaves to propaganda. I'm sure many readers have seen arguments between left-wing and right-wing social media users, too, where some variant on "drinking the Kool-Aid" has been thrown around, and it usually doesn't do much to change people's minds. A trick I learned from teaching first-year composition years ago is that when you want to persuade your audience to take your own views seriously, you have to start — and sometimes it's a challenge! — by finding some common ground, some shared value, between yourself and your interlocutor.
Q: Is the concept of "Human Programming" inherently binary (freedom versus unfreedom), or are the degrees by which an individual is influenced by their own personal experiences and entertainment choices — those shades of gray — a driving force in your research?
A: I think it's a really interesting trick of perspective: We all imagine ourselves as free-thinking individuals who've arrived at our own opinions naturally, but we're quick to imagine those we deeply disagree with as unthinking dupes who are blind to the ways they've been manipulated. Of course the reality is in between, for all of us. And yes, my book is all about the ways that, both domestically and in terms of international conflicts, "freedom" and "unfreedom" have been described as much more black-and-white than they really are.
Q: Is there a relationship between a person's awareness of cultural/media influences and that person's ability to think autonomously? Or is our "programming" so deeply hard-wired into cultural and political discourse that it's impossible to differentiate autonomy from influence?
A: It's very difficult to differentiate autonomy from influence in the sphere of political opinion — are any of our ideas and opinions truly ours and ours alone? But if it's impossible to be free from the limitations of our own perspectives, we can always choose to try to broaden our horizons, to read and take seriously the range of ideas that we have access to. That's one place that humanities and social science education comes in — these are disciplines that teach us how to evaluate sources, to think critically about our own assumptions, and to acknowledge and be intellectually generous toward opposing points of view.
University of Arizona in the News