April 26, 2018

Facts, Beliefs, and the Brain: How Propaganda, Ideology, and Donald Trump Inhabit the Group Mind

http://www.inquiriesjournal.com/articles/1662/facts-beliefs-and-the-brain-how-propaganda-ideology-and-donald-trump-inhabit-the-group-mind

By Anthony R. Brunello
2017, VOL. 9 NO. 10 | PG. 1/2 | »

In the human experience, political ideology and propaganda have played powerful roles in forging group identity. In the evolution of the human species, beliefs have been as powerful as facts and truths. Knowledge of this research and political reality can help us to understand contemporary politics, and why lies continue to shape political discourse, and also why populist messages are resilient, even when they are wrong.

In The Atlantic magazine (March 11, 2017), Caitlin Flanagan wrote an essay ar Iguing that despite a blizzard of satire lampooning President Donald Trump coming from late night comics and cable television shows, the result has not (so far) diminished the faith of Trump’s base of followers. Titled, “How Late Night Comedy Alienated Conservatives, Made Liberals Smug and Fueled the Rise of Trump,” Flanagan contended, to the contrary, that the cacophony of mocking comedy has deepened polarization and hardened President’s Trump’ support. Flanagan said,

“Though aimed at blue-state sophisticates, these shows are an unintended but powerful form of propaganda for conservatives. When Republicans see these harsh jokes—which echo down through morning news shows and the chattering day’s worth of viral clips, along with those of Jimmy Kimmel, Stephen Colbert and Seth Meyers—they don’t just see a handful of comics mocking them. They see HBO, Comedy Central, TBS, ABC, CBS and NBC. In other words they see exactly what Donald Trump has taught them: that the entire media landscape loathes them, their values, their family, and their religion. It is hardly a reach for them to further imagine that the legitimate news shows on these channels are run by similarly partisan players—nor is it at all illogical. No wonder so many of Trump’s followers are inclined to believe only the things that he or his spokespeople tell them directly—everyone else on the tube thinks they’re a bunch of trailer-park, Oxy-snorting half-wits who divide their time between retweeting Alex Jones fantasies and ironing Klan hoods,” (2017).


Flanagan has a point and it turns out that what she has observed might be coded into the human condition. Public criticism of the president and his followers may have the effect of “circling their wagons” and strengthening group identity in part due to human evolution. In the field of political propaganda this result is akin to watching the use of anchoring myths backfiring in sensational and unwanted ways.

Advertisement

Frustrated Progressives and angry liberals have been asking themselves when the 2016 Trump voters will wake up, show some “buyer’s remorse” and begin to withdraw their support for the President. Not only is such thinking perhaps an unhealthy train of thought, but it is not likely to happen any time soon. What liberals believe is obvious to them will likely not be obvious at all to the Trump supporter sold on their vision of “Making America Great Again.” The slogan of “Making America Great Again” means many different things to different people, which is part of its virtue as a political slogan. No one needs to know how to articulate that vision specifically as long as the slogan binds the group together in a shared sense of identity. The voters who supported President Trump will probably stick with their “man” for a long time—and many—forever.

The author of the book The Sixth Extinction, (2014) Elizabeth Kolbert, wrote an article for the New Yorkermagazine in February 2017 entitled: “Why Facts Don’t Change Our Minds: New Discoveries about the Human Mind Show the Limitations of Reason,” (New Yorker, February 27, 2017). In this article Kolbert explains why it is very difficult to change the minds of the ideologically convinced. From the study of the intersection between propaganda and ideology we know that of all the forms of persuasion in rhetoric and communication, the most difficult kind is known as “response changing persuasion.” Response changing persuasion (according to Garth Jowett and Victoria O’Donnell, pp. 38-39) involves: “asking people to switch from one attitude to another…People are reluctant to change; thus to convince them to do so, the persuader has to relate the change to something in which the persuadee already believes,” (Jowett and O’Donnell, p. 39). Finding the right “something” is not easy to do. One of the greatest challenges in communication is to find a way to change a person’s mind once it has been made up. The difficulty is increased when the subject is a matter of personal belief, and thus the individual wants to believe in their world view or ideology, no matter what the facts actually say. Ultimately, when it comes to political ideology, changing peoples’ patterns of beliefs requires skill, patience, tenacity and luck.

A Professor of Archaeology at St. Lawrence University named Peter Neal Peregrine, (writing in The Conversation, February 22, 2017) noted that there exists a distinction between two common modes through which human beings determine what we call facts. In our modern times, the predominant mode of understanding “facts” has been through Science. Professor Peregrine argues [in the article titled “Seeking Truth from Alternative Facts,”] for example, that the claim about the “massive and unprecedented” size of the crowd at President Trump’s Inauguration was viewed as silly by most observers because (from a scientific perspective), the claim was empirically false. Science does not employ alternative facts (Ball, 2017). Science (we believe) makes judgments based on established bodies of method, theory and logical argument. In the end, the “alternative facts” that claimed the “largest audience to ever witness an Inauguration—ever,” was materially false because of what was observable and measureable. Scientific perspective helps determine the “truth” (such as we may know it) by empirical observation, measurement and the scientific methods which always maintain the prospect of falsifiability--that a theory or observation may be disproven. In his own scientific research, Professor Peregrine readily admits, for example, that sometimes two archaeologists can look at the same artifact and be uncertain if it is a stone, or some ancient tool. To make a determination, archeologists will apply careful rules, methods and measurements according to their scientific discipline. In the end, the marshaling of material evidence will tip the decision on the “truth” one way or another—until it may be disproven.

Science versus Authority

In 2017, the Trump administration is often operating with a different and an older tradition of marking what is determined to be true. This method, Peregrine suggests, in contrast to Science, is known as the argument of authority. Prior to the rise of Science in the Enlightenment and Scientific Revolution, the authority of those with powerdetermined the nature of Truth. In the realm of propaganda, we see the importance under such circumstances in the propagation of ideas, so that the facts or the truths we accept are largely determined by what we may believe, by faith, and what the authorities or the “Powers that Be” may tell us to believe. The wisdom of authority goes back to long before the Middle Ages and deep into the Human experience.

The Enlightenment (the 17th and 18th centuries) gave the world Science as we know it today. The scientific method was a human creation—and was aimed at challenging the venerable modes of judging truth, especially as it related to the natural world and the lives of people. For millennia human beings judged between competing claims of truth based upon whatever the people in power said was true. There was no separation between facts and values; the Shaman, Kings, Emperors, Prophets or Popes ordained the truth. What anyone might have seen, measured or reasoned did not matter. [It is instructive to recall the story of the scientist Galileo and his struggle with the Roman Catholic Church authorities in his discoveries about the solar system, and the invention of the telescope. In the end Galileo’s science did win, but not without a fight]. In the book Ignorance: How it Drives Science, (2012), Stuart Firestein exposed a significant distinction between Science and Authority and that is the substantial role that ignorance plays in forging scientific endeavor and discovery. Knowing what you do not know, and then working through it, is the way science progresses. In Firestein’s view, the scientific method is only a part of the story where the marshalling of data and testing leading to facts is, by itself, simply a process that corroborates or disproves theories. Much of science is also about intuition and serendipity, and in the end, facts remain disinterested. As Firestein says,” Thoroughly conscious ignorance is…the prelude to discovery, “(p. 57). What this means is that ignorance is the inspiration of both our imaginations and our potential discoveries. In contrast, the guidance of “authority” stifles imagination in the cold hands of power and interests.

Gunther Stent’s introductory essay to James Watson’s book The Double Helix, (which is Watson’s personal account of the discovery of DNA) is illustrative of the modes of scientific inquiry. Perhaps few discoveries have had more impact on civilization than the double helix. Stent wrote that, “Just as the Renaissance sprang from the confrontation of the Christian West within the Muslim East, so molecular biology sprang from the confrontation of genetics with biochemistry, (p. xi). In other words, ideas and new ways of thinking are born in competition and struggles over competing forms of truth. Science does not move in a hierarchical fashion, but at its best, is a contest of ideas. In our times, the battle between Science versus Authority was a competition many in the West believed to have been largely settled. The methods for determining truth back before the time of the Renaissance (1300-1700) were established on Authority, and those with social, political and economic power determined the truth. The Renaissance sparked a revolution against Authority. The story of the discovery of DNA as told by James Watson, shows the interplay of human intuition, personality and empirical observation. Science evolved to explode the arguments of Authority; ultimately the truth is a matter of conscious ignorance, experimentation, measurement and proof.

The claims often made by Donald Trump and his administration about facts and reality thus have an old world quality. Professor Peregrine asked a very sincere question: if we believe that those with alternative facts are empowered to shape the truth based simply on their authority, are we-- as a civilization--moving backward in time, beyond the Enlightenment and into the Middle Ages? Scientific data no matter how carefully collected and measured do not carry much weight against arguments based on authority. For example, a good comparison is Evolution versus Creationism. Creationists claim the Earth and all life were created by God and their accounts of this are based on authority, and especially on their sense of the authority of religious belief. Therefore, it is next to impossible, no matter how high the biologist may pile the scientific data concerning Evolution and the science of genetics, to challenge the authority of Creationism. The beliefs of Creationists may remain impervious. What a scientific view calls a false claim can be, in the eyes of the true believer, absolutely true.

True Believers

In the classic work in 1951 entitled The True Believer, (1951) Eric Hoffer explained what he called: “the peculiarities common to all mass movements.” As Hoffer said:

“All mass movements generate in their adherents a readiness to die and a proclivity for united action; all of them irrespective of the doctrine they preach and the program they project, breed fanaticism, enthusiasm, fervent hope, hatred, intolerance; all of them are capable of releasing a powerful flow of activity in certain departments of life; all of them demand blind faith and single-hearted allegiance,” (Preface, p. 10).


Eric Hoffer also pointedly observed that:

Advertisement

“Things that are not are indeed mightier than things that are. In all ages men have fought most desperately for beautiful cities yet to be built and gardens yet to be planted. Satan did not digress to tell all he knew when he said: ‘All that a man hath will he give for his life.’ All he hath—yes. But he sooner dies than yield aught of that which he hath not yet,” (p. 73).


Hoffer’s argument was quite simple: Human imaginations and the possibility that illusion may control human perception and cognition is a real and a dynamic force in human history. Beliefs are truly the “stuff that dreams are made of” and people will die for what they believe—even if it is wrong—because they believe it to be true. Humanity may look back over the history of the glories and the tragedies and often ask why? The answer lies partly in our brains—and it also may lie in the natural powers of Fear—which generates hatred, illusion and anxiety.

Eric Hoffer made a most compelling suggestion when he wrote: “Passionate hatred can give meaning and purpose to an empty life. Thus people haunted by the purposelessness of their lives try to find a new content not only by dedicating themselves to a holy cause but also by nursing a fanatical grievance. A mass movement offers them unlimited opportunities for both,” (p. 92). Hoffer was right, and it is not simply the frailty of the human heart—but it rides in our genes and our biology, too. The current populist wave of anger against the establishment in America, Britain, France and across the West is partly inspired by a sense of grievance and fear, and in those conditions, facts are less important by far than what people prefer to believe.

In Ignorance: How It Drives Science, (2012) Firestein observed: “Because you see, the single biggest problem with understanding the brain is having one. Not that it isn’t smart enough. It isn’t reliable,” (p. 125). Human beings are vulnerable to fanatical grievances because of the fear of what is not, or the desire to control what has not happened, or because of the drive for things that are yet to exist. The human brain is both powerful and yet unreliable. People will act on the fear of what they don’t know and cannot predict. Fear is deeply embedded in the evolutionary development of the human species, and fear helps human communities survive. But it is the easiest passion to manipulate in the human heart and the most dangerous. Fear can bring out the very worst in human behavior and thinking. People will easily fear and hate all those who stand in the way of what is not, and also what they believe they want. Human beings can fall prey to believing that something that is not has been taken away from them by some other. For example, such fantasies inspired the Holocaust. Dreams, beliefs—and worse lies—all can become living nightmares.

The Human Brain: Powerful and Unreliable

The social scientific and biological evidence clearly suggests that human beings do not easily change their minds or beliefs once they are established. Elizabeth Kolbert (New Yorker, February 2017) cites several psychological studies beginning in 1975 at Stanford University. In 1975, undergraduate students were given pairs of suicide notes. The pairs held one note written by a random individual, and another by someone who was a real victim of suicide. In the experiment some students found they had a gift for correctly identifying the real suicide notes as opposed to the fake notes. Other students in the experiment found, in contrast, they were terrible at the task. Of course—all of the experimental scores were fictions. All of the suicide notes had come from the coroner’s office and the students who had been told they were generally correct were, on average, no more successful than the students told that they had guessed wrong. These deceptions were revealed in what turned out to be the second phase of the experiment. In part two the students were asked how they should rate their responses. This was another deception. When the students were asked to guess the number of suicide notes they had gotten correct, the students in the original high score group believed that they had done very well; meanwhile the low-score group believed that they had done worse than the other students. The students drew these conclusions even after they were told the truth despite the fact that the entire experiment was a deception and that no group had estimated more successfully than the other. The students tended to accept the false results.

In Kolbert’s article she discusses similar experiments from the 1970s and 80s and the results are the same: “Even after evidence for their beliefs has been totally refuted, people fail to make appropriate revisions in those beliefs,” (Kolbert, p. 3). The Stanford studies became famous. Literally thousands of such experimental studies over time confirmed similar results. As Kolbert said: “Reasonable-seeming people are often totally irrational,” (p. 3). In our times, and because of the notion that we are navigating a post-fact, post-truth environment, this understanding is all the more significant. Even so, like the definition of Political Power—the real question is why? How and why do people act this way? Political Power is understood to be generated by human relationships. These human relationships are built upon perceptions about Motives and Resources (Burns, 1978). People assess the motives and resources of one another—and followers choose to follow leaders based on perceptions. Propaganda seeks to shape and manipulate human perceptions. The actual factors that are most central to fomenting political power are also naturally embedded in propaganda. The development of propaganda as a tool was certainly no accident.

A new study titled the Enigma of Reason, by Hugo Mercier and Dan Sperber (2017) may hold some clues. Reason [as we understand the human trait] likely evolved in human beings and human communities living in the African savannah. In her book The Sixth Extinction, (2014) Kolbert carefully identified the one trait in human beings that allowed humanity to work and hunt cooperatively. This trait—related to the construction of our mouths and tongues--is the ability to communicate. The primary factor of human success and our ability to compete over other animals and species is this cooperative behavior. Kolbert links this factor to the long time line leading to a current human inspired “6th Extinction.” On our own we would not, and will not, survive. Mercier and Sperber’s research lends further support to this idea: “Humans’ biggest advantage over other species is our ability to cooperate. Cooperation is difficult to establish and almost as difficult to sustain. …’Reason is an adaptation to the hypersocial niche humans have evolved for themselves’,” (Kolbert, 2017). Habits of thinking that we might think of as strange or weird turn out to be very clever from a “social interaction” standpoint.

To illustrate this point, Kolbert (2017) uses the concept of confirmation biasConfirmation biasrefers to the tendency that people have to accept beliefs that support their preexisting beliefs and to reject any and all new information that comes into conflict with said beliefs. Stanford University again has provided the research which repeatedly confirms the concept. A classic Stanford study conducted by C.G. Lord, Lee Ross and Mark Lepper in 1979 dealt with capital punishment. A group of students were gathered among whom half supported the death penalty while the other half did not. The students were shown two studies. One study provided data to support the deterrence argument—that capital punishment deters crime and murder; the other study provided data that called deterrence into question. As you might guess, the students who supported capital punishment thought the pro-deterrence data were credible. The students who originally opposed capital punishment, viewed the anti-deterrence data as credible. In fact—students who began the study pro-capital punishment were more in favor of the death penalty than prior to the experiment. Those who opposed death penalties also became more fervent in opposition, (Lord, Ross and Lepper, 1979). Why? The answer was Confirmation bias.

Mercier and Sperber’s evolutionary research suggests that the “myside” bias played a role in human “hyper sociability,” (Kolbert, p. 5). As Human individuals, being free riders is frequently a positive choice—getting anything we can with as little invested is basically rational for the individual. The problem is that free rider behavior in groups is a catastrophe. Because human beings must live in groups to survive, the hyper-social qualities began to be selected in our evolution. Over the expanse of time human communities selected for hyper-social qualities. This selection process led to the irony of confirmation or what can also be called “myside” bias, (Mercier and Sperber, 2017). One would think that Confirmation bias—or only agreeing with what my group believes, or what I have always believed in the face of facts to the contrary—should be dangerous. After all, adaptation in response to new data would generally be wise—unless not doing so performed some adaptive function

The Adaptive Functions of “My-Side Bias”

As it turns out, Mercier and Sperber show that “myside bias” had an “adaptive function” and it works within groups to protect us as individuals from getting hurt, or “screwed” by other members of our own group. Once again, according to Kolbert: “Humans…aren’t randomly credulous. Presented with someone else’s argument, we’re quite adept at spotting weaknesses. Almost invariably, the positions we're blind about are our own, ” (2017). Living in small groups as hunter-gatherers, our ancestors’ main concern was with their social standing, and with insuring that they were not the folks risking their lives on the hunt while others relaxed back at cave. Reasoning clearly was less important than winning arguments. Our ancestors, for example, were not concerned about the deterrent effects of capital punishment. They did not deal with  or theory, or false news, fake stories, the social network or Twitter. Reason may seem to fail us today because as Mercier and Sperber argue we live in an environment that changed more quickly than human natural selection could adapt, (Mercier and Sperber). What if human beings are unsuited to deal with the technological world of communication and ideas that exists today? Is it possible that the evolutionary traits that helped humanity succeed are working against us in this new environment?

Steven Sloman and Philip Fernbach, in their book (2017) The Knowledge of Illusion: Why We Never Think Alone, also believe that sociability played the predominant role in how the human mind functions and malfunctions. According to Elizabeth Kolbert (pp. 6-7), these cognitive scientists demonstrate that people believe they know much more than they actually do about the world. In recent studies at Yale, students could not explain the functions of flush toilets, zippers, cylinder blocks, and much more. We laugh when we say that we cannot use most of the electronics we have in our home, but perhaps it should be a cause for some concern. Our persistence in our belief that we know more than we do essentially relies on other people. Whether we are talking about flush toilets, or I-Pods, Smart Phones, or digital televisions or laptop computers—human beings have been relying on the knowledge and expertise of other people since we began hunting in groups. We blissfully delude ourselves that we know how things work when in fact we are actually collaborating with other people. People share knowledge and understandings in such a way that we cannot perceive where our ignorance ends and our authentic knowledge actually begins.

By and large, people do not have clear borders between their ideas and beliefs, and their actual knowledge. It is a kind of confusion , but essential to human progress. We use tools—but we do not have to invent new tools every time we wish to use them. Can you imagine if every generation had to re-invent the wheel, or the shovel, rake, or even the spoon? We need not learn the principles of an internal combustion engine to drive an automobile—nor understand digital electronics to program a  set. The main problems arise according to the research of Sloman and Fernbach when people move into the terrain of politics. [Here you can see the functionality of ideology as a filter; ideology simplifies a complicated world and allows (or calls) people to take action and make choices that they otherwise lack the knowledge to make]. Consider the difference between flushing a toilet and selecting a policy to manage Illegal , or to invade another country like Afghanistan? We assume knowledge enough to vote and make choices about politicians and leaders who argue for policies dealing with far away countries like , North Korea or , or laws like Immigration policy, the Affordable Health Care Act, Tax Reform, Criminal justice reform, Military Strategy, and much more. Even so—as people engage in making decision about these issues, they may know far less about those things than the operation of a zipper. In fact, Sloman and Fernbach’s research says that, “As a rule strong feelings about issues do not emerge from deep understanding,” (Sloman and Fernbach, 2017)). Our dependence on the minds and arguments of others becomes dominant in our thinking processes. Our thinking becomes the thinking of those we agree with—and despite their lack of knowledge—seems to support what we think. In the end, as Elizabeth Kolbert said: “If we all now dismiss as unconvincing any information that contradicts our opinion, you get, well, the Trump Administration,” (Kolbert, 2017, p. 7).

A community of “knowledge” becomes dangerous as we continually express opinions without data and analysis. This is what we often refer to as the “narrative,” and the process concerns the manner which stories are shaped and then which stories dominate the social and historical landscape. For a long time, Science has been a system for correcting our worst inclinations as human beings. Returning to questions posed by Peter Peregrine, we may shudder to ask: “Are we re-entering a pre-Enlightenment world?” Are we returning to the Middle Ages when authority and the group mind blocked the rise of new and scientific ideas that changed the world? Scientific inquiry has no place for “confirmation bias.” Undoubtedly, access to empirical truths and material data probably accounts for the massive success of science over the last 300 years—and why we need a Political Science today more than ever.

Advertisement

 

The Illusion of Knowledge

Elizabeth Kolbert also examined the research of Jack and Sara Gorman (2016) and their book Denying to the Grave: Why We Ignore the Facts That Will Save Us. The Gorman’s studied persistent beliefs that are not only demonstrably false, but possibly deadly. For example, the conviction held by many people that vaccines are hazardous, and may lead to birth defects and disease, is persistent despite the massive palliative effect of vaccinations over time. Without the polio vaccine for example, the world would yet be battling this painful disfiguring disease. “Anti-vaxxers” (as they are called) are obstinate in their beliefs although there is no common or scientific link between immunization and mass birth defects, seizures or autism. At the same time there is scientific evidence to the contrary. The real hazard we face in society is not being vaccinated, and that is something we can empirically measure.

So, what does America do when it has a President who has shared the “Anti-Vaxxers” world view? Suggestively, the Gorman’s research found that this kind of self-destructive thinking had some adaptive value in human biology. Apparently, human beings get a rush of dopamine-- hence a physiological reaction—when they process information that supports their pre-existing beliefs. We should not be surprised that blasting off our own opinions actually “feels good,” and it may account for why people become so agitated when confronted by opinions that challenge their own. In other words, it feels good to “stick to our guns,” even when we are as wrong as sin and suffering. (This perhaps helps to explain so many ruined family gatherings).

Common sense and science tells us that vaccines are good for kids, and handguns are not. The data are clear. Having said that, providing people with the information that owning a gun does not make you safer, or that vaccines save lives, will not change minds. Such information will be discounted. In another example, many folks believe that the economy always does better under Republican Presidents. In fact, between 1949 and 2009, the US gross domestic product has been higher under Democratic Presidents. In addition, Democratic Presidents have been more likely to reduce inequality—and have contributed less to the national debt since 1945 as a percentage of GDP. In another example, many people believe: “The government spends a lot of money on welfare,” but in fact 8% of the total 2014 US budget was devoted to “welfare” benefits. Many people believe violent  is on the rise, but over the last decade violent crime has fallen. In fact—since 1991 violent crime has fallen by 50%. Murder rates between 1993 and 2010 dropped by 49% and general violent crime has fallen by 72% since 1993. So why do people prefer to believe what is not true—including empirical truths that would not only make us feel safer, but would actually improve our security? (Hochschild, 2017)

The answer lies in part in our human desire to belong. Our group identity forces us to seek comfortable answers that insure group acceptance and make us feel good and comfortable. It is also much easier to not change beliefs. This behavior is reinforced because human beings have learned that we do not really need to know how things work to get along and succeed. The brain seeks the easiest way to accomplish any goal, which generally speaking, has been a good thing in human evolution. We also know from long historical experience that emotion and passion can often defeat scientific understandings. Science requires our brains to work harder; actually understanding and knowing the truth is challenging. For example, Kellyanne Conway’s “alternative facts” are a good fit for the current times. Alternative facts, and acting on passions, authority, anger and fear—is faster and easier—and will always be tempting to people. In politics—this has always been true and it is something the philosophers have warned about for millennia. In fact, it is one of the main things Madison warned about in Federalist No. 51: ”But what is government, but the greatest of all reflections on human nature? If men were angels, no government would be necessary,” (Madison, The Federalist, 2001). As Kolbert said: “These days it can feel as if the entire country has been given over to a vast psychological experiment being run by no one, or by Steve Bannon. Rational agents would be able to think their way to a solution. But, on this matter (Kolbert says) the literature is not reassuring,” (Kolbert, 2017). There are many ways to define political ideology, but among the best is to think of ideologies as “patterns of belief.” Ideologies are calls to action, dreams, delusions and nightmares, but they are always a pattern of beliefs organized in such a way as to define the world people see and understand. Ideologies filter out the competing ideas of the political world and simplify complex reality to become not only our calls to action but our personal identities, (Love, 2006). Political ideology in a time where truth is malleable means that just as we might have been ready to embrace the Universal Declaration of Human Rights as a template to a world of decency and  in the future, we may be equally poised to fall prey to lies, deceits, or propaganda of the darkest sort in the service to interests who do not seek the public good.

Conclusion

Professor Larry J. Sabato of the Crystal Ball(University of Virginia, Political Science) analyzed polling data in April 2017 that confirmed the following: “Voters who supported President Donald Trump in last year’s election have few regrets as Trump enters his 100th day in office…,” (www.centerforpolitics.org, April 27, 2017). The data (scientifically gathered) confirm that voters who elected Trump show a 93% approval rating, with 42% in “strong approval” and 51% with “somewhat approve.” In addition, 70% of these Trump voters believe the country is now “on the right track,” and two thirds believe the economy is improving since Trump took office. The polling data did reflect that many respondents are concerned about America, using words like “upheaval,” “polarization,” “chaotic,” and “volatile,” to describe the country at this time, but it is hard to know what form these feelings may take in each respondent’s mind. Break down in the “strong approval” categories still show the same characteristics as in the November 2016 election among Trump voters: 44% of “strong approvers” were Men, 39% were women; and respondents over 65 years old were the only age group outnumbering “somewhat approvers” (around 48-46%). Meanwhile “strong approvers” of Trump narrowly outnumbered “somewhat approvers” only among those with a High School  or less. Trump voters with “some” college education were likelier to “somewhat approve” than to “strongly approve.” Trump voters’ approval at the 100 Day mark remained strong in early 2017, and they may not change their minds any time soon in the future.

The search for truth relies on the belief that understanding and wisdom matters; that observation and analysis matter; that in politics, lives and passions matter—but so too does knowledge, expertise, and intelligence. One key insight found in the study of propaganda and political ideology is to appreciate the difficulty of changing the embedded beliefs of people. The human community has made the species successful, and over time has selected for the traits that support group identity. The human being needs to belong to the group as much as to be a discrete individual and the research show that group coherence, even when errors in judgment or opinion are dangerous, has aided human survival and dominance. Hence, human beings do not change their minds easily, even when they are wrong, and more important, even after the evidence of error is shown to people. People created and employ political ideology to make political choices and to operate within their social groups successfully. Evolution and biology help us to understand, in part, the powerful role of ideology in organizing human hyper-sociability into integrated patterns of belief. Political ideology authorizes our emotional and unreasonable beliefs to appear to be rational and reasonable. With the aid of propaganda, political ideology works within the group to shape and influence beliefs and bind groups together in action. Political ideology is endemic in the success of any human community, and is more than patterns of beliefs or movements, but the binding force within the shape of political  itself.

References

Arendt, Hannah. (2004). The Origins of Totalitarianism. New York: Random House Inc. [1951].

Aristotle. (1940). “The Rhetoric,” in The Basic Works of Aristotle. New York: Random House.

Ball, Molly. (2017). “Kellyanne’s Alternative Universe,” The Atlantic, April 2017.

Advertisement

Burns, James MacGregor. (1978). Leadership. New York: Harper/Colophon Books.

Davidson, Cathy, N. (2011). Now You See It. How  and Brain Science Will Transform Schools and Business in the 21st Century. Middlesex: Penguin Books.

Firestein, Stuart. (2012). Ignorance: How It Drives Science. New York: Oxford University Press.

Gorman, Sara E., and Jack M. Gorman. (2017). Denying to the Grave: Why We Ignore the Facts that Will Save Us. New York: Oxford University Press.

Hamilton, Alexander and James Madison, John Jay. (2001). The Federalist. (Gideon Edition). George W. Carey and James McClellan eds. Indianapolis: Liberty Fund. (1787-1788)

Hochschild, Arlee Russell, (2016). Strangers in Their Own Land: Anger and Mourning on the American Right. New York: The New Press.

Hoffer, Eric. (1951). The True Believer. New York: Mentor Books.

Inglehart, Ronald and Pippa Norris. (2016). “Trump, Brexit and the Rise of Populism. Economic Have Nots and Cultural Backlash,” Harvard Kennedy School Faculty Research Working Papers, RWP, 16-026, August.

Jowett, Garth S. and Victoria O’Donnell, (2015). Propaganda and Persuasion, 6th ed. Thousand Oaks: Sage Publications.

Kolbert, Elizabeth. (2014). The Sixth Extinction: An Unnatural History. New York: Henry Holt and Co.

Kolbert, Elizabeth. (2017). “Why Facts Don’t Change Our Minds: New Discoveries About the Human Mind show The Limitations of Reason.” New Yorker Magazine. February 2017.

Lord, Charles G., and Lee Ross and Mark Lepper. (1979). “Biased Assimilation and Attitude Polarization: The Effects of Prior Theories on Subsequently Considered Evidence.” Journal of Personality and Social Psychology; Vol. 37 (11), November 1979, 2098-2109.

Love, Nancy S. (2006). Understanding Dogmas and Dreams, 2nd edition. Washington D.C.: CQ Press.

Mercier, Hugo and Dan Sperber. (2017). The Enigma of Reason. Cambridge: Harvard University Press.

Nussbaum, Martha C. (2012). The New Religious Intolerance: Overcoming the Politics of Fear in an Anxious Age. Cambridge: Harvard University Press.

Peregrine, Peter Neal. (2017). “Seeking Truth Among ‘Alternative Facts’.” The Conversation. (February 23, 2017). https://theconversation.com/seeking-truth-among-alternative-facts.

Sabato, Larry J. (2017). “Center for Politics Poll Takes Temperature of Trump Voters at 100 Day Mark,” Crystal Ball. April 27, 2017. http://www.centerfor politics.org.

Sloman, Steven and Philip Fernbach. (2017). The Knowledge of Illusion: Why We Never Think Alone. New York: Riverhead Books.

Watson, James. (1980). The Double Helix: A Personal Account of the Discovery of the Structure of DNA. Edited by Gunther Stent. New York: W.W. Norton Company.

Compliance with Ethical Standards

This manuscript is the original research of the author and there was no funding or grant support received or accepted. All conclusions and opinions are the responsibility of the author alone and there is no  of interest. This article does not contain studies with human participants or animals performed by the author


No comments: