“In the Eye of the Beholder: Parochial Altruism, Radicalization, and Extremism” By Zoey Reeve [The Evolution Institute]

“In the Eye of the Beholder: Parochial Altruism, Radicalization, and Extremism

By Zoey Reeve

Zoey Reeve has a background in Psychology, Terrorism Studies and Political Science, and is a VOX-Pol Fellow.  Her research focuses on the social-evolutionary psychology of radicalization and terrorism in both online and offline spheres.

https://evolution-institute.org/in-the-eye-of-the-beholder-parochial-altruism-radicalization-and-extremism/

(…)

However, this stance inhibits our capacity to understand the radicalization process because it exceptionalizes people on the basis of what can be, admittedly, a set of rather exceptional behaviors (i.e. suicide terrorism), though often also increasingly includes unexceptional behaviors (i.e. providing funding, logistics, or even just online support for certain groups). Radicalization and extremism are thus little more than labels. The ‘in the eye of the beholder’ philosophy is a luxury that some cannot afford, and perhaps many are unable to stomach. But it leaves us better equipped to understand why (some) people may engage in what we currently think of as extremism and violent extremism because it looks to normal psychological processes and mechanisms that are involved in the radicalization process, rather focusing on the qualities that we have labeled as exceptional.

One such psychological mechanism is Parochial Altruism. Parochial altruism is the propensity for humans to engage in costly-to-self behavior to protect group members from non-group members.2 One (of many) causes of death in ancestral times was outgroups. Whether due to resource encroachment, the spread of disease and parasites, or overt aggression, the mere presence of outgroups would have been enough to trigger parochial altruism. Parochial altruistic responses include fear, withdrawal or fleeing, withholding benefits/resources, and overt hostility and aggression. Presuming that an individual belongs to a sufficiently important group, perceptions of threat to that group will stir parochial altruism in modern humans, despite these conditions being unlikely to manifest in the potential existential threat that may have occurred during ancestral times. This is known as mismatch.3”

“The Evolutionary Psychology of Mass Mobilization: How Disinformation and Demagogues Coordinate Rather Than Manipulate by Michael Bang Petersen [Current Opinion in Psychology, 20 February 2020]

“The Evolutionary Psychology of Mass Mobilization: How Disinformation and Demagogues Coordinate Rather Than Manipulate

Michael Bang Petersen

Current Opinion in Psychology

Available online 20 February 2020

https://www.sciencedirect.com/science/article/pii/S2352250X20300208

Highlights

• Violent mobilization is often attributed to manipulation from, for example, demagogues.

• The human mind contains psychological defenses against manipulation, also in politics.

• Mass mobilization requires that the attention of group members is coordinated.

• Demagogues and disinformation can be explained as tools for achieving coordination.

• Mobilized individuals are predisposed for conflict rather than manipulated into conflict.

Large-scale mobilization is often accompanied by the emergence of demagogic leaders and the circulation of unverified rumors, especially if the mobilization happens in support of violent or disruptive projects. In those circumstances, researchers and commentators frequently explain the mobilization as a result of mass manipulation. Against this view, evolutionary psychologists have provided evidence that human psychology contains mechanisms for avoiding manipulation and new studies suggest that political manipulation attempts are, in general, ineffective. Instead, we can understand decisions to follow demagogic leaders and circulate fringe rumors as attempts to solve a social problem inherent to mobilization processes: The coordination problem. Essentially, these decisions reflect attempts to align the attention of individuals already disposed for conflict.

(…)

In this review, I ask: What are the psychological processes underlying large-scale mobilization of individuals for conflict-oriented projects? The focus is on the specific psychological role fulfilled by (a) strong leaders, (b) propaganda and (c) fringe beliefs in the context of successful mobilization processes. Understanding this role is of essential importance in current political climates where we witness a combination of political conflict, the emergence of populist leaders and concerns about the circulation of “fake news” on social media platforms.

A frequently-cited perspective is that large-scale mobilization for conflict-oriented projects reflects the use of propaganda by demagogues to manipulate the opinions of lay individuals by exploiting their reasoning deficiencies. Here, I review the emerging evidence for an alternative perspective, promoted especially within evolutionary psychology, which suggests that the primary function of leaders and information-circulation is to coordinate individuals already predisposed for conflict (1, 2**, 3). As reviewed below, human psychology contains sophisticated defenses against manipulation (4**) and, hence, it is extremely difficult to attain large-scale mobilization without the widespread existence of prior beliefs that such mobilization is beneficially. Furthermore, a range of counter-intuitive features about demagogues, disinformation and distorted beliefs is readily explained by a coordination perspective.

(…)

In general, leadership and followership evolved to solve coordination problems (21, 27) and there are reasons to expect that authoritarian leaders will solve these coordination problems to the benefit of those who seek aggression (19). Authoritarian leaders often have aggressive personalities themselves and, hence, are more likely to choose this focal point rather than others. Also, authoritarian leaders are more likely to aggressively enforce collection action, thereby also providing a solution to the free-rider problem. Consistent with this coordination-for-aggression perspective on preferences for dominant leaders, such leader preferences are specifically predicted by feelings of anger rather than, for example, fear (28, 29, 30), suggesting that people decide to follow dominant leaders to commit to an offensive strategy against the target group (28).

This perspective also explains highly counter-intuitive features of the appeal of demagogues. If followers search for the optimal leader to solve conflict-related problems of coordination, they will seek out candidates who are willing to violate normative expectations by engaging in obvious lying (31**) and who displays a personality oriented towards conflict, even if such personalities under other circumstances would be considered unappealing (2**).

(…)

Another propaganda tactic is moralistic in nature. Thus, in less violent forms of groupbased conflict, including in the context of modern social media discussions, an often-used tactic is to direct attention towards a group’s or person’s violation of moral principles. Moral principles are effective tools for large-scale coordination because they suggest that the target behavior is universally relevant (1, 34*, 35). Consistent with the coordination perspective, however, recent research suggests that the motivations to broadcast such violations can reflect attempts to mobilize others for self-interested causes. Thus, the airing of such moral principles, referred to as moralgrandstanding, is strongly motivated by status-seeking (36*) and there is increasing evidence that the acceptance of moral principles shifts flexibly with changes in self-interest (37).

(…)

Consistent with this, recent evidence shows that political affiliation is a strong predictor of statements of belief in fringe stories such as conspiracy theories and “fake news” (3, 42**).

(…)

Overall, the effects of the coordination problem on mobilization processes are dual. On the one hand, the existence of the coordination problem means that groups and societies can be stable even if they contain large minority segments of individuals who share disruptive, violent or prejudiced view. On the other hand, the existence of the coordination problem also implies that this stability can be quickly undermined if suddenly coordination is achieved. Not because people are manipulated; but because a sufficient number of them direct attention to a particular set of preferences simultaneously.”

“Find something morally sickening? Take a ginger pill” – Jessica Tracy [aeon]

“Find something morally sickening? Take a ginger pill

Jessica Tracy

is a professor of psychology and a Sauder Distinguished Scholar at the University of British Columbia in Vancouver. She is the director of the Self and Emotion Lab at UBC, and an associate editor at the Journal of Personality and Social Psychology. She is also the author of Take Pride: Why the Deadliest Sin Holds the Secret to Human Success (2016).

https://aeon.co/ideas/find-something-morally-sickening-take-a-ginger-pill

(…)

“This gap in scientific knowledge led my former graduate student Conor Steckler to come up with a brilliant idea. As those prone to motion sickness might know, ginger root can reduce nausea. Steckler suggested we feed people ginger pills, then ask them to weigh in on morally questionable scenarios – behaviours such as peeing in a public pool, or buying a sex doll that looks like one’s receptionist. If people’s moral beliefs are wrapped up in their bodily sensations, then giving them a pill that reduces some of those sensations might reduce how wrong those behaviours seem.

In my psychology lab at the University of British Columbia, we filled empty gel capsules with either ginger powder or sugar (for randomly assigned control participants); in a double-blind design, neither the participants nor the researchers running the study knew who received which pill. After swallowing their pills and waiting 40 minutes for them to metabolise, participants were asked to read scenarios describing a range of possible moral infractions, and tell us how morally wrong they believed each to be. Sure enough, as we reported in an article in the Journal of Personality and Social Psychology in 2019, we found the predicted difference. Those who ingested ginger decided that some of those violations, such as someone peeing in your swimming pool, were not so wrong after all. Blocking their nausea changed our participants’ moral beliefs.

(…)

The violations that were affected by ginger, in contrast, centred on maintaining the purity of one’s own body. These transgressions are ones that have, historically, carried a high likelihood of transmitting disease. As a result, it is evolutionarily adaptive for us to feel disgusted by, and consequently avoid, close contact with dead bodies, human faeces and certain unsafe sex practices. Throughout human evolutionary history, moralising these behaviours, along with others that protect the sanctity of the body, might have been a useful way for societies to shield their members from dangerous germs they had no cognitive awareness of. According to the psychologist Jonathan Haidt and his colleagues, in many cultures this presumably adaptive tendency morphed into a broader ethic that uses concepts such as purity, sanctity and sin to discourage behaviours perceived to cause some manner of bodily degradation. In many cultures, these rules have stretched far beyond their original adaptive purposes; today, across the globe, societies regulate individuals’ purity-related behaviours by invoking morality in ways that sometimes do – but just as often do not – lead to actual health or social benefits.

We were able to shift people’s sanctity beliefs simply by giving them ginger. A moral view that changes on the basis of how nauseous we feel is probably not one that we want to put a lot of stake in.”

***

“The physiological basis of psychological disgust and moral judgments.

Tracy, Jessica L. Steckler, Conor M. Heltzel, Gordon

[Tracy, J. L., Steckler, C. M., & Heltzel, G. (2019). The physiological basis of psychological disgust and moral judgments. Journal of Personality and Social Psychology, 116(1), 15–32. https://doi.org/10.1037/pspa0000141

Abstract

To address ongoing debates about whether feelings of disgust are causally related to moral judgments, we pharmacologically inhibited spontaneous disgust responses to moral infractions and examined effects on moral thinking. Findings demonstrated, first, that the antiemetic ginger (Zingiber officinale), known to inhibit nausea, reduces feelings of disgust toward nonmoral purity-offending stimuli (e.g., bodily fluids), providing the first experimental evidence that disgust is causally rooted in physiological nausea (Study 1). Second, this same physiological experience was causally related to moral thinking: ginger reduced the severity of judgments toward purity-based moral violations (Studies 2 and 4) or eliminated the tendency for people higher in bodily sensation awareness to make harsher moral judgments than those low in this dispositional tendency (Study 3). In all studies, effects were restricted to moderately severe purity-offending stimuli, consistent with preregistered predictions. Together, findings provide the first evidence that psychological disgust can be disrupted by an antiemetic and that doing so has consequences for moral judgments. (PsycINFO Database Record (c) 2018 APA, all rights reserved)”

“Partisanship predicts belief in fake news more strongly than conspiracy mentality, study finds” [PsyPost]

“Partisanship predicts belief in fake news more strongly than conspiracy mentality, study finds

By ERIC W. DOLAN February 6, 2020

https://www.psypost.org/2020/02/partisanship-predicts-belief-in-fake-news-more-strongly-than-conspiracy-mentality-study-finds-55464

(…)

Supporters of the current Hungarian prime minister, Viktor Orbán, were more likely to rate the pro-government fake news as coming from an independent source and were more likely to believe that the pro-government fake news was real. But Orbán supporters were less likely to view the anti-government fake news as real.

Opponents of Orbán’s government, on the other hand, were more likely to rate the anti-government fake news as coming from an independent source and were more likely to believe that the anti-government fake news was real, but were more skeptical of the pro-government fake news.

Conspiracy mentality, a measure of one’s propensity to endorse conspiracy theories, was only weakly linked to belief in anti-government fake news.

“Despite fake news and conspiracy theories often being mentioned interchangeably, our research revealed that they do not necessarily overlap. We focused on wish-fulfilling political fake news, which was unrelated to the general mentality to believe in conspiracy theories. Therefore, our research suggests that pipedream fake news is processed like any other information,” Faragó explained.

(…)

“The perception of the source is also important in the evaluation process: if the news is consistent with our beliefs, we more likely think that the news was written by an independent journalist, but if the news contradicts our viewpoint, we assume that it is biased and part of political propaganda,” Faragó added.

“When we read news, the satisfaction with the economic situation is also an important factor: if we are satisfied with the economy and the political management, we will trust pro-government news more, even if it is fake, and regard opposition news as political propaganda.”

“Therefore, we should read news that come from our own side even more critically,” Faragó said.”

***

We only believe in news that we doctored ourselves: The connection between partisanship and political fake news.

Faragó, Laura; Kende, Anna; Krekó, Péter

https://psycnet.apa.org/record/2019-57441-001

Abstract

In this research we aimed to explore the importance of partisanship behind the belief in wish-fulfilling political fake news. We tested the role of political orientation, partisanship, and conspiracy mentality in the acceptance of pro- and anti-government pipedream fake news. Using a representative survey (N = 1,000) and a student sample (N = 382) in Hungary, we found that partisanship predicted belief in political fake news more strongly than conspiracy mentality, and these connections were mediated by the perceived credibility of source (independent journalism vs. political propaganda) and economic sentiment. Our findings suggest that political bias can be more important in predicting acceptance of pipedream political fake news than conspiracy mentality. (PsycINFO Database Record (c) 2019 APA, all rights reserved)

“‘Why We’re Polarized’ shows how media, emotion, politicians and more are dividing Americans” By Dan Hopkins [On Ezra Klein’s Why We’re Polarized]

“‘Why We’re Polarized’ shows how media, emotion, politicians and more are dividing Americans

Ezra Klein explains the political science for you.

By Dan Hopkins

Jan. 29, 2020 at 9:45 a.m. GMT-3

https://www.washingtonpost.com/politics/2020/01/29/why-were-polarized-shows-how-media-emotion-politicians-more-are-dividing-americans/

Few books are as well-matched to the moment of their publication as Ezra Klein’s “Why We’re Polarized.” President Trump has just become the third president ever to be impeached — and the first one where the votes have fallen almost perfectly along party lines. Klein’s careful book explains how different groups of Americans can see politics through such different lenses, examining how various psychological mechanisms allow committed partisans to rationalize almost anything their party does.

Klein first came to my attention during the 2009-2010 health-care debate, when his Washington Post blog “Wonkblog” was the go-to site for understanding the political and policy dynamics surrounding that legislation. This book fully displays the attributes that have made Klein’s journalism so successful.

The book is undeniably wonky, in the best sense of the word. Klein is an astute reader of political science and social psychology, disciplines he takes seriously. I should disclose that I occasionally wrote for Wonkblog several years ago. But given how much Klein has done to elevate political and social science, it’s hard to find a political scientist not in his debt.

Klein’s book starts with the psychological underpinnings of polarization, and then looks at ways that today’s media landscape and political institutions generate feedback loops that amplify it. In this view, polarization is self-reinforcing. Political elites divide over a question, and then citizens, picking up on those divisions, follow the natural grooves of human psychology by dividing themselves into increasingly meaningful groups. Those emerging divisions, in turn, heighten politicians’ incentives to accentuate their divisions. Thick with insight, the book is especially compelling on how today’s media environment fosters identity-infused content.

But Klein may have incorporated certain lessons from contemporary political science too well — picking up its blind spots and inheriting my discipline’s collective overemphasis on political psychology.”

***

“Why We’re Polarized

Simon & Schuster, 2020.

Ezra Klein

https://www.amazon.com.br/Why-Were-Polarized-Ezra-Klein/dp/1797107658

“America’s political system isn’t broken. The truth is scarier: it’s working exactly as designed. In this book, journalist Ezra Klein reveals how that system is polarizing us—and how we are polarizing it—with disastrous results.

“The American political system—which includes everyone from voters to journalists to the president—is full of rational actors making rational decisions given the incentives they face,” writes political analyst Ezra Klein. “We are a collection of functional parts whose efforts combine into a dysfunctional whole.”

In Why We’re Polarized, Klein reveals the structural and psychological forces behind America’s descent into division and dysfunction. Neither a polemic nor a lament, this book offers a clear framework for understanding everything from Trump’s rise to the Democratic Party’s leftward shift to the politicization of everyday culture.

America is polarized, first and foremost, by identity. Everyone engaged in American politics is engaged, at some level, in identity politics. Over the past fifty years in America, our partisan identities have merged with our racial, religious, geographic, ideological, and cultural identities. These merged identities have attained a weight that is breaking much in our politics and tearing at the bonds that hold this country together.

Klein shows how and why American politics polarized around identity in the twentieth century, and what that polarization did to the way we see the world and one another. And he traces the feedback loops between polarized political identities and polarized political institutions that are driving our system toward crisis.

This is a revelatory book that will change how you look at politics, and perhaps at yourself.”

“Conflict Changes How People View God” | Psychological Science

“Conflict Changes How People View God

Psychological Science

Nava Caluori, Joshua Conrad Jackson, Kurt Gray, and Michele Gelfand

First Published January 28, 2020

https://doi.org/10.1177/0956797619895286

https://journals.sagepub.com/doi/abs/10.1177/0956797619895286

Abstract

Religion shapes the nature of intergroup conflict, but conflict may also shape religion. Here, we report four multimethod studies that reveal the impact of conflict on religious belief: The threat of warfare and intergroup tensions increase the psychological need for order and obedience to rules, which leads people to view God as more punitive. Studies 1 (N = 372) and 2 (N = 911) showed that people’s concern about conflict correlates with belief in a punitive God. Study 3 (N = 1,065) found that experimentally increasing the salience of conflict increases people’s perceptions of the importance of a punitive God, and this effect is mediated by people’s support for a tightly regulated society. Study 4 showed that the severity of warfare predicted and preceded worldwide fluctuations in punitive-God belief between 1800 CE and 2000 CE. Our findings illustrate how conflict can change the nature of religious belief and add to a growing literature showing how cultural ecologies shape psychology.”

Strategy, Evolution, and War: From Apes to Artificial Intelligence by Kenneth Payne (Georgetown University Press, 2018)

“Strategy, Evolution, and War: From Apes to Artificial Intelligence

by Kenneth Payne

Georgetown University Press, 2018.

https://www.amazon.com.br/Strategy-Evolution-War-Artificial-Intelligence/dp/1626165807

Kenneth Payne is a senior lecturer in the School of Security Studies at King’s College, London. He is also a senior member of St Antony’s College, Oxford University, having earlier been a visiting fellow in the Department of International Relations there. Payne’s research is broadly in the field of political psychology and strategic studies. He is the author of two previous books, The Psychology of Strategy: Exploring Rationality in the Vietnam War and The Psychology of Modern Conflict.

This book is about the psychological and biological bases of strategy making in war as they have evolved in humans over our history as a species. The book is also a cautionary preview of how Artificial Intelligence (AI) will revolutionize strategy more than any development in the last three thousand years of military history. Machines will make important decisions about war on both sides, and they may do so without input from humans. Kenneth Payne describes strategy as an evolved package of conscious and unconscious behaviors with roots in our primate ancestry. Human-made strategy is influenced by emotion as well as reason, with both positive and negative results. The strategic implications of AI are profound because they depart radically from the biological basis of human intelligence. Rather than being just another tool of war, AI will exponentially speed up decisionmaking, make choices humans might not make, and force faster actions and reactions. This book is a fascinating examination of the psychology of strategy-making from prehistoric times, through the ancient world, and into the modern age. It also offers a concerning preview of a future when humans cede at least some control over their destiny.”

“An Evolutionary Explanation for Unscientific Beliefs” by Brandon Bretl

“An Evolutionary Explanation for Unscientific Beliefs

written by Brandon Bretl

https://quillette.com/2020/01/13/an-evolutionary-explanation-for-unscientific-beliefs

Brandon Bretl is a research fellow and PhD candidate in the department of educational psychology at the University of Kansas. His current research is focused on explaining how political ideology and other cultural factors influence cognitive development during adolescence. You can follow him on Twitter @BrandonBretl

As it turns out, the theory of evolution by natural selection provides a strong explanation for how and why some people don’t believe evolution by natural selection has ever taken place. I initially thought the problem was a matter of knowledge and the standards people have for what constitutes knowledge, but eventually it became clear that holders of anti-scientific beliefs (from William Jennings Brian of the Scopes Monkey Trial to modern day conspiracy theorists) typically root their convictions in moral obligation.

To understand morality from an evolutionary point of view, one needs to realize that humans have always existed in groups. Often, these groups compete with one another, and this means group-level selection pressures have influenced individual traits, including psychological traits. For example, if two tribes come into conflict with one another, the tribe with members better able to cooperate will prevail. Thus, psychological phenomena such as empathy, concern for fairness and reciprocity, in-group loyalty, and respect for hierarchy have a selective advantage in contexts of group competition, and we immediately recognize the lack of these traits as psychopathy. In other words, most humans have innate tendencies that guide moral development—participating in fair exchanges and seeing moral violators punished are both inherently pleasurable, whereas witnessing injustice and suffering are inherently uncomfortable (just as sweet tastes are innately pleasurable and bitter tastes are innately aversive, even for infants).

Thus, what we consider moral and why we consider it moral are not arbitrary nor are they solely guided by social learning. Our moral intuitions are rooted in natural selection’s answers to social problems that have consistently arisen throughout our evolutionary past. Nonetheless, what we readily recognize as moral is dependent on a wide range of conceptual abilities that must be flexible enough to adapt to cultural contexts and be utilized correctly in specific social circumstances (for instance, empathy for an in-group member’s loss but pleasure in an enemy’s loss), so a significant part of our moral intuitions are dependent on learning and social experience as well.

(…)

Prior to science and accurate causal models for natural phenomena, cultures themselves evolved through trial and error, relying on superstition, myth, and tradition to perpetuate survival-enhancing knowledge and skills. In such circumstances, an ability to override rational thoughts in favor of conformity could have a reproductive benefit. We can see evidence of such conformity biases in neuroimaging studies—when participants change their opinions to match with others, the part of the brain involved in feelings of pleasure and happiness becomes more active. We’re also more likely to imitate and learn from high-prestige individuals, which is why high-prestige individuals are paid so much to market products (a phenomenon known as the “prestige effect”). Finally, in the most extreme cases, high-emotional arousal can completely shut down a person’s rational faculties.

(…)

 If you doubt that these psychological mechanisms exist, try using evidence to convince a creationist that evolution by natural selection occurs or a climate change denier that human-induced climate change is real, or a die-hard Cowboys fan that the Green Bay Packers are a better football team. Most likely you already have tried something like this, and you know very well that it is futile, enraging, and sometimes even traumatic.

(…)

Appeals will be made to moral foundations such as care, harm, justice, respect for authority, in-group loyalty, cleanliness, purity, or sanctity, either implicitly or explicitly. It is this moral framing that stimulates the emotional response, not the other way around; and this moral framing is designed to take advantage of some of our most deeply evolved psychological traits.”

“It’s the network, stupid: Study offers fresh insight into why we’re so divided” [by Jennifer Ouellette – Ars Technica]

“It’s the network, stupid: Study offers fresh insight into why we’re so divided”

Social perception bias might simply be an emergent property of our social networks.

JENNIFER OUELLETTE – 1/4/2020, 5:07 PM

https://arstechnica.com/science/2020/01/its-the-network-stupid-study-offers-fresh-insight-into-why-were-so-divided

Social perception bias is best defined as the all-too-human tendency to assume that everyone else holds the same opinions and values as we do. That bias might, for instance, lead us to over- or under-estimate the size and influence of an opposing group. It tends to be especially pronounced when it comes to contentious polarizing issues like race, gun control, abortion, or national elections.

Researchers have long attributed this and other well-known cognitive biases to innate flaws in individual human thought processes. But according to a paper published last year in Nature Human Behaviour [https://www.nature.com/articles/s41562-019-0677-4], social perception bias might best be viewed as an emergent property of our social networks. This research, in turn, could lead to effective strategies to counter that bias by diversifying social networks.

(…)

The team was surprised to find that the survey results closely matched the model’s predictions. Specifically, “People who were surrounded by people similar to them think that their group is larger than it really is, and people who have more diverse social circles think their group is smaller than it really is,” Galesic told Ars. “These biases are exaggerated with the relative size of the majority and minority groups.””

***

“Homophily and minority-group size explain perception biases in social networks

Eun Lee, Fariba Karimi, Claudia Wagner, Hang-Hyun Jo, Markus Strohmaier & Mirta Galesic

Nature Human Behaviour volume 3, pagesc1078–1087(2019)

https://www.nature.com/articles/s41562-019-0677-4

Abstract

People’s perceptions about the size of minority groups in social networks can be biased, often showing systematic over- or underestimation. These social perception biases are often attributed to biased cognitive or motivational processes. Here we show that both over- and underestimation of the size of a minority group can emerge solely from structural properties of social networks. Using a generative network model, we show that these biases depend on the level of homophily, its asymmetric nature and on the size of the minority group. Our model predictions correspond well with empirical data from a cross-cultural survey and with numerical calculations from six real-world networks. We also identify circumstances under which individuals can reduce their biases by relying on perceptions of their neighbours. This work advances our understanding of the impact of network structure on social perception biases and offers a quantitative approach for addressing related issues in society.”

“How does your body respond to feelings of moral outrage? It depends on your politics” [From University of Southern California/Medical Xpress]

“How does your body respond to feelings of moral outrage? It depends on your politics”

by University of Southern California

https://medicalxpress.com/news/2020-01-body-moral-outrage-politics.html

When you see someone being unfair, disloyal or uncaring toward others, do you feel a sense of moral outrage in the form of a twisting stomach, pounding heart or flushing face? And is it possible that your body’s response depends on your political affiliation?

Researchers with the University of Southern California Brain and Creativity Institute (BCI) set out to examine how and where emotions associated with violations of moral concerns are experienced in the body, and whether political orientation plays a role.

“Our study finds that liberals and conservatives feel moral violations in different areas of their bodies, interpret them as distinct complex feelings and make different moral and political judgements,” said Morteza Dehghani, assistant professor of psychology and computer science at the BCI and the USC Dornsife College of Letters, Arts and Sciences. “This was particularly true for perceptions of feelings of loyalty and purity.”

The research was published today in Psychological Science.

Liberals and conservatives: Wired differently?

Prior research has shown liberals and conservatives rely on different moral foundations and react differently to violations of morals. The authors say their study is the first to indicate that political orientation influences where and how violations of specific moral concerns—including care, fairness, purity, loyalty and authority—are felt in the body. For example, liberals feel violations of purity in their crotch area, chest and slightly in their heads while conservatives feel these violations almost exclusively—and very strongly—in their heads.

***

“Body Maps of Moral Concerns

Mohammad Atari, Aida Mostafazadeh Davani, Morteza DehghaniFirst

Published January 8, 2020 Research Article

https://doi.org/10.1177/0956797619895284

https://journals.sagepub.com/doi/full/10.1177/0956797619895284

Abstract

It has been proposed that somatosensory reaction to varied social circumstances results in feelings (i.e., conscious emotional experiences). Here, we present two preregistered studies in which we examined the topographical maps of somatosensory reactions associated with violations of different moral concerns. Specifically, participants in Study 1 (N = 596) were randomly assigned to respond to scenarios involving various moral violations and were asked to draw key aspects of their subjective somatosensory experience on two 48,954-pixel silhouettes. Our results show that body patterns corresponding to different moral violations are felt in different regions of the body depending on whether individuals are classified as liberals or conservatives. We also investigated how individual differences in moral concerns relate to body maps of moral violations. Finally, we used natural-language processing to predict activation in body parts on the basis of the semantic representation of textual stimuli. We replicated these findings in a nationally representative sample in Study 2 (N = 300). Overall, our findings shed light on the complex relationships between moral processes and somatosensory experiences.

(…)

Whether moral judgment is a product of reason or emotion has been an ongoing debate among philosophers and psychologists for decades. When moral psychology separated itself from moral philosophy, it almost exclusively focused on reasoning rather than on affective aspects of morality. The first empirical attempts in moral psychology started by examining cognitive-developmental components of understanding fairness and rules (Kohlberg, 1971; Piaget, 1948). But subsequently, as the field expanded, there was an increasing interest in the affective components of morality. Accumulating evidence suggests that emotion can ensue from, amplify, or directly cause moral judgment (Avramova & Inbar, 2013). Irrespective of the exact nature of the relationship between emotion and morality, distinct emotions are known to be associated with specific moral concerns as well as moral violations (Haidt, 2003). Emotions are neural and somatic events that have the evolutionary function of preparing an organism to respond adaptively to a change in social or physical circumstances (Darwin, 1872). Once emotions are induced, individuals can consciously experience them by constructing a feeling, that is, generating a conscious mental experience (Damasio, 1999). Constructing feelings of emotions depends on brain systems that map and regulate body responses (Damasio & Carvalho, 2013). Both classic and modern theories of emotion postulate that interoception—the sensing of physiological feedback from the body and its visceral organs—is essential for emotional experience (Damasio, 1999; James, 1994; Schachter & Singer, 1962). The link between interoception and emotion continues to be supported by various studies. For example, Barrett, Quigley, BlissMoreau, and Aronson (2004) found that arousal focus, the extent to which individuals emphasize the changes of feelings in their verbal reports of experienced emotion, is related to interoceptive sensitivity. Individuals who were sensitive to their heartbeat change in response to emotion-arousal images reported more intense emotional experiences compared with less sensitive individuals (Barrett et al., 2004), supporting the association between body feedback and emotional states.”

“Why Evangelicals Are Hardwired to Believe Certain Falsehoods” – by Bobby Azarian [Mind In The Machine]

“Why Evangelicals Are Hardwired to Believe Certain Falsehoods”

This brain quirk makes gaslighting particularly easy.

Posted Dec 31, 2019

Bobby Azarian Ph.D.
Mind In The Machine

https://www.psychologytoday.com/us/blog/mind-in-the-machine/201912/why-evangelicals-are-hardwired-believe-certain-falsehoods

(…)

One reason Trump supporters believe him comes from a basic fact about the brain: it takes more mental effort to reject an idea as false than to accept it as true. In other words, it’s easier to believe than to not.

This fact is based on a landmark study [https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0007272] published in the journal PLOS ONE in 2009, which asked the simple question, how is the brain activated differently during a state of belief compared to a state of disbelief? To test this, participants were asked whether or not they believed in a series of statements while their brain activity was being imaged by an fMRI scanner. Some sentences were simple and fact-based (California is larger than Rhode Island), while others were more abstract and subjective (God probably does not exist). The results showed the activation of distinct but often overlapping brain areas in the belief and disbelief conditions.

While these imaging results are complicated to interpret, the electrical patterns also showed something that was fairly straightforward. Overall, there was greater brain activation that persisted for longer during states of disbelief. Greater brain activation requires more cognitive resources, of which there is a limited supply. What these findings show is that the mental process of believing is simply less work for the brain, and therefore often favored. The default state of the human brain is to accept what we are told, because doubt takes effort. Belief, on the other hand, comes easily.

This troubling finding makes sense from an evolutionary standpoint. If children questioned every single fact they were being taught, learning would occur at a rate so slow that it would be a hindrance. But this fact could be just as easily applied to both the political left and right.

For Christian fundamentalists, being taught to suppress critical thinking begins at a very early age. It is the combination of the brain’s vulnerability to believing unsupported facts and aggressive indoctrination that create the perfect storm for gullibility. Due to the brain’s neuroplasticity, or ability to be sculpted by lived experiences, evangelicals literally become hardwired to believe far-fetched statements.”

***

Harris S, Kaplan JT, Curiel A, Bookheimer SY, Iacoboni M, et al. (2010) The Neural Correlates of Religious and Nonreligious Belief. PLOS ONE 5(1): 10.1371

https://doi.org/10.1371/annotation/7f0b174d-ab93-4844-8305-1de22836aab8

“Can Empathic Concern Actually Increase Political Polarization?” – By Scott Barry Kaufman

From Beautiful Minds

Can Empathic Concern Actually Increase Political Polarization?

New research suggests that those who display the most concern for others are also the most socially polarized

By Scott Barry Kaufman on November 6, 2019

https://blogs.scientificamerican.com/beautiful-minds/can-empathic-concern-actually-increase-political-polarization/

One recent survey found that among those who are highly engaged in politics, 70% of Democrats and 62% of Republicans say they are “afraid” of the other party, and a near majority of Democrats and Republicans report being angry with the opposing party and see the opposing party as a threat to the nation’s well-being.

Obama has proposed that a major source of this political conflict is an “empathy gap”. But what if the reality is far more complex, and empathy in certain circumstances is actually the problem?

(…)

While empathic concern is often assumed to be a universal good, there are many cases in which empathy does not live up to its promise. Even those who score high on psychological tests of empathy aren’t always empathic.* After all, empathy is hard work. As a result, people often choose to avoid empathy, viewing it as just not worth the effort.

One important factor is the nature of the relationship with another person. Research shows that the suffering of a perceived member of an outgroup dampens the empathic response compared to empathic concern for an ingroup member’s suffering.

(…)

What about within the realm of politics? Are we all just treating politics as though it were one big sports game? In this extremely partisan climate, it certainly seems so. As political psychologist Lilliana Mason put it, “a partisan behaves more like a sports fan than like a banker choosing an investment. Partisans feel emotionally connected to the welfare of the party; they prefer to spend time with other members of the party; and when the party is threatened, they become angry and work to help conquer the threat, even if they disagree with some of the issue positions taken by the party.”

“Strength of conviction won’t help to persuade when people disagree” [Medical Xpress/University College London]

“Strength of conviction won’t help to persuade when people disagree

by University College London

https://medicalxpress.com/news/2019-12-strength-conviction-wont-people.html

If you disagree with someone, it might not make any difference how certain they say they are, as during disagreement your brain’s sensitivity to the strength of people’s beliefs is reduced, finds a study led by UCL and City, University of London.

The brain scanning study, published in Nature Neuroscience, reveals a new type of confirmation bias that can make it very difficult to alter people’s opinions.

“We found that when people disagree, their brains fail to encode the quality of the other person’s opinion, giving them less reason to change their mind,” said the study’s senior author, Professor Tali Sharot (UCL Psychology & Language Sciences).

(…)

Professor Sharot added: “Opinions of others are especially susceptible to the confirmation bias, perhaps because they are relatively easy to dismiss as subjective. Because humans make the vast majority of decisions—including professional, personal, political and purchase decisions—based on information received from others, the identified bias in using the strength of others’ opinions is likely to have a profound effect on human behaviour.”

***

Confirmation bias in the utilization of others’ opinion strength

Andreas Kappes, Ann H. Harvey, Terry Lohrenz, P. Read Montague & Tali Sharot

Nature Neuroscience (2019)

Abstract

Humans tend to discount information that undermines past choices and judgments. This confirmation bias has significant impact on domains ranging from politics to science and education. Little is known about the mechanisms underlying this fundamental characteristic of belief formation. Here we report a mechanism underlying the confirmation bias. Specifically, we provide evidence for a failure to use the strength of others’ disconfirming opinions to alter confidence in judgments, but adequate use when opinions are confirmatory. This bias is related to reduced neural sensitivity to the strength of others’ opinions in the posterior medial prefrontal cortex when opinions are disconfirming. Our results demonstrate that existing judgments alter the neural representation of information strength, leaving the individual less likely to alter opinions in the face of disagreement.”

“To Fight Polarization, Ask, “How Does That Policy Work?”” – by Alex Chesterfield and Kate Coombs

[From Behavioral Scientist]

To Fight Polarization, Ask, “How Does That Policy Work?”

By Alex Chesterfield and Kate Coombs

To Fight Polarization, Ask, “How Does That Policy Work?”

 

(…)

Alex’s experience reflects an increasingly split United Kingdom and United States, where ideological and political polarization (defined as the division of attitudes, typically along a single dimension)— have evolved into a new “phenomenon of animosity,” according to political scientist Shanto Iyengar and colleagues. That phenomenon is affective polarization—when ordinary people “increasingly dislike and distrust those from the other party.” Research from 2010 shows, for instance, that nearly half of Republicans, and about one third of Democrats, said they would feel “somewhat or very unhappy” if their child married a member of the opposing party. This was around 5 percent in the 1960s.

(…)

In one U.S. study, when research asked participants of various political affiliations to explain how different policies, such as instituting a cap and trade system, would bring about specific outcomes, like reduced carbon emissions, they moderated their attitudes and reduced their donations to relevant advocacy groups. In contrast, the participants who simply provided a list of reasons that they supported a policy did not. These findings, researchers conclude, could imply a connection between polarized attitudes and overly simplistic mental models for how policies actually work.

(…)

Alex Chesterfield
Alex Chesterfield is a behavioral scientist working in financial services. She is an associate of the Depolarization Project, based in Stanford, California. Previously, she was an elected local government councillor in the U.K. and has an M.Sc. from UCL in cognitive and decision science.

Kate Coombs
Kate Coombs is a behavioural scientist working in financial services. She holds an MSc in Cognitive and Decision Sciences from University College London.”

Jonathan Haidt Explains How Social Media Drives Polarization

“In a time of heightened political tension, Jonathan Haidt has a good idea of what’s driving this polarized atmosphere around the world. He is a social psychologist who believes social media has transformed in recent years to become an “outrage machine,” spreading anger and toxicity. He sits down with Hari to discuss this difficult problem and what the possible solutions could be.”

https://www.pbs.org/wnet/amanpour-and-company/video/jonathan-haidt-explains-how-social-media-drives-polarization/