Shell Shock & PTSD: What’s the Difference?

Introduction

“Men who went to war and came back broken inside—they called it shell shock.”

This line from a World War I diary captures the confusion and anguish surrounding a poorly understood condition. Today, we recognise these symptoms under the broader term post-traumatic stress disorder (PTSD). Understanding the evolution from “shell shock” to PTSD highlights how society’s perception of trauma has matured and informs modern treatment approaches. This article briefly explores the historical context, symptoms, treatments, and societal impacts of shell shock and PTSD, revealing their similarities and key differences.

Historical Overview of “Shell Shock”


The term “shell shock” originated during World War I as soldiers experienced unexplained symptoms after exposure to the horrors of combat. Initially thought to result from physical injuries caused by shell explosions, common symptoms included trembling, nightmares, paralysis, and an inability to focus or function.

Early treatments varied widely, from rest and basic counselling to controversial interventions like electric shock therapy. As cases increased, many began to suspect that prolonged exposure to battlefield stress played a significant role in these conditions. However, understanding was limited, and responses were inconsistent.

Some soldiers were stigmatised, labelled as cowards, or even punished for what was seen as a lack of discipline. Despite these challenges, the recognition of shell shock marked a turning point in acknowledging the psychological toll of war.

The Emergence of PTSD

The concept of shell shock evolved over decades, influenced by experiences in subsequent conflicts. During World War II, similar symptoms were termed “combat stress reaction” or “battle fatigue”.

However, it was not until 1980 that the American Psychiatric Association (APA) formally recognised PTSD in the third edition of the Diagnostic and Statistical Manual of Mental Disorders (DSM-III). This recognition was primarily influenced by research on Vietnam War veterans, Holocaust survivors, and sexual trauma victims.

Unlike shell shock, PTSD acknowledges that trauma could result from various experiences beyond combat, including natural disasters, accidents, and interpersonal violence. The DSM criteria for PTSD have been revised multiple times since its introduction, reflecting ongoing research and evolving understanding of trauma responses.

Today, PTSD is recognised as a common condition, with lifetime prevalence rates estimated at 4% for American men and 10% for American women.

Key Differences Between Shell Shock and PTSD

The distinctions between shell shock and PTSD reflect advances in both medical science and societal attitudes toward trauma. Shell shock was considered specific to World War I soldiers and often linked to physical injuries or battlefield stress.

By contrast, PTSD is a formal diagnosis encompassing a wide range of traumas, from combat to civilian experiences. While shell shock symptoms like paralysis, trembling, and mutism were prominent, PTSD is marked by flashbacks, avoidance, and hyperarousal.

Treatments have also evolved significantly; early responses to shell shock were inconsistent and sometimes punitive, whereas modern PTSD care includes evidence-based therapies like cognitive-behavioural therapy (CBT), eye movement desensitisation and reprocessing (EMDR), and medication.

Continuities between Shell Shock and PTSD

Despite differences in terminology and understanding, shell shock and PTSD share common ground in highlighting the deep and lasting impact of trauma. Both conditions recognise that exposure to extreme stress or life-threatening situations can have profound psychological impacts.

Common symptoms persist across both diagnoses, including nightmares, hypervigilance, and emotional numbing. The struggle to reintegrate into civilian life after traumatic experiences remains a challenge for veterans of all eras.

Both conditions have forced society and medical professionals to confront the invisible wounds of war and other traumas. The long-term impacts of untreated psychological trauma, whether labelled as shell shock or PTSD, can be devastating and enduring. Recognition of these continuities has been crucial in developing effective treatments and support systems.

It underscores the importance of early intervention and the need for ongoing research to understand better and address the complex nature of trauma responses, regardless of the terminology used to describe them.

Modern Implications

The journey from shell shock to PTSD offers valuable lessons for addressing trauma in military and civilian populations alike. Understanding the history of trauma-related conditions can help reduce stigma and encourage those affected to seek help.

Today, cutting-edge therapies such as ACT (Acceptance and Commitment Therapy), EMDR, CBT and medication provide effective options for managing PTSD. These advances reflect society’s growing commitment to holistic care.

Social workers, particularly those trained through programmes like a Masters in Social Work online, are well-equipped to provide trauma-informed interventions that integrate historical understanding with modern approaches.

Summary

Acknowledging the universal nature of trauma fosters a sense of shared humanity, helping individuals and communities heal together. By learning from the past and applying innovative treatments, we can continue to improve outcomes for trauma survivors and honour their resilience.

The evolution from shell shock to PTSD reveals society’s growing understanding of trauma and its effects. While the conditions differ in historical context and clinical definition, they share a common thread: the profound impact of traumatic experiences. Recognising this journey reminds us of the importance of addressing trauma compassionately, ensuring that those who endure it are never left to face it alone.

An Overview of Psychological Inertia

Introduction

Psychological inertia is the tendency to maintain the status quo (or default option) unless compelled by a psychological motive to intervene or reject this.

Psychological inertia is similar to the status quo bias but there is an important distinction in that psychological inertia involves inhibiting any action, whereas the status-quo bias involves avoiding any change which would be perceived as a loss.

Research into psychological inertia is limited, particularly into its causes, but it has been seen to affect decision-making by causing individuals to automatically choose or prefer the default option, even if there is a more beneficial option available to them, unless motivated to reject this option. For example, psychological inertia may cause individuals to continue with their investments later than they should, despite information telling them otherwise, causing them to suffer greater losses than they would have if they had disinvested earlier.

Psychological inertia has also seen to be relevant in areas of health, crime and within the workplace.

Refer to Knowledge Inertia and Social Inertia.

Loss Aversion vs Psychological Inertia

David Gal and Derek Rucker both suggest that psychological inertia could be a more suitable explanation for phenomena such as the status-quo bias and the endowment effect than loss aversion.

Status Quo Bias

The psychological inertia account asserts that the reason individuals choose to remain at the status quo is due to a lack of psychological motive to change this behaviour rather than through the weighing up of losses and gains in this decision. Both explanations were tested by David Gal in a study where subjects were asked to imagine that they owned a quarter minted in either Denver or Philadelphia. They were then given the choice of exchanging their coin with one minted in the other city, assuming insignificant time and effort involved in this process. It was found that 85% of participants chose to retain their original coin which can be explained by the inertia account of remaining at the status quo. However, the loss aversion account is unable to explain this decision as it does not provide insight into a propensity towards the status-quo when the option values are equivalent.

Endowment Effect

The endowment effect, i.e. greater value being placed on objects that are owned than those that are not, has been shown to be caused by loss aversion. This was demonstrated in Daniel Kahneman’s study in 1990 where participants who were given a mug demanded, on average, around seven dollars to part with it. Whereas, individuals who were not given a mug were only willing to spend, on average, around three dollars on the same mug. This therefore demonstrated that losses exert a greater impact than gains. However, it could also be seen as evidence for psychological inertia as the participants were provided with the same objects and therefore, as they were indifferent to them, they chose to maintain the status quo as there was no incentive to trade.

Inability to Break with Tradition

The 1998 article “Psychological Inertia” by James Kowalick refers to a company where the president was displeased that company management had little knowledge of what was going on in the manufacturing department. The management team was not approachable and looked down on employees that were not managers. “Remaining behind the sacred doors of one’s managerial office had become quite a tradition.” To address this issue, the president asked each manager to present a manufacturing procedure in detail at the staff meeting while the other managers asked penetrating questions. As a result, in short time, managers were on the production floor learning the procedures. This form of PI represents “cultural and traditional programming”.

Examples and Applications

Health

Avolition has been understood as a core symptom in schizophrenia, however, the drives of it are unclear. One possible drive that may underlie avolition is psychological inertia. It has been argued that as individuals with schizophrenia may be less able to convert their preferences into actions, they may display an increased tendency to maintain a current state, even if they attribute greater value to a different option available. Therefore, this causes these individuals to display greater levels of psychological inertia, and since this process inhibits their action, its presence could drive avolition. James Gold found that motivational impairments of schizophrenia may be associated with abnormalities in estimating the “cost” of effortful behaviour leading to increased psychological inertia which, in turn, could lead to increased avolition in these individuals. However, research into links between psychological inertia and schizophrenia is limited as is their relationship to avolition. For example, research is needed to explore whether the differences in levels of psychological inertia in individuals with schizophrenia only occur when there is a need to engage high levels of inertia or when the individual displays a high level of avolition. Research has shown, however, that the differences in levels of psychological inertia among individuals with schizophrenia is not only due to avolition but could be caused by attention deficits or action-readiness deficits.

Crime

Psychological inertia is believed to be one explanation factor in crime continuity, that is the persistence of criminal behaviour. Glenn Walter’s psychological inertia theorem states that crime continuity is partly caused by cognitive factors that account for the continuity in behaviour between past and future criminality and derives from his broader ‘lifestyle theory’ model, which explains the overall development of a criminal lifestyle. Walter’s theorem is based upon Newton’s law of inertia which states that a body will remain in motion until acted upon by an outside force, in which here the body in motion is crime. Within this theorem, Walter attributes six slow-changing variables that when combined link past criminality with future criminality. These six cognitive variables are:

  • Criminal thinking (antisocial attitudes and irrational thought patterns)
  • Positive outcome expectancies for crime (belief that crime will have specific positive outcomes)
  • Attribution biases (tendency to view the world as hostile and others as malicious)
  • Efficacy expectations (lack of confidence in one’s ability to avoid criminal opportunities in the future)
  • Goals (i.e. focus on short-term goals which becomes detrimental to long-term goals)
  • Values (pursuit of self-indulgent pleasure and immediate gratification)

The psychological inertia theorem argues that criminal involvement gives rise to these six cognitive variables which then encourage further offending behaviour.

Theories surrounding the expectation of behavioural continuity are a topic of debate in the criminal justice community. But the conventional wisdom that past behaviour is the best predictor of future behaviour has generally led to:

“an expectation that offenders with histories of criminal violence in the community are at increased risk for disruptive conduct in prison [and] has been operationalized as a routine component in prison risk classifications”.

Workplace

Psychological inertia has been found to be prevalent in change management within the workplace due to the fact it causes individuals to feel anxiety and fear as a result of any type of change away from the status-quo which may bring new responsibilities and roles. There are a variety of different interventions that have been suggested to overcome this psychological inertia which include providing fuller information including explaining the benefits that such a change will bring, causing people to feel less anxious and more motivated to carry out this change.

This page is based on the copyrighted Wikipedia article < https://en.wikipedia.org/wiki/Psychological_inertia >; it is used under the Creative Commons Attribution-ShareAlike 3.0 Unported License (CC-BY-SA). You may redistribute it, verbatim or modified, providing that you comply with the terms of the CC-BY-SA.

An Overview of Rationalisation (Defence Mechanism)

Introduction

Rationalisation is a defence mechanism (ego defence) in which apparent logical reasons are given to justify behaviour that is motivated by unconscious instinctual impulses. It is an attempt to find reasons for behaviours, especially one’s own. Rationalisations are used to defend against feelings of guilt, maintain self-respect, and protect oneself from criticism.

Rationalisation happens in two steps:

  • A decision, action, judgement is made for a given reason, or no (known) reason at all.
  • A rationalisation is performed, constructing a seemingly good or logical reason, as an attempt to justify the act after the fact (for oneself or others).

Rationalisation encourages irrational or unacceptable behaviour, motives, or feelings and often involves ad hoc hypothesizing. This process ranges from fully conscious (e.g. to present an external defence against ridicule from others) to mostly unconscious (e.g. to create a block against internal feelings of guilt or shame). People rationalise for various reasons—sometimes when we think we know ourselves better than we do. Rationalisation may differentiate the original deterministic explanation of the behaviour or feeling in question.

Brief History

Quintilian and classical rhetoric used the term colour for the presenting of an action in the most favourable possible perspective. Laurence Sterne in the eighteenth century took up the point, arguing that, were a man to consider his actions, “he will soon find, that such of them, as strong inclination and custom have prompted him to commit, are generally dressed out and painted with all the false beauties [color] which, a soft and flattering hand can give them”.

DSM Definition

According to the DSM-IV, rationalisation occurs “when the individual deals with emotional conflict or internal or external stressors by concealing the true motivations for their own thoughts, actions, or feelings through the elaboration of reassuring or self serving but incorrect explanations”.

Examples

Individual

Rationalisation can be used to avoid admitting disappointment: “I didn’t get the job that I applied for, but I really didn’t want it in the first place.”
Egregious rationalisations intended to deflect blame can also take the form of ad hominem attacks or DARVO. Some rationalisations take the form of a comparison. Commonly, this is done to lessen the perception of an action’s negative effects, to justify an action, or to excuse culpability:

  • “At least [what occurred] is not as bad as [a worse outcome].”
  • In response to an accusation: “At least I didn’t [worse action than accused action].”
  • As a form of false choice: “Doing [undesirable action] is a lot better than [a worse action].”
  • In response to unfair or abusive behaviour from a separate individual or group to the person: “I must have done something wrong if they treat me like this.”

Based on anecdotal and survey evidence, John Banja states that the medical field features a disproportionate amount of rationalisation invoked in the “covering up” of mistakes. Common excuses made are:

  • “Why disclose the error? The patient was going to die anyway.”
  • “Telling the family about the error will only make them feel worse.”
  • “It was the patient’s fault. If he wasn’t so (sick, etc.), this error wouldn’t have caused so much harm.”
  • “Well, we did our best. These things happen.”
  • “If we’re not totally and absolutely certain the error caused the harm, we don’t have to tell.”
  • “They’re dead anyway, so there’s no point in blaming anyone.”

In 2018, Muel Kaptein and Martien van Helvoort developed a model, called the Amoralisations Alarm Clock, that covers all existing amoralisations in a logical way. Amoralisations, also called neutralisations, or rationalisations, are defined as justifications and excuses for deviant behaviour. Amoralisations are important explanations for the rise and persistence of deviant behaviour. There exist many different and overlapping techniques of amoralisations.

Collective

  • Collective rationalisations are regularly constructed for acts of aggression, based on exaltation of the in-group and demonisation of the opposite side: as Fritz Perls put it, “Our own soldiers take care of the poor families; the enemy rapes them”.
  • Celebrity culture can be seen as rationalising the gap between rich and poor, powerful and powerless, by offering participation to both dominant and subaltern views of reality.

Criticism

Some scientists criticise the notion that brains are wired to rationalise irrational decisions, arguing that evolution would select against spending more nutrients at mental processes that do not contribute to the improvement of decisions, such as rationalisation of decisions that would have been taken anyway. These scientists argue that rationalising causes one to learn less rather than learn more from their mistakes, and they criticise the hypothesis that rationalisation evolved as a means of social manipulation by noting that if rational arguments were deceptive, there would be no evolutionary chance for breeding individuals that responded to the arguments and therefore making them ineffective and not capable of being selected for by evolution.

Psychoanalysis

Refer to Psychoanalysis.

Ernest Jones introduced the term “rationalisation” to psychoanalysis in 1908, defining it as “the inventing of a reason for an attitude or action the motive of which is not recognized” – an explanation which (though false) could seem plausible. The term (Rationalisierung in German) was taken up almost immediately by Sigmund Freud to account for the explanations offered by patients for their own neurotic symptoms.

As psychoanalysts continued to explore the glossed of unconscious motives, Otto Fenichel distinguished different sorts of rationalisation – both the justifying of irrational instinctive actions on the grounds that they were reasonable or normatively validated and the rationalising of defensive structures, whose purpose is unknown on the grounds that they have some quite different but somehow logical meaning.

Later psychoanalysts are divided between a positive view of rationalisation as a stepping stone on the way to maturity, and a more destructive view of it as splitting feeling from thought, and so undermining the powers of reason.

Cognitive Dissonance

Leon Festinger highlighted in 1957 the discomfort caused to people by awareness of their inconsistent thought. Rationalisation can reduce such discomfort by explaining away the discrepancy in question, as when people who take up smoking after previously quitting decide that the evidence for it being harmful is less than they previously thought.

This page is based on the copyrighted Wikipedia article < https://en.wikipedia.org/wiki/Rationalization_(psychology) >; it is used under the Creative Commons Attribution-ShareAlike 3.0 Unported License (CC-BY-SA). You may redistribute it, verbatim or modified, providing that you comply with the terms of the CC-BY-SA.

An Overview of the McNamara Fallacy

Introduction

The McNamara fallacy (also known as the quantitative fallacy), named for Robert McNamara, the US Secretary of Defence from 1961 to 1968, involves making a decision based solely on quantitative observations (or metrics) and ignoring all others. The reason given is often that these other observations cannot be proven.

But when the McNamara discipline is applied too literally, the first step is to measure whatever can be easily measured. The second step is to disregard that which can’t easily be measured or given a quantitative value. The third step is to presume that what can’t be measured easily really isn’t important. The fo[u]rth step is to say that what can’t be easily measured really doesn’t exist. This is suicide.” ( Daniel Yankelovich, “Interpreting the New Life Styles”, Sales Management, 1971).

The quote originally referred to McNamara’s ideology during the two months that he was president of Ford Motor Company, but has since been interpreted to refer to his attitudes during the Vietnam War.

Refer to the Streetlight Effect.

Examples in Warfare

Vietnam War

The McNamara fallacy is often considered in the context of the Vietnam War, in which enemy body counts were taken to be a precise and objective measure of success. War was reduced to a mathematical model: By increasing estimated enemy deaths and minimising one’s own, victory was assured. Critics such as Jonathan Salem Baskin and Stanley Karnow noted that guerrilla warfare, widespread resistance, and inevitable inaccuracies in estimates of enemy casualties can thwart this formula.

US Air Force Brigadier General Edward Lansdale reportedly told McNamara, who was trying to develop a list of metrics to allow him to scientifically follow the progress of the war, that he was not considering the feelings of the common rural Vietnamese people. McNamara wrote it down on his list in pencil, then erased it and told Lansdale that he could not measure it, so it must not be important.

McNamara’s interest in quantitative figures is also seen in Project 100,000 aka McNamara’s Folly: by lowering admission standards to the military, enlistment was increased. Key to this decision was the idea that one soldier is, in the abstract, more or less equal to another, and that with the right training and superior equipment, he would factor positively in the mathematics of warfare. Inductees of the project died at three times the rate of soldiers who met the earlier standards.

Global War on Terror

Donald Rumsfeld, US Secretary of Defence under George W. Bush, sought to prosecute wars with better data, clear objectives, and achievable goals. Writes Jon Krakauer:

… the sense of urgency attached to the mission came from little more than a bureaucratic fixation on meeting arbitrary deadlines so missions could be checked off a list and tallied as ‘accomplished’. This emphasis on quantification has always been a hallmark of the military, but it was carried to new heights of fatuity during Donald Rumsfeld’s tenure at The Pentagon. Rumsfeld was obsessed with achieving positive ‘metrics’ that could be wielded to demonstrate progress in the Global War on Terror.” ( Jon Krakauer, Where Men Win Glory).

In Modern Clinical Trials

There has been discussion of the McNamara fallacy in medical literature. In particular, the McNamara fallacy is invoked to describe the inadequacy of only using progression-free survival (PFS) as a primary endpoint in clinical trials for agents treating metastatic solid tumours simply because PFS is an endpoint which is merely measurable, while failing to capture outcomes which are more meaningful, such as overall quality of life or overall survival.

In Competitive Admissions Processes

In competitive admissions processes – such as those used for graduate medical education – evaluating candidates using only numerical metrics results in ignoring non-quantifiable factors and attributes which may ultimately be more relevant to the applicant’s success in the position.

This page is based on the copyrighted Wikipedia article < https://en.wikipedia.org/wiki/McNamara_fallacy >; it is used under the Creative Commons Attribution-ShareAlike 3.0 Unported License (CC-BY-SA). You may redistribute it, verbatim or modified, providing that you comply with the terms of the CC-BY-SA.

An Overview of the Streetlight Effect

Introduction

The streetlight effect, or the drunkard’s search principle, is a type of observational bias that occurs when people only search for something where it is easiest to look. Both names refer to a well-known joke:

A policeman sees a drunk man searching for something under a streetlight and asks what the drunk has lost. He says he lost his keys and they both look under the streetlight together. After a few minutes the policeman asks if he is sure he lost them here, and the drunk replies, no, and that he lost them in the park. The policeman asks why he is searching here, and the drunk replies, “this is where the light is”.

The anecdote is attributed to Nasreddin. According to Idries Shah, this tale is used by many Sufis, commenting upon people who seek exotic sources for enlightenment. Outside of the Nasreddin corpus, the anecdote goes back at least to the 1920s, and has been used metaphorically in the social sciences since at least 1964, when Abraham Kaplan referred to it as “the principle of the drunkard’s search”. Noam Chomsky, for instance, uses the tale as a picture of how science operates:

“Science is a bit like the joke about the drunk who is looking under a lamppost for a key that he has lost on the other side of the street, because that’s where the light is. It has no other choice.”

Refer to the McNamara Fallacy – Erroneous reasoning based solely on numeric metrics.

This page is based on the copyrighted Wikipedia article < https://en.wikipedia.org/wiki/Streetlight_effect >; it is used under the Creative Commons Attribution-ShareAlike 3.0 Unported License (CC-BY-SA). You may redistribute it, verbatim or modified, providing that you comply with the terms of the CC-BY-SA.

An Overview of the Cognitive Miser

Introduction

In psychology, the human mind is considered to be a cognitive miser due to the tendency of humans to think and solve problems in simpler and less effortful ways rather than in more sophisticated and effortful ways, regardless of intelligence. Just as a miser seeks to avoid spending money, the human mind often seeks to avoid spending cognitive effort. The cognitive miser theory is an umbrella theory of cognition that brings together previous research on heuristics and attributional biases to explain when and why people are cognitive misers.

The term cognitive miser was first introduced by Susan Fiske and Shelley Taylor in 1984. It is an important concept in social cognition theory and has been influential in other social sciences such as economics and political science.

Simply put, people are limited in their capacity to process information, so they take shortcuts whenever they can.

Assumption

The metaphor of the cognitive miser assumes that the human mind is limited in time, knowledge, attention, and cognitive resources. Usually people do not think rationally or cautiously, but use cognitive shortcuts to make inferences and form judgements. These shortcuts include the use of schemas, scripts, stereotypes, and other simplified perceptual strategies instead of careful thinking. For example, people tend to make correspondent reasoning and are likely to believe that behaviours should be correlated to or representative of stable characteristics.

Background

The Naïve Scientist and Attribution Theory

Before Fiske and Taylor’s cognitive miser theory, the predominant model of social cognition was the naïve scientist. First proposed in 1958 by Fritz Heider in The Psychology of Interpersonal Relations, this theory holds that humans think and act with dispassionate rationality whilst engaging in detailed and nuanced thought processes for both complex and routine actions. In this way, humans were thought to think like scientists, albeit naïve ones, measuring and analysing the world around them. Applying this framework to human thought processes, naïve scientists seek the consistency and stability that comes from a coherent view of the world and need for environmental control.

In order to meet these needs, naïve scientists make attributions. Thus, attribution theory emerged from the study of the ways in which individuals assess causal relationships and mechanisms. Through the study of causal attributions, led by Harold Kelley and Bernard Weiner amongst others, social psychologists began to observe that subjects regularly demonstrate several attributional biases including but not limited to the fundamental attribution error.

The study of attributions had two effects: it created further interest in testing the naïve scientist and opened up a new wave of social psychology research that questioned its explanatory power. This second effect helped to lay the foundation for Fiske and Taylor’s cognitive miser.

Stereotypes

According to Walter Lippmann’s arguments in his classic book Public Opinion, people are not equipped to deal with complexity. Attempting to observe things freshly and in detail is mentally exhausting, especially among busy affairs. The term stereotype is thus introduced: people have to reconstruct the complex situation on a simpler model before they can cope with it, and the simpler model can be regarded as a stereotype. Stereotypes are formed from outside sources which identified with people’s interests and can be reinforced since people could be impressed by those facts that fit their philosophy.

On the other hand, in Lippmann’s view, people are told about the world before they see it. People’s behaviour is not based on direct and certain knowledge, but pictures made or given to them. Hence, influence from external factors are unneglectable in shaping people’s stereotypes.

“The subtlest and most pervasive of all influences are those which create and maintain the repertory of stereotypes.”

That is to say, people live in a second-handed world with mediated reality, where the simplified model for thinking (i.e. stereotypes) could be created and maintained by external forces. Lippmann suggested that the public “cannot be wise”, since they can be easily misled by overly simplified reality which is consistent with their pre-existing pictures in mind, and any disturbance of the existing stereotypes will seem like “an attack upon the foundation of the universe”.

Although Lippmann did not directly define the term cognitive miser, stereotypes have important functions in simplifying people’s thinking process. As cognitive simplification, it is useful for realistic economic management, otherwise people will be overwhelmed by the complexity of the real rationales. Stereotype, as a phenomenon, has become a standard topic in sociology and social psychology.

Heuristics

Much of the cognitive miser theory is built upon work done on heuristics in judgment and decision-making, most notably Amos Tversky and Daniel Kahneman results published in a series of influential articles. Heuristics can be defined as the “judgmental shortcuts that generally get us where we need to go—and quickly—but at the cost of occasionally sending us off course.” In their work, Kahneman and Tversky demonstrated that people rely upon different types of heuristics or mental short cuts in order to save time and mental energy. However, in relying upon heuristics instead of detailed analysis, like the information processing employed by Heider’s naïve scientist, biased information processing is more likely to occur. Some of these heuristics include:

  • Representativeness heuristic (the inclination to assign specific attributes to an individual the more he/she matches the prototype of that group).
  • Availability heuristic (the inclination to judge the likelihood of something occurring because of the ease of thinking of examples of that event occurring).
  • Anchoring and adjustment heuristic (the inclination to overweight the importance and influence of an initial piece of information, and then adjusting one’s answer away from this anchor).

The frequency with which Kahneman and Tversky and other attribution researchers found the individuals employed mental shortcuts to make decisions and assessments laid important groundwork for the overarching idea that individuals and their minds act efficiently instead of analytically.

Cognitive Miser Theory

The wave of research on attributional biases done by Kahneman, Tversky and others effectively ended the dominance of Heider’s naïve scientist within social psychology. Fiske and Taylor, building upon the prevalence of heuristics in human cognition, offered their theory of the cognitive miser. It is, in many ways, a unifying theory of ad-hoc decision-making which suggests that humans engage in economically prudent thought processes instead of acting like scientists who rationally weigh cost and benefit data, test hypotheses, and update expectations based upon the results of the discrete experiments that are our everyday actions. In other words, humans are more inclined to act as cognitive misers using mental short cuts to make assessments and decisions regarding issues and ideas about which they know very little, including issues of great salience. Fiske and Taylor argue that it is rational to act as a cognitive miser due to the sheer volume and intensity of information and stimuli humans intake. Given the limited information processing capabilities of individuals, people try to adopt strategies that economise complex problems. Cognitive misers usually act in two ways: by disregarding part of the information to reduce their own cognitive load, or by overusing some kind of information to avoid the burden of finding and processing more information.

Other psychologists also argue that the cognitively miserly tendency of humans is a primary reason why “humans are often less than rational”. This view holds that evolution has made the brain’s allocation and use of cognitive resources extremely embarrassing. The basic principle is to save mental energy as much as possible, even when it is required to “use your head”. Unless the cognitive environment meets certain criteria, we will, by default, try to avoid thinking as much as possible.

Implications

The implications of this theory raise important questions about both cognition and human behaviour. In addition to streamlining cognition in complicated, analytical tasks, the cognitive miser approach is also used when dealing with unfamiliar issues and issues of great importance.

Politics

Voting behaviour in democracies are an arena in which the cognitive miser is at work. Acting as a cognitive miser should lead those with expertise in an area to more efficient information processing and streamlined decision making. However, as Lau and Redlawsk note, acting as cognitive miser who employs heuristics can have very different results for high-information and low-information voters. They write:

“…cognitive heuristics are at times employed by almost all voters, and that they are particularly likely to be used when the choice situation facing voters is complex… heuristic use generally increases the probability of a correct vote by political experts but decreases the probability of a correct vote by novices.”

In democracies, where no vote is weighted more or less because of the expertise behind its casting, low-information voters, acting as cognitive misers, can have broad and potentially deleterious choices for a society.

Samuel Popkin argues that voters make rational choices by using information shortcuts that they receive during campaigns, usually using something akin to a drunkard’s search. Voters use small amounts of personal information to construct a narrative about candidates. Essentially, they ask themselves this:

“Based on what I know about the candidate personally, what is the probability that this presidential candidate was a good governor? What is the probability that he will be a good president?”

Popkin’s analysis is based on one main premise: voters use low information rationality gained in their daily lives, through the media and through personal interactions, to evaluate candidates and facilitate electoral choices.

Economics

Cognitive misers could also be one of the contributors to the prisoner’s dilemma in game theory. To save cognitive energy, cognitive misers tend to assume that other people are similar to themselves. That is, habitual co-operators assume most of the others as co-operators, and habitual defectors assume most of the others as defectors. Experimental research has shown that since co-operators offer to play more often, and fellow co-operators will also more often accept their offer, co-operators would have a higher expected payoff compared with defectors when certain boundary conditions are met.

Mass Communication

Lack of public support towards emerging techniques are commonly attributed to lack of relevant information and the low scientific literacy among the public. Known as the knowledge deficit model, this point of view is based on idealistic assumptions that education for science literacy could increase public support of science, and the focus of science communication should be increasing scientific understanding among lay public. However, the relationship between information and attitudes towards scientific issues are not empirically supported.

Based on the assumption that human beings are cognitive misers and tend to minimize the cognitive costs, low-information rationality was introduced as an empirically grounded alternative in explaining decision making and attitude formation. Rather than using an in-depth understanding of scientific topics, people make decisions based on other shortcuts or heuristics such as ideological predistortions or cues from mass media due to the subconscious compulsion to use only as much information as necessary. The less expertise citizens have on an issue initially, the more likely they will rely on these shortcuts. Further, people spend less cognitive effort in buying toothpaste than they do when picking a new car, and that difference in information-seeking is largely a function of the costs.

The cognitive miser theory thus has implications for persuading the public: attitude formation is a competition between people’s value systems and prepositions (or their own interpretive schemata) on a certain issue, and how public discourses frame it. Framing theory suggest that the same topic will result in different interpretations among audience, if the information is presented in different ways. Audiences’ attitude change is closely connected with relabelling or re-framing the certain issue. In this sense, effective communication can be achieved if media provide audiences with cognitive shortcuts or heuristics that are resonate with underlying audience schemata.

Risk Assessment

The metaphor of cognitive misers could assist people in drawing lessons from risks, which is the possibility that an undesirable state of reality may occur. People apply a number of shortcuts or heuristics in making judgements about the likelihood of an event, because the rapid answers provided by heuristics are often right. Yet certain pitfalls may be neglected in these shortcuts. A practical example of the cognitively miserly way of thinking in the context of a risk assessment of Deepwater Horizon explosion, is presented below.

  • People have trouble in imagining how small failings can pile up to form a catastrophe;
  • People tend to get accustomed to risk. Due to the seemingly smooth current situation, people unconsciously adjust their acceptance of risk;
  • People tend to over-express their faith and confidence in backup systems and safety devices;
  • People regard complicated technical systems in line with complicated governing structures;
  • When concerned with a certain issue, people tend to spread good news and hide bad news; and
  • People tend to think alike if they are in the same field (see also: echo chamber), regardless of their position in a project’s hierarchy.

Psychology

The theory that human beings are cognitive misers, also shed light on the dual process theory in psychology. Dual process theory proposes that there are two types of cognitive processes in human mind. Daniel Kahneman described these as intuitive (System 1) and reasoning (System 2), respectively.

When processing with System 1, which starts automatically and without control, people expend little to no effort, but can generate complex patterns of ideas. When processing with System 2, people actively consider how best to distribute mental effort to accurately process data, and can construct thoughts in an orderly series of steps. These two cognitive processing systems are not separate and can have interactions with each other. Here is an example of how people’s beliefs are formed under the dual process model:

  • System 1 generates suggestions for System 2, with impressions, intuitions, intentions or feelings;
  • If System 1’s proposal is endorsed by System 2, those impressions and intuitions will turn into beliefs, and the sudden inspiration generated by System 1 will turn into voluntary actions;
  • When everything goes smoothly (as is often the case), System 2 adopts the suggestions of System 1 with little or no modification. Herein there is a window for bias to form, as System 2 may be trained to incorrectly regard the accuracy of data derived from observations gathered via System 1.

The reasoning process can be activated to help with the intuition when:

  • A question arises, but System 1 does not generate an answer
  • An event is detected to violate the model of world that System 1 maintains.

Conflicts also exists in this dual-process. A brief example provided by Kahneman is that when we try not to stare at the oddly dressed couple at the neighbouring table in a restaurant, our automatic reaction (System 1) makes us stare at them, but conflicts emerge as System 2 tries to control this behaviour.

The dual processing system can produce cognitive illusions. System 1 always operates automatically, with our easiest shortcut but often with error. System 2 may also have no clue to the error. Errors can be prevented only by enhanced monitoring of System 2, which costs a plethora of cognitive efforts.

Limitations

Omission of Motivation

The cognitive miser theory did not originally specify the role of motivation. In Fiske’s subsequent research, the omission of the role of intent in the metaphor of cognitive miser is recognised. Motivation does affect the activation and use of stereotypes and prejudices.

Updates and Later Research

Motivated Tactician

People tend to use heuristic shortcuts when making decisions. But the problem remains that although these shortcuts could not compare to effortful thoughts in accuracy, people should have a certain parameter to help them adopt one of the most adequate shortcuts. Kruglanski proposed that people are combination of naïve scientists and cognitive misers: people are flexible social thinkers who choose between multiple cognitive strategies (i.e. speed/ease vs. accuracy/logic) based on their current goals, motives, and needs.

Later models suggest that the cognitive miser and the naïve scientist create two poles of social cognition that are too monolithic. Instead, Fiske, Taylor, and Arie W. Kruglanski and other social psychologists offer an alternative explanation of social cognition: the motivated tactician. According to this theory, people employ either shortcuts or thoughtful analysis based upon the context and salience of a particular issue. In other words, this theory suggests that humans are, in fact, both naïve scientists and cognitive misers. In this sense people are strategic instead of passively choosing the most effortless shortcuts when they allocate their cognitive efforts, and therefore they can decide to be naïve scientists or cognitive misers depending on their goals.

This page is based on the copyrighted Wikipedia article < https://en.wikipedia.org/wiki/Cognitive_miser >; it is used under the Creative Commons Attribution-ShareAlike 3.0 Unported License (CC-BY-SA). You may redistribute it, verbatim or modified, providing that you comply with the terms of the CC-BY-SA.

What is a Motivated Tactician?

Introduction

In social psychology, a motivated tactician is someone who shifts between quick-and-dirty cognitively economical tactics and more thoughtful, thorough strategies when processing information, depending on the type and degree of motivation. Such behaviour is a type of motivated reasoning. The idea has been used to explain why people use stereotyping, biases and categorisation in some situations, and more analytical thinking in others.

Brief History

After much research on categorisation, and other cognitive shortcuts, psychologists began to describe human beings as cognitive misers; which explains that a need to conserve mental resources causes people to use shortcuts to thinking about stimuli, instead of motivations and urges influencing the way humans think about their world. Stereotypes and heuristics were used as evidence of the economic nature of human thinking. In recent years, the work of Fiske & Neuberg (1990), Higgins & Molden (2003), Molden & Higgins (2005) and others has led to the recognition of the importance of motivational thinking. This is due to contemporary research studying the importance of motivation in cognitive processes, instead of concentrating on cognition versus motivation. Current research does not deny that people will be cognitively miserly in certain situations, but it takes into account that thorough analytic thought does occur in other situations.

Using this perspective, researchers have begun to describe human beings as “motivated tacticians” who are tactical about how much cognitive resources will be used depending on the individual’s intent and level of motivation. Based on the complex nature of the world and the occasional need for quick thinking, it would be detrimental for a person to be methodical about everything, while other situations require more focus and attention. Considering human beings as motivated tacticians has become popular because it takes both situations into account. This concept also takes into account, and continues to study, what motivates people to use more or less mental resources when processing information about the world. Research has found that intended outcome, relevancy to the individual, culture, and affect can all influence the way a person processes information.

Goal-Oriented Motivational Thinking

The most prominent explanation of motivational thinking is that the person’s desired outcome motivates him to use more or less cognitive resources while processing a situation or thing. Researchers have divided preferred outcomes into two broad categories:

  • Directional outcomes; and
  • Non-directional outcomes.

The preferred outcome provides the motivation for the level of processing involved.

Individuals motivated by directional outcomes have the intention of accomplishing a specific goal. These goals can range from appearing smart, courageous or likeable to affirming positive thoughts and feelings about something or someone to whom they are close or find likable. If someone is motivated by non-directional outcomes, he or she may wish to make the most logical and clear decision. Whether a person is motivated by directional or non-directional outcomes depends on the situation and the person’s goals. Confirmation bias is an example of thought-processing motivated by directional outcomes. The goal is to affirm previously held beliefs, so one will use less thorough thinking in order to reach that goal. A person motivated to get the best education, who researches information on colleges and visit schools is motivated by a non-directional outcome. Evidence for outcome-influenced motivation is illustrated by research on self-serving bias. According to Miller (1976), p.901-906:

“Independent of expectancies from prior success or failure, the more personally important a success is in any given situation, the stronger is the tendency to claim responsibility for this success but to deny responsibility for failure.”

Motivation Based on Strategy

Though outcome-based motivation is the most prominent approach to motivated thinking, there is evidence that a person can be motivated by their preferred strategy of processing information. However, rather than being an alternative, this idea is actually a compliment to the outcome-based approach. Proponents of this approach feel that a person prefers a specific method of information-processing because it usually yields the results they wish to receive. This relates back to the intended outcome being the primary motivation. “Strategy of information processing” means whether a person makes a decision using bias, categories, or analytical thinking. Regardless of whether the method is best suited for the situation or more thorough is less important to the person than its likelihood of yielding the intended result. People feel that their preferred strategy just “feels right”. What makes the heuristic or method feel “right” is that the strategy accomplishes the desired goal (i.e. affirming positive beliefs of self-efficacy).

Other Motivations and Approaches

There has been limited research on motivated tactical thinking outside of Western countries. One theory experts have mentioned is that a person’s culture could play a large role in a person’s motivations. Nations like the United States are considered to be individualistic, while many Asian nations are considered to be collectivistic. An individualist emphasizes importance on the self and is motivated by individual reward and affirmation, while a collectivist sees the world as being more group- or culture-based. The difference in the two ways of thinking could affect motivation in information processing. For example, instead of being motivated by self-affirmation, a collectivist would be motivated by more group-affirming goals.

Another theory is that emotions can affect the way a person processes information. Forgas (2000) has stated that current mood can determine the information processing as well as thoroughness of thought. He also mentioned that achieving a desired emotion can influence the level to which information is processed.

This page is based on the copyrighted Wikipedia article < https://en.wikipedia.org/wiki/Motivated_tactician >; it is used under the Creative Commons Attribution-ShareAlike 3.0 Unported License (CC-BY-SA). You may redistribute it, verbatim or modified, providing that you comply with the terms of the CC-BY-SA.

An Overview of Motivated Forgetting

Introduction

Motivated forgetting is a theorised psychological behaviour in which people may forget unwanted memories, either consciously or unconsciously. It is an example of a defence mechanism, since these are unconscious or conscious coping techniques used to reduce anxiety arising from unacceptable or potentially harmful impulses thus it can be a defence mechanism in some ways. Defence mechanisms are not to be confused with conscious coping strategies.

Thought suppression is a method in which people protect themselves by blocking the recall of these anxiety-arousing memories. For example, if something reminds a person of an unpleasant event, their mind may steer towards unrelated topics. This could induce forgetting without being generated by an intention to forget, making it a motivated action. There are two main classes of motivated forgetting: psychological repression is an unconscious act, while thought suppression is a conscious form of excluding thoughts and memories from awareness.

Refer to An Overview of Motivated Reasoning and Emotional Reasoning.

Brief History

Neurologist Jean-Martin Charcot was the first to do research into hysteria as a psychological disorder in the late 19th century. Sigmund Freud, Joseph Breuer, and Pierre Janet continued with the research that Charcot began on hysteria. These three psychologists determined that hysteria was an intense emotional reaction to some form of severe psychological disturbance, and they proposed that incest and other sexual traumas were the most likely cause of hysteria. The treatment that Freud, Breuer, and Pierre agreed upon was named the talking cure and was a method of encouraging patients to recover and discuss their painful memories. During this time, Janet created the term dissociation which is referred to as a lack of integration amongst various memories. He used dissociation to describe the way in which traumatising memories are stored separately from other memories.

The idea of motivated forgetting began with the philosopher Friedrich Nietzsche in 1894. Nietzsche and Sigmund Freud had similar views on the idea of repression of memories as a form of self-preservation. Nietzsche wrote that man must forget in able to move forward. He stated that this process is active, in that we forget specific events as a defence mechanism.

The publication of Freud’s famous paper, “The Aetiology of Hysteria”, in 1896 led to much controversy regarding the topic of these traumatic memories. Freud stated that neuroses were caused by repressed sexual memories, which suggested that incest and sexual abuse must be common throughout upper and middle class Europe. The psychological community did not accept Freud’s ideas, and years passed without further research on the topic.

It was during World War I and World War II that interest in memory disturbances was piqued again. During this time, many cases of memory loss appeared among war veterans, especially those who had experienced shell shock. Hypnosis and drugs became popular for the treatment of hysteria during the war. The term post traumatic stress disorder (PTSD) was introduced upon the appearance of similar cases of memory disturbances from veterans of the Korean War. Forgetting, or the inability to recall a portion of a traumatic event, was considered a key factor for the diagnosis of PTSD.

Ann Burgess and Lynda Holmstrom looked into trauma related memory loss in rape victims during the 1970s. This began a large outpouring of stories related to childhood sexual abuse. It took until 1980 to determine that memory loss due to all severe traumas was the same set of processes.

The False Memory Syndrome Foundation (FMSF) was created in 1992 as a response to the large number of memories claimed to be recovered. The FMSF was created to oppose the idea that memories could be recovered using specific techniques; instead, its members believed that the “memories” were actually confabulations created through the inappropriate use of techniques such as hypnosis.

Theories

There are many theories which are related to the process of motivated forgetting.

The main theory, the motivated forgetting theory, suggests that people forget things because they either do not want to remember them or for another particular reason. Painful and disturbing memories are made unconscious and very difficult to retrieve, but still remain in storage. Retrieval Suppression (the ability to utilise inhibitory control to prevent memories from being recalled into consciousness) is one way in which we are able to stop the retrieval of unpleasant memories using cognitive control. This theory was tested by Anderson and Green using the Think/No-Think paradigm.

The decay theory is another theory of forgetting which refers to the loss of memory over time. When information enters memory, neurons are activated. These memories are retained as long as the neurons remain active. Activation can be maintained through rehearsal or frequent recall. If activation is not maintained, the memory trace fades and decays. This usually occurs in short term memory. The decay theory is a controversial topic amongst modern psychologists. Bahrick and Hall disagree with the decay theory. They have claimed that people can remember algebra they learnt from school even years later. A refresher course brought their skill back to a high standard relatively quick. These findings suggest that there may be more to the theory of trace decay in human memory.

Another theory of motivated forgetting is interference theory, which posits that subsequent learning can interfere with and degrade a person’s memories. This theory was tested by giving participants ten nonsense syllables. Some of the participants then slept after viewing the syllables, while the other participants carried on their day as usual. The results of this experiment showed that people who stayed awake had a poor recall of the syllables, while the sleeping participants remembered the syllables better. This could have occurred due to the fact that the sleeping subjects had no interference during the experiment, while the other subjects did. There are two types of interference; proactive interference and retroactive interference. Proactive interference occurs when you are unable to learn a new task due to the interference of an old task that has already been learned. Research has been done to show that students who study similar subjects at the same time often experience interference. Retroactive interference occurs when you forget a previously learnt task due to the learning of a new task.

The Gestalt theory of forgetting, created by Gestalt psychology, suggests that memories are forgotten through distortion. This is also called false memory syndrome. This theory states that when memories lack detail, other information is put in to make the memory a whole. This leads to the incorrect recall of memories.

Criticisms

The term recovered memory, also known in some cases as a false memory, refers to the theory that some memories can be repressed by an individual and then later recovered. Recovered memories are often used as evidence in a case where the defendant is accused of either sexual or some other form of child abuse, and recently recovered a repressed memory of the abuse. This has created much controversy, and as the use of this form of evidence rises in the courts, the question has arisen as to whether or not recovered memories actually exist. In an effort to determine the factuality of false memories, several laboratories have developed paradigms in order to test whether or not false repressed memories could be purposefully implanted within a subject. As a result, the verbal paradigm was developed. This paradigm dictates that if someone is presented a number of words associated with a single non-presented word, then they are likely to falsely remember that word as presented.

Similar to the verbal paradigm is fuzzy-trace theory, which dictates that one encodes two separate things about memory: the actual information itself and the semantic information surrounding it (or the gist). If we are given a series of semantic information surrounding a false event, such as time and location, then we are more likely to falsely remember an event as occurring. Tied to that is Source Monitoring Theory, which, among other things, dictates that emotionally salient events tend to increase the power of the memory that forms from said event. Emotion also weakens our ability to remember the source from the event. Source monitoring is centralised to the anterior cingulate cortex.

Repressed memory therapy has come under heavy criticism as it is said that it follows very similar techniques that are used to purposefully implant a memory in an adult. These include: asking questions on the gist of an event, creating imagery about said gist, and attempting to discover the event from there. This, when compounded with the fact that most repressed memories are emotionally salient, the likelihood of source confusion is high. One might assume that a child abuse case one heard about actually happened to one, remembering it with the imagery established through the therapy.

Repression

The idea of psychological repression was developed in 1915 as an automatic defensive mechanism based on Sigmund Freud’s psychoanalytic model in which people subconsciously push unpleasant or intolerable thoughts and feelings into their unconscious.

When situations or memories occur that we are unable to cope with, we push them away. It is a primary ego defence mechanism that many psychotherapists readily accept. There have been numerous studies which have supported the psychoanalytic theory that states that murder, childhood trauma and sexual abuse can be repressed for a period of time and then recovered in therapy.

Repressed memories can influence behaviour unconsciously, manifesting themselves in our discussions, dreams, and emotional reactions. An example of repression would include a child who is abused by a parent, who later has no recollection of the events, but has trouble forming relationships. Freud suggested psychoanalysis as a treatment method for repressed memories. The goal of treatment was to bring repressed memories, fears and thoughts back to the conscious level of awareness.

Suppression

Thought suppression is referred to as the conscious and deliberate efforts to curtail one’s thoughts and memories. Suppression is goal-directed and it includes conscious strategies to forget, such as intentional context shifts. For example, if someone is thinking of unpleasant thoughts, ideas that are inappropriate at the moment, or images that may instigate unwanted behaviours, they may try to think of anything else but the unwanted thought in order to push the thought out of consciousness.

In order to suppress a thought, one must:

  1. Plan to suppress the thought; and
  2. Carry out that plan by suppressing all other manifestations of the thought, including the original plan.

Thought suppression seems to entail a state of knowing and not knowing all at once. It can be assumed that thought suppression is a difficult and even time consuming task. Even when thoughts are suppressed, they can return to consciousness with minimal prompting. This is why suppression has also been associated with obsessive-compulsive disorder.

Directed Forgetting

Suppression encompasses the term directed forgetting, also known as intentional forgetting. This term refers to forgetting which is initiated by a conscious goal to forget. Intentional forgetting is important at the individual level: suppressing an unpleasant memory of a trauma or a loss that is particularly painful.

The directed forgetting paradigm is a psychological term meaning that information can be forgotten upon instruction. There are two methods of the directed forgetting paradigm: item method and list method. In both methods, the participants are instructed to forget some items, the to-be-forgotten items and remember some items, the to-be-remembered items. The directed forgetting paradigm was originally conceived by Robert Bjork. The Bjork Learning and Forgetting Lab and members of the Cogfog group performed much important research using the paradigm in subsequent years.

In the item method of directed forgetting, participants are presented with a series of random to-be-remembered and to-be-forgotten items. After each item an instruction is given to the participant to either remember it, or forget it. After the study phase, when participants are told to remember or to forget subsets of the items, the participants are given a test of all the words presented. The participants were unaware that they would be tested on the to-be-forgotten items. The recall for the to-be-forgotten words are often significantly impaired compared to the to-be-remembered words. The directed forgetting effect has also been demonstrated on recognition tests. For this reason researchers believe that the item method affects episodic encoding.

In the list method procedure, the instructions to forget are given only after half of the list has been presented. These instructions are given once in the middle of the list, and once at the end of the list. The participants are told that the first list they had to study was just a practice list, and to focus their attention on the upcoming list. After the participants have conducted the study phase for the first list, a second list is presented. A final test is then given, sometimes for only the first list and other times for both lists. The participants are asked to remember all the words they studied. When participants are told they are able to forget the first list, they remember less in this list and remember more in the second list. List method directed forgetting demonstrates the ability to intentionally reduce memory retrieval. To support this theory, researchers did an experiment in which they asked participants to record 2 unique events that happened to them each day over a 5-day period in a journal. After these five days, the participants were asked to either remember or forget the events on these days. They were then asked to repeat the process for another five days, after which they were told to remember all the events in both weeks, regardless of earlier instructions. The participants that were part of the forget group had had worse recall for the first week compared to the second week.

There are two theories that can explain directed forgetting: retrieval inhibition hypothesis and context shift hypothesis.

The Retrieval Inhibition Hypothesis states that the instruction to forget the first list hinders memory of the list-one items. This hypothesis suggests that directed forgetting only reduces the retrieval of the unwanted memories, not causing permanent damage. If we intentionally forget items, they are difficult to recall but are recognized if the items are presented again.

The Context Shift Hypothesis suggests that the instructions to forget mentally separate the to-be-forgotten items. They are put into a different context from the second list. The subject’s mental context changes between the first and second list, but the context from the second list remains. This impairs the recall ability for the first list.

Psychogenic Amnesia

Motivated forgetting encompasses the term psychogenic amnesia which refers to the inability to remember past experiences of personal information, due to psychological factors rather than biological dysfunction or brain damage.

Psychogenic amnesia is not part of Freud’s theoretical framework. The memories still exist buried deeply in the mind, but could be resurfaced at any time on their own or from being exposed to a trigger in the person’s surroundings. Psychogenic amnesia is generally found in cases where there is a profound and surprising forgetting of chunks of one’s personal life, whereas motivated forgetting includes more day-to-day examples in which people forget unpleasant memories in a way that would not call for clinical evaluation.

Psychogenic Fugue

Psychogenic fugue, a form of psychogenic amnesia, is a DSM-IV Dissociative Disorder in which people forget their personal history, including who they are, for a period of hours to days following a trauma. A history of depression as well as stress, anxiety or head injury could lead to fugue states. When the person recovers they are able to remember their personal history, but they have amnesia for the events that took place during the fugue state.

Neurobiology

Motivated forgetting occurs as a result of activity that occurs within the prefrontal cortex. This was discovered by testing subjects while taking a functional MRI of their brain. The prefrontal cortex is made up of the anterior cingulate cortex, the intraparietal sulcus, the dorsolateral prefrontal cortex, and the ventrolateral prefrontal cortex. These areas are also associated with stopping unwanted actions, which confirms the hypothesis that the suppression of unwanted memories and actions follow a similar inhibitory process. These regions are also known to have executive functions within the brain.

The anterior cingulate cortex has functions linked to motivation and emotion. The intraparietal sulcus possesses functions that include coordination between perception and motor activities, visual attention, symbolic numerical processing, visuospatial working memory, and determining the intent in the actions of other organisms. The dorsolateral prefrontal cortex plans complex cognitive activities and processes decision making.

The other key brain structure involved in motivated forgetting is the hippocampus, which is responsible for the formation and recollection of memories. When the process of motivated forgetting is engaged, meaning that we actively attempt to suppress our unwanted memories, the prefrontal cortex exhibits higher activity than baseline, while suppressing hippocampal activity at the same time. It has been proposed that the executive areas which control motivation and decision-making lessen the functioning of the hippocampus in order to stop the recollection of the selected memories that the subject has been motivated to forget.

Examples

War

Motivated forgetting has been a crucial aspect of psychological study relating to such traumatising experiences as rape, torture, war, natural disasters, and homicide. Some of the earliest documented cases of memory suppression and repression relate to veterans of the Second World War. The number of cases of motivated forgetting was high during war times, mainly due to factors associated with the difficulties of trench life, injury, and shell shock. At the time that many of these cases were documented, there were limited medical resources to deal with many of these soldiers’ mental well-being. There was also a weaker understanding of the aspects of memory suppression and repression.

Case of a Soldier (1917)

The repression of memories was the prescribed treatment by many doctors and psychiatrists, and was deemed effective for the management of these memories. Unfortunately, many soldiers’ traumas were much too vivid and intense to be dealt with in this manner, as described in the journal of Dr. Rivers. One soldier, who entered the hospital after losing consciousness due to a shell explosion, is described as having a generally pleasant demeanour. This was disrupted by his sudden onsets of depression occurring approximately every 10 days. This intense depression, leading to suicidal feelings, rendered him unfit to return to war. It soon became apparent that these symptoms were due to the patient’s repressed thoughts and apprehensions about returning to war. Dr. Smith suggested that this patient face his thoughts and allow himself to deal with his feelings and anxieties. Although this caused the soldier to take on a significantly less cheery state, he only experienced one more minor bout of depression.

Abuse

Many cases of motivated forgetting have been reported in regards to recovered memories of childhood abuse. Many cases of abuse, particularly those performed by relatives or figures of authority, can lead to memory suppression and repression of varying amounts of time. One study indicates that 31% of abuse victims were aware of at least some forgetting of their abuse and a collaboration of seven studies has shown that one eighth to one quarter of abuse victims have periods of complete unawareness (amnesia) of the incident or series of events. There are many factors associated with forgetting abuse including: younger age at onset, threats/intense emotions, more types of abuse, and increased number of abusers. Cued recovery has been shown in 90% of cases, usually with one specific event triggering the memory. For example, the return of incest memories have been shown to be brought on by television programs about incest, the death of the perpetrator, the abuse of the subject’s own child, and seeing the site of abuse. In a study by Herman and Schatzow, confirming evidence was found for the same proportion of individuals with continuous memories of abuse as those individuals who had recovered memories. 74% of cases from each group were confirmed. Cases of Mary de Vries and Claudia show examples of confirmed recovered memories of sexual abuse.

Legal Controversy

Motivated forgetting and repressed memories have become a very controversial issue within the court system. Courts are currently dealing with historical cases, in particular a relatively new phenomenon known as historic child sexual abuse (HCSA). HCSA refers to allegations of child abuse having occurred several years prior to the time at which they are being prosecuted.

Unlike most American states, Canada, the United Kingdom, Australia and New Zealand have no statute of limitations to limit the prosecution of historical offenses. Therefore, legal decision-makers in each case need to evaluate the credibility of allegations that may go back many years. It is nearly impossible to provide evidence for many of these historical abuse cases. It is therefore extremely important to consider the credibility of the witness and accused in making a decision regarding guiltiness of the defendant.

One of the main arguments against the credibility of historical allegations, involving the retrieval of repressed memories, is found in false memory syndrome. False memory syndrome claims that through therapy and the use of suggestive techniques, clients mistakenly come to believe that they were sexually abused as children.

In the United States, the statute of limitations requires that legal action be taken within three to five years of the incident of interest. Exceptions are made for minors, where the child has until they reach eighteen years of age.

There are many factors related to the age at which child abuse cases may be presented. These include bribes, threats, dependency on the abuser, and ignorance of the child to their state of harm. All of these factors may lead a person, who has been harmed, to require more time to present their case. As well as seen in the case below of Jane Doe and Jane Roe, time may be required if memories of the abuse have been repressed or suppressed. In 1981, the statute was adjusted to make exceptions for those individuals who were not consciously aware that their situation was harmful. This rule was called the discovery rule. This rule is to be used by the court as deemed necessary by the Judge of that case.

Psychogenic Amnesia

Severe cases of trauma may lead to psychogenic amnesia, or the loss of all memories occurring around the event.

This page is based on the copyrighted Wikipedia article < https://en.wikipedia.org/wiki/Motivated_forgetting >; it is used under the Creative Commons Attribution-ShareAlike 3.0 Unported License (CC-BY-SA). You may redistribute it, verbatim or modified, providing that you comply with the terms of the CC-BY-SA.

An Overview of Emotional Reasoning

Introduction

Emotional reasoning is a cognitive process by which an individual concludes that their emotional reaction proves something is true, despite contrary empirical evidence. Emotional reasoning creates an ’emotional truth’, which may be in direct conflict with the inverse ‘perceptional truth’. It can create feelings of anxiety, fear, and apprehension in existing stressful situations, and as such, is often associated with or triggered by panic disorder or anxiety disorder. For example, even though a spouse has shown only devotion, a person using emotional reasoning might conclude, “I know my spouse is being unfaithful because I feel jealous.”

Plutchik Wheel

This process amplifies the effects of other cognitive distortions. For example, a student may feel insecure about their understanding of test material even though they are capable of answering the questions. If said student acts on their insecurity about failing the test, they might make the assumption that they misunderstand the material and therefore may guess answers randomly, causing their own failure in a self-fulfilling prophecy.

Emotional reasoning is related to other similar concepts, such as: motivated reasoning, a type of reasoning wherein individuals reach conclusions from bias instead of empirical motivations; emotional intelligence, which relates to the ways in which individuals use their emotions to understand situations or the information and reach conclusions; and cognitive distortion or cognitive deficiency, wherein individuals misinterpret situations or make decisions without considering a range of consequences.

Refer to Motivated Reasoning and Motivated Forgetting.

Origin

Emotional reasoning, as a concept, was first introduced by psychiatrist Aaron Beck. It was included as a part of Beck’s broader research topic: cognitive distortions and depression. To counteract cognitive distortions, Beck developed a type of therapy formally known as cognitive therapy, which became associated with cognitive-behavioural therapy.

Emotional reasoning had been attributed to automatic thinking, but Beck believed that it stemmed from negative thoughts that were uncontrollable and happened without effort. This reasoning has been commonly accepted over the years. Most recently, a new explanation states that an “activating agent” or sensory trigger from the environment increases emotional arousal. With this increase in arousal, certain areas of the brain are inhibited. The combination of an increase in emotional arousal and the inhibition of parts of the brain leads to emotional reasoning.

Examples

The following are simple examples of emotional reasoning.

EmotionFactsFalse Conclusion
I feel jealousMy spouse is apparently faithful and loving.My spouse is unfaithful, because I wouldn’t feel jealous if my spouse were faithful and loving.
I feel lonelyMy friends and family seem to like me and normally treat me well.I am unlovable, because I wouldn’t feel lonely if I were lovable.
I feel guiltyNeither I nor anyone around me is aware of any wrong I’ve done.I did something wrong, because I wouldn’t feel guilty unless I had done something wrong.
I feel angry at herI can’t think of anything upsetting she did or any harm she caused me.She did something wrong, because I wouldn’t feel angry at her unless she had done something wrong.
I feel stupidMy academic and professional success is typical or better.I am stupid, because I wouldn’t feel stupid or doubt my proven abilities unless I really was stupid.

Treatment

Before seeking professional help, an individual can influence the effect that emotional reasoning has on them based on his or her coping method. Using a proactive, problem-focused coping style is more effective at reducing stress and deterring stressful events. Additionally, having good social support also leads to lower psychological stress. If an individual chooses to seek professional help, a psychologist will often use cognitive-behavioural therapy to teach the patient how to challenge their cognitive distortions, including emotional reasoning. In this approach, the automatic thoughts that control emotional reasoning are identified, studied, and reasoned through by the patient. In doing so, the psychologist hopes to change the automatic thoughts of the patient and reduce the patient’s stress levels. Cognitive behavioural therapy has been generally regarded as the most-effective method of treatment for emotional reasoning.

Most recently, a new therapeutic approach uses the RIGAAR method to reduce emotional stress. RIGAAR is an abbreviation for: rapport building, information gathering, goal setting, accessing resources, agreeing strategies and rehearsing success.

Reducing emotional arousal is also suggested by the human givens approach in order to counter emotional reasoning. High emotional arousal inhibits brain regions necessary for logical complex reasoning. With less emotional arousal, cognitive reasoning is less affected and it is easier for the subject to disassociate reality from emotions.

Factors

Cognitive schemas is one of the factors to cause emotional reasoning. Schema is made of how we look at this world and our real-life experiences. Schema helps us remember the important things or events that happened in our lives. The result of the learning process is the schema, and it is also made by classical and operant conditioning. For example, an individual can develop a schema about terrorists and spiders that are very dangerous. Based on their schema, people can change what they think or how they are biased about the way they perceive things. Information-processing biases of schema impact how a person thinks and remembers, and their understanding of experiences and information. The bias makes a person’s schema automatically access similar content of schema. For example, a person with rat phobia is more likely to visualise or perceive a rat being near them. Schemas also easily connect with schema-central stimuli. For example, when depressed people start to think about negative things, it can be very difficult for them to think of anything positive.

For memory bias, schema can affect an individual’s recollections to cause schema-incongruent memories. For example, if individuals have a schema about how intelligent they are, failure-related recollections have a high chance to be retained in their minds and they become likely to recall positive past events. The schema also make individuals biased through the way that they interpret information. In other words, schema alters their understanding of the information. For example, when people refuse to help low self-esteem children solve a math problem, the children may think they are too stupid to learn how to solve the problem rather than the other people being too busy to help.

Reduction Techniques

Techniques for reducing emotional reasoning include:

  • Validity testing: Patients defend their thoughts and ideas using objective evidence to support their assumptions. If they cannot, they might be exposed to emotional reasoning.
  • Cognitive reversal: Patients are told of a difficult situation that they had in the past, and work with a therapist to help them address and correct their problems. This can prepare the patient for similar situations so that they do not revert to emotional reasoning.
  • Guided discovery: The therapist asks the patients a series of questions designed to help them realise their cognition distortions.
  • Writing in a journal: Patients form a habit of writing in a journal to record the situations they face, emotions and thoughts they experience, and their responses or behaviours to them. The therapist and patient then analyse how the patient’s maladaptive thought patterns influence their behaviours.
  • Homework: Once the patient acquires the ability to perform self-recovery and remember the insights gained from therapy sessions, the patient is tasked with reviewing sessions and reading related books to focus their thoughts and behaviours, which are recorded and reviewed for the next therapy session.
  • Modelling: The therapist could use role-playing to act in different ways in response to imagined situations so that patients could understand and model their behaviour.
  • Systematic positive reinforcement: The behaviour-oriented therapist would use a reward system (systematic positive reinforcement) to motivate patients to reinforce specific behaviours.

Negative memories and stressful life circumstances have a chance to trigger depression. The main factor for causing depression is unresolved life experiences. People who experience emotional reasoning are more likely to connect to depression. Emotion-focused therapy (EFT) is a form of psychotherapy which can help people find a positive perspective of their emotional process. EFT is a research-based treatment that emphasizes emotional change, which is the goal of this therapy. EFT has two different alternative therapies for treatments: cognitive-behavioural therapy (CBT), which emphasizes changing self-defeating thoughts and behaviours; and interpersonal therapy (IPT), which emphasizes changing people’s skills to have better interaction with others.

EFT operates on the understanding that a person’s development is influenced by emotional memories and experiences. The purpose of the therapy is to change the emotional process by resurfacing painful emotional experiences and bringing them into awareness. This process helps patients to differentiate between what they experience and the influence of past experiences on how they feel. This can result in greater self-awareness of what they want in their life and enable better decision-making through reducing emotional reasoning. Another purpose of EFT is to promote emotional intelligence, which is the ability to understand their emotions and perceive emotional information, controlling their behaviour while responding to problems.

Emotion-focused coping is a way to focus on managing one’s emotions to reduce stress and also to reduce the chance to have emotional reasoning. Cognitive therapy is a form of therapy that helps patients recognise their negative thought patterns about themselves and events to revise these thought patterns and change their behaviour. Cognitive-behavioural therapy helps individuals to perform well at cognitive tasks and to help them rethink their situation in a way that can benefit them. The treatment of cognitive-behavioural therapy is through the process of learning and making the change for maladaptive emotions, thoughts, and behaviours.

Implications

If not treated, debilitating effects can occur, the most common being depression. However, emotional reasoning has the potential to be useful when appraising the outside world and not ourselves. How one feels when assessing an object, person or event, can be an instinctual survival response and a way to adapt to the world.

“The amygdala buried deep in the limbic system serves as an early warning device for novelty, precisely so that attention can be mobilized to alert the mind to potential danger and to prepare for a potential of flight or fight.”

This page is based on the copyrighted Wikipedia article < https://en.wikipedia.org/wiki/Emotional_reasoning >; it is used under the Creative Commons Attribution-ShareAlike 3.0 Unported License (CC-BY-SA). You may redistribute it, verbatim or modified, providing that you comply with the terms of the CC-BY-SA.

An Overview of Motivated Reasoning

Introduction

Motivated reasoning (motivational reasoning bias) is a cognitive and social response in which individuals, consciously or sub-consciously, allow emotion-loaded motivational biases to affect how new information is perceived. Individuals tend to favour evidence that coincides with their current beliefs and reject new information that contradicts them, despite contrary evidence.

Motivated reasoning overlaps with confirmation bias. Both favour evidence supporting one’s beliefs, at the same time dismissing contradictory evidence. However, confirmation bias is mainly a sub-conscious (innate) cognitive bias. In contrast, motivated reasoning (motivational bias) is a sub-conscious or conscious process by which one’s emotions control the evidence supported or dismissed. For confirmation bias, the evidence or arguments can be logical as well as emotional.

Motivated reasoning can be classified into two categories:

  1. Accuracy-oriented (non-directional), in which the motive is to arrive at an accurate conclusion, irrespective of the individual’s beliefs; and
  2. Goal-oriented (directional), in which the motive is to arrive at a particular conclusion.

Refer to Motivated Forgetting, Emotional Reasoning, and Motivated Tactician.

Definitions

Motivated reasoning is a cognitive and social response, in which individuals, consciously or unconsciously, allow emotion-loaded motivational biases to affect how new information is perceived. Individuals tend to favour arguments that support their current beliefs and reject new information that contradicts these beliefs.

Motivated reasoning, confirmation bias and cognitive dissonance are closely related. Both motivated reasoning and confirmation bias favour evidence supporting one’s beliefs, at the same time dismissing contradictory evidence. Motivated reasoning (motivational bias) is an unconscious or conscious process by which personal emotions control the evidence that is supported or dismissed. However, confirmation bias is mainly an unconscious (innate, implicit) cognitive bias, and the evidence or arguments utilised can be logical as well as emotional. More broadly, it is feasible that motivated reasoning can moderate cognitive biases generally, including confirmation bias.

Individual differences such as political beliefs can moderate the emotional/motivational effect. In addition, social context (groupthink, peer pressure) also partly controls the evidence utilised for motivated reasoning, particularly in dysfunctional societies. Social context moderates emotions, which in turn moderate beliefs.

Motivated reasoning differs from critical thinking, in which beliefs are assessed with a sceptical but open-minded attitude.

Cognitive Dissonance

Individuals are compelled to initiate motivated reasoning to lessen the amount of cognitive dissonance they feel. Cognitive dissonance is the feeling of psychological and physiological stress and unease between two conflicting cognitive and/or emotional elements (such as the desire to smoke, despite knowing it is unhealthy). According to Leon Festinger, there are two paths individuals can engage in to reduce the amount of distress: the first is altering behaviour or cognitive bias; the second, more common path is avoiding or discrediting information or situations that would create dissonance.

Research suggests that reasoning away contradictions is psychologically easier than revising feelings. Emotions tend to colour how “facts” are perceived. Feelings come first, and evidence is used in service of those feelings. Evidence that supports what is already believed is accepted; evidence which contradicts those beliefs is not.

Mechanisms: Cold and Hot Cognition

The notion that motives or goals affect reasoning has a long and controversial history in social psychology. This is because supportive research could be reinterpreted in entirely cognitive non-motivational terms (the hot versus cold cognition controversy). This controversy existed because of a failure to explore mechanisms underlying motivated reasoning.

Early research on how humans evaluated and integrated information supported a cognitive approach consistent with Bayesian probability, in which individuals weighted new information using rational calculations (“cold cognition”). More recent theories endorse these cognitive processes as only partial explanations of motivated reasoning, but have also introduced motivational[1] or affective (emotional) processes (“hot cognition”).

Kunda Theory

Ziva Kunda reviewed research and developed a theoretical model to explain the mechanism by which motivated reasoning results in bias. Motivation to arrive at a desired conclusion provides a level of arousal, which acts as an initial trigger for the operation of cognitive processes. To participate in motivated reasoning, either consciously or subconsciously, an individual first needs to be motivated. Motivation then affects reasoning by influencing the knowledge structures (beliefs, memories, information) that are accessed and the cognitive processes used.

Lodge–Taber Theory

Milton Lodge and Charles Taber introduced an empirically supported model in which affect is intricately tied to cognition, and information processing is biased toward support for positions that the individual already holds. Their model has three components:

  • On-line processing, in which, when called on to make an evaluation, people instantly draw on stored information which is marked with affect;
  • A component by which affect is automatically activated along with the cognitive node to which it is tied; and
  • An “heuristic mechanism” for evaluating new information, which triggers a reflection on “How do I feel?” about this topic. This process results in a bias towards maintaining existing affect, even in the face of other, disconfirming information.

This theory is developed and evaluated in their book The Rationalizing Voter (2013). David Redlawsk (2002) found that the timing of when disconfirming information was introduced played a role in determining bias. When subjects encounter incongruity during an information search, the automatic assimilation and update process is interrupted. This results in one of two outcomes:

  • Subjects may enhance attitude strength in a desire to support existing affect (resulting in degradation in decision quality and potential bias); or
  • Subjects may counter-argue existing beliefs in an attempt to integrate the new data.

This second outcome is consistent with research on how processing occurs when one is tasked with accuracy goals.

To summarise, the two models differ in that Kunda identifies a primary role for cognitive strategies such as memory processes, and the use of rules in determining biased information selection, whereas Lodge and Taber identify a primary role for affect in guiding cognitive processes and maintaining bias.

Neuroscientific Evidence

A neuroimaging study by Drew Westen and colleagues does not support the use of cognitive processes in motivated reasoning, lending greater support to affective processing as a key mechanism in supporting bias. This study, designed to test the neural circuitry of individuals engaged in motivated reasoning, found that motivated reasoning “was not associated with neural activity in regions previously linked with cold reasoning tasks [Bayesian reasoning] nor conscious (explicit) emotion regulation”.

This neuroscience data suggests that “motivated reasoning is qualitatively distinct from reasoning when people do not have a strong emotional stake in the conclusions reached.” However, if there is a strong emotion attached during their previous round of motivated reasoning and that emotion is again present when the individual’s conclusion is reached, a strong emotional stake is then attached to the conclusion. Any new information in regards to that conclusion will cause motivated reasoning to reoccur. This can create pathways within the neural network that further ingrain the reasoned beliefs of that individual along similar neural networks to where logical reasoning occurs. This causes the strong emotion to reoccur when confronted with contradictory information, time and time again. This is referred to by Lodge and Taber as affective contagion. But instead of “infecting” other individuals, the emotion “infects” the individual’s reasoning pathways and conclusions.

Categories

Motivated reasoning can be classified into two categories:

  1. Accuracy-oriented (non-directional), in which the motive is to arrive at an accurate conclusion, irrespective of the individual’s beliefs; and
  2. Goal-oriented (directional), in which the motive is to arrive at a particular conclusion.

Politically motivated reasoning, in particular, is strongly directional.

Despite their differences in information processing, an accuracy-motivated and a goal-motivated individual can reach the same conclusion. Both accuracy-oriented and directional-oriented messages move in the desired direction. However, the distinction lies in crafting effective communication, where those who are accuracy motivated will respond better to credible evidence catered to the community, while those who are goal-oriented will feel less threatened when the issue is framed to fit their identity or values.

Accuracy-Oriented (Non-Directional) Motivated Reasoning

Several works on accuracy-driven reasoning suggest that when people are motivated to be accurate, they expend more cognitive effort, attend to relevant information more carefully, and process it more deeply, often using more complex rules.

Kunda asserts that accuracy goals delay the process of coming to a premature conclusion, in that accuracy goals increase both the quantity and quality of processing—particularly in leading to more complex inferential cognitive processing procedures. When researchers manipulated test subjects’ motivation to be accurate by informing them that the target task was highly important or that they would be expected to defend their judgments, it was found that subjects utilized deeper processing and that there was less biasing of information. This was true when accuracy motives were present at the initial processing and encoding of information. In reviewing a line of research on accuracy goals and bias, Kunda concludes, “several different kinds of biases have been shown to weaken in the presence of accuracy goals”. However, accuracy goals do not always eliminate biases and improve reasoning: some biases (e.g. those resulting from using the availability heuristic) might be resistant to accuracy manipulations. For accuracy to reduce bias, the following conditions must be present:

  • Subjects must possess appropriate reasoning strategies.
  • They must view these as superior to other strategies.
  • They must be capable of using these strategies at will.

However, these last two conditions introduce the construct that accuracy goals include a conscious process of utilising cognitive strategies in motivated reasoning. This construct is called into question by neuroscience research that concludes that motivated reasoning is qualitatively distinct from reasoning in which there is no strong emotional stake in the outcomes. Accuracy-oriented individuals who are thought to use “objective” processing can vary in information updating, depending on how much faith they place in a provided piece of evidence and inability to detect misinformation that can lead to beliefs that diverge from scientific consensus.

Goal-Oriented (Directional) Motivated Reasoning

Directional goals enhance the accessibility of knowledge structures (memories, beliefs, information) that are consistent with desired conclusions. According to Kunda, such goals can lead to biased memory search and belief construction mechanisms. Several studies support the effect of directional goals in selection and construction of beliefs about oneself, other people and the world.

Cognitive dissonance research provides extensive evidence that people may bias their self-characterisations when motivated to do so. Other biases such as confirmation bias, prior attitude effect and disconfirmation bias could contribute to goal-oriented motivated reasoning. For example, in one study, subjects altered their self-view by viewing themselves as more extroverted when induced to believe that extroversion was beneficial.

Michael Thaler of Princeton University, conducted a study that found that men are more likely than women to demonstrate performance-motivated reasoning due to a gender gap in beliefs about personal performance. After a second study was conducted the conclusion was drawn that both men and women are susceptible to motivated reasoning, but certain motivated beliefs can be separated into genders.

The motivation to achieve directional goals could also influence which rules (procedural structures, such as inferential rules) are accessed to guide the search for information. Studies also suggest that evaluation of scientific evidence may be biased by whether the conclusions are in line with the reader’s beliefs.

In spite of goal-oriented motivated reasoning, people are not at liberty to conclude whatever they want merely because of that want. People tend to draw conclusions only if they can muster up supportive evidence. They search memory for those beliefs and rules that could support their desired conclusion or they could create new beliefs to logically support their desired goals.

Case Studies

Smoking

When an individual is trying to quit smoking, they might engage in motivated reasoning to convince themselves to keep smoking. They might focus on information that makes smoking seem less harmful while discrediting any evidence which emphasizes any dangers associated with the behaviour. Individuals in situations like this are driven to initiate motivated reasoning to lessen the amount of cognitive dissonance they feel. This can make it harder for individuals to quit and lead to continued smoking, even though they know it is not good for their health.

Political Bias

Peter Ditto and his students conducted a meta-analysis in 2018 of studies relating to political bias. Their aim was to assess which US political orientation (left/liberal or right/conservative) was more biased and initiated more motivated reasoning. They found that both political orientations are susceptible to bias to the same extent. The analysis was disputed by Jonathan Baron and John Jost, to whom Ditto and colleagues responded. Reviewing the debate, Stuart Vyse concluded that the answer to the question of whether US liberals or conservatives are more biased is: “We don’t know.”

On 22 April 2011, The New York Times published a series of articles attempting to explain the Barack Obama citizenship conspiracy theories. One of these articles by political scientist David Redlawsk explained these “birther” conspiracies as an example of political motivated reasoning. US presidential candidates are required to be born in the US. Despite ample evidence that President Barack Obama was born in the US state of Hawaii, many people continue to believe that he was not born in the US, and therefore that he was an illegitimate president. Similarly, many people believe he is a Muslim (as was his father), despite ample lifetime evidence of his Christian beliefs and practice (as was true of his mother). Subsequent research by others suggested that political partisan identity was more important for motivating “birther” beliefs than for some other conspiracy beliefs such as 9/11 conspiracy theories.

Climate Change

Despite a scientific consensus on climate change, citizens are divided on the topic, particularly along political lines. A significant segment of the American public has fixed beliefs, either because they are not politically engaged, or because they hold strong beliefs that are unlikely to change. Liberals and progressives generally believe, based on extensive evidence, that human activity is the main driver of climate change. By contrast, conservatives are generally much less likely to hold this belief, and a subset believes that there is no human involvement, and that the reported evidence is faulty (or even fraudulent). A prominent explanation is political directional motivated reasoning, in that conservatives are more likely to reject new evidence that contradicts their long established beliefs. In addition, some highly directional climate deniers not only discredit scientific information on human-induced climate change but also to seek contrary evidence that leads to a posterior belief of greater denial.

A study by Robin Bayes and colleagues of the human-induced climate change views of 1,960 members of the Republican Party found that both accuracy and directional motives move in the desired direction, but only in the presence of politically motivated messages congruent with the induced beliefs.

Social Media

Social media is used for many different purposes and ways of spreading opinions. It is the number one place people go to get information and most of that information is complete opinion and bias. The way this applies to motivated reasoning is the way it spreads. “However, motivated reasoning suggests that informational uses of social media are conditioned by various social and cultural ways of thinking”. All ideas and opinions are shared and makes it very easy for motivated reasoning and biases to come through when searching for an answer or just facts on the internet or any news source.

COVID-19

In the context of the COVID-19 pandemic, people who refuse to wear masks or get vaccinated may engage in motivated reasoning to justify their beliefs and actions. They may reject scientific evidence that supports mask-wearing and vaccination and instead seek out information that supports their pre-existing beliefs, such as conspiracy theories or misinformation. This can lead to behaviours that are harmful to both themselves and others.

In a 2020 study, Van Bavel and colleagues explored the concept of motivated reasoning as a contributor to the spread of misinformation and resistance to public health measures during the COVID-19 pandemic. Their results indicated that people often engage in motivated reasoning when processing information about the pandemic, interpreting it to confirm their pre-existing beliefs and values. The authors argue that addressing motivated reasoning is critical to promoting effective public health messaging and reducing the spread of misinformation. They suggested several strategies, such as reframing public health messages to align with individuals’ values and beliefs. In addition, they suggested using trusted sources to convey information by creating social norms that support public health behaviours.

Outcomes and Tackling Strategies

The outcomes of motivated reasoning derive from “a biased set of cognitive processes—that is, strategies for accessing, constructing, and evaluating beliefs. The motivation to be accurate enhances use of those beliefs and strategies that are considered most appropriate, whereas the motivation to arrive at particular conclusions enhances use of those that are considered most likely to yield the desired conclusion.” Careful or “reflective” reasoning has been linked to both overcoming and reinforcing motivated reasoning, suggesting that reflection is not a panacea, but a tool that can be used for rational or irrational purposes depending on other factors. For example, when people are presented with and forced to think analytically about something complex that they lack adequate knowledge of (i.e. being presented with a new study on meteorology whilst having no degree in the subject), there is no directional shift in thinking, and their extant conclusions are more likely to be supported with motivated reasoning. Conversely, if they are presented with a more simplistic test of analytical thinking that confronts their beliefs (i.e. seeing implausible headlines as false), motivated reasoning is less likely to occur and a directional shift in thinking may result.

Hostile Media Effect

Research on motivated reasoning tested accuracy goals (i.e. reaching correct conclusions) and directional goals (i.e. reaching preferred conclusions). Factors such as these affect perceptions; and results confirm that motivated reasoning affects decision-making and estimates. These results have far reaching consequences because, when confronted with a small amount of information contrary to an established belief, an individual is motivated to reason away the new information, contributing to a hostile media effect. If this pattern continues over an extended period of time, the individual becomes more entrenched in their beliefs.

Tipping Point

However, recent studies have shown that motivated reasoning can be overcome. “When the amount of incongruency is relatively small, the heightened negative affect does not necessarily override the motivation to maintain [belief].” However, there is evidence of a theoretical “tipping point” where the amount of incongruent information that is received by the motivated reasoner can turn certainty into anxiety. This anxiety of being incorrect may lead to a change of opinion to the better.

This page is based on the copyrighted Wikipedia article < https://en.wikipedia.org/wiki/Motivated_reasoning >; it is used under the Creative Commons Attribution-ShareAlike 3.0 Unported License (CC-BY-SA). You may redistribute it, verbatim or modified, providing that you comply with the terms of the CC-BY-SA.