An Overview of Neuroplasticity

Introduction

Neuroplasticity, also known as neural plasticity or just plasticity, is the ability of neural networks in the brain to change through growth and reorganisation. Neuroplasticity refers to the brain’s ability to reorganise and rewire its neural connections, enabling it to adapt and function in ways that differ from its prior state. This process can occur in response to learning new skills, experiencing environmental changes, recovering from injuries, or adapting to sensory or cognitive deficits. Such adaptability highlights the dynamic and ever-evolving nature of the brain, even into adulthood. These changes range from individual neuron pathways making new connections, to systematic adjustments like cortical remapping or neural oscillation. Other forms of neuroplasticity include homologous area adaptation, cross modal reassignment, map expansion, and compensatory masquerade. Examples of neuroplasticity include circuit and network changes that result from learning a new ability, information acquisition, environmental influences, pregnancy, caloric intake, practice/training, and psychological stress.

Neuroplasticity was once thought by neuroscientists to manifest only during childhood, but research in the latter half of the 20th century showed that many aspects of the brain can be altered (or are “plastic”) even through adulthood. Furthermore, starting from the primary stimulus-response sequence in simple reflexes, the organisms’ capacity to correctly detect alterations within themselves and their context depends on the concrete nervous system architecture, which evolves in a particular way already during gestation. Adequate nervous system development forms us as human beings with all necessary cognitive functions. The physicochemical properties of the mother-foetus bio-system affect the neuroplasticity of the embryonic nervous system in their ecological context. However, the developing brain exhibits a higher degree of plasticity than the adult brain. Activity-dependent plasticity can have significant implications for healthy development, learning, memory, and recovery from brain damage.

Brief History

Origin

The term plasticity was first applied to behaviour in 1890 by William James in The Principles of Psychology where the term was used to describe “a structure weak enough to yield to an influence, but strong enough not to yield all at once”. The first person to use the term neural plasticity appears to have been the Polish neuroscientist Jerzy Konorski.

One of the first experiments providing evidence for neuroplasticity was conducted in 1793, by Italian anatomist Michele Vicenzo Malacarne, who described experiments in which he paired animals, trained one of the pair extensively for years, and then dissected both. Malacarne discovered that the cerebellums of the trained animals were substantially larger than the cerebellum of the untrained animals. However, while these findings were significant, they were eventually forgotten. In 1890, the idea that the brain and its function are not fixed throughout adulthood was proposed by William James in The Principles of Psychology, though the idea was largely neglected. Up until the 1970s, neuroscientists believed that the brain’s structure and function was essentially fixed throughout adulthood.

While the brain was commonly understood as a nonrenewable organ in the early 1900s, the pioneering neuroscientist Santiago Ramón y Cajal used the term neuronal plasticity to describe nonpathological changes in the structure of adult brains. Based on his renowned neuron doctrine, Cajal first described the neuron as the fundamental unit of the nervous system that later served as an essential foundation to develop the concept of neural plasticity. Many neuroscientists used the term plasticity to explain the regenerative capacity of the peripheral nervous system only. Cajal, however, used the term plasticity to reference his findings of degeneration and regeneration in the adult brain (a part of the central nervous system). This was controversial, with some like Walther Spielmeyer and Max Bielschowsky arguing that the CNS cannot produce new cells.

The term has since been broadly applied:

Given the central importance of neuroplasticity, an outsider would be forgiven for assuming that it was well defined and that a basic and universal framework served to direct current and future hypotheses and experimentation. Sadly, however, this is not the case. While many neuroscientists use the word neuroplasticity as an umbrella term it means different things to different researchers in different subfields … In brief, a mutually agreed-upon framework does not appear to exist.

Research and Discovery

In 1923, Karl Lashley conducted experiments on rhesus monkeys that demonstrated changes in neuronal pathways, which he concluded were evidence of plasticity. Despite this, and other research that suggested plasticity, neuroscientists did not widely accept the idea of neuroplasticity.

Inspired by work from Nicolas Rashevsky, in 1943, McCulloch and Pitts proposed the artificial neuron, with a learning rule, whereby new synapses are produced when neurons fire simultaneously. This is then extensively discussed in The Organization of Behavior (Hebb, 1949) and is now known as Hebbian learning.

In 1945, Justo Gonzalo concluded from his research on brain dynamics, that, contrary to the activity of the projection areas, the “central” cortical mass (more or less equidistant from the visual, tactile and auditive projection areas), would be a “manoeuvring mass”, rather unspecific or multisensory, with capacity to increase neural excitability and re-organize the activity by means of plasticity properties. He gives as a first example of adaptation, to see upright with reversing glasses in the Stratton experiment, and specially, several first-hand brain injuries cases in which he observed dynamic and adaptive properties in their disorders, in particular in the inverted perception disorder [e.g. see pp 260–62 Vol. I (1945), p 696 Vol. II (1950)]. He stated that a sensory signal in a projection area would be only an inverted and constricted outline that would be magnified due to the increase in recruited cerebral mass, and re-inverted due to some effect of brain plasticity, in more central areas, following a spiral growth.

Marian Diamond of the University of California, Berkeley, produced the first scientific evidence of anatomical brain plasticity, publishing her research in 1964.

Other significant evidence was produced in the 1960s and after, notably from scientists including Paul Bach-y-Rita, Michael Merzenich along with Jon Kaas, as well as several others.

In the 1960s, Paul Bach-y-Rita invented a device that was tested on a small number of people, and involved a person sitting in a chair, embedded in which were nubs that were made to vibrate in ways that translated images received in a camera, allowing a form of vision via sensory substitution.

Studies in people recovering from stroke also provided support for neuroplasticity, as regions of the brain that remained healthy could sometimes take over, at least in part, functions that had been destroyed; Shepherd Ivory Franz did work in this area.

Eleanor Maguire documented changes in hippocampal structure associated with acquiring the knowledge of London’s layout in local taxi drivers. A redistribution of grey matter was indicated in London Taxi Drivers compared to controls. This work on hippocampal plasticity not only interested scientists, but also engaged the public and media worldwide.

Michael Merzenich is a neuroscientist who has been one of the pioneers of neuroplasticity for over three decades. He has made some of “the most ambitious claims for the field – that brain exercises may be as useful as drugs to treat diseases as severe as schizophrenia – that plasticity exists from cradle to the grave, and that radical improvements in cognitive functioning – how we learn, think, perceive, and remember are possible even in the elderly.” Merzenich’s work was affected by a crucial discovery made by David Hubel and Torsten Wiesel in their work with kittens. The experiment involved sewing one eye shut and recording the cortical brain maps. Hubel and Wiesel saw that the portion of the kitten’s brain associated with the shut eye was not idle, as expected. Instead, it processed visual information from the open eye. It was “…as though the brain didn’t want to waste any ‘cortical real estate’ and had found a way to rewire itself.”

This implied neuroplasticity during the critical period. However, Merzenich argued that neuroplasticity could occur beyond the critical period. His first encounter with adult plasticity came when he was engaged in a postdoctoral study with Clinton Woosley. The experiment was based on observation of what occurred in the brain when one peripheral nerve was cut and subsequently regenerated. The two scientists micromapped the hand maps of monkey brains before and after cutting a peripheral nerve and sewing the ends together. Afterwards, the hand map in the brain that they expected to be jumbled was nearly normal. This was a substantial breakthrough. Merzenich asserted that, “If the brain map could normalize its structure in response to abnormal input, the prevailing view that we are born with a hardwired system had to be wrong. The brain had to be plastic.” Merzenich received the 2016 Kavli Prize in Neuroscience “for the discovery of mechanisms that allow experience and neural activity to remodel brain function.”

Neurobiology

There are different ideas and theories on what biological processes allow for neuroplasticity to occur. The core of this phenomenon is based upon synapses and how connections between them change based on neuron functioning. It is widely agreed upon that neuroplasticity takes on many forms, as it is a result of a variety of pathways. These pathways, mainly signalling cascades, allow for gene expression alterations that lead to neuronal changes, and thus neuroplasticity.

There are a number of other factors that are thought to play a role in the biological processes underlying the changing of neural networks in the brain. Some of these factors include synapse regulation via phosphorylation, the role of inflammation and inflammatory cytokines, proteins such as Bcl-2 proteins and neutrophorins, and energy production via mitochondria.

JT Wall and J Xu have traced the mechanisms underlying neuroplasticity. Re-organisation is not cortically emergent, but occurs at every level in the processing hierarchy; this produces the map changes observed in the cerebral cortex.

Types

Christopher Shaw and Jill McEachern (eds) in “Toward a theory of Neuroplasticity”, state that there is no all-inclusive theory that overarches different frameworks and systems in the study of neuroplasticity. However, researchers often describe neuroplasticity as “the ability to make adaptive changes related to the structure and function of the nervous system.” Correspondingly, two types of neuroplasticity are often discussed: structural neuroplasticity and functional neuroplasticity.

Structural Neuroplasticity

Structural plasticity is often understood as the brain’s ability to change its neuronal connections. New neurons are constantly produced and integrated into the central nervous system throughout the life span based on this type of neuroplasticity. Researchers nowadays use multiple cross-sectional imaging methods (i.e. magnetic resonance imaging (MRI), computerised tomography (CT)) to study the structural alterations of the human brains. This type of neuroplasticity often studies the effect of various internal or external stimuli on the brain’s anatomical reorganisation. The changes of grey matter proportion or the synaptic strength in the brain are considered as examples of structural neuroplasticity. Structural neuroplasticity is currently investigated more within the field of neuroscience in current academia.

Functional Neuroplasticity

Functional plasticity refers to the brain’s ability to alter and adapt the functional properties of network of neurons. It can occur in four known ways namely:

  • Homologous area adaptation.
  • Map expansion.
  • Cross-model reassignment.
  • Compensatory masquerade.

Homologous Area Adaptation

Homologous area adaptation is the assumption of a particular cognitive process by a homologous region in the opposite hemisphere. For instance, through homologous area adaptation a cognitive task is shifted from a damaged part of the brain to its homologous area in opposite side of the brain. Homologous area adaptation is a type of functional neuroplasticity that occur usually in children rather than adults.

Map Expansion

In map expansion, cortical maps related to particular cognitive tasks expand due to frequent exposure to stimuli. Map expansion has been proven through experiments performed in relation to the study: experiment on effect of frequent stimulus on functional connectivity of the brain was observed in individuals learning spatial routes.

Cross-Model Reassignment

Cross-model reassignment involves reception of novel input signals to a brain region which has been stripped off its default input.

Compensatory Masquerade

Functional plasticity through compensatory masquerade occurs using different cognitive processes for an already established cognitive task.

Changes in the brain associated with functional neuroplasticity can occur in response to two different types of events:

  • Previous activity (activity-dependent plasticity) to acquire memory; or
  • In response to malfunction or damage of neurons (maladaptive plasticity) to compensate a pathological event

In the latter case the functions from one part of the brain transfer to another part of the brain based on the demand to produce recovery of behavioural or physiological processes. Regarding physiological forms of activity-dependent plasticity, those involving synapses are referred to as synaptic plasticity. The strengthening or weakening of synapses that results in an increase or decrease of firing rate of the neurons are called long-term potentiation (LTP) and long-term depression (LTD), respectively, and they are considered as examples of synaptic plasticity that are associated with memory. The cerebellum is a typical structure with combinations of LTP/LTD and redundancy within the circuitry, allowing plasticity at several sites. More recently it has become clearer that synaptic plasticity can be complemented by another form of activity-dependent plasticity involving the intrinsic excitability of neurons, which is referred to as intrinsic plasticity. This, as opposed to homeostatic plasticity does not necessarily maintain the overall activity of a neuron within a network but contributes to encoding memories. Also, many studies have indicated functional neuroplasticity in the level of brain networks, where training alters the strength of functional connections. Although a recent study discusses that these observed changes should not directly relate to neuroplasticity, since they may root in the systematic requirement of the brain network for reorganisation.

Applications and Examples

The adult brain is not entirely “hard-wired” with fixed neuronal circuits. There are many instances of cortical and subcortical rewiring of neuronal circuits in response to training as well as in response to injury.

There is ample evidence for the active, experience-dependent re-organisation of the synaptic networks of the brain involving multiple inter-related structures including the cerebral cortex. The specific details of how this process occurs at the molecular and ultrastructural levels are topics of active neuroscience research. The way experience can influence the synaptic organisation of the brain is also the basis for a number of theories of brain function including the general theory of mind and neural Darwinism. The concept of neuroplasticity is also central to theories of memory and learning that are associated with experience-driven alteration of synaptic structure and function in studies of classical conditioning in invertebrate animal models such as Aplysia.

There is evidence that neurogenesis (birth of brain cells) occurs in the adult, rodent brain—and such changes can persist well into old age. The evidence for neurogenesis is mainly restricted to the hippocampus and olfactory bulb, but research has revealed that other parts of the brain, including the cerebellum, may be involved as well. However, the degree of rewiring induced by the integration of new neurons in the established circuits is not known, and such rewiring may well be functionally redundant.

Treatment of Brain Damage

A surprising consequence of neuroplasticity is that the brain activity associated with a given function can be transferred to a different location; this can result from normal experience and also occurs in the process of recovery from brain injury. Neuroplasticity is the fundamental issue that supports the scientific basis for treatment of acquired brain injury with goal-directed experiential therapeutic programs in the context of rehabilitation approaches to the functional consequences of the injury.

Neuroplasticity is gaining popularity as a theory that, at least in part, explains improvements in functional outcomes with physical therapy post-stroke. Rehabilitation techniques that are supported by evidence which suggest cortical reorganisation as the mechanism of change include constraint-induced movement therapy, functional electrical stimulation, treadmill training with body-weight support, and virtual reality therapy. Robot assisted therapy is an emerging technique, which is also hypothesized to work by way of neuroplasticity, though there is currently insufficient evidence to determine the exact mechanisms of change when using this method.

One group has developed a treatment that includes increased levels of progesterone injections in brain-injured patients. “Administration of progesterone after traumatic brain injury and stroke reduces edema, inflammation, and neuronal cell death, and enhances spatial reference memory and sensory-motor recovery.” In a clinical trial, a group of severely injured patients had a 60% reduction in mortality after three days of progesterone injections. However, a study published in the New England Journal of Medicine in 2014 detailing the results of a multi-centre NIH-funded phase III clinical trial of 882 patients found that treatment of acute traumatic brain injury with the hormone progesterone provides no significant benefit to patients when compared with placebo.

Binocular Vision

For decades, researchers assumed that humans had to acquire binocular vision, in particular stereopsis, in early childhood or they would never gain it. In recent years, however, successful improvements in persons with amblyopia, convergence insufficiency or other stereo vision anomalies have become prime examples of neuroplasticity; binocular vision improvements and stereopsis recovery are now active areas of scientific and clinical research.

Phantom Limbs

In the phenomenon of phantom limb sensation, a person continues to feel pain or sensation within a part of their body that has been amputated. This is strangely common, occurring in 60–80% of amputees. An explanation for this is based on the concept of neuroplasticity, as the cortical maps of the removed limbs are believed to have become engaged with the area around them in the postcentral gyrus. This results in activity within the surrounding area of the cortex being misinterpreted by the area of the cortex formerly responsible for the amputated limb.

The relationship between phantom limb sensation and neuroplasticity is a complex one. In the early 1990s V.S. Ramachandran theorized that phantom limbs were the result of cortical remapping. However, in 1995 Herta Flor and her colleagues demonstrated that cortical remapping occurs only in patients who have phantom pain. Her research showed that phantom limb pain (rather than referred sensations) was the perceptual correlate of cortical reorganisation. This phenomenon is sometimes referred to as maladaptive plasticity.

In 2009, Lorimer Moseley and Peter Brugger carried out an experiment in which they encouraged arm amputee subjects to use visual imagery to contort their phantom limbs into impossible configurations. Four of the seven subjects succeeded in performing impossible movements of the phantom limb. This experiment suggests that the subjects had modified the neural representation of their phantom limbs and generated the motor commands needed to execute impossible movements in the absence of feedback from the body.

Chronic Pain

Individuals who have chronic pain experience prolonged pain at sites that may have been previously injured, yet are otherwise currently healthy. This phenomenon is related to neuroplasticity due to a maladaptive reorganisation of the nervous system, both peripherally and centrally. During the period of tissue damage, noxious stimuli and inflammation cause an elevation of nociceptive input from the periphery to the central nervous system. Prolonged nociception from the periphery then elicits a neuroplastic response at the cortical level to change its somatotopic organisation for the painful site, inducing central sensitisation. For instance, individuals experiencing complex regional pain syndrome demonstrate a diminished cortical somatotopic representation of the hand contralaterally as well as a decreased spacing between the hand and the mouth. Additionally, chronic pain has been reported to significantly reduce the volume of grey matter in the brain globally, and more specifically at the prefrontal cortex and right thalamus. However, following treatment, these abnormalities in cortical reorganisation and grey matter volume are resolved, as well as their symptoms. Similar results have been reported for phantom limb pain, chronic low back pain and carpal tunnel syndrome.

Meditation

A number of studies have linked meditation practice to differences in cortical thickness or density of gray matter. One of the most well-known studies to demonstrate this was led by Sara Lazar, from Harvard University, in 2000. Richard Davidson, a neuroscientist at the University of Wisconsin, has led experiments in collaboration with the Dalai Lama on effects of meditation on the brain. His results suggest that meditation may lead to change in the physical structure of brain regions associated with attention, anxiety, depression, fear, anger, and compassion as well as the ability of the body to heal itself.

Artistic Engagement and Art Therapy

There is substantial evidence that artistic engagement in a therapeutic environment can create changes in neural network connections as well as increase cognitive flexibility. In one 2013 study, researchers found evidence that long-term, habitual artistic training (e.g. musical instrument practice, purposeful painting, etc.) can “macroscopically imprint a neural network system of spontaneous activity in which the related brain regions become functionally and topologically modularized in both domain-general and domain-specific manners”. In simple terms, brains repeatedly exposed to artistic training over long periods develop adaptations to make such activity both easier and more likely to spontaneously occur.

Some researchers and academics have suggested that artistic engagement has substantially altered the human brain throughout our evolutionary history. D.W Zaidel, adjunct professor of behavioural neuroscience and contributor at VAGA, has written that “evolutionary theory links the symbolic nature of art to critical pivotal brain changes in Homo sapiens supporting increased development of language and hierarchical social grouping”.

Music Therapy

There is evidence that engaging in music-supported therapy can improve neuroplasticity in patients who are recovering from brain injuries. Music-supported therapy can be used for patients that are undergoing stroke rehabilitation where a one month study of stroke patients participating in music-supported therapy showed a significant improvement in motor control in their affected hand. Another finding was the examination of grey matter volume of adults developing brain atrophy and cognitive decline where playing a musical instrument, such as the piano, or listening to music can increase grey matter volume in areas such as the caudate nucleus, Rolandic operculum, and cerebellum. Evidence also suggests that music-supported therapy can improve cognitive performance, well-being, and social behaviour in patients who are recovering from damage to the orbitofrontal cortex (OFC) and recovering from mild traumatic brain injury. Neuroimaging post music-supported therapy revealed functional changes in OFC networks, with improvements observed in both task-based and resting-state fMRI analyses.

Fitness and Exercise

Aerobic exercise increases the production of neurotrophic factors (compounds that promote growth or survival of neurons), such as brain-derived neurotrophic factor (BDNF), insulin-like growth factor 1 (IGF-1), and vascular endothelial growth factor (VEGF). Exercise-induced effects on the hippocampus are associated with measurable improvements in spatial memory. Consistent aerobic exercise over a period of several months induces marked clinically significant improvements in executive function (i.e. the “cognitive control” of behaviour) and increased gray matter volume in multiple brain regions, particularly those that give rise to cognitive control. The brain structures that show the greatest improvements in gray matter volume in response to aerobic exercise are the prefrontal cortex and hippocampus; moderate improvements are seen in the anterior cingulate cortex, parietal cortex, cerebellum, caudate nucleus, and nucleus accumbens. Higher physical fitness scores (measured by VO2 max) are associated with better executive function, faster processing speed, and greater volume of the hippocampus, caudate nucleus, and nucleus accumbens.

Deafness and Loss of Hearing

Due to hearing loss, the auditory cortex and other association areas of the brain in deaf and/or hard of hearing people undergo compensatory plasticity. The auditory cortex usually reserved for processing auditory information in hearing people now is redirected to serve other functions, especially for vision and somatosensation.

Deaf individuals have enhanced peripheral visual attention, better motion change but not colour change detection ability in visual tasks, more effective visual search, and faster response time for visual targets compared to hearing individuals. Altered visual processing in deaf people is often found to be associated with the repurposing of other brain areas including primary auditory cortex, posterior parietal association cortex (PPAC), and anterior cingulate cortex (ACC). A review by Bavelier et al. (2006) summarizes many aspects on the topic of visual ability comparison between deaf and hearing individuals.

Brain areas that serve a function in auditory processing repurpose to process somatosensory information in congenitally deaf people. They have higher sensitivity in detecting frequency change in vibration above threshold and higher and more widespread activation in auditory cortex under somatosensory stimulation. However, speeded response for somatosensory stimuli is not found in deaf adults.

Cochlear Implant

Neuroplasticity is involved in the development of sensory function. The brain is born immature and then adapts to sensory inputs after birth. In the auditory system, congenital hearing loss, a rather frequent inborn condition affecting 1 of 1000 newborns, has been shown to affect auditory development, and implantation of a sensory prostheses activating the auditory system has prevented the deficits and induced functional maturation of the auditory system. Due to a sensitive period for plasticity, there is also a sensitive period for such intervention within the first 2–4 years of life. Consequently, in prelingually deaf children, early cochlear implantation, as a rule, allows the children to learn the mother language and acquire acoustic communication.

Blindness

Due to vision loss, the visual cortex in blind people may undergo cross-modal plasticity, and therefore other senses may have enhanced abilities. Or the opposite could occur, with the lack of visual input weakening the development of other sensory systems. One study suggests that the right posterior middle temporal gyrus and superior occipital gyrus reveal more activation in the blind than in the sighted people during a sound-moving detection task. Several studies support the latter idea and found weakened ability in audio distance evaluation, proprioceptive reproduction, threshold for visual bisection, and judging minimum audible angle.

Human Echolocation

Human echolocation is a learned ability for humans to sense their environment from echoes. This ability is used by some blind people to navigate their environment and sense their surroundings in detail. Studies in 2010 and 2011 using functional magnetic resonance imaging techniques have shown that parts of the brain associated with visual processing are adapted for the new skill of echolocation. Studies with blind patients, for example, suggest that the click-echoes heard by these patients were processed by brain regions devoted to vision rather than audition.

Attention Deficit Hyperactivity Disorder

Reviews of MRI and electroencephalography (EEG) studies on individuals with ADHD suggest that the long-term treatment of ADHD with stimulants, such as amphetamine or methylphenidate, decreases abnormalities in brain structure and function found in subjects with ADHD, and improves function in several parts of the brain, such as the right caudate nucleus of the basal ganglia, left ventrolateral prefrontal cortex (VLPFC), and superior temporal gyrus.

In Early Child Development

Neuroplasticity is most active in childhood as a part of normal human development, and can also be seen as an especially important mechanism for children in terms of risk and resiliency. Trauma is considered a great risk as it negatively affects many areas of the brain and puts a strain on the sympathetic nervous system from constant activation. Trauma thus alters the brain’s connections such that children who have experienced trauma may be hyper vigilant or overly aroused. However, a child’s brain can cope with these adverse effects through the actions of neuroplasticity.

Neuroplasticity is shown in four different categories in children and covering a wide variety of neuronal functioning. These four types include impaired, excessive, adaptive, and plasticity.

There are many examples of neuroplasticity in human development. For example, Justine Ker and Stephen Nelson looked at the effects of musical training on neuroplasticity, and found that musical training can contribute to experience dependent structural plasticity. This is when changes in the brain occur based on experiences that are unique to an individual. Examples of this are learning multiple languages, playing a sport, doing theatre, etc. A study done by Hyde in 2009, showed that changes in the brain of children could be seen in as little as 15 months of musical training. Ker and Nelson suggest this degree of plasticity in the brains of children can “help provide a form of intervention for children… with developmental disorders and neurological diseases.”

In Animals

In a single lifespan, individuals of an animal species may encounter various changes in brain morphology. Many of these differences are caused by the release of hormones in the brain; others are the product of evolutionary factors or developmental stages. Some changes occur seasonally in species to enhance or generate response behaviours.

Seasonal Brain Changes

Changing brain behaviour and morphology to suit other seasonal behaviours is relatively common in animals. These changes can improve the chances of mating during breeding season. Examples of seasonal brain morphology change can be found within many classes and species.

Within the class Aves, black-capped chickadees experience an increase in the volume of their hippocampus and strength of neural connections to the hippocampus during fall months. These morphological changes within the hippocampus which are related to spatial memory are not limited to birds, as they can also be observed in rodents and amphibians. In songbirds, many song control nuclei in the brain increase in size during mating season. Among birds, changes in brain morphology to influence song patterns, frequency, and volume are common. Gonadotropin-releasing hormone (GnRH) immunoreactivity, or the reception of the hormone, is lowered in European starlings exposed to longer periods of light during the day.

The California sea hare, a gastropod, has more successful inhibition of egg-laying hormones outside of mating season due to increased effectiveness of inhibitors in the brain. Changes to the inhibitory nature of regions of the brain can also be found in humans and other mammals. In the amphibian Bufo japonicus, part of the amygdala is larger before breeding and during hibernation than it is after breeding.

Seasonal brain variation occurs within many mammals. Part of the hypothalamus of the common ewe is more receptive to GnRH during breeding season than at other times of the year. Humans experience a change in the “size of the hypothalamic suprachiasmatic nucleus and vasopressin-immunoreactive neurons within it” during the fall, when these parts are larger. In the spring, both reduce in size.

Traumatic Brain Injury Research

A group of scientists found that if a small stroke (an infarction) is induced by obstruction of blood flow to a portion of a monkey’s motor cortex, the part of the body that responds by movement moves when areas adjacent to the damaged brain area are stimulated. In one study, intracortical microstimulation (ICMS) mapping techniques were used in nine normal monkeys. Some underwent ischemic-infarction procedures and the others, ICMS procedures. The monkeys with ischemic infarctions retained more finger flexion during food retrieval and after several months this deficit returned to preoperative levels. With respect to the distal forelimb representation, “postinfarction mapping procedures revealed that movement representations underwent reorganization throughout the adjacent, undamaged cortex.” Understanding of interaction between the damaged and undamaged areas provides a basis for better treatment plans in stroke patients. Current research includes the tracking of changes that occur in the motor areas of the cerebral cortex as a result of a stroke. Thus, events that occur in the reorganization process of the brain can be ascertained. The treatment plans that may enhance recovery from strokes, such as physiotherapy, pharmacotherapy, and electrical-stimulation therapy, are also being studied.

Jon Kaas, a professor at Vanderbilt University, has been able to show “how somatosensory area 3b and ventroposterior (VP) nucleus of the thalamus are affected by longstanding unilateral dorsal-column lesions at cervical levels in macaque monkeys.” Adult brains have the ability to change as a result of injury but the extent of the reorganization depends on the extent of the injury. His recent research focuses on the somatosensory system, which involves a sense of the body and its movements using many senses. Usually, damage of the somatosensory cortex results in impairment of the body perception. Kaas’ research project is focused on how these systems (somatosensory, cognitive, motor systems) respond with plastic changes resulting from injury.

One recent study of neuroplasticity involves work done by a team of doctors and researchers at Emory University, specifically Donald Stein and David Wright. This is the first treatment in 40 years that has significant results in treating traumatic brain injuries while also incurring no known side effects and being cheap to administer. Stein noticed that female mice seemed to recover from brain injuries better than male mice, and that at certain points in the oestrus cycle, females recovered even better. This difference may be attributed to different levels of progesterone, with higher levels of progesterone leading to the faster recovery from brain injury in mice. However, clinical trials showed progesterone offers no significant benefit for traumatic brain injury in human patients.

Ageing

Transcriptional profiling of the frontal cortex of persons ranging from 26 to 106 years of age defined a set of genes with reduced expression after age 40, and especially after age 70. Genes that play central roles in synaptic plasticity were the most significantly affected by age, generally showing reduced expression over time. There was also a marked increase in cortical DNA damage, likely oxidative DNA damage, in gene promoters with aging.

Reactive oxygen species appear to have a significant role in the regulation of synaptic plasticity and cognitive function. However age-related increases in reactive oxygen species may also lead to impairments in these functions.

Multilingualism

There is a beneficial effect of multilingualism on people’s behaviour and cognition. Numerous studies have shown that people who study more than one language have better cognitive functions and flexibilities than people who only speak one language. Bilinguals are found to have longer attention spans, stronger organisation and analyzation skills, and a better theory of mind than monolinguals. Researchers have found that the effect of multilingualism on better cognition is due to neuroplasticity.

In one prominent study, neurolinguists used a voxel-based morphometry (VBM) method to visualise the structural plasticity of brains in healthy monolinguals and bilinguals. They first investigated the differences in density of grey and white matter between two groups and found the relationship between brain structure and age of language acquisition. The results showed that grey-matter density in the inferior parietal cortex for multilinguals were significantly greater than monolinguals. The researchers also found that early bilinguals had a greater density of grey matter relative to late bilinguals in the same region. The inferior parietal cortex is a brain region highly associated with the language learning, which corresponds to the VBM result of the study.

Recent studies have also found that learning multiple languages not only re-structures the brain but also boosts brain’s capacity for plasticity. A recent study found that multilingualism not only affects the grey matter but also white matter of the brain. White matter is made up of myelinated axons that is greatly associated with learning and communication. Neurolinguists used a diffusion tensor imaging (DTI) scanning method to determine the white matter intensity between monolinguals and bilinguals. Increased myelinations in white matter tracts were found in bilingual individuals who actively used both languages in everyday life. The demand of handling more than one language requires more efficient connectivity within the brain, which resulted in greater white matter density for multilinguals.

While it is still debated whether these changes in brain are result of genetic disposition or environmental demands, many evidences suggest that environmental, social experience in early multilinguals affect the structural and functional reorganisation in the brain.

Novel Treatments of Depression

Historically, the monoamine imbalance hypothesis of depression played a dominant role in psychiatry and drug development. However, while traditional antidepressants cause a quick increase in noradrenaline, serotonin, or dopamine, there is a significant delay in their clinical effect and often an inadequate treatment response. As neuroscientists pursued this avenue of research, clinical and preclinical data across multiple modalities began to converge on pathways involved in neuroplasticity. They found a strong inverse relationship between the number of synapses and severity of depression symptoms and discovered that in addition to their neurotransmitter effect, traditional antidepressants improved neuroplasticity but over a significantly protracted time course of weeks or months. The search for faster acting antidepressants found success in the pursuit of ketamine, a well-known anaesthetic agent, that was found to have potent antidepressant effects after a single infusion due to its capacity to rapidly increase the number of dendritic spines and to restore aspects of functional connectivity. Additional neuroplasticity promoting compounds with therapeutic effects that were both rapid and enduring have been identified through classes of compounds including serotonergic psychedelics, cholinergic scopolamine, and other novel compounds. To differentiate between traditional antidepressants focused on monoamine modulation and this new category of fast acting antidepressants that achieve therapeutic effects through neuroplasticity, the term psychoplastogen was introduced.

Transhumanism and Bodyhacking

Bodyhacking, situated at the intersection of technology and biology, represents efforts to enhance human capabilities beyond natural limitations. Innovations in this field include vests that convert sound into vibrations for individuals with hearing impairments and advanced prosthetics capable of integrating with neural signals to mimic natural movement. Bodyhacking also encompasses sensory augmentation, such as implants that enable new forms of perception or interaction. While these developments demonstrate the potential to improve quality of life and expand human ability, they also raise ethical questions regarding accessibility, safety, and the integration of technology into the human body.

This page is based on the copyrighted Wikipedia article < https://en.wikipedia.org/wiki/Neuroplasticity >; it is used under the Creative Commons Attribution-ShareAlike 3.0 Unported License (CC-BY-SA). You may redistribute it, verbatim or modified, providing that you comply with the terms of the CC-BY-SA.

An Overview of Situationism (in Psychology)

Introduction

Under the controversy of person–situation debate, situationism is the theory that changes in human behaviour are factors of the situation rather than the traits a person possesses. Behaviour is believed to be influenced by external, situational factors rather than internal traits or motivations. Situationism therefore challenges the positions of trait theorists, such as Hans Eysenck or Raymond B. Cattell. This is an ongoing debate that has truth to both sides; psychologists are able to prove each of the view points through human experimentation.

Brief History and Conceptions

Situationists believe that thoughts, feelings, dispositions, and past experiences and behaviours do not determine what someone will do in a given situation, rather, the situation itself does. Situationists tend to assume that character traits are distinctive, meaning that they do not completely disregard the idea of traits, but suggest that situations have a greater impact on behaviour than those traits. Situationism is also influenced by culture, in that the extent to which people believe that situations impact behaviours varies between cultures. Situationism has been perceived as arising in response to trait theories, and correcting the notion that everything we do is because of our traits. However, situationism has also been criticised for ignoring individuals’ inherent influences on behaviour. There are many experiments and evidence supporting this topic, and shown in the sources below but also in the article itself. But these experiments do not test what people would do in situations that are forced or rushed, most mistakes are made from rushing and or forgetting something due to lack of concentration. Situationism can be looked at in many different ways, this means that situationism needs to be tested and experimented in many different ways.

Criticisms for Situationism

While situationism has become an increasingly popular theory in the field of philosophy, some wonder why it never quite garnered the same attention in the field of psychology. One reason for this could be the criticisms put forth by psychologists who believe that there just because a personality effect does not account for the entirety of an observed behaviour, there is no reason to believe that the rest is determined by situational effect. Rather, many psychologists believe that trait-situation interactions are more likely responsible for observed behaviours; that is, we cannot attribute behaviour to just personality traits or just situational effects, but rather an interaction between the two processes. Additionally, the popularity of the Big Five-Factor Model of Personality within the field of psychology has overshadowed the theory of situationism. Because this model of personality identifies specific personality traits and claims they can explain behaviour and decisions of an individual, situationism has become a bit obsolete.

Experimental Evidence

Evidence For Situationism

Many studies have found series of evidence supporting situationism. One notable situationist study is Philip Zimbardo’s Stanford prison experiment. This study was considered one of the most unethical because the participants were deceived and were physically and psychologically abused. The goal of the study was that Zimbardo wanted to discover two things. If prison guards abused prisoners because of their nature, or because of the power and authority they were given in the situation. They also wanted to figure out if prisoners acted violent and savage because of their nature or because of being in a secluded and violent environment. To carry out this experiment, Zimbardo gathered 24 college men and paid them 15 dollars each an hour to live two weeks in a mock prison. The participants were told that they were chosen to be guard or prisoner because of their personality traits, but they were randomly selected. The prisoners were booked and given prison clothes and no possessions. They were also assigned a number to be referred to with the intent of farther dehumanizing them. Within the first night, the prisoner and guard dynamics began to take place. The guards started waking up the prisoners in the middle of the night for count, and they would yell and ridicule them. The prisoners also started developing hostile traits against the guards and having prison related conversations. By the second day, the guards started abusing the prisoners by forcing them to do push ups, and the prisoners started rebelling by removing their caps and numbers, and hiding in their cells with their mattresses blocking the door. As the days passed the relationship between the guards and prisoners became extremely hostile- the prisoners fought for their independence, and the guards fought to strip them of it.

There were many cases where the prisoners began breaking down psychologically, and it all started with prisoner 8612. After one day after the experiment started, prisoner number 8612 has anxiety attacks and asked to leave. He was then told “You can’t leave. You can’t quit.” He then went back to the prison and “began to act ‘crazy,’ to scream, to curse, to go into a rage that seemed out of control.” After this, he was sent home. The other prisoner that broke down was 819. 819 had broken down and was told to rest in a room. When Dr. Zimbardo went to check on him he said ” what I found was a boy crying hysterically while in the background his fellow prisoners were yelling and chanting that he was a bad prisoner, that they were being punished because of him.” Zimbardo then allowed him to leave but he said he could not because he was labelled as a bad prisoner, to which Zimbardo responded “Listen, you are not 819. My name is Dr. Zimbardo, I am a psychologist, and this is not a prison. This is just an experiment and those are students, just like you. Let’s go.” He stopped crying suddenly and looked up at me just like a small child awakened from a nightmare and said, “OK, let’s go.”

The guards also began to have extremely abusive relations with the prisoners. Zimbardo claimed there were three types of guards. The first were the guards that followed all the rules but got the job done, the second felt bad for the prisoners, and the third were extremely hostile and treated them like animals. This last type showed behaviours of actual guards and seemed to have forgotten they were college students, they got into their roles faster, and seemed to enjoy tormenting the prisoners. On Thursday night, 6 days into the experiment, Zimbardo described the guards as having “sadistic” behaviour, and then decided to close down the study early.

This study showed how regular people can completely disassociate with who they are when their environment changes. Regular college boys turned into broken down prisoners and sadistic guards.

Studies investigating bystander effects also support situationism. For example, in 1973, Darley and Batson conducted a study where they asked students at a seminary school to give a presentation in a separate building. They gave each individual participant a topic, and would then tell a participant that they were supposed to be there immediately, or in a few minutes, and sent them on their way to the building. On the way, each participant encountered a confederate who was on the ground, clearly in need of medical attention. Darley and Batson observed that more participants who had extra time stopped to help the confederate than those who were in a hurry. Helping was not predicted by religious personality measures, and the results therefore indicate that the situation influenced their behaviour.

A third well-known study supporting situationism is an obedience study, the Milgram experiment. Stanley Milgram made his obedience study to explain the obedience phenomenon, specifically the holocaust. He wanted to explain how people follow orders, and how people are likely to do unmoral things when ordered to by people of authority. The way the experiment was devised was that Milgram picked 40 men from a newspaper add to take part in a study at Yale University. The men were between 20 and 50 years old, and were paid $4.50 for showing up. In this study, a participant was assigned to be a “teacher” and a confederate was assigned to be a “learner”. The teachers were told the learners had to memorise word pairs, and every time they got it wrong they were shocked with increasing voltages. The voltages ranged from 15 to 450, and in order for the participants to believe the shock was real, the experimenters administered to them a real 45v shock, The participant was unaware that the learner was a confederate. The participant would test the learner, and for each incorrect answer the learner gave, the participant would have to shock the learner with increasing voltages. The shocks were not actually administered, but the participant believed they were. When the shocks reached 300v, the learner began to protest and show discomfort. Milgram expected participants to stop the procedure, but 65% of them continued to completion, administering shocks that could have been fatal, even if they were uncomfortable or upset. Even though most of the participants continued administering the shocks, they had distressed reactions when administering the shocks, such as laughing hysterically. Participants felt compelled to listen to the experimenter, who was the authority figure present in the room and continued to encourage the participant throughout the study. Out of 40 participants, 26 went all the way to the end.

Evidence against Situationism

Personality traits have a very weak relationship to behaviour. In contrast, situational factors usually have a stronger impact on behaviour; this is the core evidence for situationism. In addition, people are also able to describe character traits of close to such as friends and family, which goes to show that there are opposing reasons showing why people can recall these traits.

In addition, there are other studies that show these same trends. For example, twin studies have shown that identical twins share more traits than fraternal twins. This also implies that there is a genetic basis for behaviour, which directly contradicts situationist views that behaviour is determined by the situation. When observing one instance of extroverted or honest behaviour, it shows how in different situations a person would behave in a similarly honest or extroverted way. It shows that when many people are observed in a range of situations the trait-related reactions to behaviour is about .20 or less. People think the correlation is around .80. This shows that the situation itself is more dependent on characteristics and circumstances in contrast to what is taking place at that point in time.

These recent challenges to the Traditional View have not gone unnoticed. Some have attempted to modify the Traditional View to insulate it from these challenges, while others have tried to show how these challenges fail to undermine the Traditional View at all. For example, Dana Nelkin (2005), Christian Miller (2003), Gopal Sreenivasan (2002), and John Sabini and Maury Silver (2005), among others, have argued that the empirical evidence cited by the Situationists does not show that individuals lack robust character traits.

Current Views: Interactionism

In addition to the debate between trait influences and situational influences on behaviour, a psychological model of “interactionism” exists, which is a view that both internal dispositions and external situational factors affect a person’s behaviour in a given situation. This model emphasizes both sides of the person-situation debate, and says that internal and external factors interact with each other to produce a behaviour. Interactionism is currently an accepted personality theory, and there has been sufficient empirical evidence to support interactionism. However, it is also important to note that both situationists and trait theorists contributed to explaining facets of human behaviour.

This page is based on the copyrighted Wikipedia article < https://en.wikipedia.org/wiki/Situationism_(psychology) >; it is used under the Creative Commons Attribution-ShareAlike 3.0 Unported License (CC-BY-SA). You may redistribute it, verbatim or modified, providing that you comply with the terms of the CC-BY-SA.

An Overview of Regulatory Focus Theory

Introduction

Regulatory focus theory (RFT) is a theory of goal pursuit  formulated by Columbia University psychology professor and researcher E. Tory Higgins regarding people’s motivations and perceptions in judgement and decision making processes. RFT examines the relationship between the motivation of a person and the way in which they go about achieving their goal. RFT posits two separate and independent self-regulatory orientations: prevention and promotion (Higgins, 1997).

This psychological theory, like many others, is applied in communication, specifically in the subfields of nonverbal communication and persuasion. Chronic regulatory focus is measured using the Regulatory Focus Questionnaire (Higgins et al., 2001) or the Regulatory Strength measure. Momentary regulatory focus can be primed or induced.

Background

Regulatory Fit Theory

To understand RFT, it is important to understand another of E. Tory Higgins’ theories: regulatory fit theory. When a person believes that there is “fit”, they will involve themselves more in what they are doing and “feel right” about it.  Regulatory fit should not directly affect the hedonic occurrence of a thing or occasion, but should influence a person’s assurance in their reaction to the object or event.

Regulatory fit theory suggests that a match between orientation to a goal and the means used to approach that goal produces a state of regulatory fit that both creates a feeling of rightness about the goal pursuit and increases task engagement (Higgins, 2001, 2005). Regulatory fit intensifies responses, such as the value of a chosen object, persuasion, and job satisfaction.

Regulatory fit does not increase the assessment of a decision; instead when someone feels “right” about their decision, the experience of “correctness and importance” is transferred to the ensuing assessment of the chosen object, increasing its superficial worth. Research suggests that the “feeling right” experience can then sway retrospective or prospective evaluations. Regulatory fit can be manipulated incidentally (outside the context of interest) or integrally (within the context of interest).

Definition

RFT refers to when a person pursues a goal in a way that maintains the person’s own personal values and beliefs, also known as regulatory orientation. This theory operates on the basic principle that people embrace pleasure but avoid pain, and they then maintain their regulatory fit based on this standard.

The regulatory focus is basically the way in which someone approaches pleasure but avoids pain. An individual’s regulatory focus concentrates on desired end-states, and the approach motivation used to go from the current state to the desired end-state. This theory differentiates between a promotion-focus on hopes and accomplishments, also known as gains. This focus is more concerned with higher level gains such as advancement and accomplishment.

Another focus is the prevention-focus based on safety and responsibilities, also known as non-losses. This focus emphasizes security and safety by following the guidelines and the rules.

These two regulatory focuses regulate the influences that a person would be exposed to in the decision-making process, and determine the different ways they achieve their goal, as discussed by RFT. An individual’s regulatory orientation is not necessarily fixed. While individuals have chronic tendencies towards either promotion or prevention, these preferences may not hold for all situations. Furthermore, a specific regulatory focus can be induced.

The value taken from interaction and goal attainment can be either positive or negative. The decision has positive value when people attempt to attain their goal in a way that fits their regulatory orientation and it will have negative value when people attempt to attain their goal in a way that does not fit their regulatory orientation. Regulatory fit allows value to be created by intensifying the commitment, based on one of the regulatory focus orientations. Making choices and fulfilling objectives are considered as activities, and with any activity, people can be more or less involved. When this involvement is strong, it can intensify the feelings and values about this activity, and the approach to the activity determines whether they are or are not satisfied with the outcome and method of achieving the outcome.

This theory has noteworthy implications for increasing the value of life. For example, in interpersonal conflict, if each person experiences “fit”, each one will be satisfied with and committed to the outcome. In the broad sense, for people to appreciate their own lives, they need to be satisfied and “feel right” about what they are doing, and the way they are doing it.  If it is not satisfying, it is known as “non-fit”, and they will not reach their desired goal.

Goal Attainment and Motivation

Regulatory focus theory, according to Higgins, views motivation in a way that allows an understanding of the foundational ways we approach a task or a goal. Different factors can motivate people during goal pursuit, and we self-regulate our methods and processes during our goal pursuit. RFT proposes that motivational strength is enhanced when the manner in which people work toward a goal sustains their regulatory orientation. Achieving a goal in a way that is consistent to a person’s regulatory orientation leads to an individual sense of importance to the event. The impact of motivation is considered calculated and this creates a greater sense of commitment to the goal. The more strongly an individual is engaged (i.e. involved, occupied, fully engrossed) in an activity, the more intense the motivational force experienced. Engagement is of great importance to attain and motivate in order to reach a goal. Engagement serves as intensifier of the directional component of the value experience. An individual who is strongly engaged in a goal pursuits will experience a positive target more positively and a negative target more negatively.

Individuals can pursue different goals with diverse regulatory orientations and in unlike ways. There are two different kinds of regulatory orientations that people use to obtain their goals: promotion-focus orientation and prevention-focus orientation. These terms are derived from E. Tory Higgins’s Theory of Regulatory Focus. In which, he adds to the notion that people regulate their goal-oriented behaviour in two very distinct ways, coined promotion-focus orientation and prevention-focus orientation

E. Tory Higgins uses this example: there is Student A and Student B, and they both have the shared goal to make an A in a class they are both taking in college. Student A uses a promotion-focus orientation which slants them towards achieving their goal and towards advancement, growth and life accomplishment. This would cause Student A to view the goal as an ideal that satisfies their need for accomplishment. Student B uses a prevention-focus orientation where the goal is something that should be realised because it fulfils their need for security, protection and prevention of negative outcomes. Student A uses an eager approach where they read extra materials to obtain their goal of an A. Student B uses a vigilant approach where they become more detail oriented and pay careful attention to completing all of the course requirements.

Both forms of regulatory orientation can work to fulfil goals, but the choice of orientation is based on individual preferences and style. When a person pursues their goal in the focus that fits their regulatory orientation, they are more likely to pursue their goal more eagerly and aggressively than if they were using the other focus. In this case each student has different styles. They both feel more comfortable in persuading their goal. The outcome in this experiment would have been different if the students were given an undesirable choice.

When people make decisions, they often envision the possible “pleasure or pain” of the possible outcomes that the focus orientation will produce.  A person imagining making a pleasing choice is more likely to engage in promotion-focus orientation because envisioning the possible outcome of success maintains eagerness about the outcome but does not place importance on vigilance. A person imagining the possible pain by making an undesirable choice maintains more vigilance but less eagerness.

A person with promotion-focus orientation is more likely to remember the occasions where the goal is pursued by using eagerness approaches and less likely to remember occasions where the goal is pursued by vigilance approaches. A person with prevention-focus orientation is more likely to remember events where the goal is pursued by means of vigilance than if it was pursued using eagerness approaches.

Application

Regulatory Focus Theory and Persuasion

When relating regulatory focus theory to persuasion, it is important to remember that RFT is a goal-attainment theory, and that RFT can spawn feelings of rightness/wrongness which in turn may produce formulations for judgements.

The feelings of rightness give an individual more commitment to the information coming in and therefore can avoid endangering their regulatory fit which in turn changes their regulatory focus and accepting a probable motive to change. If a person experiences feelings of wrongness they will suffer negative emotions and deem the experience and information as a threat to their regulatory fit and therefore a threat to their regulatory focus and their goal.

Studies have been done where fit and focus have been applied to show their applicability to consumer purchasing, health advisories, and social policy issues.  To be persuaded is to change your prior feelings, actions, and/or beliefs on a matter to where you agree with the persuader.

The “fit” involved in RFT plays a large role in such issues and stories because it can be a device to help an individual receive and review the experience during a particular message delivery. Positive reinforcement and feelings of rightness while decoding the message creates a stronger engagement and relationship with processing the message, and negative reinforcement and feelings of wrongness lessens the engagement and attachment.

Researchers found that targeting the two different regulatory focus orientations, and their coinciding types of fit, works as an effective process to aid in persuasive charm or pull when they introduced a manner of persuasion where the framing of the message was everything and the content was irrelevant to uphold or interrupt a person’s regulatory fit and follow the pattern of logic used in regulatory orientation.

Lee and Aaker (2004) conducted an experiment that involved whether or not to give their information in a prevention-focus- or promotion-focus-concerning way. The study involved an advertisement for a grape juice drink, which they split into two to create prevention-focus concerns (disease-preventing) and then promotion-focus concerns (energy enhancement).  In doing so, they demonstrated that rather than trying to know each individual recipient’s qualities, one needs only to start by nailing the focus (prevention/promotion) and then framing the message so that it creates that “rightness”.

Some may confuse RFT with regulatory fit, regulatory relevance, message matching, and source attractiveness in such an example. The extent of similarities between closely related theories of RFT, such as ones stated above, make it hard to clarify when this theory is applicable or apparent in respect to the persuasion process.

Regulatory Focus Theory and Nonverbal Communication

RFT can be a useful outline for a better understanding of the effects of nonverbal cues in persuasion and impression formation. Regulatory Fit Theory suggests that the effect of a cue cannot be understood without remembering what the cue means given a recipient’s focus orientation.

Nonverbal cues can be used by the message source to vary delivery style, more specifically to convey eagerness or vigilance, of a given message in a way that will produce regulatory fit in message recipients of different focus orientations.

Advancement implies eager movement forward, so eagerness is conveyed by gestures that involve animated, broad opening movements such as hand movements projecting outward, forward leaning body positions, fast body movement, and fast speech rate. Caution implies vigilant carefulness, so vigilance should be conveyed by gestures that show precision like slightly backward-leaning body positions, slower body movement, and slower speech rate.

An eager nonverbal delivery style will result in greater message effectiveness for promotion-focus recipients than for prevention-focus recipients, while the opposite is true for a vigilant nonverbal style.

There are various aspects, which may contribute to whether or not a message’s persuasive element is successful. One aspect is the effect of nonverbal cues and their association with persuasive appeals based on the message recipient’s motivational regulatory orientation. This determines the recipient’s impression of the source during impression formation.

Research has found that nonverbal cues are an essential element of most persuasive appeals. RFT creates the background that allows a prediction for when and for whom a nonverbal cue can have an effect on persuasion. When nonverbal cues and signals are used appropriately, they increase the effectiveness of persuasion.

Moral Judgement

RFT has also been applied within moral psychology to the topic of moral judgment, contrasting the notions of “oughts” and “ideals.”

References

  • Cesario, J: “Regulatory fit and persuasion: Basic principles and remaining questions”, Social and Personality Psychology Compass 2(1)
  • Higgins, E: “Making a Good Decision: Value From Fit”, American Psychologist 55(11):1217
  • Higgins, E. T. (2005). Value From Regulatory Fit. Current Directions in Psychological Science, 14(4), 209–213. doi:10.1111/j.0963-7214.2005.00366.x
  • Kruglanski, A. W., Pierro, A., & Higgins, E. T. (2007). Regulatory Mode and Preferred Leadership Styles: How Fit Increases Job Satisfaction. Basic and Applied Social Psychology, 29(2), 137–149. doi:10.1080/01973530701331700
  • Avnet, T: “Locomotion, assessment, and regulatory fit: Value transfer from ‘how’ to ‘what'”, Journal of Experimental Psychology 39(5):525
  • Manczak, Erika M.; Zapata-Gietl, Claudia; McAdams, Dan P. (January 2014). “Regulatory focus in the life story: prevention and promotion as expressed in three layers of personality”. Journal of Personality and Social Psychology. 106 (1): 169–181. doi:10.1037/a0034951. ISSN 1939-1315. PMID 24377362.
  • Higgins, E: “Achievement orientations from subjective histories of success: promotion pride versus prevention pride”, European Journal of Social Psychology. 31(1):4
  • Higgins, E. (1997, December) Beyond pleasure and pain. American Psychologist, 52(12):1281
  • Spiegel, S: “How regulatory fit enhances motivational strength during goal pursuit”, European Journal of Social Psychology. 34(1):40
  • Larsen, R., & Buss, D. (2009). Personality psychology: domains of knowledge about human nature. (4th ed., p. 388).
  • Vaughn, A: “‘This story is right on’: The impact of regulatory fit on narrative engagement and persuasion”, European Journal of Social Psychology. 39:448
  • Cesario, J., Higgins, E. (2008 May) Making Message Recipients “Feel Right”: How Nonverbal Cues Can Increase Persuasion. Psychological Science, 19(5)
  • Cornwell, James F. M.; Higgins, E. Tory (September 2015). “The “Ought” Premise of Moral Psychology and the Importance of the Ethical “Ideal””. Review of General Psychology. 19 (3): 311–328. doi:10.1037/gpr0000044. S2CID 146170745.
  • Cornwell, James F.M.; Higgins, E. Tory (November 2015). “Approach and avoidance in moral psychology: Evidence for three distinct motivational levels”. Personality and Individual Differences. 86: 139–149. doi:10.1016/j.paid.2015.06.012.
  • Cornwell, James F. M.; Higgins, E. Tory (March 2016). “Eager feelings and vigilant reasons: Regulatory focus differences in judging moral wrongs”. Journal of Experimental Psychology: General. 145 (3): 338–355. doi:10.1037/xge0000136. PMC 4755905. PMID 26726912. S2CID 20920447.

This page is based on the copyrighted Wikipedia article < https://en.wikipedia.org/wiki/Regulatory_focus_theory >; it is used under the Creative Commons Attribution-ShareAlike 3.0 Unported License (CC-BY-SA). You may redistribute it, verbatim or modified, providing that you comply with the terms of the CC-BY-SA.

What is Personal Construct Theory?

Introduction

Within personality psychology, personal construct theory (PCT) or personal construct psychology (PCP) is a theory of personality and cognition developed by the American psychologist George Kelly in the 1950s. The theory addresses the psychological reasons for actions. Kelly proposed that individuals can be psychologically evaluated according to similarity–dissimilarity poles, which he called personal constructs (schemas, or ways of seeing the world). The theory is considered by some psychologists as forerunner to theories of cognitive therapy.

From the theory, Kelly derived a psychotherapy approach, as well as a technique called the repertory grid interview, that helped his patients to analyse their own personal constructs with minimal intervention or interpretation by the therapist. The repertory grid was later adapted for various uses within organizations, including decision-making and interpretation of other people’s world-views. The UK Council for Psychotherapy, a regulatory body, classifies PCP therapy within the experiential subset of the constructivist school.

Principles

A main tenet of PCP theory is that a person’s unique psychological processes are channelled by the way they anticipate events. Kelly believed that anticipation and prediction are the main drivers of our mind. “Every man is, in his own particular way, a scientist”, said Kelly: people are constantly building up and refining theories and models about how the world works so that they can anticipate future events. People start doing this at birth (for example, a child discovers that if they start to cry, their mother will come to them) and continue refining their theories as they grow up.

Kelly proposed that every construct is bipolar, specifying how two things are similar to each other (lying on the same pole) and different from a third thing, and they can be expanded with new ideas. (More recent researchers have suggested that constructs need not be bipolar.) People build theories—often stereotypes—about other people and also try to control them or impose on others their own theories so as to be better able to predict others’ actions. All these theories are built up from a system of constructs. A construct has two extreme points, such as “happy–sad,” and people tend to place items at either extreme or at some point in between. People’s minds, said Kelly, are filled up with these constructs at a low level of awareness.

A given person, set of persons, any event, or circumstance can be characterized fairly precisely by the set of constructs applied to it and by the position of the thing within the range of each construct. For example, Fred may feel as though he is not happy or sad (an example of a construct); he feels as though he is between the two. However, he feels he is more clever than he is stupid (another example of a construct). A baby may have a preverbal construct of what behaviours may cause their mother to come to them. Constructs can be applied to anything people put their attention to, and constructs also strongly influence what people fix their attention on. People can construe reality by constructing different constructs. Hence, determining a person’s system of constructs would go a long way towards understanding them, especially the person’s essential constructs that represent their very strong and unchangeable beliefs and their self-construal.

Kelly did not use the concept of the unconscious; instead, he proposed the notion of “levels of awareness” to explain why people did what they did. He identified “construing” as the highest level and “preverbal” as the lowest level of awareness.

Some psychologists have suggested that PCT is not a psychological theory but a metatheory because it is a theory about theories.

Therapy Approach

Kelly believed in a non-invasive or non-directive approach to psychotherapy. Rather than having the therapist interpret the person’s psyche, which would amount to imposing the doctor’s constructs on the patient, the therapist should just act as a facilitator of the patient finding his or her own constructs. The patient’s behaviour is then mainly explained as ways to selectively observe the world, act upon it and update the construct system in such a way as to increase predictability. To help the patient find his or her constructs, Kelly developed the repertory grid interview technique.

Kelly explicitly stated that each individual’s task in understanding their personal psychology is to put in order the facts of his or her own experience. Then the individual, like the scientist, is to test the accuracy of that constructed knowledge by performing those actions the constructs suggest. If the results of their actions are in line with what the knowledge predicted, then they have done a good job of finding the order in their personal experience. If not, then they can modify the construct: their interpretations or their predictions or both. This method of discovering and correcting constructs is roughly analogous to the general scientific method that is applied in various ways by modern sciences to discover truths about the universe.

The Repertory Grid

The repertory grid serves as part of various assessment methods to elicit and examine an individual’s repertoire of personal constructs. There are different formats such as card sorts, verbally administered group format, and the repertory grid technique.

The repertory grid itself is a matrix where the rows represent constructs found, the columns represent the elements, and cells indicate with a number the position of each element within each construct. There is software available to produce several reports and graphs from these grids.

To build a repertory grid for a patient, Kelly might first ask the patient to select about seven elements (although there are no fixed rules for the number of elements) whose nature might depend on whatever the patient or therapist are trying to discover. For instance, “Two specific friends, two work-mates, two people you dislike, your mother and yourself”, or something of that sort. Then, three of the elements would be selected at random, and then the therapist would ask: “In relation to … (whatever is of interest), in which way are two of these people alike but different from the third?” The answer is sure to indicate one of the extreme points of one of the patient’s constructs. He might say for instance that Fred and Sarah are very communicative whereas John is not. Further questioning would reveal the other end of the construct (say, introvert) and the positions of the three characters between extremes. Repeating the procedure with different sets of three elements ends up revealing several constructs the patient might not have been fully aware of.

In the book Personal Construct Methodology, researchers Brian R. Gaines and Mildred L.G. Shaw noted that they “have also found concept mapping and semantic network tools to be complementary to repertory grid tools and generally use both in most studies” but that they “see less use of network representations in PCP studies than is appropriate”. They encouraged practitioners to use semantic network techniques in addition to the repertory grid.

Organisational Applications

PCP has always been a minority interest among psychologists. During the last 30 years, it has gradually gained adherents in the US, Canada, the UK, Germany, Australia, Ireland, Italy and Spain. While its chief fields of application remain clinical and educational psychology, there is an increasing interest in its applications to organisational development, employee training and development, job analysis, job description and evaluation. The repertory grid is often used in the qualitative phase of market research, to identify the ways in which consumers construe products and services.

This page is based on the copyrighted Wikipedia article < https://en.wikipedia.org/wiki/Personal_construct_theory >; it is used under the Creative Commons Attribution-ShareAlike 3.0 Unported License (CC-BY-SA). You may redistribute it, verbatim or modified, providing that you comply with the terms of the CC-BY-SA.

An Overview of Neuroepigenetics

Introduction

Neuroepigenetics is the study of how epigenetic changes to genes affect the nervous system. These changes may effect underlying conditions such as addiction, cognition, and neurological development.

Mechanisms

Neuroepigenetic mechanisms regulate gene expression in the neuron. Often, these changes take place due to recurring stimuli. Neuroepigenetic mechanisms involve proteins or protein pathways that regulate gene expression by adding, editing or reading epigenetic marks such as methylation or acetylation. Some of these mechanisms include ATP-dependent chromatin remodelling, LINE1, and prion protein-based modifications. Other silencing mechanisms include the recruitment of specialised proteins that methylate DNA such that the core promoter element is inaccessible to transcription factors and RNA polymerase. As a result, transcription is no longer possible. One such protein pathway is the REST co-repressor complex pathway. There are also several non-coding RNAs that regulate neural function at the epigenetic level. These mechanisms, along with neural histone methylation, affect arrangement of synapses, neuroplasticity, and play a key role in learning and memory.

Methylation

DNA methyltransferases (DNMTs) are involved in regulation of the electrophysiological landscape of the brain through methylation of CpGs. Several studies have shown that inhibition or depletion of DNMT1 activity during neural maturation leads to hypomethylation of the neurons by removing the cell’s ability to maintain methylation marks in the chromatin. This gradual loss of methylation marks leads to changes in the expression of crucial developmental genes that may be dosage sensitive, leading to neural degeneration. This was observed in the mature neurons in the dorsal portion of the mouse prosencephalon, where there was significantly greater amounts of neural degeneration and poor neural signalling in the absence of DNMT1. Despite poor survival rates amongst the DNMT1-depleted neurons, some of the cells persisted throughout the lifespan of the organism. The surviving cells reaffirmed that the loss of DNMT1 led to hypomethylation in the neural cell genome. These cells also exhibited poor neural functioning. In fact, a global loss of neural functioning was also observed in these model organisms, with the greatest amounts neural degeneration occurring in the prosencephalon.

Other studies showed a trend for DNMT3a and DNMT3b. However, these DNMT’s add new methyl marks on unmethylated DNA, unlike DNMT1. Like DNMT1, the loss of DNMT3a and 3b resulted in neuromuscular degeneration two months after birth, as well as poor survival rates amongst the progeny of the mutant cells, even though DNMT3a does not regularly function to maintain methylation marks. This conundrum was addressed by other studies which recorded rare loci in mature neurons where DNMT3a acted as a maintenance DNMT. The Gfap locus, which codes for the formation and regulation of the cytoskeleton of astrocytes, is one such locus where this activity is observed. The gene is regularly methylated to downregulate glioma related cancers. DNMT inhibition leads to decreased methylation and increased synaptic activity. Several studies show that the methylation-related increase or decrease in synaptic activity occurs due to the upregulation or downregulation of receptors at the neurological synapse. Such receptor regulation plays a major role in many important mechanisms, such as the ‘fight or flight’ response. The glucocorticoid receptor (GR) is the most studied of these receptors. During stressful circumstances, there is a signalling cascade that begins from the pituitary gland and terminates due to a negative feedback loop from the adrenal gland. In this loop, the increase in the levels of the stress response hormone results in the increase of GR. Increase in GR results in the decrease of cellular response to the hormone levels. It has been shown that methylation of the I7 exon within the GR locus leads to a lower level of basal GR expression in mice. These mice were more susceptible to high levels of stress as opposed to mice with lower levels of methylation at the I7 exon. Up-regulation or down-regulation of receptors through methylation leads to change in synaptic activity of the neuron.

Hypermethylation, CpG Islands, and Tumour Suppressing Genes

CpG Islands (CGIs) are regulatory elements that can influence gene expression by allowing or interfering with transcription initiation or enhancer activity. CGIs are generally interspersed with the promoter regions of the genes they affect and may also affect more than one promoter region. In addition they may also include enhancer elements and be separate from the transcription start site. Hypermethylation at key CGIs can effectively silence expression of tumour suppressing genes and is common in gliomas. Tumour suppressing genes are those which inhibit a cell’s progression towards cancer. These genes are commonly associated with important functions which regulate cell-cycle events. For example, PI3K and p53 pathways are affected by CGI promoter hypermethylation, this includes the promoters of the genes CDKN2/p16, RB, PTEN, TP53 and p14ARF. Importantly, glioblastomas are known to have high frequency of methylation at CGIs/promoter sites. For example, Epithelial Membrane Protein 3 (EMP3) is a gene which is involved in cell proliferation as well as cellular interactions. It is also thought to function as a tumour suppressor, and in glioblastomas is shown to be silenced via hypermethylation. Furthermore, introduction of the gene into EMP3-silenced neuroblasts results in reduced colony formation as well as suppressed tumour growth. In contrast, hypermethylation of promoter sites can also inhibit activity of oncogenes and prevent tumorigenesis. Such oncogenic pathways as the transformation growth factor (TGF)-beta signalling pathway stimulate cells to proliferate. In glioblastomas the overactivity of this pathway is associated with aggressive forms of tumour growth. Hypermethylation of PDGF-B, the TGF-beta target, inhibits uncontrolled proliferation.

Hypomethylation and Aberrant Histone Modification

Global reduction in methylation is implicated in tumorigenesis. More specifically, wide spread CpG demethylation, contributing to global hypomethylation, is known to cause genomic instability leading to development of tumours. An important effect of this DNA modification is its transcriptional activation of oncogenes. For example, expression of MAGEA1 enhanced by hypomethylation interferes with p53 function.

Aberrant patterns of histone modifications can also take place at specific loci and ultimately manipulate gene activity. In terms of CGI promoter sites, methylation and loss of acetylation occurs frequently at H3K9. Furthermore, H3K9 dimethylation and trimethylation are repressive marks which, along with bivalent differentially methylated domains, are hypothesized to make tumour suppressing genes more susceptible to silencing. Abnormal presence or lack of methylation in glioblastomas are strongly linked to genes which regulate apoptosis, DNA repair, cell proliferation, and tumour suppression. One of the best known examples of genes affected by aberrant methylation that contributes to formation of glioblastomas is MGMT, a gene involved in DNA repair which encodes the protein O6-methylguanine-DNA methyltransferase. Methylation of the MGMT promoter is an important predictor of the effectiveness of alkylating agents to target glioblastomas. Hypermethylation of the MGMT promoter causes transcriptional silencing and is found in several cancer types including glioma, lymphoma, breast cancer, prostate cancer, and retinoblastoma.

Neuroplasticity

Neuroplasticity refers to the ability of the brain to undergo synaptic rearrangement as a response to recurring stimuli. Neurotrophin proteins play a major role in synaptic rearrangement, amongst other factors. Depletion of neurotrophin BDNF or BDNF signalling is one of the main factors in developing diseases such as Alzheimer’s disease, Huntington’s disease, and depression. Neuroplasticity can also occur as a consequence of targeted epigenetic modifications such as methylation and acetylation. Exposure to certain recurring stimuli leads to demethylation of particular loci and remethylation in a pattern that leads to a response to that particular stimulus. Like the histone readers, erasers and writers also modify histones by removing and adding modifying marks respectively. An eraser, neuroLSD1, is a modified version of the original Lysine Demethylase 1(LSD1) that exists only in neurons and assists with neuronal maturation. Although both versions of LSD1 share the same target, their expression patterns are vastly different and neuroLSD1 is a truncated version of LSD1. NeuroLSD1 increases the expression of immediate early genes (IEGs) involved in cell maturation. Recurring stimuli lead to differential expression of neuroLSD1, leading to rearrangement of loci. The eraser is also thought to play a major role in the learning of many complex behaviors and is way through which genes interact with the environment.

Neurodegenerative Diseases

Alzheimer’s Disease

Alzheimer’s disease (AD) is a neurodegenerative disease known to progressively affect memory and incite cognitive degradation. Epigenetic modifications both globally and on specific candidate genes are thought to contribute to the aetiology of this disease. Immunohistochemical analysis of post-mortem brain tissues across several studies have revealed global decreases in both 5-methylcytosine (5mC) and 5-hydroxymethylcytosine (5hmC) in AD patients compared with controls. However, conflicting evidence has shown elevated levels of these epigenetic markers in the same tissues. Furthermore, these modifications appear to be affected early on in tissues associated with the pathophysiology of AD. The presence of 5mC at the promoters of genes is generally associated with gene silencing. 5hmC, which is the oxidised product of 5mC, via ten-eleven-translocase (TET), is thought to be associated with activation of gene expression, though the mechanisms underlying this activation are not fully understood.

Regardless of variations in results of methylomic analysis across studies, it is known that the presence of 5hmC increases with differentiation and aging of cells in the brain. Furthermore, genes which have a high prevalence of 5hmC are also implicated in the pathology of other age related neurodegenerative diseases, and are key regulators of ion transport, neuronal development, and cell death. For example, over-expression of 5-Lipoxygenase (5-LOX), an enzyme which generates pro-inflammatory mediators from arachidonic acid, in AD brains is associated with high prevalence of 5hmC at the 5-LOX gene promoter region.

Amyotrophic Lateral Sclerosis

DNA modifications at different transcriptional sites have been shown to contribute to neurodegenerative diseases. These include harmful transcriptional alterations such as those found in motor neuron functionality associated with Amyotrophic Lateral Sclerosis (ALS). Degeneration of upper and lower motor neurons, which contributes to muscle atrophy in ALS patients, is linked to chromatin modifications among a group of key genes. One important site that is regulated by epigenetic events is the hexanucleotide repeat expansion in C9orf72 within the chromosome 9p21. Hypermethylation of the C9orf72 related CpG Islands is shown to be associated with repeat expansion in ALS affected tissues. Overall, silencing of the C9orf72 gene may result in haploinsufficiency, and may therefore influence the presentation of disease. The activity of chromatin modifiers is also linked to prevalence of ALS. DNMT3A is an important methylating agent and has been shown to be present throughout the central nervous systems of those with ALS. Furthermore, over-expression of this de novo methyl transferase is also implicated in cell death of motor-neuron analogues.

Mutations in the FUS gene, that encodes an RNA/DNA binding protein, are causally linked to ALS. ALS patients with such mutations have increased levels of DNA damage. The protein encoded by the FUS gene is employed in the DNA damage response. It is recruited to DNA double-strand breaks and catalyses recombinational repair of such breaks. In response to DNA damage, the FUS protein also interacts with histone deacetylase I, a protein employed in epigenetic alteration of histones. This interaction is necessary for efficient DNA repair. These findings suggest that defects in epigenetic signalling and DNA repair contribute to the pathogenesis of ALS.

Neuro-oncology

A multitude of genetic and epigenetic changes in DNA profiles in brain cells are thought to be linked to tumourgenesis. These alterations, along with changes in protein functions, are shown to induce uncontrolled cell proliferation, expansion, and metastasis. While genetic events such as deletions, translocations, and amplification give rise to activation of oncogenes and deactivation of tumour suppressing genes, epigenetic changes silence or up-regulate these same genes through key chromatin modifications.

Neurotoxicity

Neurotoxicity refers to damage made to the central or peripheral nervous systems due to chemical, biological, or physical exposure to toxins. Neurotoxicity can occur at any age and its effects may be short-term or long-term, depending on the mechanism of action of the neurotoxin and degree of exposure.

Certain metals are considered essential due to their role in key biochemical and physiological pathways, while the remaining metals are characterized as being nonessential. Nonessential metals do not serve a purpose in any biological pathway and the presence and accumulation in the brain of most can lead to neurotoxicity. These nonessential metals, when found inside the body compete with essential metals for binding sites, upset antioxidant balance, and their accumulation in the brain can lead to harmful side effects, such as depression and intellectual disability. An increase in nonessential heavy metal concentrations in air, water and food sources, and household products has increased the risk of chronic exposure.

Acetylation, methylation and histone modification are some of the most common epigenetic markers. While these changes do not directly affect the DNA sequence, they are able to alter the accessibility to genetic components, such as the promoter or enhancer regions, necessary for gene expression. Studies have shown that long-term maternal exposure to lead (Pb) contributes to decreased methylation in areas of the foetal epigenome, for example the interspaced repetitive sequences (IRSs) Alu1 and LINE-1. The hypomethylation of these IRSs has been linked to increased risk for cancers and autoimmune diseases later in life. Additionally, studies have found a relationship between chronic prenatal Pb exposure and neurological diseases, such as Alzheimer’s and schizophrenia, as well as developmental issues. Furthermore, the acetylation and methylation changes induced by overexposure to lead result in decreased neurogenesis and neuron differentiation ability, and consequently interfere with early brain development.

Overexposure to essential metals can also have detrimental consequences on the epigenome. For example, when manganese, a metal normally used by the body as a cofactor, is present at high concentrations in the blood it can negatively affect the central nervous system. Studies have shown that accumulation of manganese leads to dopaminergic cell death and consequently plays a role in the onset of Parkinson’s disease (PD). A hallmark of Parkinson’s disease is the accumulation of α-Synuclein in the brain. Increased exposure to manganese leads to the downregulation of protein kinase C delta (PKCδ) through decreased acetylation and results in the misfolding of the α-Synuclein protein that allows aggregation and triggers apoptosis of dopaminergic cells.

Research

The field has only recently seen a growth in interest, as well as in research, due to technological advancements that facilitate better resolution of the minute modifications made to DNA. However, even with the significant advances in technology, studying the biology of neurological phenomena, such as cognition and addiction, comes with its own set of challenges. Biological study of cognitive processes, especially with humans, has many ethical caveats. Some procedures, such as brain biopsies of Rett Syndrome patients, usually call for a fresh tissue sample that can only be extricated from the brain of deceased individual. In such cases, the researchers have no control over the age of brain tissue sample, thereby limiting research options. In case of addiction to substances such as alcohol, researchers utilise mouse models to mirror the human version of the disease (even though mouse models do not translate very well to human models). However, the mouse models are administered greater volumes of ethanol than humans normally consume in order to obtain more prominent phenotypes. Therefore, while the model organism and the tissue samples provide an accurate approximation of the biology of neurological phenomena, these approaches do not provide a complete and precise picture of the exact processes underlying a phenotype or a disease.

Neuroepigenetics had also remained underdeveloped due to the controversy surrounding the classification of genetic modifications in matured neurons as epigenetic phenomena. This discussion arises due to the fact that neurons do not undergo mitosis after maturation, yet the conventional definition of epigenetic phenomena emphasizes heritable changes passed on from parent to offspring. However, various histone modifications are placed by epigenetic modifiers such as DNA methyltransferases (DNMT) in neurons and these marks regulate gene expression throughout the life span of the neuron. The modifications heavily influence gene expression and arrangement of synapses within the brain. Finally, although not inherited, most of these marks are maintained throughout the life of the cell once they are placed on chromatin.

This page is based on the copyrighted Wikipedia article < https://en.wikipedia.org/wiki/Neuroepigenetics >; it is used under the Creative Commons Attribution-ShareAlike 3.0 Unported License (CC-BY-SA). You may redistribute it, verbatim or modified, providing that you comply with the terms of the CC-BY-SA.

An Overview of the Transtheoretical Model

Introduction

The transtheoretical model of behaviour change is an integrative theory of therapy that assesses an individual’s readiness to act on a new healthier behaviour, and provides strategies, or processes of change to guide the individual. The model is composed of constructs such as: stages of change, processes of change, levels of change, self-efficacy, and decisional balance.

The transtheoretical model is also known by the abbreviation “TTM” and sometimes by the term “stages of change”, although this latter term is a synecdoche since the stages of change are only one part of the model along with processes of change, levels of change, etc. Several self-help books—Changing for Good (1994), Changeology (2012), and Changing to Thrive (2016) – and articles in the news media have discussed the model. In 2009, an article in the British Journal of Health Psychology called it “arguably the dominant model of health behaviour change, having received unprecedented research attention, yet it has simultaneously attracted exceptional criticism”.

Brief History and Core Constructs

James O. Prochaska of the University of Rhode Island, and Carlo Di Clemente and colleagues developed the transtheoretical model beginning in 1977. It is based on analysis and use of different theories of psychotherapy, hence the name “transtheoretical”.  Prochaska and colleagues refined the model on the basis of research that they published in peer-reviewed journals and books.

Stages of Change

This construct refers to the temporal dimension of behavioural change. In the transtheoretical model, change is a “process involving progress through a series of stages”:

  • Precontemplation (“not ready”) – “People are not intending to take action in the foreseeable future, and can be unaware that their behaviour is problematic”.
  • Contemplation (“getting ready”) – “People are beginning to recognize that their behaviour is problematic, and start to look at the pros and cons of their continued actions”.
  • Preparation (“ready”) – “People are intending to take action in the immediate future, and may begin taking small steps toward behaviour change”.
  • Action – “People have made specific overt modifications in modifying their problem behaviour or in acquiring new healthy behaviours”.
  • Maintenance – “People have been able to sustain action for at least six months and are working to prevent relapse”.
  • Termination – “Individuals have zero temptation and they are sure they will not return to their old unhealthy habit as a way of coping”.

In addition, the researchers conceptualised “Relapse” (recycling) which is not a stage in itself but rather the “return from Action or Maintenance to an earlier stage”.

The quantitative definition of the stages of change (see below) is perhaps the most well-known feature of the model. However it is also one of the most critiqued, even in the field of smoking cessation, where it was originally formulated. It has been said that such quantitative definition (i.e. a person is in preparation if he intends to change within a month) does not reflect the nature of behaviour change, that it does not have better predictive power than simpler questions (i.e. “do you have plans to change…”), and that it has problems regarding its classification reliability.

Communication theorist and sociologist Everett Rogers suggested that the stages of change are analogues of the stages of the innovation adoption process in Rogers’ theory of diffusion of innovations.

Details of Each Stage

StagePrecontemplationContempplationPreparationActionMaintenanceRelapse
Standard TimeMore than 6 monthsIn the next 6 monthsIn the next monthNowAt least 6 monthsAny time

Stage 1: Precontemplation (Not Ready)

People at this stage do not intend to start the healthy behaviour in the near future (within 6 months), and may be unaware of the need to change. People here learn more about healthy behaviour: they are encouraged to think about the pros of changing their behaviour and to feel emotions about the effects of their negative behaviour on others.

Precontemplators typically underestimate the pros of changing, overestimate the cons, and often are not aware of making such mistakes.

One of the most effective steps that others can help with at this stage is to encourage them to become more mindful of their decision making and more conscious of the multiple benefits of changing an unhealthy behaviour.

Stage 2: Contemplation (Getting Ready)

At this stage, participants are intending to start the healthy behaviour within the next 6 months. While they are usually now more aware of the pros of changing, their cons are about equal to their Pros. This ambivalence about changing can cause them to keep putting off taking action.

People here learn about the kind of person they could be if they changed their behaviour and learn more from people who behave in healthy ways.

Others can influence and help effectively at this stage by encouraging them to work at reducing the cons of changing their behaviour.

Stage 3: Preparation (Ready)

People at this stage are ready to start taking action within the next 30 days. They take small steps that they believe can help them make the healthy behaviour a part of their lives. For example, they tell their friends and family that they want to change their behaviour.

People in this stage should be encouraged to seek support from friends they trust, tell people about their plan to change the way they act, and think about how they would feel if they behaved in a healthier way. Their number one concern is: when they act, will they fail? They learn that the better prepared they are, the more likely they are to keep progressing.

Stage 4: Action (Current Action)

People at this stage have changed their behaviour within the last 6 months and need to work hard to keep moving ahead. These participants need to learn how to strengthen their commitments to change and to fight urges to slip back.

People in this stage progress by being taught techniques for keeping up their commitments such as substituting activities related to the unhealthy behaviour with positive ones, rewarding themselves for taking steps toward changing, and avoiding people and situations that tempt them to behave in unhealthy ways.

Stage 5: Maintenance (Monitoring)

People at this stage changed their behaviour more than 6 months ago. It is important for people in this stage to be aware of situations that may tempt them to slip back into doing the unhealthy behaviour—particularly stressful situations.

It is recommended that people in this stage seek support from and talk with people whom they trust, spend time with people who behave in healthy ways, and remember to engage in healthy activities (such as exercise and deep relaxation) to cope with stress instead of relying on unhealthy behaviour.

Relapse (Recycling)

Relapse in the TTM specifically applies to individuals who successfully quit smoking or using drugs or alcohol, only to resume these unhealthy behaviours. Individuals who attempt to quit highly addictive behaviours such as drug, alcohol, and tobacco use are at particularly high risk of a relapse. Achieving a long-term behaviour change often requires ongoing support from family members, a health coach, a physician, or another motivational source. Supportive literature and other resources can also be helpful to avoid a relapse from happening.

Processes of Change

The 10 processes of change are “covert and overt activities that people use to progress through the stages”.

To progress through the early stages, people apply cognitive, affective, and evaluative processes. As people move toward Action and Maintenance, they rely more on commitments, counter conditioning, rewards, environmental controls, and support.

Prochaska and colleagues state that their research related to the transtheoretical model shows that interventions to change behaviour are more effective if they are “stage-matched”, that is, “matched to each individual’s stage of change”.

In general, for people to progress they need:

  • A growing awareness that the advantages (the “pros”) of changing outweigh the disadvantages (the “cons”) – the TTM calls this decisional balance.
  • Confidence that they can make and maintain changes in situations that tempt them to return to their old, unhealthy behaviour – the TTM calls this self-efficacy.
  • Strategies that can help them make and maintain change – the TTM calls these processes of change.

The ten processes of change include:

  1. Consciousness-raising (Get the facts) — increasing awareness via information, education, and personal feedback about the healthy behaviour.
  2. Dramatic relief (Pay attention to feelings) — feeling fear, anxiety, or worry because of the unhealthy behaviour, or feeling inspiration and hope when hearing about how people are able to change to healthy behaviours.
  3. Self-re-evaluation (Create a new self-image) — realising that the healthy behaviour is an important part of who they want to be.
  4. Environmental re-evaluation (Notice your effect on others) — realizing how their unhealthy behaviour affects others and how they could have more positive effects by changing.
  5. Social liberation (Notice public support) — realising that society is supportive of the healthy behaviour.
  6. Self-liberation (Make a commitment) — believing in one’s ability to change and making commitments and re-commitments to act on that belief.
  7. Helping relationships (Get support) — finding people who are supportive of their change.
  8. Counterconditioning (Use substitutes) — substituting healthy ways of acting and thinking for unhealthy ways.
  9. Reinforcement management (Use rewards) — increasing the rewards that come from positive behaviour and reducing those that come from negative behaviour.
  10. Stimulus control (Manage your environment) — using reminders and cues that encourage healthy behaviour and avoiding places that do not.

Health researchers have extended Prochaska’s and DiClemente’s 10 original processes of change by an additional 21 processes. In the first edition of Planning Health Promotion Programmes, Bartholomew et al. (2006) summarised the processes that they identified in a number of studies; however, their extended list of processes was removed from later editions of the text, perhaps because the list mixes techniques with processes. There are unlimited ways of applying processes. The additional strategies of Bartholomew et al. were:

  1. Risk comparison (Understand the risks) – comparing risks with similar dimensional profiles: dread, control, catastrophic potential and novelty
  2. Cumulative risk (Get the overall picture) – processing cumulative probabilities instead of single incident probabilities
  3. Qualitative and quantitative risks (Consider different factors) – processing different expressions of risk
  4. Positive framing (Think positively) – focusing on success instead of failure framing
  5. Self-examination relate to risk (Be aware of your risks) – conducting an assessment of risk perception, e.g. personalisation, impact on others
  6. Re-evaluation of outcomes (Know the outcomes) – emphasising positive outcomes of alternative behaviours and re-evaluating outcome expectancies
  7. Perception of benefits (Focus on benefits) – perceiving advantages of the healthy behaviour and disadvantages of the risk behaviour
  8. Self-efficacy and social support (Get help) – mobilising social support; skills training on coping with emotional disadvantages of change
  9. Decision making perspective (Decide) – focusing on making the decision
  10. Tailoring on time horizons (Set the time frame) – incorporating personal time horizons
  11. Focus on important factors (Prioritise) – incorporating personal factors of highest importance
  12. Trying out new behaviour (Try it) – changing something about oneself and gaining experience with that behaviour
  13. Persuasion of positive outcomes (Persuade yourself) – promoting new positive outcome expectations and reinforcing existing ones
  14. Modelling (Build scenarios) – showing models to overcome barriers effectively
  15. Skill improvement (Build a supportive environment) – restructuring environments to contain important, obvious and socially supported cues for the new behaviour
  16. Coping with barriers (Plan to tackle barriers) – identifying barriers and planning solutions when facing these obstacles
  17. Goal setting (Set goals) – setting specific and incremental goals
  18. Skills enhancement (Adapt your strategies) – restructuring cues and social support; anticipating and circumventing obstacles; modifying goals
  19. Dealing with barriers (Accept setbacks) – understanding that setbacks are normal and can be overcome
  20. Self-rewards for success (Reward yourself) – feeling good about progress; reiterating positive consequences
  21. Coping skills (Identify difficult situations) – identifying high risk situations; selecting solutions; practicing solutions; coping with relapse

While most of these processes and strategies are associated with health interventions such as stress management, exercise, healthy eating, smoking cessation and other addictive behaviour, some of them are also used in other types of interventions such as travel interventions. Some processes are recommended in a specific stage, while others can be used in one or more stages.

Decisional Balance

This core construct “reflects the individual’s relative weighing of the pros and cons of changing”. Decision making was conceptualised by Janis and Mann as a “decisional balance sheet” of comparative potential gains and losses. Decisional balance measures, the pros and the cons, have become critical constructs in the transtheoretical model. The pros and cons combine to form a decisional “balance sheet” of comparative potential gains and losses. The balance between the pros and cons varies depending on which stage of change the individual is in.

Sound decision making requires the consideration of the potential benefits (pros) and costs (cons) associated with a behaviour’s consequences. TTM research has found the following relationships between the pros, cons, and the stage of change across 48 behaviours and over 100 populations studied.

  • The cons of changing outweigh the pros in the Precontemplation stage.
  • The pros surpass the cons in the middle stages.
  • The pros outweigh the cons in the Action stage.

The evaluation of pros and cons is part of the formation of decisional balance. During the change process, individuals gradually increase the pros and decrease the cons forming a more positive balance towards the target behaviour. Attitudes are one of the core constructs explaining behaviour and behaviour change in various research domains. Other behaviour models, such as the theory of planned behaviour (TPB) and the stage model of self-regulated change, also emphasise attitude as an important determinant of behaviour. The progression through the different stages of change is reflected in a gradual change in attitude before the individual acts.

Due to the use of decisional balance and attitude, travel behaviour researchers have begun to combine the TTM with the TPB. Forward uses the TPB variables to better differentiate the different stages. Especially all TPB variables (attitude, perceived behaviour control, descriptive and subjective norm) are positively show a gradually increasing relationship to stage of change for bike commuting. As expected, intention or willingness to perform the behaviour increases by stage. Similarly, Bamberg uses various behaviour models, including the transtheoretical model, theory of planned behaviour and norm-activation model, to build the stage model of self-regulated behaviour change (SSBC). Bamberg claims that his model is a solution to criticism raised towards the TTM. Some researchers in travel, dietary, and environmental research have conducted empirical studies, showing that the SSBC might be a future path for TTM-based research.

Self-Efficacy

This core construct is “the situation-specific confidence people have that they can cope with high-risk situations without relapsing to their unhealthy or high risk-habit”. The construct is based on Bandura’s self-efficacy theory and conceptualises a person’s perceived ability to perform on a task as a mediator of performance on future tasks. In his research Bandura already established that greater levels of perceived self-efficacy leads to greater changes in behaviour. Similarly, Ajzen mentions the similarity between the concepts of self-efficacy and perceived behavioural control. This underlines the integrative nature of the transtheoretical model which combines various behaviour theories. A change in the level of self-efficacy can predict a lasting change in behaviour if there are adequate incentives and skills. The transtheoretical model employs an overall confidence score to assess an individual’s self-efficacy. Situational temptations assess how tempted people are to engage in a problem behaviour in a certain situation.

Levels of Change

This core construct identifies the depth or complexity of presenting problems according to five levels of increasing complexity. Different therapeutic approaches have been recommended for each level as well as for each stage of change. The levels are:

  • Symptom/situational problems: e.g., motivational interviewing, behaviour therapy, exposure therapy
  • Current maladaptive cognitions: e.g., Adlerian therapy, cognitive therapy, rational emotive therapy
  • Current interpersonal conflicts: e.g., Sullivanian therapy, interpersonal therapy
  • Family/systems conflicts: e.g., strategic therapy, Bowenian therapy, structural family therapy
  • Long-term intrapersonal conflicts: e.g., psychoanalytic therapies, existential therapy, Gestalt therapy

In one empirical study of psychotherapy discontinuation published in 1999, measures of levels of change did not predict premature discontinuation of therapy. Nevertheless, in 2005 the creators of the TTM stated that it is important “that both therapists and clients agree as to which level they attribute the problem and at which level or levels they are willing to target as they work to change the problem behavior”. 

Psychologist Donald Fromme, in his book Systems of Psychotherapy, adopted many ideas from the TTM, but in place of the levels of change construct, Fromme proposed a construct called contextual focus, a spectrum from physiological microcontext to environmental macrocontext: “The horizontal, contextual focus dimension resembles TTM’s Levels of Change, but emphasizes the breadth of an intervention, rather than the latter’s focus on intervention depth.”

 Outcomes of Programmes

The outcomes of the TTM computerised tailored interventions administered to participants in pre-Action stages are outlined below.

Stress Management

A national sample of pre-Action adults was provided a stress management intervention. At the 18-month follow-up, a significantly larger proportion of the treatment group (62%) was effectively managing their stress when compared to the control group. The intervention also produced statistically significant reductions in stress and depression and an increase in the use of stress management techniques when compared to the control group. Two additional clinical trials of TTM programmes by Prochaska et al. and Jordan et al. also found significantly larger proportions of treatment groups effectively managing stress when compared to control groups.

Adherence to Antihypertensive Medication

Over 1,000 members of a New England group practice who were prescribed antihypertensive medication participated in an adherence to antihypertensive medication intervention. The vast majority (73%) of the intervention group who were previously pre-Action were adhering to their prescribed medication regimen at the 12-month follow-up when compared to the control group.

Adherence to Lipid-Lowering Drugs

Members of a large New England health plan and various employer groups who were prescribed a cholesterol lowering medication participated in an adherence to lipid-lowering drugs intervention. More than half of the intervention group (56%) who were previously pre-Action were adhering to their prescribed medication regimen at the 18-month follow-up. Additionally, only 15% of those in the intervention group who were already in Action or Maintenance relapsed into poor medication adherence compared to 45% of the controls. Further, participants who were at risk for physical activity and unhealthy diet were given only stage-based guidance. The treatment group doubled the control group in the percentage in Action or Maintenance at 18 months for physical activity (43%) and diet (25%).

Depression Prevention

Participants were 350 primary care patients experiencing at least mild depression but not involved in treatment or planning to seek treatment for depression in the next 30 days. Patients receiving the TTM intervention experienced significantly greater symptom reduction during the 9-month follow-up period. The intervention’s largest effects were observed among patients with moderate or severe depression, and who were in the Precontemplation or Contemplation stage of change at baseline. For example, among patients in the Precontemplation or Contemplation stage, rates of reliable and clinically significant improvement in depression were 40% for treatment and 9% for control. Among patients with mild depression, or who were in the Action or Maintenance stage at baseline, the intervention helped prevent disease progression to Major Depression during the follow-up period.

Weight Management

Five-hundred-and-seventy-seven overweight or moderately obese adults (BMI 25-39.9) were recruited nationally, primarily from large employers. Those randomly assigned to the treatment group received a stage-matched multiple behaviour change guide and a series of tailored, individualized interventions for three health behaviours that are crucial to effective weight management: healthy eating (i.e. reducing calorie and dietary fat intake), moderate exercise, and managing emotional distress without eating. Up to three tailored reports (one per behaviour) were delivered based on assessments conducted at four time points: baseline, 3, 6, and 9 months. All participants were followed up at 6, 12, and 24 months. Multiple Imputation was used to estimate missing data. Generalized Labour Estimating Equations (GLEE) were then used to examine differences between the treatment and comparison groups. At 24 months, those who were in a pre-Action stage for healthy eating at baseline and received treatment were significantly more likely to have reached Action or Maintenance than the comparison group (47.5% vs. 34.3%). The intervention also impacted a related, but untreated behaviour: fruit and vegetable consumption. Over 48% of those in the treatment group in a pre-Action stage at baseline progressed to Action or Maintenance for eating at least 5 servings a day of fruit and vegetables as opposed to 39% of the comparison group. Individuals in the treatment group who were in a pre-Action stage for exercise at baseline were also significantly more likely to reach Action or Maintenance (44.9% vs. 38.1%). The treatment also had a significant effect on managing emotional distress without eating, with 49.7% of those in a pre-Action stage at baseline moving to Action or Maintenance versus 30.3% of the comparison group. The groups differed on weight lost at 24 months among those in a pre-Action stage for healthy eating and exercise at baseline. Among those in a pre-Action stage for both healthy eating and exercise at baseline, 30% of those randomised to the treatment group lost 5% or more of their body weight vs. 16.6% in the comparison group. Coaction of behaviour change occurred and was much more pronounced in the treatment group with the treatment group losing significantly more than the comparison group. This study demonstrates the ability of TTM-based tailored feedback to improve healthy eating, exercise, managing emotional distress, and weight on a population basis. The treatment produced the highest population impact to date on multiple health risk behaviours.

The effectiveness of the use of this model in weight management interventions (including dietary or physical activity interventions, or both, and also combined with other interventions) for overweight and obese adults was assessed in a 2014 systematic review. The results revealed that there is inconclusive evidence regarding the impact of these interventions on sustainable (one year or longer) weight loss. However, this approach may produce positive effects in physical activity and dietary habits, such as increased in both exercise duration and frequency, and fruits and vegetables consumption, along with reduced dietary fat intake, based on very low quality scientific evidence.

Criticisms

In 2009, an article in the British Journal of Health Psychology called the TTM “arguably the dominant model of health behaviour change, having received unprecedented research attention, yet it has simultaneously attracted exceptional criticism”, and said “that there is still value in the transtheoretical model but that the way in which it is researched needs urgently to be addressed”. Depending on the field of application (e.g. smoking cessation, substance abuse, condom use, diabetes treatment, obesity and travel) somewhat different criticisms have been raised.

In a systematic review, published in 2003, of 23 randomized controlled trials, the authors found that “stage based interventions are no more effective than non-stage based interventions or no intervention in changing smoking behaviour”. However, it was also mentioned that stage based interventions are often used and implemented inadequately in practice. Thus, criticism is directed towards the use rather the effectiveness of the model itself. Looking at interventions targeting smoking cessation in pregnancy found that stage-matched interventions were more effective than non-matched interventions. One reason for this was the greater intensity of stage-matched interventions. Also, the use of stage-based interventions for smoking cessation in mental illness proved to be effective. Further studies, e.g. a randomised controlled trial published in 2009, found no evidence that a TTM based smoking cessation intervention was more effective than a control intervention not tailored to stage of change. The study claims that those not wanting to change (i.e. precontemplators) tend to be responsive to neither stage nor non-stage based interventions. Since stage-based interventions tend to be more intensive they appear to be most effective at targeting contemplators and above rather than pre-contemplators. A 2010 systematic review of smoking cessation studies under the auspices of the Cochrane Collaboration found that “stage-based self-help interventions (expert systems and/or tailored materials) and individual counselling were neither more nor less effective than their non-stage-based equivalents”. A 2014 Cochrane systematic review concluded that research on the use of TTM stages of change “in weight loss interventions is limited by risk of bias and imprecision, not allowing firm conclusions to be drawn”.

Main criticism is raised regarding the “arbitrary dividing lines” that are drawn between the stages. West claimed that a more coherent and distinguishable definition for the stages is needed. Especially the fact that the stages are bound to a specific time interval is perceived to be misleading. Additionally, the effectiveness of stage-based interventions differs depending on the behaviour. A continuous version of the model has been proposed, where each process is first increasingly used, and then decreases in importance, as smokers make progress along some latent dimension. This proposal suggests the use of processes without reference to stages of change.

West claimed that the model “assumes that individuals typically make coherent and stable plans”, when in fact they often do not. However, the model does not require that all people make a plan: for example, the SAMSHA document Enhancing Motivation for Change in Substance Use Disorder Treatment, which uses the TTM, also says: “Don’t assume that all clients need a structured method to develop a change plan. Many people can make significant lifestyle changes and initiate recovery from SUDs without formal assistance”.

Within research on prevention of pregnancy and sexually transmitted diseases, a systematic review from 2003 comes to the conclusion that “no strong conclusions” can be drawn about the effectiveness of interventions based on the transtheoretical model. Again this conclusion is reached due to the inconsistency of use and implementation of the model. This study also confirms that the better stage-matched the intervention the more effect it has to encourage condom use.

Within the health research domain, a 2005 systematic review of 37 randomized controlled trials claims that “there was limited evidence for the effectiveness of stage-based interventions as a basis for behaviour change. Studies with which focused on increasing physical activity levels through active commute however showed that stage-matched interventions tended to have slightly more effect than non-stage matched interventions. Since many studies do not use all constructs of the TTM, additional research suggested that the effectiveness of interventions increases the better it is tailored on all core constructs of the TTM in addition to stage of change. In diabetes research the “existing data are insufficient for drawing conclusions on the benefits of the transtheoretical model” as related to dietary interventions. Again, studies with slightly different design, e.g. using different processes, proved to be effective in predicting the stage transition of intention to exercise in relation to treating patients with diabetes.

TTM has generally found a greater popularity regarding research on physical activity, due to the increasing problems associated with unhealthy diets and sedentary living, e.g. obesity, cardiovascular problems. A 2011 Cochrane Systematic Review found that there is little evidence to suggest that using the transtheoretical model stages of change (TTM SOC) method is effective in helping obese and overweight people lose weight. There were only five studies in the review, two of which were later dropped due to not being relevant since they did not measure weight. Earlier in a 2009 paper, the TTM was considered to be useful in promoting physical activity. In this study, the algorithms and questionnaires that researchers used to assign people to stages of change lacked standardisation to be compared empirically, or validated.

Similar criticism regarding the standardisation as well as consistency in the use of TTM is also raised in a 2017 review on travel interventions. With regard to travel interventions only stages of change and sometimes decisional balance constructs are included. The processes used to build the intervention are rarely stage-matched and short cuts are taken by classifying participants in a pre-action stage, which summarises the precontemplation, contemplation and preparation stage, and an action/maintenance stage. More generally, TTM has been criticised within various domains due to the limitations in the research designs. For example, many studies supporting the model have been cross-sectional, but longitudinal study data would allow for stronger causal inferences. Another point of criticism is raised in a 2002 review, where the model’s stages were characterised as “not mutually exclusive”. Furthermore, there was “scant evidence of sequential movement through discrete stages”. While research suggests that movement through the stages of change is not always linear, a study of smoking cessation conducted in 1996 demonstrated that the probability of forward stage movement is greater than the probability of backward stage movement. Due to the variations in use, implementation and type of research designs, data confirming TTM are ambiguous. More care has to be taken in using a sufficient amount of constructs, trustworthy measures, and longitudinal data.

This page is based on the copyrighted Wikipedia article < https://en.wikipedia.org/wiki/Transtheoretical_model >; it is used under the Creative Commons Attribution-ShareAlike 3.0 Unported License (CC-BY-SA). You may redistribute it, verbatim or modified, providing that you comply with the terms of the CC-BY-SA.

What is Common Factors Theory?

Introduction

Common factors theory, a theory guiding some research in clinical psychology and counselling psychology, proposes that different approaches and evidence-based practices in psychotherapy and counselling share common factors that account for much of the effectiveness of a psychological treatment. This is in contrast to the view that the effectiveness of psychotherapy and counselling is best explained by specific or unique factors (notably, particular methods or procedures) that are suited to treatment of particular problems.

However, according to one review, “it is widely recognized that the debate between common and unique factors in psychotherapy represents a false dichotomy, and these factors must be integrated to maximize effectiveness.” In other words, “therapists must engage in specific forms of therapy for common factors to have a medium through which to operate.” Common factors is one route by which psychotherapy researchers have attempted to integrate psychotherapies.

Brief History

Saul Rosenzweig started the conversation on common factors in an article published in 1936 that discussed some psychotherapies of his time. John Dollard and Neal E. Miller’s 1950 book Personality and Psychotherapy emphasized that the psychological principles and social conditions of learning are the most important common factors. Sol Garfield (who would later go on to edit many editions of the Handbook of Psychotherapy and Behaviour Change with Allen Bergin) included a 10-page discussion of common factors in his 1957 textbook Introductory Clinical Psychology.

In the same year, Carl Rogers published a paper outlining what he considered to be common factors (which he called “necessary and sufficient conditions”) of successful therapeutic personality change, emphasizing the therapeutic relationship factors which would become central to the theory of person-centred therapy. He proposed the following conditions necessary for therapeutic change: psychological contact between the therapist and client, incongruence in the client, genuineness in the therapist, unconditional positive regard and empathic understanding from the therapist, and the client’s perception of the therapist’s unconditional positive regard and empathic understanding.

In 1961, Jerome Frank published Persuasion and Healing, a book entirely devoted to examining the common factors among psychotherapies and related healing approaches. Frank emphasized the importance of the expectation of help (a component of the placebo effect), the therapeutic relationship, a rationale or conceptual scheme that explains the given symptoms and prescribes a given ritual or procedure for resolving them, and the active participation of both patient and therapist in carrying out that ritual or procedure.

After Lester Luborsky and colleagues published a literature review of empirical studies of psychotherapy outcomes in 1975, the idea that all psychotherapies are effective became known as the Dodo bird verdict, referring to a scene from Alice’s Adventures in Wonderland quoted by Rosenzweig in his 1936 article; in that scene, after the characters race and everyone wins, the Dodo bird says, “everybody has won, and all must have prizes.” Luborsky’s research was an attempt (and not the first attempt, nor the last one) to disprove Hans Eysenck’s 1952 study on the efficacy of psychotherapy; Eysenck found that psychotherapy generally did not seem to lead to improved patient outcomes. A number of studies after 1975 presented more evidence in support of the general efficacy of psychotherapy, but the question of how common and specific factors could enhance or thwart therapy effectiveness in particular cases continued to fuel theoretical and empirical research over the following decades.

The landmark 1982 book Converging Themes in Psychotherapy gathered a number of chapters by different authors promoting common factors, including an introduction by Marvin R. Goldfried and Wendy Padawer, a reprint of Rosenzweig’s 1936 article, and further chapters (some of them reprints) by John Dollard and Neal E. Miller, Franz Alexander, Jerome Frank, Arnold Lazarus, Hans Herrman Strupp, Sol Garfield, John Paul Brady, Judd Marmor, Paul L. Wachtel, Abraham Maslow, Arnold P. Goldstein, Anthony Ryle, and others. The chapter by Goldfried and Padawer distinguished between three levels of intervention in therapy:

  1. Theories of change (therapists’ theories about how change occurs);
  2. Principles or strategies of change; and
  3. Therapy techniques (interventions that therapists suppose will be effective).

Goldfried and Padawer argued that while therapists may talk about their theories using very different jargon, there is more commonality among skilled therapists at the (intermediate) level of principles or strategies. Goldfried and Padawer’s emphasis on principles or strategies of change was an important contribution to common factors theory because they clearly showed how principles or strategies can be considered common factors (they are shared by therapists who may espouse different theories of change) and specific factors (they are manifested in particular ways within different approaches) at the same time. Around the same time, James O. Prochaska and colleagues, who were developing the transtheoretical model of change, proposed ten “processes of change” that categorized “multiple techniques, methods, and interventions traditionally associated with disparate theoretical orientations,” and they stated that their processes of change corresponded to Goldfried and Padawer’s level of common principles of change.

In 1986, David Orlinsky and Kenneth Howard presented their generic model of psychotherapy, which proposed that five process variables are active in any psychotherapy: the therapeutic contract, therapeutic interventions, the therapeutic bond between therapist and patient, the patient’s and therapist’s states of self-relatedness, and therapeutic realisation.

In 1990, Lisa Grencavage and John C. Norcross reviewed accounts of common factors in 50 publications, with 89 common factors in all, from which Grencavage and Norcross selected the 35 most common factors and grouped them into five areas: client characteristics, therapist qualities, change processes, treatment structure, and therapeutic relationship. In the same year, Larry E. Beutler and colleagues published their systematic treatment selection model, which attempted to integrate common and specific factors into a single model that therapists could use to guide treatment, considering variables of patient dimensions, environments, settings, therapist dimensions, and treatment types. Beutler and colleagues would later describe their approach as “identifying common and differential principles of change”.

In 1992, Michael J. Lambert summarised psychotherapy outcome research and grouped the factors of successful therapy into four areas, ordered by hypothesized percent of change in clients as a function of therapeutic factors: first, extratherapeutic change (40%), those factors that are qualities of the client or qualities of his or her environment and that aid in recovery regardless of his or her participation in therapy; second, common factors (30%) that are found in a variety of therapy approaches, such as empathy and the therapeutic relationship; third, expectancy (15%), the portion of improvement that results from the client’s expectation of help or belief in the rationale or effectiveness of therapy; fourth, techniques (15%), those factors unique to specific therapies and tailored to treatment of specific problems. Lambert’s research later inspired a book on common factors theory in the practice of therapy titled The Heart and Soul of Change.

In the mid-1990s, as managed care in mental health services became more widespread in the United States, more researchers began to investigate the efficacy of psychotherapy in terms of empirically supported treatments (ESTs) for particular problems, emphasizing randomised controlled trials as the gold standard of empirical support for a treatment. In 1995, the American Psychological Association’s Division 12 (clinical psychology) formed a task force that developed lists of empirically supported treatments for particular problems such as agoraphobia, blood-injection-injury type phobia, generalised anxiety disorder, obsessive–compulsive disorder, panic disorder, etc. In 2001, Bruce Wampold published The Great Psychotherapy Debate, a book that criticised what he considered to be an overemphasis on ESTs for particular problems, and he called for continued research in common factors theory.

In the 2000s, more research began to be published on common factors in couples therapy and family therapy.

In 2014, a series of ten articles on common factors theory was published in the APA journal Psychotherapy. The articles emphasized the compatibility between ESTs and common factors theory, highlighted the importance of multiple variables in psychotherapy effectiveness, called for more empirical research on common factors (especially client and therapist variables), and argued that individual therapists can do much to improve the quality of therapy by rigorously using feedback measures (during treatment) and outcome measures (after termination of treatment). The article by Stefan G. Hofmann and David H. Barlow, two prominent researchers in cognitive behavioural therapy, pointed out how their recent shift in emphasis from distinct procedures for different diagnoses to a transdiagnostic approach was increasingly similar to common factors theory.

Models

There are many models of common factors in successful psychotherapy process and outcome. Already in 1990, Grencavage and Norcross identified 89 common factors in a literature review, which showed the diversity of models of common factors. To be useful for purposes of psychotherapy practice and training, most models reduce the number of common factors to a handful, typically around five. Frank listed six common factors in 1971 and explained their interaction. Goldfried and Padawer listed five common strategies or principles in 1982: corrective experiences and new behaviours, feedback from the therapist to the client promoting new understanding in the client, expectation that psychotherapy will be helpful, establishment of the desired therapeutic relationship, and ongoing reality testing by the client. Grencavage and Norcross grouped common factors into five areas in 1990. Lambert formulated four groups of therapeutic factors in 1992. Joel Weinberger and Cristina Rasco listed five common factors in 2007 and reviewed the empirical support for each factor: the therapeutic relationship, expectations of treatment effectiveness, confronting or facing the problem (exposure), mastery or control experiences, and patients’ attributions of successful outcome to internal or external causes.

Terence Tracy and colleagues modified the common factors of Grencavage and Norcross, and used them to develop a questionnaire which they provided to 16 board certified psychologists and 5 experienced psychotherapy researchers; then they analysed the responses and published the results in 2003. Their multidimensional scaling analysis represented the results on a two-dimensional graph, with one dimension representing hot processing versus cool processing (roughly, closeness and emotional experience versus technical information and persuasion) and the other dimension representing therapeutic activity. Their cluster analysis represented the results as three clusters: the first related to bond (roughly, therapeutic alliance), the second related to information (roughly, the meanings communicated between therapist and client), and the third related to role (roughly, a logical structure so that clients can make sense of the therapy process).

In addition to these models that incorporate multiple common factors, a number of theorists have proposed and investigated single common factors, common principles, and common mechanisms of change, such as learning. In one example, at least three independent groups have converged on the conclusion that a wide variety of different psychotherapies can be integrated via their common ability to trigger the neurobiological mechanism of memory reconsolidation.

Empirical Research

While many models of common factors have been proposed, they have not all received the same amount of empirical research. There is general consensus on the importance of a good therapeutic relationship in all forms of psychotherapy and counselling.

Factors% of Variability in Outcome
Common Factors
Goal Consensus/Collaboration11.5
Empathy9.0
Alliance7.5
Positive Regard/Affirmation7.3
Congruence/Genuineness5.7
Therapist Differences5.0
Specific Ingredients
Treatment Differences< 1.0
Research by Laska eta l., 2014.

A review of common factors research in 2008 suggested that 30% to 70% of the variance in therapy outcome was due to common factors. A summary of research in 2014 suggested that 11.5% of variance in therapy outcome was due to the common factor of goal consensus/collaboration, 9% was due to empathy, 7.5% was due to therapeutic alliance, 6.3% was due to positive regard/affirmation, 5.7% was due to congruence/genuineness, and 5% was due to therapist factors. In contrast, treatment method accounted for roughly 1% of outcome variance.

Alan E. Kazdin has argued that psychotherapy researchers must not only find statistical evidence that certain factors contribute to successful outcomes; they must also be able to formulate evidence-based explanations for how and why those factors contribute to successful outcomes, that is, the mechanisms through which successful psychotherapy leads to change. Common factors theory has been dominated by research on psychotherapy process and outcome variables, and there is a need for further work explaining the mechanisms of psychotherapy common factors in terms of emerging theoretical and empirical research in the neurosciences and social sciences, just as earlier works (such as Dollard and Miller’s Personality and Psychotherapy or Frank’s Persuasion and Healing) explained psychotherapy common factors in terms of the sciences of their time.

One frontier for future research on common factors is automated computational analysis of clinical big data.

Criticisms

There are several criticisms of common factors theory, for example:

  • That common factors theory dismisses the need for specific therapeutic techniques or procedures,
  • That common factors are nothing more than a good therapeutic relationship, and
  • That common factors theory is not scientific.

Some common factors theorists have argued against these criticisms. They state that:

  • The criticisms are based on a limited knowledge of the common factors literature;
  • A thorough review of the literature shows that a coherent treatment procedure is a crucial medium for the common factors to operate;
  • Most models of common factors define interactions between multiple variables (including but not limited to therapeutic relationship variables); and
  • Some models of common factors provide evidence-based explanations for the mechanisms of the proposed common factors.

This page is based on the copyrighted Wikipedia article < https://en.wikipedia.org/wiki/Common_factors_theory >; it is used under the Creative Commons Attribution-ShareAlike 3.0 Unported License (CC-BY-SA). You may redistribute it, verbatim or modified, providing that you comply with the terms of the CC-BY-SA.

What is the Molecular and Behavioural Neuroscience Institute?

Introduction

The Molecular and Behavioural Neuroscience Institute at the University of Michigan (UM) is an interdisciplinary research institute, which played a key role in the development of general systems theory. Formerly the Mental Health Research Institute, over the years it developed a specific interest in neuroscience and biological psychiatry and was subsequently renamed in the new millennium.

Background

The institute was established as Mental Health Research Institute at the University of Michigan in 1955 with the goal of “applying scientific methods to the study of human behavior.” It became known in the 1950s for employing some of the initial members of the Society for General Systems Research (SGSR) such as biologist and founding director of the institute James Grier Miller, mathematician Anatol Rapoport, physicist John Platt, urban planner Richard L. Meier, economist Walter Cannon, neurophysiologist Ralph Gerard, among others like Margaret Mead and Richard F. Ericson.

This page is based on the copyrighted Wikipedia article < https://en.wikipedia.org/wiki/Molecular_and_Behavioral_Neuroscience_Institute >; it is used under the Creative Commons Attribution-ShareAlike 3.0 Unported License (CC-BY-SA). You may redistribute it, verbatim or modified, providing that you comply with the terms of the CC-BY-SA.

What is the Mental Research Institute?

Introduction

The Palo Alto Mental Research Institute (MRI) is one of the founding institutions of brief and family therapy. Founded by Don D. Jackson and colleagues in 1958, MRI has been one of the leading sources of ideas in the area of interactional/systemic studies, psychotherapy, and family therapy.

Overview

According to an article in the Psychotherapy Networker on Jay Haley (a Research Associate at MRI in the 1960s) MRI “became the go-to place for any therapist who wanted to be on the cutting edge of psychotherapy research and practice. Fostering a climate of almost untrammelled experimentalism, MRI started the first formal training program in family therapy, produced some of the seminal early papers and books in the field, and became a place where some of the field’s leading figures – Paul Watzlawick, Richard Fisch, Jules Riskin, Virginia Satir, Salvador Minuchin, R.D. Laing, Irvin D. Yalom, Cloe Madanes – came to work or just hang out”.

As of 1967, the Brief Therapy Centre at MRI presented an innovative model for the comprehensive approach to brief psychotherapy, a model which, in turn, has influenced subsequent brief therapy approaches throughout the world. The Brief Therapy Centre at MRI was founded by Dick Fisch, John Weakland, and Paul Watzlawick. Continuing applied research and theory development have expanded the use of interactional concepts to community, school and business. Thousands of professionals within the US as well as from many countries of the world have attended MRI training programmes.

Mission Statement

The Mental Research Institute (MRI), established in 1958 by Donald deAvila Jackson, is a small, independent, multi-disciplinary, non-profit corporation:

  • Devoted to conducting and encouraging scientific research based on new ways of looking at how people behave,
  • Dedicated to benefit the human community worldwide through training, clinical and consultative services
  • Committed to extending a tradition of innovation and openness towards new paradigms of change.

The focus of MRI is to explore and to encourage the use of an interactional approach to further understand and more effectively resolve human problems from the family to all other levels of social organisation.

Books on MRI

  • The Interactional View: Studies at the Mental Research Institute, Palo Alto, 1965–1974, edited by Weakland, J., and Watzlawick, P. (1979) New York: WW Norton.
  • Propagations: Thirty years of Influence from the Mental Research Institute, Weakland, J., & Ray, W. (1995). New York: Haworth Press.
  • The bibliography of associates of MRI lists over 1000 journal and book publications.

This page is based on the copyrighted Wikipedia article < https://en.wikipedia.org/wiki/Mental_Research_Institute >; it is used under the Creative Commons Attribution-ShareAlike 3.0 Unported License (CC-BY-SA). You may redistribute it, verbatim or modified, providing that you comply with the terms of the CC-BY-SA.

What is Binding and Retrieval in Action Control?

Introduction

Binding and Retrieval in Action Control (BRAC) is a theoretical framework to explain basic psychological functions at the intersection of perception and motor control. It takes a cognitive approach by capturing how events are represented in the cognitive system. Its two core mechanisms – binding and retrieval of feature codes – explain a variety of observations in basic psychological experiments within a compact and parsimonious framework.

Binding and Retrieval

Many influential theories have proposed that the human cognitive system represents events in terms of distributed feature codes. For instance, colour and shape of an object in the visual field give rise to neural activity in distinct brain areas. This distributed activity has to be synchronized to create a visual impression of this object. In other words: Distributed features are bound into integrated representations.

Graphical summary of the BRAC framework. Click to enlarge.

Crucially, the BRAC framework suggests that such bindings persist in time. They further integrate features from distinct events, such as features relating to the current stimulation, the agent’s motor response, and corresponding effects of this response. The BRAC framework imports the theoretical concept of common coding of sensory and action events in a shared representational format, allowing for direct interactions and associations of perceptual and action features.

Compound representations of such features are labelled event files. Once bound, re-encountering any feature will retrieve previously stored event files. Because these event files contain features of a previous response, such retrieval provides an efficient shortcut by recycling previously used feature codes.

The BRAC framework emphasizes that binding and retrieval are separate mechanisms. They can therefore be subject to different influences as shown above. These influence can stem from top down factors and bottom up factors alike. Disentangling these separable contributions of binding and retrieval is a major goal of current work inspired by the BRAC framework.

Experimental Observations

The BRAC framework highlights the sequential dependency of human actions. Corresponding binding and retrieval effects have been observed in a range of experimental setups, including prime-probe experiments and sequential choice reaction tasks. Key measures in these studies have been reaction times and error rates for speeded responses. These performance measures vary as a joint function of feature sequences for responses and corresponding stimulation: Stimulus repetitions (as compared to stimulus changes) from one occasion to the next facilitate response repetitions rather than response changes. The picture below shows an idealised pattern of results as predicted by the BRAC framework.

Idealised results from a behavioural experiment that measures performance (reaction times) in successive choice responses as a function of response sequence and stimulus sequence for two successive responses.

Current Research

A key question of current research on the BRAC framework concerns the relation of short-term binding on the one hand, and long-term learning of stable associations on the other hand. Further unresolved questions pertain to the moment that binding takes place, and to possible bottom-up and top-down influences on both binding and retrieval.

This page is based on the copyrighted Wikipedia article < https://en.wikipedia.org/wiki/Binding_and_Retrieval_in_Action_Control >; it is used under the Creative Commons Attribution-ShareAlike 3.0 Unported License (CC-BY-SA). You may redistribute it, verbatim or modified, providing that you comply with the terms of the CC-BY-SA.