Reason, rather than emotion, may guide our moral feelings, according to a study conducted by psychology professor Jean Decety and psychology graduate student Keith Yoder. Their study, published in March in The Journal of Neuroscience, was the first to assess how individual differences in justice sensitivity, a personality trait, affect regions of activity in the brain.
Decety and Yoder examined how differences in sensitivity to justice affect changes in brain activity when people evaluate moral situations.
In the experiment, subjects filled out surveys to assess their justice sensitivity, or how readily they perceive a situation as unjust and how severely they respond to the injustice, as well as their empathy levels. Then subjects viewed morally-charged, everyday interpersonal interactions and labeled each scenario as good, bad, or neither. As subjects viewed the interactions, they underwent a functional magnetic resonance imaging (fMRI) scan, a brain imaging technique that allows researchers to see regions of brain activity.
Yoder described two types of sensitivity to injustice. “One of the ways you can collapse scores is by looking at how sensitive people are to injustice when they are the victim. So that’s self-oriented justice sensitivity.… The other way to view it is when another person is the victim of injustice. And that’s other-oriented justice sensitivity,” Yoder said.
People vary in how intensely they react to injustice and in what types of situations they react strongly depending on whether they are self-oriented or other-oriented.
The researchers found that other-oriented justice sensitivity positively correlated with activity levels in key brain regions during morally “bad” scenarios, whereas self-oriented justice sensitivity did not correlate with activity in any region. This may be because the subjects were viewing rather than participating in these scenarios.
“[Self-oriented justice sensitivity] actually didn’t predict hardly anything in the brain…That’s not to say it’s not important…but all of these judgments are about third-party interactions that you don’t have any information about, so they’re not necessarily your relatives; it’s not expected that you necessarily identify with any of these people,” Yoder said.
Yoder and Decety found that the level of sensitivity to justice that is oriented toward others correlated with brain activity in regions involved in reason-based rather than emotion-based processing. This may indicate that reason plays a greater role than emotion when evaluating moral conflict.
“It doesn’t seem the case that when people are very sensitive to justice it’s because they are more emotionally motivated. It seems to be more something very cool, and cold,” Decety said.
The researchers interpreted this finding to indicate that while emotions may play a role in moral evaluation, humans do not simply respond to injustice by getting upset. People with a heightened level of justice sensitivity are likely relying on rational understanding to interpret something as morally wrong, and react accordingly.
Empathy is a personality trait that is distinct from one’s sensitivity to justice. In contrast to other-oriented justice sensitivity, interpersonal differences in empathy did not correlate with any neural responses. This finding strengthens the scientists’ argument that moral evaluation is not as based on emotion as may have been previously thought.
“That [empathy did not correlate with brain activity] was really interesting and fits along again with the idea that this isn’t about emotion processing necessarily, it’s about understanding justice maybe as a high level [concept],” Yoder said.
In contrast to the morally bad scenarios, morally good scenarios did not correlate with other-oriented justice sensitivity. This may be because bad experiences make a stronger impression.
“When we talk about fairness and justice, bad actions are more salient. That’s why you’re going to find the differences you see,” Decety said.
The researchers found that good actions were associated with activity in brain regions implicated in reward whereas bad actions were not, implying that humans find the observance of good actions to be rewarding.
“Usually good actions are something that is agreeable to observe. People like it,” Decety said.
Good and bad scenarios had distinct time frames for brain activity responses to the observed morally-charged scenarios, with faster responses to bad actions than good ones in key brain regions.
“This difference in the time course really makes it look like we rapidly extract information about the intentionality of harmful interactions. And so, there is greater recruitment for the harmful interactions…But the good actions, especially for the people who are high on justice sensitivity, take a little bit longer to develop and there’s more elaboration involved, and it is very likely the case that they are spending more time thinking about and reflecting on those actions. And so that’s why we think this time course is important to look at,” Yoder said.
Yoder and Decety interpreted their findings as an inspiring humanitarian message that humans use a rational understanding of right and wrong, rather than emotional reactions, to treat people fairly.
“Apparently, those of us who are very sensitive to justice, it is based more on reason than affect or emotion,” Decety said. “And if this is the case, we need to cultivate reason over emotion. And it’s nice. It’s a good message. It’s a very important message.”