We all know we should eat better, but it’s hard work to cut out saturated fats and processed sugars completely. Most of us try to achieve some kind of balance: if we’ve been eating fruit, veg, and bean salads all day, we may allow ourselves the luxury of a doughnut or ice-cream sundae in the evening. Conversely, if we know that we’ve been hitting the cakes heavily in recent days, we may try to restore our dietary balance with some simple soups and maybe even a trip to the gym.
This kind of accounting, of balancing the books, also seems to operate in the moral domain. Some studies have shown, for example, that asking people to recall an unethical or immoral thing they’ve done primes them to be more ethical or moral in the near future — they’ll donate more money to charity , for example . (Like remembering the cookies you scoffed, and atoning for your sin by eating an apple for your next snack.) Reciprocally, asking people to recall a time when they behaved in a morally praiseworthy way decreases subsequent moral behaviour, and they’ll donate less money to charitable causes afterwards.
One idea to explain this is that people try to maintain a sense of their own moral identity somewhere between amoral sinner and moral saint. When we remember that we’ve been good, our moral identity gets a boost, perhaps making us feel like we’ve gone above and beyond the basic requirements of moral behaviour as we temporarily rise towards moral sainthood — and this surplus of moral capital gives us license to be a bit more selfish, a bit less altruistic, as on balance we’ll still be pretty moral folk. By contrast, when we recall being bad we feel this moral deficit and try to make amends by getting some credit in the ‘moral goodness’ ledger.
However, not all studies have found this pattern of behaviour. In fact, they’ve found the opposite: when people recall their past immoral actions, they’re more likely to behave immorally (or, at least, less morally) in the future, and vice versa. One study found that reminding people of their environmental activities fosters stronger pro-environmental attitudes . These results suggest that people strive for moral consistency rather than moral balancing. (Akin to thinking “Well, I blew my diet with that pizza – I may as well go the whole hog and have this key lime pie for dessert’ or “I’ve resisted sugary foods all day, so I won’t let myself down now and eat this chocolate!”)
A possible explanation for these discordant findings — that sometimes people seem to go for moral balancing while at others they show moral consistency — is provided by a new paper in Psychological Science by Gert Cornelissen, Michael Bashshur, Julian Rode and Marc Le Menestrel. The findings of their research suggest that whether or not you go in for moral balancing or moral consistency depends on the kind of ethical framework or mindset you use in making moral decisions. What, you may ask, is meant by an ethical mindset? Essentially it’s the underlying ethical theory you use to determine whether or not actions are good and bad (though this need not be an explicitly articulated or philosophically sophisticated theory — it’s enough that you behave as if following these ethical theories.)
Western ethics has developed two particularly influential frameworks for thinking about morality. The first, consequentialism, argues that the whether or not an action is good or bad depends on the outcomes it has. In utilitarian versions of consequentialism, this usually means the impact the action has on the happiness or well-being of other people, with the goal of creating the greatest happiness for the greatest number. The other big ethical theory, deontology, stresses the importance of respecting people as ends in themselves, and following moral rules and codes of conduct out of a sense of ethical duty.
Cornelissen and colleagues wanted to see whether the tendency to opt for moral balancing versus moral consistency might have anything to do with whether people tend more towards consequentialist thinking, or deontological ethics. (Again, this isn’t a question about whether people have thought about the relative merits of consequentialism versus deontology, or even heard of these ideas — what matters is whether their moral judgments reflect either a consequentialist or deontological mindset.)
In the first of three studies, 86 undergraduates were asked to give their answer to a version of the well-known trolley dilemma: “A runaway trolley is headed for five people who will be killed if it proceeds on its present course. The only way to save them is to hit a switch that will turn the trolley onto an alternate set of tracks where it will kill one person instead of five.” People who said they would flip the switch were classified as having a consequentialist mindset (they went for the option that led to the preservation of more lives, and therefore more well-being), while those who opted to leave things alone were classified as having a deontological mindset.
After this moral dilemma, participants were asked to recall a recent instance of their behaviour that was either ethical or unethical. The same participants then played, in pairs, a round of the Dictator Game, in which one player (the decider) was given 10 coins each worth 0.5 Euros and told that they could split their windfall with the other player as they wish. (In reality, no one actually played the game, face to face, with another person: everyone was told the rules of the game, and also that they would be the decider. They were then asked how much they would give the receiver.
In this study, recalling an ethical or unethical behaviour affected how generous people were in the dictator game, but in different ways between consequentialists and deontologists. After reflecting on recent unethical behaviour, consequentialists gave just under 4 coins to the other player, on average, but when they played the game following recall of ethical behaviour, this dropped to just over 2 coins. Meanwhile, the deontologists showed the opposite pattern: those who remembered an ethical act gave on average 3 coins or more, but those who thought about a bad deed gave under 2. These people acted as they recalled acting in the past, whether that was good or bad.
In a second study, the researchers tried to manipulate people into a consequentialist or deontological mindset. To do this, they provided 107 undergraduates with some text defining ethical behaviour in either consequentialist or deontological terms. This had the intended effect: compared with people not primed in any way, those in the consequentialist group were more likely to flip the switch in the trolley dilemma, and those in the deontological group less so. Next, the participants were asked to recall either ethical or unethical behaviour, and then to explain who was hurt/helped (for people being primed for consequentialism), or which moral rule or principle was violated (for those being inducted into a deontological mindset).
As before, people put into a consequentialist mindset gave away more coins after recalling bad behaviour than good, while those put into a deontological frame of mind showed the opposite pattern. (A control group gave at a level in between the highs and lows of the consequentialists/deontologists.)
A third and final set of experiments repeated the procedure just described, but instead of playing the Dictator Game participants were given a set of puzzles. They were told that they would get paid for each puzzle solved, and also that they would be marking their own performance — thus creating an opportunity for cheating. (The researchers, meanwhile, had means to identify how much people had overstated their performance by; that is, to find out how much people cheated by.) After doing the puzzle task, participants’ moral self-image was assessed with a moral identity scale that asks people to rate themselves as being more or less honest, caring, compassionate, fair, friendly, generous, hardworking, helpful and kind than the person they would like to be.
The results of this last experiment were a little less clear. As expected on the basis of the previous results, people in a consequentialist mindset cheated less after recalling a bad deed and more after remembering a good deed; people in a deontological mindset did the reverse. However, the proposed link between the ethicality of remembered actions, moral self-image and subsequent behaviour was more complicated. Recalling a past unethical act while in a consequentialist mindset had a negative impact on moral self-image, which people then tried then tried to rectify with a good deed; likewise, when they recalled an ethical deed their moral self-image got a boost, which liberated them from feeling like they had to be moral again.
However, this was not observed for those in the deontological condition. When these participants recalled a good/bad act, and then acted honestly or cheated, it wasn’t because their moral image had been affected. So something else besides managing their moral self-image must underlie the connection between recalling bad/good deeds and cheating/being honest in self-administered tests among those with a deontological mindset — an issue for future research to address.
These new findings add to a growing body of research on the psychological differences between people who gravitate towards consequentialist thinking, and those for whom deontology seems more natural or compelling. A lot of this has tied these two different ethical mindsets to personality styles, painting consequentialism in an unflattering light in the process. Psychopaths are more utilitarian in their moral judgments . Brain-damaged patients with deficits in processing social emotions also show a greater inclination towards consequentialism . And among non-clinical populations, people who make consequentialist judgments tend to rate higher on measures of psychoticism and psychopathy, higher on Machiavellianism (the willingness to manipulate others), and also lower on meaning-of-life scales. Consequentialists also seem rate higher on measures of anger too [6–8].
Yet these latest findings do not suggest that either mindset leads to more or less moral behaviour per se. On the one hand, consequentialists may flip-flop their behaviour to maintain a relatively defensible moral self-image, letting their good deeds make up for their bad ones and vice versa. Deontologists, on the other, may be more consistent in their moral behaviour, but this can just as easily be consistently bad as consistently good.
1. Meritt, A. C., Effron, D. A. & Monin, B. (2010). Moral self-licensing: when being good frees us to be bad. Social and Personality Psychology Compass 4/5, 344–357. DOI: 10.1111/j.1751-9004.2010.00263.x
2. Jordan, J., Mullen, E. & Murnighan, J. K. (2011). Striving for the moral self: the effects of recalling past moral actions on future moral behaviour. Personality and Social Psychology Bulletin 37, 701–713. DOI: 10.1177/0146167211400208
3. Cornelissen, G., Bashshur, M. R., Rode, J. & Le Menestrel, M. (2013). Rules or consequences? The role of ethical mind-sets in moral dynamics. Psychological Science (OnlineFirst) DOI: 10.1177/0956797612457376
4. Glenn, A. L., Koleva, S., Iyer, R., Graham, J. & Ditto, P. H. (2010) Moral identity in psychopathy. Judgment and Decision Making 5, 497–505.
5. Koenigs, M. et al. (2007). Damage to the prefrontal cortex increases utilitarian moral judgements. Nature 446, 908–911.
6. Bartels, D. M. & Pizarro, D. (2011) The mismeasure of morals: antisocial personality traits predict utilitarian responses to moral dilemmas. Cognition 121, 154–161.
7. Wiech, K., Kahane, G., Shackel, N., Farias, M., Savulescu, J. & Tracey, I. (2013) Cold or calculating? Reduced activity in the subgenual cingulate cortex reflects decreased emotional aversion to harming in counterintuitive utilitarian judgment. Cognition 126, 364–372.
8. Choe, S. Y. & Min, K.-H. (2011) Who makes utilitarian judgments? The influences of emotions on utilitarian judgments. Judgment and Decision Making 6, 580–592.