How Research Into Moral Thinking May Affect Management Strategies
A study on people’s moral perceptions in daily life that was reported last fall in the journal Science may bring a fresh perspective to workplace ethics. As illustrated by the study, people sometimes emphasize very different sets of moral concepts when assessing a situation, and in a way that seems to correlate with their political affiliations. We would hardly recommend surveying employees about their political views, which could raise concerns for a number of reasons, but sensitivity to the potential diversity of employees’ political perspectives could motivate a more nuanced approach in certain kinds of investigations, and guide our explanations of ethical concerns and mediation of disputes. More broadly, the study may encourage a more trusting work culture that is, in my view, the best strategy for avoiding strife and litigation in the workplace.
The research tracked a few thousand study subjects as they went about their lives over a three-day period. At five randomly-selected times each day, the subjects were contacted and asked to report any moral or immoral acts (judged by their own standards) that occurred in the prior hour, who had committed them, who the target of the act was, and other details. Political affiliation turned out to be a better predictor than religious views of subjects’ perception of certain moral situations. For example, politically liberal study subjects were more likely to describe moral or immoral acts in terms the researchers summarized as “Fairness/Unfairness”, “Liberty/Oppression”, or “Honesty/Dishonesty”, while politically conservative subjects more often described moral events in terms the researchers summarized as “Loyalty/Disloyalty”, “Authority/Subversion”, and “Sanctity/Degradation”. (Other notions of moral conduct, such as self-discipline, or caring for or harming others, were applied more or less equally by subjects of all political affiliations.)
The study therefore tends to corroborate a notion gaining traction in some circles that differences in political ideology correspond to our subjective emphasis of different moral values. As a result, sensitivity to the potential range of another’s moral priorities might be helpful when discussing ethical concerns or conducting investigations. For example, presenting concerns in terms of loyalty or betrayal to a cause may resonate more for some, while others may be more receptive to arguments based on integrity or candor. In addition, since internal complaints are sometimes motivated by a sense of moral outrage, it may be helpful to understand whether the underlying concern is that the complainant was misled, humiliated, treated unfairly, or some other consideration. If nothing else, such an approach may help persuade the complainant that the investigator is listening to their point of view.
Although the study was silent on this point, it could be asked whether these different moral frameworks correlate in some way with authority or social status: people already in positions of power may have more reason to value loyalty in others, while those with little control over their work environment may be more preoccupied with honesty, fairness, and autonomy. In any event, it’s well-known that a single-minded emphasis on any one of these values can invite disaster. The problems with over-emphasis on loyalty, for example, are nicely illustrated in the classic study of both liberal and conservative presidential administrations compiled in Irving Janis’s Groupthink. For those unfamiliar with this work, it describes how group dynamics lured very capable presidential teams into catastrophic blunders, from the Bay of Pigs to Watergate. Of interest here, many of these groups discouraged and suppressed “disloyal” dissenting views that later turned out to be correct.
Further, to no one’s surprise, the study published in Science also suggests that our moral thinking consists mostly of dwelling on our own excellent moral qualities and gossiping critically about others: study participants tended to report more of their own good deeds than bad deeds, and reported more gossip about others’ bad deeds than good deeds. However, when describing events going on around them, participants reported acts that they considered moral more often than those they considered immoral – suggesting, perhaps, that when our self-interested biases are removed, more good deeds are taking place (or at least being noticed) than bad.
The fact that people may be kinder than we realize will hopefully rattle a preconception, common among both employees and managers, that the researchers described as the “holier-than-thou” effect – our predictions about our own moral behavior are rosy, but our predictions about others are far more cynical. Understanding that this attitude arises from certain cognitive biases, and does not necessarily reflect the way people really behave, may help us place more confidence in our colleagues and, with time, relax the all-too-human reflex to take offense simply because offense is possible, rather than because such a reaction is warranted from what we know of the people involved. It’s true that neglect, insensitivity, ignorance, a misplaced sense of duty, and our other failings can all cause real harm; but I think most managers, and most employees, too, would prefer to keep anger as a last resort.
Promoting a culture of trust and understanding is a recurring theme in my trainings for clients on anti-discrimination, diversity, and inclusion. First of all, engaging with people coming from different moral perspectives may help us identify and resolve problems earlier, before they become expensive. Further, although policies are important, and may be critical to establishing certain defenses in litigation, no policy can take the place of a workplace culture of mutual trust and respect, where people have learned, by listening to and getting to know each other, that others are often better-intentioned than we may casually assume.