Do Humans Have a “Moral Sense”?

What drives us to treat others as we want to be treated? Is it a skill we must learn, or are we born with some sense of morality? If we are, how far can it take us?

In a note to a group of young people, Mark Twain once advised, “Always do right. This will gratify some people and astonish the rest.”

As appealing as the thought of astonishing people may be, defining what is right is not as easy as it sounds. Vast fields of related study are devoted to the topic, described using terms such as ethics or moral philosophy. Countless philosophers writing libraries of books over thousands of years have not improved on the central tenet of Jesus’ Sermon on the Mount: “Treat others as you want to be treated.”

Philosopher and theologian Albert Schweitzer expressed a similar notion: “A man is truly ethical only when he obeys the compulsion to help all life which he is able to assist, and shrinks from injuring anything that lives.” To some degree, we understand what it means to help or injure others because we can conceive of what makes us feel helped or injured.

But is our sense of how we want to be treated enough to answer all the moral questions that come our way? Obviously, personal preferences and cultural differences can influence how we want to be treated, but what can drive us to do the right thing toward others? Is it something we just have to learn, or are we born with some sense of morality? If we are, how far does it take us?

Some of these questions may seem best left to philosophers and theologians, but we all face moral dilemmas. Devout or skeptic, on some level we all want to see ourselves in a good light. We’d like to think we can tell right from wrong—that we are, in fact moral—and most would accept that a belief in God is not a prerequisite. A quick read of the daily news forces us to acknowledge the validity of that view. Some of those who are religious don’t have what might be called high moral standards, and those with high moral standards aren’t always religious. This is not to say that religious texts don’t have much to offer in terms of moral codes; they do. The trouble is that some who claim to believe in them don’t necessarily live by them. The Bible’s Ten Commandments, for instance, were summed up by Jesus in two: love God, and love your neighbor. Yet even among those who claim to obey the first, there’s often little evidence of the second.

Nevertheless, religious texts have historically helped inform society’s moral standards. Even if their precepts have never been universally upheld, they’ve made considerable contributions to the moral discourse.

In recent decades, moral psychologists have entered the discussion, focusing mainly on how people decide what’s right and wrong, while philosophers and theologians are more concerned with what is right and wrong. There are, of course, common interests across these groups, and one is the question of motivation. What keeps us on the straight and narrow? What makes people want to determine and do the right thing?

You could say it all starts with our inborn need for emotional connection. Researchers who study motivation find that emotion works in tandem with our thoughts when prompting moral action. Jesus seems to have acknowledged this connection when He said, “If you love me, you will keep my commandments” (John 14:15). From this perspective, feeling love for someone (which is a multidimensional emotion that includes such feelings as compassion and loyalty) manifests as showing care, concern and faithfulness.

Motivating Morality

We are social and emotional beings from birth, which means we have a natural drive to connect socially with others on an emotional level. Our capacity for empathy allows us to do this, which is why neuroscience has long explored how it may be hardwired in the human brain. “Mirror neurons” caused quite a stir after their discovery by Italian scientists in the 1990s, and continued research has shed further light on their role. These neurons were found to activate certain areas of the brain, not only when we perform an action ourselves but also when we watch others perform the same action. This observation has led many scientists to conclude that they’ve found the seat of our brain’s ability to empathize—to identify with someone else’s feelings and experiences.

Simply put, mirror neurons may give us a window into the inner worlds of others so we can connect empathetically on both a social and an emotional level. And this window seems to open early—from infancy, in fact.

Paul Bloom is a Yale psychologist whose interest in moral behavior made him curious about how much of our moral sense might be innate. He has studied babies and young children to see how our sense of rightness and wrongness develops. Over his career, he and his colleagues have collected an impressive body of evidence showing that babies and children (from about three months) do have something we could call a moral sense. For instance, they can tell whether someone is being kind or cruel; they don’t like to see people suffer and will try to alleviate their pain; they value fairness, although they see it in simple terms (everyone gets the same number of raisins); and their sense of justice calls for rewarding good actions and punishing bad ones.

Our innate goodness is limited, however,” writes Bloom, “sometimes tragically so. . . . We are by nature indifferent, even hostile, to strangers; we are prone toward parochialism and bigotry. Some of our instinctive emotional responses, most notably disgust, spur us to do terrible things, including acts of genocide.”

A moral sense . . . is not the same as an impulse to do good and avoid doing evil. Rather, it is the capacity to make certain types of judgments—to distinguish between good and bad, kindness and cruelty.”

Paul Bloom, Just Babies: The Origins of Good and Evil

Some have tried to counter these tendencies by calling on people to expand their moral circles—the groups they consider worthy of moral concern. As Bloom pointed out, we can be parochial and indifferent to strangers, even while showing concern for our families and close friends. How far our moral circles extend varies from person to person due to a number of complex dynamics. But to simplify the idea, we can picture the ripples made by a pebble in a pond, with ourselves at the center. Closest to us is the family circle, then more circles extend out to represent friends, our community, the nation we live in, all human beings—and potentially even animals, all living things and the universe itself.

As we expand our moral circles, we might find ourselves behaving in ways that would benefit those yet to be born (by doing what we can to address existential threats, for instance). But at some point, for most people, a more inward-seeking force begins to kick in, and we may find ourselves unable (or unwilling) to expand our moral circles any further.

There are, however, ways to nurture our willingness to expand them. Compassion meditation is one of these. Brain scans show that it can make significant changes to the areas active in focused attention and empathy, helping us better imagine the perspectives of others and make appropriate moral judgments. We can practice compassion meditation by picturing the care and concern we feel for a close loved one and then applying that same feeling to someone further out in the expanding ripples of our moral circles. Exercising our minds in this way, by focusing on the emotions that support our better intentions, can help us cultivate love, respect, compassion, gratitude and an overall sense of social belonging—some of the many positive emotions that motivate productive behavior.

Clearly, however, there’s much more to motivating moral behavior than simply generating positive emotions. After all, fringe groups—from incels to terrorists—recruit and motivate their followers by offering them a sense of belonging that’s often missing in their usual environment. Our need to belong is powerful. We have a strong desire to please those we respect and care about, and to know where we stand in their estimation. We want to feel that they approve of our behavior—so much so, in fact, that we sometimes alter it to stay in their good graces. This is one of the mechanisms behind peer pressure, and again, it can work for us or against us.

It is emotionally taxing to violate social and moral rules.”

Jesse Prinz, “The Emotional Basis of Moral Judgments”

The same can be said of the negative emotions linked to moral behavior. Disgust, embarrassment, guilt, shame, sadness, remorse—each of these can motivate behavior positively or negatively. And while “transcendent” emotions such as awe and veneration are often thought of as religious terms that can describe only morally sound motivations, it’s easy to think of examples where each of these emotions has led to decidedly amoral or immoral acts. Hero-worship and self-righteousness, for instance, have been a driving force in war and genocide throughout the ages.

Still, research on psychopaths underscores the fact that even negative emotions can play a key role in making moral judgments. Because they rarely experience such negative emotions as fear and sadness themselves, psychopaths have difficulty recognizing them in others. As a consequence, they’re unable to empathize with another’s distress, which explains why they don’t feel genuine guilt or remorse when their behaviors cause others pain. Without empathy, the concept of “wrong” means little beyond, perhaps, “prohibited by law.” The internal motivation to choose to do right is missing when we can’t imagine and care about our actions’ consequences on others’ emotional states.

Empathy, then, helps us leverage our emotions so we can “treat others as we want to be treated.” But acting on that empathy often requires another well-studied trait: self-control.

Self-Control: The Empathy Factor

Researchers have found ample evidence of a link between empathy and self-control. Related regions of the brain are associated with these traits, and each is affected by relationships with childhood caregivers. In other words, these traits, which seem so fundamental to human morality, involve both nature and nurture. Just as good-quality relationships enlarge and strengthen the brain centers responsible for empathy and self-control, so they enlarge our moral identity. We look to those who love and care for us to set appropriate behavioral boundaries and teach the finer points of conscientious, responsible character.

As we do this, we can make the mistake of thinking that self-control means suppressing emotions. It can sometimes seem that emotions are the enemy to be conquered, the weak part of our system that opens us up to a slippery slope of irresponsible behavior. We typically hear about the prefrontal cortex as the home of self-regulation, the whistleblower that stops impulsiveness dead in its tracks. But as we saw in the reference to psychopath studies, our capacity for emotion is key to our capacity for empathy—and that’s a pathway to making the moral judgments that influence our behavior.

The takeaway is clear: the prefrontal cortex isn’t the only structure we use to regulate ourselves or to choose long-term rewards over immediate ones.

One group of researchers focusing on an area of the brain known for its role in social processes and active empathy, among other functions, has found something very interesting. This area of the brain is activated not only when we regulate our behavior in consideration of other people (as one might naturally expect) but also when we regulate our behavior in consideration of our future self. Empathy, of course, requires being able to imagine someone else’s perspective; delayed gratification, seen as a key aspect of self-control, requires the same thing, except that the “someone else” can also be the future version of our self, a person whose needs and perspectives will be different from those of our present self.

When we think about some of our moral failures, which happen even though we want to see ourselves and be seen by others as moral people, we can see why the link between empathy and self-control is so important. Of course, despite every compassionate intention we can muster, we sometimes behave in ways that aren’t in sync with our beliefs. We hurt not only those we don’t know but also those we love—people we have strong emotional reasons to protect and care for.

We call it cognitive dissonance when our beliefs and behaviors don’t align. A similar dissonance occurs when our moral intentions and behaviors don’t align. A group of behavioral scientists who study morality found that “people often transgress even when they recognize that their actions are morally ‘wrong.’” This happens, for instance, when a stronger emotion overrides the emotion on which we base our moral standard. Our standard may be that it’s wrong to lie to a friend, but fear that the truth could make them think less of us might tempt us to ignore that standard in a given moment. Then, through strategies these researchers call “moral disengagement,” we soothe the dissonance we feel by justifying our actions, perhaps by minimizing or otherwise misrepresenting the consequences of our behavior. In other instances, we might even dehumanize or blame others to justify our problematic behavior.

Blame and the Accountability Factor

Where we place blame makes a big difference in aligning our moral beliefs with our behaviors and judgments. Those who have been abused often internalize blame when the blame isn’t theirs at all. The person who has abused them is eager to evade guilt, and they use the techniques just mentioned to accomplish that: they dehumanize their victim (“they deserved it or were asking for it”), misrepresent the consequences (“they’ll get over it; they didn’t die”), or place the blame on others (“it’s not my fault; it was the way they were dressed”). Rarely do perpetrators accept accountability by looking at their own actions through the same lens that other people would. Sometimes observers make a similar mistake, blaming victims for their victimization and excusing the perpetrator. These kinds of errors are not only moral failures of individuals but also of the society that supports them.

So if we’re to hold people accountable for moral wrongs, what are the building blocks of accountability?

Moral philosophers Brendan Dill and Stephen Darwell begin by looking at how observers and perpetrators react to moral wrongdoing. Critical attitudes such as contempt and disdain lead to shame, they note, which causes us to internalize blame, but not in a constructive way. Internalizing blame as shame isn’t effective in bringing about change. Guilt, on the other hand, is a personal acknowledgment of wrongdoing that is much more likely to lead us to accept accountability. In both cases blame is involved, but the difference lies in how it’s expressed and the emotion evoked. Some emotions are better at motivating moral behavior than others.

When we know we’ve committed a moral wrong and objectively charge ourselves with that wrong, guilt is the emotion that motivates us to hold the wrongdoer (ourself) accountable.

How can a perpetrator hold himself accountable? By regarding his actions as condemnable in the same way that an outside party would, and responding appropriately to this fact.”

Brendan Dill and Stephen Darwall, “Moral Psychology as Accountability”

Related to accountability and accepting blame are such concepts as remorse and forgiveness. These emotions can motivate moral behaviors that include reparation and reconciliation—drives that we might also consider innate. While babies may not understand the finer points of remorse and forgiveness, they do actively seek repair when they sense a broken connection with their caregivers. They can become quite agitated and anxious when a caregiver seems angry or unresponsive, often reaching out in an active attempt to restore that connection.

As adults, we clearly face many obstacles on our quest for sound moral judgment, but there’s enough evidence to suggest that humans are born with something like a rudimentary moral sense that could be considered innate. If so, this would seem to weaken arguments for moral relativism and to elevate words such as values and ethics to a status a good bit higher than taste or preferences. This is a crucial distinction. Among other things, it allows us to see modern examples of man’s inhumanity to man as the horrors they are—as betrayals of humanity, rather than as merely the chosen practices of a particular culture that can be left to develop as it will.

It should be clear from our own life experience, as well as from the events we see around us every day, that our inborn sense doesn’t give us all the moral answers we need to navigate our lives. To repeat Bloom’s point (quoted earlier), “Our innate goodness is limited, . . . sometimes tragically so.” But, imperfect as it is, it should at least tell us that there are right and wrong ways to treat one another and motivate us to commit ourselves to determining which is which.