Morality is essentially a suite of psychological mechanisms that enable us to cooperate. But, biologically at least, we only evolved to cooperate in a tribal way. Individuals who were more cooperative with those around them - could outcompete others who were not. However, we have the capacity to take a step back from this and ask what a more global morality would look like. Why are the lives of people on the other side of the world worth any less than those in my immediate community? Going through that reasoning process can allow our moral thinking to do something it never evolved to.
When you share your moral common sense with people in your locality, that helps you to form a community. But those gut reactions differ between groups, making it harder to get along with other groups.
When there is a conflict, which group's sense of right and wrong should prevail? If a morality is a system that allows individuals to form a group and to get along with each other, then the challenge is to devise a system that allows different groups to get along - what I call a meta-morality.
We now have a better biological and psychological understanding of our moral thinking. The idea that we should do what maximizes happiness sounds very reasonable, but it often conflicts with our gut reactions. Philosophers have spent the last century or so finding examples where our intuition runs counter to this idea and have taken these as signals that something is wrong with this philosophy. But when you look at the psychology behind those examples, they become less compelling. An alternative is that our gut reactions are not always reliable.
Since functional brain imaging first emerged, we have learned that there aren't very many brain regions uniquely responsible for specific tasks; most complex tasks engage many if not all of the brain's major networks. So it is fairly hard to make general psychological inferences just from brain data.
When you are thinking about whether you have an obligation to try to save people's lives, you don't usually think, well, how close by are they? Understanding what we are reacting to can change the way we think about the problem. If, biologically, morality evolved to help us get along with individuals in our community, it makes sense that we have heartstrings that can be tugged - and that they are not going to be tugged very hard from far away. But does that make sense? From a more reflective moral perspective, that may just be a cognitive glitch.
Utilitarianism is inherently pragmatic - in fact, I prefer to call it "deep pragmatism." Humans have real limitations, obligations, and frailties, so the best policy is to set reasonable goals, given your limitations. Just try to be a little less tribalistic.
Follow AzQuotes on Facebook, Twitter and Google+. Every day we present the best quotes! Improve yourself, find your inspiration, share with friends
or simply: