OUTROS TEMAS. THE NEW SCIENCE OF MORAL COGNITION: THE STATE OF THE - TopicsExpress



          

OUTROS TEMAS. THE NEW SCIENCE OF MORAL COGNITION: THE STATE OF THE ART. [PARTE II] Ph.D. Antonio Olivera-La Rosa (Human Evolution and Cognition Group (IFISC-CSIC), University of the Balearic Islands, Palma de Mallorca, Spain) & Ph.D. Jaume Rosselló; Department of Psychology, University of the Balearic Islands, Palma de Mallorca, Spain) Moral judgment understood as an evaluation driven by innate principles The first approach to the innateness of moral content argues that we are born with a moral faculty akin to the language faculty. Thus, it has been proposed that moral judgments are structured on a set of implicit principles that constitute the “Universal Moral Grammar” (Hauser, 2006), understood as an innate device of morality acquisition (Mikhail, 2007). In other words, the human mind is born equipped with a set of domain-specific rules, principles and concepts that can produce a wide range of mental representations. These implicit principles determine the deontological status of an infinite assortment of acts (and non-acts, see Mikhail, 2007). As a result, moral intuitions are structured on these psychological guidelines that constitute the moral faculty. For instance, it is argued that, although there are domain-general mechanisms underlying the moral faculty, some cognitive mechanisms are moral-specific (Cushman, Young & Hauser, 2006). These authors believe that such mechanisms “translate” general principles into specific moral judgments, because each one of them is understood as “a single factor that, when varied in the context of a moral dilemma, consistently produces divergent moral judgments” (Cushman, Young & Hauser, 2006, p. 1082). Therefore, they found support for the existence of three particular moral principles. Action principle causes that people judge harm caused by action as morally worse than harm caused by omission. Intention principle causes that people judge intended harm as morally worse than foreseen harm. Lastly, contact principle causes that people judge harm involving physical contact as morally worse than harm caused without contact. Research conducted by Knobe (2010) is an interesting counterpoint to this perspective. This author has found evidence suggesting that the “moral status” of an action (that is, whether it is judged as morally right or wrong) influences the perception of the intentionality of the action judged. For instance, Knobe and his team found that the same action was judged as intentional or unintentional depending on the wrongness or rightness of the action, respectively. Likewise, a growing body of studies from the field of neuroscience suggests that there might be some unconscious principles underlying moral judgments. Consider the following scenario: A runaway trolley is going to kill five people if it continues its present course. The only way to avoid this tragedy is to hit a switch that will change the trolley course, of which the major problem is that, in its new side track, it will run over—and of course, kill—one person instead of the initial five. Is it morally acceptable to hit the switch? (Greene Sommerville, Nystrom, Darley, & Cohen, 2001, p.2105) Diverse studies on this topic show a large inclination to immediately consider the affirmative response morally acceptable (Greene et al., 2001; Greene, Nystrom, Engell, Darley, & Cohen, 2004). Interestingly, responses were quite different when participants were asked to evaluate a similar recreation of the trolley dilemma (In this second case (the “footbridge dilemma”), all the variables were controlled to be identical than in the trolley dilemma. Thus, in this second version, the only modification was that, in order to stop the train and save five people, participants have to push a “big” person instead of performing the action of “hitting the switch.” Despite the obvious similarities, results show that people respond in an opposite way: they tend to immediately consider as “not permissible” to push one man off in order to save five (Greene et al., 2001). What makes it morally acceptable to sacrifice one life in order to save five in the first case but not in the second one? For Greene and collaborators (2001), the main distinction between the two situations is that the simple thought of pushing someone to certain death with one’s hands in an “close-up and personal” manner is likely to be more emotionally salient than the “impersonal” thought of hitting a switch, even if both responses have similar consequences. It is noteworthy that, despite that the explanatory validity of this distinction has been seriously questioned (Kahane et al., 2011; McGuire, Langdon, Coltheart & Mackenzie, 2009), it appears that there is something about the actions in the footbridge and the switch dilemma that elicits different behaviors. Moral judgments understood as an automatic-affective evaluative process The possibility that the evaluation of both types of dilemmas engage dissociable processing systems has been proposed as an explanation for this phenomenon. Neuroimaging studies have reported activity in several brain regions during the evaluation of moral events (Moll & Schulkin, 2009), which shows that the process of moral judgment involves several brain areas working integratedly. Some of these areas are associated with emotional processes, and others areas are related to rational processing, a fact that has favored the discussion about the function of rational and emotional processes in moral judgments. For example, Greene (2009) proposes a dual-process theory of moral judgment, according to which automatic emotional responses drive characteristically deontological judgments, and controlled cognitive processes drive utilitarian judgments. Thus, Greene claims that moral cognition functions like a picture camera: there is an “automatic” (emotions-intuitions) and a “manual” (conscious reasoning) mode. Depending on the situation being judged, one setting could be more efficient than the other. However, as a general rule, the automatic mode is more efficient in everyday situations to which we are to some extent habituated. Conversely, in novel situations that require of more flexible responses, the manual mode is more efficient. These differentiated processes can enter into conflict in the moral situations where a rational evaluation clearly favors the “right” response, but the implication of such a choice elicits a negative emotional reaction (Greene et al., 2004). Supporting this claim, a neuropsychological study by Koenigs et al. (2007) found that ventromedial prefrontal patients made about five times more utilitarian judgments than control subjects. The dual conception of moral cognition is amply shared among moral psychologists. Moreover, a recent body of research favors the characterization of a typical moral judgment as an automatic process. For example, Jonathan Haidt (2001) found an important battery of evidence supporting his central claim that most moral judgments are caused by moral intuitions. Based on this conception, Haidt (2001) proposes the Social Intuitionist Model of moral judgment (SIM), which, essentially, captures the interaction between moral intuitions, moral judgments and moral reasoning. Therefore, in daily life, affect-laden intuitions drive moral judgments, whereas moral reasoning-when it occurs follows these intuitions in an ex-post facto manner. From this perspective, moral judgment is much like aesthetic judgment: in the presence of a moral event, we experience an instant feeling of approval or disapproval (Haidt, 2001). Thus, moral reasoning also plays an important “social” role in moral cognition, being very common in conversation and moral decisions (Haidt & Bjorklund, 2007). In particular, moral arguments should be understood as attempts to trigger the right intuitions in others. As a consequence, moral discussions are understood as processes in which two or more people are engaged in a battle to push the rival´s emotional buttons. The characterization of moral judgment as a response resulting from intuitive-affective processes has found support in two central claims. Firstly, the fact that people often have the feeling that something is wrong but find it extremely difficult to find reasons that justify their evaluation. Thus, Haidt (2001) identified the cognitive phenomenon of “Moral dumbfounding,” which consists of the fact that, in the absence of a truly comprehension of a given moral judgment, people tend to search for plausible explanations about why anyone in a similar situation would have proceeded in the same way. Therefore, it can be said that in those situations, people intuitively “know” whether something is right or wrong, but faced with the lack of a logical understanding of the response, they tend to rationalize a justification for their initial intuition. In other words, the reason why we are often unconscious of the cognitive processes that influence moral judgments is because the “moral mind” acts more like a lawyer trying to build a case rather than a judge searching for the truth (Haidt, 2001): People have quick and automatic moral intuitions and, when called upon to justify these intuitions, they generate post-hoc justifications out of a priori moral theories. They do not realize that they are doing this. (…). Rather, people are searching for plausible theories about why they might have done what they did. Moral arguments are therefore like shadow-boxing matches: each contestant lands heavy blows to the opponent’s shadow, then wonders why he doesn’t fall down (Haidt, 2001, p. 12-13). The second claim that supports the characterization of moral judgments as automatic-affective evaluative processes is the sensitivity of moral judgments to affective influences. For instance, there is evidence suggesting that disgust exerts a special influence on moral judgments (Eskine, Kacinik, & Prinz, 2011; Eskine, Kacinik, Webster, 2012; Schnall, Haidt, Clore, & Jordan, 2008; Olivera La Rosa & Rosselló, 2012, 2013). Also, it seems that the reverse of this patter also mediates moral cognition. Ritter and Preston (2011) found that disgust towards rejected religious beliefs was eliminated when participants were allowed to wash their hands. Moreover, there is evidence that both the cognitive concept and the sensation of cleanliness can make moral judgments less severe (Schnall, Benton, & Harvey, 2008) and reduced the upsetting consequences of immoral behavior (Zhong & Liljenquist, 2006). Moral norms understood as psychologically constrained cultural constructions The affective-intuitive approach to morality is largely sustained by the claim that moral beliefs and motivations are ultimately derived from moral emotions. These emotions are understood as evaluations (good or bad) of persons or actions, with the particularity that the object evaluated can be the self or another. Thus, Haidt (2003) proposes that moral emotions can be divided into other-condemning emotions (like contempt, anger or guilt), self-condemning emotions (shame, embarrassment and guilt), other-praising emotions (gratitude, admiration and elevation) and self-praising emotions (pride and self-satisfaction). These emotions are typically triggered by the perception of a moral violation and normally motivate actions directed at the reestablishment of the “broken” moral value (Nichols, 2008). A distinctive feature of moral emotions is that their subjective experience is especially sensitive to cultural factors and social dynamics. Thus, the fact that some moral emotions are associated with some social situations across different cultures suggests that there may be some psychological foundations underlying the development of moral systems. For instance, Haidt and Joseph (2004) argue that we are born with a “first moral draft” that is constituted of (at least) five sets of affect-laden intuitions, of which one is easily triggered by the perception of (at least) five sets of moral situations. In other words, the human mind has evolved these sorts of “social receptors” or “moral buds” (Haidt & Joseph, 2004, p. 57) that are sensitive to the recognition of social patterns (such as actions, relationships or intentions) and can “translate” the perception of these patterns into emotional states. Further, it is argued that evolutionary pressures structured the human mind to intuitively develop concerns about five moral foundations (Haidt & Joseph, 2004). Therefore, harm/care is associated with the emotion of compassion and concerns for other-suffering, including virtues such as caring and compassion. Fairness/reciprocity involves concerns about unfair treatment, inequity, and abstract notions of justice. Moral violations within this domain are associated with the emotion of anger. In-group/loyalty is associated with emotions of group pride and rage against traitors and concerns derived from group membership. Authority/respect involves concerns related to social order and obligations derived from hierarchical relationships, concerns that are mediated by the emotion of fear. Lately, purity/sanctity involves concerns about physical and spiritual contagion, including virtues of chastity, wholesomeness, sanctity, control of desires and is regulated by the emotion of disgust. Thus, Haidt and Bjorklund (2007) argue that the process of moral development should be understood as an externalization process: our mind has evolved five moral foundations that function as “learning modules,” which, when working together with cultural elements, facilitated the emergence of moral knowledge. Moreover, an important aspect of this theory is that each moral foundation is understood as largely independent from an evolutionary perspective. That is, each set of psychological mechanisms (moral emotions and intuitions) can be explained as shaped by different selective social pressures. This hypothesis is derived from the fact that four of them (all but Purity-sanctity) appear to be built on psychological mechanisms that are present in non-human primates (Haidt & Joseph, 2004). These findings call attention to the significant influence of emotional processes in moral life. For instance, it has been proposed that the moral dimension of rules is psychologically grounded on moral emotions (Nichols, 2008). Like Greene (2009) and Haidt and Joseph (2004), the author believes that we have evolved an innate psychological predisposition to feel negative affective responses when in the presence of an action that involves another’s suffering. According to his approach, this aversive mechanism constitutes the “emotional support” for the emergence and transmission of moral norms. In other words, for the “cultural fitness” of a moral norm, there must be some emotional congruence between the content of the norm and its implications. Therefore, affective mechanisms appear to constitute an important factor mediating the moral/conventional distinction. Rozin, Markwith and Stoess (1997) proposed the concept of moralization to explain the phenomenon in which objects or activities that were originally neutral acquire a moral status. For example, they found that participants who reported avoiding meat for moral reasons found meat more disgusting and offered more reasons in support of their position. In the same line, Rozin and Singh (1999) found that participants’ disgust measures were highly correlated with their (negative) moral judgments against smokers, suggesting that disgust toward smoking is correlated with strong beliefs that smoking is immoral. Conclusion Summarizing, the approaches reviewed above suggest that emotional processes play a motivational role at the normative level of morality. Such a claim implies that there are no rigid parameters constraining moral norms, only innate predispositions that can potentially shape the content of those norms. As Sripada (2007) points out, although there are “high-level themes” in the content of moral norms that are nearly ubiquitous among moral systems-such as harm, incest, helping, sharing, social justice, and group defense-, the specific rules that operate within each theme are culturally idiosyncratic and highly variable. Therefore, the innateness of moral systems should be understood in terms of a set of social preparedness-like a “universal menu of moral categories” (Prinz, 2007, p. 381) -that constrains the construction and functioning of moral systems. In this context, the cuisine analogy created by Haidt and Bjorklund (2007) might be illustrative: although cuisines are unique cultural products, they are also built on an innate sensory system that includes five different taste receptors on the tongue. These biological structures constrain cuisines while at the same time allow them a wide range of creativity in the final products, also constraining our preferences. In short, it can be said that the human mind is endowed with “conceptual moral seeds” that are typically externalized through individual development if the right “weather” (the cultural inputs) does its part. The present review has some limitations. Due to the broadness of the research’s theme different approaches were not considered in the current discussion. For instance, morality has been a major theme in Western philosophy. Although the discussion of philosophical approaches to the moral domain certainly exceeds the scope of this review, it is important to mention that recent findings from neuroscientific and clinical studies have provided new insights into traditional philosophical debates. With regard to this issue, Damasio (2004) research strongly suggests that the human mind is essentially embodied (as Spinoza believed) which implies that body-states often precede higher-order mental processes and not the other way around (as Descartes claimed). In addition, further studies on clinical populations that involve affective-related impairments and dysfunctions can provide key insights to the understanding of the influence of affective variables on the moral domain. In this line, further research is needed to address the specific role of emotional processes in moral judgments. Moreover, future studies should be designed to test whether the influence of incidental affects on moral judgments is indeed moral specific or it can be extended to other type of affective judgments (e.g., aesthetic judgments). References Bargh, J. A. (1994). The four horsemen of automaticity: Awareness, efficiency, intention, and control in social cognition. In J. R. S. Wyer & T. K. Srull (Eds.), Handbook of social cognition, 2nd edition (pp. 1-40). Hillsdale, NJ: Erlbaum. Brosnan, S. F. & de Waal, F. (2003).Monkeys reject unequal pay. Nature 425, 297-299. doi:10.1038/nature01963 Cushman, F., Young, L., & Hauser, M. D. (2006). The Role of Conscious Reasoning and Intuition in Moral Judgment: Testing Three Principles of Harm. Psychological Science, 17(12), 1082-9. doi: 10.1111/j.1467-9280.2006.01834.x Damasio, A. R. (1994). Descartes’ error: Emotion, rationality and the human brain. New York: Putnam (Grosset Books). Damasio, A. R. (2004). Looking for Spinoza: Joy, sorrow and the feeling brain. Random House. De Waal, F. (1996). Good Natured: The origins of right and wrong in humans and other animals. Cambridge, MA: Harvard University Press. De Waal, F. (2007). Putting the altruism back into altruism: The evolution of empathy. Anual Review of Psychology, 59, 279-300.doi: 10.1146/annurev.psych.59.103006.093625 Eskine, K. J., Kacinik, N. A., & Prinz, J. J. (2011). A Bad Taste in the Mouth: Gustatory Disgust Influences Moral Judgments. Psychological Science, 22, 295-99. doi: 10.1177/0956797611398497 Eskine, K.J., Kacinik, N.A., & Webster, G.D. (2012) The Bitter Truth about Morality: Virtue, Not Vice, Makes a Bland Beverage Taste Nice. PLoS ONE 7(7):e41159. doi:10.1371/journal.pone.0041159 Flack, J. C., & De Waal, F. (2000). ‘Any animal whatever’. Darwinian building blocks of morality in monkeys and apes. Journal of Consciousness Studies, 7, 1–29. Greene, J. D., Sommerville, R. B., Nystrom, L. E., Darley, J.M., & Cohen, J. D. (2001). An fMRI Investigation of Emotional Engagement in Moral Judgment. Science, 293, 2105-2108. doi: 10.1126/science.1062872 Greene, J. D., Nystrom, L. E., Engell, A.D., Darley, J.M., Cohen, J.D. (2004). The Neural Bases of Cognitive Conflict and Control in Moral Judgment. Neuron, 44, 389-400. doi:10.1016/j.neuron.2004.09.027 Greene, J. D. (2009). Dual-process morality and the personal/impersonal distinction: A reply to McGuire, Langdon, Coltheart, and Mackenzie. Journal of Experimental Social Psychology, Vol. 45 (3), 581-584. doi:10.1016/j.jesp.2009.01.003 Haidt, J. (2001). The emotional dog and its rational tail: A social intuitionist approach to moral judgment. Psychology Review, 108, 814-834. doi: : 10.1037//0033-295X. 108.4.814 Haidt, J. (2003). The moral emotions. In R. J. Davidson, K. R. Scherer, & H. H. Goldsmith (Eds.), Handbook of affective sciences (pp. 852-870). Oxford: Oxford University Press. Haidt, J., & Bjorklund, F. (2007). Social Intuitionists Answer Six Questions About Moral Psychology. In W. Sinnott-Amstrong (Ed.), Moral psychology, Vol.2: The cognitive science of morality (pp.181-217). Cambridge, MA: MIT Press. Haidt, J., & Joseph, C. (2004) Intuitive ethics: how innately prepared intuitions generate culturally variable virtues. Daedalus, 133, 55-65. doi:10.1162/0011526042365555 Haidt, J., & Kesebir, S. (2010). Morality. In S. T. Fiske, D. T. Gilbert, & G. Lindzey (Eds.), Handbook of Social Psychology (5th Edition) (pp.797-832). Hoboken, NJ: Wiley. Hauser, M. D. (2006). Moral Minds: How nature designed our universal sense of right and wrong. New York: HarperCollins. Kahane, G., Wiech, K., Shackel, N., Farias, M., Savulescu, J., & Tracey, I. (2011). The neural basis of intuitive and counterintuitive moral judgment. Social Cognitive and Affective Neuroscience, 7(4),393-402. doi:10.1093/scan/nsr005 Knobe, J. (2010). Action trees and moral judgment. Topics in Cognitive Science, 2(3), 555-578. doi: 10.1111/j.1756-8765.2010.01093.x Koenigs, M., Young, L., Adolphs, R., Tranel, D., Cushman, F., Hauser, M., & Damasio, A. (2007). Damage to the prefrontal cortex increases utilitarian moral judgements. Nature, 446, 908-11. doi:10.1038/nature05631 Kohlberg, L. (1969). Stage and sequence: The cognitive- developmental approach to socialization. In D. A. Golsin (Ed.), Handbook of socialization theory and research (pp. 347-480). Chicago: Rand McNally. McGuire, J., Langdon, R., Coltheart, M., & Mackenzie, C. (2009). A reanalysis of the personal/impersonal distinction in moral psychology research. Journal of Experimental Social Psychology, 45(3),577-580. doi: 10.1016/j.jesp.2009.01.002 Mikhail, J. (2007) Universal moral grammar: Theory, evidence and the future. Trends in Cognitive Sciences, 11, 143-152. doi:10.1016/j.tics.2006.12.007 Moll, J., & Schulkin, J. (2009). Social attachment and aversion in human moral cognition. Neuroscience & Biobehavioral Reviews, 33(3), 456-465. doi: 10.1016/j.neubiorev.2008.12.001. Nadal, M., Barceló-Coblijn, L., Olivera, A., Christensen, J. F., Rincón-Ruíz, C., & Cela-Conde, C. (2009). Darwins Legacy: A comparative approach to the evolution of human derived cognitive traits. Ludus Vitalis, 15(32), 145-172. Nichols, S. (2008). Moral rationalism and empirical immunity. In W. Sinnott-Armstrong (ed.) Moral Psychology. Volume 3: The Neuroscience of Morality (pp. 395-407). Cambridge, MA: MIT Press. Öhman, A. (1987). Psychophysiology of emotion: an evolutionary-cognitive perspective. In P.K. Ackles, J.R. Jennings y M.G.H. Coles (Eds.), Advances in Psychophysiology (Vol. 2) (pp. 79-127). Greenwich, CT: JAI Press. Olivera La Rosa, A., & Rosselló-Mir, J. (2012). Shocking moral judgments. LAP LAMBERT Academic Publishing. Olivera La Rosa, A. & Rosselló, J. (2013). On the relationships between disgust and morality: a critical review. Psicothema, 25 (2), 222-226. Parkinson, C., Sinnott-Armstrong, W., Koralus, P. E., Medelovici, A., McGeer, V., & Wheatley, T. (2011). Is morality unified? Evidence that distinct neural systems underlie moral judgments of harm, dishonesty, and disgust. Journal of Cognitive Neuroscience, 23(10), 3162-3180. doi: 10.1162/jocn_a_00017 Piaget, J. (1932/1965). The moral judgement of the child. New York: Free Press. Preston, S. D., & de Waal, F. (2002). Empathy: Its ultimate and proximate bases. Behavioral and Brain Sciences, 25(1), 1-71. doi: 10.1017/S0140525X02000018. Ritter, R. S., & Preston, J. L. (2011). Gross gods and icky atheism: Disgust responses to rejected religious beliefs. Journal of Experimental Social Psychology, 47, 1225-1230. doi:10.1016/j.jesp.2011.05.006 Rozin, P., & Singh, L. (1999). The moralization of cigarette smoking in the United States. Journal of Consumer Behavior, 8, 321–337. Rozin, P., Markwith, M., & Stoess, C. (1997). Moralization and becoming a vegetarian: The transformation of preferences into values and the recruitment of disgust. Psychological Science, 8, 67–73. doi: 10.1111/j.1467-9280.1997.tb00685.x Schnall, S., Benton, J., & Harvey, S. (2008). With a clean conscience: Cleanliness reduces the severity of moral judgments. Psychological Science, 19, 129-1222. doi: 10.1111/j.1467-9280.2008.02227.x. Schnall, S., Haidt, J., Clore, G. L., & Jordan, A. H. (2008). Disgust as embodied moral judgment. Personality and Social Psychology Bulletin, 34, 1096-1109. doi: 10.1177/0146167208317771 Sripada, C. (2007). Nativism and Moral Psychology: Three models of the innate structure that shapes the content of moral norms. In Walter Sinnot-Armstrong (ed.). Moral Psychology, Volume 1: The Evolution of Morality: Adaptations and Innateness (pp.319-343) London: MIT Press Tomasello, M., Call, J., & Hare, B. (2003). Chimpanzees understand psychological states – the question is which ones and to what extent. Trends in Cognitive Sciences, 7, 153-156. doi:10.1016/S1364-6613(03)00035-4 Tse, P. U. (2008). Symbolic Thought and the Evolution of Human Morality. In W. Sinnott-Armstrong (eds.) Moral Psychology, Vol. 1: The Evolution of Morality: Adaptation and Innateness (pp.269-297). Cambridge, MA: The MIT Press. Van Wolkenten, M., Brosnan, S. F., & de Waal, F. (2007). Inequity responses of monkeys modified by effort. Proceedings of the National Academy of Sciences USA 104, 18854–18859. doi: 10.1073/pnas.0707182104 Wilson, E. O. (1975). Sociobiology: the new synthesis. Cambridge, MA: Harvard University Press. Zhong, C. B., & Liljenquist, K. A. (2006). Washing away your sins: Threatened morality and physical cleansing. Science, 313, 1451-1452. doi; 10.1126/science.1130726
Posted on: Wed, 23 Oct 2013 04:27:56 +0000

Trending Topics



Recently Viewed Topics




© 2015