1. Introduction
In the ever-evolving realm of moral psychology, the investigation into the determinants of moral judgment has emerged as a central focal point, capturing the attention of scholars from diverse disciplines. In the previous decade, a notable upsurge in research endeavors has appeared aiming at unraveling the complexities underlying moral decision-making, with particular emphasis on scrutinizing moral judgments—those evaluative appraisals individuals formulate in response to transgressions against ethical norms. As scholars remain captivated by the intricacies of morality, this literature review embarks on an all-encompassing expedition delving into the multifaceted factors that shape moral judgment [1].
The process of moral decision-making is highly intricate, owing to the multifaceted interplay among sociocultural, cognitive, and neurological dimensions. These three fundamental pillars serve as the foundation for constructing people’s moral compass, exerting a profound influence on the choices the author makes and shaping the ethical frameworks [2]. Gaining a comprehensive understanding of the subtle nuances inherent in these influences is crucial for effectively navigating the complexities of morality, particularly in an era characterized by diverse cultural perspectives and advancements in cognitive and neurological research.
This literature review, guided by the premise that a comprehensive comprehension of moral judgment necessitates an exploration of sociocultural, cognitive, and neurological factors, engages with seminal studies. These studies, each focusing on a distinct dimension of the moral decision-making process, collectively contribute to the broader discourse on morality while offering unique insights into the intricate web of influences that shape people’s ethical compass. However, it is crucial to recognize the inherent complexities and limitations within each study. While these investigations contribute significantly to people’s understanding, they also beckon further inquiry, emphasizing the imperative need for ongoing research in the realm of moral psychology. The dynamic nature of societal norms, the evolving landscape of cognitive science, and the continuous refinement of neuroscientific methodologies all emphasize the necessity for a comprehensive and evolving understanding of the factors influencing moral judgments.
In the following sections, this review will thoroughly examine the specific contributions of each study, carefully assessing their methodologies, findings, and limitations. By conducting this analysis, the author aims to create a detailed narrative that not only highlights the unique influences of sociocultural, cognitive, and neurological factors but also suggests the intricate interconnectedness of these dimensions in shaping people’s moral judgments. The integration of these diverse aspects not only strengthens the public’s theoretical understanding but also has significant practical usage in various fields including psychology, ethics, policy-making, and more.
2. Influence Factors
2.1. Sociocultural
Sociocultural influences on moral judgment encompass a range of elements, including cultural norms, religious beliefs, social institutions, and historical context [3-5]. Understanding the intricate dynamics of these factors is essential for unraveling the tapestry of moral reasoning and judgment in different societies. This exploration examines the profound ways in which sociocultural forces mold and influence moral decision-making, contributing to the rich diversity of ethical frameworks that define human societies worldwide.
To establish a causal relationship between Sociocultural factors and moral judgments, A group of scientists conducted an experiment to elucidate this connection [3].
The study involved the participation of a diverse group of individuals, including men and women from various demographic backgrounds, encompassing factors such as age, religion, education, and occupation. A total of 659 participants completed the study, which entailed an analysis of the participant’s responses to a carefully constructed set of moral dilemmas. These individuals were recruited through an internet-based research initiative with specific attention given to those who wholly finished a comprehensive demographic survey, provided participants’ opinions on moral scenarios, and accurately responded to control scenarios designed to assess attention and comprehension skills.
The participants enrolled in the study were separated into two primary groups: the Russian sample, which consisted of 89 males (aged 16-69) and 238 females (aged 16-58), and the Western English-speaking sample, comprising 191 males (aged 10-85) and 141 females (aged 14-66). Furthermore, each group was further classified into five age brackets: individuals aged between the ages of 10 to 19 years old, those aged between the ages of20 to24 years old, those aged between the ages of 25 to34 years old, those aged between the ages of35 to44 years old, and finally individuals who were in the age range from45 to85 years old.
The experimental process involved participants visiting a specific website (moral.wjh.harvard.edu), where they were required to follow instructions displayed on the screen to complete a comprehensive demographic questionnaire covering aspects such as gender, age, religion, education, and political affiliation. Following that, participants were presented with 32 moral scenarios in a random order. These scenarios examined decision-making situations where individuals had to make choices involving sacrificing one person to save others. Differentiations were made between actions and omissions, intended methods, and foreseen side effects, as well as contact and no contact. In addition, two supplementary scenarios were included as controls to present situations that were not related to morality and to evaluate participants' understanding of the given instructions. One representative scenario involved a character named Luke who had to decide to operate a switch at a railroad station, which posed a moral dilemma. Participants were asked to rate the acceptability of Luke's actions using a Likert scale ranging from "prohibited" on one end to "obligatory" on the other end, with "permissible" in the middle.
The data analysis was comprehensive, encompassing comparisons of paired situations, computation of moral permissibility ratings (MPRs) for all 30 test scenarios per participant, and examination of extreme evaluations. Paired-sample t-tests were employed to compare eighteen controlled pairs of situations, while MPRs were derived to capture overall moral judgments across diverse harm scenarios. Reliability checks were performed using Cronbach’s alpha, and the analysis extended to exploring extreme moral judgments, including utilitarian and non-utilitarian extremes, and their distribution across different participant groups.
The analysis of the set of robust data encompassed comparisons of paired scenarios, calculation of moral permissibility ratings (MPRs) across 30 test scenarios for each participant, and examination of extreme judgments. Statistical analyses utilized IBM SPSS 20, incorporating tests for normality, reliability assessments, univariate general linear models (three-way ANOVA), and various post hoc tests for group comparisons. Homogeneity of variance was assessed, and in-group comparisons were performed. The dynamics of MPRs across different age groups were explored using Jonckheere trend tests, and Pearson's correlation analysis established associations between variables, with effect size estimates calculated.
The complexities of participants' responses to moral dilemmas were examined in the study, aiming to understand the influences of gender, age, and cultural factors on moral judgments. Employing a comprehensive methodology, including scenario presentation and advanced statistical analyses, enhances the robustness of the research. The findings support the universality thesis of moral judgment, revealing consistency in judgments among Russian and Western participants based on three proposed morally relevant principles. However, cross-cultural variations exist, with Western participants emphasizing harm avoidance and fairness, while Russian participants prioritize loyalty and group cohesion. This suggests that cultural socialization influences individuals' moral judgments.
The study also identifies disparities in variability within cultural cohorts, with Russian participants exhibiting more diverse moral judgments than Western participants. This challenges assumptions about homogeneity within Western respondents. The research carries significant implications for understanding moral development across cultures, emphasizing the importance of cultural sensitivity in interventions and policies promoting ethical conduct.
Furthermore, the study contributes to people’s understanding of the mutual effect of culture, socialization, and moral judgment. It highlights the need for further research to explore specific cultural factors and socialization processes contributing to cross-cultural disparities. Future investigations could examine how these variations in moral judgment impact individuals' behavior in real-world contexts, such as organizational or political settings.
The study under discussion has made notable contributions, but its validity is compromised by several significant limitations. Firstly, the use of internet-based surveys may introduce sampling bias, as online participants may not represent the entire population being studied. This could result in skewed and less generalizable findings. Additionally, relying on volunteers might attract individuals with specific inclinations, affecting the study's generalizability.
The study's classification of participants as "Western" and "Russian" oversimplifies cultural diversity, neglecting variations within these regions. Likert scales, while common, have subjective limitations, and assuming equal intervals between points may not be accurate. The study's assumption of the universal applicability of selected moral principles across cultures ignores the complexities of moral reasoning in diverse contexts, requiring a more qualitative exploration.
The use of a cross-sectional design limits the establishment of causal relationships between sociocultural factors and moral judgments, necessitating consideration of longitudinal or experimental designs. The study overlooks influential confounding variables such as socioeconomic status and religious beliefs. Limited demographic information collection ignores nuanced factors contributing to a comprehensive understanding of participants' backgrounds.
The study's reliance on specific moral dilemmas may influence participant responses, and a more diverse range of scenarios could capture a broader spectrum of moral judgments. Although acknowledging variability within cultural groups, the study does not thoroughly investigate it, and adherence to "Nature Journal" criteria raises concerns about potential publication bias. A more comprehensive consideration of negative or inconclusive findings is indicated.
In conclusion, this study provides support for the universality thesis of moral judgment while also highlighting significant cross-cultural variations in the emphasis placed on moral principles. It offers valuable insights into the influential role of cultural socialization in shaping individuals' moral judgments and decision-making processes. Though with the presence of dissatisfactions, these findings contribute to people’s comprehensive grasp of the intricate complexities comprised in moral development and underscore the imperative need for cultural sensitivity when promoting moral judgments.
2.2. Cognitive
The intricate process of moral judgment is not solely the product of external societal influences; it is also deeply intertwined with the cognitive machinery of the human mind [1]. Cognitive factors play a pivotal role in shaping how individuals perceive, evaluate, and respond to moral dilemmas. The intricate interplay of cognitive processes, such as reasoning, decision-making, and emotional responses, constitutes the lens through which individuals navigate the complexities of moral choices. This exploration sheds light on the cognitive mechanisms that underlie ethical decision-making and contribute to the diverse array of moral perspectives observed across individuals and cultures. [6,7]
In the hope of investigating the connection between Cognitive factors and Moral decision-making, Daniel M. Bartels, and David A. Pizarro conducted a study aimed to provide evidence on the psychological traits associated with utilitarian preferences, challenging the notion that utilitarian responses represent optimal moral judgment [7].
The research considers the psychological traits associated with those who endorse utilitarian solutions, connecting them with characteristics observed in clinical populations. It explores two potential routes to utilitarian preferences: one involving rational deliberation and another related to a reduced aversion to causing harm. The study aims to provide evidence on the psychological traits associated with utilitarian preferences, challenging the notion that utilitarian responses represent optimal moral judgment.
The study involved 208 undergraduates exploring the factors influencing individuals' inclination towards utilitarian solutions in moral dilemmas, which encompassed sacrificing one person for the greater good. The participants were presented with 14 moral dilemmas involving sacrifices and also completed assessments to measure their levels of psychopathic personality traits, Machiavellianism, and perceived lack of meaning in life. Psychopathy was evaluated based on reduced empathy and a tendency to seek excitement, while Machiavellianism assessed cynicism and manipulative tendencies. Both psychopathy and Machiavellianism share characteristics such as cognitive detachment, aggression, and an inclination toward engaging in or justifying deception. Whereas previous studies have shown that while these traits are correlated, they are distinct entities [8]. Additionally, perceptions of life meaninglessness were evaluated using the No Meaning Scale.
The study's findings indicated that individuals who scored higher on measures of psychopathy, lack of meaning, and Machiavellianism displayed a stronger preference for utilitarian options in moral dilemmas. The correlations between these predictor variables and average utilitarian preferences were statistically significant, with male participants scoring higher on psychopathy, lack of meaning, and Machiavellianism scales. Additionally, there was an observed correlation between social desirability and the predictor variables.
Revealing strong connections between utilitarian preferences and psychopathy, absence of purpose, and Machiavellianism were observed in multiple regression analyses after accounting for gender and social desirability. The distinctive predictive power of psychopathy and Machiavellianism was underscored when compared to other factors. Overall, the study suggests that specific personality traits, particularly psychopathy and Machiavellianism, are linked to a heightened propensity for endorsing utilitarian solutions in moral dilemmas.
The study has notable drawbacks that impact the robustness and generalizability of its findings. First and foremost, the reliance on a sample of 208 undergraduates raises concerns about the representativeness of the results, limiting the applicability to a broader and more diverse population. Cultural and demographic factors are overlooked, diminishing the study's capacity to account for variations in moral judgments across different backgrounds. The use of footbridge-like moral dilemmas may oversimplify the complexity of real-world ethical decision-making, potentially compromising the external validity of the study. Self-report measures, particularly for psychopathy, Machiavellianism, and perceived life meaninglessness, introduce the risk of social desirability bias, affecting the reliability of participants' responses. The cross-sectional design lacks a longitudinal perspective, limiting insights into the dynamics of the relationships over time. Additionally, the study's narrow focus on a specific set of personality traits neglects other potential influential factors in moral decision-making. The lack of inclusion of alternative ethical perspectives and the assumption that a utilitarian framework is universally applicable raise doubts about the generalizability of the study's findings. Additionally, the use of hypothetical scenarios and participants' self-reported preferences presents challenges in determining how these responses translate to real-world moral behavior.
Nevertheless, this study explores the philosophical debate regarding moral principles and decisions, particularly focusing on deontological and utilitarian approaches [7]. The discussion highlights cognition aspects in moral judgments, especially in sacrificial dilemmas, where utilitarian options are often seen as less-than-ideal. The paper proposes that embracing a utilitarian framework as a normative benchmark may lead to the classification of a considerable number of individuals as morally incorrect.
2.3. Neurology
As the study of morality progresses, researchers have expanded their lens to encompass the captivating realm of neuroscience. The intricate neural networks and biochemical processes within the human brain offer invaluable insights into the foundations of moral judgments [9]. Neuroscience plays a pivotal role in unraveling the enigmas surrounding ethical decision-making, illuminating how people’s brains process information, emotions, and social cues to shape moral perspectives [10,11]. This exploration delves into the burgeoning field of neuroethics, scrutinizing how the architecture and functioning of the brain influence moral judgments and providing a profound comprehension of the biological underpinnings that contribute to diverse moral reasoning across individuals and societies.
The previous study only briefly addressed the fundamental reasoning behind human cognition, specifically neuroscience [12]. In contrast, the subsequent study aims to explore the neural associations of people’s moral judgments from both the approach of the individual involved (1st person - actor) and an outside observer (3rd person) [13].
A group of sixteen individuals who were predominantly right-handed took part in tasks involving making judgments about morality. They assessed a total of 72 ethics-related statements, which were presented either from the viewpoint of an individual or an external perspective. Particular attention was paid to carefully controlling the emotional impact and significance of the stimuli throughout the experiment. The analysis using functional magnetic resonance imaging (fMRI) employed a block design approach with four different conditions - narratives from a personal perspective, narratives from an external perspective, non-moral content, and scrambled content - each repeated eight times during scanning sessions. While their brains were being scanned using fMRI technology, participants provided intuitive ratings indicating whether they perceived each sentence as morally "right" or "wrong". The neural activity patterns associated with moral decision-making processes were recorded during these evaluations.
Behavioral results indicated notable differences in moral judgments between 1st and 3rd person perspectives. The fMRI findings demonstrated distinct patterns of neural activation for each perspective. When making judgments from a first-person viewpoint, the anterior medial prefrontal cortex (PFC), posterior cingulate cortex (PCC), and temporoparietal junction (TPJ) were observed to be active. Conversely, when making judgments from a third-person perspective, the PFC, lingual gyrus, middle occipital gyrus, and hippocampus exhibited activation. The analysis also revealed shared activation in the PFC across both perspectives. Region of interest (ROI) analyses confirmed the overall main effects in PFC, precuneus, TPJ, and hippocampus, providing insights into the neural mechanisms underlying different moral perspectives.
The study should be acknowledged for its limitations. Firstly, the findings may not be easily applicable to a wider population due to the small sample size of only sixteen participants. To improve the external validity of the study, it would be beneficial to include a larger and more diverse group of participants. Additionally, the study relies on fMRI technology, which measures neural activity but does not provide a direct and exhaustive understanding of cognitive and emotional processes. The interpretation of brain activation patterns remains somewhat speculative, and caution is needed in attributing specific mental states solely based on neural activity. The study's focus on hypothetical moral scenarios presented in isolation from real-world contexts might not fully capture the complexity of moral decision-making in dynamic, socially embedded situations. Furthermore, the study employs self-reported intuitive judgments, which can be influenced by individual differences in interpretation and subjective experiences. A more comprehensive examination of moral reasoning would benefit from integrating behavioral measures and exploring the ecological validity of the experimental design. Finally, the researchers appropriately recognize the importance of caution when applying neurobiological knowledge to real-life situations, highlighting the ethical concerns associated with utilizing neuroscience and neurotechnology in comprehending morality.
In brief, the research indicates that there are different patterns and mechanisms involved in making moral judgments from a first-person perspective (1PP) compared to a third-person perspective (3PP), providing a valuable understanding of the decision-making processes related to morally significant behaviors. The findings have implications for understanding behaviors in situations involving moral decisions, such as "Good Samaritan" acts and bystander effects. The authors, in the meantime, emphasize caution in interpreting the results. They advocate for a neuroscience approach to critically assess how neuroscience and neurotechnology contribute to people’s understanding of cognition, emotions, and behaviors, emphasizing the need for careful consideration in applying such information to real-world circumstances.
2.4. Connection
Moral decision-making is no doubt influenced by multiple complex factors. Therefore, sociocultural and cognitive factors oftentimes work together. The connection was mentioned by one of the researchers [14].
The objective of this research is to explore the intricate nature of moral perception by investigating the impact of gender, education, and religious belief on decision-making in ethical situations. The research involves a group of 50 males and 50 females who will be exposed to various situations in order to assess their reactions towards non-moral dilemmas, objective moral dilemmas, and subjective moral dilemmas that involve emotionally charged decisions. The findings suggest that there are no noticeable gender differences in utilitarian reactions to non-moral and objective moral dilemmas. Nevertheless, males tend to offer considerably more utilitarian responses when confronted with subjective moral dilemmas. Revealingly, cultural aspects like education and religion do not appear to impact reaction to moral judgment tasks. The result proposes that the assessment of personal moral dilemmas involves distinct cognitive-emotional processes for men and women, potentially indicating variances in underlying neural mechanisms. These gender-related factors identified could provide valuable insights into real-world disparities between genders in domains.
3. Conclusion
In conclusion, the exploration of factors influencing moral judgment presented in this essay underscores the profound complexity of this subject. The studies examined provide valuable insights into the interplay of sociocultural, cognitive, and neurological elements in shaping people’s moral decisions. As the author navigates a world marked by diverse perspectives and cultural nuances, understanding the foundations of moral judgment becomes increasingly crucial. Both individuals and the broader community must acknowledge the importance of this issue. People’s moral judgments not only guide personal behavior but also contribute to the fabric of social interactions and the development of ethical frameworks. Therefore, the author must place heightened importance on continued discussions and research in this realm. By fostering a deeper understanding of the multifaceted nature of moral judgment, the author can cultivate a more informed and empathetic society. This involves acknowledging the cross-cultural variations, appreciating the impact of cognitive factors, and delving into the intricate neural mechanisms at play. As the author explores the intricacies of ethical decision-making, it is crucial to underscore these discussions and promote further investigation. Continuous research and dialogue on moral judgment will not only improve people’s comprehension of human behavior but foster the development of more comprehensive and culturally aware ethical frameworks, as well. This literature review aims to collectively acknowledge the significance of this issue and foster a shared dedication to ongoing exploration.
References
[1]. Malle, B. F. (2020). Moral Judgments. Annual Review of Psychology, 72(1).
[2]. Ellemers, N., van der Toorn, J., Paunov, Y., & van Leeuwen, T. (2019). The Psychology of Morality: A Review and Analysis of Empirical Studies Published From 1940 Through 2017. Personality and Social Psychology Review, 23(4), 332–366.
[3]. Arutyunova, K. R., Znakov, V. V., Hauser, M. D., & Alexandrov, Yu. I. (2013). Moral Judgments in Russian Culture: Universality and Cultural Specificity. Journal of Cognition and Culture, 13(3-4), 255–285.
[4]. Banerjee, K., Huebner, B., & Hauser, M. (2010). Intuitive Moral Judgments are Robust across Variation in Gender, Education, Politics and Religion: A Large-Scale Web-Based Study. Journal of Cognition and Culture, 10(3-4), 253–281.
[5]. Bentahila, L., Fontaine, R., & Pennequin, V. (2021). Universality and Cultural Diversity in Moral Reasoning and Judgment. Frontiers in Psychology, 12.
[6]. Greene, J. D., Nystrom, L. E., Engell, A. D., Darley, J. M., & Cohen, J. D. (2004). The Neural Bases of Cognitive Conflict and Control in Moral Judgment. Neuron, 44(2), 389–400.
[7]. Bartels, D. M., & Pizarro, D. A. (2011). The mismeasure of morals: Antisocial personality traits predict utilitarian responses to moral dilemmas. Cognition, 121(1), 154–161.
[8]. Paulhus, D. L., & Williams, K. M. (2002). The Dark Triad of personality: Narcissism, Machiavellianism and psychopathy. Journal of Research in Personality, 36(6), 556–563.
[9]. Yoder, K. J., & Decety, J. (2018). The neuroscience of morality and social decision-making. Psychology, Crime & Law : PC & L, 24(3), 279–295.
[10]. Blasi, A. (1980). Bridging moral cognition and moral action: A critical review of the literature. Psychological Bulletin, 88(1), 1–45.
[11]. Christensen, J. F., & Gomila, A. (2012). Moral dilemmas in cognitive neuroscience of moral decision-making: A principled review. Neuroscience & Biobehavioral Reviews, 36(4), 1249–1264.
[12]. Greene, J. D. (2014). The Cognitive Neuroscience of Moral Judgment and Decision Making. The MIT Press EBooks.
[13]. Avram, M., Hennig-Fast, K., Bao, Y., Pöppel, E., Reiser, M., Blautzik, J., Giordano, J., & Gutyrchik, E. (2014). Neural correlates of moral judgments in first- and third-person perspectives: implications for neuroethics and beyond. BMC Neuroscience, 15(1).
[14]. Fumagalli, M., Ferrucci, R., Mameli, F., Marceglia, S., Mrakic-Sposta, S., Zago, S., Lucchiari, C., Consonni, D., Nordio, F., Pravettoni, G., Cappa, S., & Priori, A. (2009). Gender-related differences in moral judgments. Cognitive Processing, 11(3), 219–226.
Cite this article
Zheng,A. (2024). Literature Review: The Factors That Influence Moral Judgment. Lecture Notes in Education Psychology and Public Media,41,94-101.
Data availability
The datasets used and/or analyzed during the current study will be available from the authors upon reasonable request.
Disclaimer/Publisher's Note
The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of EWA Publishing and/or the editor(s). EWA Publishing and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
About volume
Volume title: Proceedings of the 2nd International Conference on Social Psychology and Humanity Studies
© 2024 by the author(s). Licensee EWA Publishing, Oxford, UK. This article is an open access article distributed under the terms and
conditions of the Creative Commons Attribution (CC BY) license. Authors who
publish this series agree to the following terms:
1. Authors retain copyright and grant the series right of first publication with the work simultaneously licensed under a Creative Commons
Attribution License that allows others to share the work with an acknowledgment of the work's authorship and initial publication in this
series.
2. Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the series's published
version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgment of its initial
publication in this series.
3. Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and
during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See
Open access policy for details).
References
[1]. Malle, B. F. (2020). Moral Judgments. Annual Review of Psychology, 72(1).
[2]. Ellemers, N., van der Toorn, J., Paunov, Y., & van Leeuwen, T. (2019). The Psychology of Morality: A Review and Analysis of Empirical Studies Published From 1940 Through 2017. Personality and Social Psychology Review, 23(4), 332–366.
[3]. Arutyunova, K. R., Znakov, V. V., Hauser, M. D., & Alexandrov, Yu. I. (2013). Moral Judgments in Russian Culture: Universality and Cultural Specificity. Journal of Cognition and Culture, 13(3-4), 255–285.
[4]. Banerjee, K., Huebner, B., & Hauser, M. (2010). Intuitive Moral Judgments are Robust across Variation in Gender, Education, Politics and Religion: A Large-Scale Web-Based Study. Journal of Cognition and Culture, 10(3-4), 253–281.
[5]. Bentahila, L., Fontaine, R., & Pennequin, V. (2021). Universality and Cultural Diversity in Moral Reasoning and Judgment. Frontiers in Psychology, 12.
[6]. Greene, J. D., Nystrom, L. E., Engell, A. D., Darley, J. M., & Cohen, J. D. (2004). The Neural Bases of Cognitive Conflict and Control in Moral Judgment. Neuron, 44(2), 389–400.
[7]. Bartels, D. M., & Pizarro, D. A. (2011). The mismeasure of morals: Antisocial personality traits predict utilitarian responses to moral dilemmas. Cognition, 121(1), 154–161.
[8]. Paulhus, D. L., & Williams, K. M. (2002). The Dark Triad of personality: Narcissism, Machiavellianism and psychopathy. Journal of Research in Personality, 36(6), 556–563.
[9]. Yoder, K. J., & Decety, J. (2018). The neuroscience of morality and social decision-making. Psychology, Crime & Law : PC & L, 24(3), 279–295.
[10]. Blasi, A. (1980). Bridging moral cognition and moral action: A critical review of the literature. Psychological Bulletin, 88(1), 1–45.
[11]. Christensen, J. F., & Gomila, A. (2012). Moral dilemmas in cognitive neuroscience of moral decision-making: A principled review. Neuroscience & Biobehavioral Reviews, 36(4), 1249–1264.
[12]. Greene, J. D. (2014). The Cognitive Neuroscience of Moral Judgment and Decision Making. The MIT Press EBooks.
[13]. Avram, M., Hennig-Fast, K., Bao, Y., Pöppel, E., Reiser, M., Blautzik, J., Giordano, J., & Gutyrchik, E. (2014). Neural correlates of moral judgments in first- and third-person perspectives: implications for neuroethics and beyond. BMC Neuroscience, 15(1).
[14]. Fumagalli, M., Ferrucci, R., Mameli, F., Marceglia, S., Mrakic-Sposta, S., Zago, S., Lucchiari, C., Consonni, D., Nordio, F., Pravettoni, G., Cappa, S., & Priori, A. (2009). Gender-related differences in moral judgments. Cognitive Processing, 11(3), 219–226.