• No results found

3. Part three: How to promote moral competence

3.3. Framework to change System 1 thinking

Pedersen‟s (2009) argues that capabilities such as moral language and mindfulness (of which it can be claimed that they are also “conscious” in the usual sense of the word since they are parts of conscious experience) influence the ability of how individuals unconsciously construe (ethical) problems. Based on this perspective, this section further elaborates on the question how we can systematically leverage System 1 thinking in order to improve ethical decision-making.

more

"ethical"

System 1 judgments

Change of enviroment

de-bias cognitive barriers thorugh consicous "moral"

experiences

Override

"unethical"

System 1 judgments with System 2 thought

3.3.1. Change of environment

One strategy for improving biased System 1 judgments focuses on the “environment” and

“contextual” factors, so that System 1 thinking will lead to “good” results (Milkman, Chugh

& Bazerman, 2008). Rather than trying to move System 1 thinking “harder” towards more conscious principle based System 2 thinking, the change of environment strategy - amongst others discussed by Sunstein & Thaler (2003) - suggests that in order to improve biased decision-making, we must “leverage our automatic cognitive processes and turn them to our advantage”.

The change of environment approach calls upon designing situations in which choices are made to increase the probability that individuals make “wise” (i.e. ethical) decisions and construe and solve ethical problems in a rich manner. Several authors have contributed to the question what must be changed in order to promote more ethical decision, as for example empirical studies on the influence of the presence of a code of conduct on ethical behavior.

Gino, Ayal & Ariely (2009) highlight the social process component of the change of environment perspective by showing that when people observe an in-group member behaving dishonestly, they are more likely to engage in dishonesty themselves, but the effect disappears when observing an out-group member. Shu, Gino & Bazerman (2011) assert more generally that “providing the opportunity to cheat leads to increased moral disengagement” as compared to a situation where a control condition does not allow cheating. The authors claim that making morality salient not only reduces cheating in these studies, it also keeps individuals‟

judgments scrupulous, a claim which was supported by making participants to read an honor code prior to a problem-solving task, which led to a reduction in engagement in moral disengagement. Ayal & Gino (2011) conclude that when the opportunity costs to act ethically are high25, a higher share of individuals crosses ethical boundaries and acts unethical, while maintaining a positive image of themselves as “ethical” individuals.

A remaining issue of the change of environment approach is, however, that individuals do not like to admit that they are susceptible to cognitive biases and therefore feel uncomfortable acknowledging their egocentrism and errors in judgment (and hence also in designing

“favorable” situations), even to themselves, as Milkman, Chugh & Bazerman (2008) highlight. Humans put (too) much trust into their intuition (which goes along with the

25 This perspective resembles strongly to Zsolnai‟s model of the “Moral Economic Man” which will be referred to at a later point of this thesis

decision aid “listen to your heart”26) since they believe that they are “natural” and impossible to be changed. This paper however argues that System 1 intuitions can be changed and that part of the solution to reduce cognitive biases depends on the individual to acknowledge his/her biases (i.e. recognition of the need for specific training and designing favorable environments) and to commit consciously to “improving his/her unconsciousness”.

3.3.2. Changing the unconscious through conscious experiences

The argument brought forward in this thesis is that moral competence can be promoted through real-live exposure to situations where cognitive biases are challenged – that is unconscious processes such as moral disengagement can be counteracted through the (from a normative perspective) “right” experiences. This includes conscious engagement in mindful diverse theories (as well as by engaging in a diverging learning style), as proposed by Pedersen (2009), in order to promote a multi-perspective exploration of problems.

Improving or changing the outcome of System 1 thinking is particularly important for decision-making biases that individuals do not like to admit or believe they are susceptible to.

For instance, as highlighted by Nosek, Greenwald & Banaji (2007), many individuals are susceptible to implicit racial biases but feel uncomfortable acknowledging this fact, also to themselves. In this context, conscious efforts to “do better” on implicit bias tests are often found futile, however not believed to change the intuitive judgment. This paper claims that through the “right” conscious experiences it is possible to align our System 1 judgments more and more with our conscious normative views. Referring to the racism bias this means that by exposing ourselves to situations where the bias is active, and discussing (and writing down) consciously, our normative view of these situations and biases can promote unconscious

“proactiveness”27.

This latter claim is derived from organizational development theory. Organizational theory has developed an approach to deal with a company‟s “reactiveness” to ethical challenges and unconscious “blindness” to ethical challenges. This paper presents a framework from

26 The perspective of this paper suggests that one should only listen to his heart if your intuitions are aligned with your normative views

27 One saying in line with this claim is that stereotyping occurs as a consequence if you know 7 or less people “good” enough from a certain group of people (i.e. you don‟t engage deeply enough in conscious discussions and experiences with a certain group to change your bias with them).

organizational development theory which insights are believed to be helpful to describe the process of how the individuals‟ moral System 1 competence can be promoted. The framework believed to be useful to deal with the individuals‟ unconscious reactiveness is Egon Zehnder International‟s process diagram of phases of organizational capability development. Egon Zehnder International (2009) describe different phases of organizational capability development of how companies facing sustainability and CSR challenges (such as for example a climate change adaptation strategy implementation) can evolve over time from being unconsciously reactive to being consciously reactive, consciously proactive before reaching the desired state, a unconsciously proactive orientation (Figure 4 and Appendix).

evolves through three distinct phases between these stages, of which each requires different levels of organizational capability to address sustainability-related issues. This paper suggests that the same may apply to the individual in becoming more proactive in regards to cognitive biases. In order to become unconsciously proactive (in a normative sense), individuals should hence (in order to achieve the desired state) engage in conscious proactive behavior to improve their unconscious proactive capabilities.

Figure 5: Phases of organizational capability development (Egon Zehnder International, 2009)

In other words, it is claimed that we must commit ourselves to applying the descriptive knowledge we possess about how our learning and development in order to promote the normative views the world possesses of how much better decision-making can be.

Consequently, in order to reduce cognitive biases and its consequences (such as amongst others bounded ethicality, racism and sexism) we must gather “the right” conscious experiences designed based on the descriptive knowledge about human boundedness. Bartsch, Cole & Wright (2005) write that by experiencing real situations, we obtain knowledge which is activated in a similar, new situation and which can lead to a “new” intuitive reaction.

This view can be combined with the previous argument based on the concept of the use of analogy in a sense that we must commit ourselves to making consciously proactive ethical views a “habit”. The work of Moran, Ritov & Bazerman (2008) supports this former idea, as they assert that people who are encouraged to see and understand the common moral principle

unconcsiously reactive

concsiously reactive

concsiously proactive

unconcsiously proactive

underlying a set of seemingly unrelated tasks generally demonstrate an improved ability to discover solutions in a different task that relied on the same underlying principle (in Milkman, Chugh & Bazerman, 2008). In other words, instead of making cost benefit analysis a habit, we need to consciously commit to making ethical approaches to problem-solving and cognitive bias reduction mechanisms a habit. As an example for such an exposure to cognitive biases, a community outreach program is discussed in chapter 3.5.