An Explanation of how Biases Affect Rationality
From the book titled The Robot’s Rebellion: Finding Meaning in the Age of Darwin by Kieth E. Stanovich, the author presents something called the acceptance principle. This principle states that when people analyze problems, they tend to immediately accept the framework for the problem as it is presented to them and will consequently evaluate the outcomes in terms of that frame. Stanovich presents several detailed examples in his argument concerning the flaws of human rationality. The author presents us with a common definition of rationality and then goes on to argue through examples that humans do not use their rationality in this way most the time. Stanovich briefly mentions confirmation bias in this discussion, but he never fully explains the different cognitive biases that have been researched. Therefore, the acceptance principle, or the concept of how biases affect our rationality in terms of decision making, will be the target concept of this paper. The base concept(s) will be a discussion of the different cognitive biases that have been presented in lecture.
The first bias that will be discussed is distinction bias. This bias refers to the tendency for humans to be able to identify differences between two options more clearly when viewing the options at the same time, rather than viewing them separately (Hsee, 1998). For example, consider two hoodies, one is produced by Nike, and the other is a no-name brand, but both are made from the same material and are of the same thickness. In the store, consumers will often perceive the Nike hoodie to be of higher quality, and therefore able to perform its function (provide warmth) better than the no-name brand hoodie. However, if a test were done and a group of consumers were given both Nike hoodies and no-name brand hoodies and asked to rate them on warmth, both ratings would probably be about equal. That is, when the hoodies are considered alone, without the other beside it, they will be perceived to be almost identical in form and function; neither can be ‘better’ than the other. However, when a consumer is in a store and making the decision between buying a Nike hoodie and a no-name hoodie the difference in brand between the two hoodies will be more obvious to them than if they were observing the two hoodies in separate situations. [“Write my essay for me?” Get help here.]
Another common bias is the confirmation bias. Confirmation bias is the tendency for humans to seek out, interpret, focus on, and remember information in a way that confirms what they already believe to be true (Plous, 1993). Confirmation bias is a huge hindrance in our ability to accept new information about the world and grow as human beings. Because if we are always focusing on information that confirms our own beliefs, even if those beliefs are false, it can be difficult to accept a competing view, even if that view is true. Confirmation biases can lead to heated emotional debates and severely inhibits the rational ability of humans as it is quite illogical to ignore information that contradicts what you believe to be true, even if what you believe is false. Examples of confirmation bias appear everywhere in our day to day lives and appear in almost every major political or social issues. For example, consider debates about the existence of supernatural beings or phenomena (like ghosts or hauntings etc.). Those who believe in ghosts, or the supernatural, will tend to read books and articles on ‘documented’ hauntings and ghost stories which further confirms their belief that ghosts and such are real. However, another individual who does not believe in the supernatural will spend their time reading about possible scientific explanations for strange events that lead people to believe in supernatural intervention.
The two biases discussed are only a fraction of the huge number of cognitive biases identified in the literature. Some other common cognitive biases are the negativity bias, the omission bias, and the outcome bias. The negativity bias is the psychological notion that individuals tend to remember negative events much more prominently than then they remember positive events. The omission bias comes up when considering two scenarios; if the outcome of individual action will cause harm, but the same amount of harm will be causes if the individual takes no action, the path of no action is seen by the individual to be the more moral decision even if both scenarios lead to the same amount of harm (action or no action). The outcome bias is described as the tendency for humans to judge decisions made in the past in terms of the absolute outcome, rather than in terms of the information that was available to them at the time the decision was made. [Need an essay writing service? Find help here.]
In his book, The Robot’s Rebellion, Stanovich states that people often violate rationality principles in every aspect of their day to day lives and he identifies biases as part of the cause for this. This paper has explained in more detail how some of those biases affect human decision making processes. Biases defy the concept that humans most often operate in a rational matter because it shows just how susceptible to faulty thinking they are. Biases, and other threats to our rationality (that Stanovich calls TASS), operate in the background of the human mind during the decision-making process and heavily influence the outcome of our decision. Since decisions often must be made quickly, it can be difficult to ‘turn off’ these biases and so they will influence human rationality.[Click Essay Writer to order your essay]The ability for humans to learn to rise above their biases takes time, concentration, and knowledge of what these biases is and how they operate; the identification biases in our rationality is the first step towards overcoming them.
Hsee, C. K. (1998). Less is better: When low-value options are valued more highly than high-value options. Journal of Behavioral Decision Making, 11(2), 107-121
Plous, S. (1993). The Psychology of Judgment and Decision Making. Philadelphia: Temple University Press.