The human brain is complex, most especially as a specimen for study regarding decision-making and problem solving. The brain uses very precise methods of compression in order to distinguish the most important features of a certain sensory data. Because of human error, these methods are not perfect. Humans obtain large numbers of sensory data a day, even terabytes worth to be more precise. Most of what a person sees within the day gets erased from his memory, yet tiny pieces of data remain. These are converted into symbolic format, which would connect to the person’s experiences once he is in contact with it.
When the sensory data gets abstracted it then becomes symbolical to the person and taken from long-term memory, certain biasing effects arise. “Biases also operate when the symbols are invoked and manipulated for cognitive operations. ” The results of these are our belief systems, representation and anchoring. Anchoring refers to the process where people form beliefs around an anchor and every incoming data should relate towards that anchor even though significantly irrelevant. Representation “occurs when people expect their outputs to resemble the generating process.”
Yet representation doesn’t prove efficient and always true, and this is due mainly because of human bias. Representation and anchoring are examples of heuristics. They are more commonly described as “rules of thumb” which humans use in reasoning in cognitively economical ways. These are inscribed in the human brain, and it is the same for all, as we all have a pair of hands and a pair of eyes. Heuristics started in the late 1960s and early 1970s and devised by Amos Tversky and Daniel Kahneman where they focused their studies on human judgment.
Heuristics replaced rational judgment and the algorithmic method where they theorized that judgment in uncertainty rests on a limited number of heuristics rather than other more complicated methods. Heuristics became accepted and spread upon almost all forms of knowledge – economics, medicine, law, psychology and political science. This study was revolutionary in its time because “it simultaneously questioned the descriptive adequacy of ideal models of judgment and offered a cognitive alternative that explained human error without invoking motivated irrationality. ”
Kahneman and Tversky’s study revolved around the assumption of “bounded rationality”. In their study, they have also showed that humans indeed are very limited in processing and are probable to erroneous judgment, they attest to the earlier models of judgment where not fit to humans since they are much simpler than what is really happens in human decision making. After wide acceptance and a moving away from the rational decision-making patterns devised in the past, where humans are thought to always choose the best decision by means of probability, Heuristics is still seen to have inconsistencies and laden with biases.
The whole concept of Heuristics gives a structured way of problem solving, taking into consideration human brain function and capacity which inevitably makes the process easier. As compared to the old model of thinking where humans are always seen to know probability and choose the best way based on probability computation, Heuristics give a deeper understanding of the human condition. Some failures of heuristics enter when it is presented with data that is not part of its “domain of expertise” or what is already previously calculated. Biases are a key error in using heuristics for problem solving.
A cognitive bias is defined as “any of a wide range of observer effects identified in cognitive science and social psychology including very basic statistical, social attribution, and memory errors that are common to all human beings. ” Biases that are in direct relation to decision making and problem solving affect scientific methods technically designed to eliminate these exact chances of bias. Biases in Heuristics are difficult to notice for three reasons. First, the human thinking process that is used to judge and assess in problem solving is in itself full of biases.
Second, biases are common and widespread that it is difficult to notice and third, the decisions that are made through the use of Heuristics feel good therefore it satisfies the person, regardless if it right or wrong. According to a University of Pennsylvania law school research paper, principal findings in behavioral economics and cognitive psychology through the years have shown in studies that humans “deviate from ideal precepts of rationality in many settings, showcasing inconsistent judgment in the face of framing and other formal manipulations of the presentation of problems.”
In their research paper entitled, “Heuristics and Biases in Thinking About Tax”, they have suggested that citizens especially in the United States suffer from a wide range of biases in the understanding of the basic features of the tax-law design and reform, like the perceptual biases more studied in the domain of the private markets, like the evaluation of “risky choice” and consumer finances. The main goal of the paper was to show that in evaluating the tax systems present in the country, citizens are vulnerable and exhibit a wide range of Heuristics and biases, which lead to inconsistent judgment and evaluation.
Prevalence of these biases show that there is indeed room for “skillful” politicians and facile political systems to “manipulate public opinion, and that tax system design will reflect a certain volatility on account of the possibility of eliciting preference reversals through purely formal rhetorical means. ” Due to the inconsistencies and biases of Heuristics, decision theorists have studied this phenomenon more closely. It turned into a respected field, founded by of Kahneman and Tversky, commonly known as “Heuristics and biases.”
Heuristics may work well in problem solving, but can also turn to harmful biases. A few examples of heuristics and biases include Framing, which means viewing a need in the real world as a “problem” you can work on solving and the counterpart bias is mistaking your view of the problem for the real need. Status quo, a heuristic that implies “Business as Usual” or “If it ain’t broke don’t fix it” may incur bias against anything new. Cognitive overconfidence is the same as decisiveness and refusal to be haunted by doubt which may lead to the bias of self-delusion.
The Heuristic Prudent Estimation means “conservative estimates” which may lead to missed opportunities which are especially dangerous in group problem solving. Most likely scenario has the Heuristic explanation of avoiding wasting time on possibilities that probably won’t happen, but the bias is rare events can be the most important. Guessing at patterns implies quickly spotting the trend or the big picture, with a corresponding bias of “Outguessing randomness” and seeing patterns that doesn’t exist.
The last example Recall ability or Availability which implies, if an idea doesn’t fit in with the obvious data, it’s surely suspect. The corresponding bias for this is, non-obvious things can be most important or even most common. These examples of Heuristics are common in everyday life, and these rules of thumb do help in assessing situations such as deals in business, economics, or day to day domestic problems. It is common knowledge that these Heuristics can fail predictably, which are also known as “hidden traps” when a person succumbs to the counterpart bias.
It is already a given that Heuristics bring about inconsistencies and biases, but there are some methods of control. For example, for the Heuristic Framing, advice is to not automatically accept initial framing, strive for objective neutral framing, and challenge other people’s framings. These are remedies to biased formed Heuristics, which will generally help in problem solving, whichever stage of the problem the person is at.