Appendix 3: Fallacies, Biases, and Archetypes

A3.1 Logical Fallacies

From know your logical fallacy arc flaws in reasoning. These flaws have implications on objectivity as well as negative connotations on communications and problem resolution and fostering an environment of team work. Logical fallacies are different than cognitive biases which we discussed in the chapter. Logical fallacy is an error in logical arguments while cognitive biases are limitations in thinking and knowledge. These limitations originate in memory, social attribution, and errant calculations. We provide a list of logical fallacies that you may find in the work place.

■ Strawman: an over exaggeration of another person’s argument makes it possible for another to present their arguments as well reasoned and reasonable.

■ False cause: the connecting of one thing to another, real or perceived, in that the prior is the cause of the latter; correlation is not causation. Sometimes these events are coincidental.

■ Appeal to emotion: the attempt to move the discussion from the merits of the discussion to evoking an emotional response, rather than a valid or compelling argument

■ Gambler’s fallacy: the belief that runs occur to statistically independent phenomenon such as the flipping of a coin. Each flip event is an independent instance, 50/50; if you have a run of 10 heads, the next flip probability is still 50/50.

■ Band wagon: the appeal to popularity or the fact others do something as an attempted form of validation

■ Appeal the authority: because an authority says something, it therefore must be true. This is not an argument, but a way to shut down the argument. There are experts and what is said may be true, but to just assume because somebody viewed as an expert says something does not make it valid.

■ Fallacy fallacy: The presumption that because a claim has been poorly stated or argued, or a fallacy has been made, then the entire claim must be wrong

■ Slippery slope: if we do x then it will lead to y. In this fallacy the issue at hand is avoided and extrapolated to the extreme hypothetical without strong evidence to support.

■ Ad hominem: the attack on a person’s character or appearance and personal traits rather than on the merits of the argument to undermine their argument.

■ Composition: the assumption that one part of something has to be applied to all, or other parts of that thing; or that the whole must apply to its parts.

■ No true Scotsman: The making of appeals to purity as a way to dismiss relevant criticisms or flaws in the argument.

■ Genetic: the judgment of something as good or bad on the basis of where or from whom the idea originates.

■ Tu quoque: answering criticism with criticism, also known as the appeal to hypocrisy. It is a way of distracting one’s focus to eliminate the defense of the other’s argument, moving the conversation.

■ Personal incredulity: because you do not understand something or find it difficult comprehend, you act as if it is probably not true.

■ Special pleading: the clinging to a way of thinking in the face of evidence that debunks your perspective, moving the goal posts to allow the prior argument to remain.

■ Black or white: the proposition that only two alternatives are possible when many more exist. Also known as the false dilemma.

■ Begging the question: circular argument in which the conclusion was included in the premise, originating from deeply held assumption so much so that the assumption is deemed as valid.

■ Appeal to nature: the argument that because something is natural it is therefore valid, justified, and inevitable, good or ideal.

■ Loaded question: loaded question fallacies are particularly effective at derailing rational debates because of their inflammatory nature - the recipient of the loaded question is compelled to defend themselves and may appear flustered or on the back foot.

■ Burdon of proof: the burden of proof lies not with the person making the claim, but with someone else to disprove. Just because others are unable or unwilling to disprove does not make the assertion valid.

■ Ambiguity: use of double meaning or ambiguity of language to mislead or misdirect or misrepresent the truth, obfuscation.

■ Anecdotal: the use of personal experience no matter how isolated as an example rather than sound arguments and compelling evidence.

■ The Texas sharpshooter: using cherry picked data to fit your argument or to find a pattern to fit a presumption rather than developing a hypothesis based upon the evidence.

■ Middle ground: the claim that a compromise or middle point between two extremes must be the truth

There are many more biases and impediments to clear thinking than the selection we just provided, this is but a brief view. It is more difficult than we think to have clear thinking and many times, we are not even aware. These biases can make it difficult to learn, and distribute any learning.

A3.2 Cognitve Bias

In the previous section we discussed teaching and how teaching is just the facilitation of experiences that produces some form of behavioral modification. This section deals with the processing of those experiences. Cognitive Biases is using past experiences or the remembrance of past experiences (how we remember past experiences may be inaccurate or jaded by our position at the time of the experience) to determine our actions for current situations: experiences. While this helps us process information (new experiences) quickly through relating it to previous experiences it comes with the potential for inaccurate assessments or interpretations of a situation. Another part of cognitive biases is its use of the individuals’ own likes or dislikes; this emotional context could and commonly does jade our decisions. We all have our own perspectives on every situation. I like to use the example of the angle of view of a quarter: if we look at a quarter from its side while standing up on its end we could assume it’s a line, if we look at it from the side while it’s laying down it could appear to be a dash, and yet again if we look at it from the front we see it’s a quarter. While the quarter never actually changed, what is the position (perspective) of the individual viewing it could have a drastic effect on what they think they are seeing.

For example, let’s consider a few of those biases starting with confirmation bias. Confirmation bias impacts product development and project management in many ways. We seek information that supports what we already believe and when we find it, we stop looking and proceed or make some decision. The problem, as pointed out by the great philosopher Karl Popper, is that finding positive evidence does not confirm what you believe to be true. It just gives you the illusion that it is true. Hypotheses are explored for veracity by finding information that refutes our thinking and the hypothesis (falsification). Evidence we find that confirms does not mean what we believe is in fact true. There are many cases where a product after undergoing testing, all is well, the product is good, has catastrophic problems in the field, recalls, and legal action.

There is a tendency to overestimate small runs of data, without understanding the sampling and what that means. Those reviewing the data can see “patterns” in the data and take action on what we see is a pattern, however, the truth is there is no pattern, or our view of the pattern is incorrect. Bad data, or misinterpreted data, amount to the same thing, failure.

In general, people are optimistic, in my experience. We occasionally find the “Marvin the Robots” from the book The Hitchhikers Guide by the late great Douglas Adams, but by and large those are exceptions to the rule. This optimism can become a negative thing with optimism bias, when we are trying to find out what will likely happen. We delude ourselves of the probability of success and the risks associated with our work and the product. We choose to set our project up without due diligence to the risks associated, that is, there is little thinking about what can go wrong. Similarly, when it comes time to launch the product, in the absence of information telling us there could be a defect (see confirmation bias), we put on our rose-colored glasses, and off we launch believing nothing can go wrong.

Lastly, we will review survivor bias. We become subjected to survivor bias when we decide to review our successes to find a common theme, thinking if we do what those earlier projects did, we should be successful. The problem comes when we only look at those success stories. It is possible those things we attribute to those project’s successes were also used in those failing projects, and something else drove the project to success. Without looking at the failures, we never know. My friend Kim Pries used to say, “if we walk into Toyota’s bathroom, and it is tiled in blue, does that mean if we tile our bathrooms in blue, would we have a successful project and company”?

The impact of cognitive biases can be helpful or extremely detrimental to any progress or process. This is especially true for teaching and learning where the basic premise is providing an experience to modify a behavior. To get past these biases requires a safe environment, one where thoughts can be explored without retribution. As we will discuss in the next section, Communication, the more inferences applied to a topic or items the more likely there is for some error in its assessment. If a new experience is related to an incorrect experience or the recipient’s perspective is not that of the individual delivering the experience, what is learned from the experience could and most probably will be not what was intended. We all know that diversity can produce great ideas; the premise behind it would seem to counter the idea that cognitive biases can create the potential for issues. It is not diversity or cognitive biases that create anything, but how they are used that creates the gain or loss. We should view these two situations as tools. The tool does neither make nor break how the work is done; it is how they are employed that determines that. Understanding that there is diversity and cognitive biases and engaging these different perspectives to determine the reality of a situation can provide a better starting point and path to the goal at hand.

A3.3 Heuristics

Heuristics are mental shortcuts that are used to reduce cognitive loading; a rule of thumb is an example of a heuristic. Attributes allow us to arrive at an accurate conclusion without taking the mental load and time to calculate or perfectly decide. These shortcuts are developed through the experiences of the individual and as such it is not likely any two of these shortcuts will be perfectly congruent; different experiences provide different mental shortcuts. Heuristics can sometimes lead to biases.

АЗ.3.1 Availability Heuristic

Availablity heuristic is born out of what comes to our mind immediately when a decision is required, drawing from experiences and knowledge that are relevant to the decision at hand.

“Are there more words that begin with “r” or that have “r” as their third letter?” To answer this question, you can’t help but bring specific words to mind. Words that begin with “r” are easy to think of; words that have “r” as their third letter are harder to think of, so many people answer this question with “words that begin with V” when in fact, that’s the wrong answer[1] [2]

A3.3.2 Representativeness Heuristic

The affect heuristic is when we make choices based on emotions at the time of the decision that needs to be made; the mood impacts the decision results. “Linda the bank teller” is one of the most famous examples. It comes from the work of Kahneman and Tversky. In this problem, you are told a little bit about Linda, and then asked what her profession is likely to be. Linda is described as an avid protester who went to an all girls’ college. She is an environmentalist, politically liberal, etc. (I’m making up these details, but the information that subjects got in this study is quite similar.) Basically, she’s described in such a way that you can’t help but think that she must be a feminist, because the prototype/stereotype that you have in your head is that women who are like Linda are feminists. So when people are asked if Linda is more likely to be a bank teller (working for The Man!) or a feminist bank teller, most people say the latter, even though that doesn’t make any sense, in terms of probability. In this case, people use a shortcut that involved a stereotype to answer the question, and they ignored actual likelihoods^

While availability has more to do with memory of specific instances, representativeness has more to do with memory of a prototype, stereotype, or averaged

A3.4 Archetypes

An archetype is a model of an observed system or behavior that is readily identifiable by the observer into a specific category or classification.

A3.4.1 Systems Archetype

System archetypes are patterns of behavior of a system. Systems expressed by circles of causality have therefore similar structure. Identifying a system archetype and finding the leverage enables efficient changes in a system. The basic system archetypes and possible solutions of the problems are mentioned in the section Examples of system archetypes are provided below. A fundamental property of nature is that no cause can effect the past. System archetypes do not imply that current causes effect past effects.

АЗ.4.1.1 Circles of Causality

Circles of causality is the concept that all events, have some precedent for cause. An observable phenomenon is the result of some other system or subsystem event or events. For every action, there is some effect, it may be largely imperceptible, we may not know where to look, but there is some impact. Think of everything we do as a net where everything is connected, perhaps not immediately, appreciably, but are connected.

A3.4.1.2 Reinforcing Feedback

Reinforcing feedback (or amplifying feedback) accelerates the given trend of a process. If the trend is ascending, the reinforcing (positive) feedback will accelerate the growth. If the trend is descending, it will accelerate the decline. An avalanche falling, or a snowball rolling dowon hill is an example of the reinforcing feedback process.

A3.4.1.3 Balancing Feedback

Balancing feedback (or stabilizing feedback) will work if any goal-state exists. Balancing process intends to reduce a gap between a current state and a desired state. Tlie balancing (negative) feedback adjusts a present state to a desirable target regardless of whether the trend is descending or ascending. An example of the balancing feedback process is staying upright on a bicycle (when riding).

A3.4.1.4 Limits to Growth

The unprecedented growth is produced by a reinforcing feedback process until the system reaches its peak. The halt of this growth is caused by limits inside or outside of the system. However, if the limits are not properly recognized, the former methods are continuously applied, but more and more aggressively. This results in the contrary of the desired state - a decrease of the system. The solution lies in the weakening or elimination of the cause of limitation. Example: dieting, learning foreign languages.

A3.4.1.5 Shifting the Burden

The problem is handled by a simple solution with immediate effect, thereby “healing the symptoms.” The primary source of the problem is overlooked, because its remedy is demanding and has no immediate outcome. The origin of the problem should be identified and solved in the long-term run during which the addiction to the symptomatic remedy decreases. Example: drug addiction, paying debts by borrowing.

A3.4.1.6 Eroding Coals

A kind of shifting the burden archetype. As current problems need to be handled immediately, the long-term goals continuously decline. It can be avoided by sticking to the vision. Example: balancing the public debt, sliding limits of environmental pollution.

A3.4.1.7 Escalation

This archetype could be seen as a non-cooperative game where both players suppose that just one of them can win. They are responding to actions of the other player to “defend themselves.” The aggression grows and can result in self-destructive behavior. The vicious circle can be broken by one agent stopping to react defensively and turn the game into a cooperative one. Example: arms race.

A3.4.1.8 Success to Successful

Two people or activities need the same limited resources. As one of them becomes more successful, more resources are assigned to them. The second one becomes less and less successful due to lacking resources, and thus “proves the right decision” to support the first one. Problems occur if the competition is unhealthy and interferes with the goals of the whole system. The two activities or agents might be decoupled or eliminate the competing elements, or perhaps they should receive balanced amount of resources, some examples: two products at one company, work vs. family.

A3.4.1.9 Tragedy of the Commons

Agents use a common limited resource to profit individually. As the use of the resource is not controlled, the agents would like to continuously raise their benefits.

The resource is therefore used more and more, and the revenues of the agents are decreasing. The agents are intensifying their exploitation until the resource is completely used up or seriously damaged. To protect common resources some form of regulation should be introduced. Example: fish stocks (The Fishing Game).

A3.4.1.10 Fixes that Fail

In the fixes that fail archetype, the problem is solved by some fix (a specific solution) with immediate positive effect. Nonetheless, the “side effects” of this solution turn out in the future. The best remedy seems to apply the same solution. Example: saving costs on maintenance, paying interest by other loans (with other interest).

A3.4.1.11 Growth and Underinvestment

The limit to growth is the current production capacity. It can be removed by enough investment in new capacities. If the investment is not aggressive enough (or it is too low), the capacities are overloaded, the quality of services declines, and the demand decreases. This archetype is especially important in capacity planning, for example: small but growing company.

A3.4.1.12 Unintended Consequences

Unintended consequences are those things that arise out of actions we take that we did not anticipate. Experience suggests these unintended consequences originate from incomplete thinking or singular perspective about the topic under consideration. We do not take the time to explore the consequences beyond our immediate consideration.

To be sure there are things we do not know or will not know until we undertake specific actions. There are ways to minimize the probability and the ensuing damage by using multiple perspectives as well as not making large sweeping changes, but rather perform tests and explore what can happen without putting the entire department or organization at risk.

Including multiple perspectives when considering or evaluating the alternatives at hand and what change we should undertake. This multiple perspective approach can help moderate biases and certainly can generate additional ideas on the topic through conversations that help us discover any underlying assumptions and relevant experiences. In this way we may be able to ascertain some of the consequences for our decision before acting, perhaps saving us some unnecessary time and undue stress. Experience suggests many times these unintended consequences could have been predicted if time and effort was spent first exploring the proposed idea. It is important to not stigmatize failure. Failure is where we discover something new, something perhaps in opposition to what we believed to be true. The fear of unintended consequences should not prohibit the exploration. Somethings cannot be known until effort is expended.

Another appropriate approach that is less risky is to devise small scale experiments to learn about our actions. For example, the Total Quality Management approach devised by Shewhart and Deming advocates this incremental exploration via experiments and a review of the results. These results provide more information about the subject we are exploring, and help us to understand these consequences better, or perhaps open our eyes to things we had not considered previously.

  • [1] * ' Finucane, M. L„ Alhakami, A., Slovic, P., & Johnson, S. M. (2000). The affect heuristic injudgments of risks and benefits. Journal of Behavioral Decision Making, 13(1), 1-17. doi:10.1002/(sici)1099-0771(200001/03);2-s
  • [2]
< Prev   CONTENTS   Source   Next >