Cognitive Biases

Biases are cognitive distortions. With less negative connotations, we also speak of heuristics, rash thinking, or mental shortcuts. Many names for similar phenomena that significantly influence all of our perceptions, opinion formation, memories and decision-making processes - again and again also through unconscious mechanisms.

Such thought patterns, psychology now describes well over a hundred (!) of them, are not per se good or bad, helpful or harmful. They are above all one thing: human.

On this page we present - with deliberately concise explanations and examples - some of the most important biases.

Anchor effect

When making decisions, people are influenced by information (e.g. numbers) that they hear first - regardless of whether this information has anything to do with the actual decision.
Example: In a salary negotiation, the person who communicates an initial sum sets the anchor. The other people then orient themselves to this figure - whether consciously or unconsciously.

Availability heuristic

The easier it is for a person to remember a piece of information (e.g. an event), the more this information is overestimated.
Example: If a person often reads reports about bank robberies in a short period of time, the general probability of such crimes is often estimated to be higher than objective statistics would indicate.

Bandwagon or follower effect

This bias describes the tendency for a person to adopt the opinions or decisions of others the greater the number of these people.
For example, many voters tend to put their crosses in the hands of politicians who they believe will win in the end.

Mere Exposure Effect

Even the repeated perception of a piece of information or thing can result in a more positive evaluation.
Example: The more familiar a person becomes, the more attractive and likeable they can appear.

Social desirability effect

Respondents tend to give answers that they believe will meet with social approval.
Example: In pre-election surveys, many citizens state centrist parties as their preference, but then actually vote for extreme parties (left or right).

Not-invented-here Syndrome

Devaluation of ideas that did not originate or were not developed within the company itself.
Example: lack of consideration or recognition of outside consulting firms when they challenge previous strategies.

Clustering illusion

Here we describe the human property of ascribing meaning to random patterns (which are bound to occur in sufficiently large data sets).
In short: recognizing patterns where there are none at all.

Confirmation Bias

Describes the tendency to take in and perceive information that corresponds to one's own world view. Or, in other words, to select and interpret information in such a way that it corresponds to one's own expectations.
Example: Internet users only take into account the information on social media channels that corresponds to their attitudes (or prejudices). So people only hear and read what they want to hear and see.

Priming

Describes the effect of activating implicit memory in subjects with words or pictures, for example. Example: If people are confronted with a series of terms from the context of "aging/getting old", some subsequently move more slowly than before.

Framing

People perceive messages that are presented differently. Even if the actual content has not changed. Example: If large corporations are repeatedly described with terms such as "giant" or "industry giant," many people react with more sympathy for the smaller (supposedly weaker) competitors.

Attribution error

This is about the tendency of observers to see the cause of actions only in the persons - instead of (also) on the external circumstances. The reason is that the personality or the behavior of an agent is more (and easier) noticeable than the overall situation.
Example: The person who bumps into you in a crowded pedestrian zone is assumed to have a bad character, instead of asking whether this was not intentional at all, but due to the situation.

Correlation and causality

If the change of one variable causes the change of another, this is called causality. Here there is a cause-effect relationship. If there is only a statistical connection (without cause-effect), it is a correlation.
Example of causality: Rising temperatures in summer lead to more ice cream consumption. Here, one leads to the other.
Example of correlation: sunburns in summer and ice consumption of people. Both effects usually occur together, but the common cause in both cases is something else. Here, the more intense solar radiation.

Halo effect (halo effect)

When a single feature of a person is perceived as so dominant that other features or characteristics of that person are hardly noticed anymore. The dominant characteristic alone then leads to the overall judgment about this person.
Example: Good-looking people are often assumed to have a good character or higher intelligence.

Blind Spot Bias

Some people believe that they have not succumbed to distorting influences. I.e. THEY always act rationally and only the OTHERS are influenced by thinking errors.

Conservatism Bias

This is the tendency to prefer existing evidence or information to new ones - especially if the latter might change something about the status quo. This thus corresponds to the fundamental unwillingness to want to change existing judgments or evaluations.
Example: New information that the earth was a sphere was not believed because it challenged the old worldview.

Pro-innovation or pro-change bias Innovations or new developments are generally considered to be better or more advantageous than existing ones.
Example: In high-tech industries, innovations are always perceived rather positively. Here, more than anywhere else, standing still is seen as a step backward.

Selective perception

Stimuli or information to which we currently pay more attention are perceived more strongly. Others are more likely to be ignored.
Example: Anyone who is currently more involved with classic cars will "suddenly" notice such vehicles more frequently in the street scene.

Outcome Bias

If decisions are evaluated "only" according to their results, this is often too short-sighted. The end does not always justify all means.
Example: Winning the lottery does not necessarily mean that playing the lottery is always a good idea.

Ostrich effect

This tendency corresponds to the famous "sticking one's head in the sand" of the ostrich - especially to avoid subjectively difficult information or developments.
Example: The rapid developments of new chatbots are ignored in the hope that everything will remain as it is.

Information Bias

Too much information can also lead to wrong decisions or assessments.
Example: If you hear about burglaries in a large city again and again, this can lead to a misjudgement of the probability of burglary. Official statistics would be absolutely necessary for this.

Recency effect

The so-called recency bias favors current events over past information. One reason is the easier availability of the new information.
Example: If the stock market is doing well today, some traders fall into the belief that this is always the case.

Salience bias (salience effect)

If something about a product stands out (to be salient), people like to be influenced by it and transfer this salience to the whole.
Example: The above-average size of the display in some Tesla vehicles leads people to conclude the technical superiority of the entire car.

Stereotyping

Everyone knows stereotypes: FC Bayern fans are arrogant and yoga participants are esoteric. However, every person is a unique individual. So one should try not to make a hasty judgment about certain character traits of a person.

Survivorship Bias

Here, successes or their probabilities are systematically overestimated and failures underestimated. For example, successful entrepreneurs are presented more prominently in the media than less successful ones. This can lead to entrepreneurship being seen as less risky than it actually is.

Backshopping error

If you have always known it afterwards, it doesn't necessarily have to have been that way. Often, in retrospect, one's own ability to judge is overestimated because everything is known at the present time.
Example: If a project has failed, people want to have known about the excessive risks already at the beginning of the project ("but nobody wanted to hear my opinion").

Dunning-Kruger effect

Less intelligent or even incompetent people are often subject to self-overestimation, but UNDERestimate others. The self-conception about their own ability exceeds the reality, so they are unable to judge themselves objectively.
Examples: There are enough of these, you just have to look around :-)

Negativity bias

Negative events, thoughts or feelings are subjectively perceived and weighted more strongly than their positive counterparts. Criticism therefore has a stronger effect than praise, warnings stronger than positive forecasts or a possible (monetary) loss counts more than a nominally equal gain. The overestimation of the negative is primarily evolutionary.

Your success, our goal!