Internet marketing: what do dragons and data have in common?

10.3.2025

Let's find out.


The first thing is to know the enemy. Imagine that we see a dragon and that it is quite real.

What do you notice first about the dragon? Also, is it dangerous?


The dragon in the presentation is from Game of Thrones. I think even from House of the Dragon. So we know that they are quite powerful creatures.

Does this dragon seem threatening to you?

The thing is, when we see these dragons, we tend to generalize, to use stereotypes and myths about dragons.

What does this mean? We have a mental representation of the dragon as a creature that is quite threatening. This dragon doesn't necessarily have bad intentions, but by how we imagine dragons, we perceive them as a threat.

We use these myths as shortcuts because it is safer to assume that the dragon is dangerous and to save our lives we must fight it or find a place to hide. This is because when you actually see a dragon, your brain says and screams:danger. When you see danger, you think of escape strategies, strategies to fight the dragon.

Now let's put the dragons aside for a moment and look at this.

The text itself on this slide doesn't have much meaning, it was taken from one of the books. This is an example of data where we can have a block of text with only a few bold words and the rest looks like gibberish.

This article will not burn you alive, right? But the process that we engage in its analysis - is similar to fighting a dragon.You need a strategy for fast processing of data, especially if you need to remember it.

Now let's imagine such a situation:

We are preparing for an outbreak of an unusual disease that is expected to kill exactly 600 people. We have two alternative programs that can be used to fight this disease.

  1. If Program A is adopted, 200 people will be rescued.
  2. If Program B is adopted, there is about a 33 percent probability that 600 people will be rescued. The probability that no person will be rescued is about 70 percent.

Which program do you choose?

Option B won, and Option A received two votes out of five.

Now imagine that we have the same scenario and we have to choose one program.

  1. If Program C is adopted, 400 people will die.
  2. If Program D is adopted, there is about a 30 percent probability that no one will die and about a 70 percent probability that 600 people will die.

This interesting thought experiment shows: the way we choose and make decisions, and how the wording of a question can also change your answer.
When you compare all these treatments, you can see that program A is actually the same as program C. It is just worded differently. So in this case we have 200 people who will be saved. In the case of program C, you simply state that 400 people will die. So it's a statement of the same thing, but worded differently, a little more positively than in the case of program A - because people will be saved. In the second case, people will die.

A negative connotation can affect how people perceive, research, analyze and make decisions based on data.


In addition, the first scenario shows the exact numbers: 200 people will be saved and 400 will die.

In the second scenario, there is a probability: some people may be saved and more people may die. The point is that options A and C seem less risky because fewer people will be saved, but there is 100% certainty that they will be saved. In contrast, the second scenario presents a higher probability that more people will die, creating uncertainty about who will die and who will be saved, making it much riskier. The researchers conducting this experiment found that participants who were given a choice between programs A and B tended to choose the first option, preferring the certainty of saving 200 people.


Those who chose programs C and D were more willing to take risks, often choosing option D despite the higher risk.

The formulation of the question also plays a key role. When we focus on a positive outcome, such as saving people, the framework leads to a preference for options that ensure that some people will definitely survive. Conversely, when we focus on the death of people, it causes a shift of attention toward avoiding that outcome, leading to different decision-making patterns.

Cognitive errors

Let's now discuss some of the biases that can affect our thinking, analysis and decision-making.

The effect of the formulation

As for the experiment we just conducted, the main bias in the game is the effect of the wording of the (framing effect).

The formulation effect is a cognitive bias in which people make decisions based on whether options are presented with positive or negative connotations. In general, we tend to make risk-averse choices when options are presented in a positive way. As shown in the example, people prefer the certainty of saving 200 people to the uncertainty of potentially saving more people or none at all. We also tend to avoid choices that involve potential losses when problems are presented in a negative way.

Confirmation effect

Another common tendency is the confirmation effect. This is the tendency to seek out, interpret and favor information that supports or confirms our prior beliefs and values.


One example of this effect is the belief that focusing on long tail keywords related to topics such as eco-friendly printing gadgets is the best SEO strategy.


When analyzing SEO data, you may interpret it in a way that confirms this belief, even if the data suggests that shorter keywords with high volume work better. This is a clear example of the confirmation(confirmation effect) fallacy - ignoring evidence that contradicts your preconceived notions because you are convinced that your chosen strategy is the best.

Accessibility heuristics

Another bias to consider is the accessibility heuristic, which occurs when people rely on immediate examples that come to mind when evaluating a particular topic, concept, method or decision. For example, if someone has recently watched Game of Thrones and then encounters a dragon in real life (assuming this is possible), he or she may immediately perceive it as threatening, since dragons in the series are portrayed as dangerous and destructive. This perception is reinforced because the representation of dragons from the TV series is fresh in their memory. Conversely, if they had recently watched How to Train Your Dragon or Tabaluga-where dragons are portrayed as friendly or harmless-their perception would likely be different.


Content consumed recently can have a strong impact on our decision-making and thought processes due to its availability in our memory.

In the context of SEO, if you've recently read a case study on a successful blog-focused SEO strategy, you may be overestimating the effectiveness of using blog content on your own site. This overestimation occurs because the information is fresh in your mind, making it more readily available for SEO strategy decisions.

Algorithmic bias

There is also algorithmic bias, which can be particularly important for SEO specialists.


Algorithm bias refers to systematic and repetitive errors in a computer system that result in unfair outcomes, such as favoring one category over another in unintended ways. To give a simpler example, imagine we live in a kingdom where dragons are an everyday sight. We are developing a software system to identify and categorize dragons as dangerous or not, based on characteristics such as size, color and behavioral patterns. The system is designed to help us stay safe and quickly identify different types of dragons. The historical data used to train the system comes from reports from people who have seen dragons in another part of the country, where dragons are more common. For example, red dragons were often seen on burning ships and were labeled as dangerous. As a result, the system learns to associate red scales with danger. However, this may not always be accurate; there may be red dragons in the south that are completely harmless and pose no threat.

This illustrates how an algorithm can develop biases based on incomplete or skewed data, leading to associations that are not universally true

The algorithm's performance is not error-free. It reflects the biases present in the data on which it was trained, which can affect how we react and behave in different situations.

Anchoring effect

Now let's consider another experiment.

Suppose you were asked to estimate the result of a multiplication sequence without using a calculator. For example, a sequence might be presented in ascending order: 1 × 2 × 3 all the way up to 8. When the sequence is presented in this way, people tend to estimate the result as relatively low - around 50, 60 or 100. However, when the sequence is reversed, starting with 8 and going down to 1, the estimates are much higher.This difference in estimates shows how the presentation of data affects our perception and judgment.

The actual answer, which requires a calculator to solve accurately, shows that our intuitions can be quite misleading depending on how the problem is formulated. This is another example of cognitive bias, illustrating the impact of how information is organized and presented on our brains.

The actual result of the sequence of multiplying 1 × 2 × 3 to 8, is 4320 - a surprisingly high number. Its exact calculation without a calculator would be impossible, so estimation becomes necessary. When people see a multiplication sequence starting with smaller numbers, they tend to estimate a much smaller result than when the sequence starts with a larger number. This happens because of the anchoring effect.


The anchoring effect is a cognitive error in which an individual's judgments or decisions are heavily influenced by the first information encountered - the "anchor."


In this example, when the sequence starts with smaller numbers (such as 1, 2, 3), it creates an anchor, which makes the final estimate lower. Conversely, when the sequence starts with a larger number (such as 8), the anchor is higher, leading to a higher estimate of the total. This initial anchor, while often insignificant, has a strong influence on how we perceive and process the rest of the information. The anchoring effect is not limited to numbers or math problems. For example, when reading a text, we tend to focus on the information presented at the beginning and end, often overlooking what is in the middle. This insight is particularly useful when writing emails, especially marketing or persuasive ones. If you want to influence someone's decision or emphasize a particular choice, it's best to present this information at the very beginning. People are more likely to focus and remember the first piece of information they encounter because of the anchoring effect. This effect shows how the initial information can set the tone for subsequent judgments, whether it's numerical estimation, decision-making or interpretation of written content.

The anchoring effect is particularly interesting because it goes beyond numerical estimates. It also affects the way we interpret questions, often acting as a subtle suggestion. Even with experience, it's difficult to be completely immune to the anchoring effect because it works automatically in our brains. While we may be more aware of biases, such as paying attention to the middle information in a sequence, the anchoring effect is a cognitive shortcut that our brains use, and cannot always be avoided.

Another everyday example of the anchoring effect is shopping. For example, you may see a product marked with a price that seems much lower than the "original" price. This "original" price often serves as an anchor, making the new, lower price seem like a good bargain. However, the "sale" price may actually be the intended selling price all along.


This tactic takes advantage of the anchoring effect, making people believe they are getting a bargain and encouraging them to buy. Marketers can use this understanding of anchoring to present data or prices in a way that convinces customers to make purchases.

Heuristics

Let's take a look at how we make decisions and how our brain functions, focusing on the psychology of decision-making. There are many theories about how our brain works. One theory relevant to this discussion is the concept of the brain as an "adaptive toolbox." This theory suggests that our brains are organized to use various tools-known in psychology as heuristics-to solve problems and make decisions. Heuristics are mental shortcuts that are quick and efficient, allowing us to quickly process complex information and make decisions without having to consciously analyze every detail.

When we encounter complex data, we need ways to break it down into manageable chunks for effective analysis and processing.


Heuristics help us interpret data and make decisions more efficiently. However, these shortcuts can also lead to biases, such as the anchoring effect. For example, when choosing between two similarly priced products, heuristics can help us make a quick decision based on a distinguishing characteristic - such as brand reputation or perceived discount, even if the actual differences between the products are minimal. In summary, while heuristics allow us to more effectively navigate the complexities of everyday decision-making, they can also introduce biases that influence our judgments. Understanding these biases and how they work can help us make more informed decisions, whether in personal and professional choices or marketing strategies.

When we buy something like shampoo, we don't have the luxury of spending an hour deciding which one to buy. Our brains rely on a kind of mental algorithm or heuristics to quickly compare options and make a decision.

However, this approach involves a trade-off. Heuristics simplify complex information, which can lead to biases.

Stereotypes and prejudices

In addition to prejudice, stereotypes can also influence our decision-making processes.

Stereotypes can shape our thinking and lead us to make decisions based on generalized beliefs rather than concrete evidence. It is important to remember that our decision-making process can also be influenced by personal preferences, personality traits and cultural background. These factors, while not directly related to this particular presentation, play a significant role in shaping how we make decisions. Depending on these individual factors (even when faced with the same data or situation), different people may make very different decisions.
In summary, the way we make decisions is influenced not only by cognitive biases and stereotypes, but also by a combination of heuristics, personal experiences, personality, culture and many other factors. Therefore, even when presented with the same problem or data, individuals may come to different conclusions

Here are the definitions of some key terms:

  • Prejudice: An inclination or prejudice toward a person, or group, often in a way that is considered unfair. Prejudice can also be a tendency, trend or inclination that is preconceived or not based on reason. For example, the belief that "all dragons are dangerous and should be killed" is a form of prejudice.
  • Stereotype: A simplified and standardized concept or image that represents a group or set of characteristics associated with that group. Unlike prejudice, which can be more general, stereotypes are concrete mental representations. For example, the stereotype that "all dragons are killers" is a more defined and fixed image of how dragons are perceived.

The main difference between a prejudice and a stereotype is that while prejudices are tendencies or inclinations that can influence judgment in a broader sense, stereotypes are more specific, rigid sets of characteristics applied to groups.


Both, however, can significantly affect how we view the world and make decisions.

When we talk about myths, they are not just traditional stories, but also widely held false beliefs or ideas that influence decision-making. Myths, like prejudices and stereotypes, serve as mental shortcuts to simplify decision-making, but they can also mislead us. For example, the myth that "higher price means higher quality" often influences purchasing decisions, even though the actual value may not justify the higher price.Myths, stereotypes and prejudices create filters through which we view the world. In fields such as marketing or product management, it is crucial to identify and challenge these myths to avoid wrong decisions.

Tips for avoiding biases, stereotypes and misconceptions

To mitigate biases, stereotypes and the impact of myths on decision-making, consider the following strategies.

Collect data from various sources

Make sure the data used is representative of different groups, situations or perspectives. For example, when researching SEO techniques, rely on multiple credible sources instead of one to avoid a biased or incomplete perspective.

Define clear, objective criteria for data analysis

Use algorithms, indicators or standard methodologies to reduce subjective interpretation. In recruitment, for example, a scoring system can help objectively compare candidates based on specific criteria rather than subjective opinions.

Conduct data audits to identify gaps

Regularly analyze data to check the overrepresentation or "underrepresentation" of specific demographic groups, regions or behaviors. This helps create a balanced approach to decision-making.

Use blind techniques

Blinding involves hiding key information from the analyst that may introduce bias. In recruiting, for example, removing identifying information (such as names or education) from resumes helps focus on skills and experience, reducing bias based on unrelated factors.

Encourage diverse teams and perspectives

Diverse teams bring different perspectives that can help identify biases and stereotypes. Shared decision-making often leads to a more comprehensive understanding of the data or problem.

Regularly challenge assumptions

Actively question the assumptions behind data analysis and decision-making processes. Ask whether certain beliefs or heuristics (such as "higher price means better quality") are justified by the data or are just cognitive shortcuts.

Applications

Understanding biases, stereotypes and myths in the decision-making process is essential. Cognitive shortcuts simplify everyday choices, but can lead to unfair or erroneous results. Using a variety of data sources, defining objective criteria, conducting regular audits, practicing blind analysis, and promoting awareness and diversity can lead to more informed decisions.

This approach is essential in data analysis, marketing, recruiting, product development and strategic planning.