You need to make an unbiased, rational decision about something important. You do your research, make lists of pros and cons, consult experts and trusted friends. When it’s time to decide, will your decision really be objective?

Maybe not.

That’s because you’re analyzing information using the complex cognitive machine that has also processed each one of your life experiences. And over the course of your life, like every person on the planet, you have developed a few subtle cognitive biases. Those biases influence what information you pay attention to, what you remember about past decisions, and which sources you decide to trust as you research your options.

A cognitive bias is a flaw in your reasoning that leads you to misinterpret information from the world around you and to come to an inaccurate conclusion. Because you are flooded with information from millions of sources throughout the day, your brain develops ranking systems to decide which information deserves your attention and which information is important enough to store in memory. It also creates shortcuts meant to cut down on the time it takes for you to process information. The problem is that the shortcuts and ranking systems aren’t always perfectly objective because their architecture is uniquely adapted to your life experiences.

Researchers have catalogued over 175 cognitive biases. Here’s a brief summary of some of the most common biases that can affect your everyday life:

Actor-observer bias

Actor-observer bias is a difference between how we explain other people’s actions and how we explain our own. People tend to say that another person did something because of their character or some other internal factor. By contrast, people usually attribute their own actions to external factors like the circumstances they were in at the time.

In one 2007 study, researchers showed two groups of people a simulation of a car swerving in front of a truck, almost causing an accident. One group saw the event from the perspective of the swerving driver, and the other group witnessed the near-wreck from the perspective of the other driver. Those who saw the wreck from the driver’s perspective (the actor) attributed much less riskiness to the move than the group who had the trailing motorist’s (observer’s) perspective.

Anchoring bias

Anchoring bias is the tendency to rely heavily on the first information you learn when you are evaluating something. In other words, what you learn early in an investigation often has a greater impact on your judgment than information you learn later.

In one study, for example, researchers gave two groups of study participants some written background information about a person in a photograph. Then they asked them to describe how they thought the people in the photos were feeling. People who read more negative background information tended to infer more negative feelings, and people who read positive background information tended to infer more positive feelings. Their first impressions heavily influenced their ability to infer emotions in others.

Attentional bias

Attentional biases probably evolved in human beings as a survival mechanism. To survive, animals have to evade or avoid threats. Of the millions of bits of information that bombard the senses daily, people have to spot the ones that might be important for their health, happiness, and safety. This highly-tuned survival skill can become a bias if you begin to focus your attention too much on one kind of information, while you disregard other kinds of information.

Practical examples: Ever notice how you see food everywhere when you’re hungry or baby product ads everywhere when you’re trying to conceive? An attentional bias might make it seem that you’re surrounded by more than the usual stimuli, but you’re probably not. You’re just more aware. Attentional bias can present particular challenges to people with anxiety disorders, because they may fix more of their attention on stimuli that seem threatening, and ignore information that might calm their fears.

Availability heuristic

Another common bias is the tendency to give greater credence to ideas that come to mind easily. If you can immediately think of several facts that support a judgment, you may be inclined to think that judgment is correct.

For example, if a person sees multiple headlines about shark attacks in a coastal area, that person might form a belief that the risk of shark attacks is higher than it is.

The American Psychological Association points out that when information is readily available around you, you’re more likely to remember it. Information that is easy to access in your memory seems more reliable.

Confirmation bias

Similarly, people tend to seek out and interpret information in ways that confirm what they already believe. Confirmation bias makes people ignore or invalidate information that conflicts with their beliefs. This tendency seems more prevalent than ever, since many people receive their news from social media outlets that track “likes” and searches, feeding you information based on your apparent preferences.

Dunning-Kruger effect

Psychologists describe this bias as the inability to recognize your own lack of competence in an area. Research has shown that some people express a high degree of confidence about something they’re actually not very skilled at doing. This bias exists in all sorts of areas, from recreational card-playing to medical examinations.

False consensus effect

Just as people sometimes overestimate their own skill, they also overestimate the degree to which other people agree with their judgments and approve of their behaviors. People tend to think that their own beliefs and actions are common, while other people’s behaviors are more deviant or uncommon. One interesting note: false consensus beliefs appear in numerous cultures around the world.

Functional fixedness

When you see a hammer, you’re likely to view it as a tool for pounding nail heads. That function is what hammers were designed to fulfill, so the brain efficiently affixes the function to the word or picture of a hammer. But functional fixedness doesn’t just apply to tools. People can develop a kind of functional fixedness with respect to other human beings, especially in work environments. Hannah = IT. Alex = marketing.

The problem with functional fixedness is that it can strictly limit creativity and problem solving. One way researchers have found to overcome functional fixedness is to train people how to notice every feature of an object or problem.

In a 2012 study, participants were trained in a two-step process known as generic parts technique. The first step: list an object’s (or a problem’s) parts. The second step: uncouple the part from its known use. The classic example is to break a candle into wax and wick. Next, uncouple wick from how it works in the candle, describing it instead as string, which opens new possibilities for its use. Study participants who used this method solved 67 percent more problems than people who did not use it.

Halo effect

If you are under the influence of a halo effect bias, your general impression of a person is being unduly shaped by a single characteristic.

One of the most influential characteristics? Beauty. People routinely perceive attractive people as more intelligent and conscientious than their actual academic performance indicates.

Misinformation effect

When you remember an event, your perception of it can be altered if you later receive misinformation about the event. In other words, if you learn something new about an event you saw, it can change how you remember the event, even if what you are told is unrelated or untrue.

This form of bias has huge implications for the validity of witness testimony. Researchers have recently uncovered an effective way to reduce this bias. If witnesses practice repeating self-affirmations, especially ones that focus on the strength of their judgment and memory, misinformation effects decrease, and they tend to recall events more accurately.

Optimism bias

An optimism bias may cause you to believe that you are less likely to experience hardships than other people are, and more likely to experience success. Researchers have found that whether people are making predictions about their future wealth, relationships, or health, they usually overestimate success and underestimate the likelihood of negative outcomes. That’s because we selectively update our beliefs, adding an update when something turns out well but not as often when things turn out badly.

Self-serving bias

When something goes wrong in your life, you may have a tendency to blame an outside force for causing it. But when something goes wrong in someone else’s life, you might wonder whether that person was somehow to blame, if an internal characteristic or flaw caused their problem. In the same way, a self-serving bias might cause you to credit your own internal qualities or habits when something good comes your way.

Cognitive biases can affect your decision-making skills, limit your problem-solving abilities, hamper your career success, damage the reliability of your memories, challenge your ability to respond in crisis situations, increase anxiety and depression, and impair your relationships.

Probably not. The human mind seeks efficiency, which means that much of the reasoning we use to conduct our daily decision-making relies on nearly automatic processing. But researchers think we can get better at recognizing the situations in which our biases are likely to operate and take steps to uncover and correct them. Here’s how to mitigate the effects of bias:

  • Learn. Studying cognitive biases can help you recognize them in your own life and counteract them once you’ve sussed them out.
  • Question. If you’re in a situation where you know you may be susceptible to bias, slow your decision-making and consider expanding the range of reliable sources you consult.
  • Collaborate. Assemble a diverse group of contributors with varying areas of expertise and life experience to help you consider possibilities you might otherwise overlook.
  • Remain blind. To cut down on the chances that you’ll be influenced by gender, race, or other easily stereotyped considerations, keep yourself and others from accessing information on those factors.
  • Use checklists, algorithms, and other objective measures. They may help you focus on relevant factors and reduce the likelihood that you’ll be influenced by irrelevant ones.

Cognitive biases are flaws in your thinking that can lead you to draw inaccurate conclusions. They can be harmful because they cause you to focus too much on some kinds of information while overlooking other kinds.

It’s probably unrealistic to think that you can eliminate cognitive biases, but you can improve your ability to spot the situations where you’ll be vulnerable to them. By learning more about how they work, slowing your decision-making process, collaborating with others, and using objective checklists and processes, you can reduce the chances that cognitive biases will lead you astray.