In a previous life I was a computer nerd. We programmers have a saying - garbage in, garbage out (or GIGO). It doesn’t matter if your code is perfect, if the data you use is faulty then the output will be bad as well.
The same holds true for each of us - there is no way great decisions can emerge from our thinking if the information that goes into it is bad. Garbage in - garbage out.
When I was in charge of software development, it was essential to ensure data validity and the earlier in the process we could detect errors, the better. This fell under the domain of “quality assurance (qa).”
I have since discovered that we can apply qa practices to ourselves. By doing so we can improve our ability to make decisions that better help us achieve our goals and align with our values. In this article I will focus on data qa and in the future I will cover applying qa to our decision making processes.
The information we use in our decision making has three elements: 1) the information itself, 2) our belief about the validity of the information, and 3) the amount of certainty we have about that belief.
If you have followed my content, you know that subconscious illusions in the form of biases, shortcuts and impulses can cause errors in all three elements. Our personal qa system must detect these errors whenever possible before we let them into our internal decision making software (remember GIGO).
1: Raw data sources
We consume information through reading, watching, and engaging in conversation. How can you ensure you are receiving accurate information? You can’t. But you can increase the probability of valid data by using discernment in your information sources. When possible, ask these questions before consuming information:
- Is the source respected for honesty and integrity?
- Is the source trustworthy - if so, why or why not?
- Is the source ideology-based rather than objective? If the majority of their content supports one ideology over another, be wary.
- Does the source make money from your attention? If so, they are often more interested in tweaking you emotionally than providing accurate information. Most “media” falls into this category.
- Is the information supported by other credible sources?
The more important the information is to your worldview or decision making, the more vital it is to question its veracity.
2: The belief filter
Have you heard that we only use a small percentage of our brain? That’s a cool bit of information and makes us feel good knowing that we have a lot of untapped potential up there that we may be able to access at some point. Unfortunately, it is false - demonstrably false and easily debunked. While it is true that we have a lot of untapped potential, we use almost all of our brain.
As with that myth, we love information that either confirms a belief we already have (confirmation bias), or aligns with a “lizard brain” preference for that belief. Your personal qa system can catch errors at this stage by looking out for these subconscious preferences when assessing your belief about a piece of information:
Status quo - Does the information align with what you currently believe or how you behave? Does it support doing things the way you are currently doing them? If so, take a moment to review whether it is truly accurate.
Speed and convenience - Does believing the information make your decisions and/or actions quicker and easier? For example, if you believe that your current job is the best choice for you even though you are miserable and don’t feel you are paid enough, maybe that belief stems from the fact that staying there is easier than looking for a new one. I’m all for quick and easy but if that is the driving factor and supports a choice that does not align best with your goals and values, it’s faulty data.
Exposure - Studies show that the more we are exposed to an idea or piece of information, the more likely we are to believe it. You may not have believed that immunizations cause autism the first time we heard it, but by the fifth time you heard it you felt it must be true. One of the dangers of the information age is that it is easy to be repeatedly exposed to the same idea, especially if we use the same information sources (or types of sources) again and again.
We love to feel certain about our beliefs. Our brain rewards us with the fun chemical, dopamine, when we feel we know something for sure, so we tend to be more certain about our beliefs than logic would dictate. The truth is, everything is a probability. The probability that 1+1=2 is close to 100%, so you can be secure in your certainty on that one. What about the belief that buying a new car will make you happier? Or that your supervisor is actively plotting against you?
In the software qa system I helped develop we flagged data that was likely, though not certain, to be in error. For example, if we received a patient record that showed a male with a breast cancer diagnosis, we would flag it. It could be true because males do get breast cancer (1% of all cases), but if we found a high percentage of those cases then the probability of a data system error was more likely and we would do further research.
The key is to avoid jumping to certainty when evaluating information. Assess probabilities instead and you will be better prepared in case the data is inaccurate. Then you can act accordingly.
Implementing the system
Adopting a personal qa system takes some work. It doesn’t need to be applied robustly to all information you consume, but when that data can influence a decision that has a large impact on you or others, take the extra step to ensure the data is accurate, or at least understand the probability that it may not be.
It won’t take long for your personal qa system to become second nature, and once the habits are formed you will feel confident that your decision making system is not suffering from GIGO.
Think well and be well!
- Steve Haffner, decision performance expert
Want to learn more about improving your thinking and decision making?
Click here for my free book, 7 Strategies for Making Better Decisions