Sep 25, 2016

The Misinformation Age


‘A Field Guide to Lies’ by the neuroscientist Daniel Levitin lays out the many ways in which each of us can be fooled and misled—and the modes of critical thinking we will need to overcome this.

 

Wall Street Journal

By SAMUEL ARBESMAN

Sept. 16, 2016

 

The phrase “too good to be true” telegraphs a sense of both surprise and concern. It indicates something we should look more carefully into or immediately swerve away from. But how do we know how to properly respond? How do we navigate the many claims, facts, statistics and arguments that we are bombarded with on a daily basis? How do we make sure our little mental alarm bells go off consistently and accurately? In other words, how do we think critically in our modern world?

While it is easy to say that we value critical thinking, I don’t think people are really wired that way. It is a lot simpler to take a claim at face value than to delve into its veracity. We are not in a constant state of careful thought, reading or consuming information vigilantly. We must think of critical thinking like a muscle: The more we use it, the stronger it will be and the more natural its use becomes. If you train for a marathon, you can run a mile. If you constantly try to grapple with data-riddled documents, seeing through a talking head should be a breeze.

“A Field Guide to Lies” by the neuroscientist Daniel Levitin lays out the many ways in which each of us can be fooled and misled by numbers and logic, as well as the modes of critical thinking we will need to overcome this. You will learn how to think critically about numerical facts, how to recognize the host of cognitive biases that we often fall prey to, and even how to evaluate the reliability of a website. Mr. Levitin tells us how to think about averages (the mean can be deceiving, such as in very uneven distributions, like investment returns). He also explains how to read graphs (pay attention to the axes; when they don’t start at zero, something might be fishy).

We are often sloppy when thinking, for example, about the field of medicine, from how we test for disease to how we think about a disease’s presence in the population. The chance of a positive test result, assuming you actually have a disease—such as a certain type of cancer—is not the same as the chance that you have cancer given that the test results are positive. Depending on the numbers, the chance of the presence of disease can be overestimated many times over. For instance, if a test for a disease that occurs in 1 in 100 people only detects it 90% of the time but someone without the disease still tests positive 9% of the time, about 9 in 10 positive results will actually be false positives. In fact, physicians can fall prey to this error in thinking, with one study that Mr. Levitin quotes noting that 90% of doctors make this error.

These logical failings can arise in more mundane situations. For instance, a lack of critical thinking can lead to problems in how we end up thinking about coincidences. As Mr. Levitin recognizes, we note coincidental situations when they happen, such as when a friend calls just as we are thinking about him. But we don’t note when we think about him and he doesn’t call or when we don’t think about them and they call, or even when they don’t call and when we don’t think about them. We focus on certain situations and exclude the rest of the possibilities, making it difficult to understand the larger picture and the nature of the coincidence. Researchers have even looked at the situation where we come across a new word and then hear or see it again soon after; with a statistical mind-set, this “coincidence” might appear far less mysterious.

Some readers might have seen similar forays into this topic elsewhere, particularly for various subsets of these approaches, from navigating our cognitive biases to how to think in terms of Bayesian probability (updating our probabilities based on new information). For other readers, this book might have the feel of something akin to eating your vegetables: something you recognize that you need to be familiar with and conversant in but only if you are forced to learn it. But I’d recommend this vegetable eating—it will help you consume healthy information more regularly rather than the misinformation that is all around us. Ultimately Mr. Levitin appears to be advocating a scientific mind-set in how we approach the world around us and the information within it, constantly querying what we encounter with a skeptical and critical eye.

http://www.wsj.com/articles/the-misinformation-age-1474059411

 

No comments: