Correlation and causation warrant plenty of suspicion because researchers - and consumers of said research - fall prey to:. Tyler Vegihn compiled some funny misleading statistics examples to prove this exact point:. This graph depicts a compelling correlation between the number of people who drowned falling into a pool and the number of movies Nicolas Cage appeared in:.
Another shows a correlation between the number of people who died by becoming entangled in bedsheets, with cheese consumption:. Probably not. Data visualizations turn raw numbers into visual representations of key relationships, trends, and patterns.
One popular example from the news is the Terri Schiavo case , a right-to-die legal case in the U. A glance at this graph suggests that when compared to Republicans and Independents, 3 times more Democrats agreed with the court. The truncated graph and tampered Y-axis starting at 50 instead of 0 distort the data, and lead you to believe an exaggerated idea about a certain group.
The intervals and scales. Check for uneven increments and odd measurements use of numbers instead of percentages etc. The complete context and other comparative graphs to see how similar data is measured and represented. They create shocking headlines that attract swarms of traffic but provide flawed insights at best. Instead of helping you navigate through detours, potholes and pitfalls, they knowingly- or unknowingly - steer you right into them.
Research is expensive and time-consuming. Check who is sponsoring it, weigh their bias on the topic and how they might benefit from results. Are they a B2C company with a product? A consulting service? An ideal sample size depends on many factors, like your company and the goals for your project. Using a third-party tool helps you reliably assess your sample size without having to figure out the calculations on your own.
Users enter their expected conversion rate and the percent change they are looking for. The way you word survey questions can also be a source of misleading statistics. A recent UK study shows that the way you phrase a question directly affects how a person answers.
One example is survey questions that ask for confirmation. Essentially, you are including the answer in the question. Check your surveys for manipulative wording that might lead respondents to give a particular answer. A few examples of influential phrasing include:. Check for leading language by asking co-workers to review surveys before sending to customers. Ask what parts of your questions, if any, suggest how they should respond. Confirmation bias is when you have a set result you want or expect to see, so you look at only data that affirms your belief.
Companies are most susceptible to this phenomenon when a single employee is giving a presentation. Whether the employee realizes it or not, they may not be providing a full picture of the data due to their own views—and that can lead to poorly informed decisions. To support her claim, she shows that very few customer support calls mention this feature.
As it turns out, she was looking at calls from only the last six months. When analyzing support calls from long-term customers, the product team sees a much higher percentage bringing up issues with the Favorites feature.
Everyone has unconscious biases. But not everyone has the same ones. If an employee comes to you with a proposal, have another team member review the project idea and the presentation. Each person will approach the data differently, so someone will likely realize whether the data is skewed toward one perspective.
Offer training to help employees become aware of their biases. This is especially important when it comes to internal hiring and employee development decisions. Training can help teams avoid misleading statistics that could negatively affect business decisions ranging from product features to team diversity. As mentioned in the beginning of this article, it has been shown that a third of the scientists admitted that they had questionable research practices, including withholding analytical details and modifying results!
It becomes hard to believe any analysis! Insightful graphs and charts include very basic, but essential, grouping of elements. Whatever the types of data visualization you choose to use, it must convey:. Absent these elements, visual data representations should be viewed with a grain of salt, taking into account the common data visualization mistakes one can make. Intermediate data points should also be identified and context given if it would add value to the information presented.
With the increasing reliance on intelligent solution automation for variable data point comparisons, best practices i.
The last of our most common examples for misuse of statistics and misleading data is, perhaps, the most serious. Purposeful bias is the deliberate attempt to influence data findings without even feigning professional accountability. Bias is most likely to take the form of data omissions or adjustments. The selective bias is slightly more discreet for whom does not read the small lines.
It usually falls down on the sample of people surveyed. For instance, the nature of the group of people surveyed: asking a class of college student about the legal drinking age, or a group of retired people about the elderly care system. Another way of creating misleading statistics, also linked with the choice of sample discussed above, is the size of said sample. When an experiment or a survey is led on a totally not significant sample size, not only will the results be unusable, but the way of presenting them - namely as percentages - will be totally misleading.
Providing solely the percentage of change without the total numbers or sample size will be totally misleading. Likewise, the needed sample size is influenced by the kind of question you ask, the statistical significance you need clinical study vs business study , and the statistical technique. If you perform a quantitative analysis, sample sizes under people are usually invalid. Misleading statistics in the media are quite common.
On Sept. Based on the structure of the chart, it does in-fact appear to show that the number of abortions since experienced substantial growth, while the number of cancer screenings substantially decreased.
The intent is to convey a shift in focus from cancer screenings to abortion. The chart points appear to indicate that , abortions are greater in inherent value than , cancer screenings. Yet, closer examination will reveal that the chart has no defined y-axis. This means that there is no definable justification for the placement of the visible measurement lines.
Politifact, a fact checking advocacy website, reviewed Rep. Using a clearly defined scale, here is what the information looks like:. Once placed within a clearly defined scale, it becomes evident that while the number of cancer screenings has in fact decreased, it still far outnumbers the quantity of abortion procedures performed yearly. As such, this is a great misleading statistics example, and some could argue bias considering that the chart originated not from the Congressman, but from Americans United for Life, an anti-abortion group.
Among some other overlooked factors, based on this argument, Sally Clark was convicted. The statistician later showed that chances of Sally's innocence were two in three if the data was reversed to sudden unexpected deaths.
When looking at statistics, consider the source of data; whether its sampled or controlled experiment and find all the other factors that tie to the analysis. Look for all the tricks used in the distortion of the truth to deliberately direct others towards a preconceived target. Make sure the data is accurate , and the truth is the highest priority whether you are a viewer or you are the one collecting the data. Wendy is a data-oriented marketing geek who loves to read detective fiction or try new baking recipes.
She writes articles on the latest industry updates or trends. Enter your email and get curated content straight to your inbox! Pinky promise. Log in Try it free. Resources A collection of useful resources to help power up your performance monitoring and reporting flow. Thank you for subscribing! It's great to feel loved. Published on Nov 05, See your marketing results now See your marketing results now.
0コメント