Statistics Done Wrong by Alex Reinhart is an awesome guide that points out holes in the reasoning of many scientific studies. I initially book marked it to double check myself. When dealing with survey and request data it’s easy to present a picture that is what you want to find, instead of what is actually there. However, Statistics Done Wrong deals with issues that are far above the data I pull for departmental reports.
Each error is explained with a hypothetical and/or real example of a study that contained statistical errors. This makes it even easier to wrap a layman’s mind around a subject that affects the authors of studies and articles for which he/she is the end audience. As an example: after expounding on the multiple ways that scientists error in their quest to prove or disprove a hypothesis, Reinhart includes an excellent example of double checking:
courtesy of John Ioannidis and Jonathan Schoenfeld. They studied the question “Is everything we eat associated with cancer?”51[1] After choosing fifty common ingredients out of a cookbook, they set out to find studies linking these studies to cancer rates – and found 216 studies on forty different ingredients. Of course, most of the studies disagreed with each other. Most ingredients had multiple studies claiming they increased and decreased the risk of getting cancer.
Another interesting thing: being allowed to right turn on red wasn’t considered until the 1970s and was allowed because of underpowered statistics and a drive to save gas in a fuel crisis.