There are many articles, both news articles and blog posts that discuss scientific findings. Headlines encouraging the next best diet or the newest super food are posted everyday, so what do you believe?
Have you wondered if these authors are summarizing the research article accurately? Not everyone has nutrition and science backgrounds; I mean, have you ever attempted to read a 58-page research journal?
This publication from the European Food Safety Authority (EFSA) about a insecticide that may harm bees was mentioned in a couple of blogs earlier this month. Civil Eats, posted EU Steps Up for Bees and U.S. Backtracks, and discussed the research and it’s impacts here and in the EU.
Paul Towers, a guest blogger for Civil Eats, is also the Organizing and Media Director at Pesticide Action Network North America. While Towers may be biased towards pesticides and insecticides, he did a decent job at providing his audience with an accurate portrait at what the EFSA study was about and its affects.
A reading from the Harvard School of Public Health provides questions readers should ask when reading an article that discusses scientific findings. While this article focuses on diet and nutrition research, similar questions will apply to this pesticide/insecticide research.
The first question: Are they simply reporting the results of a single study? If so, where does it fit in with other studies on the topic? Towers says:
“several recent reports, including one from the European Food Safety Agency (EFSA), indicate that three neonicotinoid insecticides pose an unacceptable hazard to honey bees. Additionally, EFSA found that the industry-sponsored science — upon which regulatory agencies’ claim of safety have relied — are fatally flawed.”
While ‘several’ studies should not have much credibility when looking at hundreds, EFSA found that the industry-sponsored science were flawed. While there was no further detail given, we can assume that this study will have more weight than the industry-sponsored studies.
Second question: how large was the study? While my eyes glazed over for most of my read-through of the study, exact numbers were given for many studies, this research included different risk assessments on different tiers, some lasted three years at three different locations, other just a month with under 100 samples, however, there was enough data given to prove reliable and credible.
Third question: did the study look at real disease endpoints? The purpose of this question was when looking at nutrition research, was the study long enough to see long-term results and accurate relationships. The EFSA insecticide study discussed data gaps that occurred because most of the research used honey bees. While Towers never mentioned this in his posting, honey bees are the type of bee used for pollination, and honey bees are the bees that are slowly becoming extinct.
While Towers did not go into too much detail about the study, he did draw the same conclusions about the study that I did when taking a deeper look at it. I find that sometimes blog posts can more accurately disseminate information to the public because these author are very passionate about the topics they are writing about and will be sure to get as much credible information as possible. Whereas news sources have journalists from all different backgrounds who are paid when they attract an audience.