To me, one of the most interesting aspects of working in agriculture is the ongoing research and the way that research is applied in the field (or the corral). There’s a good deal of science behind many of the products beef producers use regularly, whether it’s vaccines or new forage varieties. There’s science behind many on-farm practices, too, although other factors often come into play (animal welfare and needs, economics, efficiency, a producer’s own strengths, how that practice affects the rest of the operation, etc…).
There is, of course, no shortage of snake oil and misinformation in the world, either. I’m most often exposed to it by someone trying to sell me something horse-related or, like most of the population, someone trying to sell me some health product. In an ideal situation, here is how my thought process goes before I actually buy into whatever they’re selling:
1. Prove that there is a problem, and that it is likely to affect me (or my mules).
2. Prove what is causing the problem.
3. Prove that this recommendation is the only or the best solution.
4. Prove that it’s worth paying for or worth changing practices.
To me, proof is not “it worked for some stranger on Facebook.” Slightly more interesting, but still inconclusive, is someone saying “it worked for me.” How do I know that one case wasn’t a fluke? Show me the science, or at least the logic, I say.
This all sounds simple and surefire on paper, but we all know the real world is more complicated. There are all kinds of things that could derail this process.
One complication is that the science is sometimes contradictory. Ron Clarke has an interesting piece on the debate around chronic wasting disease research. There are researchers who claim it is not caused by prions, but by bacteria. No doubt everyone would be thrilled if there was a treatment or vaccine in the works for the diseases caused by rogue prions. But it seems to me that wishful thinking has helped drive that branch of TSE-related research, although some readers may disagree.
The thing is, scientists on both sides of the issue are convincing, Clarke writes. “Little wonder we confuse readers when we always stress that science is good.”
Steve Dittmer’s column touches on conflict of interest issues in research by delving into the process of drafting dietary guidelines in the U.S. The National Academy of Science, Engineering and Medicine has reviewed the whole process, and recommended taking into account financial and non-financial conflicts of interest when choosing members of the Dietary Guidelines Advisory Committee (which is the group that creates America’s version of the food guide). Published research, academic or professional advancement and organizational memberships all constitute non-financial conflicts of interest.
But that doesn’t mean scrubbing that committee of all conflicts of interest is necessarily the best course.
“It may not be possible to eliminate conflicts of interest but transparency allows them to be managed and limited,” Dittmer writes. After all, he adds, the people with the needed expertise might also have some sort of conflict of interest.
Even setting aside the potential for conflicts of interest, the research on a given topic can be contradictory for any number of reasons. The world is a big, complicated place with a number of factors that can’t be controlled or accounted for as they would in the lab.
One of my favourite podcasts is a show called Science Vs. It starts with a question and goes through the scientific evidence on either side of the issue. A recent episode looked at the research into moderate alcohol consumption and health. There have been some studies linking moderate alcohol consumption and heart health benefits. However, it turns out that in some of those studies, the non-drinkers were actually “sick quitters” — people who had quit drinking alcohol because of serious health issues. The moderate drinkers were, of course, quite healthy in comparison, but the results likely would have been different if they’d studied people who’d never touched a drop. How often does something like that happen in these studies, I wondered after listening to it.
Even more depressing, research is showing a strong correlation between alcohol consumption and several types of cancer. The official word right now is that there is no safe amount of alcohol when it comes to cancer risk. But before you dump that Malbec, you should know that even the researchers who are finding links between cancer and alcohol have a favourite drink, at least in this podcast. It just goes to show there’s more to life than avoiding every risk, I suppose.
Given all this complex, contradictory information, it’s no wonder people are confused. As soon as you add emotion or urgency to a decision, it’s tougher. And if science doesn’t have a clear solution, or the solution is more than a person can stomach, there’s always someone else happy to sell a more palatable answer.
I imagine scientific reasoning must be like learning to shoot hoops. You’re not going to sink every shot, but the more you practice the technique, the better your aim. Eventually the process becomes like muscle memory. We all know people like this — they might be, for example, veterinarians, producers, nurses, teachers, or mechanics. Whether it’s because of their jobs or their natural inclinations, they have a much better chance of sinking a free throw under pressure than the rest of us (and hopefully they can admit that sometimes they miss, too).
As for all the writers covering science-type topics in this magazine and others, I will be sure to toast your hard work with my next glass of Malbec.