You probably have had a review of the
scientific method in every science class you've ever had. It's about
research and about conducting GOOD research. Because, let's be honest,
research results are only as good as the experimental procedures. If you
have flawed experimental design your results mean NOTHING! Designing a
good research is extremely difficult to do, you must account for all kinds of
variables... And, if you are anything like me, once you start your research
project, you will discover a lot of variables you didn't even think of before.
In one of my undergraduate classes, we had to
conduct research experiments, first with a group and then we did one all on our
own. Being the hungry serious students we were, our group
decided we'd much rather use our class time to go out to lunch (in our defense,
the course was from noon to 1:00 every day), so we designed our experiment
around the amount of food people ate at a buffet versus at a non-buffet. Hey,
no one said this was ground breaking research! So, it wasn't until we
got 'in the field' (i.e. out to lunch) that we realized we never set an operational
definition of what 'a lot' of food was. We're we going to count
plates? No, that wasn't very accurate ~ buffet plates are small and sit
down plates were typically larger. Could we weigh plates? Probably,
but we weren't brave enough to take on that task. And, plates don't
always weigh the same, so the results of that wouldn't have been accurate
anyway. We ended up completing our experiment the best we could, probably
estimating the amount of food someone ate (again that is so subjective, what is
a lot of food to someone else may not seem like that much to me). We did
this experiment probably 15 years ago, so honestly, I've forgotten the exact
results, but I believe we concluded that people who went to a buffet ate a higher
quantity of food than those who did not. Shocking, right?! (Please
note my sarcasm!). Bottom line, designing a quality experiment is tough
to do!
Another thing these modules bring to mind is
that the media doesn't really care about the research design, generally they
care about what will sell. Case and point, this article regarding the
time line of the vaccination link to autism controversy: http://theweek.com/article/index/242395/autism-and-vaccines-a-timeline-of-the-dubious-theory-and-the-ongoing-debate#axzz33KiIZUCZ The article states Andrew Wakefield, the head researcher:
"misrepresented
or altered the medical histories of all 12 of the patients" involved in
the groundbreaking 1998 study. The editor in chief of the journal that
published the report said Wakefield's work "seems to be a deliberate
attempt to create an impression that there was a link by falsifying the
data."
Unfortunately, the retraction of the research
didn't seem to make near the splash that the initial, erroneous reports
did.
I am not prepared to hash out the pros and
cons of vaccinations, I'm not that educated on the topic. My point
is (I'm getting there, I promise!) that research that is the 'loudest' or most
publicized by the media isn't necessarily the most sound. Let's be
honest, with the internet we can pretty much find anything to back up just
about any position we may have on any give subject. It is up to the
consumer (that's YOU!) to evaluate if the claims seem legit.
Another example: Anyone cruise through
Facebook and see this advertisements that say you can lose 5 lbs. in three
days? As a consumer (and a scientist), we have to be skeptical.
Does that claim make sense? I guess, sort of. You could lose
5 lbs. of (water) weight in three days. But if we think just a tad
deeper, we realize that those results will go out the window once we drink any
liquids, so it's not like the user would lose 5 lbs. of fat, which the consumer
may think they are signing up for. If my calculations are correct, and
that's a big IF because remember math and me aren't really friends, it
would take something like 16,500 calories to lose 5 lbs. of fat. What
does this company have to gain from such claims? Does this claim make
sense? Obviously, they want to gain our business. My point (again,
I will get there) aside from evaluating research, we must also evaluate who is
doing the research and what they have to gain from the research results they
are publishing.
I'm reminded of something my grad school stats
professor once said: Liars can figure and figures can lie.
Any research or claims that you've come across that
you are skeptical about?
No comments:
Post a Comment