Or do they and I just can't find them?
It'd be nice to be able to easily track down the actual paper.
Is there a reason, copyright or otherwise, they don't include them?
asked byJeff__1 (15236)
Get FREE instant access to our Paleo For Beginners Guide & 15 FREE Recipes!
on March 29, 2012
at 11:19 AM
The story in your link is from a press release given out by the Endocrine Society. The study results were presented at a conference and had not yet been published in any journal.
Sometimes such studies never end up being published. I don't think the study in the story ever was. A number of the news stories on sciencedaily are from such conference presentations.
Scientific societies write these stories themselves and pass them out to news organisations to raise their profile and generate publicity.
Jakubowicz only has one similar followup study published on pubmed and as you can see it was published four years after that original story.
on April 16, 2012
at 09:08 PM
I agree with the conference talk not being published, but I also want to chime in with this:
Most people in the general public are not well versed enough in research methods to properly critique an article. I think (besides profit) why articles are often limited to certain audiences is because of the risk of misinterpretation/misguided use. Even if the reader is a "scientist" in a different field, he/she might not be aware of the variables that play a significant role in the area not in their expertise. For example, a physicist (who understands the importance of methodology) might not be aware that early education interventions are almost always done with low-income populations, because that's where funding goes to (in the US). He might read an article about a method "found" to significantly improve language skills in X article, and become frustrated if his child doesn't showing in increase in words over a few months because X article told him it would work!! What the parent might not know is that the baseline scores of his child and those in the study were significantly different at baseline. Thus the "ceiling" (highest possible score) is closer for his child, meaning that he can only improve so much, while the lower-income child in this magical study improved by leaps and bounds because those children were so much lower to start with.
You mentioned linking the abstract. The abstract alone still won't provide enough information. For instance, abstracts are very brief and cover the main points. They don't go into depth about limitations, attrition, drop out rates, or provide much information about the participants included in the study. By simply reading the abstract, you only get a glimpse of the entire study. The authors might write in the conclusion that the limitations are far too great to make ANYTHING out of the results due to sample size, lack of participant diversity, generalizability etc. This might not make it to the abstract.
A study might report an INCREASE in heart disease, in say...those who consume a heavy meat diet. BUT, an important question one needs to address before blasting it on the evening news, is, what was the p value? Was it significant/statistically different? Is there practical significance in reducing meat consumption? What method of analysis did they study use? Could they only do basic comparisons, or did they use more advanced analysis like regression? Did they use control variables? Are there moderator variables (such as "does heart disease in heavy meat eaters DEPEND on whether the person also ate lots of vegetables vs. few vegetables)? Glancing at the few lines of the abstract or results section is not enough.
I believe that a major reason why there is such much misinformation floating around is lay people (e.g. reporters) trying to make a big deal out of a single study. They fail to interpret the study and to note who the participants were, where it was conducted, etc. All of that matters. I don't know what the solution is, but I don't think simply providing the public with access to journals is going to solve things. In fact, it might lead to an opposite problem, as people take conclusions out of context/misinterpret findings.
on March 29, 2012
at 07:09 AM
Maybe that would make it to obvious that what's reported in science journalism doesn't accurately reflect what's reported in the study?
Normally you could google around with the names of the study authors- and they usually mention what journal it was published in. In this case, however, it just says that the research was just presented at a meeting of The Endrocrine Society. I think you can find it here though 'Meal timing and composition influence ghrelin levels, appetite scores and weight loss maintenance in overweight and obese adults' found through searching for 'Jakubowicz' on pubmed.
on March 29, 2012
at 12:18 PM
I'm seeing a "Story Source" cited at the very bottom of the article. Looks like they referenced all the material they had.
on March 29, 2012
at 10:47 AM
Because most news sites aren't in the business of giving you links out of their domain. When I read the scientific journals, typically the only references in papers that have links are the ones with the same publisher.
Yeah, it would be nice to have links to everything, but it's still a business...