A scan through health sites makes it apparent that many physical conditions are being continually linked to genes or biomarkers. A study from an expert in scientific study design at the Stanford University School of Medicine puts forth that the association between genes and diseases often seems to be exaggerated, apparently resulting in professionals arriving at wrong conclusions without a larger study backup.
Some of the cited studies include BRCA1 mutation in link with colon cancer, C-reactive protein in the blood associated with cardiovascular disease and the homocysteine levels related to vascular disease. According to the scientists once an association of a gene or a biomarker with a disease is mentioned or cited, the link seemingly persists to be perceived that way. Every new study or finding comes along previous studies and citations that are used as references. Landmark studies that seem to be mentioned often supposedly become accepted results without any heed to studies that tend to oppose the analysis.
“The exaggeration is likely the result of statistical vagaries coupled with human nature and the competitive nature of scientific publication. No research finding has no uncertainty; there are always fluctuations. This is not fraud or poor study design, it’s just statistical expectation. Some results will be stronger, some will be weaker. But scientific journals and investigators like to publish big associations,†specified John Ioannidis, MD, DSc, chief of the Stanford Prevention Research Center
As a part of the study, 35 widely mentioned analyses were examined. It was apparently found that larger follow up studies seemed to show significant associations between the biomarker and disease risk. It appeared that only one in every five of the original studies seemed to increase the patients’ relative risk for a disease by more than 1.37.
The analysis examined 35 most highly cited studies that were published between 1991 and 2006 in 10 well known biomedical journals. These studies were subsequently referred by a minimum of 400 papers and some even mentioned the numbers in thousands. It apparently came to light that majority of the papers that cited the study many times seemed to suggest stronger effects than it was observed in the largest analysis related to the markers and their result. Scientists also observed that meta-analyses were more likely to be accurate than small pilot studies.
Scientists further suggest that investigators should follow scientific methods thoroughly and that verification and replication of results is crucial before accepting any study as a fact. This could be achieved by implementing a system of continual review and reassessment for each of the mentioned link between biomarkers and disease. Just like flipping a coin repeatedly would yield the correct head to tails ratio, the real strength of the study should become apparent by ongoing reassessment. Scientists conclude that it is vitally important to validate original published findings with subsequent large-scale evidence to make progress in the field of biomarkers and risk association.
The study is published in the Journal of the American Medical Association.