The New Peer Review: Why ‘Unbiased’ Science Is Now Often Misleading

The New Peer Review: Why ‘Unbiased’ Science Is Now Often Misleading
(JNT Visual/Shutterstock*)
Jennifer Margulis
Joe Wang
8/24/2022
Updated:
10/12/2022

Peer-reviewed scientific publishing works like this: A scientist or a science team have a scientific question, and they come together to design and conduct an experiment to try to answer that question. The experiment may take months, years, or even decades.

Once the scientists have collected and analyzed the experiment’s results, they write up their findings and draw conclusions based on the already accepted knowledge in the field, their new discovery, and their educated speculations of what is yet to be known.

Then, they send their article to scientific journals within their field of study.

When a journal editor receives the article, the editor reads it carefully and either rejects it or sends it to other known experts in the field, who weren’t involved with the study, to review the findings and the write-up. Once the experts weigh in, the editor then makes the decision about whether to reject or accept the paper, in most cases, with notes for the authors to revise their submission.

Peer reviewers will often ask the researchers insightful questions or query parts of the findings in the paper. These queries help the researchers refine their ideas, review their findings, and double-check that their data, and their analyses, are correct.

This sometimes quite lengthy peer-review process is to ensure that journals publish scientific articles that make a real contribution to our understanding of the field, whether it’s chemistry, biology, physics, social science, or any other subject.

2.6 Million Studies a Year

On the order of 2.6 million scientific studies are published every year, according to the National Center for Science and Engineering Statistics. The explosion in published science means there may be as many as 30,000 peer-reviewed journals providing scientists an outlet for their findings. The result: It’s increasingly difficult to distinguish between good science and bad science.

Good science is work that has a high level of integrity and transparency, is conducted in an unbiased way, and leads to findings that can be replicated by other scientists.

Bad science is often ego-driven or industry-sponsored: published not for the good of advancing knowledge or helping people, but to mislead the public, often for financial gain. For-profit industries have and continue to use bad science to convince consumers to buy their products.

Even more damaging may be the role of ideology—when certain positions become a matter of political position rather than scientific merit.

Three American researchers took advantage of this weakness to troll several journals by getting seven studies published in sociology journals.

One study, “Dog Parks Are Petri Dishes for Canine ‘Rape Culture,’” made it into the journal Gender, Place and Culture.

In the process, the researchers learned something disturbing:

“What appears beyond dispute is that making absurd and horrible ideas sufficiently politically fashionable can get them validated at the highest level of academic grievance studies,” James Lindsay, one of the researchers, said in a video about the project.

Junk Science

Recent history shows how “junk science” can have negative repercussions that harm human and planetary health.
For example, in 1948, a husband and wife team at Harvard University, Olive Watkins Smith and George Van Siclen Smith, published an article that asserted that a synthetic hormone, diethylstilbestrol (DES), not only prevented miscarriage but also made a normal pregnancy “more normal.”

Drug manufacturers copied and distributed the Smiths’ study to thousands of medical doctors to encourage them to prescribe DES.

The Harvard research was shoddy at best: They used a sample size of pregnant women that was too small to draw statistically significant conclusions and had no control group. The Smiths also failed to disclose that their research was funded by the drug industry.

Largely based on this junk science, an estimated 5 million to 10 million pregnant women in America took DES. Yet DES was neither helpful nor benign. It caused miscarriage, as well as an aggressive hormone-induced reproductive cancer in teens whose moms had taken it. DES was banned for use in pregnancy in 1971.

A more famous example started in the 1950s when the tobacco industry began a sophisticated public relations campaign to counteract the peer-reviewed science that showed that smoking was harmful to human health.

Though it was known by 1953 that smoking caused lung cancer, industry-sponsored science so effectively muddied the scientific waters that the connection wasn’t acknowledged by public health authorities until the early 1990s.
More recently, in the 1990s, when biologist Tyrone Hayes found out that a common pesticide, atrazine, was so endocrine-disrupting that it turned male frogs into females, Syngenta, the company that makes the pesticide, did everything it could to keep this information from the public. Two class-action lawsuits revealed that Syngenta had the goal of publicly discrediting Hayes’s reputation in order to make environmentalists question the validity of his research.
Publishing poorly designed studies that couldn’t be replicated was an effective strategy to keep the Environmental Protection Agency from regulating Syngenta’s $14 billion a year pesticide and seed sales. In 2014, as reported by The New Yorker, Syngenta was giving research money to 400 academic institutions around the world.

‘Sneer Review’

The research that scientists publish affects their job prospects, livelihood, reputation, and even friendships. Given the explosion of scientific publications, it’s easy to see how the peer-review process can go awry.

The Epoch Times spoke with a professor who spent more than 25 years in a top 10 medical school. This scientist asked not to be named, for fear of reprisal.

“I call it sneer review,” the scientist said. “There is tremendous bias. Reviewers ignore data that doesn’t fit with what they already believe.”

The scientist said that certain fields have fewer problems with special interests than others, and certain topics—including the safety of modern medicine and, especially, the safety of vaccines—tend to push ideological buttons.

“The idea in science should be that we just push toward finding out the answer. We have a hypothesis, we ask questions, we test the hypotheses, we collect more data,” the scientist said. “That’s how we move forward. But when it gets polarized, the sneer-review phenomenon starts to happen. Then it becomes a more ideological confrontation.

“People will try to publish total nonsense for ideological reasons.”

When Ideology Drives Decisions

When peer-reviewed studies have the potential to harm multibillion-dollar industries, they often get retracted, several scientists told The Epoch Times.
“Follow the silenced science,” said James Lyons-Weiler, CEO and director of the Institute for Pure and Applied Knowledge (IPAK). He has published more than 50 peer-reviewed studies on a variety of topics and recently had a controversial study retracted.
It’s especially difficult to publish research that calls vaccine safety into question in the first place, Lyons-Weiler said, and these studies are often summarily retracted by controversy-averse editors.

“They tend to be retracted after critique by anonymous critics,” he said. “This is a problematic new development. The journals are retracting based on criticism from anonymous reviewers, instead of publishing the critique and allowing the authors to rebut. That means the critics’ comments are not peer-reviewed.”

The retraction may happen a week after the science is published, or more than 10 years.

Canceling Critics, a Technique to Silence Science

A Danish medical doctor who worked for the pharmaceutical industry for almost a decade, Peter Gotzsche saw firsthand how his bosses would manipulate data that didn’t fit their industry agenda. Largely as a result of that frustration, Gotzsche co-founded the Cochrane Collaboration, a nonprofit initiative with an explicit goal to keep bias out of science.
For years, the Cochrane Collaboration was considered the gold standard of unbiased information, and Gotzsche, who himself published more than 50 peer-reviewed articles and eight books, was hailed as a crusader for scientific integrity.

In September 2018, however, Gotzsche was voted off Cochrane’s board (six in favor, five opposed, and one abstention). This move led four board members to resign in protest. He was also fired from his position as director of the Nordic Cochrane Center and was suspended from the hospital where he worked.

Gotzsche told journalist and documentary filmmaker Bert Ehgartner that he believed his dismissal was because he and two co-authors criticized a Cochrane review that found “high-certainty evidence” that a vaccine against human papilloma virus (HPV) protected women and girls from cervical precancer. Gotzsche critiqued the review, pointing out that Cochrane had excluded almost half the trials and ignored glaring safety signals about the HPV vaccine.
A hero of scientific integrity to many, Gotzsche is now being ostracized by his colleagues and characterized as an “industry scold.”

“A new scientific truth does not triumph by convincing its opponents and making them see the light,” German physicist Max Planck famously writes in his 1950 autobiography, “but rather because its opponents eventually die, and a new generation grows up that is familiar with it.”

According to Lyons-Weiler, science continues to move forward even without funerals. IPAK is currently engaged in a second phase of a study to examine the health outcomes of vaccinated versus unvaccinated children. This time, it has the participation of medical doctor, Russell Blaylock, a neurosurgeon who has warned against the toxicity of aluminum in vaccines for more than two decades.
In the meantime, do the problems with peer review mean we should reject new scientific findings? Watch out for the warning signs. Ask the question: Who is David and who is Goliath?

The discerning reader, whether scientist, academic, ethicist, journalist, or layperson, will understand that any asserted scientific “fact” or “conclusion” must be combined with common sense, a healthy skepticism, and a closer look at those who stand to profit.

Jennifer Margulis, Ph.D., is an award-winning journalist and author of “Your Baby, Your Way: Taking Charge of Your Pregnancy, Childbirth, and Parenting Decisions for a Happier, Healthier Family.” A Fulbright awardee and mother of four, she has worked on a child survival campaign in West Africa, advocated for an end to child slavery in Pakistan on prime-time TV in France, and taught post-colonial literature to nontraditional students in inner-city Atlanta. Learn more about her at JenniferMargulis.net
twitter
Related Topics