100 Psychology Studies Replicated, More Than Half Failed

Posted: Aug 28 2015, 4:37am CDT | by , in News | Latest Science News


100 Psychology Studies Replicated, More Than Half Failed
Getty Images
  • Huge Hurdles encountered in replicating Results of Psychology Studies

Scientists encountered huge hurdles in replicating results of psychology studies.

Over a hundred psychological findings were put to the test. However, less than half of these studies could be properly replicated. This has cast doubt on the veracity of the findings. And it has also led to cynicism. That is  regarding the repeatability of these important psychology experiments.

The study was carried out by 270 experts spread over five land masses. The results of this review study are published in the Aug. 28 issue of the journal Science. Twenty-two students and faculty from the University of Virginia were among the co-authors.

"For years there has been concern about the reproducibility of scientific findings, but little direct, systematic evidence. This project is the first of its kind and adds substantial evidence that the concerns are real and addressable," said Brian Nosek, a U.Va. psychology professor and coordinator of the study.

Apprehension had been building up regarding the repetition of the findings. There was little consistent proof. The results of the psychology experiments could not be repeated.

Now it seems that the suspicions had been right all along. Most of the psychological studies were not reliable to begin with. They had loopholes and potholes inside their very structural-functional makeup.

"With this project we established an initial estimate of the rate of reproducibility in psychology, and identified some evidence of possible influences on reproducibility," said Anup Gampa, a Reproducibility Project team member and Ph.D. candidate at U.Va. "This sets the stage for new research to examine how to improve reproducibility."

These issues have to be paid attention to. Otherwise we are walking on thin ice as far as the (pseudo)science of psychology is concerned. This is the first ever open probe into the matter.

And it has found a lot of sneakiness in the conducting of the psychology studies. Ideally, the results should be reproducible. But if they are not then the problem should be solved with serious intent.

"Scientific evidence does not rely on trusting the authority of the person who made the discovery," said Reproducibility Project team member Angela Attwood, a psychology professor at the University of Bristol. "Rather, credibility accumulates through independent replication and elaboration of the ideas and evidence."

The criterion of science is that its experiments ought to be verifiable. If there is any falsification, this will lead to further evolution in scientific knowledge. Science does not depend on blind faith alone. It is based on observation and stringent standards.

If the results of the experiments cannot be repeated, then they are as good as invalid in their proof-positivity. Psychology has still not escaped from the ambit of the humanities. It is not a full-fledged science. A lot of cleaning up needs to be done before psychology becomes a hard science.

"A replication team must have a complete understanding of the methodology used for the original research, and shifts in the context or conditions of the research could be unrecognized but important for observing the result," Elizabeth Gilbert, a Reproducibility Project team member and Ph.D. candidate at U.Va., said.

The findings of psychology are published in journals. And while some journals print articles with reluctance, others easily publish the articles. Fewer than half of the 100 psychology studies were reproduced by the researchers.

This is a sad commentary on the bogus nature of psychological science. This is a sphere of where anything goes. A modicum of openness and repeatability must be built into the experiments.

That is if they are to retain their truth value. Otherwise it is a losing battle that the social sciences are fighting. And that too against the forces of bombastic nonsense. Improvements in methodology are needed. It will be quite a struggle but it will have been worth it in the end.   

"The findings demonstrate that reproducing original results may be more difficult than is presently assumed, and interventions may be needed to improve reproducibility," said Johanna Cohoon, a project coordinator with the Charlottesville-based Center for Open Science.

"Efforts include increasing transparency of original research materials, code and data so that other teams can more accurately assess, replicate and extend the original research, and pre-registration of research designs to increase the robustness of the inferences drawn from the statistical analyses applied to research results," said Denny Borsboom, a project team member from the University of Amsterdam who was involved in the creation of the Transparency and Openness Promotion Guidelines, recently published in Science.

You May Like


The Author

<a href="/latest_stories/all/all/20" rel="author">Sumayah Aamir</a>
Sumayah Aamir (Google+) has deep experience in analyzing the latest trends.




Leave a Comment

Share this Story

Follow Us
Follow I4U News on Twitter
Follow I4U News on Facebook

You Also Like


Read the Latest from I4U News