Community Guest edition by
Apart from the usual commercial superficiality and downright covered-up marketing, my LinkedIn feed has changed the last year. With a background in physics and semiconductor fabrication – and today serving customers in biotech, chemistry and life science — I see a lot of content in my feed from people in or around the global scientific community. And something is picking up.
Poor reproducibility. Selective reporting. Predatory publishing.
According to a recent survey in Nature, more than 70% of researchers have tried and failed to reproduce another scientist’s experiments, and more than half have failed to reproduce their own experiments. Some hints to the underlying causes are provided:
“The survey asked scientists what led to problems in reproducibility. More than 60% of respondents said that each of two factors — pressure to publish and selective reporting — always or often contributed. More than half pointed to insufficient replication in the lab, poor oversight or low statistical power. A smaller proportion pointed to obstacles such as variability in reagents or the use of specialized techniques that are difficult to repeat. But all these factors are exacerbated by common forces, says Judith Kimble, a developmental biologist at the University of Wisconsin–Madison: competition for grants and positions, and a growing burden of bureaucracy that takes away from time spent doing and designing research.” – Monya Baker | 452 | NATURE | VOL 533 | 26 MAY 2016
Many fields of science, from psychology to cancer biology, seem to be dealing with similar problems. A recent article by journalist Ed Yong sums up one example of decades of early research on the genetics of depression built on nonexistent foundations and provide clues to the cause:
“Beyond a few cases of outright misconduct, these practices are rarely done to deceive. They’re an almost inevitable product of an academic world that rewards scientists, above all else, for publishing papers in high-profile journals—journals that prefer flashy studies that make new discoveries over duller ones that check existing work. People are rewarded for being productive rather than being right, for building ever upward instead of checking the foundations. These incentives allow weak studies to be published. And once enough have amassed, they create a collective perception of strength that can be hard to pierce”. – Ed Yong, The Atlantic
This race might in a not so distant future raise the question of why then trust science at all:
“People ask, well, if scientists are publishing crap, why should we believe global warming and evolution?” – Keller Matthew Keller, associate professor, University of Colorado.
I will argue that in uncertain times, it must be in the interest of the general public to maintain a strong well-funded scientific community free of political interference that can provide impartial fact-based answers to the big questions we humans face in these years. Easier said than done.
While the reproducibility and mass-publication mega-trends are hard to battle as an individual researcher, more amble efforts have been suggested in my LinkedIn feed to combat some of these problems.
Simply. Go digital in the lab.
Such a simplified statement requires elaboration. In my LinkedIn feed, the digital lab is a big thing. At least from a vendor perspective. But if central records of the experiment are recorded on paper by manual reading of equipment values with a supplement of electronic data scattered across multiple sources, then replication becomes hard. Especially when the person performing the original experiment has left the lab. If simple experimental conditions and metrics are not monitored and cannot be controlled, how can high-level outcomes on an entire study then be reproduced consistently?
In my daily work, it is quite the common to visit laboratories where scales are manual, pipetting is manual with manual calibration, temperature readings are manual, pH-meters are manual relying on infrequent calibrations, and finally instruments are not connected to a computer, and almost never directly record data in a dedicated database system. Instrument communication is based on all sorts of different proprietary protocols and getting data out is often done by USB drive and manual notes. Most work is manual, not automated, and SOPs are non-existent. The modern lab has a long way to go in terms of digitalization.
Less than 10% of researchers are using an electronic lab notebook. – labfolder
Even starting with something as simple as the central lab notebook, many labs still have a long way to go even though old fashion Laboratory Information Management Systems have been around for some decades now. With modern offerings such as labfolder, LabArchives, RSpace, Benchling and SciNote the lab notebook should be electronic by now, but lab users do not see a problem. Culture still eats strategy for breakfast. When all generated data is recorded on a user-friendly accessible platform, unintentional cherry picking of results becomes harder and sharing afterwards comes without extra costs.
As a small example, in our production lab, we use the Eupry system for temperature logging — a neat little box automatically sending temperature readings to the cloud over Wi-Fi. Due to the warm summers in Denmark the last couple of years, the temperature sometimes drifts out of specification. Because we know this, we can take appropriate measures. Simple things, but it helps us with the quality control to maintain high reproducibility.
This is why the merger between instrument integrator cobuslab and cloud-based notebook labfolder into LABFORWARD should raise a few eyebrows for people interested in the lab of the future. The attractive promise of complete autonomous digitalization from instrumentation and robots via cables and protocols to cloud data storage and one electronic lab notebook unifying all data is a compelling vision of Industry 4.0 for the lab.
Such digitalization efforts must be part of the solution to battle the fundamental issues facing research these years with more than half of the researchers in the Nature survey agreeing that there is a significant ‘crisis’ of reproducibility. It cannot stand alone, but together with other initiatives, the trust once bestowed upon the scientific community can be reinstated.
Data is the value created by any lab.
Let us make it count.
About the author: Emil Højlund-Nielsen has an academic background as a researcher and project manager in material sciences using Micro- and Nanotechnology principles. His Phd investigated color metasurfaces in collaboration with LEGO and the car company FIAT as a way to use structural nano color in plastic products instead of traditional pigments, which fade over time. The project won the EU excellence award in 2018 and his results were published in Nano Letters and Nature Nanotechnology. Today, he is cited more than 450 times in the literature. In 2015, he then went on to co-found the startup Copenhagen Nanosystems, where he has been CEO for more than 3 years. The startup combines plastic labware for life science and biotech with nanotechnology to expand the functionality of existing simple laboratory equipment. The company has invented a plastic cuvette with nanotechnology and cloud-based software making it easy to analyze biological samples and document the results.