- Research
- Open access
- Published:
Reproducible and transparent research practices in published neurology research
Research Integrity and Peer Review volume 5, Article number: 5 (2020)
Abstract
Background
The objective of this study was to evaluate the nature and extent of reproducible and transparent research practices in neurology publications.
Methods
The NLM catalog was used to identify MEDLINE-indexed neurology journals. A PubMed search of these journals was conducted to retrieve publications over a 5-year period from 2014 to 2018. A random sample of publications was extracted. Two authors conducted data extraction in a blinded, duplicate fashion using a pilot-tested Google form. This form prompted data extractors to determine whether publications provided access to items such as study materials, raw data, analysis scripts, and protocols. In addition, we determined if the publication was included in a replication study or systematic review, was preregistered, had a conflict of interest declaration, specified funding sources, and was open access.
Results
Our search identified 223,932 publications meeting the inclusion criteria, from which 400 were randomly sampled. Only 389 articles were accessible, yielding 271 publications with empirical data for analysis. Our results indicate that 9.4% provided access to materials, 9.2% provided access to raw data, 0.7% provided access to the analysis scripts, 0.7% linked the protocol, and 3.7% were preregistered. A third of sampled publications lacked funding or conflict of interest statements. No publications from our sample were included in replication studies, but a fifth were cited in a systematic review or meta-analysis.
Conclusions
Currently, published neurology research does not consistently provide information needed for reproducibility. The implications of poor research reporting can both affect patient care and increase research waste. Collaborative intervention by authors, peer reviewers, journals, and funding sources is needed to mitigate this problem.
Background
Scientific advancement is hampered by potential research flaws, such as the lack of replication; poor reporting; selective reporting bias; low statistical power; and inadequate access to materials, protocols, analysis scripts, and experimental data [1,2,3]. These factors may undermine the rigor and reproducibility of published research. Substantial evidence suggests that a large proportion of scientific evidence may be false, unreliable, or irreproducible [4,5,6,7,8]. Estimates of irreproducible research range from 50 to 90% in preclinical sciences [9] and substantiated in a recent survey of scientists. Prior survey studies reported that roughly 70% of scientists were unable to replicate another scientist’s experiment, and 90% agreed that scientific research is currently experiencing a “reproducibility crisis” [7].
Reproducibility is vital for scientific advancement as it aids in enhancing the credibility of novel scientific discoveries and mitigates erroneous findings. One review discussed potential pitfalls in fMRI reproducibility, such as scanner settings, consistency of cognitive tasks, and analysis methods [10]. Boekel et al. replicated five fMRI studies measuring a total of 17 structural brain-behavior correlations. After reanalysis, only one of the 17 was successfully replicated [11]. Thus, practices related to transparency and reproducibility can be improved within fMRI and other neurology research.
Adopting open science in neurology would help mitigate irreproducible research, such as studies on brain-behavior correlation. Open science practices—such as data sharing, open access articles, sharing protocols and methods, and study preregistration—promote transparency and reproducibility [12]. For example, preregistering a study helps guard against selective outcome reporting [13]. Selective outcome reporting occurs when discrepancies exist between outcome measures prespecified in trial registries or research protocols and the outcomes listed in the published report [14]. In neurology, an audit of randomized clinical trials published in neurology journals found 180 outcome inconsistencies across 180 trials, with most inconsistencies favoring changes in accordance with statistically significant results. Additionally, only 55% of neurology trials were prospectively registered [15], providing indications that neurology researchers are not adhering to transparency and reproducibility practices early in research planning. Reproducible research and open science practices are widely endorsed by a large proportion of authors. Despite this support, evidence suggests that authors infrequently implement them [16,17,18].
Given the recent attention to the reproducibility crisis in science, further investigation is warranted to ensure the existence of reproducible and transparent research in the field of neurology. Here, we examine key transparency- and reproducibility-related research practices in the published neurology literature. Our findings from this investigation may serve as a baseline to measure future progress regarding transparency and reproducibility-related practices.
Methods
This observational, cross-sectional study used the methodology proposed by Hardwicke et al [3], with modifications. We reported this study in accordance with the guidelines for meta-epidemiological methodology research [19] and, when pertinent, the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) [20]. Our study did not use any human subjects or patient data and, as such, was not required to be approved by an institutional review board prior to initiation. We have used The Open Science Framework to host our protocol, materials, training video, and study data in a publically available database (https://osf.io/n4yh5/). This study was part of a comprehensive investigation on reproducibility across multiple clinical specialties.
Journal and publication selection
On June 25, 2019, one investigator (D.T.) searched the National Library of Medicine (NLM) catalog for all journals using the subject terms tag “Neurology [ST].” The inclusion criteria required that all journals publish English, full-text manuscripts and be indexed in the MEDLINE database. The final list of included journals was created by extracting the electronic international standard serial number (ISSN) or the linking ISSN, if necessary. PubMed was searched with the list of journal ISSNs on June 25, 2019, to identify all publications. We then limited our publication sample to those between January 1, 2014, and December 31, 2018. Four hundred publications within the time period were randomly sampled for data extraction. The rest were available, but not needed (https://osf.io/wvkgc/).
To estimate the required sample size for our study, we used Open Epi 3.0 (openepi.com). We selected data availability as our primary outcome based on its importance for study [3]. Our estimated parameters included a population size of 223,932 publications; a hypothesized % frequency of 18.5% for the data availability factor in the population (which was based upon data obtained by Hardwicke et al.); a confidence limit of 5%; and a design factor of 1, which is used in random sampling. Based upon these considerations, a 95% confidence level would require a sample size of 232. From our previous studies [21, 22], we estimated that approximately 40% of studies would be excluded following screening. Thus, a random sample of 400 publications with a hypothesized attrition rate of 40% would yield a final, minimum sample of 240 for analysis. Previous investigations, upon which this study is based, have included random samples of 250 publications in the social sciences and 150 publications in the biomedical sciences. Thus, our sample size exceeds those used in previous investigations.
Extraction training
Prior to data extraction, two investigators (S.R. and J.P.) completed in-person training designed and led by another investigator (D.T.). The training sessions included reviewing the protocol, study design, data extraction form, and likely locations of necessary information within example publications. The two authors being trained received two sample publications to extract data from. This example data extraction was performed in the same duplicate and blinded fashion used for data acquisition for this study. The two investigators then met to reconcile any discrepancies. After the two sample publications were completed, the investigators extracted data and reconciled differences from the first 10 of the included 400 neurology publications. This process insured interrater reliability prior to analyzing the remaining 390 publications. A final reconciliation meeting was conducted, with a third investigator (D.T.) available for disputes but not needed.
Data extraction
After completing the training, the same two investigators extracted the data from the included list of randomly sampled publications between June 3, 2019, and June 10, 2019, using a pilot-tested Google form. This Google form was based on the one used by Hardwicke et al., but including modifications [3]. We specified the 5-year impact factor and that for the most recent year as opposed to the impact factor of a specific year. The available types of study designs were expanded to include case series, cohort studies, secondary analyses, chart reviews, and cross-sectional analyses. Last, we specified funding sources, such as hospital, private/industry, non-profit, university, or mixed, instead of restricting the criteria to public or private.
Assessment of reproducibility and transparency characteristics
This study used the methodology by Hardwicke et al. [3] for analyses of transparency and reproducibility of research, with modifications. Full publications were examined for funding disclosures, conflicts of interest, available materials, data, protocols, and analysis scripts. Publications were coded to fit two criteria: those with and those without empirical data. Publications without empirical data (e.g., editorials, reviews, news, simulations, or commentaries without reanalysis) were analyzed for conflict of interest statements, open access, and funding. Given that protocols, data sets, and reproducibility were not relevant, these were omitted. Case studies and case series were listed as empirical studies; however, questions pertaining to the availability of materials, data, protocol, and registration were excluded due to previous study recommendations [18]. Data extraction criteria for each study design are outlined in Table 1.
Publication citations included in research synthesis and replication
For both empirical and nonempirical studies, we measured the impact factor of each journal by searching for the publication title on the Web of Science (https://webofknowledge.com). For empirical studies, we used the Web of Science to determine whether our sample of studies was cited in either a meta-analysis, systematic review, or a replication study. The Web of Science provided access to studies that cited the queried publication and provided the title, abstract, and link to the full-text article. This permitted the evaluation of the inclusion of the queried article in data synthesis. Extraction was performed by both investigators in a duplicate, blinded fashion.
Assessment of open access
Important core components of publications necessary for reproducibility are only available within the full text of a manuscript. To determine the public’s access to each publication’s full text, we systematically searched the Open Access Button (https://openaccessbutton.org), Google, and PubMed. First, we searched the title and DOI using the Open Access Button to determine if the publication was available for public access. If this search returned no results or had an error, then we searched the publication title on Google or PubMed and reviewed the journal website to determine if the publication was available without a paywall.
Statistical analysis
Microsoft Excel was used to report statistics for each category of our analysis. In particular, we used Excel functions to calculate our study characteristics, results, and 95% confidence intervals.
Results
Journal and publication selection
After searching the National Library of Medicine catalog, 490 neurology journals were eligible for analysis. After screening for inclusion criteria, 299 journals remained for analysis, yielding 223,932 publications. Of the 223,932 publications, we randomly sampled 400 (https://osf.io/qfy7u/). Eleven publications were inaccessible, which left 389 publications for analysis. Of the 389 eligible publications, 291 provided analyzable empirical data, and 118 articles were excluded because they did not contain characteristics measurable for reproducibility. Of the 291 publications eligible for analysis, an additional 20 case studies and case series were excluded, as they are irreproducible. Our final analysis was based on 271 publications with measurable reproducibility characteristics (Fig. 1 and Table 1).
Sample characteristics
Of the eligible publications, the median 5-year impact factor was 3.5 (interquartile range (IQR) 2.6–4.9), although 17 publications had inaccessible impact factors. The USA was the location of most of the primary authors (32.6%, 127/389) and the country of most publications (56.6%, 220/389). Of the 389 publications that were accessible, 32.1% (125/389) did not report a funding source, and 25.7% (100/389) reported funding from mixed sources (Table 2).
Of the randomly sampled 400 publications, 77.2% were behind a paywall (227/400), and only 57.1% were available to the public via the Open Access Button (168/400). Approximately half of analyzed publications stated that they did not have any conflicts of interest (55.5, 216/389), and 32.4% did not report whether or not conflicts of interest existed (126/389). Humans were the focus of 51.2% of the analyzed publications (199/389). Additional sample characteristics are viewable in Supplemental Tables 1, 2, and 3.
Reproducibility-related characteristics
Among the 271 publications with empirical data that were analyzed, a mere 3.7% provided preregistration statements or claimed to be preregistered (10/271). Of the 271 publications, just 0.7% provided access to the protocol (2/271). Only 9.4% provided access to the materials list (24/255), 9.2% provided access to the raw data (25/271), and just 2 articles provided the analysis script (0.7%, 2/2271). Not a single publication claimed to be a replication study. Additional characteristics are viewable in Supplemental Tables 1, 2, and 3.
Discussion
Our analysis demonstrates inadequate reproducibility practices within published neurology and neuroscience research. We found that few publications contained data or materials availability statements and even fewer contained a preregistration statement, made the protocol available, or included an analysis script. Our overall finding—that a majority of neurology publications lack the information necessary to be reproduced and transparent—is comparable to findings in the social and preclinical sciences [3, 5, 23,24,25,26]. Here, we present a discussion on prominent reproducibility and transparency indicators that were lacking in our study while presenting recommendations and practices to help improve neurology research.
First, data and materials availability is essential for reproducing research. Without source data, corroborating the results is nearly impossible. Without a detailed description of materials, conducting the experiment becomes a guessing game. Less than 10% of publications in our sample reported either a data or a materials availability statement. Efforts toward data sharing in neurological research originated with brain mapping and neuroimaging, but have spread to other areas within the specialty to improve reproducibility, transparency, and data aggregation [27]. Although data sharing poses challenges, steps have been taken in fMRI studies [28, 29]. fMRI data are complex and cumbersome to handle, but can be managed with software, such as Automatic Analysis [30], C-BRAIN [31], and NeuroImaging Analysis Kit [32]. Furthermore, these data can be hosted on online repositories, such as The National Institute of Mental Health Data Archive [33], Figshare [34], and other National Institutes of Health repositories [35]. Although researchers may take these steps voluntarily, journals—the final arbiters of research publications—can require such practices. Our study found that less than half of the sampled journals had a data availability policies, with approximately 20% of articles from these journals reporting source data [36]. Another study in PLOS ONE found that only 20% of nearly 50,000 publications included a data sharing statement and found that once a data sharing policy was enacted, open access to raw data increased [37]. Based on this evidence, journals and funders should consider implementing and enforcing data sharing policies that, at a minimum, require a statement detailing whether data are available and where data are located. For example, the journal Neurology has endorsed the International Committee of Medical Journal Editors policy of requiring a data sharing statement and encourages open access [38,39,40]. If other neurology journals follow suit, an environment of transparency and reproducibility may be established.
Second, preregistration practices were uncommon among neurology researchers. Preregistration prior to conducting an experiment safeguards against selective outcome reporting. This form of bias affects the quality of research in neurology. For example, when a randomized controlled trial (RCT) contains an outcome deemed “not significant” and is selectively removed from a trial, the validity of the RCT may be questioned. Previous studies have already established outcome reporting bias as an issue within neurology, noting that only 40% of analyzed RCTs were preregistered and, therefore, prespecified their analysis [15]. This same study found outcome reporting inconsistencies that often favored statistically significant results [15]. JAMA Neurology, The Lancet Neurology, and Neurology all require the preregistration of clinical trials prior to study commencement in accordance with the International Committee of Medical Journal Editors (ICJME) [41]. Only The Lancet Neurology mentions registration of other study designs, such as observational studies, and only “encourages the registration of all observational studies on a WHO-compliant registry” [42,43,44]. The ICJME notes that although non-trial study designs lack a researcher prespecified intervention, it is recommended to preregister all study types to discourage selective reporting and selective publication of results [41]. On ClinicalTrials.gov alone, almost 65,000 observational study designs have been preregistered, comprising 21% of all registered studies [45]. Encouraging the preregistration of clinical trials and observational studies, alike, will increase transparency, increase the evidence available for systematic reviews and meta-analyses, and improve reproducibility [46, 47].
Moving forward
We propose the following solutions to promote reproducible and transparent research practices in neurology. With regard to journals, we recommend requiring open data sharing upon submission, or, at least, a statement from the authors signifying why open data sharing does not apply to their study. There are many open data repositories available, including the Open Science Framework (https://osf.io/), opendatarepository.org, and others listed at re3data.org. Second, we recommend journals and funding providers to consider incentivizing reproducible research practices. For example, the Open Science Framework awards “badges” for open research practices, such as open data sharing, materials availability, and preregistration [48]. If one or more of these reproducible research practices do not apply to a particular study, a statement as to such should still qualify for the award. One Neuroscience journal, Journal of Neurochemistry, has already implemented open science badges with considerable success [49].
With regard to researchers, better awareness and education is necessary to encourage transparent and reproducible practices. Organizations, such as the Global Biological Standards Institute, have committed to improving the reproducibility of life sciences research through multiple methods, including training and educating researchers in effective trial design [50, 51]. The institute’s president has called for and implemented training programs aimed at teaching students, postdoctoral fellows, and principal investigators the importance of robust study design [50]. Additionally, we propose that medical schools and residency programs incorporate classes and didactic programs detailing proper experimental design with an emphasis on reproducible scientific practices. Research education should be a pillar of medical education, as physicians play an important role in guiding evidence-based healthcare. We anticipate that these recommendations, if implemented, will improve reproducibility within neurology and, as a result, the quality of research produced within this specialty.
Strengths and limitations
We feel that our methodology is robust and has many strengths, including blind and duplicate data extraction. Additionally, our protocol and data are available online to encourage reproducibility and transparency. However, we acknowledge a few limitations. First, we recognize that not all publications (clinical trials and protected patient data) are readily able to share their data and materials, although we feel a statement should still be reported, as justification was not always provided in each publication. Second, we did not contact the authors to obtain data, materials, or analysis scripts and only used published materials for extraction. Had we contacted the authors, then source data, materials, and protocols may have been available, but the goal of this publication was to examine readily available, published indicators of reproducibility. Finally, the scope of this study is limited to PubMed-indexed journals in neurology, and the results of this cross-sectional study may not be generalizable beyond this reach.
Conclusions
In summary, improvement is needed to incorporate reproducibility factors in neurology publications. Such necessary improvement is attainable. Authors, journals, and peer-reviewers all have a part to play in developing an improved community of patient-centered neurology researchers. Reproducibility is paramount in evidence-based medicine to corroborate findings and ensure physicians have the highest quality evidence upon which to base patient care.
Availability of data and materials
All protocols, materials, and raw data are available online via bioRxiv (BIORXIV/2019/763730).
Abbreviations
- RCT:
-
Randomized control trial
References
Wicherts JM, Borsboom D, Kats J, Molenaar D. The poor availability of psychological research data for reanalysis. Am Psychol. 2006;61:726–8.
Iqbal SA, Wallach JD, Khoury MJ, Schully SD, Ioannidis JPA. Reproducible research practices and transparency across the biomedical literature. PLoS Biol. 2016;14:e1002333.
Hardwicke TE, Wallach JD, Kidwell M, Ioannidis J. An empirical assessment of transparency and reproducibility-related research practices in the social sciences (2014–2017) [Internet]; 2019. Available from:. https://doi.org/10.31222/osf.io/6uhg5.
Ioannidis JPA. Why most published research findings are false. PLoS Med. 2005;2:e124.
Button KS, Ioannidis JPA, Mokrysz C, Nosek BA, Flint J, Robinson ESJ, et al. Power failure: why small sample size undermines the reliability of neuroscience. Nat Rev Neurosci. 2013;14:365–76.
Peng R. The reproducibility crisis in science: a statistical counterattack. Significance. 2015;12:30–2.
Baker M. 1,500 scientists lift the lid on reproducibility. Nature. 2016;533:452–4.
Begley CG, Glenn Begley C, Ioannidis JPA. Reproducibility in science. Circ Res. 2015:116–26. Available from:. https://doi.org/10.1161/circresaha.114.303819.
Freedman LP, Cockburn IM, Simcoe TS. The economics of reproducibility in preclinical research. PLoS Biol. 2015;13:e1002165.
Bennett CM, Miller MB. How reliable are the results from functional magnetic resonance imaging? Ann N Y Acad Sci. 2010;1191:133–55.
Boekel W, Wagenmakers E-J, Belay L, Verhagen J, Brown S, Forstmann BU. A purely confirmatory replication study of structural brain-behavior correlations. Cortex. 2015;66:115–33.
Banks GC, Field JG, Oswald FL, O’Boyle EH, Landis RS, Rupp DE, et al. Answers to 18 questions about open science practices. J Bus Psychol. Springer US. 2019;34:257–70.
Nosek BA, Ebersole CR, DeHaven AC, Mellor DT. The preregistration revolution. Proc Natl Acad Sci U S A. 2018;115:2600–6.
Mathieu S, Boutron I, Moher D, Altman DG, Ravaud P. Comparison of registered and published primary outcomes in randomized controlled trials. JAMA. 2009;302:977–84.
Howard B, Scott JT, Blubaugh M, Roepke B, Scheckel C, Vassar M. Systematic review: outcome reporting bias is a problem in high impact factor neurology journals. PLoS One. 2017;12:e0180986.
Harris JK, Johnson KJ, Carothers BJ, Combs TB, Luke DA, Wang X. Use of reproducible research practices in public health: a survey of public health analysts. PLoS One. 2018;13:e0202447.
Stupple A, Singerman D, Celi LA. The reproducibility crisis in the age of digital medicine. npj Digital Medicine. Nat Publ Group; 2019;2:2.
Wallach JD, Boyack KW, Ioannidis JPA. Reproducible research practices, transparency, and open access data in the biomedical literature, 2015–2017. PLoS Biol. Public Libr Sci. 2018;16:e2006930.
Murad MH, Wang Z. Guidelines for reporting meta-epidemiological methodology research. Evid Based Med. 2017;22:139–42.
Liberati A, Altman DG, Tetzlaff J, Mulrow C, Gøtzsche PC, Ioannidis JPA, et al. The PRISMA statement for reporting systematic reviews and meta-analyses of studies that evaluate health care interventions: explanation and elaboration. J Clin Epidemiol. 2009;62:e1–34.
Kinder NC, Weaver MD, Wayant C, Vassar M. Presence of “spin” in the abstracts and titles of anaesthesiology randomised controlled trials. Br J Anaesth. 2019:e13–4. Available from:. https://doi.org/10.1016/j.bja.2018.10.023.
Checketts JX, Riddle J, Zaaza Z, Boose MA, Whitener JH, Vassar MB. An evaluation of spin in lower extremity joint trials. J Arthroplast. 2019;34:1008–12.
Prinz F, Schlange T, Asadullah K. Believe it or not: how much can we rely on published data on potential drug targets? Nat Rev Drug Discov. 2011;10:712 Nature Publishing Group, a division of Macmillan Publishers Limited. All Rights Reserved.
Begley CG, Glenn Begley C, Ellis LM. Raise standards for preclinical cancer research. Nature. 2012:531–3. Available from:. https://doi.org/10.1038/483531a.
Tsilidis KK, Panagiotou OA, Sena ES, Aretouli E, Evangelou E, Howells DW, et al. Evaluation of excess significance bias in animal studies of neurological diseases. PLoS Biol. 2013;11:e1001609.
Begley CG, Ioannidis JPA. Reproducibility in science: improving the standard for basic and preclinical research. Circ Res. 2015;116:116–26.
Ferguson AR, Nielson JL, Cragin MH, Bandrowski AE, Martone ME. Big data from small data: data-sharing in the “long tail” of neuroscience. Nat Neurosci. 2014:1442–7. Available from:. https://doi.org/10.1038/nn.3838.
Nichols TE, Das S, Eickhoff SB, Evans AC, Glatard T, Hanke M, et al. Best practices in data analysis and sharing in neuroimaging using MRI. Nat Neurosci. 2017;20:299–303.
Borghi JA, Van Gulick AE. Data management and sharing in neuroimaging: practices and perceptions of MRI researchers. PLoS One. 2018;13:e0200562.
Cusack R, Vicente-Grabovetsky A, Mitchell DJ, Wild CJ, Auer T, Linke AC, et al. Automatic analysis (aa): efficient neuroimaging workflows and parallel processing using Matlab and XML. Front Neuroinform. 2014;8:90.
CBRAIN | Home. Available from: http://www.cbrain.ca/. Cited 14 Aug 2019.
NITRC: NeuroImaging Analysis Kit (NIAK): Tool/Resource Info. Available from: https://www.nitrc.org/projects/niak. Cited 14 Aug 2019.
NDA. Available from: https://nda.nih.gov/. Cited 2019 Aug 14.
Kraker P, Lex E, Gorraiz J, Gumpenberger C, Peters I. Research data explored II: the anatomy and reception of figshare. arXiv [cs.DL]. 2015; Available from: http://arxiv.org/abs/1503.01298.
NIH Data Sharing Repositories. U.S. National Library of Medicine; 2013; Available from: https://www.nlm.nih.gov/NIHbmic/nih_data_sharing_repositories.html. Cited 14 Aug 2019.
Johnson JN, Hanson KA, Jones CA, Grandhi R, Guerrero J, Rodriguez JS. Data sharing in neurosurgery and neurology journals. Cureus. 2018;10:e2680.
Federer LM, Belter CW, Joubert DJ, Livinski A, Lu Y-L, Snyders LN, et al. Data sharing in PLOS ONE: an analysis of data availability statements. PLoS One. 2018;13:e0194768.
Taichman DB, Sahni P, Pinborg A, Peiperl L, Laine C, James A, et al. Data sharing statements for clinical trials: a requirement of the International Committee Of Medical Journal Editors. Ethiop J Health Sci. 2017;27:315–8.
Baskin PK, Gross RA. The new neurology: redesigns, short articles for print, full articles online, and data availability policies. Neurology. 2017;89s:2026–8.
Research Policies and Guidelines | American Academy of Neurology Journals. Available from: https://www.neurology.org/research-policies-and-guidelines. Cited 14 Aug 2019.
of Medical Journal Editors IC, Others. Recommendations for the conduct, reporting, editing, and publication of scholarly work in medical journals. 2016.
Instructions for Authors | JAMA Neurology | JAMA Network. Available from: https://jamanetwork.com/journals/jamaneurology/pages/instructions-for-authors. Cited 15 Aug 2019.
Research Policies and Guidelines | American Academy of Neurology Journals. Available from: https://www.neurology.org/research-policies-and-guidelines. Cited 15 Aug 2019.
Information for Authors. The Lancet Neurology. Available from: https://els-jbs-prod-cdn.literatumonline.com/pb/assets/raw/Lancet/authors/tln-info-for-authors-1564564413020.pdf. Cited 15 Aug 2019.
Trends, Charts, and Maps - ClinicalTrials.gov. Available from: https://clinicaltrials.gov/ct2/resources/trends. Cited 15 Aug 2019.
Dal-Ré R, Ioannidis JP, Bracken MB, Buffler PA, Chan A-W, Franco EL, et al. Making prospective registration of observational research a reality. Sci Transl Med. 2014;6:224cm1.
Ioannidis JPA. The importance of potential studies that have not existed and registration of observational data sets. JAMA. 2012;308:575–6.
Blohowiak BB, Cohoon J, de-Wit L, Eich E, Farach FJ, Hasselman F, et al. Badges to acknowledge open practices: OSF; 2013. Available from: https://osf.io/tvyxz/wiki/home/.
Jones J, More Content by. Connecting research with results: open science badges. 2018. Available from: https://www.wiley.com/network/researchers/licensing-and-open-access/connecting-research-with-results-open-science-badges. Cited 19 Jul 2019.
One way to fix reproducibility problems: train scientists better. The Scientist Magazine®. Available from: https://www.the-scientist.com/news-opinion/one-way-to-fix-reproducibility-problems-train-scientists-better-30577. Cited 19 Jul 2019.
Our Mission - Global Biological Standards Institute. Global Biological Standards Institute. Available from: https://www.gbsi.org/about/who-we-are/. Cited 14 Aug 2019.
Acknowledgements
Not applicable
Funding
This study was funded through the 2019 Presidential Research Fellowship Mentor – Mentee Program at Oklahoma State University Center for Health Sciences.
Author information
Authors and Affiliations
Contributions
All authors read and approved the final manuscript. Individual roles are detailed in the “Methods” section.
Authors’ information
Not applicable
Corresponding author
Ethics declarations
Ethics approval and consent to participate
Not applicable
Consent for publication
Not applicable
Competing interests
The authors declare that they have no competing interests.
Additional information
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Supplementary information
Additional file 1: Table S1.
Additional Characteristics of Reproducibility in Neurology Studies.
Additional file 2: Table S2.
Additional Characteristics of Reproducibility in Neurology Studies.
Additional file 3: Table S3.
Additional Characteristics of Reproducibility in Neurology Studies.
Rights and permissions
Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.
About this article
Cite this article
Rauh, S., Torgerson, T., Johnson, A.L. et al. Reproducible and transparent research practices in published neurology research. Res Integr Peer Rev 5, 5 (2020). https://doi.org/10.1186/s41073-020-0091-5
Received:
Accepted:
Published:
DOI: https://doi.org/10.1186/s41073-020-0091-5