Search results

Filters

  • Journals
  • Authors
  • Keywords
  • Date
  • Type

Search results

Number of results: 1
items per page: 25 50 75
Sort by:
Download PDF Download RIS Download Bibtex

Abstract

Sharing research data from public funding is an important topic, especially now, during times of global emergencies like the COVID-19 pandemic, when we need policies that enable rapid sharing of research data. Our aim is to discuss and review the revised Draft of the OECD Recommendation Concerning Access to Research Data from Public Funding. The Recommendation is based on ethical scientific practice, but in order to be able to apply it in real settings, we suggest several enhancements to make it more actionable. In particular, constant maintenance of provided software stipulated by the Recommendation is virtually impossible even for commercial software. Other major concerns are insufficient clarity regarding how to finance data repositories in joint private-public investments, inconsistencies between data security and user-friendliness of access, little focus on the reproducibility of submitted data, risks related to the mining of large data sets, and sensitive (particularly personal) data protection. In addition, we identify several risks and threats that need to be considered when designing and developing data platforms to implement the Recommendation (e.g., not only the descriptions of the data formats but also the data collection methods should be available). Furthermore, the non-even level of readiness of some countries for the practical implementation of the proposed Recommendation poses a risk of its delayed or incomplete implementation.
Go to article

Bibliography

  1.  OECD, Recommendation of the council concerning access to research data from public funding. [Online]. https://legalinstruments.oecd. org/en/instruments/OECDLEGAL-0347
  2.  D.N. Le, A. Shahbazian, and N. Medvidovic, “An Empirical Study of Architectural Decay in Open-Source Software”, IEEE International Conference on Software Architecture (ICSA), 2018.
  3.  K.R. Sipido, “Irreproducible results in preclinical cardiovascular research: Opportunities in times of need”, Cardiovasc. Res. 115 (3), E34–E36 (2019).
  4.  D.A. Eisner, “Reproducibility of science: Fraud, impact factors and carelessness”, J. Mol. Cell. Cardiol. 114, 364‒368 (2018).
  5.  L. Madeyski and B. Kitchenham, “Would wider adoption of reproducible research be beneficial for Empir. Softw. Eng. research?”, J. Intell. Fuzzy Syst. 32(2), 1509–1521 (2017).
  6.  T. Lewowski and L. Madeyski, “Creating Evolving Project Data Sets in Software Engineering”, Integr. Res. Pract. Softw. Eng. 851, 1–14 (2020), doi: 10.1007/978-3-030-26574-8_1.
  7.  T. Moberly, “Should we be worried about the NHS selling patient data?”, BMJ 368, m113 (2020). doi: 10.1136/bmj.m113.
  8.  C. Aicardi, L. Del Savio, E.S. Dove, F. Lucivero, N. Tempini, and B. Prainsack, “Emerging ethical issues regarding digital health data. On the World Medical Association Draft Declaration on Ethical Considerations Regarding Health Databases and Biobanks”, Croat. Med. J. 57(2), 207–213 (2016), doi: 10.3325/cmj.2016.57.207.
  9.  E. Mahase, “Government hands Amazon free access to NHS information”, BMJ 367, l6901 (2019), doi: 10.1136/bmj.l6901.
  10.  A. Ballantyne, “How should we think about clinical data ownership?”, J. Med. Ethics 46(5), 289–294 (2020).
  11.  B. Kitchenham, L. Madeyski, D. Budgen, J. Keung, P. Brereton, S. Charters, S. Gibbs, and A. Pohthong, “Robust Statistical Methods for Empir. Softw. Eng.”, Empir. Softw. Eng. 22(2), 579–630 (2017).
  12.  Ch. Edwards, “Malevolent machine learning”, Commun. ACM 62(12), 13–15 (2019).
  13.  S. Greengard, “An inability to reproduce”, Commun. ACM 62(9), 13–15 (2019).
  14.  F. Pasquale, “When machine learning is facially invalid”, Commun. ACM 61(8), 25–27 (2018).
  15.  J.P.A. Ioannidis: “The mass production of redundant, misleading, and conflicted systematic reviews and meta-analyses”, Milbank Q. 94, 485–514 (2016).
  16.  J.B. Carlisle, “Data fabrication and other reasons for nonrandom sampling in 5087 randomised, controlled trials in anaesthetic and general medical journals”, Anaesthesia 72(8), 944–952, (2017)
  17.  Ch.H.J. Hartgerink, J.M. Wicherts, and M.A. van Assen, “The value of statistical tools to detect data fabrication”, Res. Ideas Outcomes 2, e8860 (2016).
  18.  Ch.H.J. Hartgerink, J.G. Voelkel, J.M. Wicherts, and Marcel A.L.M. van Assen. “Detection of Data Fabrication Using Statistical Tools”, PsyArXiv, 2019, doi: 10.31234/osf.io/jkws4.
  19.  N.J.L. Brown and J.A.J. Heathers, “The grim test: A simple technique detects numerous anomalies in the reporting of results in psychology”, Soc. Psychol. Personal Sci. 8(4), 363–369 (2017).
  20.  J.A. Heathers, J. Anaya, T. van der Zee, and N. Brown “Recovering data from summary statistics: Sample Parameter Reconstruction via Iterative TEchniques (SPRITE)”, PeerJ Preprints, e26968v1 (2018). doi: 10.7287/peerj.preprints.26968v1.
  21.  S. Al-Marzouki, S. Evans, T. Marshall, and I. Roberts, “Are these data real? Statistical methods for the detection of data fabrication in clinical trials”, BMJ, 331(7511), 267–270 (2005), doi: 10.1136/bmj.331.7511.267.
Go to article

Authors and Affiliations

Lech Madeyski
1
ORCID: ORCID
Tomasz Lewowski
1
ORCID: ORCID
Barbara Kitchenham
2
ORCID: ORCID

  1. Faculty of Computer Science and Management, Wroclaw University of Science and Technology, ul. Wybrzeze Wyspianskiego 27, 50-370 Wroclaw, Poland
  2. School of Computing and Mathematics, Keele University, Keele, Staffordshire, ST5 5BG, UK

This page uses 'cookies'. Learn more