Tuesday, February 5, 2019, KOH-B-10.
Open to all members of UZH: students, researchers, administrative and technical staff
No registration required.
Get information on issues of reproducibility, learn about solutions and offers at UZH. Take the plunge and practice with experts from CRS in hands-on workshops. Presentations and workshops by experts from across UZH as well as invited speakers:
Joachim Wagner, chief editor of The International Journal for Re-Views in Empirical Economics and Nathalie Le Bot, senior editor at Nature will give the publisher's perspective on replication and reproducibility.
Organized by The Center for Reproducible Science
Software Carpentry Workshop on February 7 and 8, 2019
Aiming to help researchers get their work done in less time and with less pain by teaching them basic research computing skills. This hands-on workshop will cover basic concepts and tools, including program design, version control, data management, and task automation. Participants will be encouraged to help one another and to apply what they have learned to their own research problems.
9 : 00 - 9 : 30
By Leonhard (CRS)
Introduction of the CRS, history of the reproducibility crisis and overview over the day
9 : 30 - 10 : 00
By Marco Steenbergen (CRS)
Using Qualitative Comparative Analysis (QCA) as a case study, the first half of this lecture shows that reproducibility is a problem in qualitative social science. Numerous decisions enter the approach, many of them having direct consequences for subsequent findings. In many cases, the choices made are highly subjective but often hidden. This hampers reproducibility. The second half of this lecture focuses on practices that could improve reproducibility. Those practices go beyond QCA and can also inform other forms of qualitative inquiry.
10 : 00 - 10 : 30
10 : 30 - 10 : 45
By Martina Grunow, managing editor of The International Journal for Re-Views in Empirical Economics (program changed: Joachim Wagner, editor in chief had to cancel)
Non-reproducible research and the lack of replication studies determine the current replication crisis which is afflicting economic science. By ignoring the scientific instrument of replication studies economics risk to get lost in the crowd of unsystematic empirical results. In the light of a publish-or-perish culture with which we are faced nowadays in academia the replication crisis can be understood as a problem of incentives. Increasing incentives towards replication studies can help to foster reproducible research. The International Journal for Re-Views in Empirical Economics (IREE) is dedicated to the publication of replications studies in empirical economics. IREE is specifically designed to overcome the incentive problems and to encourage researchers to conduct replications.
10 : 45 - 11 : 00
By Nathalie le Bot, senior editor of Nature
Scientists, institutions, funders and journals all have a part in tackling issues with reproducibility of research. Nature Research journals have taken substantive steps to improve the robustness in what they publish, implementing concrete steps to increase the transparency of reporting and reproducibility of published results across all areas of science. I will present an overview of the steps taken across journals, from implementing field specific reporting checklists to mandating deposition of specific datasets and protocols in open access platforms.
11 : 00 - 11 : 15
By Lawrence Rajendran, Science Matters/CRS/King's College
Due to the current academic reward system focusing mostly on the publication of novel and positive research findings that are subsequently rewarded through funding and other career prospects, reproducibility - one of the fundamental aspects of the scientific endeavour - has been neglected over the past decades. To address this gap, we are establishing an open access journal on “Reproducibility” in science exclusively dedicated to publishing observations that either confirm or contradict previously published findings, thereby providing a platform for researchers who want to improve the state of reproducibility in science. We plan to link all observations to the original findings and create visualisations of the network containing “confirmatory edges” or “contradictory edges” of observations published, enabling the natural emergence of an honest scientific narrative. Apart from its unusual content, the new journal will also leverage features of innovative digital tools and blockchain-empowered publishing strategies that enable exploiting the full potential of pre-prints and crowdsourced peer-review of these findings.
11 : 15 - 11 : 30
By André Hoffmann, Eva-Marie Lang (HBZ)
For more than 10 years the Main Library has supported the implementation of UZH’s Open Access (OA) policy with trainings, financial support and services such as the OA repository ZORA or the OA journal platform HOPE. With the implementation of the national OA strategy for Switzerland and other international initiatives, Open Access will continue to play an increasingly important role for the dissemination of research results in the near future.
In the second part of this talk we present the data management support provided by the Main Library: With ongoing digitalization the amount of research data is growing faster than ever. Reproducibility strongly depends on the thorough handling of this data. The Main Library addresses the new tasks researchers are currently confronted with and helps them to successfully manage their data ensuring its long term preservation as well as the legitimate re-use. For more information visit us on www.hbz.uzh.ch/de/open-access-und-open-science.html
11 : 30 - 12 : 00
By Martina Grunow, Natalie le Bot, Lawrence Rajendrani, André Hoffmann
12 : 00 - 13 : 15
13 : 15 - 13 : 45
By Regina Grossmann (CTC),
and Ulrike Held (EBPI, CRS) & Stefanie von Felten (EBPI)
Medical research involving humans has a long history of regulations and guidelines, such as the Declaration of Helsinki and the ICH guidelines including Good Clinical Practice (GCP). High ethical and scientific standards have been established to guarantee validity and generalizability of research data.
Planning of clinical studies requires that a study protocol is written and approved by an independent ethics committee, and the statistical analysis of data (especially from randomised clinical trials, RCTs) needs to be pre-specified. Pre-registration of studies, quality control and quality assurance measures are additional attributes to achieve reproducible results. In this talk, we also present reporting guidelines implemented by medical journals, and discuss their endorsement.
13 : 45 - 14 : 00
By Paulin Jirkof (Animal Welfare UZH)
Reproducibility issues with animal studies have become a highly discussed topic in the scientific community. The ethical construct commonly used to justify the use of animals in research is that of the “greater good” that can be achieved with results of animal experimentation. Results of animal based research are the foundation and reason for further animal experiments and are used to provide efficacy and safety determinations for clinical studies in humans. This talk will give a short overview on the ethical and animal welfare considerations that arise when a study using animals is not reproducible and how the scientific community is trying to tackle the reproducibility issue.
14 : 00 - 14 : 15
By Simon Schwab (CRS)
Open science principles are part of good research practice. This talk will give a short overview of the most important principles, such as pre-registration, open methods, and data sharing . The presentation will also address why the focus must shift from significant results to the moment prior to the analysis: the research question and the methods.
14 : 15 - 14 : 45
By Abraham Bernstein (DSI, CRS)
The Digitalization for Science is well on its way. On one side, scientific processes are increasingly automated and artificial intelligence techniques are “invading" almost any research domain. On the other side, technology allows for a previously unprecedented scale of collaboration. Based on examples from Computer Science and other related domains this talk introduces a number of these innovations and reflects on their risks and opportunities for ensuring the reproducibility of results.
14 : 45 - 15 : 15
15 : 15 - 15 : 30
By Lars Malmstrom (S3IT)
Reproducible computing and data analytics can be challenging. At S3IT, we have developed iPortal, an integrated data and workflow manager that enables users to upload and analyze their data using a web browser. The iPortal workflows achieve reproducibility by using singularity containers and generates Jupyter Notebooks that can be extended by the user.
15 : 30 - 16 : 05
By Nicolas Langer (Institute of Psychology, CRS)
Please bring a laptop!
Flexibility in how you analyse your data can invalidate statistical inferences. Importantly, you can employ questionable research practices like “p-hacking“ without knowing you are doing it. Decide to stop an analysis because the results are significant? Measure 3 dependent variables and use the one that “works”? Exclude participants who don’t respond to your manipulation? All justified in exploratory research. However, the scientific community tends to forget the distinction between exploratory vs. confirmatory research, presenting exploratory results as confirmatory, presenting post-hoc rationales as predictions. As well as being dishonest, this makes for unreliable science. There is a solution: preregistration. Declare in advance the details of your method and your analysis: sample size, exclusion conditions, dependent variables, directional predictions. Preregistration is easy. There is no single, universally accepted, way to do it. In this hands-on presentation you will learn different possibilities of preregistration.
To access the different preregistration possibilities, you need to create an account (free) on OSF: https://osf.io/
We are going to make a preregistration. Here is the materials for the preregistration: https://osf.io/cp98d/
For a small experiment, you will need the following link to google forms: https://goo.gl/forms/hZfEK4vt6dIzjPFu1
16 : 05 - 16 : 40
By Carolin Strobl (CRS)
Please bring a laptop!
Sample size calculation is an important prerequisite for planning scientific studies. This presentation will review the statistical basics of how sample size and other factors determine the power of significance tests (bring your laptop to use an online app for the illustration). Afterwards, sample size calculation for standard methods like t-test and regression by means of the R package pwr is illustrated (install R and the pwr package on your laptop to actively follow the example; it is also fine to just watch). In the end, the presentation will discuss critical choices that need to be made for sample size calculation in practice and give a short outlook on sample size calculation for more advanced methods like multilevel models.
16 : 40 - 16 : 55
16 : 55 - 17 : 30
By Leonhard Held, Manuela Ott, Charlotte Micheloud, Samuel Pawel (CRS)
Please bring a laptop!
It is conventionally thought that a replication of a significant experiment will have a high probability of resulting again in statistical significance. However, even in the absence of publication or other biases of the original study effect, this "replication probability" is substantially lower than expected. We will illustrate through examples how it depends on the p-value and the uncertainty of the effect estimate from the original study. This in turn implies that the replication sample size has to be quite large to ensure a significant replication result with high probability. We will illustrate this with a Shiny App, which can be used by the participants to design their own replication studies.
17 : 30 - 18 : 05
By Simon Schwab, Eva Furrer (CRS)
Please bring a laptop!
R Notebooks can largely improve the reproducibility of data processing and statistical analyses. Notebooks are electronic documents containing code chunks that can be executed independently and interactively, with output directly visible next to the code. We will give a brief introduction into R Notebooks and R Markdown and demonstrate its technicalities and strengths. Afterwards, participants analyze themselves an example dataset with a provided R Notebook to produce publication-ready figures and tables. Rreproducible analysis pipelines not only strengthen the credibility and transparency in research, but also improve the efficiency of the researcher in case an analysis needs to be modified or redone, for example, during the peer-review process.
18 : 05 - 19 : 00,