TReQ: 6 Conclusion

16 December, 2021

Reading time: 11 minutes

This is the final video in our series on improving the transparency, reproducibility, and quality (TReQ) of applied research.

TReQ (Improving the transparency, reproducibility and quality of your research): 6 Conclusion

Presented by Gesche Huebner, Michael Fell and Nicole Watson

This is the final video in our series on improving the transparency, reproducibility, and quality (TReQ) of applied research. It explains when tools like preregistration of studies, reporting guidelines, preprints, and open data and code should be used in the research process, and why this is beneficial both for you as a researcher and for science and society in general. It also introduces the TReQlist, which you can use as a helpful checklist as well as to report your use of these tools. This is video 6 of 6 from our team at University College London.

TReQ conclusion: video transcript

[Introduction slide 1: title “TReQ: improving the transparency, reproducibility and quality of your research”]

[Introduction slide 2: title “Video Six: Conclusion”]

Gesche Huebner: In these videos we have just spoken about why we need to improve the transparency, reproducibility, and quality of our research. And we have described four tools to do just that. Let’s now bring it all together and show how the tools fit into the research process. It can be helpful to think about the research process as a circle.

[Diagram slide with a circle of seven arrows with gaps between them. This diagram repeats throughout the video until Michael Fell begins presenting]

Gesche Huebner: We start with an idea, we generate a research question and depending on study types, some hypotheses.

[The text “generate ideas/hypotheses” appears in the first gap between the circle of arrows]

Gesche Huebner: We then design our study before we collect our data and then analyse it.

[The text “Design study”, “Collect, record and store data” and “Analyse data” appear as Gesche mentions them in order in the second, third and fourth gaps between the circle of arrows]

Gesche Huebner: We interpret findings and then write them up for publication.

[The text “Interpret data”, “Write up results” and “Publish” appear as Gesche mentions them in order in the fifth, sixth and seventh gaps between the circle of arrows. The gaps between the arrows are now all filled with text to show the circular nature of the research process]

Gesche Huebner: We then go back to the drawing board and start again with a research question, with ideas and hypotheses. At various stages of the research process we might go wrong both intentionally and unintentionally. For example, we might design a study poorly.

[The diagram now adds text around the outside of the circle of arrows in time with Gesche mentioning them, to show where things can go wrong in the research process. The text “Poor study design” appears next to the text “Design study”]

Gesche Huebner: Maybe you just forget to collect some important covariates. It’s also very easy to make mistakes when it comes to collecting the data as simply as forgetting to turn on a recording device.

[The text “Mistakes in data collection” appears next to the text “Collect, record and store data”]

Gesche Huebner: When it comes to analysing data, it’s quite easy to make mistakes such as overriding your own data in your script. Sometimes we might be tempted to play around with the data and try different types of analysis until we found very nice significant effects.

[Two small double sided arrows with the text “Manipulation” written above them now appear to link the text “Analyse data” and “Interpret data” together, and to link the text “Analyse data” and “Collect, record and store data” together]

Gesche Huebner: When it comes to publishing findings we often omit important details in the description of our studies.

[The text “Insufficient detail” appears next to the text “Write up results”]

Gesche Huebner: And finally, we might not even be tempted to publish our findings if we have nul results.

[The text “Publication bias” appears next to the text “Publish”]

Gesche Huebner: Often we don’t make our data and code available, which means others can not scrutinise our work.

[The text “Closed data and code” also appears next to the text “Publish”]

Nicole Watson: The tools we have discussed can be used at different phases of the research process and can help overcome many of these limitations. Pre-analysis plans should be done early on, ideally at the study design phase or at the very latest before you start doing your analysis.

[An outline of a circle appears inside the circle of arrows. The text “Pre-registration” appears in this circle next to the text “Design study”.]

Nicole Watson: Thinking through your analysis early on can help you design a better study in the first place and make sure that you’re collecting all of the data that you’ll need.

[The text “Poor study design” that is on the outside of the circle of arrows fades away]

Nicole Watson: To give just a brief example, when I was writing out my pre analysis plan for a study that I’m doing about future energy tariffs, I realised that the way I phrased the question meant that I would know whether a respondent has ever switched their energy tariff, but I’d have no idea how long ago they switched. Thinking through this beforehand meant that I could write a better question and make sure that I had all of the data I needed when I came to do my analysis later on. When we write our pre-analysis plans, we also specify the aims of the study and when appropriate whatever hypotheses we might be testing. This limits the amount of manipulation that can be done later on such as hypothesising after the results are known or p-hacking, as mentioned in the first video.

[The two small double sided arrows with the text “Manipulation” written above them fade away]

Nicole Watson: Reporting guidelines are typically used when you’re writing up the study to give you structure and ensure that you’re including sufficient details.

[The text “Reporting guidelines” appears in the circle next to the text “Write up results”. The text “Insufficient detail” on the outside of the text “Reporting guidelines” fades away]

Nicole Watson: But it can also be useful to familiarise yourself with the reporting guidelines that you’re going to be using before you go and do the study. For example, some reporting guidelines require that you take note of the number of participants at key stages of the study. For example, how many people you contacted, how many dropped out and how many actually completed the study. Being aware that you’re going to need to write this up can help you make sure that you’re keeping track of all the information that you’ll need. Preprints are usually shared around the time that you submit your article to a journal, but it can be useful to share them sooner, for example, as a working paper.

[The text “Preprints” appears in the circle next to the arrow linking the text “Write up results” and “Publish”]

Nicole Watson: They play a really important role in mitigating publication bias, and also getting your findings out there into the world sooner so that they can be used by other scientists.

 

[The text “Publication bias” on the outside of the text “Publish” fades away]

Nicole Watson: As always, it’s really important to check that the journal you’re submitting to allows for preprints before you share them. Sharing data and code is usually done at the end of the study but it can sometimes be useful to share them sooner.

[The text “Open data and code” appears in the circle next to the text “Publish”]

Nicole Watson: It’s really worthwhile building data and code sharing into your workflow right from the beginning, to benefit from better data and code management, and also increased efficiency.

[The text “Closed data and code” on the outside of the text “Publish” fades away. We are now left with the circle of arrows showing the circular research process, and the outline of a circle within the circle of arrows with text describing the different TReQ tools placed at their relevant points in the circle. The diagram now fades]

Nicole Watson: Thinking about the kind of metadata that you will have to include, and also the requirements of your chosen repository can really help improve the efficiency of this process. By sharing our data and code we’re letting other scientists benefit from the valuable resources we produce, and also allowing our findings to be scrutinised and built upon.

Michael Fell: We’ve developed a checklist, called the ‘TReQlist’, that you can use to show how you’ve employed tools for transparency in your own work, for example, by including it as an appendix in your papers.

[Slide containing an image of the ‘TReQlist’ document. As Michael describes the different sections of the ‘TReQlist’, the image zooms in and scrolls to show the relevant sections he describes]

Michael Fell: For each of the four tools, there are a couple of questions to answer and you’re asked to insert links as appropriate. It starts with pre-registration, asking whether you’ve pre-registered the study. If so there are some other questions to answer such as whether you did this before data collection, and if any deviations from the pre-registration are noted in the manuscript. The second tool is using reporting guidelines when you’re asked to indicate which one, if any, you’ve used. The next section is around open data and code, whether you’ve shared your data and code, and if the data is shared you’re asked to indicate where the metadata is being uploaded and whether the data follow the FAIR principles. Finally, you’re asked whether you’ve published a pre-print and if so where. So the ‘TReQlist’ has got two main purposes in our view.

[Slide titled “The “TReQlist” has two main purposes:”

  • It helps you as a researcher ensure that you have used and reported on these tools for improving research practices
  • It helps others to quickly see which tools you have used, with links to the relevant details]

Michael Fell: First is to help you as a researcher ensure that you’ve used and reported on these tools for improving research practices. And the second is so that others can easily see which tools you’ve used with links to the relevant details. So hopefully this shows how everything hangs together. And one important thing to emphasise is that throughout these videos we’ve put a lot of emphasis on what individuals can do in their role to improve research practices.

[Slide with an animated icon depicting a person with four arrows leading away from them and title “These videos have focused on the role of individuals in improving scientific practices”]

Michael Fell: But of course, there are many other actors in the field of academia that might’ve contributed to the development and maintenance of an academic world which favours novel and impressive findings over considerations around transparency, reproducibility, and quality.

[Slide with an animated icon depicting a large group of and title “There are many actors in academia that have contributed to an academic world that favours impressive findings”]

Michael Fell: We’ve all heard the saying publish or perish, journals preferring exciting new findings and funders and universities wanting to see outstanding publication records.

[Slide with an animated icon depicting two hands nourishing a plant and title “We need to see a more holistic change in our research culture”]

Michael Fell: We need to see a more holistic change in our research culture with all of these actors having an important role to play, whether that’s by requiring data sharing or more highly valuing replication studies. Otherwise there’s a risk that this move towards more transparent, open, reproducible science becomes purely tokenistic.

[Slide with an animated icon depicting a knight chess piece and title “Otherwise there’s a risk this move towards more transparent, open, reproducible science becomes purely tokenistic”]

Michael Fell: Thanks so much for watching these videos. We’ve tried to convey how important transparency, reproducibility, and quality are both to us as researchers and to science in general. And we hope that we’ve given you the tools you need to employ these TReQ approaches in your own work.

[Closing slide 1: title “TReQ: improving the transparency, reproducibility and quality of your research”]

[Closing slide 2: title “For links to further resources and more about us visit: bit.ly/TReQtools”]

[Closing slide 3: UCL logo, CREDS: Centre for Research into Energy Demand Solutions logo with title “Supported by”]

Banner photo credit: Joel filipe on Unsplash