TReQ: 1 Principles

04 January, 2022

Reading time: 8 minutes

This video introduces the TReQ (Transparency, Reproducibility, Quality) principles and explain their importance and general benefits.

TReQ: 1 Improving the transparency, reproducibility and quality of your research

Presented by Gesche Huebner, Michael Fell and Nicole Watson

Working in applied research? Interesting in learning more about why and how to make your research more transparent, reproducible and high quality? This video introduces the TReQ (Transparency, Reproducibility, Quality) principles and explain their importance and general benefits. We also reflect on the extra challenges that applied research poses. This is video 1 of 6 from our team at University College London.

TReQ introduction: video transcript

[Introduction slide 1: TReQ: Improving the transparency, reproducibility and quality of your research]

[Introduction slide 2: Video One: Introduction]

Gesche Huebner: Welcome to this series of videos about how to improve the transparency, reproducibility, and quality of your research, called TReQ for short. My name is Gesche Huebner and with my colleagues Nicole Watson and Mike Fell, I will be speaking about how and why to introduce TReQ into your research.

[Slide titled “Why TReQ is important”]

Gesche Huebner: I will start with a little example. You might have heard of the paper “Growth in a Time of Debt”…

[Slide containing an image of the paper “Growth in a Time of Debt” by Carmen M. Reinhart and Kenneth S. Rogoff]

Gesche Huebner: …that showed that countries with a high level of debt had lower economic growth. This paper was highly influential in justifying economic austerity policies. However, when another team of researchers requested access to the data, and was granted it, they realised that some mistakes had been made in the analysis that actually changed the conclusions. So could this mistake have been avoided if the data had been shared from the onset? Could greater transparency have avoided this problem? This is just one of those examples and we will speak in detail about how using transparency, reproducibility, and quality in research can help to create better science.

[Slide titled “What we will cover in these videos”]

Gesche Huebner: In these videos we will also discuss why TReQ can be particularly challenging in applied multidisciplinary research like energy, where the three of us work.

[Slide titled “Four tools to improve the TReQ of your research”:

  • Pre-registration of studies
  • Reporting guidelines
  • Preprints
  • Open data and code]

Gesche Huebner: We will take you through four tools you can use to improve the TReQ of your research: pre-registration of studies, reporting guidelines, preprints, and open data and open code. For each we will cover how the approach is used, what the benefits are for you as a researcher, but also for science more generally. Alongside, we will also discuss commonly cited drawbacks. In the final video we’ll draw it all together and show you how you can implement the TReQ tools into your workflow.

[Slide containing an image of “the TReQlist”, a checklist created by Gesche Huebner, Michael Fell and Nicole Watson to accompany this course]

Gesche Huebner: We will also share the ‘TReQlist’ with you. A checklist which you can use to document how you’ve been using tools to promote the TReQ of your own research.

[Slide titled “Why is transparency so important?”]

Nicole Watson: One thing I think we can all agree on is that research needs to be as high quality as possible. If we’re to solve big societal challenges, such as climate change,

[Footage of a flooded town is shown]

Nicole Watson: biodiversity loss,

[Footage showing a forest where trees have been recently cut down is shown]

Nicole Watson: pandemics,

[Footage showing a young man on a train using his phone and wearing a face mask is shown]

Nicole Watson: and more, we need the evidence underpinning the solutions that we propose to be valid and reliable. A big part of that is allowing other researchers to check our work and describing studies in enough detail to see exactly how they were done.

[Definition slide titled “Transparency: ‘The principle that every […] scientist should make the essential components of his or her work visible to fellow scholars*’”. Text at the bottom of the slide attributes this citation to Moravcsik, A. (2014)]

Nicole Watson: This or, the principle that every scientist should make the essential components of their work visible to other scholars, is what we mean when we talk about transparency. So why is transparency so important?

[Slide with an animated icon depicting a signpost and title “Transparency allows other researchers to see exactly how your study was carried out”]

Nicole Watson: First of all, it allows other researchers to see exactly how your study was carried out. This can be really important for helping to spot errors and biases, and also thinking about what other situations you might expect or not expect your findings to apply to.

[Slide with an animated icon depicting a flowchart leading to a large tick and title “It also means other research teams can go out and do your study again themselves”]

Nicole Watson: Secondly, it means that other research teams can go out and do your study again themselves.

[Definition slide with title “Reproducibility: the idea that independent studies testing the same thing should obtain broadly similar results”]

Nicole Watson: The idea that independent studies testing the same thing should produce broadly the same results is known as reproducibility. If a finding is reproducible, we can have more confidence that it’s robust, and it wasn’t just a one-off fluke. You might have heard about the reproducibility crisis, where results in medicine, psychology, and economics were found not to reproduce when the studies were done again by independent teams.

[Slide titled “What are questionable research practices?”]

Michael Fell: So looking at the length of lots of research papers, you might be forgiven for thinking that levels of transparency are already pretty high. But there are actually plenty of ways that important details can be lost or glossed over in their reporting.

[Definition slide with title “HARKing: hypothesising after results are known”]

Michael Fell: One example is HARKing, or hypothesising after results are known. Here, a researcher might run an experiment, get the results, then report them as if they were always what they expected.

[Definition slide with title “P-Hacking: when researchers tweak their analysis to make sure that statistically significant results are obtained”]

Michael Fell: Another is p-hacking, where researchers tweak aspects of their analysis to make sure that statistically significant results are obtained. These and other so-called questionable research practices are likely driven, at least in part, by publication bias.

[Definition slide with title “Publication Bias: the failure to publish the results of a study on the basis of the direction or strength of the study findings”]

Michael Fell: This is the failure to publish the results of a study on the basis of the direction or strength of the study findings.

[Slide titled “Important limitations to consider”]

Michael Fell: While the idea of making our work more transparent and reproducible sounds sensible, it isn’t always easy. And this can be especially the case in applied multidisciplinary subject areas like energy, water, food, and so on.

[Slide with three white animated icons depicting a green energy battery, water droplets with arrows round them to show circular water supply, and a young plant growing from the soil on a plain turquoise background and title “It’s hard to agree on good practice approaches that work for everyone in applied multidisciplinary areas of research”.]

Michael Fell: Because of the wide variety of different approaches and methods used it can be hard to agree on a set of good practice approaches that work for everyone.

[Slide with an animated white icon depicting an electric car on a plain turquoise background and title “There’s a recognition that some findings will go out of date”.]

Michael Fell: Also, there’s a recognition that findings will go out of date. For example, we’d all expect that people’s concerns about electric vehicles today would be quite different to what they’re likely to be in five years time.

[Slide with an animated icon depicting four people with different skin tones and title “In some areas of research it might not make sense to think in terms of reproducibility”.]

Michael Fell: In some areas of research, like ethnography for example, it might not even make sense to think in terms of reproducibility at all. And finally, research in these areas is often based around large, expensive, time-consuming trials that can’t easily be rerun to test the results.

[Slide with an animated icon depicting time passing on a clock with title “Large, expensive, time-consuming trials cannot be easily re-run to test results”.]

Michael Fell: Nevertheless, transparency, quality, and where relevant, reproducibility, remain important and achievable in applied multidisciplinary research areas. In our TReQ recommendations we’ve focused on approaches that are applicable in a broad range of areas that are flexible in terms of how they can be used, and have got a low barrier to entry, that is, they’re easy to pick up and get started with. So let’s jump in.

[Closing slide 1: title “Up Next: Pre-Registration Of Studies”]

[Closing slide 2: title “TReQ: improving the transparency, reproducibility and quality of your research”]

[Closing slide 3: title “For links to further resources and more about us visit: bit.ly/TReQtools”]

[Closing slide 4: UCL logo, CREDS: Centre for Research into Energy Demand Solutions logo with title “Supported by”]

Banner photo credit: JJ Ying on Unsplash