vos-headline-type-email-header-062620
HEOR Articles

Dear Health Economics and Outcomes Researcher: It’s Time We Had the Transparency Talk

 

Shirley V. Wang, PhD, MSc, Brigham and Women’s Hospital, Harvard Medical School, Boston MA, USA

 

One of the most pressing challenges facing both public health and science in general is the growing spread of misinformation. In an environment where public trust in research is being eroded, simply doing rigorous work is no longer enough. We also need to make that rigor visible.

To combat misinformation and protect the public’s trust, we must show our work clearly, honestly, and consistently. We must make transparency an expectation.

 

Transparency isn’t just about reproducibility. It’s about credibility.

But we need to be thoughtful in how we pursue and promote openness. Language around “transparency” and “open science” can be misused or repurposed to question legitimate findings or to apply pressure that undermines—rather than strengthens—scientific integrity. Our goal should not be performative openness, but purposeful transparency: practices that make our methods understandable, our reasoning clear, and our decisions traceable.

Other disciplines such as psychology,1 economics,2 and cancer biology3 research had to stumble very publicly before meaningful changes could begin. Clinical trials had to undergo massive pressure4 by legislation and journal editors before embracing preregistration and open reporting. But our field remains at the starting line—aware enough to know that we should do better, but still hesitant to make transparency a default.

 

Transparency doesn’t require perfection. It requires intention.

We have a rapidly growing set of resources designed to support transparency in our field:

  • Protocol registration platforms that are built for real-world evidence (RWE).
  • Templates like the HARmonized Protocol template to Enhance Reproducibility5,6 (HARPER) that bring structure and clarity to our studies.
  • Transparency statements7 that help to tell the full story behind a study. What was planned, what was amended, and what was done.
  • Infrastructure for sharing code,8,9 tools,10-13 and logic14 even when data must stay private.

I would argue that in our field, the challenge isn’t a lack of tools. It is integrating them into our everyday work. But culture change doesn’t come from toolkits or templates. It starts with people.

Change can begin with small, deliberate choices made by you and me. And when each of us commits to clarity and openness, the whole research community moves forward together. But if we wait for the perfect conditions to get started, we never will. The time to start building better habits is now.

 

What counts as “doing transparency right” today?

1. Preregistering protocols. Whether it’s the Open Science Frameworl-hosted Real-World Evidence (RWE) Registry (https://osf.io/registries/rwe/), the Heads of Medicines Agencies-European Medicines Agency Catalogue (https://catalogues.ema.europa.eu/), or ClinicalTrials.gov, the point isn’t which platform, it’s that you’re using one. Preregistration doesn’t mean your study is locked in stone. It simply creates a transparent, traceable starting point. We all know that data can lead us in new directions. Amendments are not only acceptable—they are expected. The key is to document those changes so that others can follow your reasoning as clearly as your results.

2. Using templates that make rigor visible. HARPER5,6 is a deliverable of a joint task force between ISPE and ISPOR that is now endorsed by global organizations such as International Council for Harmonisation of Technical Requirements for Pharmaceuticals for Human Use,15 European Network of Centres for Pharmacoepidemiology and Pharmacovigilance,16 Council for International Organizations of Medical Sciences,17 National Institute for Health and Care Excellence,18 and Centers for Medicaid & Medicare Services.19 Protocol templates such as HARPER aren’t just templates, they are thinking tools. They encourage clear articulation of study decisions, estimands, potential biases, and assessment of data quality.

3. Making analyses reproducible even without sharing data. We know patient privacy and data use agreements make sharing raw data tricky. But you can still share clearly annotated and documented analytic code, synthetic data, diagnostics, and workflow logic. Transparency isn’t all-or-nothing. Just be sure to avoid dumping disorganized spaghetti scripts and cryptic outputs. Usability is what makes sharing valuable.

4. Proudly declaring your transparency efforts. The new transparency statement framework7 makes this easier: What did you register? What’s open? Where can people find it? Even if parts of your study can’t be open, you can still make your process visible.

In an environment where public trust in research is being eroded, simply doing rigorous work is no longer enough. We also need to make that rigor visible.

What stands in the way?

It’s not a lack of awareness. Most of us know that transparency is important.

The real barriers are inertia and hesitation. The feeling that this is yet another step in a process that already takes considerable time and effort, the fear of being scooped, losing a competitive edge, or the discomfort of committing too early.

Yes, it can feel like more work up front. But that investment pays off. When you start with a clear, transparent protocol, you reduce confusion later. You streamline analysis, facilitate cleaner documentation, and make peer review smoother by minimizing ambiguity and making it easier to retrace your own steps weeks or months later. Transparency isn’t just good science; it’s good project management. It also makes life easier for your team. Clear protocols and documented decisions mean fewer misunderstandings, less rework, and easier onboarding for collaborators.

And for your future self? It’s a gift. When you revisit a study 6 months later (or have to explain it to a reviewer or regulator), you’ll be glad that everything was clearly laid out from the start.

 

What if we flipped the perspective?

What if preregistration and transparency weren’t extra burdens—instead they were protective measures? What if sharing your study protocol upfront actually strengthened your credibility? What if transparency stopped being viewed as an added burden and started being valued as evidence of a rigorous and well-planned process?

Our field remains at the starting line—aware enough to know that we should do better, but still hesitant to make transparency a default. Transparency doesn’t require perfection. It requires intention.

Concrete steps, collective momentum

If you’re an early career researcher? Start here.

  • Preregister your next protocol. The platform matters less than the act itself.
  • Use the HARPER5,6 template. Even if your funder doesn’t require it, your future self will thank you.
  • Include a transparency statement in your next manuscript. Even partial openness is valuable.

If you’re a supervisor, journal editor, or policy maker?

Your example speaks louder than any checklist. Model transparency. Be the first to register. Reward transparency in peer review. Ask about reproducibility at the next team meeting. Help normalize the behavior that we all say we value.

 

Institutional support is growing

The good news is that momentum is building. ISPOR—The Professional Society for Health Economics and Outcomes Research—and the International Society for Pharmacoepidemiology have established joint task forces focused on transparency and reproducibility in RWE. The society journals Value in Health and Pharmacoepidemiology and Drug Safety have revised author submission guidance to strongly encourage transparency statements.

These badges are more than symbolic. They send a clear message that open, reproducible research is not only encouraged but celebrated. And as more institutions and communities begin to recognize and celebrate open science practices, that recognition will become a powerful driver of change—helping to shift transparency from an individual choice to a shared expectation.

 

Building a culture of transparency is a team effort.

I encourage you to join the movement and proudly display the open science practices you have used in your next study. Let’s do this together. One study, one protocol, one statement at a time.

 

References

  1. PSYCHOLOGY. Estimating the reproducibility of psychological science. Science. Aug 28 2015;349(6251):aac4716. doi:10.1126/science.aac4716
  2. Chang AC, Li P. Is economics research replicable? Sixty published papers from thirteen journals say ’usually not’. Finance and Economics Discussion Series 2015-083. Washington: Board of Governors of the Federal Reserve System, doi:10.17016/FEDS.2015.083.
  3. Errington TM, Mathur M, Soderberg CK, et al. Investigating the replicability of preclinical cancer biology. Elife. Dec 7 2021;10doi:10.7554/eLife.71601
  4. Zarin DA, Tse T, Williams RJ, Rajakannan T. Update on trial registration 11 years after the ICMJE policy was established. N Engl J Med. 2017;376(4):383-391. doi:10.1056/NEJMsr1601330
  5. Wang SV, Pottegard A, Crown W, et al. HARmonized Protocol Template to Enhance Reproducibility of hypothesis evaluating real-world evidence studies on treatment effects: a good practices report of a joint ISPE/ISPOR task force. Pharmacoepidemiol Drug Safety. Oct 10 2022;doi:10.1002/pds.5507
  6. Wang SV, Pottegård A, Crown W, et al. HARmonized Protocol Template to Enhance Reproducibility of Hypothesis Evaluating Real-World Evidence Studies on Treatment Effects: A Good Practices Report of a Joint ISPE/ISPOR Task Force. Value Health. 2022;25(10):1663-1672. doi:10.1016/j.jval.2022.09.001
  7. Wang SV, Pottegård A. Building transparency and reproducibility into the practice of pharmacoepidemiology and outcomes research. Am J Epidemiol. 2024;193(11):1625-1631. doi:10.1093/aje/kwae087
  8. Weberpals J, Wang SV. The FAIRification of research in real-world evidence: a practical introduction to reproducible analytic workflows using Git and R. Pharmacoepidemiol Drug Safety. 2024/01/03 2024;n/a(n/a)doi:https://doi.org/10.1002/pds.5740
  9. Tazare J, Wang SV, Gini R, et al. Sharing Is Caring? International Society for Pharmacoepidemiology Review and Recommendations for Sharing Programming Code. Pharmacoepidemiol Drug Safety. 2024;33(9):e5856. doi:10.1002/pds.5856
  10. Program S. Routine Querying System. https://www.sentinelinitiative.org/sentinel/surveillance-tools/routine-querying-tools/routine-querying-system. Updated Jan 24, 2020. Accessed Feb 14, 2020.
  11. Raventós B, Català M, Du M, et al. Incidence Prevalence: an R package to calculate population-level incidence rates and prevalence using the OMOP common data model. Pharmacoepidemiol Drug Safety. 2024;33(1):e5717. doi:https://doi.org/10.1002/pds.5717
  12. Dernie F, Corby G, Robinson A, et al. Standardised and Reproducible Phenotyping Using Distributed Analytics and Tools in the Data Analysis and Real World Interrogation Network (DARWIN EU). Pharmacoepidemiol Drug Safety. 2024;33(11):e70042. doi:https://doi.org/10.1002/pds.70042
  13. Wang SV, Verpillat P, Rassen JA, Patrick A, Garry EM, Bartels DB. Transparency and reproducibility of observational cohort studies using large healthcare databases. Clin Pharmacol Ther. 2016;99(3):325-332. doi:10.1002/cpt.329
  14. Matthewman J, Andresen K, Suffel A, et al. Checklist and guidance on creating codelists for routinely collected health data research. NIHR Open Res. 2024;4:20. doi:10.3310/nihropenres.13550.2
  15. International Council for Harmonisation of Technical Requirements for Pharmaceuticals for Human Use. General Principles on Plan, Design, and Analysis of Pharmacoepidemiological Studies That Utilize Real-World Data for Safety Assessment of Medicines M14. https://database.ich.org/sites/default/files/ICH_M14_Step3_DraftGuideline_2024_0521.pdf. Endorsed on May 21, 2024. Accessed July 8,2025.
  16. ENCePP. ENCePP Guide on Methodological Standards in Pharmacoepidemiology. Revision 11. https://encepp.europa.eu/encepp-toolkit/methodological-guide_en. Published July 2023. Accessed July 8, 2025.
  17. Real-world data and real-world evidence in regulatory decision making. CIOMS Working Group report draft. https://cioms.ch/wp-content/uploads/2020/03/CIOMS-WG-XIII_6June2023_Draft-report-for-comment.pdf. Draft June 6, 2023. Accessed July 8, 2025.
  18. NICE real-world evidence framework. https://www.nice.org.uk/corporate/ecd9/resources/nice-realworld-evidence-framework-pdf-1124020816837. Published June 23, 2022. Accessed July 8, 2025.
  19. Proposed Guidance Document: Study Protocols That Use Real-World Data. Centers for Medicare & Medicaid Services. https://www.cms.gov/medicare-coverage-database/view/medicare-coverage-document.aspx?mcdid=39. Published January 17, 2025. Accessed March 6, 2025.
Your browser is out-of-date

ISPOR recommends that you update your browser for more security, speed and the best experience on ispor.org. Update my browser now

×