Skip to content

Pink Sheet – Real-World Database Studies: Prepare For A Long Journey, IQVIA Advises

Pink Sheet – Real-World Database Studies: Prepare For A Long Journey, IQVIA Advises

Executive Summary

Rigorous planning and a multidisciplinary team including database experts and healthcare specialists with local knowledge are vital to effective real-world evidence, white paper shows.

 

“Real-world database studies are a long journey,” IQVIA Real World Solutions UK associate principal Paola Nasuti observed in an interview with the Pink Sheet.

 

“The data sciences have evolved over time,” she observed, reflecting the “constant evolution in the way we access data, the way we analyze them, the way we conduct the studies.” Ten years ago, “we were analyzing claims data, reimbursement data,” she observed. Today “we are conducting external comparator studies. We are using these data in more novel ways.”

 

With the multiplicity of data available, it is “very important to use the right methods,” Nasuti said, and to “answer specific research questions with a high degree of rigor.”

 

Nasuti is an author of a recent white paper from IQVIA, “Real World Database Studies: Eight Key Steps To Success.” The consultants take a regulatory science approach, focusing on the process and precepts that should produce retrospective database studies with “robust, reliable results.” (See graphic.)

 

IQVIA is a key participant in efforts to incorporate real-world data (RWD) and real-world evidence (RWE) into regulatory decision-making. Company researchers are working with the US Food and Drug Administration, the European Medicines Agency and many initiatives “to improve the way we conduct studies, to improve the transparency around how these studies are conducted, the methods we used, and the way we report the findings in the public domain,” Nasuti noted.

 

“It’s still a work in progress,” she commented.

 

The 21st Century Cures Act directs the FDA to develop policy on use of RWE in regulatory decision-making – an ambitious regulatory science initiative that includes an effort to replicate clinical trials using RWD from claims data along with multi-pronged efforts to work through standards and processes with stakeholders.

 

The agency released a framework for assessing RWE for effectiveness in December 2018, but draft guidance is not expected until December 2021. (Also see “Real-World Evidence: US FDA Framework Emphasizes Data Fitness And Study Quality” – Pink Sheet, 9 Dec, 2018.)

 

The framework is particularly hesitant about the utility of retrospective observational studies using RWD. The FDA noted that the literature offers some examples where observational and randomized trials reached similar conclusions, but weighed them against countervailing examples “when effects identified in observational studies could not be reproduced in randomized trials or when the effect sizes differed in direction or magnitude.” (Also see “US FDA Is Hesitant About Using Observational Studies In Real-World Evidence Framework ” – Pink Sheet, 6 Dec, 2018.)

 

Some themes have already arisen in the limited FDA experience with RWE that align with the recommendations of the IQVIA white paper. At a recent meeting on RWE hosted by the Friends of Cancer Research and Alexandria Real Estate Equities, Aetion president and chief scientific officer Jeremy Rassen identified data issues as a major stumbling block to adoption of RWE.

 

“If you read the reviews, a lot of the criticisms are about the data or the generalizability of the populations – missing data, selection bias, things like that – but also about the way that the data are analyzed,” Rassen observed. “These issues are really important to work through.” (Also see “Real-World Evidence: Sponsors Look To US FDA Drug Reviews For Potential Pitfalls” – Pink Sheet, 7 Oct, 2019.)

Not A Clinical Trial

“We need to be mindful that we cannot make the database studies too complex, too much something like clinical trials,” Nasuti said. “We should not overcomplicate the objective of the study, which is to improve the quality of drug development.” That goal “is not going to be met if these studies become too complex.”

 

Real-world database studies “need to be different from clinical trials,” she said. “The key objective is to re-use data that are already collected for other purposes,” like medical claims and pharmacy records.

 

In fact, the “growing access to different types of data sources” is one of the biggest challenges facing the RWD field, she said.

 

“We have more and exciting new types of data,” she said. “I think the challenge is to make sure that we understand the data we are using.” She pointed out that diagnostic and drug use codes often differ between locations.

 

“That’s why we highlight in the white paper the right collaboration with the right partner,” she said.

 

Nasuti highlighted I-O Optimise, a multinational real-world research initiative in lung cancer, as an example. It’s an “innovative study leveraging multiple, different real-world data sources and involving multi-disciplinary teams of experts across Europe” that was described in a March 2019 paper by Simon Ekman, Karolinksa University Hospital, Sweden, et al. published in Future Medicine.

 

“The research objectives of this program span from epidemiology, standard of care, safety, patient-reported outcomes to exploring health-related quality of life of patients with lung cancer,” she reported.

 

I-O Optimise is a collaboration between Bristol-Myers Squibb Co., IQVIA, and real-world data source (RWDS) owners. An external scientific committee comprised of a multidisciplinary team of experts, including clinicians/oncologists, epidemiologists, health economists and RWDS owners, “provide continuous insights and guidance on the latest medical and epidemiologic knowledge, to ensure deployment of the most rigorous methodological approaches during data analyses and to ensure that research outputs are independently verified,” the Future Medicine article explained.

 

I-O Optimise had considered a total of 594 RWDS for inclusion by the October 2018 cut-off date for the article; 173 made it to the shortlist. Only 36 were selected for initial assessment, and only 12 were selected for full assessment. Reasons for exclusion included lack of continuous data collection, overlap with other data sources, limited sample size, unsuitable database structure or content, and refusal to participate.

 

“It was, and continues to be, important to acknowledge the challenges that could be faced when conducting analyses across these RWDS,” Ekman et al. concluded. “Indeed, some of the main concerns related to the conduct of multi-data source initiatives appear to be related to incomplete/missing data and methodological differences.”

 

The authors look outside oncology for another example, pointing to the “recently established Real world Outcomes across the Alzheimer’s Disease spectrum for better care: Multi-modal data Access Platform (ROADMAP) initiative, designed to optimize real-world data generation in Alzheimer’s disease,” where “a noted challenge is the lack of standardized outcomes across the different data sources.”

The Importance Of A Good Foundation

“The first important step is to understand the rationale for conducting the study,” the IQVIA white paper advises. “The study objectives should be clearly defined and documented in scientific prose as a hypothesis that can be tested or proven, with a description of how this might be achieved through an epidemiologic study.”

 

“It is important to bear in mind that if the detail contained in the objectives is insufficient, the detail contained in the methodology will be equally insufficient, leaving the project vulnerable in terms of the overall strategy,” the white paper says.

The next step, assessment of the suitability of a RWDS for a study, “calls for both knowledge of the database and a clear vision of the study design, data components and operational requirements,” the paper states.

 

“Before selecting a database, the team should give careful consideration to the diverse and heterogenous nature of RWD and uniqueness in both the content (e.g., parameters available, diagnositic coding) and context of the source (e.g. healthcare setting, purpose of data collection, geographic representatitiveness, duration of patient enrollment in the database).”

 

“Consulting a database expert who has a deep understanding of the sources can inform this process and help to place the database within the context of the heathare environment,” the paper advises.

 

With rationale and data sources established, a study protocol can be designed, “built on formal epidemiological principles.”

All RWD Is Local

The fourth step, applying for ethics approval, “can be a lengthy process,” the white paper cautions. Different countries have different ethical requirements for retrospective database studies, so “it is important to consider any local rules and regulations.” Local representation is necessary for multidisciplinary teams designing and overseeing RWD studies.

 

The next steps on the IQVIA matrix, build the statistical analysis plan (SAP) and extract the data, turn to specialists with local or specific knowledge. The SAP “should be written by an experienced statistician who is familiar with the database,” the white paper says. “A data preparation plan should be created, providing an unambiguous set of instructions to the programming team to operationalize the study objectives.”

 

Data extraction should then be “executed by a database expert with sufficient knowledge of the structure, content and context of the database as well as a solid understanding of the healthcare system from which the data is derived.”

 

Analysis of RWD studies is typically “faster than clinical trials,” Nasuti commented. The white paper suggests a range “between a few weeks and a few months in most cases.” Nonetheless, “each step in a retrospective database study can take considerable time.”

 

The white paper repeatedly emphasizes the range of expertise needed to analyze and report real-world database studies. “The reporting of real-world studies requires various skillsets and should be based on feedback from a multi-disciplinary team of data scientists,” the white paper says.

 

The research team should have “a deep understanding of the content (e.g., parameters available, diagnostic codes used) and context (e.g., local healthcare setting, purpose of data collection) of the data source,” IQVIA wrote. “Especially in the case of multi-country, multi-database studies, database experts with country-specific experience should also be involved to ensure correct interpretation of differences across geographies.”

Invest In Planning Time

The IQVIA white paper responds to the need for “rigorous methods” in the burgeoning RWD and RWE fields, Nasuti indicated.

 

Given the emphasis on planning in the eight-step rubric devised by IQVIA, it is not surprising that Nasuti identified as a key challenge for the field as the need “to make sure that before conducting any kind of database studies we allow enough time for planning.”

 

“Data are not always accessible,” she cautioned. In Europe especially, countries have imposed restrictions on how healthcare data can be used. “These require a lot of time, a lot of planning.” The work of multidisciplinary RWD study teams starts “well in advance” of a study.

 

https://pink.pharmaintelligence.informa.com/PS141284/RealWorld-Database…