Clinical study designs are becoming increasingly complex. A growing number of studies are using adaptive designs and require decisions during the conduct of the study. At the same time there’s a growing demand for more amount of data, a larger variety of data types and time pressures on decision making. During clinical study conduct, scientists are under high time pressure as they need to manage a multitude of tasks, such as medical data review, signal detection on clinical study and project level, conducting preliminary analysis, preparing for database closure and working on publications and presentations.
Above all, clinical scientists are expected to drive innovation in pharmaceutical drug development with new clinical study designs, new assays and new ways to look at data. Innovative thinking requires time and a mind at ease, a contradicting requirement in the busy world of clinical studies. Since the evaluation of data is very often done with sub-optimal tools and processes requiring a lot of manual work, the clinical scientists have neither the time not space needed to turn creative and develop new ideas.
An effective and streamlined data flow from data capture to decision making can support the scientists in their intrinsic key responsibility to innovate drug development. The specific deliverables of an improved data flow must focus on two aspects:
- Early and speedy access to quality data during study conduct including integrated data displays and the ability to pool data across studies and projects.
- Flexibility to manage changing study designs and incorporate changes to studies during setup and conduct.
Such improvements need to be achieved on the back of a high economic pressures for further improved operational efficiency and continuously high levels of data quality and regulatory compliance.
RETHINKING THE FLOW OF DATA
Addressing the scientists’ needs according to these requirements is a tall order. It requires a comprehensive approach looking at the systems, data standards and business processes in a combined fashion. Standardization is the underlying common characteristics to all of these because it offers re-usability and reduced time and effort.
Specifically, there are 3 topics that require consideration:
- Simplifying the Data Flow and Tools for Clinical Studies: The data flow and the involved tools need to be redesigned for seamless data transfers between systems and across functions.
- Providing Speedy Access to Study Data: For each ongoing study, early access to quality study data is required. In addition, the data flow needs to allow for the speedy implementation of study amendments at any time during the study.
- Standardizing Data Formats and Displays: On a project level, the key requirement is to implement integrated data views across multiple studies with minimal manual effort.
Finally, all functions involved need to be absolutely clear on their contribution and responsibilities across the entire data flow. In addition, there needs to be a clear distinction between mandatory process steps and deliverables versus areas where flexibility is possible and welcome.
SIMPLIFYING THE DATA FLOW AND TOOLS FOR CLINICAL STUDIES
The key design principles for the future system landscape was to minimize the number of tools and databases, eliminate redundant data storage where possible, and use the same tools or platforms across functions. The different options for the data flows needs to be reduced to one preferred way of working: on-line EDC data capture and access to clinical data via a graphical data review tool.
Tools and Platforms:
- A data management tool for all clinical trials in exploratory development.
- A platform to store clinical data as the single, cross-functional repository and to be used for all clinical data.
- Data extraction and upload into the repository to be managed via programs with shared responsibilities between Data Management and Statistics.
- An interactive data review tool during study conduct and for scientific decision making.
- A web-based EDC as the single and only data flow for all studies in exploratory development.
- Data to be uploaded continuously into the data repository, starting with the first subject enrolled until database closure.
- Clinical scientists should be offered access to the SDTM datasets during study conduct.
- CDISC/CDASH to be implemented as standard for data capture.
- CDISC/SDTM to be implemented as standard for data extraction
PROVIDING SPEEDY ACCESS TO STUDY DATA
A key requirement for clinical scientists is early and speedy access to study data. This can be greatly supported by the use of global data standards. A Gartner report showed that CDISC data standards can reduce the time for study setup by up to 80% and the time for data extraction into a usable format by up to 50%. Such time savings translate directly into the thinking time and space for scientists for decision making.
The redesigned data flow offers a variety of components for early and speedy access to study data.
- The use of a pre-defined global library in web-based EDC will enable faster eCRF and database design. Thus leading to significant reduction in study start up times.
- Data extraction programs and graphical displays needs to be frontloaded and developed prior to first subject enrollment. When the first subjects arrives the study specific machinery is ready to go so that data arrives quickly in the repository and is available via displays.
STANDARDIZING DATA FORMATS AND DISPLAYS
Data standardization supports the fast database setup and enables a fast data flow during study conduct. Beyond that, standards are extremely valuable when it comes to integrated analysis reaching across studies. Finally, standards are a strong enabler for presenting data in an interpretable fashion. Downstream tools need to find variable names and types based on standardized names, and scientists are becoming used to this nomenclature.
For the re-designed data flow, CDISC data standards play a key role:
- The study specific databases will be built from standardized e-Forms according to the CDASH definitions.
- Data will be extracted into a standardized data model (SDTM) which serves all downstream users of the data.
- A global data model will captures not only the variable names and types but also hosts descriptions and other metadata helpful for the correct usage of the data.
NEW RESPONSIBILITIES FOR CLINICAL SCIENCE
Early and speedy access to clinical data during study conduct is a privilege which comes with responsibilities.
In order to work with data, the clinical scientists need to acquaint themselves with the concept of data models. As a prerequisite to data exploration, the meaning and interpretation of the variables in data sets need to be understood.
When receiving data early during study conduct, it needs to be understood that the data are not clean. This should not cause friction in a team but should be understood by all parties involved.
Clinical scientists need to apply the concept of data exploration: first comes a question, then the data are explored using an adequate tool to get an answer to the question. Following the format of question and answer should help to look at data in a structured manner, without getting lost in a jungle of data.
The key elements to enable scientific innovation in drug development are
- Early and speedy access to study data in a useable format, and
- Time and space for scientists to work with the data.
The daily transactions in drug development, however, frequently do not provide room for both, data availability and thinking time. Correspondingly, an improved data flow facilitated by an integrated data management platform coupled with data visualization tools can encourage innovation while maintaining overall efficiency and regulatory compliance.
At MaxisIT, we clearly understand strategic priorities within clinical R&D, and we can resonate that well with our similar experiences of implementing solutions for improving Clinical Development Portfolio via an integrated platform-based approach; which delivers timely access to study specific as well as standardized and aggregated clinical trial operations as well as patient data, allows efficient trial oversight via remote monitoring, statistically assessed controls, data quality management, clinical reviews, and statistical computing. Moreover, it provides capabilities for planned vs. actual trending, optimization, as well as for fraud detection and risk-based monitoring. MaxisIT’s Integrated Technology Platform is a purpose-built solution, which helps Pharmaceutical & Life sciences industry by “Empowering Business Stakeholders with Integrated Computing, and Self-service Dashboards in the strategically externalized enterprise environment with major focus on the core clinical operations data as well as clinical information assets; which allows improved control over externalized, CROs and partners driven, clinical ecosystem; and enable in-time decision support, continuous monitoring over regulatory compliance, and greater operational efficiency at a measurable rate”.