Resolve Clinical Development Data Challenges with RBQM

By Suvarnala Mathangi | Date: July 31, 2022 | Blog | 0 Comment(s)

Clinical trials must be properly monitored by sponsors, not only to protect the participants in the trials but also to ensure trial data integrity for regulatory compliance. Capable of adapting to the level of risk, Risk Based Quality Management (RBQM) can prove highly effective for onsite-monitoring. This is why sponsors believe RBQM is the way to go when monitoring critical data elements as it helps ensure data quality, improve safety and check the rising costs of drug development.

It’s time to bid goodbye to cumbersome practices which drive up drug development expenses and extend project timelines, with teams traveling to all study sites to check and verify all data points, whether critical or non-critical. Monitoring today can be centralized and remote. All submitted data can be remotely verified to identify missing data trends, outliers and even potential protocol deviations without the need for onsite monitoring. With RBQM, it’s easy to identify even unusual distribution of data at any site, high screen failure or withdrawal rates, high levels of eligibility violations and delays in reporting any critical data elements. Integrated solution providers in RBQM help improve monitoring processes, enabling real-time analysis and improvement in overall quality and compliance.

Evolving RBQM Solutions

An effective, cloud-based integrated solution offers sponsors the ability to gain oversight into clinical investigations by integrating with multiple EDC, CTMS, Safety, Healthcare, PV and document management systems (DMS) and by enabling the standardization and storage of data. The solution’s data analytics capabilities should offer sponsors a range of insights into trial data, including predictive analytics with a dashboard for metrics, key performance indicators, key risk indicators, configurable thresholds, triggers, alerts, escalations, and workflows. This will drive proactive risk mitigation and actionable outcomes and enable timely decision-making. It can also enable course correction throughout the monitoring cycle. Sponsors can now enhance patient safety and data quality, as the solution enables effective protocol design, reduces costs, and offers the ability to strategically adjust oversight in proportion to the level of risk.

Automating and Optimizing Monitoring Activities

Trial monitoring accounts for a substantial portion of total study costs in the drug development cycle and sponsors have explored ways to reduce them through better efficiency in study management. This calls for monitoring functionality that goes over and above the central data collection capabilities made possible by EDC and CTMS systems. With the shift toward risk-based monitoring, technology is helping to collect and track critical data elements, with real-time data standardization and analytics to provide timely insights and shorten trial costs and time significantly. The powerful insights into data powered by modern analytical tools and technologies are driving effective and accurate monitoring and risk mitigation. Modified site management is helping to reduce costs by more than 20%, while improving the quality and integrity of the trial by uncovering problems with the data early, identifying potential risks, and ensuring that the staff at the site are adhering to trial protocols.

Monitoring plans should be tailored to protect trial participants and the integrity of trial data, with focus on preventing or mitigating possible sources of error in the conduct, collection, and reporting of critical data and processes. Sponsors must also analyze ongoing data to continuously assess and adjust their monitoring strategies and activities.

Requirements for RBQM Success

When choosing a solution that will support evolving requirements, ensure that it offers you the following capabilities:

  • Disparate Data Throughout the Data Lifecycle: The ability to communicate with disparate data sources and integrate, aggregate, and review not just sponsor-specific metadata but also standard metadata, with remote analysis capability, while being flexible and configurable across different data standards, with cross-functional report generation capabilities to offer better visibility through disparate data based on their linkage or dependencies.
  • Risk Mitigation Functionality: Mitigate risk through selective source data validation in a seamless, real-time process, identifying discrepancies like missing or fraudulent data, duplicate records, data outliers, and inconsistent data offering increased data quality and integrity.
  • Data Standardization: Eliminate all inconsistencies and standardize the data collected for analysis or submission to regulatory authorities. Help identify missing or inconsistent data, data outliers, and potential protocol deviations that may be indicative of systemic or significant errors in data collection and reporting at a site level.
  • Process Visibility: Centralized, easy to navigate monitoring capabilities, that offer visibility into issues, their status and actions needed to be taken. Reports in different formats like xls, pdf, png, jpg, html, that can be generated by applying appropriate filters like study, site etc., in its report generation interface.
  • Content Management Platform: Offers a simplified content development solution with highly organized author and reviewer workflow, and features like reusability, resulting in cost effective and quality content development.
  • Innovative Analytics: Solution should offer easy operability and role-based dashboards with data-driven visualizations, statistical reports and scenario modeling, including an analytics dashboard with cross-functional drilldowns to delve deep into the root cause of an issue.
  • Compliance: Compliance with different global regulatory standards like 21 CFR part 11 or CSV guidelines by EMA etc.
  • Security: Central and highly secured data storage in the cloud, with role-based user access, making it highly reliable and secure.

MaxisIT’s own fit-for-purpose RBQM solution offers all these and much more.

About MaxisIT

At MaxisIT, we clearly understand strategic priorities within clinical R&D, as we implement solutions for improving Clinical Development Portfolios via an integrated platform-based approach. For over 17 years, MaxisIT’s Clinical Trial Oversight System (CTOS) has been synonymous with timely access to study-specific, standardized and aggregated operational, trial, and patient data, enabling efficient trial oversight. MaxisIT’s platform is a purpose-built solution, which helps the Life Sciences industry by empowering business stakeholders. Our solution optimizes the clinical ecosystem; and enables in-time decision support, continuous monitoring and regulatory compliance, and greater operational efficiency at a measurable rate.

Have you outgrown your Statistical Computing Environment (SCE)?

By Suvarnala Mathangi | Date: June 30, 2022 | Blog | 0 Comment(s)

From cars to computers, with every product we use, a tipping point comes when it becomes ineffective or obsolete and needs replacement with a shinier and better performing version. Most of us resist change, and would rather plod along, using the old and underperforming version until it breaks down on us and refuses to go any further. In this process, we generally lose out on improved productivity and superior results.

Let’s consider how this scenario translates to modern clinical trials, with specific reference to an SCE (Statistical Computing Environment)?

Have you outgrown your SCE?

Technology keeps evolving, with continuous change and improvements occurring at a rapid pace. Such changes are in tune with developments in operating processes as well as regulatory requirements. When we refuse to make the switch, we lose out on many of these beneficial changes that potentially improve efficiencies. When an SCE outgrows its usefulness, we see statisticians, data scientists, and programmers struggling to keep up with the expectations. Let’s look at some tell-tale signs which tell us when our SCE has outgrown its usefulness and must be replaced:

  • You see your team frustrated with the number of manual workarounds and processes needed to derive the results they need.
  • You see the market offering SCEs which are light years ahead of yours, in terms of capabilities.
  • You are not deriving the benefits you really need from your SCE.

What can you expect from an SCE?

Clinical development teams using an SCE should enjoy seamless collaboration, controlled compliance, and powerful computing that empowers the programmers, statisticians, and data scientists on the team to be more productive. Complex data, a myriad of sources and formats, and varying requirements for different stakeholders make this a difficult task. Yet process optimization is essential for teams to realize time and cost savings.

In other words, an effective SCE enables teams to work collaboratively and offers an integrated and accessible environment where the teams have clear visibility into the lifecycle of statistical programming. The environment should offer audit trails, version control, and traceability while also enabling distributed and remote work teams. It also offers powerful computing to create and manage data structures quickly to gain insight into complex tasks.

Ensure that your Statistical Computing Environment (SCE) is a controlled and collaborative workspace where Biostatisticians, Clinical Programmers, and Data Scientists can manage tools, and share best practices, analytics, and validation plans. It needs to be a metadata-driven, scalable computing environment built upon an integrated data repository that supports exploratory analysis and the production of review-ready deliverables for regulatory submission.

From its intuitive user interface to its nimble product architecture, a modern SCE needs to deliver a robust data science and analytics functionality that Life Sciences R&D teams demand, along with some of these specific functions:

An integrated repository:

The SCE’s metadata-driven environment is an integrated data repository that supports regulatory as well as exploratory analysis without constraints. Access to such an integrated platform offers numerous benefits to biostatisticians, programmers, and other clinical trial team members. They gain version control with a single source of truth for clinical trial data, an integrated computing experience, and a controlled and compliant environment that offers traceability, transparency, and auditability.

Supercharged analytics:

The robust integrated platform supports global access and enables collaboration in a secure and compliant environment. Teams can manage sequences of steps, assess progress, and use automated workflows for review and validation. They can use familiar, industry-standard tools such as SAS, R, and Python to efficiently clean, analyze, and present more meaningful data visualizations. As the complexity of clinical trials grows, data scientists, biostatisticians, and statistical programmers need to use a variety of tools to support complex analyses. SCE’s powerful web-based computing needs to support such regulatory and exploratory analyses.

Automated workflows:

Metadata-driven workflows must support collaboration, review, and finalization of data deliverables. By accelerating analysis and productivity, the technology offers time and cost savings. With configurable workflow-driven automation, it increases the speed and accuracy of production and lifecycle management of CDISC-compliant submission data packages, including SDTM and ADaM datasets, as well as the Tables, Listings, and Figures (TLFs), and patient narratives incorporated into the clinical study report (CSR). Robust analytics enable teams to make faster, more informed decisions earlier in the data lifecycle.

Other beneficial features:

Using complex data, a myriad of sources and formats, and varying requirements for different stakeholders makes deriving insights a difficult task. Environmental controls, ongoing visibility into the process, and automated workflows supporting reviews and validation are essential for success. An effective SCE would enable seamless collaboration and offer these additional benefits.

  • Faster processes including web-based review and approval of programs and reports with built-in project management providing structure for managing all elements associated with a project.
  • A web-based global workspace with boundaryless access which supports broad 24/7 collaboration offering on-demand performance.
  • Global library of objects and snippets which facilitate the management, sharing, and reusability of code with automated workflows, tasks, and reusable templates.
  • Role-based security & controlled environment and traceability between programs, data, and output with clear documentation as well as rapid response capabilities for any regulatory inquiries.
  • Regulatory submission-ready output with built-in CDISC standards and CDISC compliant data sets.
  • Traceability, transparency, and auditability 21CFR Part 11 and GxP compliant
  • Powerful Computing
  • Flexibility to import and act on data from a variety of data sources

 

With such an SCE, life sciences teams get to improve collaboration at each stage of the clinical data lifecycle including data ingestion, integration, transformation, review, and analysis. As biostatisticians and clinical programmers collaborate with data management and clinical operations, teams can help answer critical safety and efficacy questions about the research product and quickly and accurately create submission-ready data deliverables for the products under investigation. So, would you say you have outgrown your SCE?

About MaxisIT

At MaxisIT, we clearly understand strategic priorities within clinical R&D, as we implement solutions for improving Clinical Development Portfolios via an integrated platform-based approach. For over 17 years, MaxisIT’s Clinical Trial Oversight System (CTOS) has been synonymous with timely access to study-specific, standardized and aggregated operational, trial, and patient data, enabling efficient trial oversight. MaxisIT’s platform is a purpose-built solution, which helps the Life Sciences industry by empowering business stakeholders. Our solution optimizes the clinical ecosystem; and enables in-time decision support, continuous monitoring over regulatory compliance, and greater operational efficiency at a measurable rate.

The Role of Technology in Deriving Actionable Insights from Clinical Trial Data 

By Suvarnala Mathangi | Date: April 30, 2022 | Blog | 0 Comment(s)

Clinical trials demand time and money from drug sponsors and patience from participants. From start to finish, they average a time window of 9 years and cost of $1.3 B. The real challenge is that 12% of clinical trials succeed while the rest leave dashed hopes behind, begging the question as to what can make clinical trials succeed? In other words, what makes them fail? How do we derive actionable insights from clinical trials, even as we ensure adherence to FDA guidelines, follow good manufacturing protocols, and continually tackle issues with patient recruitment, enrollment, and retention? How do we ensure that, when results are reviewed at the end of each stage in a clinical trial, they justify moving to the next stage? In short, how do we ensure the success of a trial?

Technology is the only recourse we have to advance clinical research, while staying compliant with FDA’s guidance in-home visits, direct-to-patient trial supply, telehealth, ePRO/eCOA, eConsent, and remote patient monitoring. As more drug sponsors adopt decentralized trials, we need to track all in-clinic and remote data endpoints. Factors like poor study design, inappropriate endpoints or eligibility criteria or omission of appropriate ones, inappropriate analysis, too large or too small a sample size, inconsistencies in protocol, suboptimal site selections, issues with recruitment are all reasons why a clinical trial would fail. Most of these would lead us to collect ineffective data.

Since our focus is on data, let’s also check if we are working hard enough to derive reliable insights from the data we collect so assiduously. Do we understand data quality issues, and analyze outliers sufficiently without cutting corners? Are we killing our projects due to a negligent approach to ensuring clean data? These days, huge amounts of data are collected using clinical findings/observations, patient health records, scanned images, wearable devices, and genomic studies. We bring together large amounts of data to throw a light a patient’s health in myriad ways, enabling a precision health approach and advancing health care itself. While we ensure that there’s quantity, we may find that quality is missing. Data sources should be verified before collected and integrated into the trial data.

Good Data – The Key to Success

Collecting patient medical data presents many challenges. It could be unstructured or taken from independent sources which operate in silos. Many times, data loses its structure during transmission. It may also have been inefficient to begin with, as the methods were ineffective, faulty due to the patients’ lack of adherence, or because they have been misplaced or lost. Patients may also drop out because they are not willing to travel or pay for the additional tests needed. All these issues as well as duplication, overlapping, inadequacy, incompleteness could adversely affect the data collected and thereby the accuracy of the study and its outcomes.

Gaining Insights from Data

Clinical trials need accurate, consistent, high-quality data with high response and low attrition for reliable results. While bringing cost effective care to people, quality data could find a cure for orphan diseases or reduce medical errors, measure the effectiveness of a given drug against a specific ailment or drive the delivery of optimal treatment based on data-driven diagnostics and analytics. The continuing issue is ensuring the quality of data. We are all aware of the fact that data is everywhere and that it is being collected every step of the way. Gaining deep and valuable insights from data is possible only when high quality data from disparate sources is combined and allowed to present a broader picture.

Role of Artificial Intelligence

AI works best with bigger data sets, but we need to exercise the required care to ensure that data lakes do not turn into data swamps, offering little or no actionable insights. By applying AI at the point of care, clinical trial teams can work on making the insights more efficient for patients, manufacturers, and physicians. AI could also help resolve the challenges with enrollment and ensure that the participants adhere to the medication protocol.

Collecting high-quality data and deriving meaningful insights from it is the key to success. As we look to the future of clinical trial data processes, we can see that technology will continue to have a significant impact on the data lifecycle and how we manage it to achieve the best outcomes. 

About MaxisIT

At MaxisIT, we clearly understand strategic priorities within clinical R&D, as we implement solutions for improving Clinical Development Portfolios via an integrated platform-based approach. For over 17 years, MaxisIT’s Clinical Trial Oversight System (CTOS) has been synonymous with timely access to study-specific, standardized and aggregated operational, trial, and patient data, enabling efficient trial oversight. MaxisIT’s platform is a purpose-built solution, which helps the Life Sciences industry by empowering business stakeholders. Our solution optimizes the clinical ecosystem; and enables in-time decision support, continuous monitoring over regulatory compliance, and greater operational efficiency at a measurable rate.

Adopting Transformative Changes in Clinical Development

By Suvarnala Mathangi | Date: March 31, 2022 | Blog | 0 Comment(s)

Efforts to bring new therapies to market have never been more challenging than today, with drug sponsors striving to manage dynamic, transformative changes in drug development. Advances in data mining and analysis are met by a growing number of data sources. At the same time, digitization has enabled the patient-centric design of trial protocols with an increased focus on the patient experience during clinical trials. We can better segment disease/patient populations and collect better quality data, garnering real-world evidence for regulatory approval. Drug sponsors follow master protocols while conducting adaptive trials designed to allow modifications to a trial or alterations to the statistical procedures without affecting a trial’s integrity and validity. Novel approaches to drug development are especially acceptable to both drug sponsors and regulators when seeking suitable therapies to tackle cancer or other rare, life-threatening diseases. 

Such transformative changes call for corresponding changes to current operating models, along with changes to capabilities, infrastructure, and partnerships. They require more investments where the ROI has been declining for years. For lasting change, clinical trial stakeholders could share data standards, clinical data, and access to a limited pool of clinical trial participants. They can also come together to resolve evidence hurdles even as they curate real-world data sets and pilot new approaches to reaching planned endpoints. Some companies are also coming together to share historical trial data to build historical control arms and develop industry-wide data standards and metadata in priority therapy areas.

Transformative approaches to clinical trials seek to speed up the development of effective therapies while maintaining high levels of quality in the research output in various ways: 

Advanced Statistical Techniques

By adopting advanced statistical techniques to analyze the data collected and curated, clinical trials teams improve efficiency and the probability of succeeding despite growing data sources. However, researchers must work collaboratively with regulators when adopting transformative approaches to ensure that the geographies they plan to target are modernized sufficiently and are receptive to the change.

Adaptive Trial Designs

The FDA has made it possible for drug developers to reach their endpoints while learning, continuously adjusting, and reducing cycle times using adaptive designs. The FDA encourages drug sponsors to adopt innovative clinical trial designs such as platform trials, basket studies, adaptive trials, and pragmatic randomized controlled trials. Basket trials employ a targeted therapy for multiple diseases linked to an aberration in the genetic structure. Umbrella trials test more than one targeted therapy for a specific disease. Platform trials employ multiple therapies for a single disease to evaluate the efficacies using a decision algorithm to determine their effectiveness. Such trial designs reduce research costs and time, allowing for parallel evaluation of multiple treatments within a single clinical trial structure. 

Patient Advocates

The trials enable researchers to efficiently answer multiple questions while allowing patients to navigate the complexity of clinical trials in an optimal manner. Meanwhile, patient advocacy groups are pitching in to get trials designed around patient-centric endpoints and helping to gain buy-in from important stakeholders like payers, physicians, and regulators.

RWD/RWE

Patient and disease populations are segmented better as scientific data is combined with real-world data from patient registries, electronic medical records, imaging, and claims data. This approach helps to understand diseases better and explain, influence or predict their outcomes. The availability of excessive amounts of data may require researchers to narrow their focus, creating a robust data ecosystem to derive any insights with statistical significance.

Real-world evidence from external control arms (ECAs) was initially valuable when putting some patients on placebo was neither ethical nor practical. ECAs could provide insight into the actual standard of care better than any clinical trial and helped reduce development times by helping to decide in early phase trials whether to proceed with later stage trials or not. They take less time to prove the safety and efficacy of a product and save time and costs of running actual clinical trials.

Clinical Trial Simulations

Designing a trial protocol to decide treatment duration, the number of visits, and recruitment targets for a site and simulating its feasibility helps clinical trial teams to reach their milestones faster. Using modeling and simulation (or a digital twin) to assess the impact of inclusion and exclusion criteria helps companies save time and achieve trial milestones more efficiently than earlier methods.

Artificial Intelligence

These transformative approaches benefit extensively from the use of AI, whether to normalize data from different data sources or to curate unstructured data types, whether as images or clinical notes. Machine learning algorithms can automate much of the analysis, provided you have the right technology capabilities. 

If managed right, these approaches can be leveraged efficiently to reduce the time needed to move a therapy from application stage to initial approval. They could help someone evaluate multiple therapies in parallel using a master protocol, segment patients better to accelerate recruitment, use adaptive designs to save time, or reduce approval times using RWE while ensuring the safety and efficacy of the therapies. It is high time clinical teams stepped out of their comfort zones to try and adopt one or more of these transformative approaches and reap the benefits these innovations offer.

About MaxisIT

At MaxisIT, we clearly understand strategic priorities within clinical R&D, as we implement solutions for improving Clinical Development Portfolios via an integrated platform-based approach. For over 17 years, MaxisIT’s Clinical Trial Oversight System (CTOS) has been synonymous with timely access to study-specific, standardized and aggregated operational, trial, and patient data, enabling efficient trial oversight. MaxisIT’s platform is a purpose-built solution, which helps the Life Sciences industry by empowering business stakeholders. Our solution optimizes the clinical ecosystem; and enables in-time decision support, continuous monitoring over regulatory compliance, and greater operational efficiency at a measurable rate.

Challenges of Incorporating Wearable and At-Home Device Data into Clinical Data Management

By Suvarnala Mathangi | Date: January 31, 2022 | Blog | 0 Comment(s)

Wearables have come a long way from being just a part of the support system for insomniacs and fitness enthusiasts. Today, they form a part of digital health technologies and are maturing into sensor-based wearables which gather real-time data for clinical trials. The insights they offer into a patient’s health and fitness stream in 24x7x365, helping clinical researchers track and measure novel, under-reported endpoints. Regulatory authorities around the world are happy to add them to their standard frameworks as tools that help gather patient data and have them integrated into clinical trials. These are, undoubtedly, exciting times.

Sensor-driven wearables are invaluable tools, set to revolutionize research and data collection in clinical trials. Given their ability to collect data at a highly granular level, wearables play a significant role in clinical trials collecting longitudinal, high-frequency data. Many decentralized and hybrid trials are underway, demonstrating the benefits of wearables in clinical trials. As of December this year, 1421 clinical studies involved the use of wearables, according to clinicaltrials.gov. With the impetus to digital transformation given by the COVID-19 pandemic, this trend is expected to only accelerate further.

As with all good things, benefits are followed by challenges, and this article tries to look at what they are and how they may affect the integration of wearables into clinical trials.

Choice of Device/Technology:

It is not enough to find a wearable which is efficient at collecting the required data. We should also factor in the limitations and preferences of trial participants if any. The device needs to be accepted by the participants, as they need to be comfortable with using it. Only the participants’ acceptance will drive the willing and consistent use of a device during the duration of a study. To ensure adherence, sponsors would do well to reach out to trial participants to gauge their comfort with a particular device before adopting it for a trial. Clinical study teams need to monitor all aspects of a wearable’s use in a trial from data connectivity, type of data, data privacy and security to the effective extraction of data and its processing and be assured of the accuracy and validity of the data before integrating it into their research. This approach is especially important when a wearable is a consumer-grade device.

Implementation Challenges:

Wearables impact the collection of data for clinical trials in many ways, even as they are evolving and improving in quality and efficiency to meet the expectations set upon them. The wearable’s size, battery life, impact on daily activities, regulatory approval status, and ability to meet the clinical trial endpoints with the required verification and validation are all points that will need careful consideration and evaluation. Trial sponsors’ choice of device/technology is primarily dictated by its data generation/integration capabilities which meet the parameters of the clinical trial. Adding an additional data source to a clinical trial needs to be done right, making the choice of wearable and its implementation crucially important. Some wearables today enable longitudinal biometric data sets, to provide unique insights into the effects of treatment protocols in the long term. Many wearables have their own proprietary algorithms and accessing the data collected by them, clinically validating it, and making use of it are challenges that need to be resolved, without forgetting the need to integrate it into the software that manages the clinical trial’s data.

Challenges with Owned Devices:

Trial teams may see challenges when participants find the use of a wearable uncomfortable or difficult to manage. Allowing the participants to bring their own devices to a trial could increase patient engagement and reduce logistical complexities and improve adherence.  But owned devices may not offer a uniform measure of patient-reported outcomes (PROs) across the trial in a manner compliant with regulatory requirements. This explains the reluctance of trial sponsors to accept BYOD as a strategy in clinical trials. Without software that is agnostic to the diverse operating systems used by the wearables, trials using patient-owned devices may require longer timelines.

Ensuring Data Security and Privacy:

Wearables record a lot of data, which needs to be processed and integrated into trial data and be reported. It needs to comply with the data privacy, patient consent for data collection and sharing requirements as well as the standards laid down by laws like HIPAA and certifications like HITRUST. By using licensed platforms which enable data management, companies can strategize to ensure data privacy and content validity even with wearables that are not fully compliant. Working with device makers to suitably customize the collection of clinical trial endpoints which can be validated may offer a way to resolve the challenges with data ownership and sharing, consent requirements, preservation of privacy, and security while adhering to specific regulatory requirements by region.

Effectiveness/Limitations of Technology:

Technical considerations are important when including a wearable in a clinical trial, while regulatory considerations are essential. FDA cleared devices help to assure trial sponsors of data privacy and ownership, data accuracy, data security and adherence to trial protocols. Wearable devices and sensors today are improving by leaps and bounds to offer highly useful and insightful metrics, independent connectivity, more security and better battery life. Progressively, they evolve into fit-for-purpose devices and align with analytical clinical validation methodologies. For wearables to be accepted by clinical trials, their monitoring needs to be continuous, their data verifiable and their clinical insights meaningful.

Ensuring Seamless Data Integration:

Huge amounts of raw data collected from wearables need to be validated, processed and analyzed without affecting data security. Most data from wearables cannot be accessed directly and needs to be transferred to a mobile phone first before being directed to the trial database for interpretation. This flow of data presents significant challenges and could drive costs up if there are no failsafe mechanisms that prevent data loss. It is essential to have a system in place which enables the trial sponsors to monitor trial progress, movement of data, source documentation of data with audit trails. The system needs to be able to spot errors in real-time and initiate corrective action if needed.

Identifying the best fit-for-purpose device or technology:

Choosing the right device to monitor the targeted therapeutic area through timely intervention is also a very important aspect of employing wearables in clinical trials to derive meaningful digital endpoints. Sponsors must keep their clinical endpoints in mind when selecting the device, ensuring that they are able to measure what they need to. Many consumer-grade devices are available, but their design doesn’t specifically address a medical problem. Trial sponsors need to ensure that a particular device meets their requirements and employs appropriate terminology. There’s also a need to employ technology to process the massive 24×7 data collected by sensors, which is very different from the scheduled collection of data by teams at clinical sites.

Integrating data:

Collecting data using high-frequency sensors becomes a way of life with the use of wearables in clinical trials. Integrating these data streams with data held by existing clinical systems is a challenge as it requires a customized solution based on the existing architecture, standardizing the integration of inputs in each case. Determine what data can be accessed and decide on a workflow for its secure transfer. Consider device-specific limitations of data, missing data from non-adherence, duplicate data and any inaccuracies in data for reliable data interpretation. All data submitted to regulators need to meet minimum standards in terms of validity, reliability, sensitivity and robustness.

All these challenges are more than offset by the benefits offered by wearables, which have proven highly beneficial when collecting clinical data, improving the effectiveness of clinical trials while reducing non-compliance and overall costs. They enable more observational studies and encourage new treatments and protocols, improving patient care with real-time insights. Patients are able to avoid periodic site visits while sponsors are able to redeploy personnel at clinical sites. It is indisputable that wearables offer innovative new approaches to clinical research and are set to modernize clinical research.

About MaxisIT

At MaxisIT, we clearly understand strategic priorities within clinical R&D, as we implement solutions for improving Clinical Development Portfolios via an integrated platform-based approach. For over 17 years, MaxisIT’s Clinical Trial Oversight System (CTOS) has been synonymous with timely access to study-specific, standardized and aggregated operational, trial, and patient data, enabling efficient trial oversight. MaxisIT’s platform is a purpose-built solution, which helps the Life Sciences industry by empowering business stakeholders. Our solution optimizes the clinical ecosystem; and enables in-time decision support, continuous monitoring over regulatory compliance, and greater operational efficiency at a measurable rate.

Changing Times require Innovative Approaches

By Suvarnala Mathangi | Date: September 30, 2021 | Blog | 0 Comment(s)

This blog is the second in a 2-part series that discusses the increasing variety of data types and sources. The first installment described the increasing variability of sources, types, and formats. This blog focuses on the impact on the people, processes, and technology involved in clinical trials.

The variety, volume, and velocity of data coming form new sources in new formats and in new ways is changing the way the clinical research industry works. It is impacting the people, processes, and technologies that support clinical trials and motivating innovations across research organizations.

Technology

Digitization is working as a double-edged sword as it is helping to source more data than ever before, while wearable technology is creating more data than ever before. New data can provide unprecedented insights if clinical trial teams are using robust technologies. The variety of tech-enabled sources of clinical data include:

o IoT, wearables, and other patient generated data (e.g.: Home BP monitors),
o EHR, physiologic sensors, and apps,
o Lab tests, and Biomarkers,
o EDC systems, and mobile Healthtech,
o Real-World Evidence (RWE) Outcomes as data and images,
o Patient-reported outcomes, and patient diaries,
o Quality of life measures,
o Condition-specific or disease-specific registries
o Genomic data

Implementing technologies that quickly and accurately standardize and aggregate clinical trial data is a priority. Emerging technologies such as artificial intelligence (AI) and machine learning (ML) are revolutionizing the way teams approach the data lifecycle. Automation is essential to managing large quantities of variable data while also decreasing cycle times in an effort to shorten the time-to-market for new medicines.

People

Clinical data managers face an uphill task trying to fill gaps in data and ensuring its cleanliness to meet compliance requirements. When decentralized trials generate enormous amounts of data, the task of a data manager becomes significant, as they strive to ensure access to authoritative data. Data managers today need timely access to authoritative, standardized, and aggregated clinical trial operations data as well as patient data from site, study and portfolio level views. Today’s data managers are technically savy and use tools that support their innovative strategies for the collection, processing, and analysis of clinical data. They benefit from tools that flag any discrepancies as soon as they occur, in real-time. They manage their clinical trials using analytics and reporting for timely insights. They demand efficient trial oversight to address operational, quality, and compliance efforts.

With access to the right technology, Data Managers can reduce the time needed to aggregate all the clinical data across the data silos created by the disparate sources, minmizing manual reconciliations. In a process powered by artificial intelligence and machine learning, clinical data across an entire portfolio can be aggregated in real time, flagging risks and offering actionable insights. All clinical data can be integrated into a single data repository and act as a single-source of truth. Data can be sliced and diced to offer role-based dashboards, promoting proactive and corrective action and resolution, as required.

This combination of management skills and capabilities enables Data Managers to adhere to timelines and regulatory requirements. It also minimizes cost overruns in clinical trials by enabling in-time decision-making.

Process

To be reliable and accurate enough to drive decisions, data needs to be captured, controlled, verified, and reviewed appropriately. All these are highly complicated stages in the process of collecting the right data. This complexity is further exacerbated by the different data sources from which the data gets collected and the varying formats in which it gets reported. Technology today promotes interoperability, vetting the data from diverse sources and collating it to derive a cohesive, high-quality set of standardized data for analysis and reporting. The complexity resulting from new data sources and types is creating a greater need for more and deeper collaboration across traditional clinical research teams. This level of collaboration assumes fast access to accurate data. Collaborative processes built on a common, integrated data platform will ensure success.

Clinical research teams around the globe are adopting innovative new processes across their clinical trial portfolios. Modern clinical data platforms help to improve compliance and mitigate risks in clinical trials, using AI-powered analytics to provide actionable insights which accelerate the drug development process and shorten a drug’s time-to-market.

About MaxisIT

At MaxisIT, we clearly understand strategic priorities within clinical R&D, as we implement solutions for improving Clinical Development Portfolios via an integrated platform-based approach. For over 17 years, MaxisIT’s Clinical Trial Oversight System (CTOS) has been synonymous with timely access to study-specific, standardized, and aggregated operational, trial, and patient data, enabling efficient trial oversight. MaxisIT’s platform is a purpose-built solution, which helps the Life Sciences industry by empowering business stakeholders. Our solution optimizes the clinical ecosystem; and enables in-time decision support, continuous monitoring over regulatory compliance, and greater operational efficiency at a measurable rate.

New Trial Data Sources Require New Strategies

By Suvarnala Mathangi | Date: September 30, 2021 | Blog | 0 Comment(s)

This blog is the first in a 2-part series that discusses the increasing variety of data types and sources. The first installment describes the increasing variability of sources, types, and formats. The second installment discusses the impact on the people, processes, and technology involved in clinical trials.

Emerging technologies are enabling clinical research teams to address the speed at which high amounts of varied data are being evaluated in clinical trials, as myriad, decentralized sources bring in increasingly complex clinical trial data to them. The digital transformation brought by the COVID-19 pandemic emphasizes the need for effective strategies to access reliable, accurate data gleaned from decentralized trials. The clinical trials industry must leverage technology and implement automated solutions, which incorporate artificial intelligence and machine learning, to stay on course with trial timelines.

Considerations based on the Type of Data, and its Attributes:
Structured and Unstructured Data
Structured data is typically data that conforms to a pre-defined data model, like a spreadsheet with defined rows and columns. Unstructured data is variable and does not have a consistent underlying data model. Unstructured data includes data that is found within the text such as emails, documents, some survey results, presentations, and even social media posts. As we see more new sources of data coming directly from patients, we can assume that clinical research teams will need the capability to manage greater volumes of unstructured and structured data, as well as essential metadata that helps to understand and access the data.

Batch, Stream, Micro-batch Processing
Digitization enables new data to be operationalized quickly and prepares it for analysis. Data can be processed as it arrives, in real-time. It can also accumulate before being processed.
Batch processing runs on a set schedule, allowing data to accrue or reach a specific threshold before it is processed.
Stream processing is about processing data as soon as it arrives. This could be almost as soon as (or milliseconds after) it is generated if the aggregation happens in real-time. When we have data that is generated in a continuous stream, stream processing is the best option.
In micro-batch processing, processes are run on accumulations of data – typically a minute’s worth or less. This makes data available in near real-time.

Legacy Data
Drug sponsors have a growing need to convert legacy data into CDISC compliant formats, prior to submission to meet regulatory requirements if their submissions are to be accepted.

Data Quality & Cleansing
As noted previously, clinical data can come in from disparate sources in the form of a recordset, table, or database and numerous other unstructured formats. It’s important to have a processing and cleansing strategy for each data type to boost its consistency, reliability, and value. These processes vary for each type of data but must produce consistently accurate quality data. By detecting and correcting corrupt, incomplete, irrelevant, and inaccurate records early, clinical trial teams can get quicker access to higher quality data. This driving principle forms an essential and critical foundation for all data management strategies.

Results

Gone are the days when clinical research teams were using Excel sheets and SAS to collect and integrate clinical data. Clinical researchers are taking full advantage of the data available to them, using sophisticated, tech-enabled tools. As automated processes ingest data from disparate sources as soon as it gets generated, research teams get to access large volumes of high-quality data for analysis and to derive insights. This calls for technology with the capability to capture, organize, analyze, and report clinical trial data, or use new and improved visualizations to make better sense of the data and to enable research teams to derive greater insights than ever before.

About MaxisIT

At MaxisIT, we clearly understand strategic priorities within clinical R&D, as we implement solutions for improving Clinical Development Portfolios via an integrated platform-based approach. For over 17 years, MaxisIT’s Clinical Trial Oversight System (CTOS) has been synonymous with timely access to study-specific, standardized and aggregated operational, trial, and patient data, enabling efficient trial oversight. MaxisIT’s platform is a purpose-built solution, which helps the Life Sciences industry by empowering business stakeholders. Our solution optimizes the clinical ecosystem; and enables in-time decision support, continuous monitoring over regulatory compliance, and greater operational efficiency at a measurable rate.

How Wide is Your Operational View? And What Exactly is a Single-Source-of-Truth?

By Suvarnala Mathangi | Date: March 31, 2021 | Blog | 0 Comment(s)

This week I listened as a colleague in Clinical Operations spoke with an EDC vendor. The vendor boasted that lab data, which normally arrives separately from the sites’ patient data, could be integrated with the other site data and viewed as a single-source-of-truth for the clinical team. This got me thinking about what constitutes a single-source-of-truth and what’s required to truly provide oversight of on-going clinical trials.

 

Certainly viewing lab data alongside other patient data for a trial is valuable. But does that alone illuminate all potential areas of interest for the clinical team and risks for the trial? Perhaps not. What if the Chief Medical Officer wants to compare patients lost to follow up from the last three clinical trials, not just the current trial? How do the rates of site queries compare to trials run with other CROs? Is the vendor-specific view really a single-source-of-truth? Or is it a very nice view into a limited set of data?

 

Take the picture below. Nice picture. Looks like a beach. We can identify the bird. What are we missing?

What if our vendor-centric view of the seashore takes a little broader view? Our insight expands.

Let’s increase our visibility yet again, this time dramatically. It wasn’t our imagination! This is a flock of gulls on a beach. The data confirms it.

What has our simple analogy shown us? It’s not enough to see the site-entered and lab data together. Holistic clinical trial oversight requires the ability to see the data from all domains and from other trials in the program. This requires a true single-source-of-truth. The requirements of which are pretty clear and straight forward. Clinical study sponsors need end-to-end, real-time visibility to aggregated data from multiple eClinical systems, across multiple domains and potentially multiple CROs. Management of the data streams should be automated and validated, with intelligent AI driven error checks and notifications. Access to the data must be role-based and appropriate for each user community.

Data visibility across diverse domains helps sponsors identify expected and unexpected risks and mitigate them early. Access to an aggregated standardized true single-source-of-truth allows clinical teams to gather insights, monitor performance, assess safety, review compliance, and keep studies on time and within budget.

About MaxisIT

At MaxisIT, we clearly understand strategic priorities within clinical R&D, as we implement solutions for improving Clinical Development Portfolios via an integrated platform-based approach. For over 17 years, MaxisIT’s Clinical Trial Oversight System (CTOS) has been synonymous with timely access to study-specific, standardized and aggregated operational, trial, and patient data, enabling efficient trial oversight. MaxisIT’s platform is a purpose-built solution, which helps the Life Sciences industry by empowering business stakeholders. Our solution optimizes the clinical ecosystem; and enables in-time decision support, continuous monitoring over regulatory compliance, and greater operational efficiency at a measurable rate.

Our Customers’ Voices Echo MaxisIT’s Innovative Platform and Trusted Partnering Approach

By Suvarnala Mathangi | Date: January 31, 2021 | Blog | 0 Comment(s)

MaxisIT’s Technology Platform helps small to large biopharmaceutical organizations in their need to build a holistic insight mechanism based on authenticated and aggregated data for effective decision making within the planning, conduct, and submission phases of clinical trials. MaxisIT recently spent time speaking with business leaders from some of our client organizations. The exercise was part of our continuing effort to promote dialog, build trusted relationships, and incorporate our customers’ voices into our planning and solution strategies for 2021.

Clinical Trial Oversight over Hybrid or Virtual Trials

By Suvarnala Mathangi | Date: November 30, 2020 | Blog | 0 Comment(s)

Pharma companies face huge challenges with the variety and volume of data they receive from across their clinical trial portfolios. Core challenges faced in clinical trial management vary from study design optimization to targeted therapies to benefit-risk assessments to cost reduction. Then we have the need for increased access to a core primary asset or data, for the purposes of decision-making, predictive analysis, insight, and strategic planning. Multiple processes and data formats result from individual applications, structured along departmental and functional silos. Insights are insufficient to help anticipate future business opportunities proactively. Operating in a reactive manner and producing pre-defined visualizations is not enough. The need for digitizing the processes was pressing, but change was slow and sporadic.

In the initial days of the pandemic, on-site visits by trial participants stopped as most sites were busy treating COVID-19 patients and couldn’t spare any infrastructure for the conduct of clinical trials. Patients were also unable or unwilling to go to the sites. This has posed major challenges to the continuity of clinical trials, causing many of them to pause their studies. It wasn’t possible to enroll new patients while the enrolled trial participants were unable to fulfil on-site treatments schedules. With the increasing adoption of digitization, many trials moved to adopt virtual/hybrid processes, which enabled the trial participants to continue participating in a trial remotely. The industry quickly turned to patient-centric technologies like eCOA, ePRO, eConsent, Telemedicine, wearables etc. Thereafter, patients are able to effectively use the various cloud-based apps and tools on their phones to get assessed by physicians remotely and to continue their participation in the clinical trials.

The industry needs to ensure that these app-based, digital technologies are effectively integrated into their single-source-of-truth data hub for effective remote-patient-monitoring, remote-trials-monitoring and ultimately ensure that they get to maintain total oversight on their virtual trials, alongside of trials which are partially virtual (in other words, hybrid) and the trials which are still traditional. Now, all these hybrid/virtual trial set ups need the able support of a technology platform not just to enable remote monitoring, but also to provide sponsors/CROs to stay connected and informed in a virtual manner. The importance of picking the right digital tools to achieve desired outcomes cannot be stressed enough, at this stage.

The data collected from patients, whether reported by them or captured through a wearable technology – can be easily integrated by MaxisIT®’s CTOS. As an evangelist in this field, MaxisIT® addresses all these issues and more with a complete end-to-end integrated solution for companies across the Life Sciences / Healthcare Enterprises.  Our solution delivers on two critical challenges faced by the business: 

  • Lack of access to Integrated Data: Acquisition, cleansing, virtualization, harmonization, standardization, validation, and management of multiple types of structured and unstructured data and metadata.
  • Unmet need for Analytics and Reporting:  An analytical environment available for “power users” is used to conduct predictive and exploratory analytics for data Sciences purposes.

A resolution to these challenges lies in adopting an integrated clinical data management platform like MaxisIT®’s ONE Integrated Clinical Development Platform for the biopharmaceutical industry –  which acts as a centralized Command Centre for Remote Clinical Data Review and end-to-end Clinical Data Management. Integrates and interoperates across all eClinical Systems and Virtual Patient Data sources via a secure, cloud-based offering – shifting focus to data management as opposed to data integration. As the industry’s most comprehensive, integrated, automated, and intelligent Data Management Platform, it supercharges data flow across data quality management, remote data review, and study data oversight, readily integrates clinical data from virtual/hybrid sources.

The platform integrates data from different applications or systems at different levels, to manage virtual/hybrid trials, by aggregating all the clinical data in real time, across the data silos. It offers timely access to authoritative, standardized, integrated data and enables risk mitigation with efficient trial oversight via remote monitoring. It makes data quality management possible with real-time controls and statistical computing with assured adherence to timelines and compliance requirements. The platform adopts a command centre approach to truly empowers clinical data management and data review teams to maintain quality and integrity of clinical trials’ patient data by offering seamless interoperable and metadata-driven data, providing the following benefits:

  • The system offers scalable, data-agnostic access and control over to sources – easily integrates and unifies across 16 different eClinical systems (e.g. EDCs, LABs, IRTs, ePROs, Safety systems, CTMSs etc.), top 8 CROs systems, and 50 plus wearable APIs.
  • Fail-proof, and purpose-built clinical data management platform that delivers a true single-source-of-truth clinical data hub across the types, sizes, phases, and modes of clinical trials across the end-to-end clinical data value chain.
  • AI-enabled, API-driven, modular approach delivers value through automation, interaction, and reasoning for intelligent, actionable, and guided reviews for timely and informed decision-making.
  • Real-time, on-demand, and role-based data & dashboard access with intuitive visualizations for efficient monitoring and reviews, and early detection of issues, with self-service offering huge time and cost savings
  • Enhances meaningful focus between teams with built-in, role-based, and collaborative workflows at each functional stages of data management, review, and assessment with traceability for regulatory compliance
  • Built for clinical data management by clinical data management professionals for meaningful and in-time data review to ensure data quality, patient safety, protocol compliance, and on time reporting & submission.

MaxisIT®’s Integrated Clinical Development Platform for biopharmaceutical industry adopts a transformational strategy to detect the leading indicators of change in clinical trial management and empowers key decision-makers to derive maximum value from the data collected across their portfolios.  Technologically advanced tools like MaxisIT®’s CTOS have been helping the healthcare industry to enjoy complete oversight across the results of their hybrid/virtual trials. They gain total control over their clinical trial portfolio data to take proactive action, improve the outcomes on their clinical studies and to derive better business outcomes.

About MaxisIT

At MaxisIT, we clearly understand strategic priorities within clinical R&D, and we can resonate that well with our similar experiences of implementing solutions for improving Clinical Development Portfolio via an integrated platform-based approach; which delivers timely access to study specific as well as standardized and aggregated clinical trial operations as well as patient data, allows efficient trial oversight via remote monitoring, statistically assessed controls, data quality management, clinical reviews, and statistical computing. Moreover, it provides capabilities for planned vs. actual trending, optimization, as well as for fraud detection and risk-based monitoring. MaxisIT’s Integrated Technology Platform is a purpose-built solution, which helps Pharmaceutical & Life sciences industry by “Empowering Business Stakeholders with Integrated Computing, and Self-service Dashboards in the strategically externalized enterprise environment with major focus on the core clinical operations data as well as clinical information assets; which allows improved control over externalized, CROs and partners driven, clinical ecosystem; and enable in-time decision support, continuous monitoring over regulatory compliance, and greater operational efficiency at a measurable rate”.

This website uses cookies to help us give you the best experience when you visit. By using this website you consent to our use of these cookies. For more information on our use of cookies, please review our cookie policy.