Overcoming Data Management Complexities in Medical Imaging Clinical Trials

Joe Pierro, M.D. and Brett A. Hoover | |

To coordinate a successful clinical trial, clinical operations teams need to appropriately capture, manage and transfer data. This is no easy task due to the sheer volume of data generated by a single clinical trial, and this increases exponentially when imaging is incorporated into the study design.

To complicate the matter further, different types of trials require different data management practices. Some of the more nuanced study designs that mandate a more tailored approach to data management are those that involve medical imaging.

Technology is a must for ensuring clinical trial imaging data qualityA single medical image contains its own wealth of metadata such as the modality used to capture the exam, the subject being captured in the image, time-of-day the image was captured, and so on. Therefore, it’s imperative that medical imaging trials are designed to accommodate and simplify the data management process, and do so in a way that takes into account the unique data management requirements that often stem from a study’s specific therapeutic area, indication, modality (or modalities), and image scoring criteria. Configuring and implementing the right purpose-built imaging management solution will maximize data quality by minimizing human errors, data variability, and queries related to data management.

Ensuring Data Management Integrity in Clinical Trial Imaging

The importance of central image review has been documented many times over, and the critical role that image processing, analysis and management technology plays cannot be over-stated. As study design and imaging endpoints become more complex, and as the FDA requests that imaging endpoint data be more quantitative, leveraging purpose-built technology, tailored to support the unique needs of a study’s endpoints, is critical for ensuring data quality and maintaining process and protocol compliance throughout the execution of the study.

In order to guarantee the integrity of your imaging data management system, there are a series of steps that can be taken (Figure 1). Data flow starts with the site acquiring the images and then transmitting them to an imaging vendor via a secure web portal. In the past, the standard method of image data transfer was either by physical film or CD via courier, but this has evolved tremendously over the past decade.

Ensure clinical trial imaging data quality through 5 stepsOnce the imaging vendor obtains the images, the Quality Control (QC) process takes place, where the images are inspected by modality-trained technologists to ensure image quality, image acquisition protocol compliance, and that no protected health information (PHI) is present.

Once approved by the QC Specialist, the images are sent to the queue for image evaluation (reading) by either image analysis software, a clinical expert, or both. The reading might be carried out in the so-called “rolling read” fashion (on-going), or in batches, to reduce inter-/intra-reader variability. The image evaluation results are automatically captured by the imaging management solution during the reading session, stored within an electronic database, and then aggregated, organized and provided to the sponsor in a timely fashion.

From a data management point of view, it’s important to understand both the science and process design elements of the image processing and evaluation workflow  Not only are there going to be numerous metadata fields, as described above, but often the image evaluation paradigm comprises two primary readers and a third reader for ad-hoc adjudication. This multi-reader approach to image evaluation brings with it a new layer of imaging workflow and data management complexity.

It is critical that the imaging vendor’s data manager work with the trial sponsor‘s data manager to understand exactly what data points need to be exported at the end of the study, and ensure this is clearly defined (and mutually agreed to) within the imaging data transfer agreement (DTA) and data load specification.

If properly configured within the imaging management solution, the vast majority of this data management endeavor can be automated to reduce human error, minimize data queries (e.g., accuracy and completeness), and accelerate interim data analysis and/or database lock. For example, if a 2+1 read design is being implemented, does the sponsor only need the final set of results (e.g., potentially just the adjudicator) or do they require the read results from all the readers (e.g., primary, secondary, and adjudicating reader)? Defining this up-front, and capturing expectations within the study’s image interpretation standards (IIS) and DTA will help avoid confusion, delays and unnecessary data queries at study completion, as well as provide clarity on data transfer expectations.   

Hence, from a data management view, clinical trials with imaging endpoints have added complexity to the overall database structure and imaging management workflow. Careful planning needs to be conducted in conjunction with imaging experts. It is recommended that the data management teams from all parties are included in the study planning and kick-off meetings.

To quote Stephen Covey:  always “begin with the end in mind.” No truer words can be spoken of clinical trial imaging.

This is the second part of a series on medical image trial design. Click here to read part-1 ‘Understanding the Blinded Read,’ and check back soon for the final installment on medical imaging trial documentation.

Joe Pierro, M.D. is the Medical Director of Imaging at ERT and Brett A. Hoover is the Director of Imaging Product Management at ERT.

Share