To proactively predict cancer outcomes, AI combines multiple data types

To proactively predict cancer outcomes, AI combines multiple data types ...

According to the results from a research led by Brigham and Womens Hospital, artificial intelligence (AI) is used to predict patient outcomes for 14 different types of cancer. This work sets the stage for larger health care AI experiments that combine information from multiple sources. In a broad sense, we emphasize the necessity for developing computational pathology prognostic models with much larger datasets and downstream clinical trials to strengthen their potential potential.

In a paper titled, Pan-cancer integrative histology-genomic analysis via multimodal deep learning, Mahmood and his colleagues discussed their experiences in Cancer Cell.

Patients with cancer often fail to understand how to anticipate outcomes through several sources of information, such as patient history, genes, and disease pathology. However, experts often fail to realize that actual information is used to determine outcomes. While existing technologies enable them to process these information to produce effective decisions, experts are often confronting difficulties.

Experts examine many pieces of evidence to better predict how well a patient may do. These early examinations are the basis of making decisions about enrolling in a clinical trial or specific treatment regimens. However, this multimodal prediction occurs at the level of the expert. We are doing everything we can to address the issue more efficiently.

Together, Mahmood and colleagues developed a way to enact several forms of diagnostic information computationally to provide more accurate outcome predictions. Together, they described the need for H&E WSIs (whole slide images) and molecular profile features (mutation status, copy-number variation, and RNA sequencing [RNAseq] expression) to measure and explain the relative risk of cancer death.

The researchers created these models by using The Cancer Genome Atlas, a publicly available resource containing information on many different types of cancer.

They developed a multimodal deep learning-based system capable of acquiring prognostic information from multiple data sources. By first creating separate models for histology and genomic data, they might combine the technology into a single integrated system that provides key prognostic information. Finally, they investigated the efficacy of the models by distributing data sets from 14 cancer types as well as patient histology and genomic data. The results showed that the models yielded more accurate patient outcome predictions than those that incorporated only one source of information

We present a method for interpretable, weakly supervised, multimodal deep learning that integrates WSIs and molecular profile data for cancer prognosis, which we developed and validated on 6,592 WSIs from 5,720 patients with paired molecular profile data across 14 cancer types they continued.

This research claims that using AI to encrypt various clinically informed data to predict disease outcomes is feasible... Their weakly supervised, multimodal deep-learning algorithm is able to adapt these heterogeneous techniques to predict outcome and discover prognostic features that are linked to poor and favorable outcomes.

According to Mahmood, these techniques may enable researchers to discover biomarkers that include various clinical characteristics and better understand what type of information they need to identify and treat different types of cancer. Moreover, the researchers further uncovered the importance of each diagnostic method for individual cancer types and the benefit of integrating multiple methods.

A notable improvement has been made due to previous studies showing that patients with tumors are likely to experience better outcomes.

Thousands of patients across cancer types have been given access to a research tool, the pathology-omics research platform for integrative survival estimation (PORPOISE).

While the proof-of-concept model demonstrates a newfound interest for AI technology in cancer treatment, their research is only a first step in clinically implementing these models. In the clinic, Mahmood will require incorporating even larger data sets and validating on large independent testing cohorts.

Future research will focus on developing more focused prognostic models by curating larger multimodal datasets for individual disease models, adapting models to large independent multimodal test cohorts, and using multimodal deep learning to anticipate response and resistance to treatment. These technologies, combined with whole-slide imaging, will become more spatially resolved and multimodal.

You may also like: