Bio-Rad - Preparing for a Stress-free QC Audit

CelloType: Novel AI model promises better cell analysis in tissue samples

Children’s Hospital of Philadelphia researchers have developed an advanced artificial intelligence model that can simultaneously identify and classify cells in complex tissue samples, potentially transforming how diseases are studied and diagnosed at the cellular level.

Advancing tissue analysis capabilities

Researchers at Children’s Hospital of Philadelphia (CHOP) have unveiled CelloType, a comprehensive artificial intelligence model that represents a significant advancement in how cellular images from tissue samples are analysed. This novel technology combines cell segmentation (identifying cell boundaries) and classification (determining cell types) into a unified process, offering superior accuracy compared to traditional methods.

The research, published in Nature Methods [1] on 22 November 2024, demonstrates how CelloType leverages transformer-based deep learning technology to improve the precision of cell detection, segmentation and classification in tissue samples.

Enhanced accuracy through integrated approach

Unlike conventional approaches that treat cell segmentation and classification as separate tasks, CelloType adopts a multitask learning strategy that integrates these processes simultaneously. This innovative approach allows the model to capture complex relationships between cellular architecture and functionality.

“We are just beginning to unlock the potential of this technology,” said Dr Kai Tan, the study’s lead author and professor in the Department of Pediatrics at CHOP. “This approach could redefine how we understand complex tissues at the cellular level, paving the way for transformative breakthroughs in healthcare.”

Superior performance across multiple platforms

The researchers evaluated CelloType’s performance against existing methods using various types of tissue imaging data, including multiplexed fluorescence and spatial transcriptomic images. The model demonstrated consistently superior results across different imaging platforms and tissue types.

In benchmarking tests, CelloType achieved significantly higher accuracy rates compared to current state-of-the-art methods. For cell segmentation, CelloType achieved a mean average precision of 0.56, substantially outperforming existing methods such as Cellpose2 (0.35) and Mesmer (0.31).

Advancing spatial omics research

The authors note in their paper that spatial omics technologies can now profile hundreds to thousands of genes at single-cell resolution, yielding substantially more features compared to traditional spatial proteomics technologies. “This substantial increase in the feature space, coupled with the distinct spatial distribution patterns of RNA transcripts versus proteins, introduces new computational challenges for segmentation and classification,” they write.

Practical applications and accessibility

One of the key advantages of CelloType is its ability to handle multiscale segmentation and classification, enabling detailed analysis of both cellular and non-cellular elements within tissue samples. This capability is particularly valuable for studying complex diseases such as cancer and chronic kidney disease, where understanding cellular interactions and microenvironments is crucial.

The technology has been made freely available to researchers outside of CHOP via open-source software in a public repository [2] for non-commercial use.

The development of CelloType represents a significant step forward in the field of spatial omics, which combines molecular profiling with spatial information to map cellular locations within complex tissues. This advancement could have far-reaching implications for understanding disease progression and developing targeted treatments.

Along these lines, CHOP is currently a collaborator in high profile projects such as the Human Tumor Atlas Network [3], the Human BioMolecular Atlas Program (HuBMAP) [4], and the BRAIN initiative [5], which use similar technologies to map spatial organizations of various types of healthy and diseased tissues.

References:
1. Pang, M., Roy, T.K., Wu, X., et. al. CelloType: a unified model for segmentation and classification of tissue images. Nature Methods. 22 November 2024.
https://doi.org/10.1038/s41592-024-02513-1
2. https://github.com/tanlabcode/CelloType
3. https://humantumoratlas.org/
4. https://commonfund.nih.gov/HuBMAP
5. https://braininitiative.nih.gov/

Kai Tan

Dr Kai Tan, professor in the Department of Pediatrics at Children’s Hospital of Philadelphia