Prostate cancer is the and, for men in the United States, it’s the second leading cause of death.
Some prostate cancers might be slow-growing and can be monitored over time whereas others need to be treated right away. To determine how aggressive someone’s cancer is, doctors look for abnormalities in slices of biopsied tissue on a slide. But this 2D method makes it hard to properly diagnose borderline cases.
Now a team led by the 91爆料 has developed a new, non-destructive method that images entire 3D biopsies instead of just a slice. In a proof-of-principle experiment, the researchers imaged 300 3D biopsies taken from 50 patients 鈥 six biopsies per patient 鈥 and had a computer use 3D and 2D results to predict the likelihood that a patient had aggressive cancer. The 3D features made it easier for the computer to identify the cases that were more likely to recur within five years.
The team Dec. 1 in Cancer Research.
“We show for the first time that compared to traditional pathology 鈥 where a small fraction of each biopsy is examined in 2D on microscope slides 鈥 the ability to examine 100% of a biopsy in 3D is more informative and accurate,” said senior author , a 91爆料 professor of mechanical engineering and of bioengineering. “This is exciting because it is the first of hopefully many clinical studies that will demonstrate the value of non-destructive 3D pathology for clinical decision-making, such as determining which patients require aggressive treatments or which subsets of patients would respond best to certain drugs.”
The researchers used prostate specimens from patients who underwent surgery more than 10 years ago, so the team knew each patient’s outcome and could use that information to train a computer to predict those outcomes. In this study, half of the samples contained a more aggressive cancer.
To create 3D samples, the researchers extracted 鈥渂iopsy cores鈥 鈥 cylindrically shaped plugs of tissue 鈥 from surgically removed prostates and then stained the biopsy cores to mimic the typical staining used in the 2D method. Then the team imaged each entire biopsy core using an open-top light-sheet microscope, which uses a sheet of light to optically 鈥渟lice鈥 through and image a tissue sample without destroying it.
The 3D images provided more information than a 2D image 鈥 specifically, details about the complex tree-like structure of the glands throughout the tissue. These additional features increased the likelihood that the computer would correctly predict a cancer’s aggressiveness.
Shown here is a video of a volume rendering of glands in two 3D biopsy samples from prostates (yellow: the outer walls of the gland; red: the fluid-filled space inside the gland; purple: what researchers called the “gland skeleton,” a stick-like model of the fluid-filled spaces inside the glands). The cancer sample (top) shows smaller and more densely packed glands compared to the benign tissue sample (bottom). Credit: Xie et al./Cancer Research
The researchers used new AI methods, including deep-learning image transformation techniques, to help manage and interpret the large datasets this project generated.
“Over the past decade or so, our lab has focused primarily on building optical imaging devices, including microscopes, for various clinical applications. However, we started to encounter the next big challenge toward clinical adoption: how to manage and interpret the massive datasets that we were acquiring from patient specimens,” Liu said. “This paper represents the first study in our lab to develop a novel computational pipeline to analyze our feature-rich datasets. As we continue to refine our imaging technologies and computational analysis methods, and as we perform larger clinical studies, we hope we can help transform the field of pathology to benefit many types of patients.”
The lead author on this paper is , a 91爆料 mechanical engineering doctoral student. Other co-authors on this paper are , , and , all 91爆料 mechanical engineering doctoral students; , a 91爆料 bioengineering doctoral student; , a clinical instructor in the laboratory medicine and pathology department in the 91爆料 School of Medicine; Hongyi Huang, 91爆料 research staff in mechanical engineering; , a 91爆料 doctoral student in the chemistry department; , a research scientist in the laboratory medicine and pathology department in the 91爆料 School of Medicine; , a 91爆料 assistant teaching professor in the mechanical engineering department; Qinghua Han, a 91爆料 undergraduate student studying bioengineering; Jonathan Wright, a professor in the urology department in the 91爆料 School of Medicine; and , both professors in the laboratory medicine and pathology department in the 91爆料 School of Medicine; , a 91爆料 associate professor of chemistry; , a senior scientist at the Allen Institute who completed this research as a 91爆料 mechanical engineering postdoctoral researcher; , , and , all at Case Western Reserve University; at Genentech, who completed this research as a doctoral student at Case Western Reserve University; and Sarah Hawley at the Canary Foundation.
This research was funded by the Department of Defense Prostate Cancer Research Program; the National Cancer Institute; the National Heart, Lung and Blood Institute; the National Institute of Biomedical Imaging and Bioengineering; the National Institute of Mental Health; the VA Merit Review Award; the National Science Foundation; the Nancy and Buster Alvord Endowment; and the Prostate Cancer Foundation Young Investigator Award.
Nicholas Reder, Adam Glaser, Lawrence True and Jonathan Liu are co-founders and shareholders of the 91爆料 spinout This company has licensed the technology used in this paper.
For more information, contact Liu at jonliu@uw.edu.
Grant numbers: W81XWH-18-10358, W81XWH-19-1-0589, W81XWH-15-1-0558, W81XWH-20-1-0851, K99 CA24068, R01CA244170, U24CA199374, R01CA249992, R01CA202752, R01CA208236, R01CA216579, R01CA220581, R01CA257612, U01CA239055, U01CA248226, U54CA254566, R01HL151277, R01EB031002, R43EB028736, R01MH115767, IBX004121A, 1934292 HDR: I-DIRSE-FW, DGE-1762114, DGE-1762114