11:30 a.m. - 12:30 p.m. Location: GR 3.606
Center for Depression Research and Clinical Care
University of Texas Southwestern Medical Center
Bayesian modeling of spatial point patterns and its application on the analysis of tumor pathology images
With the advance of imaging technology, digital pathology imaging of tumor tissue slides is becoming a routine clinical procedure for cancer diagnosis. This process produces massive imaging data that capture histological details in high resolution. Recent developments in deep-learning methods have enabled us to identify and classify individual cells from digital pathology images at large scale. The randomly distributed cells can be considered from a marked point process, where each point is defined by its position and cell type. Reliable statistical approaches to model such marked spatial point patterns can provide new insight into tumor progression and shed light on the biological mechanisms of cancer. In this talk, I consider the problem of modeling spatial correlations among three commonly seen cells (i.e. lymphocyte, stromal, and tumor) observed in tumor pathology images. Two novel spatial models of marked point patterns, with interpretable underlying parameters (some of which are clinically meaningful), are proposed in a Bayesian framework. Markov chain Monte Carlo (MCMC) sampling techniques, combined with the double Metropolis-Hastings (DMH) algorithm, are used to sample from the posterior distribution with an intractable normalizing constant. A case study is conducted on the pathology images of 188 lung cancer patients from the National Lung Screening Trial. The results show that the spatial correlation between tumor and stromal cells predicts patient prognosis. This statistical methodology not only presents a new model for characterizing spatial correlations in a multi-type spatial point pattern but also provides a new perspective for understanding the role of cell-cell interactions in cancer progression.
Coffee: 11:00 am in Mathematical Sciences Faculty Lounge (FO 2.606)