PhD defence by Selvan Raghavendra – Københavns Universitet

PhD defence by Selvan Raghavendra

On 16 November, Selvan Raghavendra will defend his PhD thesis.


Extraction of Airways from Volumetric Data


Obtaining reliable segmentations of tree structures like airways, vessels and neurons from medical image data can enable important clinical applications. This thesis is concerned with the development of image segmentation methods aimed primarily at obtaining such tree structures from volumetric data. The main focus is on extraction of airways from 3D computed tomography (CT) data.

Most existing airway segmentation methods in literature rely on local and sequential decisions. This renders them susceptible to occlusions and noise in the image data, resulting in missing branches. These concerns are addressed with four overarching themes prevalent in this thesis: exploratory nature of methods, use of relevant global information in making local decisions, use of domain knowledge in segmentation and estimation and incorporation of prediction uncertainty in the decision making process. Adhering to these objectives, our investigations have resulted in four diverse yet related models that achieve these criteria to varying degrees. The proposed tree extraction methods are based on: Multiple Hypothesis Tracking (MHT), Bayesian Smoothing, Mean-Field Networks (MFNs) and Graph Neural Networks (GNNs).

Modifications to an existing interactive vessel segmentation method based on MHT are proposed which turn the original MHT method into an automatic method capable of tracking complete trees starting from a single seed point. The remaining three methods are developed within the framework of probabilistic graphical models. These methods use a sparse graph-like representation of volumetric images based on a two-step preprocessing procedure, involving a trained voxel classifier to obtain airway probability maps followed by multi-scale blob detection. The Bayesian smoothing method uses linear and Gaussian process and measurement models to output candidate airway branches with multivariate Gaussian density estimates as its states. False positive branches from the predictions are discarded by thresholding a measure derived from the uncertainty estimates of the branches. As input for the next two methods, states of these candidate branches are transformed into nodes of a graph with features based on the corresponding multivariate Gaussian densities. Extraction of trees is posed as a graph refinement task, of recovering subgraphs from over-complete graphs such that the subgraphs capture the connectivity between nodes belonging to the underlying airway tree. Two methods based on MFN and GNN are proposed as solutions to the graph refinement task. With MFN, graph refinement is performed as approximate inference using mean-field approximation (MFA). Iterations of MFA are unrolled as feed-forward operations of MFN and gradient descent is used to learn the MFN parameters. Graph refinement using GNN model is performed by jointly training a graph encoder-decoder pair in a supervised learning setup. The encoder learns useful edge embeddings from which the probability of edge connections are predicted using a simple decoder.

Performance of all the methods are evaluated on a subset of CT data from the Danish Lung Cancer Screening Trial, comparing them with manually verified reference segmentations and relevant comparing methods. It is shown that the proposed methods detect more branches with less false positives than baseline methods.

Assessment Committee:
Chairman: Associate Professor Aasa Feragen, Department of Computer Science, University of Copenhagen, Denmark.
Senior Lecturer Ben Glocker, Imperial College London, UK.
Associate Professor Raul San Jose Estepar, Brigham and Women's Hospital, USA.

Academic supervisor:
Professor Marleen De Bruijne, Department of Computer Science, University of Copenhagen, Denmark.

Assistant Professor Jens Petersen, Department of Computer Science, University of Copenhagen, Denmark.

For an electronic copy of the thesis, please contact