Deep learning approaches for bone and bone lesion segmentation on 18FDG PET/CT imaging in the context of metastatic breast cancer

Noémie Moreau, PhD student
July 23, 2020

Ce travail a été présenté à l’oral le jeudi 23 Juillet 2020 lors de la 42ème conférence internationale de la IEEE Engineering in Medicine and Biology Society (EMBC2020).

Noémie Moreau1,2, Caroline Rousseau3,4 MD PhD,  Constance Fourcade2, Gianmarco Santini2, Ludovic Ferrer3,4 PhD, Marie Lacombe4 MD, Camille Guillerminet4 MD, Mario Campone3,4 MD PhD, Mathilde Colombié4 MD, Mathieu Rubeaux2 PhD and Nicolas Normand1 PhD

1 LS2N, Centrale Nantes, Nantes, France
2 Keosys, Saint Herblain, France
3 University of Nantes, CRCINA, INSERM UMR1232, CNRS-ERL6001, Nantes, France
4 ICO Cancer Center, Nantes-Angers, France

Introduction
In the context of metastatic breast cancer, tumor segmentation can provide information to assess and adapt treatments overtime. Recently, automatic segmentation based on deep learning showed good results for lesion segmentation in solid tumors. However, the heterogeneity of metastatic lesions in location, contrast and form can be very difficult to learn for a network. Since most deep learning methods are specialized on a single body part, an interesting approach would be to train different networks according to the location of the metastases. In the case of breast cancer, metastatic lesions are mostly located in the bones.
As a first step towards metastatic breast cancer lesion segmentation and characterization, we chose to concentrate our effort on the prevalent bone lesion detection and segmentation.

Methods
We used the framework nnU-Net to segment in a first experiment only the bone lesions and in the second the bone and bone lesions.
To do so we trained the network with 24 patients from the EPICURE study with a 3-fold cross validation. For each patients, PET and CT images were used as 2-channel input for the training, the lesions were segmented by expert and the bone were first extracted automatically using a set of traditional morphological and thresholding and then manually corrected by 4 non-specialist image processing researchers.
Visual evaluation, detection metrics and segmentation metrics were used to evaluate the bone and lesions segmentation.
We propose also a index inspired by the Bone Scan Index for prostate cancer that allows to compute the percentage of the total skeletal mass taken up by the tumors in order to assess breast cancer metastatic burden in the bones.

Results
Bone segmentation achieves a mean Dice score of 0.94 ± 0.03. Figure 1 shows side-by-side comparison with the traditional automatic bone segmentation method for one patient: the traditional approach fails to dissociate active organs from bones.


Figure 1: Visual results for deep learning based bone segmentation.

For bone lesion segmentation dice score of 0.58 and 0.61 were respectively reached by the network segmenting only bone lesions and the network segmenting the bone with the lesions. Better precision for the detection were also achieved for the second network. This shows that the use of the bone masks as ground truth during the training phase slightly improves the results of the automatic bone lesion segmentation in terms of Dice score, but even more in terms of precision: the network is constrained to look for lesions in the bones.
Nevertheless, the segmentation results of the bone lesions are perfectible. Indeed, the large PET SUV heterogeneity found in these lesions tend to lower the bone lesion Dice score in our experiments, low fixing lesions being sometimes ignored by the presented U-Net architecture as shown on the middle row of Figure 2.


Figure 2: Visual results for bone lesion segmentation.

The automatic PET bone index (PBI) shows relatively good agreement with the ground truth measurements, except for a few cases. These disagreements are also due to the lesions 18FDG fixation differences.

Laisser un commentaire

Votre adresse e-mail ne sera pas publiée. Les champs obligatoires sont indiqués avec *