3-D Quantification of Filopodia in Motile Cancer Cells

Warning

This publication doesn't include Institute of Computer Science. It includes Faculty of Informatics. Official publication website can be found on muni.cz.
Authors

CASTILLA Carlos MAŠKA Martin SOROKIN Dmitry MEIJERING Erik ORTIZ-DE-SOLÓRZANO Carlos

Year of publication 2019
Type Article in Periodical
Magazine / Source IEEE Transactions on Medical Imaging
MU Faculty or unit

Faculty of Informatics

Citation
Web https://doi.org/10.1109/TMI.2018.2873842
Doi http://dx.doi.org/10.1109/TMI.2018.2873842
Keywords Filopodium segmentation and tracking;actin cytoskeleton;confocal microscopy;3D skeletonization;Chan-Vese model;convolutional neural network;deep learning
Description We present a 3D bioimage analysis workflow to quantitatively analyze single, actin-stained cells with filopodial protrusions of diverse structural and temporal attributes, such as number, length, thickness, level of branching, and lifetime, in time-lapse confocal microscopy image data. Our workflow makes use of convolutional neural networks trained using real as well as synthetic image data, to segment the cell volumes with highly heterogeneous fluorescence intensity levels and to detect individual filopodial protrusions, followed by a constrained nearest-neighbor tracking algorithm to obtain valuable information about the spatio-temporal evolution of individual filopodia. We validated the workflow using real and synthetic 3-D time-lapse sequences of lung adenocarcinoma cells of three morphologically distinct filopodial phenotypes and show that it achieves reliable segmentation and tracking performance, providing a robust, reproducible and less time-consuming alternative to manual analysis of the 3D+t image data.
Related projects:

You are running an old browser version. We recommend updating your browser to its latest version.

More info