XClose

LMCB - MRC Laboratory for Molecular Cell Biology

Home
Menu

Democratising deep learning for microscopy with ZeroCostDL4Mic

Deep  Learning  (DL)  methods  are  increasingly  recognised  as powerful analysis tools for microscopy and can carry out many tasks like image segmentation, classification, object detection, denoising. Additionally, their potential to outperform conventional image processing pipelines is now well established. But, despite the enthusiasm and innovations fuelled by DL technology, the need to access powerful and compatible resources and the complexity in setting up the necessary computational tools all lead to an accessibility barrier that most biology-focused laboratories find difficult to cross.

In their recent publication in Nature Communications, Lucas von Chamier and colleagues present ZeroCostDL4Mic, a deployment DL platform which considerably simplifies access and use of DL for microscopy. This is achieved by exploiting computational resources provided by Google Colab: a free, cloud-based service accessible through a web browser. ZeroCostDL4Mic allows  researchers  with  little  or  no  coding  expertise  to  use some of the most powerful DL networks available today, e.g. U-net and Stardist (segmentation), CARE and Noise2Void (denoising), fnet (artificial labelling), Deep-STORM (super-resolution microscopy), YOLOv2 (object detection) and pix2pix and CycleGAN (image-to-image translation). Importantly the platform allows the user to perform every step of the process necessary to DL: training of the models, quality control of the network output with quantitative and image-based assessments as well as batch analysis on new data once the performance of the model is validated. 

The researchers demonstrate that their platform helps democratising the access to DL to a wider range of researchers from the biomedical community and can be used as a stepping stone for laboratories to establish new capabilities. 

Read the story behind ZeroCostDL4Mic, by Romain Laine.

Written by Romain Laine