/cellfinder

Automated 3D cell detection and registration of whole-brain images

Primary LanguagePythonBSD 3-Clause "New" or "Revised" LicenseBSD-3-Clause

Python Version PyPI Downloads Wheel Development Status Tests Coverage Status Code style: black Contributions Website Twitter

Cellfinder

Whole-brain cell detection, registration and analysis.

N.B. If you want to just use the cell detection part of cellfinder, please see the standalone cellfinder-core package, or the cellfinder plugin for napari.


cellfinder is a collection of tools developed by Adam Tyson, Charly Rousseau and Christian Niedworok in the Margrie Lab, generously supported by the Sainsbury Wellcome Centre.

cellfinder is a designed for the analysis of whole-brain imaging data such as serial-section imaging and lightsheet imaging in cleared tissue. The aim is to provide a single solution for:

  • Cell detection (initial cell candidate detection and refinement using deep learning) (using cellfinder-core)
  • Atlas registration (using brainreg)
  • Analysis of cell positions in a common space

Installation is with pip install cellfinder or by using the supplied docker image.


Basic usage:

cellfinder -s signal_images -b background_images -o output_dir --metadata metadata

Full documentation can be found here. In particular, please see the data requirements.

This software is at a very early stage, and was written with our data in mind. Over time we hope to support other data types/formats. If you have any issues, please get in touch on the forum or by raising an issue.

If you have any other questions, please send an email.


Illustration

Introduction

cellfinder takes a stitched, but otherwise raw whole-brain dataset with at least two channels:

  • Background channel (i.e. autofluorescence)
  • Signal channel, the one with the cells to be detected:

raw Raw coronal serial two-photon mouse brain image showing labelled cells

Cell candidate detection

Classical image analysis (e.g. filters, thresholding) is used to find cell-like objects (with false positives):

raw Candidate cells (including many artefacts)

Cell candidate classification

A deep-learning network (ResNet) is used to classify cell candidates as true cells or artefacts:

raw Cassified cell candidates. Yellow - cells, Blue - artefacts

Registration and segmentation (brainreg)

Using brainreg, cellfinder aligns a template brain and atlas annotations (e.g. the Allen Reference Atlas, ARA) to the sample allowing detected cells to be assigned a brain region.

This transformation can be inverted, allowing detected cells to be transformed to a standard anatomical space.

raw ARA overlaid on sample image

Analysis of cell positions in a common anatomical space

Registration to a template allows for powerful group-level analysis of cellular disributions. (Example to come)

Examples

(more to come)

Tracing of inputs to retrosplenial cortex (RSP)

Input cell somas detected by cellfinder, aligned to the Allen Reference Atlas, and visualised in brainrender along with RSP.

brainrender

Data courtesy of Sepiedeh Keshavarzi and Chryssanthi Tsitoura. Details here

Visualisation

cellfinder comes with a plugin (brainglobe-napari-io) for napari to view your data

Usage

  • Open napari (however you normally do it, but typically just type napari into your terminal, or click on your desktop icon)

Load cellfinder XML file

  • Load your raw data (drag and drop the data directories into napari, one at a time)
  • Drag and drop your cellfinder XML file (e.g. cell_classification.xml) into napari.

Load cellfinder directory

  • Load your raw data (drag and drop the data directories into napari, one at a time)
  • Drag and drop your cellfinder output directory into napari.

The plugin will then load your detected cells (in yellow) and the rejected cell candidates (in blue). If you carried out registration, then these results will be overlaid (similarly to the loading brainreg data, but transformed to the coordinate space of your raw data).

load_data Loading raw data

load_data Loading cellfinder results

Citing cellfinder

If you find cellfinder useful, and use it in your research, please cite the preprint outlining the cell detection algorithm:

Tyson, A. L., Rousseau, C. V., Niedworok, C. J., Keshavarzi, S., Tsitoura, C., Cossell, L., Strom, M. and Margrie, T. W. (2021) “A deep learning algorithm for 3D cell detection in whole mouse brain image datasets’ PLOS Computational Biology, 17(5), e1009074 https://doi.org/10.1371/journal.pcbi.1009074

If you use any of the image registration functions in cellfinder, please also cite brainreg.

If you use this, or any other tools in the brainglobe suite, please let us know, and we'd be happy to promote your paper/talk etc.


The BrainGlobe project is generously supported by the Sainsbury Wellcome Centre and the Institute of Neuroscience, Technical University of Munich, with funding from Wellcome, the Gatsby Charitable Foundation and the Munich Cluster for Systems Neurology - Synergy.