MINI2P freely-moving calcium imaging and spatial tuning analysis
MINI2P_toolbox includes the codes, softwares, 3D models, protocols, and etc. for buidling and using MINI2P to do freely-moving recording. MINI2P is an open-source miniature 2-photon microsocpe for fast high-resolution calcium imaging in freely-moving mice, published in Zong, et al.,"Large-scale two-photon calcium imaging in freely moving mice (2022)". With the materials provided in this toolbox, people can assemble, test the MINI2P system, set up the animal tracking system, process the MINI2P imaging data, extract neuronal activity from single cells, and combine the neuronal activity data and the tracking data together for user-depedent downsteam analysis. The multi-FOV stitching software is also included. The codes for most of the anaylsis (grid cells, place cells, etc) in the paper "Large-scale two-photon calcium imaging in freely moving mice (2022)" are also provided.
-
3D models (and 2D drawings for custom components) of all components for bulding a complete MINI2P system.
-
Protocols overview including video-tutorial links. Contact Information for custom-made products.
a) P1–Shopping and Machining list. This document lists each essential component with the supplier, the product name, the model (or item reference) and its approximate price in Euro. An alternative list is also available considering a laser with coupled fiber ('Plug & Play solution', item 129) and so, around 30 items were removed, because they are no longer needed for HC-920 fiber assembly. Price list as of 2022 is provided in detail in here. 2D Drawings and 3D models of most components are available here.
b) P2-System building protocol. This protocol includes all steps to assemble a MINI2P system. Each Protocol starts with a short-list of main reagents and tools needed, followed by an overview schematic of the module, and a table with the main products. HC-920 assembly building and laser coupling video tutorial can be found on the link.
c) P3–MINI2P 2023 miniscope assembly protocol. How to assemble a MINI2P 2023 microscope is described. MINI2P 2023 assembly video tutorial can be found on the link.
d) System operate instruction. This document describes the protocol for installation, laser calibration and start imaging.
e) Performance tests & standard testing protocol. This documents describes the main steps for miniscope calibration (either versions, 2022 and 2023) and preliminary testing of a MINI2P system.
f) GFB Assembly Protocol. This document describes the protocol for assembling the GRIN-end fiber bundle (GFB), which replaces the tapered fiber bundle (TFB) as the main fiber collecting and relaying the emission signal to the detection module. GFB assembly video
g) MEMS wires and mirror protocol. How to assemble MEMS wires and solder MEMS flex cable. MEMS flex cable soldering video
h) These and other video-tutorials can be found on link: MINI2P video tutorial package
-
a) One scanimage Machine Data File (MDF) for 2000Hz MEMS-L scanner. A second file for 5600Hz MEMS-F scanner. And a laser calibration look-up-table (LUT) after miniscope's objective 920 AOM beam LUT.
b) An example Suite2P settings.
c) Two DLC model configuration files:
More details in Documents.
d) AnimalTracker.vi: a Labview program for recording animal behaviors and synchronizing the tracking camera recording with the MINI2P imaging. More details in Documents
e) MINI2P SI device: Briefly, this device enables users to measure MINI2P distortion in different planes, do fast data correction, and register MINI2P system information in ScanImage software. And all this information is saved in Tiff header now. Video tutorial can be found on the link. More details in Readme
-
a) Pipelines for spatial tuning analysis included in the paper (grid cells, place cells,etc).
b) NATEX.mlapp: Nat Explorer, an application to load, process and preview the neuronal activity data (from the Suite2P output) and the tracking data (from the DLC output). It also combines the neuronal activity data and tracking data into the NAT.mat (Neuron Activity aligned with Tracking Matrix) and put all necessasy information into ExperimentInformation.mat for the user-specific downsteam analysis. More details in Documents
c) StitchingChecker.mlapp: an application to stitch multiple FOV recorded from different positions of the cortext. It can load in wide-field image as a reference for FOV alignment and can also take the retinotopic mapping result in for identifying different visual cortices. The precise alginment of FOVs is confirmed by
-
overlapping of the landmarks between FOVs and the wide-field image, or between neighbouring FOVS;
-
peak cross-correlation between FOVs and the wide-field image, or between neighbouring FOVs;
-
overlapping of the repeated cells in neighbouring FOVS. We also found this application can be used to register imagings recorded in multiple days.
More details in Documents
-
-
a) Requirements: A list of the non-optical components necessary to build and use a MINI2P system, including licensed software requirements.
b) How-to: A set of more detailed how-to documentation about how some components of the system work.
Applications NATEX, StitchingChecker and DistortionCleaner were written with Matlab app designer. In order to use these software, please press "open” in the home toolstrip of Matlab, select the software, wait until the app designer interface pops out, and then press "run". Some details in how to use each application is provided under Documents
This repository was created by Weijian Zong and maintained by the Moser group at Kavli Institute for Systems Neuroscience. It has benefitted from the inputs of all authors of the paper Zong, et al.,"Large-scale two-photon calcium imaging in freely moving mice," Cell(2022). Sections of the analysis code are based on the Behavioural Neurology Toolbox, (c) Vadim Frolov 2018.
MINI2P is a complete open-source project, we encourage people to use, test, modify and further develop this toolbox. If you have any questions or suggestions, or find any bugs in the codes, please contact us or submit an issue. If you use the code or data, please cite us!