Code and data accompanying the article Three components of human brain gene expression reflect normative developmental programmes with specific links to neurodevelopmental disorders.
Python is used for analysis, R is used for plotting.
Processing of the AHBA draws heavily on prior work by Aurina Arnatkevičiūtė and Ross Markello.
- processing.py: These functions make two changes to the standard abagen pipeline:
- Samples are filtered for left-hemisphere, cortical-only samples before all other steps (e.g. probe selection). This was found to increase generalisability of cortical-only components relative to including subcortical data when selecting probes. This step requires patching (overwriting) one function within the abagen package.
- Filtering for more stable genes and more consistently-sampled brain regions.
For convenience we provide the AHBA gene-by-region expression matrix filtered for 50% most stable genes and 3+ donor regions:
We also provide the region scores and gene weights of the top three AHBA components from DME:
Figures and results are presented in jupyter notebooks that walk through the logic of each analysis. Figure-specific code files are listed below each notebook with some other common files listed at the end.
- fig1.ipynb: generalisability summary & normative enrichments
- fig2.ipynb: spatial associations to normative imaging
- fig3.ipynb: single-cell & developmental analyses
- fig4.ipynb: disorder associations
- fig_extended.ipynb: extended data & supplementary figures
Common functions: these files include functions used in all the above
- processing.py: functions for processing the AHBA with abagen
- gradientVersion.py: class definition for PCA/DME object
- analysis_helpers.py: general helper functions
Docker - to ease with installation of package dependencies do one of:
- Pull the docker image richardajdear/ahba and run these analyses in a container (the docker image will automatically start a jupyter lab instance that can be accessed through a browser or an IDE like VScode)
- Build your own docker image using the Dockerfile, or manually install the dependencies listed in python-reqs.txt and r-reqs.R