- GNPS Analysis Tasks - mzspec:GNPS:TASK-d93bdbb5cdda40e48975e6e18a45c3ce-f.mwang87/data/Yao_Streptomyces/roseosporus/0518_s_BuOH.mzXML:scan:171
- GNPS/MassIVE public datasets - mzspec:MSV000084951:AH22
- MassiVE Proteomics datasets - mzspec:MSV000079514:Adult_CD4Tcells_bRP_Elite_28_f01
- MassiVE Proteomics dataset large - mzspec:MSV000083508:ccms_peak_centroided/pituitary_hypophysis/Trypsin_HCD_QExactiveplus/01697_A01_P018020_S00_N01_R2.mzML:scan:62886
- Metabolights public datasets - mzspec:MTBLS1124:QC07.mzML
- Thermo - mzspec:MSV000084951:AH22
- Thermo MS3 - mzspec:MSV000084765:Leptocheline_MS3_DDA_IT_5
- Sciex - mzspec:MSV000085042:QC1_pos-QC1
- Bruker - mzspec:MSV000086015:StdMix_02__GA2_01_55623
- Waters - mzspec:MSV000084977:OEPKS7_B_1_neg
- Agilent - mzspec:MSV000084060:KM0001
- Sciex - SWATH - mzspec:MSV000085570:170425_01_Edith_120417_CCF_01
Quick analysis of QC data
Here is the USI for a QC run
mzspec:MSV000085852:QC_0
What we can easily do is paste in the QC molecules and pull them out in one fell swoop:
271.0315;278.1902;279.0909;285.0205;311.0805;314.1381
You can try it out at this URL
Quickly Compare Multiple files
mzspec:MSV000085852:QC_0 mzspec:MSV000085852:QC_1 mzspec:MSV000085852:QC_2
271.0315;278.1902;279.0909;285.0205;311.0805;314.1381
We aim to provide several APIs to programmatically get data.
/mspreview?usi=<usi>
There are several ways to get GNPS Dashboard working locally, our preferred and recommended way is using docker/docker-compose as it provides a more consistent experience.
The initial steps are identical:
- Fork the GNPS Dashboard repository
- Clone down to your system
- Download Feature Finding tools (Dinosaur and MZmine 2) with get scripts
To get everything up and running, we've created a make target for you to get docker up and running:
make server-compose-interactive
The requirements on your local system are:
- Docker
- Docker Compose
This will bring the server up on http://localhost:6548.
- Install Python3 within conda
- Install all packages from the requirements.txt
- Install packages via conda
- Start the dashboard locally (defaults to http://localhost:5000)
Example shell
# make sure to have Python3 installed via conda (preferably 3.8)
conda install -c conda-forge datashader
conda install -c conda-forge openjdk
# install requirements
pip install -r requirements.txt
# run or debug the GNPS Dashboard with Python 3 on http://localhost:5000
python ./app.py
# on problem, maybe install the following (tested on Windows 10 with WSL2 Ubuntu)
sudo apt-get install libffi-dev
Since we utilize a USI to find datasets, there are a limited number of locations we can grab data from. If you want to provide a new data source, you'll have to implement the following
- USI Specification that denotes what the resource is and how to get data
- Update the code in
download.py
, specifically in_resolve_usi_remotelink
to implement how to get the remote URL for your new USI.
To run our unit tests,
cd test
pip install pytest
pip install pytest-xdist
pip install pytest-profiling
make all
Dash and plotly documentations
- Components: https://dash.plotly.com/dash-core-components
- Callbacks: https://dash.plotly.com/basic-callbacks
- Plotly express: https://plotly.com/python/plotly-express/
- Plotly: https://plotly.com/python/
One major thing about production deployemnts is the DNS routing. You want to do the following steps to have everything route properly:
- Create a DNS entry in your DNS server for the domain you want to use (e.g. dashboard.gnps2.org) and point it to the server you're running this on
- Copy .env_template to .env and update the domain name to the one you want to use
- Run a reverse proxy (https://github.com/mwang87/GNPS_ExtensionsReverseProxy)
- Run in production mode
make server-compose-production