jsherfey/dnsim

Using custom mechanisms in cluster simulations on fresh install

Opened this issue · 1 comments

I found a bug in the script below (since it's directly related to the CLI workflow):

If I clone a new dnsim copy etc. to ~/dnsim, and run cluster sims using this script matlab -r dnsim_batch_example, it works. BUT when I add a new mechanism to ~/dnsim/database and try to run a sim on the cluster using this new mechanism, it can't find that new mechanism (yes, I have $DNSIM exported correctly in my .bashrc as in https://github.com/jsherfey/dnsim README). In the ~/batchdirs/whatever/pbsout/job1.out it says

Found mech file. Using ~/research/modeling/database/iNa.txt
Looking in known mech list for iLeak_TC <the new one>
Error: Failed to find...

In dnsim/matlab/functions/get_mechlist.m it appears that DBPATH, which defaults to ~/research/modeling/database (which of course doesn't necessarily exist!), is, when run on the cluster, not getting a BIOSIMROOT to pull from. DBPATH appears to be the database directory that is ultimately used.

This wouldn't be a problem if I could say, set BIOSIMROOT=~/dnsim in my run script or startup.m or whatever, but even though BIOSIMROOT is used as a global variable, it appears that it can't just be set anywhere: I tried the above, and I still get this problem. Weirdly, when I run the script over SSH and print out BIOSIMROOT in the main session where i'm calling/running the script, it seems that it's my current directory (unsure if this is always the case), but when I look at the cluster logs, it's empty (as in BIOSIMROOT = []) in the MATLAB output for the cluster run! During the initial SSH run script, it successfully finds all the files since it's looking in the current working directory, and BIOSIMROOT is set right to it (in this case, ~/dnsim). Is BIOSIMROOT supposed to work as a bash-environment variable? That could get complicated since the cluster isn't necessarily using the same ~/.bashrc as your account's desktop/VM.

What works:

  1. I think the reason I didn't notice this before was that using the 'spec' struct from an individually loaded model specification file, e.g. putting load('~/dnsim/database/TC-RE-debug.mat') in the run script DOES work.
  2. What also works is going directly into dnsim/matlab/functions/get_mechlist.m and hardcodedly- replacing the ~/research/modeling/database with whatever my new database is, like ~/dnsim/database, or hardcoding in my BIOSIMROOT in there. But this IS hardcoding.

script:

%% Demo: DNSim batch simulations
% Purpose: demonstrate how to run simulation batches varying model parameters
% on local machine or cluster with or without codegen.
% Created by Jason Sherfey (27-Mar-2015)
% -------------------------------------------------------------------------
% model specification
spec=[];
spec.nodes(1).label = 'HH';                       % name of this node population
spec.nodes(1).multiplicity = 1;                   % size of this node population
spec.nodes(1).dynamics = {'V''=current/c'};       % node dynamics for state variables that can be "seen" by other nodes
spec.nodes(1).mechanisms = {'iNa','iK','itonic'}; % note: corresponding mechanism files (e.g., iNa.txt) should exist in matlab path
spec.nodes(1).parameters = {'stim',7,'c',1};      % user-specified parameter values (note: these override default values set in mechanism files)

% simulation controls
tspan=[0 1000];     % [beg end], ms, simulation time limits
SOLVER='euler';     % numerical integration method
dt=.01;             % integration time step [ms]
dsfact=10;          % downsample factor (applied to simulated data)
codepath='~/dnsim'; % path to dnsim toolbox

%% SINGLE SIMULATION
% local simulation with codegen
data = runsim(spec,'timelimits',tspan,'dt',dt,'SOLVER',SOLVER,'dsfact',dsfact,...
  'coder',1,'verbose',0);
plotv(data,spec,'varlabel','V');

%% SIMULATION BATCHES

% output controls
rootdir = pwd;     % where to save outputs
savedata_flag = 0; % 0 or 1, whether to save the simulated data (after downsampling)
saveplot_flag = 1; % 0 or 1, whether to save plots
plotvars_flag = 1; % 0 or 1, whether to plot state variables

% what to vary across batch
scope = 'HH';       % node or connection label whose parameter you want to vary
variable = 'c';     % parameter to vary (e.g., capacitance)
values = '[1 1.5]'; % values for each simulation in batch (varied across batches)

%{
  Prerequisites for submitting jobs to cluster:
  1. add path-to-dnsim/csh to your environment.
     e.g., add to .bashrc: "export PATH=$PATH:$HOME/dnsim/csh"
  2. run matlab on cluster node that recognizes the command 'qsub'
%}

% -------------------------------------------------------------------------
% LOCAL BATCHES (run serial jobs on current machine)
cluster_flag=0; % 0 or 1, whether to submit jobs to a cluster (requires: running on a cluster node that recognizes the command 'qsub')

% without codegen
values = '[1 1.5]';
[~,~,outdir]=simstudy(spec,{scope},{variable},{values},...
  'dt',dt,'SOLVER',SOLVER,'rootdir',rootdir,'timelimits',tspan,'dsfact',dsfact,...
  'savedata_flag',savedata_flag,'saveplot_flag',saveplot_flag,'plotvars_flag',plotvars_flag,'addpath',codepath,...
  'cluster_flag',cluster_flag,'coder',0);

% with codegen
values = '[2 2.5]';
[~,~,outdir]=simstudy(spec,{scope},{variable},{values},...
  'dt',dt,'SOLVER',SOLVER,'rootdir',rootdir,'timelimits',tspan,'dsfact',dsfact,...
  'savedata_flag',savedata_flag,'saveplot_flag',saveplot_flag,'plotvars_flag',plotvars_flag,'addpath',codepath,...
  'cluster_flag',cluster_flag,'coder',1);

% -------------------------------------------------------------------------
% CLUSTER BATCHES (submit to queue for running parallel jobs on cluster nodes)
cluster_flag=1; % 0 or 1, whether to submit jobs to a cluster (requires: running on a cluster node that recognizes the command 'qsub')

% without codegen
values = '[3 3.5]';
[~,~,outdir]=simstudy(spec,{scope},{variable},{values},...
  'dt',dt,'SOLVER',SOLVER,'rootdir',rootdir,'timelimits',tspan,'dsfact',dsfact,...
  'savedata_flag',savedata_flag,'saveplot_flag',saveplot_flag,'plotvars_flag',plotvars_flag,'addpath',codepath,...
  'cluster_flag',cluster_flag,'coder',0);

% with codegen
values = '[4 4.5]';
[~,~,outdir]=simstudy(spec,{scope},{variable},{values},...
  'dt',dt,'SOLVER',SOLVER,'rootdir',rootdir,'timelimits',tspan,'dsfact',dsfact,...
  'savedata_flag',savedata_flag,'saveplot_flag',saveplot_flag,'plotvars_flag',plotvars_flag,'addpath',codepath,...
  'cluster_flag',cluster_flag,'coder',1);

A much simpler explanation of this issue is: because ~/research/modeling/database is hardcoded in and used when running simulations on the cluster, if you have new, custom mechanisms and want to use them in a cluster simulation, right now you must add them directly into ~/research/modeling/database for them to be found by a cluster simulation.