ArgusShapes
ArgusShapes
is a package used to predict phosphene shape in epiretinal prostheses.
Please cite as:
M. Beyeler, D. Nanduri, J. D. Weiland, A. Rokem, G. M. Boynton, I. Fine (2018). A model of ganglion axon pathways accounts for percepts elicited by retinal implants. bioRxiv 453035, doi:10.1101/453035.
This code is based on pulse2percept, a Python-based simulation framework for bionic vision (Beyeler et al. 2017).
Data is available on the Open Science Framework. You can either download and extract the data yourself or have the scripts under "figures/" do it for you.
Installation
Required packages listed in requirements.txt
.
First make sure you have NumPy and Cython installed:
$ pip install numpy==1.11
$ pip install cython==0.27
Then install all packages listed in requirements.txt
:
$ pip install -r requirements.txt
After that, you are ready to install the main package, argus_shapes
:
$ pip install -e .
If you want to make sure that everything works as expect, you can run the test suite:
$ pip install pytest
$ py.test argus_shapes
Figures
The code to reproduce figures in the paper can be found in the "figures/" folder:
- fig2-phosphene-shape.ipynb: Phosphene drawings vary across electrodes.
- fig3-shape-descriptors.ipynb: Shape descriptors used to measure phosphene variability.
- fig5-axon-map-orientation.ipynb: Phosphene orientation is aligned with retinal nerve fiber bundles.
- fig6-model-shapes.ipynb: Cross-validated phosphene shape predictions.
- fig6-inset-models.ipynb: Scoreboard and axon map model schematics.
- fig7-model-scatter.ipynb: Cross-validated shape descriptor predictions.
These notebooks assume that the data live in a directory ${DATA_ROOT}/argus_shapes
,
where DATA_ROOT
is an environment variable.
On Unix, make sure to add DATA_ROOT
to your ~/.bashrc
:
$ echo 'export DATA_ROOT=/home/username/data' >> ~/.bashrc
$ source ~/.bashrc
You can either download and extract the data from OSF yourself, or have the notebooks automatically do it for you. In the above case, the data will end up in "/home/username/data/argus_shapes".
Loading your own data
In order to load your own data, you will need two .csv files:
subjects.csv
should have the following columns:
subject_id
: subject ID, has to be the same as indrawings.csv
(e.g., S1)implant_type
: currently supported are either 'ArgusI' or 'ArgusII'implant_x
/implant_y
: (x,y)-coordinates of array center in microns, assuming the fovea is at (0, 0)implant_rot
: array rotation in radians (positive: counter-clockwise rotation)loc_od_x
/loc_od_y
: (x,y)-coordinates of optic disc center of this subject in micronsxmin
/xmax
: x-extent (horizontal) of touch screen in degrees of visual angle (e.g., xmin=-36, xmax=36)ymin
/ymax
: y-extent (vertical) of touch screen in degrees of visual angle (e.g., ymin=-24, ymax=24)
drawings.csv
should have the following columns:
subject_id
: subject ID, has to be the same as insubjects.csv
(e.g., S1)stim_class
: currently supported is 'SingleElectrode'PTS_ELECTRODE
: electrode namePTS_FILE
: path to image filePTS_AMP
: applied current amplitude in micro-AmpsPTS_FREQ
: applied pulse frequency in Hzdate
: date of data collection
Then the data can be loaded as Pandas DataFrames using the following Python recipe:
>>> import argus_shapes as shapes
>>> df_subjects = shapes.load_subjects('subjects.csv')
>>> df_drawings = shapes.load_data('drawings.csv')
Submodules
argus_shapes
: Main module.fetch_data
: Download data from the web.load_data
: Load shape data from a local .csv file.load_subjects
: Load subject data from a local .csv file.extract_best_pickle_files
: Return a list of pickle files with lowest train scores.
models
: Code to run various versions of the scoreboard and axon map models.ModelA
: Scoreboard model with shape descriptor lossModelB
: Scoreboard model with perspective transform and shape descriptor lossModelC
: Axon map model with shape descriptor lossModelD
: Axon map model with perspective transform and shape descriptor loss
model_selection
:FunctionMinimizer
: Perform function minimization.GridSearchOptimizer
: Perform grid search optimization.ParticleSwarmOptimizer
: Perform particle swarm optimization.crossval_predict
: Predict data using k-fold cross-validation.crossval_score
: Score a model using k-fold cross-validation.
imgproc
: Various image processing routines.get_thresholded_image
: Apply a threshold to a grayscale image.get_region_props
: Calculate region properties of a binary image (area, center of mass, orientation, etc.)calc_shape_descriptors
: Calculate area, orientation, elongation of a phosphene.center_phosphene
: Center a phosphene in an image.scale_phosphene
: Apply a scaling factor to a phosphene.rotate_phosphene
: Rotate a phosphene by a certain angle.dice_coeff
: Calculate the dice coefficient between two phosphenes.
utils
: Various utility functions.ret2dva
: Convert retinal to visual field coordinates.dva2ret
: Convert visual field to retinal coordinates.cart2pol
: Convert from Cartesian to polar coordinates.pol2cart
: Convert from polar to Cartesian coordinates.angle_diff
: Calculate the signed difference between two angles.
viz
: Some visualization functions.scatter_correlation
: Scatter plots some data points and fits a regression curve.plot_phosphenes_on_array
: Plots mean phosphenes on a schematic of the implant.
Miscellaneous
minimal-example.ipynb
: A minimal usage example.run_fit.sh
: Bash script to fit the models to all subject data.run_crossval.sh
: Bas script to run leave-one-electrode-out cross-validation.crossval_swarm.py
: Python file running the model fitting / cross-validation.