PRIME-RE/prime-re.github.io

Electrical stimulation and neuroimaging: humans and macaques

cipetkov opened this issue · 1 comments

Resource info table

Name Electrical stimulation and neuroimaging: humans and macaques
Authors F Rocchi*, H Oya*, F Balezeau, AJ Billig, Z Kocsis, RL Jenison, KV Nourski, CK Kovach, M Steinschneider, Y Kikuchi, AE Rhone, BJ Dlouhy, H Kawasaki, R Adolphs, JDW Greenlee, TD Griffiths, MA Howard III, CI Petkov
Description Comparative human and monkey combined electrical stimulation and fMRI
Documentation Common Fronto-temporal Effective Connectivity in Humans and Monkeys: This paper and resource establishes a comparative es-fMRI resource in human neurosurgery patients and monkeys. The work also establishes considerable correspondence between fronto-temporal auditory cognitive systems involving ventro-lateral prefrontal cortex and the hippocampus. The data in monkeys are shared via PRIME-DE and the human data via OpenNeuro. The data are also available in the Open Science Framework. Human datasets also include the electrical tractography data.
Link https://osf.io/arqp8; https://fcon_1000.projects.nitrc.org/indi/indiPRIME.html; https://openneuro.org/
Publication Pre-accepted in Neuron
Communication chris.petkov@ncl.ac.uk or fabien.balezeau@ncl.ac.uk
Restrictions No restrictions
Category Electrical stimulation and neuroimaging

** Please check the applicable boxes**

  • All required ethical approval (human/animal) for the development of this resource was obtained
  • No ethical approval (human/animal) was required for the development of this resource
  • Do NOT include this resource in periodic mailings about new resources on PRIME-RE

Instructions

Please fill out the table above as completely as possible, i.e. replace the complete str > .... < with your information.
Then submit it as a new issue and tag the issue with all applicable (yellow) categorical tags to specify what type of resource you are contributing.
We will use this table and these tags to add your resource to the main list.

Name

Provide a name for your resource. Try to make it a bit descriptive, but as long as it as not incredibly offensive, we will allow it.

Authors

Who wrote the resource or deserves to be credited? This would be a good place to list them.

Description

Tell the community what it is that your resource does. Keep it concise (a few lines).

Documentation

Does your resource include instructions on how to use it, and if so, where?
This can be a hyperlink, or you can simply state that it can be found through the main link (e.g., because it's in a GitHub ReadMe.md).
Jupyter Notebooks or richly annotated scripts would be great, but any documentation would be great.

Link

Provide a link to the resource. This can be a link to a GitHub repository, website, or shared file.
If you don't already have your resource hosted somewhere and would like to have it hosted on this GitHub,
tell us and provide us with a link to the resource (Dropbox, Google Drive, WeTransfer, etc)

Language

What language (e.g., python, shell, matlab, etc) is your resource in? This will help people to look at solutions in languages they are familiar with first.
We will accept anything.

Associated publications (if available)

If your resource is a published method, you can link to the paper here.

Communication

How can a potential user get in contact with the submitter/author of the resource?
Is there a Slack or Mattermost channel? Can issues be opened on a GitHub repo? Is there an email adress?
If you want to submit your resource 'as-is' and not offer any means for communication you can also mention that.

Restrictions

Are there any limitations for how the analysis method can be used (e.g., citation or acknowledgement required, etc)?

Category

What is the most suitable category to list your resource under?

Checkboxes

Please indicate whether ethical approval was acquired for activities related to the development of this resource, or whether this wasn't necessary. Also, indicate if you do not want your resource advertised in sparsely sent updates from the PRIME-RE team to the community.