joshcampbell191/openfpga-cores-inventory

Add a JSON API for developers

goronfreeman opened this issue · 5 comments

Since this website is already the de facto place to to view the list of available cores for Analogue Pocket, why not also add a JSON API for the different application developers in the community?

The Problem

There are at least three tools currently for updating and managing the cores on the Pocket:

Both update-pocket and pocket_core_autoupdate_net maintain their own internal list of cores, and Pocket_Updater uses the list from pocket_core_autoupdate_net. This means that every time a new core is created, update-pocket, pocket_core_autoupdate_net, and this project all need to be separately updated. This is not only inefficient, but it also makes these tools less reliable because the developer may not have seen a newly released core, and thus the tool will not have it available to the end user.

The Solution

Add a JSON endpoint to openfpga-cores-inventory website that all developers can reference as the place for the most up-to-date list of cores. The only place where a new core would have to be added is this repo. This is also nice for members of the community who don't use an updater tool and prefer to do it manually, as this will be the only place anyone goes to add new cores.

The Implementation

This new API would live at /openfpga-cores-inventory/api/v0/repos.json (or somewhere else if we decide on a better location). I am thinking that the list of cores will be maintained in a YAML file that can then be referenced in both the markdown file already on the site, and in the JSON files for building out the full JSON response. This way, only a single file will have to be updated to both add to the inventory table on the website and to the API.

First Steps

In order for this to be successful, we'll want some input from the developers of the current tools on the structure for the API. Since it is possible to version the API (i.e. v0), this can always be changed in the future (while maintaining backwards compatibility), but it would be good to get it as right as we can at the start.

I already have a local branch for adding all this, so once the details are worked out, I am more than happy to run with this myself.

Thanks for putting this together. This is a great idea! I'll take a look at your PR shortly and see if there's anything missing.

How about a JSON file in this repo? Client scripts could fetch the latest version directly from Github's CDN.

I used to have my Retropatcher scraper script do this.

Github will provide unique URLs for each version of the file (using a git tag, or just the commit id), and you could have a field with the current version.

Additionally, you can write a GitHub Action that scrapes the various core repos, and update the json file if any changes are seen. Again, that's what I did with Retropatcher. I have a GitHub Action that looks at the repo of each Pocket patch author, and if there are any new ones, it writes a new js file (could've been a json file) in my repo, which would then autodeploy to my site.

Hey @JonAbrams! We're currently building a JSON file from the list of repositories currently hosted on our site. We're putting together the final details on the schema we'd like to use so that tools can leverage the API to pull down the cores. We might look into adding dynamic content to the API such as including the latest version in the response which we'll want to leverage GitHub Actions to accomplish.

Additionally, you can write a GitHub Action that scrapes the various core repos, and update the json file if any changes are seen. Again, that's what I did with Retropatcher. I have a GitHub Action that looks at the repo of each Pocket patch author, and if there are any new ones, it writes a new js file (could've been a json file) in my repo, which would then autodeploy to my site.

Thanks for the idea, @JonAbrams! We ended up implementing a version of this in #16 😀