Deep-MI/LaPy

Automatic release on Pypi

mscheltienne opened this issue · 13 comments

Now that the packaging is taken care of in #16 (and #17 for the typo in the line-length..), here are the steps to enable automatic releases on Pypi. Luckily, the name seems to be available: https://pypi.org/search/?q=lapy

  1. Create an account on Pypi
  2. Login, and go to Account settings
  3. Scroll down to 'API tokens', and create a new token
  4. As the project has never been uploaded to Pypi, you will not be able to select the scope 'lapy'. You will have to select 'Entire account (all projects)' giving this token the right to upload on all your projects.
  5. Copy the token
  6. On GitHub, on the repository Deep-MI/LaPy, click on the 'Settings' tab
  7. Go to 'Secrets and variables'
  8. Create a new repository secrets, named PYPI_API_TOKEN and paste the token

All done. The publication workflow is now able to build the package and upload it to Pypi:

- name: Build and publish
env:
TWINE_USERNAME: __token__
TWINE_PASSWORD: ${{ secrets.PYPI_API_TOKEN }}
run: |
python -m build
twine upload dist/*

After the first upload, you can go back to the settings on Pypi, create a new token with a scope restricted to LaPy, delete the token with full scope and replace the value of the PYPI_API_TOKEN variable on GitHub for additional safety.


When does this workflow trigger? The triggers are defined at the top:

on:
workflow_dispatch:
# release:
# types: [published]

workflow_dispatch: is a manual trigger. You can go to the 'Actions' tab of the repository click on a workflow on the left, and if it has this trigger, you will see 'Run workflow' on the right-side of the screen.

release: 
   types: [published] 

is an automatic trigger on release (at the moment, commented out), i.e. on the repository page, if you click on 'Releases' in the right-pane, then on 'Draft a new release' and publish a new release, it will automatically build and upload the repository to Pypi.

Note that the target has to be set to the branch to build and upload (and that branch must have the workflow). This is useful because usually you keep 'maintenance branches' of older release, which enable you to cherry-pick bugfixes from your active development branch and cut a new bugfix release with those fixes implemented before releasing the new minor (or major) version of your package, with the same bugfixes and many more improvements and features.


Final point, if you want to give it a shot with nothing being definitive, you can use Test PyPI instead (https://test.pypi.org/).
Final final point, once the Pypi release is done, we can create the recipe and feedstock for conda-forge which will automatically release on conda based on the release on Pypi.

I hope this was the right amount of detail, as I don't know what your knowledge is about distribution channels. Please let me know if that was too much detail and if I should shorten the explanation, or if on the contrary you still have questions.

Mathieu

Thanks a lot for your help Mathieu and the detailed instruction! I decided to have the automatic publish on release and it is now working. We still need to update the README as the install description is outdated.
Now we can setup conda also.

I also see that it did not get the version right and released it as 0.3.0 instead of the new 0.4.0 that I created.

Great, looks like it's working. Just one bullet point that I forgot: you do need to increment the version in the pyproject.toml manually here

version = '0.3.0'

There are tools to automate that part, like versioneer, but I am not so familiar with them.

Anyway, the release 0.4.0 on GitHub was released as 0.3.0 on Pypi.
I suggest to push a commit incrementing the version to 0.4.1, cutting a new release, and then incrementing to the next .dev version: 0.5.0.dev.

The .dev nomenclature for the main branch, or even .dev0 for incremental development release is standard, c.f. e.g. scikit-learn.

Exactly, with this workflow, the version increment is not automated. I should have mentioned it, sorry. Thus. the workflow for a release is:

  • commit the version increment. If you respect the .dev nomenclature, this commit removes the .dev extension from the version.
  • cut a new release matching the version
  • commit the next version increment, either by incrementing the minor or major version, and add the .dev extension

Example:

Current main branch is on 0.5.0.dev.

  1. Commit 0.5.0
  2. Release 0.5.0
  3. Commit 0.6.0.dev

Thanks, all should be fixed now.
Now how do we do this:

we can create the recipe and feedstock for conda-forge which will automatically release on conda based on the release on Pypi.

At first, the food-based terminology is very weird, but you get used to it ;)

To release on the conda-forge channel, you need to create a 'recipe' describing the package. This is what I did in conda-forge/staged-recipes#22124. You can browse through the 'recipe' file, I think it's very straightforward. Please ask any question you might have.

Once the 'recipe' is ready to be merged in the https://github.com/conda-forge/staged-recipes repository, you need to open a PR and let the CI do their magic. Once everything is green, a conda-forge maintainer will merge the 'recipe' in the staged-recipes repository.

Then, you wait. Until in the next couple of hours, the conda-forge CIs will again do their magic and:

  • Create a feedstock, lapy-feedstock which is an entire repository, from your recipe.
  • Create the associated webpage and deploy it.
  • Remove the recipe from the stage-recipes repository.

As an example, here is a feedstock and the webpage.

Everything is automated, and for a pure python package like this one, it's very low maintenance as a bot (that I will enable on the feedstock) will monitor the Pypi release and automatically update the feedstock if a new release appears on Pypi.

Also, I did list both you and myself as maintainers of the feedstock. This is only for feedstock maintenance, meaning I will help to keep the conda feedstock (package) in sync with the PyPI package. This is not listing myself as an author or maintainer of LaPy. In practice, we won't have much to do after we enable the bot ;) At most, we need to bump the minimum Python version when a new one is released.

One thing to know is that if you release on conda, all your dependencies MUST be available on conda. It's very common that one of your dependencies is not yet released, and that you take upon yourself the creation of a recipe/feedstock and the associated role of maintainer of the feedstock (if the authors do not respond and/or are not willing to maintain this distribution channel).

Do you know if all dependencies are available for LaPy?

They are, even scikit-sparse, which means that contrary to the PyPI release, we can include it by default in the conda-forge release.

It took a long time to get the PR accepted in conda-staged-recipes, I'm guessing they have a shortage of maintainers to review and merge those. But now that it's done, the conda-forge release is available: https://anaconda.org/conda-forge/lapy

And I added the automerge bot, thus the conda-forge release will update automatically (a bot will open a PR, build the package, and merge the PR) everytime a change in the source (PyPI) is detected, i.e. everytime a new release is cut.
The only "maintenance" to be done is to keep the dependencies in sync, i.e. if a dependency is added/remove to LaPy or if the minimum python version is updated, the recipe (https://github.com/conda-forge/lapy-feedstock/blob/main/recipe/meta.yaml) must be updated in a PR.
Any update (PR) should be done from the fork of a feedstock, we should never push directly to the feedstock: https://github.com/conda-forge/lapy-feedstock

And finally, you should have received this kind of message from the conda bot:


Hi! This is the friendly automated conda-forge-webservice.

I updated the Github team because of this commit.

You should get push access to this feedstock and CI services.

Your package won't be available for installation locally until it is built
and synced to the anaconda.org CDN (takes 1-2 hours after the build finishes).

Feel free to join the community chat room.

NOTE: Please make sure to not push to the repository directly.
Use branches in your fork for any changes and send a PR.
More details on this are here.


I hope this release on PyPI and conda-forge shows that those distribution channels can be simple to use and mostly automated. Feel free to ask any question you have and to close this issue when you want.

Thanks @mscheltienne for your help! We are currently updating doc-strings and would then try to create automated documentation.

Nice, feel free to ping me on an issue/PR when you are done updating the docstrings to the numpy convention and if you want help with the documentation build. I will gladly provide you with a hands-on doc build and deployment workflow.

Ping :-) We are done with the docstrings for now.