This repository is intended to be a base template, a cookiecutter for a new Python package project while keeping PEP518 in mind. Because it’s hosted on Github it already utilizes a few Github Actions that enforce repository-side checks for continuous integration and that implement a semantic release setup. And while this package is a starting point for a Python project with good engineering practices, it’s intended to be improved and added to in various ways — see the Wiki for more suggestions.
Features
Typing
Quality assurance
Unit testing
Documentation
Versioning and publishing
Dependency analysis
Security analysis
Package or application?
How to use this repository
Updating dependent packages
Git hooks
Testing
Generating documentation
Versioning, publishing and changelog
Cleaning up
Frequently asked questions
The badges above give you an idea of what this project template provides. It’s work in progress, and I try to enable as much engineering goodness as is possible and is sensibly bearable using git hooks (see below) and Github Actions.
The package requires a minimum of Python 3.9 and supports Python 3.10 as well as Python 3.11. All code requires comprehensive typing. The mypy static type checker is invoked by a git hook and through a Github Action to enforce continuous type checks. Make sure to add type hints to your code or to use stub files for types, to ensure that users of your package can import
and type-check your code (see also PEP 561).
A number of git hooks are invoked before and after a commit, and before push. These hooks are all managed by the pre-commit tool and enforce a number of software quality assurance measures (see below).
Comprehensive unit testing is enabled using pytest combined with Hypothesis (to generate test payloads and strategies), and test code coverage is measured using coverage (see below).
Documentation is important, and Sphinx is set up already to produce standard documentation for the package, assuming that code contains docstrings with reStructuredText (see below).
Automatic package versioning and tagging, publishing to PyPI, and Changelog generation are enabled using Github Actions. Furthermore, an optional Release Notification Action allows Github to push an update notification to a Slack bot of your choice. For setup instructions, please see below.
Dependabot is enabled to scan the dependencies and automatically create pull requests when an updated version is available.
CodeQL is enabled to scan the Python code for security vulnerabilities. You can adjust the GitHub Actions workflow at .github/workflows/codeql-analysis.yaml
and the configuration file at .github/codeql/codeql-config.yaml
to add more languages, change the default paths, scan schedule, and queries.
OSSF Security Scorecards is enabled as a GitHub Actions workflow to give the consumers information about the supply-chain security posture of this project, assigning a score of 0–10. We upload the results as a SARIF (Static Analysis Results Interchange Format) artifact after each run and the results can be found at the Security tab of this GitHub project. We also allow publishing the data at OpenSSF. We use this data to continuously improve the security posture of this project. Note that this configuration supports the main
(default) branch and requires the repository to be public and not forked.
Additionally, the bandit tool is being installed as part of a development environment (i.e. the [dev]
package extra); however, bandit does not run automatically! Instead, you can invoke it manually:
bandit --recursive src # Add '--skip B101' when checking the tests, Bandit issue #457.
A shared package or library is intended to be imported by another package or application; an application is a self-contained, standalone, runnable package. Unfortunately, Python’s packaging ecosystem is mostly focused on packaging shared packages (libraries), and packaging Python applications is not as well-supported (discussion). This template, however, supports both scenarios.
Shared package: this template works out of the box as a shared package. Direct dependencies on other packages are declared in pyproject.toml
(see the dependencies
field) and should allow for as wide a version range as possible to ensure that this package and its dependencies can be installed by and coexist with other packages and applications without version conflicts.
Application: the __main__.py
file ensures an entry point to run this package as a standalone application using Python’s -m command-line option. A wrapper script named something
is also generated as an entry point into this package by make setup
or make upgrade
. In addition to specifying directly dependent packages and their version ranges in pyproject.toml
, an application should pin its entire environment using the requirements.txt
. Use the make requirements
command to generate that file if you’re building an application.
In the future, the generated requirements.txt
file with its integrity hash for every dependent package will become an important provenance material to provide transparency in the packaging process (see also SBOM + SLSA).
If you’d like to contribute to the project template, please open an issue for discussion or submit a pull request.
If you’d like to start your own Python project from scratch, you can either copy the content of this repository into your new project folder or fork this repository. Either way, consider making the following adjustments to your local copy:
-
Change the
LICENSE.md
file and the license badge according to your needs, replace the symbolic linkREADME.md
with an actual README file, likewise replace the symbolic linkSECURITY.md
with a SECURITY file adjusted to your needs (more details here), and lastly replace the symbolic linkCHANGELOG.md
with an actual CHANGELOG file which contains a single line:<!--next-version-placeholder-->
-
Rename the
src/package/
folder to whatever your own package’s name will be, adjust the Github Actions in.github/workflows/
, and review theMakefile
,pyproject.toml
,pre-commit-config.yaml
files as well as the unit tests accordingly. Note: by default all Actions run on three different host types (Linux, MacOS, and Windows) whose rates vary widely, so make sure that you disable or budget accordingly if you’re in a private repository! -
Adjust the content of the
pyproject.toml
file according to your needs, and make sure to fill in the project URL, maintainer and author information too. Don’t forget to reset the package’s version number insrc/package/__init__.py
. -
If you import packages that do not provide type hints into your new repository, then
mypy
needs to be configured accordingly: add these packages to thepyproject.toml
file using theignore_missing_imports
option. -
If you’d like to publish your package to PyPI then set the
upload_to_pypi
variable in thepyproject.toml
file totrue
. -
Adjust the Dependabot settings in
.github/dependabot.yaml
to your desired target branch that you’d like to have monitored by Dependabot.
To develop your new package, first create a virtual environment by either using the Makefile:
make venv # Create a new virtual environment in .venv folder using Python 3.10.
or for a specific version of Python:
PYTHON=python3.9 make venv # Same virtual environment for a different Python version.
or manually:
python3.11 -m venv .venv # Or use .venv310 for more than one local virtual environments.
When working with this Makefile it is important to always activate the virtual environment because some of the git hooks (see below) depend on that:
. .venv/bin/activate # Where . is a bash shortcut for the source command.
Finally, set up the new package with all of its extras and initialize the local git hooks:
make setup
With that in place, you’re ready to build your own package!
It’s likely that during development you’ll add or update dependent packages in the pyproject.toml
file, which requires an update to the virtual environment:
make upgrade
Using the pre-commit tool and its .pre-commit-config.yaml
configuration, the following git hooks are active in this repository:
- When committing code, a number of pre-commit hooks ensure that your code is formatted according to PEP 8 using the
black
tool, and they’ll invokeflake8
(and various plugins),pylint
andmypy
to check for lint and correct types. There are more checks, but those two are the important ones. You can adjust the settings for these tools in one of thepyproject.toml
orpylintrc
ormypy.ini
or.flake8
configuration files. - The commit message hook enforces conventional commit messages and that, in turn, enables a semantic release of this package on the Github side: upon merging changes into the
main
branch, the semantic release action produces a changelog and computes the next version of this package and publishes a release — all based on the commit messages. - Using a pre-push hook this package is also set up to run
pytest
; in addition, thecoverage
plugin makes sure that all of your package’s code is covered by tests and Hypothesis is already installed to help with generating test payloads.
You can also run these hooks manually, which comes in very handy during daily development tasks. For example
make quick-check
runs pylint
and mypy
only, whereas
make check
runs all installed git hooks over your code.
As mentioned above, this repository is set up to use pytest either standalone or as a pre-push git hook. Tests are stored in the tests/
folder, and you can run them manually like so:
make test
which runs all tests in both your local Python virtual environment. For more options, see the pytest command-line flags. Also note that pytest includes doctest, which means that module and function docstrings may contain test code that executes as part of the unit tests.
Test code coverage is already tracked using coverage and the pytest-cov plugin for pytest, and it measures how much code in the src/package/
folder is covered by tests:
Run unit tests...........................................................Passed
- hook id: pytest
- duration: 0.48s
============================= test session starts ==============================
platform darwin -- Python 3.10.2, pytest-6.2.5, py-1.11.0, pluggy-1.0.0 -- /.../python-package-template/.venv/bin/python3.10
cachedir: .pytest_cache
hypothesis profile 'default' -> database=DirectoryBasedExampleDatabase('/.../python-package-template/.hypothesis/examples')
rootdir: /.../python-package-template, configfile: pyproject.toml, testpaths: tests
plugins: hypothesis-6.41.0, cov-3.0.0
collected 1 item
tests/test_something.py::test_something PASSED [100%]
---------- coverage: platform darwin, python 3.10.2-final-0 ----------
Name Stmts Miss Cover Missing
--------------------------------------------------------
src/package/__init__.py 1 0 100%
src/package/something.py 4 0 100%
--------------------------------------------------------
TOTAL 5 0 100%
Required test coverage of 100.0% reached. Total coverage: 100.00%
============================== 1 passed in 0.16s ===============================
Note that code that’s not covered by tests is listed under the Missing
column. The net effect of enforcing 100% code coverage is that every new major and minor feature, every code change, and every fix are being tested (keeping in mind that code coverage does not correlate with test quality).
Hypothesis is a package that implements property based testing and that provides payload generation for your tests based on strategy descriptions (more). Using its pytest plugin Hypothesis is ready to be used for this package.
As mentioned above, all package code should make use of Python docstrings in reStructured text format. Using these docstrings and the documentation template in the docs/source/
folder, you can then generate proper documentation in different formats using the Sphinx tool:
make docs
This example generates documentation in HTML, which can then be found here:
open docs/_build/html/index.html
To enable automation for versioning, package publishing, and changelog generation it is important to use meaningful conventional commit messages! This package template already has a semantic release Github Action enabled which is set up to take care of all three of these aspects — every time changes are merged into the main
branch.
If you work with protected branches then make sure to add a RELEASE_TOKEN
secret to your repository, see here for how to do that. For more configuration options, please refer to the tool.semantic_release
section in the pyproject.toml
file, and read the semantic release documentation.
You can also install and run the tool manually and locally, for example:
pip install python-semantic-release
semantic-release changelog
semantic-release version
Use the --verbosity=DEBUG
command-line argument for more details.
If you’d like to receive Slack notifications whenever a new release is published, follow the comments in the Release Notification Action and set up a Slack bot by following the instructions here.
In order to build a distribution of your package locally instead of publishing it through the Github Action, you can simply call:
make dist
This builds a source package and a binary distribution, and stores the files in your local dist/
folder.
On occasion it’s useful (and perhaps necessary) to clean up stale files, caches that tools like mypy
leave behind, or even to nuke the complete virtual environment:
- Remove distribution artifacts:
make dist-clean
- In addition, remove tool caches and documentation:
make clean
- In addition, remove Python code caches and git hooks:
make nuke-caches
- In addition and to reset everything, to restore a clean package to start over fresh:
make nuke
Please be careful when nuking your environment, and make sure you know what you’re doing.
- Question: Why don’t you use tools like tox or nox to orchestrate testing?
Answer: We’ve removedtox
based on a discussion in issue #100 and PR #102. In short: we want to run tests inside the development venv usingpytest
, and run more tests using an extensive test matrix using Github Actions.