This is the plugin_template
repository to help plugin writers get started and write their own
plugin for Pulp Project 3.0+.
If you are planning on writing a new Pulp plugin, but have no idea what you're doing you've come to the right place. The purpose of this guide is to walk you through, step by step, the Pulp plugin creation process.
This guide specifically details how you write a new content plugin.
Why would you want to write a plugin?
What exactly is this Pulp thing?
It's recommend that you develop on a system that already has Pulp installed. This allows you to test your plugin at every step.
It's also recommended that you go through the planning guide before starting to develop your plugin.
The first step is to create a template_config.yml
for your new plugin. This file contains
settings used by the ./plugin-template
command when generating new plugins and for future updates.
-
Clone this repository
$ git clone https://github.com/pulp/plugin_template.git
$ cd plugin_template
-
Run the provided
./plugin-template --generate-config
$ ./plugin-template --generate-config --plugin-app-label <label> PLUGIN_NAME
NOTE : The
plugin-app-label
should identify the content type which you would like to support, e.g.rubygem
ormaven
. ThePLUGIN_NAME
is usuallypulp_
orpulp-
prepended to the--plugin-app-label
, e.g.pulp_maven
.
The first time this command is run, a new directory by the name of PLUGIN_NAME is created inside
the parent directory of the plugin_template' directory. The
template_config.ymlis written to the root of this new directory. It is filled with default values for various aspects of the plugin scaffolding. You can edit them according to your needs to control subsequent calls to
plugin-template`.
The following settings are stored in template_config.yml
.
black Boolean, whether to use black to format python source files.
flake8 Boolean, whether to use flake8 to lint python source files.
check_commit_message Include inspection of commit message for a reference to an issue in
pulp.plan.io.
check_gettext Check for problems with gettext such as mixing f-strings with gettext.
check_manifest Runs check-manifest to see if any files that were unintentionally left out
of MANIFEST.in. For more info, see https://pypi.org/project/check-manifest/.
check_stray_pulpcore_imports
Check that plugins are importing from pulpcore.plugin and not pulpcore directly.
core_import_allowed A list of string patterns to be allowed to import from pulpcore explicitly.
coverage Include collection of coverage and reporting to coveralls.io
deploy_client_to_pypi Include a Github Actions job that publishes a client library to PyPI.
This job only executes when a tag is associated with the commit being
built. When enabling this job, the user is expected to provide a
secure environment variable called PYPI_API_TOKEN. The variable can
be added in the Github secrets settings page for the repository[0].
This job uses the OpenAPI schema for the plugin to generate a Python
client library using openapi-generator-cli.
deploy_client_to_rubygems
Include a Github Actions job that publishes a client library to RubyGems.org.
This job only executes when a tag is associated with the commit being
built. When enabling this job, the user is expected to provide a
secure environment variable called RUBYGEMS_API_KEY. The variable can
be added in the Github secrets settings page for the repository.
deploy_to_pypi Include a Github Actions job that publishes builds to PyPI
This job only executes when a tag is associated with the commit being
built. When enabling this job, the user is expected to provide a
secure environment variable called PYPI_API_TOKEN. The variable can
be added in the Github secrets settings page for the repository[0].
docker_fixtures In Github Actions, use the pulp-fixtures docker container to serve up
fixtures instead of using fedorapeople.org.
github_org The Github organization to use for the plugin.
issue_tracker Which issue tracker the project will use. Valid values are 'redmine' and
'github'. To switch from Redmine to GitHub use the --migrate-github-issues
option.
latest_release_branch A pointer to the currently latest release branch (this is automatically
updated).
docs_test Include a CI build for testing the 'make html' command for sphinx docs.
parallel_test_workers Run tests in parallel using `pytest-xdist` with N parallel runners. This
settings specifies N. By default it is 8.
plugin_app_label Suppose our plugin is named 'pulp_test', then this is 'test'
plugin_default_branch The default branch in your plugin repo, defaults to 'main'.
plugin_name Suppose our plugin is named 'pulp_test', then this is 'pulp_test'
plugins List of dictionaries with `app_label` and `name` as keys. One entry per
plugin resident in this repository.
publish_docs_to_pulpprojectdotorg
Include a job for publishing documentation to
docs.pulpproject.org/<plugin_name>/. This job requires the project-specific
authorized ssh key to be set as a secret named `PULP_DOCS_KEY`.
pulp_settings A dictionary of settings that the plugin tests require to be set.
pulp_settings_<scenario>
A dictionary of settings that the plugin <scenario> tests can set
additionally. `<scenario>` is one of "azure", "s3", "gcp".
pulp_env A dictionary of ENV variables used globally by all runners. The variables
are translated to separate ENV layers in Containerfile configuring the base
Pulp image.
pulp_env_<scenario>
A dictionary of ENV variables that will be translated to separate ENV
layers in Containerfile configuring the base Pulp image. `<scenario>` is one
of "azure", "s3", "gcp".
pydocstyle Boolean, whether to have flake8 use pydocstyle to check for compliance with
Python docstring conventions.
release_user The GitHub user that is associated with the RELEASE_TOKEN secret on GitHub.
The username and token are used to push the Changelog and version bump commits
created by the release workflow. The default is 'pulpbot'.
release_email The email address associated with the release_user.
run_pulpcore_tests_for_plugins
Pulpcore ships some functional tests that make sense for plugins to run.
These are pytest marked with the `from_pulpcore_for_all_plugins`. If true,
the CI will run an additional pytest call running pulpcore tests with that
mark.
noissue_marker A string that is used to mark a commit as not attached to an issue.
stalebot A boolean that indicates whether to use stalebot or not.
stalebot_days_until_stale
The number of days of inactivity before an Issue or Pull Request becomes stale.
stalebot_days_until_close
The number of days of inactivity before an Issue or Pull Request with the stale
label is closed.
supported_release_branches
Specify the release branches that should receive regular CI updates.
sync_ci Enables a nightly workflow to update the CI files.
test_cli Run the pulp-cli tests as part of the CI tests
test_performance Include a nightly job that runs a script to test performance. If using a
list, a separate job will run a specific performance test file for each
entry in the list. Otherwise, all performance tests will be run together.
test_released_plugin_with_next_pulpcore_release
Include a cron job that tests the latest released version of the plugin to
see if it is compatible with pulpcore's main branch. This helps ensure
that pulpcore is following the deprecation policy for the plugin API.
disabled_redis_runners
A list of test runners that should have the Redis service disabled. By
default, all runners execute tests with the Redis service enabled. The list
can be adjusted by specifying the names of runners (e.g., [s3, azure]).
test_azure Include azure job for running tests using [azurite](https://github.com/Azure/Azurite)
to emulate Azure.
test_gcp Include gcp job for running tests using [fake-gcs-server](https://github.com/fsouza/fake-gcs-server)
to emulate GCP.
test_lowerbounds Include lowerbounds job for running tests using lower bounds found in requirements.txt.
test_s3 Include s3 job for running tests using [minio](https://github.com/minio/minio)
to emulate S3.
ci_trigger Value for the `on` clause on workflow/ci.yml (push, pull_request, etc...)
ci_env Environment variables to set for the CI build.
pre_job_template holds name and a path for a template to be included to run before jobs.
post_job_template holds name and a path for a template to be included to run after jobs.
lint_requirements Boolean (defaults True) to enable upper bound check on requirements.txt
The next step is to bootstrap the plugin. This will create a functional but useless plugin, with minimal code and tests.
-
Run the
plugin-template --bootstrap
command. This will create a skeleton for your plugin. It will contain asetup.py
, expected plugin layout and stubs for necessary classes, methods, and tests.$ ./plugin-template --bootstrap PLUGIN_NAME
In addition to the basic plugin boilerplate, this template also provides a basic set of functional tests using the pulp_smash framework.
In order to use these tests, you will need to address the "FIXME" messages left in places where plugin-writer intervention is required.
At this point, you have a one-off opportunity to use the --all option, which generates everything included in the --bootstrap option, as well as documentation, functional and unit test, and Github Actions configuration file templates that you require to support a plugin.
Note : Regenerating the bootstrap section at a later time will reset all files to their original state, which is almost always not intended.
The next step is to add Github Actions workflows and scripts for continuous integration. These are highly recommended, as they will make continuous verification of your plugin's functionality much easier.
-
Run the
./plugin-template --github
command to generate the CI config based on the settings intemplate_config.yml
.$ ./plugin-template --github PLUGIN_NAME
Running the command again will update the plugin with the latest Github Actions CI configuration provided by the plugin-template.
The next step is to add documentation that can be hosted on Read the Docs.
-
Run the
./plugin-template --docs
command to generate the docs.$ ./plugin-template --docs PLUGIN_NAME
After bootstrapping, your plugin should be installable and discoverable by Pulp.
-
Install your bootstrapped plugin
pip install -e your_plugin_name
-
Start/restart the Pulp Server
django-admin runserver 24817
-
Check that everything worked and you have a remote endpoint
$ http GET http://localhost:24817/pulp/api/v3/remotes/{{ plugin_app_label }}/{{ plugin_app_label | dash }}/
The plugin specific /pulp/api/v3/content/{{ plugin_app_label | dash }}/
endpoints
should now also be available, and you can validate this by checking the hosted docs
http://localhost:24817/pulp/api/v3/docs
Your plugin is discoverable by Pulp because it is [a Django application that subclasses pulpcore.plugin.PulpPluginAppConfig]({{ plugin_name | snake }}/app/init.py)
For more information about plugin discoverability, including how it works and plugin entrypoints see the discoverability documentation
First, look at the overview of Pulp Models to understand how Pulp fits these pieces together.
Bootstrapping created various new endpoints (e.g. remote, repository and content). Additional information should be added to these to tell Pulp how to handle your content.
For each of these endpoints, the bootstrap has created a model
, a serializer
and a viewset
.
The model is how the data is stored in the database.
The serializer converts complex data to easily parsable types (XML, JSON).
The viewset provides the handlers to serve/receive the serialized data.
Always subclass the relevant model, serializer, and viewset from the pulpcore.plugin
namespace. Pulp provides custom behavior for these, and although implementation details
are located in pulpcore.app
, plugins should always use pulpcore.plugin
instead,
since pulpcore.plugin
gurantees the plugin API semantic versioning
Models:
- model(s) for the specific content type(s) used in plugin, should be subclassed from pulpcore.plugin.models.Content model
- model(s) for the plugin specific repository(ies), should be subclassed from pulpcore.plugin.models.Repository model
- model(s) for the plugin specific remote(s), should be subclassed from pulpcore.plugin.models.Remote model
Serializers:
- serializer(s) for plugin specific content type(s), should be subclassed from pulpcore.plugin.serializers.ContentSerializer
- serializer(s) for plugin specific remote(s), should be subclassed from pulpcore.plugin.serializers.RemoteSerializer
- serializer(s) for plugin specific repository(ies), should be subclassed from pulpcore.plugin.serializers.RepositorySerializer
Viewsets:
- viewset(s) for plugin specific content type(s), should be subclassed from pulpcore.plugin.viewsets.ContentViewSet
- viewset(s) for plugin specific repository(ies), should be subclassed from pulpcore.plugin.viewsets.RepositoryViewset
- viewset(s) for plugin specific remote(s), should be subclassed from pulpcore.plugin.viewsets.RemoteViewset
Keep namespacing in mind when writing your viewsets.
First model your content type. This file is located at [{{ plugin_name | snake }}/app/models.py]({{ plugin_name | snake }}/app/models.py). Add any fields that correspond to the metadata of your content, the could be the project name, the author name, or any other type of metadata.
The TYPE
class attribute is used for filtering purposes.
If a uniqueness constraint is needed, add a Meta
class to the model like so:
class {{ plugin_app_label | camel }}Content(Content):
TYPE = '{{ plugin_app_label | dash }}'
filename = models.TextField(unique=True, db_index=True, blank=False)
class Meta:
unique_together = ('filename',)
After adding the model, you can run the migration with
pulp-manager makemigrations {{ plugin_app_label }}
And make sure all your fields are on the {{ plugin_app_label }} database table.
Next, add a corresponding serializer field on the in [{{ plugin_name | snake }}/app/serializers.py]({{ plugin_name | snake }}/app/serializers.py). See the DRF documentation on serializer fields to see what's available
Last, add any additional routes to your [{{ plugin_name | snake }}/app/viewsets.py]({{ plugin_name | snake }}/app/viewsets.py). The content viewset usually doesn't require any additional routes, so you can leave this alone for now.
Remotes provide metadata about how content should be downloaded into Pulp, such as the URL of the remote source, the download policy, and some authentication settings. The base Remote
class provided by Pulp Platform provides support for concurrent downloading of remote content.
First model your remote. This file is located at [{{ plugin_name | snake }}/app/models.py]({{ plugin_name | snake }}/app/models.py). Add any fields that correspond to the remote source.
Remember to define the TYPE
class attribute which is used for filtering purposes.
Next, add a corresponding serializer field on the in [{{ plugin_name | snake }}/app/serializers.py]({{ plugin_name | snake }}/app/serializers.py).
Last, add any additional routes to your [{{ plugin_name | snake }}/app/viewsets.py]({{ plugin_name | snake }}/app/viewsets.py). The remote viewset usually doesn't require any additinal routes, so you can leave this alone for now.
A Repository knows the specifics of which Content it supports and defines how to create new RepositoryVersions. It is also responsible for validating that those RepositoryVersions are valid.
First model your repository. This file is located at [{{ plugin_name | snake }}/app/models.py]({{ plugin_name | snake }}/app/models.py). Add any fields as necessary for your specific content type.
Remember to define the TYPE
class attribute which is used for filtering purposes, and CONTENT_TYPES
which
defines which types of content are supported by the Repository. This is a list of classes such as
{{ plugin_app_label | camel }}Content representing the various content types your plugin supports (that you want
this repository type to support, if there is more than one repository type in your plugin).
Also, if you want to provide validation that the whole collection of the content in your RepositoryVersion makes sense
together, you do that by defining finalize_new_version
on your repository model.
Next, add a corresponding serializer field on the in [{{ plugin_name | snake }}/app/serializers.py]({{ plugin_name | snake }}/app/serializers.py).
Last, add any additional routes to your [{{ plugin_name | snake }}/app/viewsets.py]({{ plugin_name | snake }}/app/viewsets.py). Note the sync route is predefined for you. This route kicks off a task [{{ plugin_name | snake }}.app.tasks.synchronizing.py]({{ plugin_name | snake }}.app.tasks.synchronizing.py).
If you have more than one Repository type in your plugin, or you change the name of your existing one, you will also
need to have a RepositoryVersionViewSet defined for it (just a viewset, no other objects needed). This hasfield, parent_viewset
, which should be set to the accompanying Repository class defined in your plugin.
TODO
Tasks such as sync and publish are needed to tell Pulp how to perform certain actions.
TODO
Your bootstrap template comes with a set of prepopulated docs. You can host these on readthedocs when you are ready.
Pulp also comes with a set of auto API docs. When your plugin is installed endpoints in the live api docs will be automatically populate.
When you run 'make html' command to build the docs, you must have the pulp-api running on
localhost. The 'make html' command first downloads the OpenAPI schema for the plugin and saves it
in docs/_static/api.json
. This file will then provide data needed to display the restapi.html
page in the root of the built docs.
The script for generating a CI/CD configuration provided in this repository can be used to change and update said configuration. It should be run with the following command.
$ ./plugin-template --github PLUGIN_NAME
The default behavior enables two build jobs that generate client libraries using the OpenAPI schema. One publishes to PyPI using the secret environment variable called $PYPI_API_TOKEN. The other job publishes the client to rubygems.org and requires the $RUBYGEMS_API_KEY secret to be set. Both environment variables can be set in the Github secrets settings page for the plugin repository. The job that publishes tagged builds to PyPI uses the same configs as the client publishing job.
The before_install.sh, install.sh, before_script.sh, and script.sh can be augmented by plugin
writers by creating specially named scripts in their .github/workflows/scripts/
directory. The
scripts are executed in the following order, with optional plugin provided scripts in bold:
- pre_before_install.sh
- before_install.sh
- post_before_install.sh
- install.sh
- pre_before_script.sh
- before_script.sh
- post_before_script.sh
- script.sh
- post_docs_test.sh
- post_script.sh
- Plugin django app is defined using PulpAppConfig as a parent
- Plugin entry point is defined
- Necessary models/serializers/viewsets are defined. At a minimum:
- models for plugin content type, repository, remote
- serializers for plugin content type, repository, remote
- viewset for plugin content type, repository, remote
- Database migrations are generated and committed
- Errors are handled according to Pulp conventions
- Docs for plugin are available (any location and format preferred and provided by plugin writer)
plugin_template uses towncrier to manage a changelog for plugin writers to view. Whenever there is a major change to plugin_template, we recommend generating the changelog and tagging it.
The versions is the day in the format YYYY.MM.DD (eg "2020.08.11"). If there is more than one release on a day, you can append a number to end after a hyphen (eg "2020.08.11-1").
- First, generate the changelog with your version (
towncrier --yes --version 2020.08.11
) - Check in the new changelog, push, and open your PR.
- After the PR is merged, create a tag pointing to the changelog commit (
git tag 2020.08.11 9fceb02
) - Push your tag (
git push origin 2020.08.11
)
- Unless a change is small or doesn't affect plugin writers, create an issue on https://pulp.plan.io/projects/pulp. Add the tag "Plugin Template".
- Add a changelog update.
- Write an excellent Commit Message. Make sure you reference and link to the issue.
- Push your branch to your fork and open a Pull request across forks.