/openshift3-webhook-proxy

A webhook proxy service for converting the payloads of webhook callbacks to that expected by OpenShift.

Primary LanguagePythonBSD 2-Clause "Simplified" LicenseBSD-2-Clause

OpenShift WebHook Proxy Service

This is a webhook proxy service for translating the payloads of generic webhook callbacks from some foreign format into that expected by OpenShift for triggering a new build/deployment.

At this time the proxy understands webhook callbacks generated by the following services:

  • Travis-CI

Deploying the service

To deploy the service within your OpenShift environment, run the command:

oc create -f https://raw.githubusercontent.com/GrahamDumpleton/openshift3-webhook-proxy/master/openshift.json

You can create this within the same project namespace as the application which the webhook notification will target, or in a separate self contained project. Using a separate project space is better as then this service wouldn't actually be able to interact directly with your other applications in any way. The service only needs to talk to the publicly expose API endpoint for webhooks of the whole OpenShift cluster, so being confined to its own project is fine.

The service will be exposed via an automatically generated external host name. Both HTTP and HTTPS endpoints will be exposed. A HTTP endpoint is exposed because if you are using an OpenShift cluster with a self signed certificate for HTTPS connections, some external webhook sources may not work if they perform SSL certificate validation. In those cases you will need to use the HTTP endpoint.

To determine the external host name allocated to the service, you can use the oc describe route command. For example, if the name of the project the service was created in was called notifications, you would get something like:

$ oc describe route/webhook-proxy | grep Host
Requested Host:         webhook-proxy-notifications.a123.apps.example.com

Although the service only currently supports Travis-CI, pull requests to add support for other external services which you may want to proxy into OpenShift to trigger a new build/deployment, are most welcome.

Configuring Travis-CI

To use the webhook proxy with Travis-CI you need to perform two steps.

The first is to determine what the authorisation secret is that Travis-CI will send with any WebHook. The generic webhook details of the build configuration for your application within OpenShift would then need to be updated with this secret.

The Travis-CI documentation explains how to calculate what the authorisation secret is at:

Note that your GitHub username when used in the calculation to generate the authorisation secret is case sensitive.

Once you have generated the authorisation secret, then run oc edit on the build configuration for your application in OpenShift and update the generic webhook trigger:

  triggers:
  - generic:
      secret: replace-this-with-authorisation-secret
    type: Generic

This sets up the OpenShift side, next is to configure Travis-CI through the .travis.yml file which is a part of the application repository.

For this you need to add a notifications section containing a webhooks sub section. Details of setting up this type of notification can be found in the Travis-CI documentation at:

The important part of this is what URL to use. Because we need to proxy this webhook via the proxy service we need to use the URL of the proxy.

language: python
python:
  - "2.7"
install:
  - pip install -r requirements.txt
script:
  - python manage.py test
notifications:
  webhooks:
    urls:
      - http://webhook-proxy-notifications.a123.apps.example.com/travis-ci/api.example.com/myproject/myapp
    on_success: always
    on_failure: never
    on_start: never

The format of the URL is:

http://<webhook-proxy-host>/travis-ci/<openshift-api-host>/<project>/<application>

If your OpenShift cluster is using a self signed SSL certificate, use http for the scheme.

Once setup, commit the change and push your change to your code repository. When Travis-CI picks up that a push has occurred, once the tests are run the webhook should be triggered. This will go to the webhook proxy service, which will translate the webhook format to that expected by OpenShift and pass it through to OpenShift.

The result should be that a new build of your application should be triggered.

Enabling debug logging

If you are having issues and want to see details of the inbound webhook call and subsequent webhook call into OpenShift, you can enable debugging. To do this run:

oc set env dc/webhook-proxy DEBUG=true

Then look at the logs for the pod running the instance of the service.