Deploy Kong 1.0 clusters to Heroku Common Runtime and Private Spaces using the Kong buildpack.
โซ Upgrading from an earlier version? See Upgrade Guide.
๐ฌ This is a community proof-of-concept, MIT license, provided "as is", without warranty of any kind.
Kong is an extensible web proxy based on OpenResty, a web app framework built on the embedded Lua language capabilities of the Nginx web server.
With Heroku, Kong may be used for a variety of purposes. A few examples:
- unify access control & observability for a suite of microservices
- enforce request rate & size limits globally, based on the endpoint, or the authenticated consumer
- create a single management point for routing requests based on DNS hostnames, URL paths, and HTTP headers
๐ฆ Visit Kong HQ, the official resource for everything Kong.
Use the deploy button to create a Kong app in your Heroku account:
To make changes to the Kong app's source, clone and connect this repo (or your own fork) to the Heroku app:
git clone https://github.com/heroku/heroku-kong.git
cd heroku-kong
# Use the name of the Heroku app:
heroku git:remote --app $APP_NAME
heroku info
To gain local console access to Kong deployed on Heroku, see ADMIN.
Console access is primarily useful for performing kong
CLI commands against the deployed app. Most administrative features do not require console access and instead are available through the Kong Admin API.
When this app is deployed to Heroku, it automatically provisions a protected, external-facing loopback proxy to Kong's Admin API, secured by the KONG_HEROKU_ADMIN_KEY
config var.
KONG_HEROKU_ADMIN_KEY
is generated automatically when this app is deployed using the automated app setup.
You can explicitly set a new admin key value:
heroku config:set KONG_HEROKU_ADMIN_KEY=xxxxx
Make HTTPS requests using a tool like curl
or Paw.cloud:
- Base URL of the app's Kong Admin API is
https://$APP_NAME.herokuapp.com/kong-admin
- Set the current admin key in the
apikey
HTTP header
For example, set the current admin key into a local shell variable:
KONG_HEROKU_ADMIN_KEY=`heroku config:get KONG_HEROKU_ADMIN_KEY`
Now use the following HTTP request style to interact with the Kong's Admin API:
โ๏ธ Replace the variable $APP_NAME
with value for your unique deployment.
curl -H "apikey: $KONG_HEROKU_ADMIN_KEY" https://$APP_NAME.herokuapp.com/kong-admin/status
If you prefer to only use the console-based Admin API, then this externally-facing proxy can be disabled:
curl -H "apikey: $KONG_HEROKU_ADMIN_KEY" https://$APP_NAME.herokuapp.com/kong-admin/services/kong-admin/routes
# For the returned Route's `id`,
curl -H "apikey: $KONG_HEROKU_ADMIN_KEY" -X DELETE https://$APP_NAME.herokuapp.com/kong-admin/routes/$ROUTE_ID
# Now there's no longer admin access!
# Finally, clear out the old admin key value.
heroku config:unset KONG_HEROKU_ADMIN_KEY
Kong may be provisioned and configured on Heroku using Hashicorp Terraform and a third-party Kong provider.
See these examples of Using Terraform with Heroku:
- Common Runtime microservices with a unified gateway
- Private Spaces microservices with a unified gateway
๐จ Potentially breaking changes. Please attempt upgrades on a staging system before upgrading production.
Buildpack v6.0.0 supports rapid deployments using a
pre-compiled Kong binary. A pre-existing, customized app may require changes continue functioning, if the app explicitly uses the /app/.heroku
directory prefix.
Buildpack v7.0.0-rc* supports Kong 1.0 release candidates. The "rc" releases only support upgrading from Kong 0.14, not earlier versions or other release candidates.
Buildpack v7.0.0 supports Kong 1.0.
First, see Kong's official upgrade path.
Then, take into account these facts about how this Kong on Heroku app works:
- this app automatically runs
kong migrations up
for every deployment - you may prevent the previous version of Kong from attempting to use the new database schema during the upgrade (this will cause downtime):
- check the current formation size with
heroku ps
- scale the web workers down
heroku ps:scale web=0
- perform the upgrade
- allow release process to run
- finally restart to the original formation size
heroku ps:scale web=$PREVIOUS_SIZE
- check the current formation size with
- once Kong 1.0 is successfully deployed, execute:
heroku run "kong migrations finish --conf /app/config/kong.conf"
๐ฅ Please open an issue, if you encounter problems or have feedback about this process.
Kong may be customized through configuration and plugins.
Kong is automatically configured at runtime with a .profile.d
script:
- renders the
config/kong.conf
file based on:- the customizable
config/kong.conf.etlua
template - values of config vars, as defined in the buildpack
- the customizable
- exports environment variables
- see:
.profile.d/kong-env
in a running dyno
- see:
All file-based config may be overridden by setting KONG_
-prefixed config vars, e.g. heroku config:set KONG_LOG_LEVEL=debug
๐ See: Kong 1.0 Configuration Reference
๐ See: buildpack usage
Usage examples and sample plugins are includes with this Heroku Kong app.
Demo: API Rate Limiting
Request this Bay Lights API more than five times in a minute, and you'll get HTTP Status 429: API rate limit exceeded, along with X-Ratelimit-Limit-Minute
& X-Ratelimit-Remaining-Minute
headers to help the API consumers regulate their usage.
Try it in your shell terminal:
curl --head https://kong-proxy-public.herokuapp.com/bay-lights/lights
# HTTP/1.1 200 OK
curl --head https://kong-proxy-public.herokuapp.com/bay-lights/lights
# HTTP/1.1 200 OK
curl --head https://kong-proxy-public.herokuapp.com/bay-lights/lights
# HTTP/1.1 200 OK
curl --head https://kong-proxy-public.herokuapp.com/bay-lights/lights
# HTTP/1.1 200 OK
curl --head https://kong-proxy-public.herokuapp.com/bay-lights/lights
# HTTP/1.1 200 OK
curl --head https://kong-proxy-public.herokuapp.com/bay-lights/lights
# HTTP/1.1 429
Here's the whole configuration for this API rate limiter:
curl http://localhost:8001/services/ -i -X POST \
--data 'name=bay-lights' \
--data 'protocol=https' \
--data 'port=443' \
--data 'host=bay-lights-api-production.herokuapp.com'
# Note the Service ID returned in previous response, use it in place of `$SERVICE_ID`.
curl http://localhost:8001/plugins/ -i -X POST \
--data 'name=request-size-limiting' \
--data "config.allowed_payload_size=8" \
--data "service.id=$SERVICE_ID"
curl http://localhost:8001/plugins/ -i -X POST \
--data 'name=rate-limiting' \
--data "config.minute=5" \
--data "service.id=$SERVICE_ID"
curl http://localhost:8001/routes/ -i -X POST \
--data 'paths[]=/bay-lights' \
--data "service.id=$SERVICE_ID"
Custom plugins allow you to observe and transform HTTP traffic using lightweight, high-performance Lua code in Nginx request processing contexts. Building on the previous example, let's add a simple plugin to Kong.
hello-world-header will add an HTTP response header X-Hello-World showing the date and a message from an environment variable.
Activate this plugin for the API:
curl http://localhost:8001/plugins/ -i -X POST \
--data 'name=hello-world-header' \
--data "service.id=$SERVICE_ID"
Then, set a message through the Heroku config var:
heroku config:set HELLO_WORLD_MESSAGE='๐๐'
# โฆthe app will restart.
Now, when fetching an API response, notice the X-Hello-World header:
curl --head https://kong-proxy-public.herokuapp.com/bay-lights/lights
# โฉ๏ธ
# HTTP/1.1 200 OK
# Connection: keep-alive
# Content-Type: application/json;charset=utf-8
# Content-Length: 9204
# X-Ratelimit-Limit-Minute: 5
# X-Ratelimit-Remaining-Minute: 4
# Server: Cowboy
# Date: Mon, 28 Aug 2017 23:14:47 GMT
# Strict-Transport-Security: max-age=31536000
# X-Content-Type-Options: nosniff
# Vary: Accept-Encoding
# Request-Id: 6c815aae-a5e9-496f-b731-dc72bbe2b63e
# Via: kong/0.14.0, 1.1 vegur
# X-Hello-World: Today is 2017-08-28. ๐๐ <--- The injected header
# X-Kong-Upstream-Latency: 49
# X-Kong-Proxy-Latency: 161
JSON/REST has taken over as the internet API lingua franca, shedding the complexity of XML/SOAP. The National Digital Forecast Database [NDFD] is a legacy XML/SOAP service.
This app includes a sample, custom plugin ndfd-xml-as-json. This plugin exposes a JSON API that returns the maximum temperatures forecast for a location from the NDFD SOAP service. Using the single-resource concept of REST, the many variations of a SOAP or other legacy interfaces may be broken out into elegant, individual JSON APIs.
Try it in your shell terminal:
curl https://kong-proxy-public.herokuapp.com/ndfd-max-temps \
--data '{"latitude":37.733795,"longitude":-122.446747}'
# Response contains max temperatures forecast for San Francisco, CA
curl https://kong-proxy-public.herokuapp.com/ndfd-max-temps \
--data '{"latitude":27.964157,"longitude":-82.452606}'
# Response contains max temperatures forecast for Tampa, FL
curl https://kong-proxy-public.herokuapp.com/ndfd-max-temps \
--data '{"latitude":41.696629,"longitude":-71.149994}'
# Response contains max temperatures forecast for Fall River, MA
Much more elegant than the legacy API. See the sample request body:
curl --data @spec/data/ndfd-request.xml -H 'Content-Type:text/xml' -X POST https://graphical.weather.gov/xml/SOAP_server/ndfdXMLserver.php
# Response contains wrapped XML data. Enjoy decoding that.
This technique may be used to create a suite of cohesive JSON APIs out of various legacy APIs.
Here's the configuration for this API translator:
curl http://localhost:8001/services/ -i -X POST \
--data 'name=ndfd-max-temps' \
--data 'protocol=https' \
--data 'port=443' \
--data 'host=graphical.weather.gov' \
--data 'path=/xml/SOAP_server/ndfdXMLserver.php'
# Note the Service ID returned in previous response, use it in place of `$SERVICE_ID`.
curl http://localhost:8001/plugins/ -i -X POST \
--data 'name=request-size-limiting' \
--data "config.allowed_payload_size=8" \
--data "service.id=$SERVICE_ID"
curl http://localhost:8001/plugins/ -i -X POST \
--data 'name=rate-limiting' \
--data "config.minute=5" \
--data "service.id=$SERVICE_ID"
curl http://localhost:8001/plugins/ -i -X POST \
--data 'name=ndfd-xml-as-json' \
--data "service.id=$SERVICE_ID"
curl http://localhost:8001/routes/ -i -X POST \
--data 'paths[]=/ndfd-max-temps' \
--data "service.id=$SERVICE_ID"
๐ See the implementation of the custom plugin's Lua source code, unit tests, and integration tests.
- Definitely an openresty guide
- An Introduction To OpenResty - Part 1, 2, & 3
- Nginx API for Lua,
ngx
reference, for use in Kong plugins- Nginx variables, accessible through
ngx.var
- Nginx variables, accessible through
- Lua 5.1, Note: Kong is not compatible with the newest Lua version
- Classic Objects, the basis of Kong's plugins
- Moses, functional programming
- Lubyk, realtime programming (performance- & game-oriented)
- resty-http, Nginx-Lua co-routine based HTTP client
- Serpent, inspect values
- Busted, testing framework
To work with Kong locally on macOS X.
If you haven't already, clone and connect your own fork of this repo to the Heroku app:
# Replace the main repo with your own fork:
git clone https://github.com/heroku/heroku-kong.git
cd heroku-kong
# Use the name of the Heroku app:
heroku git:remote --app $APP_NAME
heroku info
-
Ensure requirements are met
-
Create the Postgres user & databases:
createuser --pwprompt kong # set the password "kong" createdb --owner=kong kong_dev createdb --owner=kong kong_tests
-
Execute
./bin/setup
bin/start
- Logs in
/usr/local/var/kong/logs/
- Prefix is
/usr/local/var/kong
for commands like:kong health -p /usr/local/var/kong
kong stop -p /usr/local/var/kong
Any test-specific Lua rocks should be specified in Rockfile_test
file, so that they are not installed when the app is deployed.
Add tests in spec/
:
bin/test