Micro — Async HTTP microservices
- Easy. Designed for usage with
async
andawait
(more) - Fast. Ultra-high performance (even JSON parsing is opt-in).
- Micro. The whole project is ~100 lines of code.
- Agile. Super easy deployment and containerization.
- Simple. Oriented for single purpose modules (function).
- Explicit. No middleware. Modules declare all dependencies.
- Standard. Just HTTP!
The following example sleep.js
will wait before responding (without blocking!)
import { send } from 'micro';
import sleep from 'then-sleep';
export default async function (req, res) {
await sleep(500);
send(res, 200, 'Ready!');
}
To run the microservice on port 3000
, use the micro
command:
$ micro -p 3000 sleep.js
Usage: micro [options] <file>
Options:
-h, --help output usage information
-V, --version output the version number
-p, --port Port to listen on (3000)
-n, --no-babel Skip Babel transformation
By default, micro
will transpile the target file and its relative dependencies so that ES6 and async
/await
an ES6 work for you.
For production, we recommend you first transpile and use --no-babel
to make bootup time much faster. That said, if you don't care about how long it takes to boot, the default flags are perfectly suitable for production.
Read more about Transpilation to understand what transformations are recommended.
micro(fn, { onError = null })
-
This function is exposed as the
default
export. -
Use
import micro from 'micro'
orrequire('micro')
. -
Returns a
http.Server
that uses the providedfn
as the request handler. -
The supplied function is run with
await
. It can beasync
! -
The
onError
function is invoked withreq, res, err
if supplied (see Error Handling) -
Example:
import micro from 'micro'; import sleep from 'then-sleep'; const srv = micro(async function (req, res) { await sleep(500); res.writeHead(200); res.end('woot'); }); srv.listen(3000);
json(req, { limit = '1mb' })
-
Use
import { json } from 'micro'
orrequire('micro').json
. -
Buffers and parses the incoming body and returns it.
-
Exposes an
async
function that can be run withawait
. -
limit
is how much data is aggregated before parsing at max. Otherwise, anError
is thrown withstatusCode
set to413
(see Error Handling). It can be aNumber
of bytes or a string like'1mb'
. -
If JSON parsing fails, an
Error
is thrown withstatusCode
set to400
(see Error Handling) -
Example:
import { json, send } from 'micro'; export default async function (req, res) { const data = await json(req); console.log(data.price); send(res, 200); }
send(res, statusCode, data = null)
-
Use
import { send } from 'micro'
orrequire('micro').send
. -
statusCode
is aNumber
with the HTTP error code, and must always be supplied. -
If
data
is supplied and is anobject
, it's automatically serialized as JSON.Content-Type
andContent-Length
are automatically set. -
If JSON serialization fails (for example, if a cyclical reference is found), a
400
error is thrown (see Error Handling). -
Example
import { send } from 'micro'; export default async function (req, res) { send(res, 400, { error: 'Please use a valid email' }); }
send(req, res, error)
- Use
import { sendError } from 'micro'
orrequire('micro').sendError
. - Used as the default handler for
onError
. - Automatically sets the status code of the response based on
error.statusCode
. - Sends the
error.message
as the body. - During development (when
NODE_ENV
is set to'development'
), stacks are printed out withconsole.error
and also sent in responses. - Usually, you don't need to invoke this method yourself, as you can use the built-in error handling flow with
throw
.
Micro allows you to write robust microservices. This is accomplished primarily by bringing sanity back to error handling and avoiding callback soup.
If an error is thrown and not caught by you, the response will automatically be 500
. Important: during development mode (if the env variable NODE_ENV
is 'development'
), error stacks will be printed as console.error
and included in the responses.
If the Error
object that's thrown contains a statusCode
property, that's used as the HTTP code to be sent. Let's say you want to write a rate limiting module:
import rateLimit from 'my-rate-limit';
export default async function (req, res) {
await rateLimit(req);
// … your code
}
If the API endpoint is abused, it can throw an error like so:
if (tooMany) {
const err = new Error('Rate limit exceeded');
err.statusCode = 429;
throw err;
}
The nice thing about this model is that the statusCode
is merely a suggestion. The user can override it:
try {
await rateLimit(req);
} catch (err) {
if (429 == err.statusCode) {
// perhaps send 500 instead?
send(res, 500);
}
}
If the error is based on another error that Micro caught, like a JSON.parse
exception, then originalError
will point to it.
If a generic error is caught, the status will be set to 500
.
In order to set up your own error handling mechanism, you can pass a custom onError
function to micro:
const myErrorHandler = async (req, res, err) => {
// your own logging here
res.writeHead(500);
res.end('error!');
};
micro(handler, { onError: myErrorHandler });
Micro makes tests compact and a pleasure to read and write. We recommend ava, a highly parallel micro test framework with built-in support for async tests:
import test from 'ava';
import listen from './listen';
import { send } from 'micro';
import request from 'request-promise';
test('my endpoint', async t => {
const fn = async function (req, res) {
send(res, 200, { test: 'woot' });
};
const url = await listen(fn);
const body = await request(url);
t.same(body.test, 'woot');
});
Look at the test/_listen
helper for a function that returns a URL with an ephemeral port every time it's called.
The Babel configuration micro
uses is:
{
"presets": ["es2015"],
"plugins": [
"transform-runtime",
"syntax-async-functions",
"transform-async-to-generator"
]
}
These requires the following NPM modules (versions might vary)
{
"babel-plugin-syntax-async-functions": "6.3.13",
"babel-plugin-transform-async-to-generator": "6.4.6",
"babel-plugin-transform-runtime": "6.4.3",
"babel-preset-es2015": "6.3.13"
}
You can use the micro
CLI for npm start
:
{
"name": "my-app-subscribe",
"dependencies": {
"micro": "x.y.z"
},
"scripts": {
"start": "micro -p 3000 subscribe.js"
}
}
Then your Dockerfile
can look like this:
FROM node:argon
RUN mkdir -p /usr/src/app
WORKDIR /usr/src/app
COPY package.json /usr/src/app/
RUN npm install
COPY . /usr/src/app
EXPOSE 3000
CMD [ "npm", "start" ]
- Run
gulp help
to see available tasks. - Before submitting a PR, please run
gulp lint
andgulp test
. - We use
standard
+ semicolons. - Please be welcoming.
- Thanks Tom Yandell and Richard Hodgson for donating the
micro
npm name. - Copyright © 2016 Zeit, Inc and project authors.
- Licensed under MIT.
- ▲