Skip to content

vercel/micro

Repository files navigation

Micro — Asynchronous HTTP microservices

Features

  • Easy:Designed for usage withasyncandawait
  • Fast:Ultra-high performance (even JSON parsing is opt-in)
  • Micro:The whole project is ~260 lines of code
  • Agile:Super easy deployment and containerization
  • Simple:Oriented for single purpose modules (function)
  • Standard:Just HTTP!
  • Explicit:No middleware - modules declare alldependencies
  • Lightweight:With all dependencies, the package weighs less than a megabyte

Disclaimer:Micro was created for use within containers and is not intended for use in serverless environments. For those using Vercel, this means that there is no requirement to use Micro in your projects as the benefits it provides are not applicable to the platform. Utility features provided by Micro, such asjson,are readily available in the form ofServerless Function helpers.

Installation

Important:Micro is only meant to be used in production. In development, you should usemicro-dev,which provides you with a tool belt specifically tailored for developing microservices.

To prepare your microservice for running in the production environment, firstly installmicro:

npm install --save micro

Usage

Create anindex.jsfile and export a function that accepts the standardhttp.IncomingMessageandhttp.ServerResponseobjects:

module.exports=(req,res)=>{
res.end('Welcome to Micro');
};

Micro providesuseful helpersbut also handles return values – so you can write it even shorter!

module.exports=()=>'Welcome to Micro';

Next, ensure that themainproperty insidepackage.jsonpoints to your microservice (which is insideindex.jsin this example case) and add astartscript:

{
"main":"index.js",
"scripts":{
"start":"micro"
}
}

Once all of that is done, the server can be started like this:

npm start

And go to this URL:http://localhost:3000- 🎉

Command line

micro - Asynchronous HTTP microservices

USAGE

$ micro --help
$ micro --version
$ micro [-l listen_uri [-l...]] [entry_point.js]

By default micro will listen on 0.0.0.0:3000 and will look first
for the "main" property in package.json and subsequently for index.js
as the default entry_point.

Specifying a single --listen argument will overwrite the default, not supplement it.

OPTIONS

--help shows this help message

-v, --version displays the current version of micro

-l, --listen listen_uri specify a URI endpoint on which to listen (see below) -
more than one may be specified to listen in multiple places

ENDPOINTS

Listen endpoints (specified by the --listen or -l options above) instruct micro
to listen on one or more interfaces/ports, UNIX domain sockets, or Windows named pipes.

For TCP (traditional host/port) endpoints:

$ micro -l tcp://hostname:1234

For UNIX domain socket endpoints:

$ micro -l unix:/path/to/socket.sock

For Windows named pipe endpoints:

$ micro -l pipe:\\.\pipe\PipeName

async&await

Examples

Micro is built for usage with async/await.

constsleep=require('then-sleep');

module.exports=async(req,res)=>{
awaitsleep(500);
return'Ready!';
};

Port Based on Environment Variable

When you want to set the port using an environment variable you can use:

micro -l tcp://0.0.0.0:$PORT

Optionally you can add a default if it suits your use case:

micro -l tcp://0.0.0.0:${PORT-3000}

${PORT-3000}will allow a fallback to port3000when$PORTis not defined.

Note that this only works in Bash.

Body parsing

Examples

For parsing the incoming request body we included an async functionsbuffer,textandjson

const{buffer,text,json}=require('micro');

module.exports=async(req,res)=>{
constbuf=awaitbuffer(req);
console.log(buf);
// <Buffer 7b 22 70 72 69 63 65 22 3a 20 39 2e 39 39 7d>
consttxt=awaittext(req);
console.log(txt);
// '{ "price": 9.99}'
constjs=awaitjson(req);
console.log(js.price);
// 9.99
return'';
};

API

buffer(req, { limit = '1mb', encoding = 'utf8' })
text(req, { limit = '1mb', encoding = 'utf8' })
json(req, { limit = '1mb', encoding = 'utf8' })
  • Buffers and parses the incoming body and returns it.
  • Exposes anasyncfunction that can be run withawait.
  • Can be called multiple times, as it caches the raw request body the first time.
  • limitis how much data is aggregated before parsing at max. Otherwise, anErroris thrown withstatusCodeset to413(seeError Handling). It can be aNumberof bytes ora stringlike'1mb'.
  • If JSON parsing fails, anErroris thrown withstatusCodeset to400(seeError Handling)

For other types of data check theexamples

Sending a different status code

So far we have usedreturnto send data to the client.return 'Hello World'is the equivalent ofsend(res, 200, 'Hello World').

const{send}=require('micro');

module.exports=async(req,res)=>{
conststatusCode=400;
constdata={error:'Custom error message'};

send(res,statusCode,data);
};
send(res, statusCode, data = null)
  • Userequire('micro').send.
  • statusCodeis aNumberwith the HTTP status code, and must always be supplied.
  • Ifdatais supplied it is sent in the response. Different input types are processed appropriately, andContent-TypeandContent-Lengthare automatically set.
    • Stream:datais piped as anoctet-stream.Note: it isyourresponsibility to handle theerrorevent in this case (usually, simply logging the error and aborting the response is enough).
    • Buffer:datais written as anoctet-stream.
    • object:datais serialized as JSON.
    • string:datais written as-is.
  • If JSON serialization fails (for example, if a cyclical reference is found), a400error is thrown. SeeError Handling.

Programmatic use

You can use Micro programmatically by requiring Micro directly:

consthttp=require('http');
constsleep=require('then-sleep');
const{serve}=require('micro');

constserver=newhttp.Server(
serve(async(req,res)=>{
awaitsleep(500);
return'Hello world';
}),
);

server.listen(3000);
serve(fn)
  • Userequire('micro').serve.
  • Returns a function with the(req, res) => voidsignature. That uses the providedfunctionas the request handler.
  • The supplied function is run withawait.So it can beasync
sendError(req, res, error)
  • Userequire('micro').sendError.
  • Used as the default handler for errors thrown.
  • Automatically sets the status code of the response based onerror.statusCode.
  • Sends theerror.messageas the body.
  • Stacks are printed out withconsole.errorand during development (whenNODE_ENVis set to'development') also sent in responses.
  • Usually, you don't need to invoke this method yourself, as you can use thebuilt-in error handlingflow withthrow.
createError(code, msg, orig)
  • Userequire('micro').createError.
  • Creates an error object with astatusCode.
  • Useful for easily throwing errors with HTTP status codes, which are interpreted by thebuilt-in error handling.
  • origsetserror.originalErrorwhich identifies the original error (if any).

Error Handling

Micro allows you to write robust microservices. This is accomplished primarily by bringing sanity back to error handling and avoiding callback soup.

If an error is thrown and not caught by you, the response will automatically be500.Important:Error stacks will be printed asconsole.errorand during development mode (if the env variableNODE_ENVis'development'), they will also be included in the responses.

If theErrorobject that's thrown contains astatusCodeproperty, that's used as the HTTP code to be sent. Let's say you want to write a rate limiting module:

constrateLimit=require('my-rate-limit');

module.exports=async(req,res)=>{
awaitrateLimit(req);
//... your code
};

If the API endpoint is abused, it can throw an error withcreateErrorlike so:

if(tooMany){
throwcreateError(429,'Rate limit exceeded');
}

Alternatively you can create theErrorobject yourself

if(tooMany){
consterr=newError('Rate limit exceeded');
err.statusCode=429;
throwerr;
}

The nice thing about this model is that thestatusCodeis merely a suggestion. The user can override it:

try{
awaitrateLimit(req);
}catch(err){
if(429==err.statusCode){
// perhaps send 500 instead?
send(res,500);
}
}

If the error is based on another error thatMicrocaught, like aJSON.parseexception, thenoriginalErrorwill point to it. If a generic error is caught, the status will be set to500.

In order to set up your own error handling mechanism, you can use composition in your handler:

const{send}=require('micro');

consthandleErrors=(fn)=>async(req,res)=>{
try{
returnawaitfn(req,res);
}catch(err){
console.log(err.stack);
send(res,500,'My custom error!');
}
};

module.exports=handleErrors(async(req,res)=>{
thrownewError('What happened here?');
});

Testing

Micro makes tests compact and a pleasure to read and write. We recommendNode TAPorAVA,a highly parallel test framework with built-in support for async tests:

consthttp=require('http');
const{send,serve}=require('micro');
consttest=require('ava');
constlisten=require('test-listen');
constfetch=require('node-fetch');

test('my endpoint',async(t)=>{
constservice=newhttp.Server(
serve(async(req,res)=>{
send(res,200,{
test:'woot',
});
}),
);

consturl=awaitlisten(service);
constresponse=awaitfetch(url);
constbody=awaitresponse.json();

t.deepEqual(body.test,'woot');
service.close();
});

Look attest-listenfor a function that returns a URL with an ephemeral port every time it's called.

Contributing

  1. Forkthis repository to your own GitHub account and thencloneit to your local device
  2. Link the package to the global module directory:npm link
  3. Within the module you want to test your local development instance of Micro, just link it to the dependencies:npm link micro.Instead of the default one from npm, node will now use your clone of Micro!

You can run the tests using:npm test.

Credits

Thanks to Tom Yandell and Richard Hodgson for donating the name "micro" onnpm!

Authors