- Easy:Designed for usage with
async
andawait
- Fast:Ultra-high performance (even JSON parsing is opt-in)
- Micro:The whole project is ~260 lines of code
- Agile:Super easy deployment and containerization
- Simple:Oriented for single purpose modules (function)
- Standard:Just HTTP!
- Explicit:No middleware - modules declare alldependencies
- Lightweight:With all dependencies, the package weighs less than a megabyte
Disclaimer:Micro was created for use within containers and is not intended for use in serverless environments. For those using Vercel, this means that there is no requirement to use Micro in your projects as the benefits it provides are not applicable to the platform. Utility features provided by Micro, such asjson
,are readily available in the form ofServerless Function helpers.
Important:Micro is only meant to be used in production. In development, you should usemicro-dev,which provides you with a tool belt specifically tailored for developing microservices.
To prepare your microservice for running in the production environment, firstly installmicro
:
npm install --save micro
Create anindex.js
file and export a function that accepts the standardhttp.IncomingMessageandhttp.ServerResponseobjects:
module.exports=(req,res)=>{
res.end('Welcome to Micro');
};
Micro providesuseful helpersbut also handles return values – so you can write it even shorter!
module.exports=()=>'Welcome to Micro';
Next, ensure that themain
property insidepackage.json
points to your microservice (which is insideindex.js
in this example case) and add astart
script:
{
"main":"index.js",
"scripts":{
"start":"micro"
}
}
Once all of that is done, the server can be started like this:
npm start
And go to this URL:http://localhost:3000
- 🎉
micro - Asynchronous HTTP microservices
USAGE
$ micro --help
$ micro --version
$ micro [-l listen_uri [-l...]] [entry_point.js]
By default micro will listen on 0.0.0.0:3000 and will look first
for the "main" property in package.json and subsequently for index.js
as the default entry_point.
Specifying a single --listen argument will overwrite the default, not supplement it.
OPTIONS
--help shows this help message
-v, --version displays the current version of micro
-l, --listen listen_uri specify a URI endpoint on which to listen (see below) -
more than one may be specified to listen in multiple places
ENDPOINTS
Listen endpoints (specified by the --listen or -l options above) instruct micro
to listen on one or more interfaces/ports, UNIX domain sockets, or Windows named pipes.
For TCP (traditional host/port) endpoints:
$ micro -l tcp://hostname:1234
For UNIX domain socket endpoints:
$ micro -l unix:/path/to/socket.sock
For Windows named pipe endpoints:
$ micro -l pipe:\\.\pipe\PipeName
Examples
Micro is built for usage with async/await.
constsleep=require('then-sleep');
module.exports=async(req,res)=>{
awaitsleep(500);
return'Ready!';
};
When you want to set the port using an environment variable you can use:
micro -l tcp://0.0.0.0:$PORT
Optionally you can add a default if it suits your use case:
micro -l tcp://0.0.0.0:${PORT-3000}
${PORT-3000}
will allow a fallback to port3000
when$PORT
is not defined.
Note that this only works in Bash.
For parsing the incoming request body we included an async functionsbuffer
,text
andjson
const{buffer,text,json}=require('micro');
module.exports=async(req,res)=>{
constbuf=awaitbuffer(req);
console.log(buf);
// <Buffer 7b 22 70 72 69 63 65 22 3a 20 39 2e 39 39 7d>
consttxt=awaittext(req);
console.log(txt);
// '{ "price": 9.99}'
constjs=awaitjson(req);
console.log(js.price);
// 9.99
return'';
};
- Buffers and parses the incoming body and returns it.
- Exposes an
async
function that can be run withawait
. - Can be called multiple times, as it caches the raw request body the first time.
limit
is how much data is aggregated before parsing at max. Otherwise, anError
is thrown withstatusCode
set to413
(seeError Handling). It can be aNumber
of bytes ora stringlike'1mb'
.- If JSON parsing fails, an
Error
is thrown withstatusCode
set to400
(seeError Handling)
For other types of data check theexamples
So far we have usedreturn
to send data to the client.return 'Hello World'
is the equivalent ofsend(res, 200, 'Hello World')
.
const{send}=require('micro');
module.exports=async(req,res)=>{
conststatusCode=400;
constdata={error:'Custom error message'};
send(res,statusCode,data);
};
- Use
require('micro').send
. statusCode
is aNumber
with the HTTP status code, and must always be supplied.- If
data
is supplied it is sent in the response. Different input types are processed appropriately, andContent-Type
andContent-Length
are automatically set.Stream
:data
is piped as anoctet-stream
.Note: it isyourresponsibility to handle theerror
event in this case (usually, simply logging the error and aborting the response is enough).Buffer
:data
is written as anoctet-stream
.object
:data
is serialized as JSON.string
:data
is written as-is.
- If JSON serialization fails (for example, if a cyclical reference is found), a
400
error is thrown. SeeError Handling.
You can use Micro programmatically by requiring Micro directly:
consthttp=require('http');
constsleep=require('then-sleep');
const{serve}=require('micro');
constserver=newhttp.Server(
serve(async(req,res)=>{
awaitsleep(500);
return'Hello world';
}),
);
server.listen(3000);
- Use
require('micro').serve
. - Returns a function with the
(req, res) => void
signature. That uses the providedfunction
as the request handler. - The supplied function is run with
await
.So it can beasync
- Use
require('micro').sendError
. - Used as the default handler for errors thrown.
- Automatically sets the status code of the response based on
error.statusCode
. - Sends the
error.message
as the body. - Stacks are printed out with
console.error
and during development (whenNODE_ENV
is set to'development'
) also sent in responses. - Usually, you don't need to invoke this method yourself, as you can use thebuilt-in error handlingflow with
throw
.
- Use
require('micro').createError
. - Creates an error object with a
statusCode
. - Useful for easily throwing errors with HTTP status codes, which are interpreted by thebuilt-in error handling.
orig
setserror.originalError
which identifies the original error (if any).
Micro allows you to write robust microservices. This is accomplished primarily by bringing sanity back to error handling and avoiding callback soup.
If an error is thrown and not caught by you, the response will automatically be500
.Important:Error stacks will be printed asconsole.error
and during development mode (if the env variableNODE_ENV
is'development'
), they will also be included in the responses.
If theError
object that's thrown contains astatusCode
property, that's used as the HTTP code to be sent. Let's say you want to write a rate limiting module:
constrateLimit=require('my-rate-limit');
module.exports=async(req,res)=>{
awaitrateLimit(req);
//... your code
};
If the API endpoint is abused, it can throw an error withcreateError
like so:
if(tooMany){
throwcreateError(429,'Rate limit exceeded');
}
Alternatively you can create theError
object yourself
if(tooMany){
consterr=newError('Rate limit exceeded');
err.statusCode=429;
throwerr;
}
The nice thing about this model is that thestatusCode
is merely a suggestion. The user can override it:
try{
awaitrateLimit(req);
}catch(err){
if(429==err.statusCode){
// perhaps send 500 instead?
send(res,500);
}
}
If the error is based on another error thatMicrocaught, like aJSON.parse
exception, thenoriginalError
will point to it. If a generic error is caught, the status will be set to500
.
In order to set up your own error handling mechanism, you can use composition in your handler:
const{send}=require('micro');
consthandleErrors=(fn)=>async(req,res)=>{
try{
returnawaitfn(req,res);
}catch(err){
console.log(err.stack);
send(res,500,'My custom error!');
}
};
module.exports=handleErrors(async(req,res)=>{
thrownewError('What happened here?');
});
Micro makes tests compact and a pleasure to read and write. We recommendNode TAPorAVA,a highly parallel test framework with built-in support for async tests:
consthttp=require('http');
const{send,serve}=require('micro');
consttest=require('ava');
constlisten=require('test-listen');
constfetch=require('node-fetch');
test('my endpoint',async(t)=>{
constservice=newhttp.Server(
serve(async(req,res)=>{
send(res,200,{
test:'woot',
});
}),
);
consturl=awaitlisten(service);
constresponse=awaitfetch(url);
constbody=awaitresponse.json();
t.deepEqual(body.test,'woot');
service.close();
});
Look attest-listenfor a function that returns a URL with an ephemeral port every time it's called.
- Forkthis repository to your own GitHub account and thencloneit to your local device
- Link the package to the global module directory:
npm link
- Within the module you want to test your local development instance of Micro, just link it to the dependencies:
npm link micro
.Instead of the default one from npm, node will now use your clone of Micro!
You can run the tests using:npm test
.
Thanks to Tom Yandell and Richard Hodgson for donating the name "micro" onnpm!