What you can achieve are things like:
- Create a server
- Work with streamed data
- Make API calls
Creating an HTTP server
We create an HTTP server on port 8080 and respond to any request with a ‘Hello there!’:
import {createServer, IncomingMessage, Server, ServerResponse} from 'http'; const server: Server = createServer(); const port = 8080; server.on('request', (req: IncomingMessage, res: ServerResponse) => { res.writeHead(200, {'content-type': 'text/plain'}); res.end('Hello there!'); }); server.listen(port);
IncomingMessage
extends two interfaces EventEmitter
and ReadableStream
which allow us to listen to events and process streams, such as incoming data.
Creating an HTTPS server
HTTPS has very much the same API as HTTP. You require/import the https instead and specify a private key and certificate. You can generate cert.pem and key.pem in unix based operating systems:
openssl req -x509 -newkey rsa:4096 -keyout key.pem -out cert.pem -nodes -subj "/"
const server = require('https').createServer({ key: fs.readFileSync('./key.pem'), cert: fs.readFileSync('./cert.pem'), }); server.on('request', (req, res) => { res.writeHead(200, { 'content-type' : 'text/plain'}); res.end('Hello there!'); }); server.listen(443);
Handling errors
To catch errors like the following
events.js:292 throw er; // Unhandled 'error' event ^ Error: listen EADDRINUSE: address already in use :::8080
…you do not wrap your code in a try catch block. Instead you listen to the server
‘s error event:
import {createServer, IncomingMessage, Server, ServerResponse} from 'http'; import ErrnoException = NodeJS.ErrnoException; const server: Server = createServer(); const port = 8080; server.listen(port); server.on('request', (req: IncomingMessage, res: ServerResponse) => { console.log('Incoming request'); }); server.on('error', (err: ErrnoException) => { if (err.code === 'EADDRINUSE') { console.error(`Error: Address on port ${port} is already in use`) } else { console.error(err); } });
To listen for errors related to incoming request
s and outgoing response
s you use:
import {createServer, IncomingMessage, Server, ServerResponse} from 'http'; const server: Server = createServer(); const port = 8080; server.on('request', (req: IncomingMessage, res: ServerResponse) => { req.on('error', (err) => { console.log('request error'); res.statusCode = 500; res.write("An error occured"); res.end(); }); res.on('error', (err) => { console.log('response error'); }); }); server.listen(port);
The err object contains properties code
, message
and stack
.
Usually you want to handle errors in order to log them, provide a fallback solution and/or display a message to the user.
If an error occurs and there is no error listener defined, the error will be thrown and potentially crash your application – that means that errors are not just silently ignored if there is no error handler.
Parsing URL
We want to parse the complete URL of an incoming request into its different parts to check if /test
was requested. We do not use url.parse()
because it is deprecated, instead we use:
import {createServer, IncomingMessage, Server, ServerResponse} from 'http'; import {URL} from 'url'; const server: Server = createServer(); const port = 8080; server.on('request', (req: IncomingMessage, res: ServerResponse) => { if(req.url) { const parsedUrl = new URL(req.url, `http://${req.headers.host}`); console.log(parsedUrl); if(parsedUrl.pathname === '/test') { return res.end('You requested the test page'); } } res.end('Bye'); }); server.listen(port);
Request to http://localhost:8080/test
results in:
URL { href: 'http://localhost:8080/test', origin: 'http://localhost:8080', protocol: 'http:', username: '', password: '', host: 'localhost:8080', hostname: 'localhost', port: '8080', pathname: '/test', search: '', searchParams: URLSearchParams {}, hash: '' }
Working with routes
const server = require('http').createServer(); let data = {"some" : "data"}; server.on('request', (req, res) => { switch (req.url) { case '/home': res.writeHead(200, {'content-type': 'text/html'}); res.end(fs.readFileSync('home.html')); break; case '/api': res.writeHead(200, {'content-type': 'application/json'}); res.end(JSON.stringify(data)); break; default: res.writeHead(404); res.end(); } }); server.listen(8000);
Headers
Reading incoming request headers
Incoming request headers are available in request.headers
. For example, use curl -H "Authorization: Bearer 123456" http://localhost:8080/something"
and you will see that request.headers will contain the authorization header.
With express you can req.header('User-Agent')
.
Setting outgoing response headers
// Option 1 response.writeHead(404, { 'X-Powered-By': 'Node' }); response.end();
// Option 2 response.statusCode = 404; response.setHeader('X-Powered-By', 'Node'); response.end();
Processing incoming data
In this example we create a server that will process any JSON payload and log it to the console.
Option 1: The long way by receiving and concatenating chunks
Generally, the server receives data not all at once, but as a stream of individual chunk
s (or buffers) that are received over time. We collect them in a body
Buffer array by listening to the data
event on the request
object. To know when the last piece of data was transmitted we have to listen to the request
‘s end
event. It is there where we will concat
the collected chunks to one piece and JSON.parse()
it to output it via console
.
import {createServer, IncomingMessage, Server, ServerResponse} from 'http'; const server: Server = createServer(); const port = 8080; server.on('request', (req: IncomingMessage, res: ServerResponse) => { const body:Buffer[] = []; req.on('data', (chunk) => { body.push(chunk); }); req.on('end', () => { const buffer = Buffer.concat(body); const json = JSON.parse(buffer.toString()); console.log(json); }); res.end('Bye'); }); server.listen(port);
Now, send a request. You can use any tool such as curl, Postman, Insomnia. In this example we create a file send.http
and send the request directly from within the Jetbrains IDE:
# send.http POST http://localhost:8080/api/item Content-Type: application/json { "user": "Mathias", "job": "Developer" }
Option 2: Using body
package
If you do not want to deal with buffers and streams yourself then npm install body @types/body
. This will give you 4 different parsers, depending on what kind of data you expect to parse: text, json, form or any. We only use json in the following example:
import {createServer, IncomingMessage, Server, ServerResponse} from 'http'; // import textbody from "body"; import jsonBody from "body/json.js"; // import formBody from "body/form.js"; // import anyBody from "body/any.js"; const server: Server = createServer(); const port = 8080; server.on('request', (req: IncomingMessage, res: ServerResponse) => { jsonBody(req, res, (err, body) => { if(err) { console.error(err); } console.log(body); }); req.on('end', () => { console.log('end'); }) }); server.listen(port);
In my first attempt (before I ended up with the working solution above) I made the mistake of running jsonBody
inside the request.on('data')
handler instead of server.on('request')
handler and got the error message from raw-body
package:
request size did not match content length
Sending JSON data
import {createServer, IncomingMessage, Server, ServerResponse} from 'http'; const server: Server = createServer(); const port = 8080; server.on('request', (req: IncomingMessage, res: ServerResponse) => { const data = { name: 'Mathias', job: 'Developer'}; res.setHeader('Content-Type', 'application/json'); res.statusCode = 200; res.write(JSON.stringify(data)); res.end(); }); server.listen(port);
Requesting HTTP data
Instead of serving data over HTTP, Node can also be used to request data over HTTP.
const https = require('https'); // req is of type https.ClientRequest const req = https.request({hostname: 'bothe.at'}, (res) => { /* res is of type http.IncomingMessage */ console.log(res); console.log(res.statusCode); console.log(res.headers); res.on('data', (data) => { console.log(data.toString()); }) }); req.on('error', (err) => console.log(err)); req.end(); console.log(req.agent) // type of http.Agent
There are also simplified versions, for example http.get('http://...', (res) => {})
in which case you do not need to call req.end()
manually.
Node uses http.globalAgent
to handle sockets for these client HTTP requests.
PUT / POST / DELETE
import http from 'http'; const data = JSON.stringify({ name: 'Mathias', job: 'Developer' }); const options = { hostname: 'localhost', port: 8080, path: 'users', method: 'POST', headers: { 'Content-Type' : 'application/json', 'Content-Length' : data.length } } const request = http.request(options, response => { response.on('data', chunk => { console.log(chunk.toString()); }); }); request.on('error', (err) => { console.error(err); }); // write data to request stream request.write(data); request.end();
Authenticated requests
That’s as simple as adding an Authorization header with a base64 encoded value
const options = { hostname: 'localhost', port: 8080, method: 'POST', headers: { 'Content-Type' : 'application/json', 'Content-Length' : data.length, 'Authorization': Buffer.from('myusername' + ':' + 'mypassword').toString('base64') } }
Handling form upload
Handling file uploads without a third-party library is non-trivial, complex and time consuming. That’s why we use Formidable which is suitable for production environments, fast (can write 500 MB/s automatically to disk), has a low memory footprint, handles errors gracefully and has a very high test coverage.
npm install formidable @types/formidable
We need an HTML form with a file upload button and create a index.html:
<form action="/upload" enctype="multipart/form-data" method="post"> <input type="text" name="description" placeholder="Enter a description"> <input type="file" name="upload" multiple="multiple"> <input type="submit" value="Upload"> </form>
There are two way to handle file uploads in NodeJS: via events or callbacks. Here is the callback way:
import {createServer, IncomingMessage, Server, ServerResponse} from 'http'; import * as fs from "fs"; import formidable from "formidable"; const server: Server = createServer(); const port = 8080; server.on('request', (req: IncomingMessage, res: ServerResponse) => { if(req.method === 'POST' && req.url === "/upload") { const form = new formidable.IncomingForm(); form.parse(req, (err, fields, files) => { console.log('Fields: ', fields); console.log('Files: ', files); }); } fs.createReadStream("./index.html").pipe(res); }); server.listen(port);
Uploading a Test.pdf
would output something similar to:
Fields: { description: 'My file' } Files: { upload: File { size: 33672, path: <temp upload folder handled by your OS where your upload was stored> name: 'Test.pdf', type: 'application/pdf', hash: null, lastModifiedDate: ..., ... } }
If we want to specify the upload directory, keep file extensions, allow multiple uploads etc:
const form = new formidable.IncomingForm({ uploadDir: __dirname, keepExtensions: true, multiples: true, maxFileSize: 5 * 10124 * 1024, // 5 MB encoding: "utf-8", maxFields: 20 });
If you need more control over the data flow, consider using the event driven way of processing files:
// Emitted after each incoming chunk of data that has been parsed. Can be used to roll your own progress bar. form.on('progress', (bytesReceived, bytesExpected) => {}); // Emitted whenever a field / value pair has been received. form.on('field', (name, value) => {}); // Emitted whenever a new file is detected in the upload stream. Use this event if you want to stream the file to somewhere else while buffering the upload on the file system. form.on('fileBegin', (name, file) => {}); // Emitted whenever a field / file pair has been received. file is an instance of File. form.on('file', (name, file) => {}); // Emitted when there is an error processing the incoming form. A request that experiences an error is automatically paused, you will have to manually call request.resume() if you want the request to continue firing 'data' events. form.on('error', (err) => {}); // Emitted when the request was aborted by the user. Right now this can be due to a 'timeout' or 'close' event on the socket. After this event is emitted, an error event will follow. In the future there will be a separate 'timeout' event (needs a change in the node core). form.on('aborted', () => {}); // Emitted when the entire request has been received, and all contained files have finished flushing to disk. This is a great place for you to send your response. form.on('end', () => {});
Axios
Axios is a popular library that works client and serverside, supports streaming, has a promise based API and automatically transforms data into JSON.
npm install axios
Here we make a simple GET request
import axios from "axios" axios.get('http://google.com').then(res => { res.data }).catch(err => { console.log(err) })
Here we pipe the response stream into a file
import fs from "fs"; import axios from "axios"; const options = { method: 'GET', url: 'http://google.com', responseType: 'stream' } axios(options).then(res => { res.data.pipe(fs.createWriteStream('google.html')); }).catch(err => { console.log(err) });
Here we transform data before we POST it:
import fs from "fs"; import axios from "axios"; const options = { method: 'POST', url: 'http://localhost:8080', responseType: 'stream', data: { userNames: ['testA', 'testB'] }, transformRequest: (data, headers) => { const newData = 'transform outgoing data'; return JSON.stringify(newData); }, transformResponse: (data) => { const newData = 'transform incoming data'; return JSON.stringify(newData); } } axios(options).then(res => { res.data.pipe(fs.createWriteStream('google.html')); }).catch(err => { console.log(err) });
Multiple parallel requests
import fs from "fs"; import axios from "axios"; axios.all([ axios.get('http://localhost:8080/post/1'), axios.get('http://localhost:8080/post/2'), axios.get('http://localhost:8080/post/3') ]).then((responseArray) => { console.log(responseArray[0].data.description); console.log(responseArray[1].data.description); console.log(responseArray[2].data.description); });