Streams in Node.js

A stream is an abstract interface for I/O. They can be an abstraction for source, destination or both. Node' s core modules such as http and fs use streams internally. You can use stream module to create your own stream instance. The stream module can be accessed as

var stream = require('stream');

There are four types of streams in Node.js -

  1. Readable - streams from which data can be read.
  2. Writable - streams to which data can be written.
  3. Duplex - streams that are readable as well as writable.
  4. Transform - duplex streams that can transform data as well.

All streams are Event Emitters. A readable stream can be piped to a writable stream.

Readable Streams

Readable Stream is an abstraction for source. Some examples of readable stream implementations are file contents (fs read streams), http request on server-side, standard input (process.stdin) etc. The events 'data', 'end', 'error', 'close', 'readable' can be emitted by readable streams.

The following code creates a readable stream using the fs module's fs.createReadStream( path, options) method. The data is read in chunks and the stream emits a 'data' event each time a chunk is read. The handler here just logs the length of the chunks, you could do something useful like pipe it into a writeable stream like http response from server, a writeable file stream etc.

var fs = require('fs');
//Create a ReadStream from file abc.txt with encoding utf-8 
var myStream = fs.createReadStream('abc.txt','utf8');
//Event handlers
myStream.on('data', function(chunk) {
   // Do what you want with chunk
myStream.on('end', function() {
   console.log('Read ended');
myStream.on('error', function(err) {

You can implement your own readable stream using stream module.

var Readable = require('stream').Readable;
var myReadable = new Readable({

The options that can be specified are highWaterMark(buffer size in bytes), encoding, objectMode and read() method.

Writeable Streams

Writeable Stream is an abstraction for destination. Some writeable stream implementations in Node.js are file contents(fs write streams), http response on server-side, standard output (process. Stdout) etc. Events that can be emitted are 'close', 'finish', 'error', 'drain' etc.

The following code creates a writeable stream using the fs module's fs.createWriteStream(path, options) method.

var fs = require('fs');
//Create a WriteStream to file pqr.txt default encoding is utf-8
var myStream = fs.createWriteStream('pqr.txt');
// Event handlers
myStream.on('finish', function() {
    console.log("Write finished");
myStream.on('error', function(err) {

You can implement your own writeable stream using stream module.

var Writable = require('stream').Writable;
var myWritable = new Writable({
  // options


You may pipe a readable stream to a writeable stream. For example, the contents of file (fs read stream) can be piped to an http response at server. The following code copies one file to another using streams.

var fs = require("fs");
// Create a readable stream from abc.txt
var myRStream = fs.createReadStream('abc.txt');
// Create a writable stream to pqr.txt
var myWStream = fs.createWriteStream('pqr.txt');
// Pipe the read and write streams
//When writeable stream emits finish
myWStream.on('finish', function() {

Pipes can be made using duplex streams and transform streams as well. Piping operations can be chained to one another to perform operations such as compressing, decompressing etc. You have to detect error events and unpipe events and handle them properly. There are many third-party packages that provide streaming API for database operations, web sockets etc.

Post a comment


Your Comment

Email (We dont publish it)


Nothing the first to share wisdom.