Node.js Streams: Efficient data handling in real-time applications : Streams

Streams are collections of data – just like arrays or strings. The difference is that streams might not be available all at once and don’t have to fit in memory. This makes streams really powerful when working with large amounts of data, or data that’s coming from an external source one chunk at a time. Node.js streams, with their ability to handle large volumes of data in a memory-efficient manner, are an indispensable feature for modern web applications dealing with real-time data processing. They not only improve application performance and resource management but also provide developers with a cleaner, more manageable codebase. By harnessing the power of streams, developers can ensure their Node.js applications are robust, responsive, and scalable, ready to handle intensive data processing tasks inherent in today’s dynamic web services.

Types of Streams:

  1. Readable: These are streams from which data can be read (for example, fs.createReadStream()).
  2. Writable: These are streams to which data can be written (for example, fs.createWriteStream()).
  3. Duplex: Streams that are both Readable and Writable (for example, net.Socket).
  4. Transform: A type of duplex stream where the output is computed based on input (for example, zlib.createGzip()).

Each type of stream is an EventEmitter instance and throws several events at different instance of times. For example, some of the commonly used events are:

  • data: This event is fired when there is data available to read.
  • end: This event is fired when there is no more data to read.
  • error: This event is fired when there is any error receiving or writing data.
  • finish: This event is fired when all the data has been flushed to the underlying system.

Example: Using readable and writable streams

The following example demonstrates a basic usage of Readable and Writable streams. It reads data from a file, chunk by chunk, and writes it to another file.

const fs = require('fs');
// Create a readable stream
const readableStream = fs.createReadStream('source.txt');
// Create a writable stream
const writableStream = fs.createWriteStream('destination.txt');
// Listen for 'data' event
readableStream.on('data', function(chunk) {
    // Write data to the destination file
    writableStream.write(chunk);
});
readableStream.on('end', function() {
    console.log('Read Stream finished');
    writableStream.end();  // End the writable stream
});
writableStream.on('finish', function() {
    console.log('Write Stream finished');
});
readableStream.on('error', function(err) {
    console.log('An error occurred:', err.message);
});
writableStream.on('error', function(err) {
    console.log('An error occurred:', err.message);
});

In this script, we use the fs module to create readable and writable streams for files. The readable stream reads data from ‘source.txt’, and as it reads, it emits ‘data’ events. The callback for each ‘data’ event writes the chunk of data to the writable stream, which writes to ‘destination.txt’. We also listen for ‘end’ and ‘finish’ events to know when the streams are done reading and writing, respectively, and ‘error’ events to catch and log any errors.

Author: user