Understand the NodeJs streams

Understand the NodeJs streams

What are Streams?

Streams in Node.js help in reading, writing, and processing data efficiently. They are commonly used for handling files, network requests, and real-time data transfer.

What makes streams unique?

Instead of program reading a file into memory all at once like in the traditional way, streams read chunks of data piece by piece, processing its content without keeping it all in memory.

This makes streams really powerful when working with large amounts of data, for example, a file size can be larger than your free memory space, making it impossible to read the whole file into the memory in order to process it.

Why Streams?

Memory Efficient – Loads data in small chunks instead of all at once.

Faster Processing – Starts processing as soon as data arrives.

Types of Streams in Node.js:

1️⃣ Writable – Write data (e.g., fs.createWriteStream()).

2️⃣ Readable – Read data (e.g., fs.createReadStream()).

3️⃣ Duplex – Read & write simultaneously (e.g., net.Socket).

4️⃣ Transform – Modify data while streaming (e.g., compression).


Advantages & Disadvantages of Node.js Streams

✅ Advantages:

  1. Memory Efficient – Streams process data in chunks, reducing memory usage.
  2. Faster Processing – No need to wait for the entire data to load before starting processing.
  3. Better Performance – Ideal for handling large files, network requests, and real-time data.
  4. Piping Support – Easily connect streams using .pipe(), simplifying complex workflows.
  5. Asynchronous & Non-blocking – Keeps applications responsive while handling data efficiently.


❌ Disadvantages:

  1. Complexity – Managing multiple streams and handling errors can be tricky.
  2. Backpressure Issues – If data flows too fast from a readable stream, the writable stream might struggle to keep up.
  3. Debugging Challenges – Stream-based code can be harder to debug due to its event-driven nature.
  4. Limited Use Cases – Not always the best choice for small data sets where simple file reading/writing is sufficient.

Events:

  • data – Triggered when data is available to read.
  • end – Emitted when the stream has no more data.
  • error – Emitted when an error occurs.



const fs = require('fs');

//Create a readable stream
const readStream = fs.createReadStream('data.txt', 'utf8');

readStream.on('data', (chunk) => {
  console.log('Received chunk:', chunk);
});

readStream.on('error', (err) => {
  console.error('Error:', err);
});

readStream.on('end', () => {
  console.log('Finished reading file.');
});

// Write Stream

let writeStream = fs.createWriteStream('../JS/data.txt');

writeStream.write('Hello javascript');

writeStream.end('end', () => {
  console.log('Ended');
});

writeStream.on('error', (error) => {
  console.log(error);
});

// Duplex-stream

const readstream = fs.createReadStream('data.txt', 'utf-8');
const writeStream = fs.createWriteStream('output.txt');

readstream.pipe(writeStream);
console.log('Piping completed.');
        
// transform

const fs = require('fs');
const zlib = require('zlib');  // Import zlib for compression

// Create a readable stream (input file)
const readableStream = fs.createReadStream('input.txt');

// Create a writable stream (output compressed file)
const writableStream = fs.createWriteStream('output.txt.gz');

// Create a transform stream (compression)
const gzipStream = zlib.createGzip();

// Pipe data: Read -> Compress -> Write
readableStream.pipe(gzipStream).pipe(writableStream);

console.log('File compression completed!');
        

To view or add a comment, sign in

More articles by HAYAT SINGH

Others also viewed

Explore content categories