Streams for Large Data Processing

Streams in Node.js are ideal for processing large amounts of data in chunks, rather than loading it all into memory at once.

javascriptCopy code// Example using streams to read a large file
const fs = require('fs');
const readline = require('readline');

const fileStream = fs.createReadStream('./largeFile.txt');
const rl = readline.createInterface({
    input: fileStream,
    crlfDelay: Infinity
});

rl.on('line', (line) => {
    console.log(`Line from file: ${line}`);
});

rl.on('close', () => {
    console.log('File read complete.');
});

Explanation:

  • fs.createReadStream creates a readable stream for a large file.
  • readline.createInterface processes each line from the stream.

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *