Node.js Stream is a powerful part of Node.js that allows efficient data handling. If you’re dealing with large amounts of data, instead of utilizing the approach of reading everything into memory before processing, you can use a Stream – which reads in chunks of data, processes it, and then moves on to the next chunk.
This way, your program can start processing data immediately as it comes in, rather than waiting for whole data to load – significantly improving performance when working with big data sets or real-time data sources. There are four types of streams: Readable, Writable, Duplex (both readable and writable) and Transform (a duplex stream that can modify or transform the data as it is written and read).
Node.js has a built-in module, ‘stream’, to create streams.
Here is an example of how to use it:
```
const fs = require(‘fs’);
const readStream = fs.createReadStream(‘./largeFile.txt’);
const writeStream = fs.createWriteStream(‘./smallerFile.txt’);
readStream.on(‘data’, (chunk) => {
writeStream.write(chunk);
});
```
In the example above, we use the `fs` module’s `createReadStream` method to read a file (‘largeFile.txt’) as a stream. We also create a write stream (‘smallerFile.txt’). Whenever a chunk of data is read from the read-stream (‘data’ event), we write that chunk to the write stream. It’s a way of copying a large file while consuming less memory.