Hello folks! welcome back to a new edition of our tutorial on Node.js. In this section of our tutorial on Node.js, we will be studying about Node.js Streams.
Streams are objects that lets you read data from a source or write data to a destination in continuous manner. In Node.js, there are four types of streams -
Streams are objects that lets you read data from a source or write data to a destination in continuous manner. In Node.js, there are four types of streams -
- Readable Stream - Stream used for read operations.
- Writable Stream - Stream used for write operations.
- Duplex Stream - Stream used for both read and write operations.
- Transform Stream - A type of duplex stream where the output is computed based on input.
Each type of these streams in Node.js is an EventEmitter instance and throws multiple events at different instances of time. Some of the commonly used events are -
- data - This event is fired when there is data available for read.
- end - This event is fired when there is no more data to read.
- error - This event is fired when there is any error receiving or writing data.
- finish - This event is fired when all the data has been flushed to underlying system.
This tutorial offers a basic understanding of the commonly used operations on streams.
READ: Node.js | Buffers
Reading from a Stream
Create a text file, input.txt with the following contents -
WebDesignTutorialz is offering self learning content to teach the world in simple and easy manner!!!!!
Create a js file, main.js using the following code -
var fs = require("fs"); var data = ''; // Create a readable stream var readerStream = fs.createReadStream('input.txt'); // Set the encoding to be utf8. readerStream.setEncoding('UTF8'); // Handle stream events --> data, end, and error readerStream.on('data', function(chunk) { data += chunk; }); readerStream.on('end',function() { console.log(data); }); readerStream.on('error', function(err) { console.log(err.stack); }); console.log("Program Ended");
Now run the main.js to see the result -
$ node main.js
Output
Verify the output.
Program Ended WebDesignTutorialz is offering self learning content to teach the world in simple and easy manner!!!!!
Writing to a Stream
Create a js file, main.js using the following code -
var fs = require("fs"); var data = 'Best Tutorial Platform'; // Create a writable stream var writerStream = fs.createWriteStream('output.txt'); // Write the data to stream with encoding to be utf8 writerStream.write(data,'UTF8'); // Mark the end of file writerStream.end(); // Handle stream events --> finish, and error writerStream.on('finish', function() { console.log("Write completed."); }); writerStream.on('error', function(err) { console.log(err.stack); }); console.log("Program Ended");
Now run the main.js to see the result -
$ node main.js
Output
Verify the output.
Program Ended Write completed.
Now open output.txt created in your current directory; it should have the following -
Best Tutorial Platform
READ: Node.js | Event Loop
Piping the Streams
Piping is a technique where we provide the output of one stream as an input to another stream. It is mainly used to get data from a stream and then pass the output to another stream. Piping operations has no limit. Now we'll illustrates a piping example for reading from one file and writing it to another file.
Create a js file, main.js using the following code -
Create a js file, main.js using the following code -
var fs = require("fs"); // Create a readable stream var readerStream = fs.createReadStream('input.txt'); // Create a writable stream var writerStream = fs.createWriteStream('output.txt'); // Pipe the read and write operations // read input.txt and write data to output.txt readerStream.pipe(writerStream); console.log("Program Ended");
Now run the main.js to see the result -
$ node main.js
Output
Verify the output.
Program Ended
Open the output.txt created in your current directory; it should have the following -
WebDesignTutorialz is offering self learning content to teach the world in simple and easy manner!!!!!
Chaining the Streams
Chaining is a process to connect the output of one stream to another stream and then create a chain of several stream operations. It is commonly used with piping operations. Now we will use piping and chaining to first compress a file, afterwards decompress the same file.
Create a js file, main.js using the following code -
Create a js file, main.js using the following code -
var fs = require("fs"); var zlib = require('zlib'); // Compress the file input.txt to input.txt.gz fs.createReadStream('input.txt') .pipe(zlib.createGzip()) .pipe(fs.createWriteStream('input.txt.gz')); console.log("File Compressed.");
Now run the main.js to see the result -
$ node main.js
Output
Verify the output.
File Compressed.
You will discover that the input.txt has been compressed and it created a file input.txt.gz in the current directory. Now, we will try to decompress the same file via the following code -
var fs = require("fs"); var zlib = require('zlib'); // Decompress the file input.txt.gz to input.txt fs.createReadStream('input.txt.gz') .pipe(zlib.createGunzip()) .pipe(fs.createWriteStream('input.txt')); console.log("File Decompressed.");
Now run the main.js to see the result -
$ node main.js
Output
Verify the output.
File Decompressed.
READ: Node.js | Event Emitter
Alright guys! This is where we are going to be rounding up for this tutorial. In our next tutorial, we will be studying about Node.js File System.
Feel free to ask your questions where necessary and we will attend to them as soon as possible. If this tutorial was helpful to you, you can use the share button to share this tutorial.
Follow us on our various social media platforms to stay updated with our latest tutorials. You can also subscribe to our newsletter in order to get our tutorials delivered directly to your emails.
Thanks for reading and bye for now.
Feel free to ask your questions where necessary and we will attend to them as soon as possible. If this tutorial was helpful to you, you can use the share button to share this tutorial.
Follow us on our various social media platforms to stay updated with our latest tutorials. You can also subscribe to our newsletter in order to get our tutorials delivered directly to your emails.
Thanks for reading and bye for now.