Stream in NodeJS - javascript

I need some help to understand how stream work in NodeJS
I explain, i need to write a module which call a UNIX process (with spawn) and I want to redirect the stdout of this process to a Readable Stream.
I want this behavior to exports the Readable Stream and allow another module to read them.
To do this, I have write a little piece of code :
var spawn = require('child_process').spawn
var Duplex = require('stream').Duplex;
var stream = new Duplex;
var start = function() {
ps = spawn('mycmd', [/*... args ...*/]);
ps.stdout.pipe(stream);
};
exports.stream = stream;
exports.start = start;
But if I use this module I throw an exception which say that the stream doesn't implement the _read method.
Can you help me with this problem ?
Thanks in advance.
[EDIT] I have try the solution of creating a Stream object, but that's doesnt work, here is the code:
var spawn = require('child_process').spawn;
var Stream = require('stream');
var ps = null;
var audio = new Stream;
audio.readable = audio.writable = true;
var start = function() {
if(ps == null) {
ps = spawn('mycmd', []);
ps.stdout.pipe(stream);
}
};
var stop = function() {
if(ps) {
ps.kill();
ps = null;
}
};
exports.stream = stream;
exports.start = start;
exports.stop = stop;
But when I try to listen the stream, I encount an new error :
_stream_readable.js:583
var written = dest.write(chunk);
^
TypeError: Object #<Stream> has no method 'write'

Most of Node's Stream classes aren't meant to be used directly, but as the base of a custom type:
Note that stream.Duplex is an abstract class designed to be extended with an underlying implementation of the _read(size) and _write(chunk, encoding, callback) methods as you would with a Readable or Writable stream class.
One notable exception is stream.PassThrough, which is a simple echo stream implementation.
var PassThrough = require('stream').PassThrough;
var stream = new PassThrough;
Also note that ps will be a global, making it directly accessible in all other modules.

If you simply want to use stream then you should do :
var stream = new Stream;
stream.readable = stream.writable = true;
Duplex is meant for developers. Some methods like _read and _write need to be implemented for it.
[Update]
OK, you have data source, from the stdout. You will need write function, use this :
stream.write = function(data){this.emit('data', data);};

Related

How to execute / access local file from Thunderbird WebExtension?

I like to write a Thunderbird AddOn that encrypts stuff. For this, I already extracted all data from the compose window. Now I have to save this into files and run a local executable for encryption. But I found no way to save the files and execute an executable on the local machine. How can I do that?
I found the File and Directory Entries API documentation, but it seems to not work. I always get undefined while trying to get the object with this code:
var filesystem = FileSystemEntry.filesystem;
console.log(filesystem); // --> undefined
At least, is there a working AddOn that I can examine to find out how this is working and maybe what permissions I have to request in the manifest.json?
NOTE: Must work cross-platform (Windows and Linux).
The answer is, that WebExtensions are currently not able to execute local files. Also, saving to some local folder on the disk is also not possible.
Instead, you need to add some WebExtension Experiment to your project and there use the legacy APIs. There you can use the IOUtils and FileUtils extensions to reach your goal:
Execute a file:
In your background JS file:
var ret = await browser.experiment.execute("/usr/bin/executable", [ "-v" ]);
In the experiment you can execute like this:
var { ExtensionCommon } = ChromeUtils.import("resource://gre/modules/ExtensionCommon.jsm");
var { FileUtils } = ChromeUtils.import("resource://gre/modules/FileUtils.jsm");
var { XPCOMUtils } = ChromeUtils.import("resource://gre/modules/XPCOMUtils.jsm");
XPCOMUtils.defineLazyGlobalGetters(this, ["IOUtils");
async execute(executable, arrParams) {
var fileExists = await IOUtils.exists(executable);
if (!fileExists) {
Services.wm.getMostRecentWindow("mail:3pane")
.alert("Executable [" + executable + "] not found!");
return false;
}
var progPath = new FileUtils.File(executable);
let process = Cc["#mozilla.org/process/util;1"].createInstance(Ci.nsIProcess);
process.init(progPath);
process.startHidden = false;
process.noShell = true;
process.run(true, arrParams, arrParams.length);
return true;
},
Save an attachment to disk:
In your backround JS file you can do like this:
var f = messenger.compose.getAttachmentFile(attachment.id)
var blob = await f.arrayBuffer();
var t = await browser.experiment.writeFileBinary(tempFile, blob);
In the experiment you can then write the file like this:
async writeFileBinary(filename, data) {
// first we need to convert the arrayBuffer to some Uint8Array
var uint8 = new Uint8Array(data);
uint8.reduce((binary, uint8) => binary + uint8.toString(2), "");
// then we can save it
var ret = await IOUtils.write(filename, uint8);
return ret;
},
IOUtils documentation:
https://searchfox.org/mozilla-central/source/dom/chrome-webidl/IOUtils.webidl
FileUtils documentation:
https://searchfox.org/mozilla-central/source/toolkit/modules/FileUtils.jsm

Fake a Node.js Readable file stream from a JavaScript object

I want to create a "fake file" from a javascript object, as the libarry I am using is expecting a file as input, but I have an object in memory instead.
So it is expecting code something like this
var file = fs.readFileSync('{/path/to/file}');
lib.addDocument(config, file),
I want to create a fake file from a object I have called payload, and send that instead, my closest attempt so far looks like this:
var fake_file = new stream.Readable({ objectMode: true });
fake_file.push(msg.payload);
fake_file.push(null);
lib.addDocument(config, fake_file),
I feel I am close, but I cant quite get it to work. Current error is
{ Error: Unexpected end of multipart data
var Readable = require('stream').Readable
var obj = { objectMode : true};
var rStream = new Readable
rStream.push(JSON.stringify(obj));
rStream.push(null); //EOF
lib.addDocument(config, rStream)
For older versions below v10 and above v4:
var Readable = require('stream').Readable
var obj = { objectMode : true};
var chars=JSON.stringify(obj).split('');
function read(n){this.push(chars.shift())}
var rStream = new Readable({read:read});
lib.addDocument(config, rStream)
Your code is missing an Object -> String conversion. May use JSON.stringify for that:
lib.addDocument(config, JSON.stringify(msg.payload));

Node JS - How to write stream big json data into json array file?

I have difficult to write a json data into json file using stream module.
I learn about this from several blog tutorial, one of them is this page
Let say i am working with big json data on a json file. I think it is not possible to store all json object inside my memory. So i decided to do it using stream module.
Here the codes i have done:
writeStream.js
var Writable = require('stream').Writable,
util = require('util');
var WriteStream = function() {
Writable.call(this, {
objectMode: true
});
};
util.inherits(WriteStream, Writable);
WriteStream.prototype._write = function(chunk, encoding, callback) {
console.log('write : ' + JSON.stringify(chunk));
callback();
};
module.exports = WriteStream;
readStream.js
var data = require('./test_data.json'),
Readable = require('stream').Readable,
util = require('util');
var ReadStream = function() {
Readable.call(this, {
objectMode: true
});
this.data = data;
this.curIndex = 0;
};
util.inherits(ReadStream, Readable);
ReadStream.prototype._read = function() {
if (this.curIndex === this.data.length) {
return this.push(null);
}
var data = this.data[this.curIndex++];
console.log('read : ' + JSON.stringify(data));
this.push(data);
};
module.exports = ReadStream;
Called with this code:
var ReadStream = require('./readStream.js'),
WriteStream = require('./writeStream.js');
var rs = new ReadStream();
var ws = new WriteStream();
rs.pipe(ws);
Problem: I want to write it into different file, how is it possible?
Can you please help me?
If you are looking for a solution to just write the data from your ReadStream into a different file, you can try fs.createWriteStream. It will return you a writeable stream which can be piped directly to your ReadStream.
You will have to make a minor change in your readStream.js. You are currently pushing an object thus making it an object stream while a write stream expects either String or Buffer unless started in the ObjectMode. So you can do one of the following:
Start the write stream in the object mode. More info here.
Push String or Buffer in your read stream as writable stream internally calls writable.write which expects either String or Buffer. More info here.
If we follow the second option as an example, then your readStream.js should look like this:
var data = require('./test_data.json'),
Readable = require('stream').Readable,
util = require('util');
var ReadStream = function() {
Readable.call(this, {
objectMode: true
});
this.data = data;
this.curIndex = 0;
};
util.inherits(ReadStream, Readable);
ReadStream.prototype._read = function() {
if (this.curIndex === this.data.length) {
return this.push(null);
}
var data = this.data[this.curIndex++];
console.log('read : ' + JSON.stringify(data));
this.push(JSON.stringify(data));
};
module.exports = ReadStream;
You can call the above by using the following code
var ReadStream = require('./readStream.js');
const fs = require('fs');
var rs = new ReadStream();
const file = fs.createWriteStream('/path/to/output/file');
rs.pipe(file);
This will write the data from test_data.json to the output file.
Also as a good practice and to reliably detect write errors, add a listener for the 'error' event. For the above code, you can add the following:
file.on('error',function(err){
console.log("err:", err);
});
Hope this helps.

Converting a Buffer into a ReadableStream in Node.js

I have a library that takes as input a ReadableStream, but my input is just a base64 format image. I could convert the data I have in a Buffer like so:
var img = new Buffer(img_string, 'base64');
But I have no idea how to convert it to a ReadableStream or convert the Buffer I obtained to a ReadableStream.
Is there a way to do this?
For nodejs 10.17.0 and up:
const { Readable } = require('stream');
const stream = Readable.from(myBuffer);
something like this...
import { Readable } from 'stream'
const buffer = new Buffer(img_string, 'base64')
const readable = new Readable()
readable._read = () => {} // _read is required but you can noop it
readable.push(buffer)
readable.push(null)
readable.pipe(consumer) // consume the stream
In the general course, a readable stream's _read function should collect data from the underlying source and push it incrementally ensuring you don't harvest a huge source into memory before it's needed.
In this case though you already have the source in memory, so _read is not required.
Pushing the whole buffer just wraps it in the readable stream api.
Node Stream Buffer is obviously designed for use in testing; the inability to avoid a delay makes it a poor choice for production use.
Gabriel Llamas suggests streamifier in this answer: How to wrap a buffer as a stream2 Readable stream?
You can create a ReadableStream using Node Stream Buffers like so:
// Initialize stream
var myReadableStreamBuffer = new streamBuffers.ReadableStreamBuffer({
frequency: 10, // in milliseconds.
chunkSize: 2048 // in bytes.
});
// With a buffer
myReadableStreamBuffer.put(aBuffer);
// Or with a string
myReadableStreamBuffer.put("A String", "utf8");
The frequency cannot be 0 so this will introduce a certain delay.
You can use the standard NodeJS stream API for this - stream.Readable.from
const { Readable } = require('stream');
const stream = Readable.from(buffer);
Note: Don't convert a buffer to string (buffer.toString()) if the buffer contains binary data. It will lead to corrupted binary files.
You don't need to add a whole npm lib for a single file. i refactored it to typescript:
import { Readable, ReadableOptions } from "stream";
export class MultiStream extends Readable {
_object: any;
constructor(object: any, options: ReadableOptions) {
super(object instanceof Buffer || typeof object === "string" ? options : { objectMode: true });
this._object = object;
}
_read = () => {
this.push(this._object);
this._object = null;
};
}
based on node-streamifier (the best option as said above).
Here is a simple solution using streamifier module.
const streamifier = require('streamifier');
streamifier.createReadStream(new Buffer ([97, 98, 99])).pipe(process.stdout);
You can use Strings, Buffer and Object as its arguments.
This is my simple code for this.
import { Readable } from 'stream';
const newStream = new Readable({
read() {
this.push(someBuffer);
},
})
Try this:
const Duplex = require('stream').Duplex; // core NodeJS API
function bufferToStream(buffer) {
let stream = new Duplex();
stream.push(buffer);
stream.push(null);
return stream;
}
Source:
Brian Mancini -> http://derpturkey.com/buffer-to-stream-in-node/

How to create streams from string in Node.Js?

I am using a library, ya-csv, that expects either a file or a stream as input, but I have a string.
How do I convert that string into a stream in Node?
As #substack corrected me in #node, the new streams API in Node v10 makes this easier:
const Readable = require('stream').Readable;
const s = new Readable();
s._read = () => {}; // redundant? see update below
s.push('your text here');
s.push(null);
… after which you can freely pipe it or otherwise pass it to your intended consumer.
It's not as clean as the resumer one-liner, but it does avoid the extra dependency.
(Update: in v0.10.26 through v9.2.1 so far, a call to push directly from the REPL prompt will crash with a not implemented exception if you didn't set _read. It won't crash inside a function or a script. If inconsistency makes you nervous, include the noop.)
Do not use Jo Liss's resumer answer. It will work in most cases, but in my case it lost me a good 4 or 5 hours bug finding. There is no need for third party modules to do this.
NEW ANSWER:
var Readable = require('stream').Readable
var s = new Readable()
s.push('beep') // the string you want
s.push(null) // indicates end-of-file basically - the end of the stream
This should be a fully compliant Readable stream. See here for more info on how to use streams properly.
OLD ANSWER:
Just use the native PassThrough stream:
var stream = require("stream")
var a = new stream.PassThrough()
a.write("your string")
a.end()
a.pipe(process.stdout) // piping will work as normal
/*stream.on('data', function(x) {
// using the 'data' event works too
console.log('data '+x)
})*/
/*setTimeout(function() {
// you can even pipe after the scheduler has had time to do other things
a.pipe(process.stdout)
},100)*/
a.on('end', function() {
console.log('ended') // the end event will be called properly
})
Note that the 'close' event is not emitted (which is not required by the stream interfaces).
From node 10.17, stream.Readable have a from method to easily create streams from any iterable (which includes array literals):
const { Readable } = require("stream")
const readable = Readable.from(["input string"])
readable.on("data", (chunk) => {
console.log(chunk) // will be called once with `"input string"`
})
Note that at least between 10.17 and 12.3, a string is itself a iterable, so Readable.from("input string") will work, but emit one event per character. Readable.from(["input string"]) will emit one event per item in the array (in this case, one item).
Also note that in later nodes (probably 12.3, since the documentation says the function was changed then), it is no longer necessary to wrap the string in an array.
https://nodejs.org/api/stream.html#stream_stream_readable_from_iterable_options
Just create a new instance of the stream module and customize it according to your needs:
var Stream = require('stream');
var stream = new Stream();
stream.pipe = function(dest) {
dest.write('your string');
return dest;
};
stream.pipe(process.stdout); // in this case the terminal, change to ya-csv
or
var Stream = require('stream');
var stream = new Stream();
stream.on('data', function(data) {
process.stdout.write(data); // change process.stdout to ya-csv
});
stream.emit('data', 'this is my string');
Edit: Garth's answer is probably better.
My old answer text is preserved below.
To convert a string to a stream, you can use a paused through stream:
through().pause().queue('your string').end()
Example:
var through = require('through')
// Create a paused stream and buffer some data into it:
var stream = through().pause().queue('your string').end()
// Pass stream around:
callback(null, stream)
// Now that a consumer has attached, remember to resume the stream:
stream.resume()
There's a module for that: https://www.npmjs.com/package/string-to-stream
var str = require('string-to-stream')
str('hi there').pipe(process.stdout) // => 'hi there'
Another solution is passing the read function to the constructor of Readable (cf doc stream readeable options)
var s = new Readable({read(size) {
this.push("your string here")
this.push(null)
}});
you can after use s.pipe for exemple
in coffee-script:
class StringStream extends Readable
constructor: (#str) ->
super()
_read: (size) ->
#push #str
#push null
use it:
new StringStream('text here').pipe(stream1).pipe(stream2)
I got tired of having to re-learn this every six months, so I just published an npm module to abstract away the implementation details:
https://www.npmjs.com/package/streamify-string
This is the core of the module:
const Readable = require('stream').Readable;
const util = require('util');
function Streamify(str, options) {
if (! (this instanceof Streamify)) {
return new Streamify(str, options);
}
Readable.call(this, options);
this.str = str;
}
util.inherits(Streamify, Readable);
Streamify.prototype._read = function (size) {
var chunk = this.str.slice(0, size);
if (chunk) {
this.str = this.str.slice(size);
this.push(chunk);
}
else {
this.push(null);
}
};
module.exports = Streamify;
str is the string that must be passed to the constructor upon invokation, and will be outputted by the stream as data. options are the typical options that may be passed to a stream, per the documentation.
According to Travis CI, it should be compatible with most versions of node.
Heres a tidy solution in TypeScript:
import { Readable } from 'stream'
class ReadableString extends Readable {
private sent = false
constructor(
private str: string
) {
super();
}
_read() {
if (!this.sent) {
this.push(Buffer.from(this.str));
this.sent = true
}
else {
this.push(null)
}
}
}
const stringStream = new ReadableString('string to be streamed...')
In a NodeJS, you can create a readable stream in a few ways:
SOLUTION 1
You can do it with fs module. The function fs.createReadStream() allows you to open up a readable stream and all you have to do is pass the path of the file to start streaming in.
const fs = require('fs');
const readable_stream = fs.createReadStream('file_path');
SOLUTION 2
If you don't want to create file, you can create an in-memory stream and do something with it (for example, upload it somewhere). ​You can do this with stream module. You can import Readable from stream module and you can create a readable stream. When creating an object, you can also implement read() method which is used to read the data out of the internal buffer. If no data available to be read, null is returned. The optional size argument specifies a specific number of bytes to read. If the size argument is not specified, all of the data contained in the internal buffer will be returned.
const Readable = require('stream').Readable;
const readable_stream = new Readable({
​read(size) {
​// ...
​ }
});
SOLUTION 3
When you are fetching something over the network, that can be fetched like stream (for example you are fetching a PDF document from some API).
const axios = require('axios');
const readable_stream = await axios({
method: 'get',
url: "pdf_resource_url",
responseType: 'stream'
}).data;
SOLUTION 4
Third party packages can support creating of streams as a feature. That is a way with aws-sdk package that is usually used for uploading files to S3.
const file = await s3.getObject(params).createReadStream();
JavaScript is duck-typed, so if you just copy a readable stream's API, it'll work just fine. In fact, you can probably not implement most of those methods or just leave them as stubs; all you'll need to implement is what the library uses. You can use Node's pre-built EventEmitter class to deal with events, too, so you don't have to implement addListener and such yourself.
Here's how you might implement it in CoffeeScript:
class StringStream extends require('events').EventEmitter
constructor: (#string) -> super()
readable: true
writable: false
setEncoding: -> throw 'not implemented'
pause: -> # nothing to do
resume: -> # nothing to do
destroy: -> # nothing to do
pipe: -> throw 'not implemented'
send: ->
#emit 'data', #string
#emit 'end'
Then you could use it like so:
stream = new StringStream someString
doSomethingWith stream
stream.send()

Categories