Here is what I want to do:
stream = fs.WriteStream('crap.txt',{flags:'w'};
// more code
response.on('close',function() {
// is the below line possible?
fs.stat(stream.name, function(stats) {
console.log(stats.size);
});
stream.end();
});
So can I get the filename from the stream object? An illustrated example would be nice, with reference to good tutorial/docs (containing examples) on writable streams.
It's fs.createWriteStream and stream.path
You can console.dir(stream) to find all it's properties
Related
I have a function that expects a write stream to which I am providing the following stream:
const logStream = fs.createWriteStream('./log.txt')
fn(logStream)
fn is provided by a third-party module, so I do not control its implementation. Internally, I know that fn eventually does this:
// super simplified
fn (logStream) {
// ...
stream.pipe(logStream, { end: true })
// ...
}
My issue is that I know that the read stream stream contains ANSI escape codes which I don't want to be outputted to my log.txt. After a quick google search, I found chalk/strip-ansi-stream, which is a transform stream designed to do just that.
So, being the Node streams newbie that I am, I decided to try to modify my code to this:
const stripAnsiStream = require('strip-ansi-stream')
const logStream = fs.createWriteStream('./log.txt')
fn(stripAnsiStream().pipe(logStream))
... which does not work: my log file still contains content with the ANSI escape codes. I think this is because instead of creating a chain like
a.pipe(b).pipe(c)
I've actually done
a.pipe(b.pipe(c))
How can I apply this transform stream to my write stream without controlling the beginning of the pipe chain where the read stream is provided?
For the purpose of chaining, stream.pipe() returns the input argument. The return value of b.pipe(c) is c.
When you call fn(b.pipe(c)), you're actually bypassing transform stream b and inputting the write stream c directly.
Case #1: a.pipe(b.pipe(c))
b.pipe(c)
a.pipe(c)
Case #2: a.pipe(b).pipe(c)
a.pipe(b)
b.pipe(c)
The transform stream can be piped into the log stream, and then passed into the module separately. You're effectively using case #2, but starting the pipes in reverse order.
const stripAnsiStream = require('strip-ansi-stream')
const fn = require('my-third-party-module')
const transformStream = stripAnsiStream()
const logStream = fs.createWriteStream('./log.txt')
transformStream.pipe(logStream)
fn(transformStream)
I have a binary application which generates a continuous stream of json objects (not an array of json objects). Json object can sometimes span multiple lines (still being a valid json object but prettified).
I can connect to this stream and read it without problems like:
var child = require('child_process').spawn('binary', ['arg','arg']);
child.stdout.on('data', data => {
console.log(data);
});
Streams are buffers and emit data events whenever they please, therefore I played with readline module in order to parse the buffers into lines and it works (I'm able to JSON.parse() the line) for Json objects which don't span on multiple lines.
Optimal solution would be to listen on events which return single json object, something like:
child.on('json', object => {
});
I have noticed objectMode option in streams node documentation however I' getting a stream in Buffer format so I belive I'm unable to use it.
Had a look at npm at pixl-json-stream, json-stream but in my opinnion none of these fit the purpose. There is clarinet-object-stream but it would require to build the json object from ground up based on the events.
I'm not in control of the json object stream, most of the time one object is on one line, however 10-20% of the time json object is on multiple lines (\n as EOL) without separator between objects. Each new object always starts on a new line.
Sample stream:
{ "a": "a", "b":"b" }
{ "a": "x",
"b": "y", "c": "z"
}
{ "a": "a", "b":"b" }
There must be a solution already I'm just missing something obvious. Would rather find appropriate module then to hack with regexp the stream parser to handle this scenario.
I'd recommend to try parsing every line:
const readline = require('readline');
const rl = readline.createInterface({
input: child.stdout
});
var tmp = ''
rl.on('line', function(line) {
tmp += line
try {
var obj = JSON.parse(tmp)
child.emit('json', obj)
tmp = ''
} catch(_) {
// JSON.parse may fail if JSON is not complete yet
}
})
child.on('json', function(obj) {
console.log(obj)
})
As the child is an EventEmitter, one can just call child.emit('json', obj).
Having the same requirement, I was uncomfortable enforcing a requirement for newlines to support readline, needed to be able to handle starting the read in the middle of a stream (possibly the middle of a JSON document), and didn't like constantly parsing and checking for errors (seemed inefficient).
As such I preferred using the clarinet sax parser, collecting the documents as I went and emitting doc events once whole JSON documents have been parsed.
I just published this class to NPM
https://www.npmjs.com/package/json-doc-stream
I need to capture in a custom stream outputs of a spawned child process.
child_process.spawn(command[, args][, options])
For example,
var s = fs.createWriteStream('/tmp/test.txt');
child_process.spawn('ifconfig', [], {stdio: [null, s, null]})
Now how do I read from the /tmp/test.txt in real time?
It looks like child_process.spawn is not using stream.Writable.prototype.write nor stream.Writable.prototype._write for its execution.
For example,
s.write = function() { console.log("this will never get printed"); };
As well as,
s.__proto__._write = function() { console.log("this will never get printed"); };
It looks like it uses file descriptors under-the-hood to write from child_process.spawn to a file.
Doing this does not work:
var s2 = fs.createReadStream('/tmp/test.txt');
s2.on("data", function() { console.log("this will never get printed either"); });
So, how can I get the STDOUT contents of a child process?
What I want to achieve is to stream STDOUT of a child process to a socket. If I provide the socket directly to the child_process.spawn as a stdio parameter it closes the socket when it finishes, but I want to keep it open.
Update:
The solution is to use default {stdio: ['pipe', 'pipe', 'pipe']} options and listen to the created .stdout of the child process.
var cmd = child_process.spaw('ifconfig');
cmd.stdout.on("data", (data) => { ... });
Now, to up the ante, a more challenging question:
-- How do you read the STDOUT of the child process and still preserve the colors?
For example, if you send STDOUT to process.stdout like so:
child_process.spawn('ifconfig', [], {stdio: [null, process.stdout, null]});
it will keep the colors and print colored output to the console, because the .isTTY property is set to true on process.stdout.
process.stdout.isTTY // true
Now if you use the default {stdio: ['pipe', 'pipe', 'pipe']}, the data you will read will be stripped of console colors. How do you get the colors?
One way to do that would be creating your own custom stream with fs.createWriteStream, because child_process.spawn requires your streams to have a file descriptor.
Then setting .isTTY of that stream to true, to preserve colors.
And finally you would need to capture the data what child_process.spawn writes to that stream, but since child_process.spawn does not use .prototype.write nor .prototype._write of the stream, you would need to capture its contents in some other hacky way.
That's probably why child_process.spawn requires your stream to have a file descriptor because it bypasses the .prototype.write call and writes directly to the file under-the-hood.
Any ideas how to implement this?
You can do it without using a temporary file:
var process = child_process.spawn(command[, args][, options]);
process.stdout.on('data', function (chunk) {
console.log(chunk);
});
Hi I'm on my phone but I will try to guide you as I can. I will clarify when near a computer if needed
What I think you want is to read the stdout from a spawn and do something with the data?
You can give the spawn a variable name instead of just running the function, e.g:
var child = spawn();
Then listen to the output like:
child.stdout.on('data', function(data) {
console.log(data.toString());
});
You could use that to write the data then to a file or whatever you may want to do with it.
The stdio option requires file descriptors, not stream objects, so one way to do it is use use fs.openSync() to create an output file descriptor and us that.
Taking your first example, but using fs.openSync():
var s = fs.openSync('/tmp/test.txt', 'w');
var p = child_process.spawn('ifconfig', [], {stdio: [process.stdin, s, process.stderr]});
You could also set both stdout and stderr to the same file descriptor (for the same effect as bash's 2>&1).
You'll need to close the file when you are done, so:
p.on('close', function(code) {
fs.closeSync(s);
// do something useful with the exit code ...
});
Scenario: Consider I am having multiple methods doing different tasks and handled by different developers. I am trying to make a generic method call which logs if error occurs. So the need is I have to log a Line No, Method name, etc..
I wrote a generic function, as follows:
function enterLog(sourcefile, methodName, LineNo)
{
fs.appendFile('errlog.txt', sourcefile +'\t'+ methodName +'\t'+ LineNo +'\n', function(e){
if(e)
console.log('Error Logger Failed in Appending File! ' + e);
});
}
So, the call for the above method has to pass source file, method name and the Line No. Which may change at any point of time during development.
E.g. for calling method with hard-coded values:
enterLog('hardcodedFileName.js', 'TestMethod()', '27');
Question: Is it better to hard-code the values (as above example) required or is there any way to get the method name & line no reference from any way in Node.js?
there is a nice module out there which we use in our applications-logger. you can even fetch the line number. https://npmjs.org/package/traceback
so you could rewrite it like that:
var traceback = require('traceback');
function enterLog(sourcefile, methodName, LineNo) {
var tb = traceback()[1]; // 1 because 0 should be your enterLog-Function itself
fs.appendFile('errlog.txt', tb.file +'\t'+ tb.method +'\t'+ tb.line +'\n', function(e) {
if(e) {
console.log('Error Logger Failed in Appending File! ' + e);
}
});
}
and just call:
enterLog();
from wherever you want and always get the correct results
Edit: another hint. not hardcoding your filename is the easiest to achieve in node.js without 3rd-party-module-dependencies:
var path = require('path');
var currentFile = path.basename(__filename); // where __filename always has the absolute path of the current file
I recently updated to the latest version of Node.js (1.10~) from 0.8~, and I've been getting a message when running that says:
util.pump() is deprecated. Use readableStream.pipe() instead.
I've tried to switch my functions to say readableStream.pipe(), but I don't think it's working the same.
So I have three questions:
Why is util.pump deprecated?
How do I switch to readableStream.pipe()?
OR 3. How do I turn off this warning?
Here is the code where I'm using it (with mustache)
var stream = mu.compileAndRender(template_file, json_object_from_db);
util.pump(stream, res);
When I replace util.pump with readableStream.pipe, I get this error:
ReferenceError: readableStream is not defined
Can anyone help point me in the right direction?
Okay, so this question was a pretty easy answer after some more experimentation (though documentation was null).
Basically, readableStream is just a variable you're supposed to replace with your stream. So in my case, the answer is:
stream.pipe(res);
You just replace util, basically, with the stream. Easy peezy.
I Think the following link will help for your work ! https://groups.google.com/forum/#!msg/nodejs/YWQ1sRoXOdI/3vDqoTazbQQJ
var readS = fs.createReadStream("fileA.txt");
var writeS = fs.createWriteStream("fileB.txt");
util.pump(readS, writeS, function(error)
// Operation done
});
=====>
var readS = fs.createReadStream("fileA.txt");
var writeS = fs.createWriteStream("fileB.txt");
readS.pipe(writeS);
readS.on("end", function() {
// Operation done
});
You can look this link http://nodejs.cn/api/stream.html
This is emitted whenever the stream.pipe() method is called on a readable stream, adding this writable to its set of destinations.
var writer = getWritableStreamSomehow();
var reader = getReadableStreamSomehow();
writer.on('pipe', (src) => {
console.error('something is piping into the writer');
assert.equal(src, reader);
});
reader.pipe(writer);