Javascript -Node.js When do i need to 'require' a module - javascript

I'm fairly new to Node.js and server side programming in general, and still a bit confused about basic Node rules. To start an http server, I need to require the http module. The server itself returns a request and a response object, which (as I understand, correct me if I'm wrong) are both eventEmitters and stream objects. I can still use methods like req.on() and res.write() without requiring the stream and eventEmitter modules. However, when I try to use the pipe function req.pipe(res), an error occurs saying that the pipe function is not defined. I assume this happens because I didn't include the stream module. How come I can use certain stream functions without requiring modules, but not others?

It seems like you're pretty new to Javascript as well, but I'll try to do my best to explain this.
require basically imports an object so you can use the functionality it provides. But if you already have an object, then you don't need to import (require) because it already exists.
For instance, if you want to create a stream, then you'll need to require it since creating a stream is a functionality you need to import. However, your function can still take a stream as a parameter and use it since it already exists.
const Stream = require('stream');
// need to require stream since we'd like to use it to create an object
const mystream = new Stream();
--
// no need to require stream here since you're given the object
function doSomethingWithStream(streamObject){
streamObject.on('data', () => { /* do something */ });
}
So in your case you don't need to require stream since you already have the object. Although your res object should have a pipe method (see example below), but it won't work because piping only works from a Readable stream to a Writable stream and res is a Writable stream (see Node Stream docs).
const express= require('express');
const app = express();
app.get('/has-pipe',(req, res) => {
const success = !!res.pipe; // true if res.pipe exists
req.pipe(res) // Readable stream to Writable stream
res.send({ success }); // returns true
})
const port = 3000
app.listen(port, () => console.log(`Example server running on port 3000`))

Related

Node read line by line, process and store

I have the following code in Node.js which reads from a file, line by line. I want to do stuff to each line and store it in an array. The array would then be used in other functions in the same file. The problem I'm running into is the async nature of reading the stream which results in an empty array. The solutions I've come across all seem to rely on modules.
function processLine(file) {
const fs = require('fs');
const readline = require('readline');
const input = fs.createReadStream(file);
const rl = readline.createInterface(input);
const arr = []
rl.on('line', (line) => {
// do stuff to data and store in array
})
// return array;
}
I am aware of being able to store the chunks and operate on the whole file with input.on('end', cb)... However, I feel like this would put too much functionality within the cb. Plus I still can't use its return value since its async. I guess my question is, is there a way to store data being read and use it within the file?
If you would like to process elements like chunks - take a look on
highWaterMark
https://nodejs.org/api/stream.html#stream_types_of_streams
Proably you will be instered in:
objectMode
as well.
Also there are interfaces which you could use while use streams:
Readable
Writable
Duplex
Transform
https://nodejs.org/api/stream.html#stream_transform_transform_chunk_encoding_callback
Where you could use any Promise based function and simply use callback to finish processing element at right point of time:
_transform = function(data, encoding, callback) {
this.push(data);
callback();
};
or
https://nodejs.org/api/stream.html#stream_class_stream_transform
_write(chunk, encoding, callback) {
// ...
}
However there is another solution - rxjs binding for node stream - which you could use while process elements.

Mocking Redis Constructor with Sinon

I'm trying to figure out a way to mock redis in this module:
const Redis = require('ioredis');
const myFunction = {
exists: (thingToCheck) {
let redis_client = new Redis(
6379,
process.env.REDIS_URL,
{
connectTimeout: 75,
dropBufferSupport: true,
retryStrategy: functionHere
});
redis_client.exists(thingToCheck, function (err, resp) {
// handlings in here
});
}
};
Using this test-code:
const LambdaTester = require('lambda-tester');
const chai = require('chai');
const expect = chai.expect;
const sinon = require('sinon');
const mockRedis = sinon.mock(require('ioredis'));
describe( 'A Redis Connection error', function() {
before(() => {
mockRedis.expects('constructor').returns({
exists: (sha, callback) => {
callback('error!', null);
}
});
});
it( 'It returns a database error', function() {
return LambdaTester(lambdaToTest)
.event(thingToCheck)
.expectError((err) => {
expect(err.message).to.equal('Database error');
});
});
});
I also tried a few variations, but I'm kind of stuck as I basically need to mock the constructor and I'm not sure Sinon supports this?
mockRedis.expects('exists').returns(
(thing, callback) => {
callback('error!', null);
}
);
sinon.stub(mockRedis, 'constructor').callsFake(() => console.log('test!'));
sinon.stub(mockRedis, 'exists').callsFake(() => console.log('test!'));
Not sure what else to try here, I also tried using rewire as suggested here, but using mockRedis.__set__("exists", myMock); never set that private variable.
I want to fake my error paths ultimately.
I would love to hear what others are doing to test redis in node js 😄.
Your problem is not whether Sinon supports this or that, but rather a missing understanding of how "classes" work in Ecmascript, as shown by the attempt at stubbing constructor property in the test code. That will never work, as that property has nothing to do with how any resulting objects turn out. It is simply a reference to the function that was used to create the object. I have covered a very similar topic on the Sinon tracker that you might have interest in reading to gain some core JS foo :-) Basically, it is not possible to stub a constructor, but you can probably coerce your code to use another constructor function in its place through either DI or link seams.
As a matter of fact, a few answers down in the same thread, you will see me covering an example of how I myself designed a Redis using class to be easily testable by supporting dependency injection. You might want to check it out, as it is more or less directly applicable to your example module above.
Another technique, which you already has tried getting to work, is using link seams (using rewire). The Sinon homepage has a nice article on doing this. Both rewire and proxyquire will do the job just fine here: I think you have just complicated the matter a bit by wrapping the require statement with a mock.
Even though I am on the Sinon maintainer team, I never use the mock functionality, so I cannot tell you how to use that, as I think it obscures the testing, but to get the basic link seams working using rewire I would basically remove all the Sinon code first and get the basic case going (removing redis for a stubbed module you have created).
Only then, add Sinon code as required.

Do something before consuming stream, using highland.js

I'm trying to code a writable stream which takes a stream of objects and inputs them into a mongodb database. Before consuming the stream of objects, I first need to wait for the db-connection to establish, but I seem to be doing something wrong, because the program never gets to the insertion-part.
// ./mongowriter.js
let mongo = mongodb.MongoClient,
connectToDb = _.wrapCallback(mongo.connect);
export default url => _.pipeline(s => {
return connectToDb(url).flatMap(db => {
console.log('Connection established!');
return s.flatMap(x => /* insert x into db */);
});
});
....
// Usage in other file
import mongowriter from './mongowriter.js';
let objStream = _([/* json objects */]);
objStream.pipe(mongoWriter);
The program just quits without "Connection established!" ever being written to the console.
What am I missing? Is there some kind of idiom I should be following?
By reading the source and some general experimentation, I figured out how to do a single asynchronous thing and then continue processing through the stream. Basically, you use flatMap to replace the event from the asynchronous task with the stream you actually want to process.
Another quirk I didn't expect and which was throwing me off, was that _.pipeline won't work unless the original stream is fully consumed in the callback. That's why it won't work simply putting in a _.map and log stuff (which was how I tried to debug it). Instead one needs to make sure to have an each or done at the end. Below is a minimal example:
export default _ => _.pipeline( stream => {
return _(promiseReturningFunction())
.tap(_ => process.stdout.write('.'))
.flatMap(_ => stream)
.each(_ => process.stdout.write('-'));
});
// Will produce something like the following when called with a non-empty stream.
// Note the lone '.' in the beginning.
// => .-------------------
Basically, a '.' is output when the async function is done, and a '-' for every object of the stream.
Hopefully, this saves someone some time. Took embarrassingly long for me to figure this out. ^^

How can I promisify the MongoDB native Javascript driver using bluebird?

I'd like to use the MongoDB native JS driver with bluebird promises. How can I use Promise.promisifyAll() on this library?
The 2.0 branch documentation contains a better promisification guide https://github.com/petkaantonov/bluebird/blob/master/API.md#promisification
It actually has mongodb example which is much simpler:
var Promise = require("bluebird");
var MongoDB = require("mongodb");
Promise.promisifyAll(MongoDB);
When using Promise.promisifyAll(), it helps to identify a target prototype if your target object must be instantiated. In case of the MongoDB JS driver, the standard pattern is:
Get a Db object, using either MongoClient static method or the Db constructor
Call Db#collection() to get a Collection object.
So, borrowing from https://stackoverflow.com/a/21733446/741970, you can:
var Promise = require('bluebird');
var mongodb = require('mongodb');
var MongoClient = mongodb.MongoClient;
var Collection = mongodb.Collection;
Promise.promisifyAll(Collection.prototype);
Promise.promisifyAll(MongoClient);
Now you can:
var client = MongoClient.connectAsync('mongodb://localhost:27017/test')
.then(function(db) {
return db.collection("myCollection").findOneAsync({ id: 'someId' })
})
.then(function(item) {
// Use `item`
})
.catch(function(err) {
// An error occurred
});
This gets you pretty far, except it'll also help to make sure the Cursor objects returned by Collection#find() are also promisified. In the MongoDB JS driver, the cursor returned by Collection#find() is not built from a prototype. So, you can wrap the method and promisify the cursor each time. This isn't necessary if you don't use cursors, or don't want to incur the overhead. Here's one approach:
Collection.prototype._find = Collection.prototype.find;
Collection.prototype.find = function() {
var cursor = this._find.apply(this, arguments);
cursor.toArrayAsync = Promise.promisify(cursor.toArray, cursor);
cursor.countAsync = Promise.promisify(cursor.count, cursor);
return cursor;
}
I know this has been answered several times, but I wanted to add in a little more information regarding this topic. Per Bluebird's own documentation, you should use the 'using' for cleaning up connections and prevent memory leaks.
Resource Management in Bluebird
I looked all over the place for how to do this correctly and information was scarce so I thought I'd share what I found after much trial and error. The data I used below (restaurants) came from the MongoDB sample data. You can get that here: MongoDB Import Data
// Using dotenv for environment / connection information
require('dotenv').load();
var Promise = require('bluebird'),
mongodb = Promise.promisifyAll(require('mongodb'))
using = Promise.using;
function getConnectionAsync(){
// process.env.MongoDbUrl stored in my .env file using the require above
return mongodb.MongoClient.connectAsync(process.env.MongoDbUrl)
// .disposer is what handles cleaning up the connection
.disposer(function(connection){
connection.close();
});
}
// The two methods below retrieve the same data and output the same data
// but the difference is the first one does as much as it can asynchronously
// while the 2nd one uses the blocking versions of each
// NOTE: using limitAsync seems to go away to never-never land and never come back!
// Everything is done asynchronously here with promises
using(
getConnectionAsync(),
function(connection) {
// Because we used promisifyAll(), most (if not all) of the
// methods in what was promisified now have an Async sibling
// collection : collectionAsync
// find : findAsync
// etc.
return connection.collectionAsync('restaurants')
.then(function(collection){
return collection.findAsync()
})
.then(function(data){
return data.limit(10).toArrayAsync();
});
}
// Before this ".then" is called, the using statement will now call the
// .dispose() that was set up in the getConnectionAsync method
).then(
function(data){
console.log("end data", data);
}
);
// Here, only the connection is asynchronous - the rest are blocking processes
using(
getConnectionAsync(),
function(connection) {
// Here because I'm not using any of the Async functions, these should
// all be blocking requests unlike the promisified versions above
return connection.collection('restaurants').find().limit(10).toArray();
}
).then(
function(data){
console.log("end data", data);
}
);
I hope this helps someone else out who wanted to do things by the Bluebird book.
Version 1.4.9 of mongodb should now be easily promisifiable as such:
Promise.promisifyAll(mongo.Cursor.prototype);
See https://github.com/mongodb/node-mongodb-native/pull/1201 for more details.
We have been using the following driver in production for a while now. Its essentially a promise wrapper over the native node.js driver. It also adds some additional helper functions.
poseidon-mongo - https://github.com/playlyfe/poseidon-mongo

How to read an entire text stream in node.js?

In RingoJS there's a function called read which allows you to read an entire stream until the end is reached. This is useful when you're making a command line application. For example you may write a tac program as follows:
#!/usr/bin/env ringo
var string = system.stdin.read(); // read the entire input stream
var lines = string.split("\n"); // split the lines
lines.reverse(); // reverse the lines
var reversed = lines.join("\n"); // join the reversed lines
system.stdout.write(reversed); // write the reversed lines
This allows you to fire up a shell and run the tac command. Then you type in as many lines as you wish to and after you're done you can press Ctrl+D (or Ctrl+Z on Windows) to signal the end of transmission.
I want to do the same thing in node.js but I can't find any function which would do so. I thought of using the readSync function from the fs library to simulate as follows, but to no avail:
fs.readSync(0, buffer, 0, buffer.length, null);
The file descriptor for stdin (the first argument) is 0. So it should read the data from the keyboard. Instead it gives me the following error:
Error: ESPIPE, invalid seek
at Object.fs.readSync (fs.js:381:19)
at repl:1:4
at REPLServer.self.eval (repl.js:109:21)
at rli.on.self.bufferedCmd (repl.js:258:20)
at REPLServer.self.eval (repl.js:116:5)
at Interface.<anonymous> (repl.js:248:12)
at Interface.EventEmitter.emit (events.js:96:17)
at Interface._onLine (readline.js:200:10)
at Interface._line (readline.js:518:8)
at Interface._ttyWrite (readline.js:736:14)
How would you synchronously collect all the data in an input text stream and return it as a string in node.js? A code example would be very helpful.
As node.js is event and stream oriented there is no API to wait until end of stdin and buffer result, but it's easy to do manually
var content = '';
process.stdin.resume();
process.stdin.on('data', function(buf) { content += buf.toString(); });
process.stdin.on('end', function() {
// your code here
console.log(content.split('').reverse().join(''));
});
In most cases it's better not to buffer data and process incoming chunks as they arrive (using chain of already available stream parsers like xml or zlib or your own FSM parser)
The key is to use these two Stream events:
Event: 'data'
Event: 'end'
For stream.on('data', ...) you should collect your data data into either a Buffer (if it is binary) or into a string.
For on('end', ...) you should call a callback with you completed buffer, or if you can inline it and use return using a Promises library.
Let me illustrate StreetStrider's answer.
Here is how to do it with concat-stream
var concat = require('concat-stream');
yourStream.pipe(concat(function(buf){
// buf is a Node Buffer instance which contains the entire data in stream
// if your stream sends textual data, use buf.toString() to get entire stream as string
var streamContent = buf.toString();
doSomething(streamContent);
}));
// error handling is still on stream
yourStream.on('error',function(err){
console.error(err);
});
Please note that process.stdin is a stream.
There is a module for that particular task, called concat-stream.
If you are in async context and have a recent version of Node.js, here is a quick suggestion:
const chunks = []
for await (let chunk of readable) {
chunks.push(chunk)
}
console.log(Buffer.concat(chunks))
On Windows, I had some problems with the other solutions posted here - the program would run indefinitely when there's no input.
Here is a TypeScript implementation for modern NodeJS, using async generators and for await - quite a bit simpler and more robust than using the old callback based APIs, and this worked on Windows:
import process from "process";
/**
* Read everything from standard input and return a string.
*
* (If there is no data available, the Promise is rejected.)
*/
export async function readInput(): Promise<string> {
const { stdin } = process;
const chunks: Uint8Array[] = [];
if (stdin.isTTY) {
throw new Error("No input available");
}
for await (const chunk of stdin) {
chunks.push(chunk);
}
return Buffer.concat(chunks).toString('utf8');
}
Example:
(async () => {
const input = await readInput();
console.log(input);
})();
(consider adding a try/catch, if you want to handle the Promise rejection and display a more user-friendly error-message when there's no input.)

Categories