Well I have some functions which connect to database (redis) and return some data, those functions usually are based on promises but are asynchronous and contain streams. I looked and read some things about testing and I chose to go with tape, sinon and proxyquire, if I mock this function how I would know that it works?
The following function (listKeys) returns (through promise) all the keys that exist in the redis db after completes the scanning.
let methods = {
client: client,
// Cache for listKeys
cacheKeys: [],
// Increment and return through promise all keys
// store to cacheKeys;
listKeys: blob => {
return new Promise((resolve, reject) => {
blob = blob ? blob : '*';
let stream = methods.client.scanStream({
match: blob,
count: 10,
})
stream.on('data', keys => {
for (var i = 0; i < keys.length; i++) {
if (methods.cacheKeys.indexOf(keys[i]) === -1) {
methods.cacheKeys.push(keys[i])
}
}
})
stream.on('end', () => {
resolve(methods.cacheKeys)
})
stream.on('error', reject)
})
}
}
So how do you test a function like that?
I think there are a couple ways To excercise this function through a test and all revolve around configuring a test stream to be used by your test.
I like to write test cases that I think are important first , then figure out a way to implement them. To me the most important is something like
it('should resolve cacheKeys on end')
Then a stream needs to be created to provide to your function
var Stream = require('stream');
var stream = new Stream();
Then scan stream needs to be controlled by your test
You could do this by creating a fake client
client = {
scanStream: (config) => { return stream }
}
Then a test can be configured with your assertion
var testKeys = ['t'];
Method.listKeys().then((cacheKeys) => {
assert(cacheKeys).toEqual(testKeys);
done()
})
Now that your promise is waiting on your stream with an assertion
Send data to stream.
stream.emit('data', testKeys)
A simple way to test whether the keys get saved to cacheKeys properly by mocking the DB stream, sending data over it and checking whether it got saved properly. E.g.:
// Create a mock stream to substitute database
var mockStream = new require('stream').Readable();
// Create a mock client.scanStream that returns the mocked stream
var client = {
scanStream: function () {
return mockStream;
}
};
// Assign the mocks to methods
methods.client = client;
// Call listKeys(), so the streams get prepared and the promise awaits resolution
methods.listKeys()
.then(function (r) {
// Setup asserts for correct results here
console.log('Promise resolved with: ', r);
});
// Send test data over the mocked stream
mockStream.emit('data', 'hello');
// End the stream to resolve the promise and execute the asserts
mockStream.emit('end');
Related
Currently I have many concurrent identical calls to my backend, differing only on an ID field:
getData(1).then(...) // Each from a React component in a UI framework, so difficult to aggregate here
getData(2).then(...)
getData(3).then(...)
// creates n HTTP requests... inefficient
function getData(id: number): Promise<Data> {
return backend.getData(id);
}
This is wasteful as I make more calls. I'd like to keep my getData() calls, but then aggregate them into a single getDatas() call to my backend, then return all the results to the callers. I have more control over my backend than the UI framework, so I can easily add a getDatas() call on it. The question is how to "mux" the JS calls into one backend call, the "demux" the result into the caller's promises.
const cache = Map<number, Promise<Data>>()
let requestedIds = []
let timeout = null;
// creates just 1 http request (per 100ms)... efficient!
function getData(id: number): Promise<Data> {
if (cache.has(id)) {
return cache;
}
requestedIds.push(id)
if (timeout == null) {
timeout = setTimeout(() => {
backend.getDatas(requestedIds).then((datas: Data[]) => {
// TODO: somehow populate many different promises in cache??? but how?
requestedIds = []
timeout = null
}
}, 100)
}
return ???
}
In Java I would create a Map<int, CompletableFuture> and upon finishing my backend request, I would look up the CompletableFuture and call complete(data) on it. But I think in JS Promises can't be created without an explicit result being passed in.
Can I do this in JS with Promises?
A little unclear on what your end goal looks like. I imagine you could loop through your calls as needed; Perhaps something like:
for (let x in cache){
if (x.has(id))
return x;
}
//OR
for (let x=0; x<id.length;x++){
getData(id[x])
}
Might work. You may be able to add a timing method into the mix if needed.
Not sure what your backend consists of, but I do know GraphQL is a good system for making multiple calls.
It may be ultimately better to handle them all in one request, rather than multiple calls.
The cache can be a regular object mapping ids to promise resolution functions and the promise to which they belong.
// cache maps ids to { resolve, reject, promise, requested }
// resolve and reject belong to the promise, requested is a bool for bookkeeping
const cache = {};
You might need to fire only once, but here I suggest setInterval to regularly check the cache for unresolved requests:
// keep the return value, and stop polling with clearInterval()
// if you really only need one batch, change setInterval to setTimeout
function startGetBatch() {
return setInterval(getBatch, 100);
}
The business logic calls only getData() which just hands out (and caches) promises, like this:
function getData(id) {
if (cache[id]) return cache[id].promise;
cache[id] = {};
const promise = new Promise((resolve, reject) => {
Object.assign(cache[id], { resolve, reject });
});
cache[id].promise = promise;
cache[id].requested = false;
return cache[id].promise;
}
By saving the promise along with the resolver and rejecter, we're also implementing the cache, since the resolved promise will provide the thing it resolved to via its then() method.
getBatch() asks the server in a batch for the not-yet-requested getData() ids, and invokes the corresponding resolve/reject functions:
function getBatch() {
// for any
const ids = [];
Object.keys(cache).forEach(id => {
if (!cache[id].requested) {
cache[id].requested = true;
ids.push(id);
}
});
return backend.getDatas(ids).then(datas => {
Object.keys(datas).forEach(id => {
cache[id].resolve(datas[id]);
})
}).catch(error => {
Object.keys(datas).forEach(id => {
cache[id].reject(error);
delete cache[id]; // so we can retry
})
})
}
The caller side looks like this:
// start polling
const interval = startGetBatch();
// in the business logic
getData(5).then(result => console.log('the result of 5 is:', result));
getData(6).then(result => console.log('the result of 6 is:', result));
// sometime later...
getData(5).then(result => {
// if the promise for an id has resolved, then-ing it still works, resolving again to the -- now cached -- result
console.log('the result of 5 is:', result)
});
// later, whenever we're done
// (no need for this if you change setInterval to setTimeout)
clearInterval(interval);
I think I've found a solution:
interface PromiseContainer {
resolve;
reject;
}
const requests: Map<number, PromiseContainer<Data>> = new Map();
let timeout: number | null = null;
function getData(id: number) {
const promise = new Promise<Data>((resolve, reject) => requests.set(id, { resolve, reject }))
if (timeout == null) {
timeout = setTimeout(() => {
backend.getDatas([...requests.keys()]).then(datas => {
for (let [id, data] of Object.entries(datas)) {
requests.get(Number(id)).resolve(data)
requests.delete(Number(id))
}
}).catch(e => {
Object.values(requests).map(promise => promise.reject(e))
})
timeout = null
}, 100)
}
return promise;
}
The key was figuring out I could extract the (resolve, reject) from a promise, store them, then retrieve and call them later.
I use the fetchAPI to retrieve my data from the backend as a stream.
I decrypt the data chunk by chunk and the concat the content back together for the original file.
I have found that the stream seems to provide data differently each time makling the chunnks different. How can I force the stream to the chunks in the original sequence.
fetch(myRequest, myInit).then(response => {
var tmpResult = new Uint8Array();
const reader = response.body.getReader();
return new ReadableStream({
start(controller) {
return pump();
function pump() {
return reader.read().then(({ done, value }) => {
// When no more data needs to be consumed, close the stream
if (value) {
//values here are different in order every time
//making my concatenated values different every time
controller.enqueue(value);
var decrypted = cryptor.decrypt(value);
var arrayResponse = decrypted.toArrayBuffer();
if (arrayResponse) {
tmpResult = arrayBufferConcat(tmpResult, arrayResponse);
}
}
// Enqueue the next data chunk into our target stream
if (done) {
if (counter == length) {
callback(obj);
}
controller.close();
return;
}
return pump();
});
}
}
})
})
The documentation tells us that:
Each chunk is read sequentially and output to the UI, until the stream
has finished being read, at which point we return out of the recursive
function and print the entire stream to another part of the UI.
I made a test program with node, using node-fetch:
import fetch from 'node-fetch';
const testStreamChunkOrder = async () => {
return new Promise(async (resolve) => {
let response = await fetch('https://jsonplaceholder.typicode.com/todos/');
let stream = response.body;
let data = '';
stream.on('readable', () => {
let chunk;
while (null !== (chunk = stream.read())) {
data += chunk;
}
})
stream.on('end', () => {
resolve(JSON.parse(data).splice(0, 5).map((x) => x.title));
})
});
}
(async () => {
let results = await Promise.all(new Array(10).fill(testStreamChunkOrder()))
let joined = results.map((r) => r.join(''));
console.log(`Is every result same: ${joined.every((j) => j.localeCompare(joined[0]) === 0)}`)
})()
This one fetches some random todo-list json and streams it chunk-by-chunk, accumulating the chunks into data. When the stream is done, we parse the full json and take the first 5 elements of the todo-list and keep only the titles, after which we then return the result asynchronously.
This whole process is done 10 times. When we have 10 streamed title-lists, we go through each title-list and join the title names together to form a string. Finally we use .every to see if each of the 10 strings are the same, which means that each json was fetched and streamed in the same order.
So I believe the problems lies somewhere else - the streaming itself is working correctly. While I did use node-fetch instead of the actual Fetch API, I think it is safe to say that the actual Fetch API works as it should.
Also I noticed that you are directly calling response.body.getReader(), but when I looked at the documentation, the body.getReader call is done inside another then statement:
fetch('./tortoise.png')
.then(response => response.body)
.then(body => {
const reader = body.getReader();
This might not matter, but considering everything else in your code, such as the excessive wrapping and returning of functions, I think your problems could go away just by reading a couple of tutorials on streams and cleaning up the code a bit. And if not, you will still be in a better position to figure out if the problem is in one of your many functions you are unwilling to expose. Asynchronous code's behavior is inherently difficult to debug and lacking information around such code makes it even harder.
I'm assuming you're using the cipher/decipher family of methods in node's crypto library. We can simplify this using streams by first piping the ReadableStream into a decipher TransformStream (a stream that is both readable and writable) via ReadableStream#pipe().
const { createDecipherIv } = require('crypto');
const { createWriteStream } = require('fs');
const { pipeline } = require('stream');
// change these to match your encryption scheme and key retrieval
const algo = 'aes-256-cbc';
const key = 'my5up3r53cr3t';
// put your initialization vector you've determined here
// leave null if you are not (or the algo doesn't support) using an iv
const iv = null;
// creates the decipher TransformStream
const decipher = createDecipherIv(algo, key, iv);
// write plaintext file here
const destFile = createWriteStream('/path/to/destination.ext');
fetch(myRequest, myInit)
.then(response => response.body)
.then(body => body.pipe(decipher).pipe(destFile))
.then(stream => stream.on('end', console.log('done writing file')));
You may also pipe this to be read out in a buffer, pipe to the browser, etc, just be sure to match your algorithm, key, and iv wherever you're defining your cipher/decipher functions.
If we take the pattern in that MDN example seriously, we should use the controller to enqueue the decrypted data (not the still encrypted value), and aggregate the results with the stream returned by the first promise. In other words...
return fetch(myRequest, myInit)
// Retrieve its body as ReadableStream
.then(response => {
const reader = response.body.getReader();
return new ReadableStream({
start(controller) {
return pump();
function pump() {
return reader.read().then(({ done, value }) => {
// When no more data needs to be consumed, close the stream
if (done) {
controller.close();
return;
}
// do the computational work on each chunk here and enqueue
// *the result of that work* on the controller stream...
const decrypted = cryptor.decrypt(value);
controller.enqueue(decrypted);
return pump();
});
}
}
})
})
// Create a new response out of the stream
.then(stream => new Response(stream))
// Create an object URL for the response
.then(response => response.blob())
.then(blob => {
const arrayResponse = blob.toArrayBuffer();
// arrayResponse is the properly sequenced result
// if the caller wants a promise to resolve to this, just return it
return arrayResponse;
// OR... the OP code makes reference to a callback. if that's real,
// call the callback with this result
// callback(arrayResponse);
})
.catch(err => console.error(err));
I'm very new to promises. I need my server to wait for socket connections from an external api and from browser clients connections. The external api sends a number of objects (4 in this example) to the server, which is received as a promise and calls the function which waits for the promise. For each object received by the promise, a browser client can make a connection (promise) and join the game.
I have a function which should wait for variables to be populated by these two promises. It is successful in waiting for the external api objects, but it never receives the promise to indicate that the correct number of clients have made connections.
I wrapped the socket listening for the external API objects in a promise as it will only we sent once. I also call the function which handles the two promises here as it didn't seem to work anywhere else.
//HANDLER FOR GAME OBJECT SENT FROM MAX API
const maxPromise = new Promise ((resolve) => {
socket.on("dictData", async (data)=>{
try {
let {songName, level, imageArr} = data;
let [imageObj] = imageArr;
gameVars.songName = songName;
gameVars.level = level;
let gameObject = {};
for (let obj in imageArr) {
let objectId = imageArr[obj].name;
gameObject.objectId = objectId;
gameObject.path = imageObj.path;
// gameObject.files = imageObj.imagePath;
gameState.totalServerCount ++;
gameState.serverList.push({gameObject});
}
resolve(gameState.serverList) //resolve the promise with the array of objects
sendData()
}
catch (e) {
console.error(e)
}
});
});
I also wrapped the client req listener in a promise because after countless tries to nest the promise inside, this was the only solution which didn't return the actual socket as the promise, so I feel this is probably the closest solution for me.
This promise should only resolve when there are the same amount of client connections as there are server objects received in the first promise. I a testing by simply connecting from 4 open tabs to localhost:3000.
//HANDLER FOR CLIENT REQUEST TO JOIN GAME
const playerPromise = new Promise ((resolve, reject) => {
socket.on('joinGame', async () => {
try {
gameState.totalPlayerCount++;
gameState.playerList.push(socket.id)
switch (true) {
case gameState.totalPlayerCount < gameState.totalServerCount :
console.log("Not enough players", gameState.totalPlayerCount)
break;
case gameState.totalPlayerCount <= gameState.totalServerCount :
console.log("Counts are equal", gameState)
readyPlayers = true;
resolve(gameState.playerList)
break;
case gameState.totalServerCount == 0 :
console.log("Server not ready", gameState)
break;
default :
console.log("Too many players", gameState.totalPlayerCount)
reject("Too many players", gameState.playerList)
}
}
catch(e) {
console.error(e);
}
})
})
sendData() function logs the 1st and 2nd tests to the console, but never the 3rd.
async function sendData() {
try {
console.log("TEST1")
const dataMax = await maxPromise;
console.log("TEST2", dataMax)
const dataPlay = await playerPromise;
console.log("TEST3", dataPlay)
for (var key in await dataPlay) {
io.to(dataPlay[key]).emit('gameData', dataPlay[key]);
}
}
catch(e) {
console.error(e)
}
};
I've looked at every other similar post on stackoverflow and online but cannot find any solution to this or where I'm going wrong. I have also devised the above solution with minimal knowledge of socket.io and promises, so is there is a better/cleaner way to do the above please let me know.
EDIT:
This is my current solution using only one promise, but now the promise is not being populated at all in the send function:
//HANDLER FOR GAME OBJECT SENT FROM MAX API
const maxPromise = new Promise ((resolve) => {
socket.on("dictData", async (data)=>{
try {
let {songName, level, imageArr} = data;
let [imageObj] = imageArr;
gameVars.songName = songName;
gameVars.level = level;
let gameObject = {};
for (let obj in imageArr) {
let objectId = imageArr[obj].name;
gameObject.objectId = objectId;
gameObject.path = imageObj.path;
gameState.totalServerCount ++;
gameState.serverList.push({gameObject});
}
resolve(gameState.serverList)
}
catch (e) {
console.error(e)
}
});
});
async function sendData(playerData) {
try {
console.log("TEST1")
const dataMax = await maxPromise;
console.log("TEST2")
for (var key in await playerData) {
io.to(playerData[key]).emit('gameData', dataMax);
}
}
catch(e) {
console.error(e)
}
};
The sendData() is called in the Client socket handler which just passes the array of connections as playerData. "TEST2" is never logged.
Seeing as the promise maxPromise is global, shouldn't it be able to access its value?
You've probably figured this out by now. It's an interesting problem. I'd like to know how you solved it. As I understand it, you need to wait for data to arrive in order to select players. Seems like a good use of promises. You could use socket.once instead of socket.on for dictData if it's a one-time event
At the same time you don't want to block players yet still need to wait for enough players to join. Awaiting another promise is again a good gating technique
If you haven't solved all issues I suggest first removing socket.io from the equation while developing the gating logic. You can do this in node with custom event emitters. I'd simulate players and data events occurring at random times. You can also do this in the browser with custom events or broadcast channels. You'll find this more convenient than manually connecting to a port
I'd put in a lot of logging with millisecond timestamps to easily understand the sequence of events - when they occur and when they're handled
The situation is simple :
I have a nodejs server (called it API-A) that use Bluebird as a Promise handler.
I have a client(browser) that ask data through the API-A which get the data from another API (API-B). API-B could be a Weather service from example and then API-A aggregate the data with other data and send it back to client.
The situation is the next: API-B need a token with a TTL of 1800 second.
So for each request done by the client, I check if my token is expired or not.
I have this kind of code :
function getActivities()
{
return this.requestAuthorisation()
.then(()=>{
// some other code that is not interesting
})
}
Everything works fine with the glory of promise.
requestAuthorisation() check if the token is valid (next!!!) and if not (I do a request the API-B to refresh the token)
The problem is here:
between the time, the token is expired and the time to obtain a fresh one, some times happen. if 1000 clients ask at the same time these, I will have 1000 request of token to API-B, that is not cool.
How can I avoid that ? I try to avoid a cron-way to do it, to avoid unnecessary call and the problem is the same.
I try to create a sort of global variable (boolean) that track the refreshing status of the token but impossible to find a sort of Promise.WaitFor (variable change)
the Promise.all can not be use because I am in different scope of event.
Is there a way to queue until the token is refresh ?
Please help !
If I understand this write, we need to do two things:
Do not call our refreshToken several times when one is in progress
Once completed, let all the waiting request know that request for the token in completed so that they can continue their work.
If you combine Observable pattern and a state to maintain the in-progress state, this can be done like below
// MyObservable.js:
var util = require('util');
var EventEmitter = require('events').EventEmitter;
let inProgress = false;
function MyObservable() {
EventEmitter.call(this);
}
// This is the function responsible for getting a refresh token
MyObservable.prototype.getToken = function(token) {
console.log('Inside getToken');
if (!inProgress) {
console.log('calling fetchToken');
resultPromise = this.fetchToken();
inProgress = true;
resultPromise.then((result) => {
console.log('Resolved fetch token');
inProgress = false;
this.emit('done', 'token refreshed');
});
}
}
// This is a mock function to simulate the promise based API.
MyObservable.prototype.fetchToken = function(token) {
console.log('Inside fetchToken');
return new Promise((resolve, reject) => {
setTimeout(() => {
console.log('resolving');
resolve("Completed");
}, 2000);
});
}
util.inherits(MyObservable, EventEmitter);
module.exports = MyObservable;
Now we can implement this and observe for the call to complete
const MyObservable = require('./MyObservable');
const observable = new MyObservable();
const A = () => {
console.log('Inside A');
observable.on('done', (message) => {
console.log('Completed A');
});
observable.getToken('test');
}
for (let i = 0; i < 5; i++) {
setTimeout(A, 1000);
}
If we run this code, you will get an output where fetchToeken is called only once even though our method A is called 5 times during the same duration.
Hope this helps!
I'm currently teaching myself reactive programming with rxjs, and I've set myself a challenge of creating an observable stream which will always emit the same result to a subscriber no matter what.
I've memoized the creation of an HTTP "GET" stream given a specific URL, and I'm trying to act on that stream every two seconds, with the outcome being that for each tick of the timer, I'll extract a cached/memoized HTTP result from the original stream.
import superagent from 'superagent';
import _ from 'lodash';
// Cached GET function, returning a stream that emits the HTTP response object
var httpget = _.memoize(function(url) {
var req = superagent.get(url);
req = req.end.bind(req);
return Rx.Observable.fromNodeCallback(req)();
});
// Assume this is created externally and I only have access to response$
var response$ = httpget('/ontologies/acl.ttl');
// Every two seconds, emit the memoized HTTP response
Rx.Observable.timer(0, 2000)
.map(() => response$)
.flatMap($ => $)
.subscribe(response => {
console.log('Got the response!');
});
I was sure that I'd have to stick a call to replay() in there somewhere, but no matter what I do, a fresh HTTP call is initiated every two seconds. How can I structure this so that I can construct an observable from a URL and have it always emit the same HTTP result to any subsequent subscribers?
EDIT
I found a way to get the result I want, but I feel like I am missing something, and should be able to refactor this with a much more streamlined approach:
var httpget = _.memoize(function(url) {
var subject = new Rx.ReplaySubject();
try {
superagent.get(url).end((err, res) => {
if(err) {
subject.onError(err);
}
else {
subject.onNext(res);
subject.onCompleted();
}
});
}
catch(e) {
subject.onError(e);
}
return subject.asObservable();
});
Your first code sample is actually closer to the way to do it
var httpget = _.memoize(function(url) {
var req = superagent.get(url);
return Rx.Observable.fromNodeCallback(req.end, req)();
});
However, this isn't working because there appears to be a bug in fromNodeCallback. As to work around till this is fixed, I think you are actually looking for the AsyncSubject instead of ReplaySubject. The latter works, but the former is designed for exactly this scenario (and doesn't have the overhead of an array creation + runtime checks for cache expiration if that matters to you).
var httpget = _.memoize(function(url) {
var subject = new Rx.AsyncSubject();
var req = superagent.get(url);
Rx.Observable.fromNodeCallback(req.end, req)().subscribe(subject);
return subject.asObservable();
});
Finally, though map appreciates that you are thinking of it, you can simplify your timer code by using the flatMap overload that takes an Observable directly:
Rx.Observable.timer(0, 2000)
.flatMap($response)
.subscribe(response => {
console.log('Got the response');
});
Unless I am getting your question wrong, Observable.combineLatest does just that for you, it cache the last emitted value of your observable.
This code sends the request once and then give same cached response every 200 ms:
import reqPromise from 'request-promise';
import {Observable} from 'rx';
let httpGet_ = (url) =>
Observable
.combineLatest(
Observable.interval(200),
reqPromise(url),
(count, response) => response
);
httpGet_('http://google.com/')
.subscribe(
x => console.log(x),
e => console.error(e)
);