JS Promise waitFor refresh Token - javascript

The situation is simple :
I have a nodejs server (called it API-A) that use Bluebird as a Promise handler.
I have a client(browser) that ask data through the API-A which get the data from another API (API-B). API-B could be a Weather service from example and then API-A aggregate the data with other data and send it back to client.
The situation is the next: API-B need a token with a TTL of 1800 second.
So for each request done by the client, I check if my token is expired or not.
I have this kind of code :
function getActivities()
{
return this.requestAuthorisation()
.then(()=>{
// some other code that is not interesting
})
}
Everything works fine with the glory of promise.
requestAuthorisation() check if the token is valid (next!!!) and if not (I do a request the API-B to refresh the token)
The problem is here:
between the time, the token is expired and the time to obtain a fresh one, some times happen. if 1000 clients ask at the same time these, I will have 1000 request of token to API-B, that is not cool.
How can I avoid that ? I try to avoid a cron-way to do it, to avoid unnecessary call and the problem is the same.
I try to create a sort of global variable (boolean) that track the refreshing status of the token but impossible to find a sort of Promise.WaitFor (variable change)
the Promise.all can not be use because I am in different scope of event.
Is there a way to queue until the token is refresh ?
Please help !

If I understand this write, we need to do two things:
Do not call our refreshToken several times when one is in progress
Once completed, let all the waiting request know that request for the token in completed so that they can continue their work.
If you combine Observable pattern and a state to maintain the in-progress state, this can be done like below
// MyObservable.js:
var util = require('util');
var EventEmitter = require('events').EventEmitter;
let inProgress = false;
function MyObservable() {
EventEmitter.call(this);
}
// This is the function responsible for getting a refresh token
MyObservable.prototype.getToken = function(token) {
console.log('Inside getToken');
if (!inProgress) {
console.log('calling fetchToken');
resultPromise = this.fetchToken();
inProgress = true;
resultPromise.then((result) => {
console.log('Resolved fetch token');
inProgress = false;
this.emit('done', 'token refreshed');
});
}
}
// This is a mock function to simulate the promise based API.
MyObservable.prototype.fetchToken = function(token) {
console.log('Inside fetchToken');
return new Promise((resolve, reject) => {
setTimeout(() => {
console.log('resolving');
resolve("Completed");
}, 2000);
});
}
util.inherits(MyObservable, EventEmitter);
module.exports = MyObservable;
Now we can implement this and observe for the call to complete
const MyObservable = require('./MyObservable');
const observable = new MyObservable();
const A = () => {
console.log('Inside A');
observable.on('done', (message) => {
console.log('Completed A');
});
observable.getToken('test');
}
for (let i = 0; i < 5; i++) {
setTimeout(A, 1000);
}
If we run this code, you will get an output where fetchToeken is called only once even though our method A is called 5 times during the same duration.
Hope this helps!

Related

Aggregate multiple calls then separate result with Promise

Currently I have many concurrent identical calls to my backend, differing only on an ID field:
getData(1).then(...) // Each from a React component in a UI framework, so difficult to aggregate here
getData(2).then(...)
getData(3).then(...)
// creates n HTTP requests... inefficient
function getData(id: number): Promise<Data> {
return backend.getData(id);
}
This is wasteful as I make more calls. I'd like to keep my getData() calls, but then aggregate them into a single getDatas() call to my backend, then return all the results to the callers. I have more control over my backend than the UI framework, so I can easily add a getDatas() call on it. The question is how to "mux" the JS calls into one backend call, the "demux" the result into the caller's promises.
const cache = Map<number, Promise<Data>>()
let requestedIds = []
let timeout = null;
// creates just 1 http request (per 100ms)... efficient!
function getData(id: number): Promise<Data> {
if (cache.has(id)) {
return cache;
}
requestedIds.push(id)
if (timeout == null) {
timeout = setTimeout(() => {
backend.getDatas(requestedIds).then((datas: Data[]) => {
// TODO: somehow populate many different promises in cache??? but how?
requestedIds = []
timeout = null
}
}, 100)
}
return ???
}
In Java I would create a Map<int, CompletableFuture> and upon finishing my backend request, I would look up the CompletableFuture and call complete(data) on it. But I think in JS Promises can't be created without an explicit result being passed in.
Can I do this in JS with Promises?
A little unclear on what your end goal looks like. I imagine you could loop through your calls as needed; Perhaps something like:
for (let x in cache){
if (x.has(id))
return x;
}
//OR
for (let x=0; x<id.length;x++){
getData(id[x])
}
Might work. You may be able to add a timing method into the mix if needed.
Not sure what your backend consists of, but I do know GraphQL is a good system for making multiple calls.
It may be ultimately better to handle them all in one request, rather than multiple calls.
The cache can be a regular object mapping ids to promise resolution functions and the promise to which they belong.
// cache maps ids to { resolve, reject, promise, requested }
// resolve and reject belong to the promise, requested is a bool for bookkeeping
const cache = {};
You might need to fire only once, but here I suggest setInterval to regularly check the cache for unresolved requests:
// keep the return value, and stop polling with clearInterval()
// if you really only need one batch, change setInterval to setTimeout
function startGetBatch() {
return setInterval(getBatch, 100);
}
The business logic calls only getData() which just hands out (and caches) promises, like this:
function getData(id) {
if (cache[id]) return cache[id].promise;
cache[id] = {};
const promise = new Promise((resolve, reject) => {
Object.assign(cache[id], { resolve, reject });
});
cache[id].promise = promise;
cache[id].requested = false;
return cache[id].promise;
}
By saving the promise along with the resolver and rejecter, we're also implementing the cache, since the resolved promise will provide the thing it resolved to via its then() method.
getBatch() asks the server in a batch for the not-yet-requested getData() ids, and invokes the corresponding resolve/reject functions:
function getBatch() {
// for any
const ids = [];
Object.keys(cache).forEach(id => {
if (!cache[id].requested) {
cache[id].requested = true;
ids.push(id);
}
});
return backend.getDatas(ids).then(datas => {
Object.keys(datas).forEach(id => {
cache[id].resolve(datas[id]);
})
}).catch(error => {
Object.keys(datas).forEach(id => {
cache[id].reject(error);
delete cache[id]; // so we can retry
})
})
}
The caller side looks like this:
// start polling
const interval = startGetBatch();
// in the business logic
getData(5).then(result => console.log('the result of 5 is:', result));
getData(6).then(result => console.log('the result of 6 is:', result));
// sometime later...
getData(5).then(result => {
// if the promise for an id has resolved, then-ing it still works, resolving again to the -- now cached -- result
console.log('the result of 5 is:', result)
});
// later, whenever we're done
// (no need for this if you change setInterval to setTimeout)
clearInterval(interval);
I think I've found a solution:
interface PromiseContainer {
resolve;
reject;
}
const requests: Map<number, PromiseContainer<Data>> = new Map();
let timeout: number | null = null;
function getData(id: number) {
const promise = new Promise<Data>((resolve, reject) => requests.set(id, { resolve, reject }))
if (timeout == null) {
timeout = setTimeout(() => {
backend.getDatas([...requests.keys()]).then(datas => {
for (let [id, data] of Object.entries(datas)) {
requests.get(Number(id)).resolve(data)
requests.delete(Number(id))
}
}).catch(e => {
Object.values(requests).map(promise => promise.reject(e))
})
timeout = null
}, 100)
}
return promise;
}
The key was figuring out I could extract the (resolve, reject) from a promise, store them, then retrieve and call them later.

How can I insert objects in SQL Server running a function in a loop? ConnectionError: .connect can not be called on a Connection in `Connecting` state

I'm working in a NodeJS project, this project I decided to change the way I'm doing it because this way wasn't working, let me try to explain it.
I need to insert data into a SQL Server DB, so I did a function insertOffice() this function opens a connection using Tedious, then fetchs data to an url with data from an array data2 to load coords, and then with this coords creates an object, then inserts this object into a DB. When inserting only one part of my data2 array, it works, by only sendind data[0] it adds:
{
latjson: 1,
lonjson: 1,
idoficina: "1",
}
But I want to insert both of the parts of my array, changing data2[0] to data2[index], to be able to insert all my array, so I tried creating another function functionLooper()that loops insertOffice() to insert my data from my array data2.
I builded this little code to learn how to loop a function, this prints index that is the value I use for bringing idoficina.
As you can see functionLooper() runs the code twice, so it can read fully data2 array, I have this little code that works with the same logic, I builded my full code using this:
function insertOffice(index) {
console.log(index);
}
function functionLooper() {
for (let i = 0; i < 5; i++) {
let response = insertOffice(i);
}
}
functionLooper();
This prints:
0
1
2
3
4
So my code it's supposed to send index
I'm expecting my code to loop my insertOffice() and being able to insert my objects, the issue is that this doesn't seems to work as I am getting this error:
C:\...\node_modules\tedious\lib\connection.js:993
throw new _errors.ConnectionError('`.connect` can not be called on a Connection in `' + this.state.name + '` state.');
^
ConnectionError: `.connect` can not be called on a Connection in `Connecting` state.
this is my code:
var config = {
....
};
const data2 = [
...
];
var connection = new Connection(config);
function insertOffice(index) {
console.log(index)
connection.on("connect", function (err) {
console.log("Successful connection");
});
connection.connect();
const request = new Request(
"EXEC SPInsert #Data1, ... ",
function (err) {
if (err) {
console.log("Couldn't insert, " + err);
} else {
console.log("Inserted")
}
}
);
console.log(myObject.Id_Oficina)
request.addParameter("Data1", TYPES.SmallInt, myObject.Id_Oficina);
request.on("row", function (columns) {
columns.forEach(function (column) {
if (column.value === null) {
console.log("NULL");
} else {
console.log("Product id of inserted item is " + column.value);
}
});
});
request.on("requestCompleted", function () {
connection.close();
});
connection.execSql(request);
}
function functionLooper() {
for (let i = 0; i < 2; i++) {
let response = insertOffice(i);
}
}
functionLooper();
I do not know if this is the right way to do it (looping the inserting function insertOffice()twice), if you know a better way to do it and if you could show me how in an example using a similar code to mine, would really appreciate it.
You're approaching an asynchronous problem as if it's a synchronous one. You're also making your life a bit harder by mixing event based async tasks with promise based ones.
For example, connection.connect() is asynchronous (meaning that it doesn't finish all its work before the next lines of code is executed), it is only done when connection emits the connect event. So the trigger for starting the processing of your data should not be started until this event is fired.
For each of the events in your loop they are not running one at a time but all at the same time because the fetch() is a promise (asynchronous) it doesn't complete before the next iteration of the loop. In some cases it may have even finished before the database connection is ready, meaning the code execution has moved on to DB requests before the connection to the database is established.
To allow your code to be as manageable as possible you should aim to "promisify" the connection / requests so that you can then write an entirely promise based program, rather than mixing promises and events (which will be pretty tricky to manage - but is possible).
For example:
const connection = new Connection(config);
// turn the connection event into a promise
function connect() {
return new Promise((resolve, reject) => {
connection.once('connect', (err) => err ? reject(err) : resolve(connection));
connection.connect()
});
}
// insert your data once the connection is ready and then close it when all the work is done
function insertOffices() {
connect().then((conn) => {
// connection is ready I can do what I want
// NB: Make sure you return a promise here otherwise the connection.close() call will fire before it's done
}).then(() => {
connection.close();
});
}
The same approach can be taken to "promisify" the inserts.
// turn a DB request into a promise
function request(conn) {
return new Promise((resolve, reject) => {
const request = new Request(...);
request.once('error', reject);
request.once('requestCompleted', resolve);
conn.execSql(request);
});
}
This can then be combined to perform a loop where it's executed one at a time:
function doInserts() {
return connect().then((conn) => {
// create a "chain" of promises that execute one after the other
let inserts = Promise.resolve();
for (let i = 0; i < limit; i++) {
inserts = inserts.then(() => request(conn));
}
return inserts;
}).then(() => connection.close())
}
or in parallel:
function doInserts() {
return connect().then((conn) => {
// create an array of promises that all execute independently
// NB - this probably won't work currently because it would need
// multiple connections to work (rather than one)
let inserts = [];
for (let i = 0; i < limit; i++) {
inserts.push(request(conn));
}
return Promise.all(inserts);
}).then(() => connection.close())
}
Finally I could fix it, I'm sharing my code for everyone to could use it and do multiple inserts, thanks to Dan Hensby, I didn't do it his way but used part of what he said, thanks to RbarryYoung and MichaelSun90 who told me how, just what I did was changing my
var connection = new Connection(config);
to run inside my
function insertOffice(index) { ... }
Looking like this:
function insertOffice(index) {
var connection = new Connection(config);
....
}

Using promises and async with socket.io doesn't return expected results

I'm very new to promises. I need my server to wait for socket connections from an external api and from browser clients connections. The external api sends a number of objects (4 in this example) to the server, which is received as a promise and calls the function which waits for the promise. For each object received by the promise, a browser client can make a connection (promise) and join the game.
I have a function which should wait for variables to be populated by these two promises. It is successful in waiting for the external api objects, but it never receives the promise to indicate that the correct number of clients have made connections.
I wrapped the socket listening for the external API objects in a promise as it will only we sent once. I also call the function which handles the two promises here as it didn't seem to work anywhere else.
//HANDLER FOR GAME OBJECT SENT FROM MAX API
const maxPromise = new Promise ((resolve) => {
socket.on("dictData", async (data)=>{
try {
let {songName, level, imageArr} = data;
let [imageObj] = imageArr;
gameVars.songName = songName;
gameVars.level = level;
let gameObject = {};
for (let obj in imageArr) {
let objectId = imageArr[obj].name;
gameObject.objectId = objectId;
gameObject.path = imageObj.path;
// gameObject.files = imageObj.imagePath;
gameState.totalServerCount ++;
gameState.serverList.push({gameObject});
}
resolve(gameState.serverList) //resolve the promise with the array of objects
sendData()
}
catch (e) {
console.error(e)
}
});
});
I also wrapped the client req listener in a promise because after countless tries to nest the promise inside, this was the only solution which didn't return the actual socket as the promise, so I feel this is probably the closest solution for me.
This promise should only resolve when there are the same amount of client connections as there are server objects received in the first promise. I a testing by simply connecting from 4 open tabs to localhost:3000.
//HANDLER FOR CLIENT REQUEST TO JOIN GAME
const playerPromise = new Promise ((resolve, reject) => {
socket.on('joinGame', async () => {
try {
gameState.totalPlayerCount++;
gameState.playerList.push(socket.id)
switch (true) {
case gameState.totalPlayerCount < gameState.totalServerCount :
console.log("Not enough players", gameState.totalPlayerCount)
break;
case gameState.totalPlayerCount <= gameState.totalServerCount :
console.log("Counts are equal", gameState)
readyPlayers = true;
resolve(gameState.playerList)
break;
case gameState.totalServerCount == 0 :
console.log("Server not ready", gameState)
break;
default :
console.log("Too many players", gameState.totalPlayerCount)
reject("Too many players", gameState.playerList)
}
}
catch(e) {
console.error(e);
}
})
})
sendData() function logs the 1st and 2nd tests to the console, but never the 3rd.
async function sendData() {
try {
console.log("TEST1")
const dataMax = await maxPromise;
console.log("TEST2", dataMax)
const dataPlay = await playerPromise;
console.log("TEST3", dataPlay)
for (var key in await dataPlay) {
io.to(dataPlay[key]).emit('gameData', dataPlay[key]);
}
}
catch(e) {
console.error(e)
}
};
I've looked at every other similar post on stackoverflow and online but cannot find any solution to this or where I'm going wrong. I have also devised the above solution with minimal knowledge of socket.io and promises, so is there is a better/cleaner way to do the above please let me know.
EDIT:
This is my current solution using only one promise, but now the promise is not being populated at all in the send function:
//HANDLER FOR GAME OBJECT SENT FROM MAX API
const maxPromise = new Promise ((resolve) => {
socket.on("dictData", async (data)=>{
try {
let {songName, level, imageArr} = data;
let [imageObj] = imageArr;
gameVars.songName = songName;
gameVars.level = level;
let gameObject = {};
for (let obj in imageArr) {
let objectId = imageArr[obj].name;
gameObject.objectId = objectId;
gameObject.path = imageObj.path;
gameState.totalServerCount ++;
gameState.serverList.push({gameObject});
}
resolve(gameState.serverList)
}
catch (e) {
console.error(e)
}
});
});
async function sendData(playerData) {
try {
console.log("TEST1")
const dataMax = await maxPromise;
console.log("TEST2")
for (var key in await playerData) {
io.to(playerData[key]).emit('gameData', dataMax);
}
}
catch(e) {
console.error(e)
}
};
The sendData() is called in the Client socket handler which just passes the array of connections as playerData. "TEST2" is never logged.
Seeing as the promise maxPromise is global, shouldn't it be able to access its value?
You've probably figured this out by now. It's an interesting problem. I'd like to know how you solved it. As I understand it, you need to wait for data to arrive in order to select players. Seems like a good use of promises. You could use socket.once instead of socket.on for dictData if it's a one-time event
At the same time you don't want to block players yet still need to wait for enough players to join. Awaiting another promise is again a good gating technique
If you haven't solved all issues I suggest first removing socket.io from the equation while developing the gating logic. You can do this in node with custom event emitters. I'd simulate players and data events occurring at random times. You can also do this in the browser with custom events or broadcast channels. You'll find this more convenient than manually connecting to a port
I'd put in a lot of logging with millisecond timestamps to easily understand the sequence of events - when they occur and when they're handled

How to call API every 2 seconds until desired result or timeout in async function

I need to call an api to get a status every 2 seconds if the response is running and first return when response is either complete or failed, or until 30 seconds have passed and the function times out.
This is what I have now which works, but I am sure it can be done much more efficient, but I simply can't figure it out at this point:
const getStatus = async (processId) => {
try {
const response = await fetch(`example.com/api/getStatus/${processId}`);
const status = await response.json();
return await status;
} catch(err) {
// handle error
}
}
Inside another async function using getStatus():
randomFunction = async () => {
let status = null;
let tries = 0;
let stop = false;
while (tries <= 15 && !stop) {
try {
status = await getStatus('some-process-id');
if (status === 'complete') {
stop = true;
// do something outside of loop
}
if (status === 'failed') {
stop = true;
throw Error(status);
}
if (tries === 15) {
stop = true;
throw Error('Request timed out');
}
} catch (err) {
// handle error
}
const delay = time => new Promise(resolve => setTimeout(() => resolve(), time));
if (tries < 15) {
await delay(2000);
}
tries++;
}
}
I would prefer to handle the looping inside getStatus() and in a more readable format, but is it possible?
EDIT:
I tried a solution that looks better and seems to work as I expect, see it here:
https://gist.github.com/AntonBramsen/6cec0faade032dfa3c175b7d291e07bd
Let me know if parts of the solution contains any solutions that are bad practice.
Your question is for javascript. Unfortunately I don't drink coffee, I can only give you the code in C#. But I guess you get the gist and can figure out how to translate this into java
Let's do this as a generic function:
You have a function that is called every TimeSpan, and you want to stop calling this function whenever the function returns true, you want to cancel, whenever some maximum time has passed.
For this maximum time I use a CancellationToken, this allows you to cancel processing for more reasons than timeout. For instance, because the operator wants to close the program.
TapiResult CallApi<TapiResult> <Func<TapiResult> apiCall,
Func<TapiResult, bool> stopCriterion,
CancellationToken cancellationToken)
{
TapiResult apiResult = apiCall;
while (!stopCriterion(apiResult))
{
cancellationToken.ThrowIfCancellationRequested();
Task.Delay(delayTime, cancellationToken).Wait;
apiResult = apiCall;
}
return apiResult;
}
ApiCall is the Api function to call. The return value is a TApiResult. In your case the status is your TApiResult
StopCriterion is a function with input ApiResult and output a boolean that is true when the function must stop. In your case this is when status equals complete or failed
CancellationToken is the Token you can get from a CancellationTokenSource. Whenever you want the procedure to stop processing, just tell the CancellationTokenSource, and the function will stop with a CancellationException
Suppose this is your Api:
Status MyApiCall(int x, string y) {...}
Then the usage is:
Timespan maxProcessTime = TimeSpan.FromSeconds(45);
var cancellationTokenSource = new CancellationTokenSource();
// tell the cancellationTokenSource to stop processing afer maxProcessTime:
cancellationTokenSource.CancelAfter(maxProcessTime);
// Start processing
Status resultAfterProcessing = CallApi<Status>(
() => MyApiCall (3, "Hello World!"), // The Api function to call repeatedly
// it returns a Status
(status) => status == complete // stop criterion: becomes true
|| status == failed, // when status complete or failed
cancellationTokenSource.Token); // get a token from the token source
TODO: add try / catch for CancellationException, and process what should be done if the task cancels
The function will stop as soon as the stopCriterion becomes true, or when the CancellationTokenSource cancels. This will automatically be done after maxTimeOut. However, if you want to stop earlier, for instance because you want to stop the program:
cancellationTokenSource.Cancel();

Using rx.js, how do I emit a memoized result from an existing observable sequence on a timer?

I'm currently teaching myself reactive programming with rxjs, and I've set myself a challenge of creating an observable stream which will always emit the same result to a subscriber no matter what.
I've memoized the creation of an HTTP "GET" stream given a specific URL, and I'm trying to act on that stream every two seconds, with the outcome being that for each tick of the timer, I'll extract a cached/memoized HTTP result from the original stream.
import superagent from 'superagent';
import _ from 'lodash';
// Cached GET function, returning a stream that emits the HTTP response object
var httpget = _.memoize(function(url) {
var req = superagent.get(url);
req = req.end.bind(req);
return Rx.Observable.fromNodeCallback(req)();
});
// Assume this is created externally and I only have access to response$
var response$ = httpget('/ontologies/acl.ttl');
// Every two seconds, emit the memoized HTTP response
Rx.Observable.timer(0, 2000)
.map(() => response$)
.flatMap($ => $)
.subscribe(response => {
console.log('Got the response!');
});
I was sure that I'd have to stick a call to replay() in there somewhere, but no matter what I do, a fresh HTTP call is initiated every two seconds. How can I structure this so that I can construct an observable from a URL and have it always emit the same HTTP result to any subsequent subscribers?
EDIT
I found a way to get the result I want, but I feel like I am missing something, and should be able to refactor this with a much more streamlined approach:
var httpget = _.memoize(function(url) {
var subject = new Rx.ReplaySubject();
try {
superagent.get(url).end((err, res) => {
if(err) {
subject.onError(err);
}
else {
subject.onNext(res);
subject.onCompleted();
}
});
}
catch(e) {
subject.onError(e);
}
return subject.asObservable();
});
Your first code sample is actually closer to the way to do it
var httpget = _.memoize(function(url) {
var req = superagent.get(url);
return Rx.Observable.fromNodeCallback(req.end, req)();
});
However, this isn't working because there appears to be a bug in fromNodeCallback. As to work around till this is fixed, I think you are actually looking for the AsyncSubject instead of ReplaySubject. The latter works, but the former is designed for exactly this scenario (and doesn't have the overhead of an array creation + runtime checks for cache expiration if that matters to you).
var httpget = _.memoize(function(url) {
var subject = new Rx.AsyncSubject();
var req = superagent.get(url);
Rx.Observable.fromNodeCallback(req.end, req)().subscribe(subject);
return subject.asObservable();
});
Finally, though map appreciates that you are thinking of it, you can simplify your timer code by using the flatMap overload that takes an Observable directly:
Rx.Observable.timer(0, 2000)
.flatMap($response)
.subscribe(response => {
console.log('Got the response');
});
Unless I am getting your question wrong, Observable.combineLatest does just that for you, it cache the last emitted value of your observable.
This code sends the request once and then give same cached response every 200 ms:
import reqPromise from 'request-promise';
import {Observable} from 'rx';
let httpGet_ = (url) =>
Observable
.combineLatest(
Observable.interval(200),
reqPromise(url),
(count, response) => response
);
httpGet_('http://google.com/')
.subscribe(
x => console.log(x),
e => console.error(e)
);

Categories