Why the nodejs with amqplib consume function is closure? - javascript

I use the nodejs amqplib module to connect rabbitmq.
I found the consume function is become a closure function, but I couldn't understand why. I didn't use closure.
My code is below. I found the corr in the returnOK still get the first time value. When I fire this function second times. The corr still the value at first time.
I think that is odd. Someone could explain this?
const corr = new Date().getTime();
try {
const params = JSON.stringify(req.body);
console.log('corr first =', corr);
await ch.sendToQueue(q, Buffer.from(params), {
deliveryMode: true,
correlationId: corr.toString(),
replyTo: queue.queue,
});
const returnOK = (msg) => {
if (msg.properties.correlationId === corr.toString()) {
console.info('******* Proxy send message done *******');
res.status(HTTPStatus.OK).json('Done');
}
};
await ch.consume(queue.queue, returnOK, { noAck: true });
} catch (error) {
res.status(HTTPStatus.INTERNAL_SERVER_ERROR).json(error);
}

It appears you're calling ch.consume on every request, in effect creating a new consumer every time. You should only do that once.
What is happening is that the first consumer is picking up the messages.
To fix this, you probably want to move ch.consume outside the request handler.

Related

Multiple arguments in Gio.Subprocess

I'm developing my first gnome-shell-extension currently. In the extension, I want to execute a simple shell command and use the output afterwards, for which I use Gio.Subprocess like it is used in this wiki: https://wiki.gnome.org/AndyHolmes/Sandbox/SpawningProcesses
Currently, I have an argument like this one with some parameters: "ProgramXYZ -a -bc" which I pass as the argument vector argv as ['ProgramXYZ','-a','-bc']. This case works fine.
So let's say I would like to call two programs and combine the output with your approach, like: "ProgramXYZ -a -bc && ProgramB". My current output is correct in a normal terminal, but I'm not sure how to pass it to the Gio.Subprocess. Something like ['ProgramXYZ','-a','-bc','&&','ProgramB'] does not work, is there a way to achieve that or do i have to make two seperate calls?
Sorry, I haven't managed to finish that page (that's why it's in my sandbox 😉).
Here is our Promise wrapper for running a subprocess:
function execCommand(argv, input = null, cancellable = null) {
let flags = Gio.SubprocessFlags.STDOUT_PIPE;
if (input !== null)
flags |= Gio.SubprocessFlags.STDIN_PIPE;
let proc = new Gio.Subprocess({
argv: argv,
flags: flags
});
proc.init(cancellable);
return new Promise((resolve, reject) => {
proc.communicate_utf8_async(input, cancellable, (proc, res) => {
try {
resolve(proc.communicate_utf8_finish(res)[1]);
} catch (e) {
reject(e);
}
});
});
}
Now you have two reasonable choices, since you have a nice wrapper.
I would prefer this option myself, because if I'm launching sequential processes I probably want to know which failed, what the error was and so on. I really wouldn't worry about extra overhead, since the second process only executes if the first succeeds, at which point the first will have been garbage collected.
async function dualCall() {
try {
let stdout1 = await execCommand(['ProgramXYZ', '-a', '-bc']);
let stdout2 = await execCommand(['ProgramB']);
} catch (e) {
logError(e);
}
}
On the other hand, there is nothing preventing you from launching a sub-shell if you really want to do shell stuff. Ultimately you're just offloading the same behaviour to a shell, though:
async function shellCall() {
try {
let stdout = await execCommand([
'/bin/sh',
'-c',
'ProgramXYZ -a -bc && ProgramB'
]);
} catch (e) {
logError(e);
}
}

Repeating API requests until new data arrives

I'm struggling with that one.
I have old piece of data. I would like to call to API in interval of 5 seconds until I will get the new equivalent of that data.
I'm trying to this like that(see the comments inside the code):
const requestTimeout = async(currentData, resolve) => {
//that's my working call to API ;)
const { data } = await requestGet(URL.LOTTERIES_LIST);
// if old data is equal to new data let's try again until the backend API will finally change data to new
if(currentData === data.newData) {
this.setTimeout(requestTimeout(currentData, resolve), 5000);
} else {
resolve(data.newData);
}
};
// this is a function which is called from another part of app, I need to return/resolve a new data from here, current Data is just old piece of data
export function callUntilNewDataArrives(shouldCallAPI, currentData) {
return new Promise(resolve => {
if (shouldCallAPI) {
requestTimeout(currentData, resolve);
}
});
As stated in comments in another part of the app I try to use callUntilNewDataArrives function simply like that:
callUntilNewDataArrives(shouldCallAPI, currentData)
.then(res => console.log(res));
This however won't work. Thank you for any help!
I believe that the problem is in your objects comparison.
The only possible reason that promise is resolved with old data is that
if(currentData === data.newData) returns false every time.
I'd check it in the debugger to be sure.

JS Promise waitFor refresh Token

The situation is simple :
I have a nodejs server (called it API-A) that use Bluebird as a Promise handler.
I have a client(browser) that ask data through the API-A which get the data from another API (API-B). API-B could be a Weather service from example and then API-A aggregate the data with other data and send it back to client.
The situation is the next: API-B need a token with a TTL of 1800 second.
So for each request done by the client, I check if my token is expired or not.
I have this kind of code :
function getActivities()
{
return this.requestAuthorisation()
.then(()=>{
// some other code that is not interesting
})
}
Everything works fine with the glory of promise.
requestAuthorisation() check if the token is valid (next!!!) and if not (I do a request the API-B to refresh the token)
The problem is here:
between the time, the token is expired and the time to obtain a fresh one, some times happen. if 1000 clients ask at the same time these, I will have 1000 request of token to API-B, that is not cool.
How can I avoid that ? I try to avoid a cron-way to do it, to avoid unnecessary call and the problem is the same.
I try to create a sort of global variable (boolean) that track the refreshing status of the token but impossible to find a sort of Promise.WaitFor (variable change)
the Promise.all can not be use because I am in different scope of event.
Is there a way to queue until the token is refresh ?
Please help !
If I understand this write, we need to do two things:
Do not call our refreshToken several times when one is in progress
Once completed, let all the waiting request know that request for the token in completed so that they can continue their work.
If you combine Observable pattern and a state to maintain the in-progress state, this can be done like below
// MyObservable.js:
var util = require('util');
var EventEmitter = require('events').EventEmitter;
let inProgress = false;
function MyObservable() {
EventEmitter.call(this);
}
// This is the function responsible for getting a refresh token
MyObservable.prototype.getToken = function(token) {
console.log('Inside getToken');
if (!inProgress) {
console.log('calling fetchToken');
resultPromise = this.fetchToken();
inProgress = true;
resultPromise.then((result) => {
console.log('Resolved fetch token');
inProgress = false;
this.emit('done', 'token refreshed');
});
}
}
// This is a mock function to simulate the promise based API.
MyObservable.prototype.fetchToken = function(token) {
console.log('Inside fetchToken');
return new Promise((resolve, reject) => {
setTimeout(() => {
console.log('resolving');
resolve("Completed");
}, 2000);
});
}
util.inherits(MyObservable, EventEmitter);
module.exports = MyObservable;
Now we can implement this and observe for the call to complete
const MyObservable = require('./MyObservable');
const observable = new MyObservable();
const A = () => {
console.log('Inside A');
observable.on('done', (message) => {
console.log('Completed A');
});
observable.getToken('test');
}
for (let i = 0; i < 5; i++) {
setTimeout(A, 1000);
}
If we run this code, you will get an output where fetchToeken is called only once even though our method A is called 5 times during the same duration.
Hope this helps!

Using rx.js, how do I emit a memoized result from an existing observable sequence on a timer?

I'm currently teaching myself reactive programming with rxjs, and I've set myself a challenge of creating an observable stream which will always emit the same result to a subscriber no matter what.
I've memoized the creation of an HTTP "GET" stream given a specific URL, and I'm trying to act on that stream every two seconds, with the outcome being that for each tick of the timer, I'll extract a cached/memoized HTTP result from the original stream.
import superagent from 'superagent';
import _ from 'lodash';
// Cached GET function, returning a stream that emits the HTTP response object
var httpget = _.memoize(function(url) {
var req = superagent.get(url);
req = req.end.bind(req);
return Rx.Observable.fromNodeCallback(req)();
});
// Assume this is created externally and I only have access to response$
var response$ = httpget('/ontologies/acl.ttl');
// Every two seconds, emit the memoized HTTP response
Rx.Observable.timer(0, 2000)
.map(() => response$)
.flatMap($ => $)
.subscribe(response => {
console.log('Got the response!');
});
I was sure that I'd have to stick a call to replay() in there somewhere, but no matter what I do, a fresh HTTP call is initiated every two seconds. How can I structure this so that I can construct an observable from a URL and have it always emit the same HTTP result to any subsequent subscribers?
EDIT
I found a way to get the result I want, but I feel like I am missing something, and should be able to refactor this with a much more streamlined approach:
var httpget = _.memoize(function(url) {
var subject = new Rx.ReplaySubject();
try {
superagent.get(url).end((err, res) => {
if(err) {
subject.onError(err);
}
else {
subject.onNext(res);
subject.onCompleted();
}
});
}
catch(e) {
subject.onError(e);
}
return subject.asObservable();
});
Your first code sample is actually closer to the way to do it
var httpget = _.memoize(function(url) {
var req = superagent.get(url);
return Rx.Observable.fromNodeCallback(req.end, req)();
});
However, this isn't working because there appears to be a bug in fromNodeCallback. As to work around till this is fixed, I think you are actually looking for the AsyncSubject instead of ReplaySubject. The latter works, but the former is designed for exactly this scenario (and doesn't have the overhead of an array creation + runtime checks for cache expiration if that matters to you).
var httpget = _.memoize(function(url) {
var subject = new Rx.AsyncSubject();
var req = superagent.get(url);
Rx.Observable.fromNodeCallback(req.end, req)().subscribe(subject);
return subject.asObservable();
});
Finally, though map appreciates that you are thinking of it, you can simplify your timer code by using the flatMap overload that takes an Observable directly:
Rx.Observable.timer(0, 2000)
.flatMap($response)
.subscribe(response => {
console.log('Got the response');
});
Unless I am getting your question wrong, Observable.combineLatest does just that for you, it cache the last emitted value of your observable.
This code sends the request once and then give same cached response every 200 ms:
import reqPromise from 'request-promise';
import {Observable} from 'rx';
let httpGet_ = (url) =>
Observable
.combineLatest(
Observable.interval(200),
reqPromise(url),
(count, response) => response
);
httpGet_('http://google.com/')
.subscribe(
x => console.log(x),
e => console.error(e)
);

Call function multiple times in the same moment but execute different calls with delay in nodejs

I need to call a function multiple times from different contexts, but i need that each call fires not before that one second has passed after the previous call started.
i'll make an example:
var i = 0;
while(i<50) {
do_something(i)
i++
}
function do_something(a) {
console.log(a)
}
I want that this log:
'1', then after a second '2', then after a second '3', then after a second '4'...
I can't use simple setInterval or setTimeout because this function 'do_something(param)' can be called in the same moment from different sources cause i am working with async function in nodejs.
I want that the order of calls is kept, but that they fires with minimum delay of one second.
I think i should add these calls to a queue, and then each second a call is dequeued and the function fires, but i really don't know how to do it in nodejs. Thank you in advance
i had to do something like this:
var tasks = [] //global var
var processor = setInterval(function() {
process_task()}, 1000)
function add_task() {
tasks.push('my task') //add task to the end of queue
}
process_task() {
var task_to_use = tasks[0];
tasks.shift() //remove first task in the queue (tasks[0])
//do what i need to with the task 'task_to_use'
}
in this way i can add tasks to the queue from wherever i want (tasks is a variable of the global context) just calling tasks.push('mytask') and the tasks will be processed one each second following the order they were put in the queue.
However, i didn't really need to do it. I needed because i am using Twilio's apis, and in their doc i read each phone number can send up to an sms for second and no more, but then the support told me they queue requests and send one message each second, so that sending more than a request for second is really not a problem and no sms sending will fail. Hope this will help, byee
Coming late to a party
I know I am late, but I had this exact same problem with this exact same technologies.
Your post was very helpful, but it lacked good practices and used Global variables.
My solution
If you are reading this today, I want you to know that after a week of bashing my head I ended up creating a question that lead to two different answers, both capable of helping you:
How to delay execution of functions, JavaScript
The queue approach, pioneered by #Arg0n and revamped by me is the closest one to your example, but with none of you drawbacks:
let asyncFunc = function(url) {
return new Promise((resolve, reject) => {
setTimeout(function() {
resolve({
url: url,
data: "banana"
});
}, 5000);
});
};
let delayFactory = function(args) {
let {
delayMs
} = args;
let queuedCalls = [];
let executing = false;
let queueCall = function(url) {
return new Promise((resolve, reject) => {
queuedCalls.push({
url,
resolve,
reject
});
if (executing === false) {
executing = true;
nextCall();
}
});
};
let execute = function(call) {
console.log(`sending request ${call.url}`);
asyncFunc(call.url)
.then(call.resolve)
.catch(call.reject);
setTimeout(nextCall, delayMs);
};
let nextCall = function() {
if (queuedCalls.length > 0)
execute(queuedCalls.shift());
else
executing = false;
};
return Object.freeze({
queueCall
});
};
let myFactory = delayFactory({
delayMs: 1000
});
myFactory.queueCall("http://test1")
.then(console.log)
.catch(console.log);
myFactory.queueCall("http://test2")
.then(console.log)
.catch(console.log);
myFactory.queueCall("http://test3")
.then(console.log)
.catch(console.log);
Give it a try and have fun!

Categories