Coalescing (combining) callbacks in node.js 6 - javascript

I'm working with an AWS lambda function that has a sort of 'map/reduce' feel to it. But the 'map' part of it, that is the part that does multiple calls is async.
Using the Node 6 STD lib, is there a dynamic way to get all results returned to a shared spot.
What I've thought about so far:
await async is a great abstraction but isn't in node 6 to my knowledge, just node 8.
array.reduce says it takes a callback but the structure of an http request seems not to qualify though I certainly may be wrong
I've thought about a really suboptimal solution, where each callback puts into a shared queue or array. And I have a loop after all the requests that checks the length of the array - I don't like this solution though
Could you guys point me in the right direction or show me some code that would do this?

Bluebird is your friend.
To convert callback functions into promises you can use .promisify() or .fromCallback().
To do map/reduce over an array of promises, you can use .map() and .reduce().

Related

MongoDb inbuilt Promises

How to know what all methods in mongoDb has an inbuilt promise in it.
eg: "updateOne() , findOne()" these methods have inbuilt promises and and we can access the response using ".then" but for lots of other mongoDB methods lack this feature how can we be sure of which methods dont have inbuilt promise in them?
eg: "find()" has no inbuilt promise so we cannott perform "find().then((response)=>{})" this will give an error.
Whereas "findOne().then((response)=>{})" will work without any issue.
This is inconsistent across the NodeJS MongoDB driver as some methods return more complex objects for manipulating the returned values. For example, .find() returns a FindCursor object which can be used to iterate over results by calling .next() repeatedly.
I would recommend referring the MongoDB documentation for the NodeJS driver (found here, or common usage examples are here) rather frequently. The documentation is reasonably extensive and should help with issues like this.
You can also consider using TypeScript, which I have personally found helpful for cases such as this because you can easily tell what type of object is returned from a function/method call.
I would recommend referring the MongoDB documentation for the NodeJS driver (found here, or common usage examples are here) rather frequently. The documentation is reasonably extensive and should help with issues like this
There is some inconsistency over the find method in MongoDB native driver in node Js. This is because the reason that the finds method returns a cursor. So what we could do here is convert that to an array using the toArray() method.
The best solution here would be to use async await over promise chaining.
This would provide us with a cleaner and easier syntax to work with.
Eg :
Let's say we want to find all the products in the product collection. Below is a function for doing exactly that.
const findAll=async(userId)=>{
const userData= await db.products.find().toArray();
console.log(userData);
return userData;
}
Upon calling the above function we would get all the products in the product collection. Just by looking at the code, we could see that it provides a more readable syntax than chaining promises all over.

Multiple fs.writeFile on Node.js

Straight to the point, I am running an http server in Node.js managing a hotel's check-in/out info where I write all the JSON data from memory to the same file using "fs.writeFile".
The data usually don't exceed 145kB max, however since I need to write them everytime that I get an update from my DataBase, I have data loss/bad JSON format when calls to fs.writeFile happen one after each other immediately.
Currently I have solved this problem using "fs.writeFileSync" however I would like to hear for a more sophisticated solution and not using the easy/bad solution of sync function.
Using fs.promises results in the same error since again I have to make multiple calls to fs.promises.
According to Node's documentation , calling fs.writefile or fs.promises multiple times is not safe and they suggest using a filestream, however this is not currently an option.
To summarize, I need to wait for fs.writeFile to end normally before attempting any repeated write action, and using the callback is not useful since I don't know a priori when a write action needs to be done.
Thank you very much in advance
I assume you mean you are overwriting or truncating the file while the last write request is still being written. If I were you, I would use the promises API and heed the warning from the documentation:
It is unsafe to use fsPromises.writeFile() multiple times on the same file without waiting for the promise to be settled.
You can await the result in a traditional loop, or very carefully use .then() to "synchronize" your callbacks, but if you're not doing anything else in your event loop except reading from your database and writing to this file, you might as well just use writeFileSync to keep things simple/safe. The asynchronous APIs (callback and Promises) are intended to allow your program to do other things in the meantime; if this is not necessary and the async APIs add troublesome complexity for your code, just use the synchronous APIs. That's true for any node API or library function, not just fs.writeFile.
There are also libraries that will perform atomic filesystem operations for you and abstract away the implementation details, but I think these are probably overkill for you unless you describe your use case in more detail. For example, why you're dumping a database to disk as JSON as fast/frequently as you can, rather than keeping things in memory or using event-based incremental updates (e.g. a real, local database with atomicity and consistency guarantees).
thank you for your response!
Since my app is mainly an http server,yes I do other things rather than simply input/output, although with not a great amount of requests. I will review again the promises solution but the first time I had no luck.
To explain more I have a:function updateRoom(data){ ...update things in memory... writetoDisk(); }
and the function writetoDisk(){
fsWriteFile(....)
}
Making the function writetoDisk an async function and implementing "await" inside it still does not solve the problem since the updateRoom function will call the writetoDisk without waiting for it to end.
The ".then" approach can not be implemented since my updateRoom is being called constantly and dynamically .
If you happen to know 1-2 thing about async-await you are more than welcome to explain me a bit more, thanks again nevertheless!

Mongoose Chained Queries

I'm currently reading this excellent MDN tutorial.
There seems to be two main different ways to perform queries with mogoose, one being more 'functional', using a chained notation.
My question is, does this kind of notation starting with an empty .find() method starts to import all the collection, then filters the successive pointers inside the app, or does it compile the query (the filtering stuff happening on the mongoDB instance) and only performs the callback locally?
The dot-based notation looks nice and cleaner, but if the filter does not get compiled before requests are being made it seems to be a bad idea to import the whole collection on the app for every simple query. Am I right?
If this is the case, are we in one of these situation where we have to favour object composition over function composition (like Eric Elliott said), or is it not the point here?

google apps script with UrlfetchApp.fetchAll() or with async/ await for multiple http requests?

I created some projects in Google Apps Script in the past for some automation, that also included some http-fetches. In the past, this worked with .fetch() pretty well, but now we need to fetch multiple urls.
Since apps script now uses V8 runtime I considered to do so with promises. I'm also quite new to async/await and promises in general.
So I considered to try the UrlfetchApp.fetch() within async functions, just to find out, that there's no difference in execution time.
I red, that UrlfetchApp.fetch() will always be sync, no matter whether you declare your function as async or not, due to the GAS-API-design. But I can't find detailed infos on this.
Is this true?
If yes: Then the only way to fetch multiple urls would be UrlfetchApp.fetchAll(), right?
If no: Means simple .fetch() would work inside async funcs (and could be chained in Promise.all()) then I'd invest further time in this.
So, yes or no would help a lot here!
Currently, Urlfetchapp runs synchronously and although the syntax of promises are supported, it works synchronously too.
Then the only way to fetch multiple urls would be UrlfetchApp.fetchAll(), right?
Yes

Function synchronization and observable

I have a Typescript function which converts a list of elements into a Map: Map.
Along this process of conversion, I need to modify something in the Map before returning the Map.
To obtain the info, I have to request a server so I have a http.GET request/subscription to get the value.
I am using the value right after the GET, and the server has not yet answer, so I am returning the Map with the wrong value.
(It comes later but too late).
Then I use this map in which I don't have the correct value
I need a mechanism to synchronize my function with the result of my GET request, before processing the Map later in my code (after returning the function).
How can I do this, I have been told that the Observable may be the solution, but I don't know how to do it.
I could use some help ;-).
Best regards,
Charles.
How can I do this, I have been told that the Observable may be the solution, but I don't know how to do it.
For async things you need some form of continuation mechanism. The popular choices are:
Callbacks
Supported natively e.g. setTimeout uses it
// Some code
setTimeout(() => {
// Some more code that executes after 1 second
}, 1000)
Promises
Supported natively (now). Some docs on TypeScript and how they help https://basarat.gitbooks.io/typescript/docs/promise.html
Observables
If your frameworks http returns observables you would need to use them.
Summary
You cannot halt execution of entire JavaScript as JavaScript's JS context is single threaded. You need to work with continuations. Check the docs of the library (e.g. angular / axios) or native api (fetch / XHR) you are using.

Categories