I've created this object which contains an array, which serves as a work queue.
It kind of works like this:
var work1 = new Work();
var work2 = new Work();
var queue = Workqueue.instance();
queue.add(work1) // Bluebird promise.
.then(function addWork2() {
return queue.add(work2);
})
.then(function toCommit() {
return queue.commit();
})
.then(function done(results) {
// obtain results here.
})
.catch(function(err){});
It works in that case and I can commit more than one task before I call the commit.
However if it's like this:
var work1 = new Work();
var work2 = new Work();
var queue = Workqueue.instance();
queue.add(work1)
.then(function toCommit1() {
return queue.commit();
})
.then(function done1(result1) {
// obtain result1 here.
})
.catch(function(err){});
queue.add(work2)
.then(function toCommit2() {
return queue.commit();
})
.then(function done2(result2) {
// obtain result2 here.
})
.catch(function(err){});
Something may go wrong, because if the first commit is called after the second commit (two works/tasks are already added), the first commit handler expects a result but they all go to the second commit handler.
The task involves Web SQL database read and may also involves network access. So it's basically a complicated procedure so the above described problem may surface. If only I can have a addWorkAndCommit() implemented which wraps the add and commit together, but still there is no guarantee because addWorkAndCommit() cannot be "atomic" in a sense because they involves asynchronous calls. So even two calls to addWorkAndCommit() may fail. (I don't know how to describe it other than by "atomic", since JavaScript is single-threaded, but this issue crops up).
What can I do?
The problem is that there is a commit() but no notion of a transaction, so you cannot explicitly have two isolated transactions running in parallel. From my understanding the Javascript Workqueue is a proxy for a remote queue and the calls to add() and commit() map directly to some kind of remote procedure calls having a similar interface without transactions. I also understand that you would not care if the second add() actually happened after the first commit(), you just want to write two simple subsequent addWorkAndCommit() statements without synchronizing the underlying calls in client code.
What you can do is write a wrapper around the local Workqueue (or alter it directly if it is your code), so that each update of the queue creates a new transaction and a commit() always refers to one such transaction. The wrapper then delays new updates until all previous transactions are committed (or rolled back).
Adopting Benjamin Gruenbaum's recommendation to use a disposer pattern, here is one, written as an adapter method for Workqueue.instance() :
Workqueue.transaction = function (work) { // `work` is a function
var queue = this.instance();
return Promise.resolve(work(queue)) // `Promise.resolve()` avoids an error if `work()` doesn't return a promise.
.then(function() {
return queue.commit();
});
}
Now you can write :
// if the order mattters,
// then add promises sequentially.
Workqueue.transaction(function(queue) {
var work1 = new Work();
var work2 = new Work();
return queue.add(work1)
.then(function() {
return queue.add(work2);
});
});
// if the order doesn't mattter,
// add promises in parallel.
Workqueue.transaction(function(queue) {
var work1 = new Work();
var work2 = new Work();
var promise1 = queue.add(work1);
var promise2 = queue.add(work2);
return Promise.all(promise1, promise2);
});
// you can even pass `queue` around
Workqueue.transaction(function(queue) {
var work1 = new Work();
var promise1 = queue.add(work1);
var promise2 = myCleverObject.doLotsOfAsyncStuff(queue);
return Promise.all(promise1, promise2);
});
In practice, an error handler should be included like this - Workqueue.transaction(function() {...}).catch(errorHandler);
Whatever you write, all you need to do is ensure that the callback function returns a promise that is an aggregate of all the component asynchronisms (component promises). When the aggregate promise resolves, the disposer will ensure that the transaction is committed.
As with all disposers, this one doesn't do anything you can't do without it. However it :
serves as a reminder of what you are doing by providing a named .transaction() method,
enforces the notion of a single transaction by constraining a Workqueue.instance() to one commit.
If for any reason you should ever need to do two or more commits on the same queue (why?), then you can always revert to calling Workqueue.instance() directly.
Related
I have a Service Worker that receives push messages from Firebase FCM. They cause notifications to show or to cancel. The user can have multiple devices (that's what the cancel is for: when the user already acted on notification A I try to dismiss it on all devices).
The problem I have is when one of the user's devices is offline or turned off altogether. Once the device goes online, firebase delivers all the messages it couldn't deliver before. So, for example, you'd get:
Show notif A with content X
Show notif A with content Y (replaces notif A)
Show notif B with content Z
Cancel notif A
The SW receives these messages in rapid succession. The problem is that cancelling a notification is a lot faster than showing one (~2ms vs 16ms). So the 4th message is handled before the first (or second) message actually created the notification, the result being that the notification is not being cancelled.
// EDIT: Heavily edited question below. Added example code and broke down my questions. Also edited the title to better reflect my actual underlying question.
I tried pushing the messages in a queue and handling them one by one. Turns out this can become a bit complicated because everything in SW is async and, to make matters worse, it can be killed at any time when the browser thinks the SW finished its work. I tried to store the queue in a persistent manner but since LocalStorage is unavailable in SW I need to use the async IndexedDB API. More async calls that could cause problems (like losing items).
It's also possible that event.waitUntil thinks my worker is done before it's actually done because I'm not correctly 'passing the torch' from promise to promise ..
Here's a (lot of) simplified code of what I tried:
// Use localforage, simplified API for IndexedDB
importScripts("localforage.min.js");
// In memory..
var mQueue = []; // only accessed through get-/setQueue()
var mQueueBusy = false;
// Receive push messages..
self.addEventListener('push', function(event) {
var data = event.data.json().data;
event.waitUntil(addToQueue(data));
});
// Add to queue
function addToQueue(data) {
return new Promise(function(resolve, reject) {
// Get queue..
getQueue()
.then(function(queue) {
// Push + store..
queue.push(data);
setQueue(queue)
.then(function(queue){
handleQueue()
.then(function(){
resolve();
});
});
});
});
}
// Handle queue
function handleQueue(force) {
return new Promise(function(resolve, reject) {
// Check if busy
if (mQueueBusy && !force) {
resolve();
} else {
// Set busy..
mQueueBusy = true;
// Get queue..
getQueue()
.then(function(queue) {
// Check if we're done..
if (queue && queue.length<=0) {
resolve();
} else {
// Shift first item
var queuedData = queue.shift();
// Store before continuing..
setQueue(queue)
.then(function(queue){
// Now do work here..
doSomething(queuedData)
.then(function(){
// Call handleQueue with 'force=true' to go past (mQueueBusy)
resolve(handleQueue(true));
});
});
}
});
}
});
}
// Get queue
function getQueue() {
return new Promise(function(resolve, reject) {
// Get from memory if it's there..
if (mQueue && mQueue.length>0) {
resolve(mQueue);
}
// Read from indexed db..
else {
localforage.getItem("queue")
.then(function(val) {
var queue = (val) ? JSON.parse(val) : [];
mQueue = queue;
resolve(mQueue);
});
}
});
}
// Set queue
function setQueue(queue) {
return new Promise(function(resolve, reject) {
// Store queue to memory..
mQueue = queue;
// Write to indexed db..
localforage.setItem("queue", mQueue)
.then(function(){
resolve(mQueue);
});
});
}
// Do something..
function doSomething(queuedData) {
return new Promise(function(resolve, reject) {
// just print something and resolve
console.log(queuedData);
resolve();
});
}
The short version of my question - with my particular use-case in mind - is: how do I handle push messages synchronously without having to use more async API's?
And if I would split those questions into multiple:
Am I right to assume I would need to queue those messages?
If so, how would one handle queues in SW?
I can't (completely) rely on global variables because the SW may be killed and I can't use LocalStorage or similar synchronous API's, so I need to use yet another async API like IndexedDB to do this. Is this assumption correct?
Is my code above the right approach?
Somewhat related: Since I need to pass the event.waitUntil from promise to promise until the queue is processed, am I right to call resolve(handleQueue()) inside handleQueue() to keep it going? Or should I do return handleQueue()? Or..?
Just to apprehend the "why not use collapse_key": It's a chat app and every chat room has it's own tag. A user can participate in more than 4 chatrooms and since firebase limits the amount of collapse_keys to 4 I can't use that.
So I'm going to go out on a limb and say that serializing things to IDB could be overkill. As long as you wait until all your pending work is done before you resolve the promise passed to event.waitUntil(), the service worker should be kept alive. (If it takes minutes to finish that work, there's the chance that the service worker would be killed anyway, but for what you describe I'd say the risk of that is low.)
Here's a rough sketch of how I'd structure your code, taking advantage of native async/await support in all browsers that currently support service workers.
(I haven't actually tested any of this, but conceptually I think it's sound.)
// In your service-worker.js:
const isPushMessageHandlerRunning = false;
const queue = [];
self.addEventListener('push', event => {
var data = event.data.json().data;
event.waitUntil(queueData(data));
});
async function queueData(data) {
queue.push(data);
if (!isPushMessageHandlerRunning) {
await handlePushDataQueue();
}
}
async function handlePushDataQueue() {
isPushMessageHandlerRunning = true;
let data;
while(data = queue.shift()) {
// Await on something asynchronous, based on data.
// e.g. showNotification(), getNotifications() + notification.close(), etc.
await ...;
}
isPushMessageHandlerRunning = false;
}
I'm currently working on a function that takes a pretty long time to finish and since I won't be able to make it finish faster and im going to call it from other Scripts I was wondering if there is a method to use something like a promise in that function.
Basically
function longrunning(){
var def = new $.Deferred();
var result = {};
[DO STUFF THAT TAKES A WHILE]
def.resolve();
return def.promise(result);
}
My basic problem is, that since all the stuff thats going on isn't async my promise won't be returned until everything is done, so the function that will later be calling longrunning won't know it's async. But of course if I return the promise before executing all of the code, it won't resolve at all. I hope you're getting what I'm trying to do. Hope someone has an idea. Thanks in advance and
Greetings Chris
Wrapping the code in a $.Deferred (or native promise) won't help even if you do manage to get the promise back to the calling code before doing the long-running work (for instance, via setTimeout). All it would accomplish is making the main UI thread seize up later, soon after longrunning returned the promise, instead of when calling longrunning itself. So, not useful. :-)
If the function in question doesn't manipulate the DOM, or if the manipulations it does can be separated from the long-running logic, this is a good candidate to be moved to a web worker (specification, MDN), so it doesn't get run on the main UI thread at all, but instead gets run in a parallel worker thread, leaving the UI free to continue to be responsive.
longrunning wouldn't do the actual work, it would just postMessage the worker to ask it to do the work, and then resolve your promise when it gets back a message that the work is done. Something along these lines (this is just a code sketch, not a fully-implemented solution):
var pendingWork = {};
var workId = 0;
var worker = new Worker("path/to/web/worker.js");
worker.addEventListener("message", function(e) {
// Worker has finished some work, resolve the Deferred
var d = pendingWork[e.data.id];
if (!d) {
console.error("Got result for work ID " + e.data.id + " but no pending entry for it was found.");
} else {
if (e.data.success) {
d.resolve(e.data.result);
} else {
d.reject(e.data.error);
}
delete pendingWork[e.data.id];
}
});
function longrunning(info) {
// Get an identifier for this work
var id = ++workid;
var d = pendingWork[id] = $.Deferred();
worker.postMessage({id: id, info: info});
return d.promise();
}
(That assumes what the worker sends back is an object with the properties id [the work ID], success [flag], and either result [the result] or error [the error].)
As you can see, there we have longrunning send the work to the worker and return a promise for it; when the worker sends the work back, a listener resolves the Deferred.
If the long-running task does need to do DOM manipulation as part of its work, it could post the necessary information back to the main script to have it do those manipulations on its behalf as necessary. The viability of that naturally depends on what the code is doing.
Naturally, you could use native promises rather than jQuery's $.Deferred, if you only have to run on up-to-date browsers (or include a polyfill):
var pendingWork = {};
var workId = 0;
var worker = new Worker("path/to/web/worker.js");
worker.addEventListener("message", function(e) {
// Worker has finished some work, resolve the Deferred
var work = pendingWork[e.data.id];
if (!work) {
console.error("Got result for work ID " + e.data.id + " but no pending entry for it was found.");
} else {
if (e.data.success) {
work.resolve(e.data.result);
} else {
work.reject(e.data.error);
}
delete pendingWork[e.data.id];
}
});
function longrunning(info) {
return new Promise(function(resolve, reject) {
// Get an identifier for this work
var id = ++workid;
pendingWork[id] = {resolve: resolve, reject: reject};
worker.postMessage({id: id, info: info});
});
}
I want to implement a dynamic loading of a static resource in AngularJS using Promises. The problem: I have couple components on page which might (or not, depends which are displayed, thus dynamic) need to get a static resource from the server. Once loaded, it can be cached for the whole application life.
I have implemented this mechanism, but I'm new to Angular and Promises, and I want to make sure if this is a right solution \ approach.
var data = null;
var deferredLoadData = null;
function loadDataPromise() {
if (deferredLoadData !== null)
return deferredLoadData.promise;
deferredLoadData = $q.defer();
$http.get("data.json").then(function (res) {
data = res.data;
return deferredLoadData.resolve();
}, function (res) {
return deferredLoadData.reject();
});
return deferredLoadData.promise;
}
So, only one request is made, and all next calls to loadDataPromise() get back the first made promise. It seems to work for request that in the progress or one that already finished some time ago.
But is it a good solution to cache Promises?
Is this the right approach?
Yes. The use of memoisation on functions that return promises a common technique to avoid the repeated execution of asynchronous (and usually expensive) tasks. The promise makes the caching easy because one does not need to distinguish between ongoing and finished operations, they're both represented as (the same) promise for the result value.
Is this the right solution?
No. That global data variable and the resolution with undefined is not how promises are intended to work. Instead, fulfill the promise with the result data! It also makes coding a lot easier:
var dataPromise = null;
function getData() {
if (dataPromise == null)
dataPromise = $http.get("data.json").then(function (res) {
return res.data;
});
return dataPromise;
}
Then, instead of loadDataPromise().then(function() { /* use global */ data }) it is simply getData().then(function(data) { … }).
To further improve the pattern, you might want to hide dataPromise in a closure scope, and notice that you will need a lookup for different promises when getData takes a parameter (like the url).
For this task I created service called defer-cache-service which removes all this boiler plate code. It writted in Typescript, but you can grab compiled js file. Github source code.
Example:
function loadCached() {
return deferCacheService.getDeferred('cacke.key1', function () {
return $http.get("data.json");
});
}
and consume
loadCached().then(function(data) {
//...
});
One important thing to notice that if let's say two or more parts calling the the same loadDataPromise and at the same time, you must add this check
if (defer && defer.promise.$$state.status === 0) {
return defer.promise;
}
otherwise you will be doing duplicate calls to backend.
This design design pattern will cache whatever is returned the first time it runs , and return the cached thing every time it's called again.
const asyncTask = (cache => {
return function(){
// when called first time, put the promise in the "cache" variable
if( !cache ){
cache = new Promise(function(resolve, reject){
setTimeout(() => {
resolve('foo');
}, 2000);
});
}
return cache;
}
})();
asyncTask().then(console.log);
asyncTask().then(console.log);
Explanation:
Simply wrap your function with another self-invoking function which returns a function (your original async function), and the purpose of wrapper function is to provide encapsulating scope for a local variable cache, so that local variable is only accessible within the returned function of the wrapper function and has the exact same value every time asyncTask is called (other than the very first time)
I have an array of files, that I'd like to attack N at a time. And a function doWork that returns a promise.
var files = []
var doWork = function(file) {
return asyncFn(file)
}
I'd like to be able to push onto this queue dynamically.
Edit: I've tried various modules (promise-queue, async-q). They all work in a fashion, but they don't allow using an array as a queue. They have there own internal structure that you need to push onto.
The reason I need to use an array as I want to be able to push an item onto the queue, and check that it's not already on the queue.
Here is how you would do that with Bluebird which you indicated you were using.
var files = ["foo.txt", "bar.txt", "baz.txt"];
var task = Promise.map(files, doWork, {concurrency: 4}); // four at a time
task.then(function(results){
// results contains the results, tasks are executed at most 4 at a time
});
A word of caution - this puts an upper limit on how much the current invocation will run, calling the function multiple times, or from multiple node processes will (obviously) execute with larger/smaller concurrency. However in the simple case - this works.
You could do something like this:
function enq (step)
var f = function() {
var d = Q.defer();
step(d);
return d.promise;
}
enq_head = enq_head.then(f);
}
where step is a function that fulfills the promise you pass it. But I don't recommend it cos it's just a fancy way of doing what setTimeout does much more efficiently.
If you want to keep track of which files you've scheduled and/or completed, just put them in a done list or take them out of the todo list you get them from, stick a bool in an object under the filename or whatever. It's a seperate problem from the scheduling.
I want to implement a dynamic loading of a static resource in AngularJS using Promises. The problem: I have couple components on page which might (or not, depends which are displayed, thus dynamic) need to get a static resource from the server. Once loaded, it can be cached for the whole application life.
I have implemented this mechanism, but I'm new to Angular and Promises, and I want to make sure if this is a right solution \ approach.
var data = null;
var deferredLoadData = null;
function loadDataPromise() {
if (deferredLoadData !== null)
return deferredLoadData.promise;
deferredLoadData = $q.defer();
$http.get("data.json").then(function (res) {
data = res.data;
return deferredLoadData.resolve();
}, function (res) {
return deferredLoadData.reject();
});
return deferredLoadData.promise;
}
So, only one request is made, and all next calls to loadDataPromise() get back the first made promise. It seems to work for request that in the progress or one that already finished some time ago.
But is it a good solution to cache Promises?
Is this the right approach?
Yes. The use of memoisation on functions that return promises a common technique to avoid the repeated execution of asynchronous (and usually expensive) tasks. The promise makes the caching easy because one does not need to distinguish between ongoing and finished operations, they're both represented as (the same) promise for the result value.
Is this the right solution?
No. That global data variable and the resolution with undefined is not how promises are intended to work. Instead, fulfill the promise with the result data! It also makes coding a lot easier:
var dataPromise = null;
function getData() {
if (dataPromise == null)
dataPromise = $http.get("data.json").then(function (res) {
return res.data;
});
return dataPromise;
}
Then, instead of loadDataPromise().then(function() { /* use global */ data }) it is simply getData().then(function(data) { … }).
To further improve the pattern, you might want to hide dataPromise in a closure scope, and notice that you will need a lookup for different promises when getData takes a parameter (like the url).
For this task I created service called defer-cache-service which removes all this boiler plate code. It writted in Typescript, but you can grab compiled js file. Github source code.
Example:
function loadCached() {
return deferCacheService.getDeferred('cacke.key1', function () {
return $http.get("data.json");
});
}
and consume
loadCached().then(function(data) {
//...
});
One important thing to notice that if let's say two or more parts calling the the same loadDataPromise and at the same time, you must add this check
if (defer && defer.promise.$$state.status === 0) {
return defer.promise;
}
otherwise you will be doing duplicate calls to backend.
This design design pattern will cache whatever is returned the first time it runs , and return the cached thing every time it's called again.
const asyncTask = (cache => {
return function(){
// when called first time, put the promise in the "cache" variable
if( !cache ){
cache = new Promise(function(resolve, reject){
setTimeout(() => {
resolve('foo');
}, 2000);
});
}
return cache;
}
})();
asyncTask().then(console.log);
asyncTask().then(console.log);
Explanation:
Simply wrap your function with another self-invoking function which returns a function (your original async function), and the purpose of wrapper function is to provide encapsulating scope for a local variable cache, so that local variable is only accessible within the returned function of the wrapper function and has the exact same value every time asyncTask is called (other than the very first time)