ES6 Promise automatic post-processing / cloning results - javascript

I have a situation where I am building a data layer based on ES6 JS Promises that fetch data from the network. I am caching all Promises internally by the url.
Everything seems to be working fine except one thing. I want to ensure that the data coming out of the network layer is a copy/clone of the data retrieved from the network and I obviously do not want to do that everywhere in the client code that implements Promise's then handlers.
I would like to set this up so then handler automatically gets a copy of the cached data.
To add a twist to this, I would like this to be configurable on a url basis inside the data layer so that some Promises do the extra post-processing copy while others return just the raw result.
Can anyone suggest a proper implementation to accomplish this? I should mention that I would like to get a new copy of the original raw result each time a new client asks for it.
The current simplified pseudo implementation looks like this
getCachedData(url){
if (cache[url]) {
return cache[url];
} else {
var promise = new Promise(function(resolve,reject){
var data = ...ajax get...;
resolve(data);
});
cache[url] = promise;
}
getCachedData(url).then(result=>{
here I want the result to be a copy of data I resolved the original promise with.
});

Structure it like this:
function retrieveCopiedData () {
// getDataFromServer is your original Promise
return getDataFromServer().then(function (value) {
// use a library of your choice for copying the object.
return copy(value);
})}
}
This means that all consumers of retrieveCopiedData will receive the value returned from retrieveCopiedData's then() handler.
retrieveCopiedData().then(function (value) {
// value is the copy returned from retrieveCopiedData's then handler
})
You can add conditional logic to retrieveCopiedData as you see fit.

It seems like you just want to incorporate the cloning process right in your data layer:
getCachedData(url){
if (!cache[url]) {
cache[url] = new Promise(function(resolve,reject){
var data = ...ajax get...;
resolve(data);
});
}
if (requiresPostProcessing(url))
return cache[url].then(clone);
else
return cache[url];
}
Notice that it might be a good idea not to clone the data each time it is retrieved, but to simply freeze the object that your promise is resolved with.

Related

a jQuery promise that always succeeds?

I have this code jQuery code fragment:
$.get('/api/' + currentPage).done(function(data) { ... })
.fail(...)
I want to replace $.get('/api/'+currentPage) with a promise that always succeeds and returns a specific value for data. Something like:
let myData = { ... } // value of data I want to pass to the done function
(new AlwaysSucceeds(myData)).done(function(data) { ... })
.fail(...)
I could cobble up a dummy object, or I could extract out the done function but I want to keep changes to the code to a minimum.
Is there a way to do this?
UPDATE: To help clarify what's going, the code I am working with is (here). Normally this app is served from a nodejs server which implements the /api/... call, but I am converting it to be served
from a static page server. I know what is going to be returned from
the $.get call. To keep changes to the code clean I simply want to
change that line to:
let myData = {...}
// $.get('/api/' + currentPage) -- comment out the $.get call
(SOMETHINGHERE(myData)).done(function(data) {
The SOMETHINGHERE expression needs to implement .done(f)
which will call the function f with myData and then return
some object which implements .fail(...) which does nothing.
You can just replace $.get(...) with a function that returns a promise that is already resolved with the data you already have. And, the shortest way to get an already resolved jQuery promise, resolved with a particular value, is this:
$.when(myData).done(...)
The more text book way to do it in jQuery is:
$.Deferred().resolve(myData).done(...)
And, if you care to switch your logic to the the ES6 standard (instead of the non-standard jQuery promise behaviors), then you could use this:
Promise.resolve(myData).then(...).catch(...)
You can achieve this by implementing AlwaysSuceeds constructor function. Please see below example.
function AlwaysSucceeds(data) {
this.data = data;
}
AlwaysSucceeds.prototype.done = function(fn) {
fn(this.data);
return this;
}
AlwaysSucceeds.prototype.fail = function(fn) {
return this;
}
var myData = {
a: 1
};
(new AlwaysSucceeds(myData)).done(function(data) {
console.log(data)
}).fail(function(data){
})
Since jQuery Ajax functions just return $.Deferred objects, you can just substitute an immediately-resolved Deferred:
$.Deferred().resolve(myData).then(...)
In this particular case, if you want to make it easy to switch between synchronous and asynchronous code, and you have access to async/await, you can just use those directly:
try {
const data = await Promise.resolve($.get('/api/' + currentPage));
// code in done
} catch (err) {
// code in fail
}
would become
try {
const data = myData;
// code in done
} catch (err) {
// code in fail (never runs unless other code throws exceptions)
}
It's not clear what you actually want but be carufull using jQuery Deferred with native promises, the deferred has some non standard methods that native promises don't have.
So to be save I always assume there is a thenable, something that has a then with that you can pretty much do whatever you want.
jQuery Deferred do not behave like native promises either (depending on version):
$.Deferred().reject("hello world")
.then(
undefined
,x=>x
)
.then(
x=>console.log("Never happens",x)
)
Promise.reject("hello world")
.then(
undefined
,x=>x
);
.then(
x=>console.log("Well behaved",x)
);
Promise.resolve().then(x=>{throw "nope"})
.then(undefined,err=>console.warn(err));
$.Deferred().resolve().then(x=>{throw "nope"})//crashes
.then(undefined,err=>err);
So it will be saver to use native promises and polyfill with something that behaves like native.
To answer the question about non failing promise, if you want to make a request but return a default when it rejects and keep returning the same once resolves or rejects you can do:
const get = (p=>{
(url) => {
p = p ||
//return native promise or properly polyfilled one
Promise.resolve($.get(url))
.then(
undefined,
_=> {defaultobject:true}
);
return p;
}
})();
Your get function will return a native promise so no fail, done and other things that are non standard. Combining "promises" from different libraries and native promises it would be best to only use then

How do I create a class that processes two promises and then returns a promise?

I want to create a class whose duty is to poll data sources, collate information into an array of 'alert' objects, and then deliver a subset of those alerts to any other class that wants them.
Because polling happens asynchronously (I'm requesting data from a web service) then I assume that what I actually need to return is a promise which, when fulfilled, will give the correct subset of Alert objects.
But clearly I don't understand how to do this, because the method that is supposed to return the promise returns something else.
Here's my code so far. As you can see, I'm trying to store the promise in an instance attribute and then retrieve it:
export class AlertCollection {
constructor() {
this.alerts = null;
}
// poll the data sources for alert data; store a promise that resolves
// to an array of alerts
poll() {
this.alerts = this.pollTeapot()
.then( (arr) => {this.pollDeliverance(arr);} );
}
// return a promise that fulfils to an array of the alerts you want
filteredAlerts(filter) {
return this.alerts; // not filtering for now
}
// return a promise that fulfills to the initial array of alerts
pollTeapot() {
let process = (json) => {
json2 = JSON.parse(json);
return json2.map( (a) => new Alert(a) );
};
message = new MessageHandler("teapot", "alerts")
return message.request().then( (json) => {process(json);} );
}
// Modify the alerts based on the response from Deliverance.
// (But for the time being let's not, and say we did.)
pollDeliverance(alerts) {
return alerts;
}
}
message.request() returns a promise from the web service. That works. If I snapshot the process function inside pollTeapot() I get the right data.
But, if I snapshot the return value from filteredAlerts() I don't get that. I don't get null either (which would at at least make sense, although it would be wrong.) I get something like { _45: 0, _81: 0, _65: null, _54: null }.
Any pointers would be very much appreciated at this point. (This is in React Native, by the way, if that helps.)
I am not sure if I understood your problem fully, but I will try to give you an generic solution to chaining promises one after another.
someAsyncFunction().then(dataFromAsync1 => {
return anotherAsyncFunction(dataFromAsync1).then(dataFromAsync2 => {
return doSomethingWithData(dataFromAsync1, dataFromAsync2);
});
});
This is going to be a hard one to describe - I have a working example but it's convoluted in that I've had to "mock up" all of the async parts, and use function classes rather than the class keyword - but the idea is the same!
There are 2 parts to this answer.
It does not make sense to store alerts as an instance variable. They are asynchronous, and wont exist until after the async calls have completed
You'll need to chain all of your behaviour onto the initial call to poll
In general, you chain promises on to one another like this
functionWhichReturnsPromise()
.then(functionPointer)
.then(function(result){
// some functionality, which can return anything - including another promise
});
So your code would end up looking like
var alertCollection = new AlertCollection()
alertCollection.poll().then(function(alerts){
//here alerts have been loaded, and deliverance checked also!
});
The code of that class would look along the lines of:
export class AlertCollection {
constructor() {
}
// poll the data sources for alert data; store a promise that resolves
// to an array of alerts
poll() {
return this.pollTeapot()
.then(filteredAlerts)
.then(pollDeliverance);
}
// return a promise that fulfils to an array of the alerts you want
filteredAlerts(alerts) {
return alerts; // not filtering for now
}
// return a promise that fulfills to the initial array of alerts
pollTeapot() {
let process = (json) => {
json2 = JSON.parse(json);
return json2.map( (a) => new Alert(a) );
};
message = new MessageHandler("teapot", "alerts")
return message.request().then(process);
}
// Modify the alerts based on the response from Deliverance.
// (But for the time being let's not, and say we did.)
pollDeliverance(alerts) {
return alerts;
}
}
A few notes
filteredAlerts can do whatever you like, so long as it returns an array of results
pollDeliverance can also do whatever you like - if it needs to call another async method, remember to return a promise which resolves to an array of alerts - perhaps updated from the result of the async call.
I have created a JSFiddle which demonstrates this - using a simple getJSON call to replicate the async nature of some of this. As I mentioned, it is convoluted, but demonstrates the process:
Live example: https://jsfiddle.net/q1r6pmda/1/

Abortable promises and using paged data from an API for an autocomplete text field

Disclaimer: I'm new to ES6 and promises in general so its possible my approach is fundamentally wrong.
The Problem
I have an api that returns paged data. That is it returns a certain number of objects for a collection and if there's more data it returns a Next property in the response body with the url to get the next set of data. The data will eventually feed the autocomplete on a text input.
To be specific, if my collection endpoint is called /Tickets the response would look like:
{
Next: "/Tickets?skip=100&take=100"
Items: [ ... an array of the first 100 items ... ]
}
My current solution to get all the ticket data would be to
Make a new promise for the returning the whole combined set of data
"Loop" the ajax calls by chaining dones until there is no more Next value
Resolve the promise
getTickets(filterValue) {
let fullSetPromise = new Promise();
let fullSet = [];
// This gets called recursively
let append = function(previousResult) {
fullSet.concat(previousResult.Items);
// Loop!
if(previousResult.Next) {
$.ajax({
url: previousResult.Next
})
.done(append)
.catch(reason => fullSetPromise.reject(reason));
}
else {
fullSetPromise.resolve(fullSet);
}
}
// We set things off by making the request for the first 100
$.ajax({
url: `/Tickets?skip=0&take=100&filter=${filterValue}`
})
.done(append)
.catch(reason => fullSetPromise.reject(reason));
return fullSetPromise;
}
Eventually the promise is used by the frontend for autocomplete on a text input. Ideally I'd like to be able to abort the previous call when new input comes in.
inputChanged(e) {
this.oldTicketPromise.abort();
this.oldTicketPromise =
Api.GetTickets(e.target.value).then(updateListWithResults);
}
I am aware of debouncing. But that just means the problem happens every n seconds instead of on every key press.
I know the jqxhr object has an abort() property on it and I'd like that to be available to the caller somehow. But because there are multiple jqXHR objects used in GetTickets I'm not sure how to handle this.
So my main 2 questions are:
What is the appropriate way to consume paged data from an api while returning a promise.
How can the returned promise be made abortable?
Side question:
I feel like if I don't catch the errors then my "wrapper" promise will swallow any thrown errors. Is that a correct assumption?
Note the javascript code might have errors. It's mostly demonstrative for the logic.
Edit: Solved
I have solved this by combining this answer https://stackoverflow.com/a/30235261/730326 with an array of xhrs as suggested in the comments. I will add a proper answer with code when I find the time.

Should I emulate '.$promise' objects?

I am working on an admin app with AngularJS. The app fetches its data from the server using $resource. I end up with data objects containing '$promise' property to determine when the data has been fetched.
Everything's fine.
Now, this admin app can also create new objects. Those new objects are managed by the same controllers than the one that usually come from '$resource'.
So now I have 2 kind of objects:
Objects with $promise property. I should use $promise.then() before manipulating them with full data
Plain objects. They don't have a $promise property, their value is accessible instantly
I would like to reduce code and to have a single use-case, not having to check if object data is resolved, or if it is not a promise.
Is there any problem in building my 'plain' objects by adding them a '$promise' property which is already resolved to themselves? That way, I would always use 'myObject.$promise.then()'.
Is there any common pattern to handle this situation? I could not find any 'standard' method to create this kind of objects with Angular.
You could use $q.when if unsure whether the object has a promise or not.
(obj.$promise ? obj.$promise || $q.when(objResult)).then(function(result){
//handle success case.
//Incase of object not having the $promise property result will be object itself
})
if the resultant property does not have a promise this will resolve with promise.
Wraps an object that might be a value or a (3rd party) then-able promise into a $q promise. This is useful when you are dealing with an object that might or might not be a promise, or if the promise comes from a source that can't be trusted.
You do not need to always create a promise and attach it on the data that is being transferred, instead you could make your methods returns promise thus help you implement promise pattern and abstract out the promise logic on your service itself. Example:-
function getCurrentUserData(userId){
var defered = $q.defer();
...
//Check in my cache if this is already there, then get it from cache and resolve it
defered.resolve(objectFromCache);
//my else condition
//It is not in cache so let me make the call.
$http.get('myurl').then(function(result){
//Do validation with data and if it doesnot match then reject it
defered.reject(reason);
//Else Do something with the data put it into the cache, mapping logic etc..
defered.resolve(dto);
}).catch(function(error){
//do something with error and then reject
defered.reject(reasonDerived);
});
return defered.promise;
}
Here is a simplified and less explicit version (Credit: Benjamin Gruenbaum):
var cached = null;
function getCurrentUserData(userId){
return cached = cached || $http.get('myurl').then(function(result){
if(isNotValid(result)) return $q.reject(reason); // alternatively `throw`
return transformToDto(result);
}, function(response){
if(checkforsomethingonstatusandreject)
return $q.reject('Errored Out')
//do some actions to notify the error scenarios and return default actions
return someDefaults; });
}
You can of course return $q.reject(derivedReason) here rather than returning the reason and transform it based on further checks, the idea is caching the promise rather than the value. This also has the advantage of not making multiple http requests if the method is called before it returns once.
Now you could always do:-
getCurrentUserData(userid).then(function(user){
}).catch(function(error){
});
Promises can be chained through as well. So you could also do this:-
return $resource('myresource').$promise.then(function(result){
//Do some mapping and return mapped data
return mappedResult;
});
A+ promises should offer a static method:
Promise.resolve(val);
...which generates a pre-resolved promise. You should return one of these if your promises library offers this. Great. I do this frequently to avoid interface duplication.

Caching a promise object in AngularJS service

I want to implement a dynamic loading of a static resource in AngularJS using Promises. The problem: I have couple components on page which might (or not, depends which are displayed, thus dynamic) need to get a static resource from the server. Once loaded, it can be cached for the whole application life.
I have implemented this mechanism, but I'm new to Angular and Promises, and I want to make sure if this is a right solution \ approach.
var data = null;
var deferredLoadData = null;
function loadDataPromise() {
if (deferredLoadData !== null)
return deferredLoadData.promise;
deferredLoadData = $q.defer();
$http.get("data.json").then(function (res) {
data = res.data;
return deferredLoadData.resolve();
}, function (res) {
return deferredLoadData.reject();
});
return deferredLoadData.promise;
}
So, only one request is made, and all next calls to loadDataPromise() get back the first made promise. It seems to work for request that in the progress or one that already finished some time ago.
But is it a good solution to cache Promises?
Is this the right approach?
Yes. The use of memoisation on functions that return promises a common technique to avoid the repeated execution of asynchronous (and usually expensive) tasks. The promise makes the caching easy because one does not need to distinguish between ongoing and finished operations, they're both represented as (the same) promise for the result value.
Is this the right solution?
No. That global data variable and the resolution with undefined is not how promises are intended to work. Instead, fulfill the promise with the result data! It also makes coding a lot easier:
var dataPromise = null;
function getData() {
if (dataPromise == null)
dataPromise = $http.get("data.json").then(function (res) {
return res.data;
});
return dataPromise;
}
Then, instead of loadDataPromise().then(function() { /* use global */ data }) it is simply getData().then(function(data) { … }).
To further improve the pattern, you might want to hide dataPromise in a closure scope, and notice that you will need a lookup for different promises when getData takes a parameter (like the url).
For this task I created service called defer-cache-service which removes all this boiler plate code. It writted in Typescript, but you can grab compiled js file. Github source code.
Example:
function loadCached() {
return deferCacheService.getDeferred('cacke.key1', function () {
return $http.get("data.json");
});
}
and consume
loadCached().then(function(data) {
//...
});
One important thing to notice that if let's say two or more parts calling the the same loadDataPromise and at the same time, you must add this check
if (defer && defer.promise.$$state.status === 0) {
return defer.promise;
}
otherwise you will be doing duplicate calls to backend.
This design design pattern will cache whatever is returned the first time it runs , and return the cached thing every time it's called again.
const asyncTask = (cache => {
return function(){
// when called first time, put the promise in the "cache" variable
if( !cache ){
cache = new Promise(function(resolve, reject){
setTimeout(() => {
resolve('foo');
}, 2000);
});
}
return cache;
}
})();
asyncTask().then(console.log);
asyncTask().then(console.log);
Explanation:
Simply wrap your function with another self-invoking function which returns a function (your original async function), and the purpose of wrapper function is to provide encapsulating scope for a local variable cache, so that local variable is only accessible within the returned function of the wrapper function and has the exact same value every time asyncTask is called (other than the very first time)

Categories