I wish to validate my assumptions:
As I read yet, to create an async call, I must eventually end up calling a javascript native call (like setTimeout() || XMLHttpRequest() || ...) to get out of the engine main scheduler loop.
Any third party library wishing to be async, will indirectly do the same.
Is this right, or is there a way to create my own async code (for instance by allowing a blocking feature on my code) intentionally without a js native call ?
In JavaScript, there are a few ways to write code that can be run asynchronously. Typically the methods used were:
setTimeout
setInterval
XMLHttpRequest
But in HTML5 (and beyond) there are a lot of other methods that work asynchronously. For example, the new asynchronous file uploading, Web Workers and Promises. (NOTE: not all of these are supported by every browser)
Your question, though, is vague. What are you attempting to do? If it's UI-centric, you may want to look at requestAnimationFrame
If by blocking, you mean you want to show a "loading" gif and then go do stuff and change that after the "stuff" is done, there are a lot of ways to handle that but with the newer browsers, you'd use Promises: http://www.html5rocks.com/en/tutorials/es6/promises/ with a check for some value in a timeout function.
//The value to check
var isDone = false;
var asynchCheck = function(resolve, reject) {
if ( notStarted )
{
// do stuff (calls async function that eventually sets "isDone" = true)
}
if ( isDone === true )
{
if ( no errors ) resolve("Stuff worked!");
else reject(Error("It broke"));
}
//Check again
else setTimeout( asynchCheck, 50 );
}
//Start your function
var promise = new Promise(asynchCheck);
//Now set the success/fail functions
promise.then(function(result) {
console.log(result); // "Stuff worked!"
}, function(err) {
console.log(err); // Error: "It broke"
});
Related
I need to display the spinner for at least 2 seconds if ajax has been completed
before transitioning.
I have the following but isnt working
route: {
data(transition) {
this.getDelayedData();
transition.next();
}
},
methods:{
getSliders(){
this.$http.get('/api/sliders/getsliders')
.then(sliders =>{this.sliders = sliders.data
});
},
getPosts(){
this.$http.get('/api/posts/getposts')
.then(posts =>{this.posts = posts.data.data
});
},
getDelayedData(){
function timer() {
var dfd = $.Deferred();
setTimeout(function()
{
console.log("done");
}, 2000,dfd.resolve);
return dfd.promise();
}
$.when( this.getPosts(), this.getSliders(),timer() )
.done();
}
}
I tried to implement the code reading this post
But the $.when function isnt waiting until the setTimeout function is finished executing.
You have already accepted my other answer, but I think the code can be improved significantly, as explained below:
It is generally recommended not to mix jQuery with Vue.js, as both modify DOM:
Vue.js has a very strong hold on the DOM, for its model-view bindings, using its Reactivity System
jQuery may modify using class or id selectors, if you use it for some quick-fix solution somewhere.
You will end up spending a lot of time debugging frontend issues, which takes focus away from the business requirements. Also, Vue.js already requires modern browsers, so technically jQuery is not doing much for browser compatibility - its primary goal.
So if you are open to leaving jQuery behind, here is another method:
getDelayedData() {
// Create a new Promise and resolve after 2 seconds
var myTimerPromise = new Promise((resolve, reject) => {
setTimeout(() => {
// callback function for timer, gets called after the time-delay
// Your timer is done now. Print a line for debugging and resolve myTimerPromise
console.log("2 seconds up, resolving myTimerPromise")
resolve();
}, 2000); // This promise will be resolved in 2000 milli-seconds
});
// Start displaying ajax spinner before fetching data
this.displaySpinner = true;
// Fetch data now, within Promise.all()
Promise.all([myTimerPromise, this.getPosts(), this.getSliders()]).then( () => {
// Hide the spinner now. AJAX requests are complete, and it has taken at least 2 seconds.
this.displaySpinner = false;
console.log("Data fetched for sliders and posts");
});
}
The above code also requires you to have this.displaySpinner, which activates or deactivates your ajax spinner. Or you may have some other way of doing it.
And you still need to return this.$http.get(..) from your getPosts and getSliders methods.
Note: I haven't tested the above code, you need to debug if required. It should work, based on my understanding of Promise.all() as given in this reference doc: https://developer.mozilla.org/en/docs/Web/JavaScript/Reference/Global_Objects/Promise/all
Once it starts working, you can simplify myTimerPromise as follows:
getDelayedData() {
// Create a new Promise and resolve after 2 seconds
var myTimerPromise = new Promise((resolve, reject) => {
setTimeout(resolve, 2000); // "resolve" is already a function, no need for another anonymous function here
});
// and do the other stuff here...
}
I would prefer this method, as it avoids jQuery. Then my Vue.js app can be as small as 30 KB gzipped, whereas if I bundle jQuery along, it can go well over 100 KB. It is just a personal choice :-)
I believe the problem is caused because your methods - getSliders() and getPosts() are not returning any Promises. So your deferred timer has no way of knowing what is going on in these get methods.
Your this.$http.get(..) from Vue-resource returns a promise. You can return it as follows:
getSliders(){
// Fetch sliders and "return" the promise object
return this.$http.get('/api/sliders/getsliders')
.then(sliders =>{
// The data on vue is already being set here. So it will not wait for timeout.
this.sliders = sliders.data
// Instead you may try to set it on "this.sliders_temp" and then change later to "this.sliders"
});
}, // and so on...
That follows the example given here: http://www.erichynds.com/blog/using-deferreds-in-jquery, which is linked in the other stackoverflow answer that you referred to in the question. They return $.get(..) in their functions.
But as indicated in my code sample above, even if you return this.$http.get(...), it will still not work for you, because you are already setting this.sliders inside your success handler for $http. So your Vue template will update the UI immediately. You need to think through a better strategy for your app.
I want to implement a dynamic loading of a static resource in AngularJS using Promises. The problem: I have couple components on page which might (or not, depends which are displayed, thus dynamic) need to get a static resource from the server. Once loaded, it can be cached for the whole application life.
I have implemented this mechanism, but I'm new to Angular and Promises, and I want to make sure if this is a right solution \ approach.
var data = null;
var deferredLoadData = null;
function loadDataPromise() {
if (deferredLoadData !== null)
return deferredLoadData.promise;
deferredLoadData = $q.defer();
$http.get("data.json").then(function (res) {
data = res.data;
return deferredLoadData.resolve();
}, function (res) {
return deferredLoadData.reject();
});
return deferredLoadData.promise;
}
So, only one request is made, and all next calls to loadDataPromise() get back the first made promise. It seems to work for request that in the progress or one that already finished some time ago.
But is it a good solution to cache Promises?
Is this the right approach?
Yes. The use of memoisation on functions that return promises a common technique to avoid the repeated execution of asynchronous (and usually expensive) tasks. The promise makes the caching easy because one does not need to distinguish between ongoing and finished operations, they're both represented as (the same) promise for the result value.
Is this the right solution?
No. That global data variable and the resolution with undefined is not how promises are intended to work. Instead, fulfill the promise with the result data! It also makes coding a lot easier:
var dataPromise = null;
function getData() {
if (dataPromise == null)
dataPromise = $http.get("data.json").then(function (res) {
return res.data;
});
return dataPromise;
}
Then, instead of loadDataPromise().then(function() { /* use global */ data }) it is simply getData().then(function(data) { … }).
To further improve the pattern, you might want to hide dataPromise in a closure scope, and notice that you will need a lookup for different promises when getData takes a parameter (like the url).
For this task I created service called defer-cache-service which removes all this boiler plate code. It writted in Typescript, but you can grab compiled js file. Github source code.
Example:
function loadCached() {
return deferCacheService.getDeferred('cacke.key1', function () {
return $http.get("data.json");
});
}
and consume
loadCached().then(function(data) {
//...
});
One important thing to notice that if let's say two or more parts calling the the same loadDataPromise and at the same time, you must add this check
if (defer && defer.promise.$$state.status === 0) {
return defer.promise;
}
otherwise you will be doing duplicate calls to backend.
This design design pattern will cache whatever is returned the first time it runs , and return the cached thing every time it's called again.
const asyncTask = (cache => {
return function(){
// when called first time, put the promise in the "cache" variable
if( !cache ){
cache = new Promise(function(resolve, reject){
setTimeout(() => {
resolve('foo');
}, 2000);
});
}
return cache;
}
})();
asyncTask().then(console.log);
asyncTask().then(console.log);
Explanation:
Simply wrap your function with another self-invoking function which returns a function (your original async function), and the purpose of wrapper function is to provide encapsulating scope for a local variable cache, so that local variable is only accessible within the returned function of the wrapper function and has the exact same value every time asyncTask is called (other than the very first time)
TL;DR: Is there any way to rewrite this callback-based JavaScript code to use promises and generators instead?
Background
I have a Firefox extension written using the Firefox Add-on SDK. As usual for the SDK, the code is split into an add-on script and a content script. The two scripts have different kinds of privileges: add-on scripts can do fancy things such as, for example, calling native code through the js-ctypes interface, while content scripts can interact with web pages. However, add-on scripts and content scripts can only interact with each other through an asynchronous message-passing interface.
I want to be able to call extension code from a user script on an ordinary, unprivileged web page. This can be done using a mechanism called exportFunction that lets one, well, export a function from extension code to user code. So far, so good. However, one can only use exportFunction in a content script, not an add-on script. That would be fine, except that the function I need to export needs to use the aforementioned js-ctypes interface, which can only be done in an add-on script.
(Edit: it turns out to not be the case that you can only use exportFunction in a content script. See the comment below.)
To get around this, I wrote a "wrapper" function in the content script; this wrapper is the function I actually export via exportFunction. I then have the wrapper function call the "real" function, over in the add-on script, by passing a message to the add-on script. Here's what the content script looks like; it's exporting the function lengthInBytes:
// content script
function lengthInBytes(arg, callback) {
self.port.emit("lengthInBytesCalled", arg);
self.port.on("lengthInBytesReturned", function(result) {
callback(result);
});
}
exportFunction(lengthInBytes, unsafeWindow, {defineAs: "lengthInBytes",
allowCallbacks: true});
And here's the add-on script, where the "real" version of lengthInBytes is defined. The code here listens for the content script to send it a lengthInBytesCalled message, then calls the real version of lengthInBytes, and sends back the result in a lengthInBytesReturned message. (In real life, of course, I probably wouldn't need to use js-ctypes to get the length of a string; this is just a stand-in for some more interesting C library call. Use your imagination. :) )
// add-on script
// Get "chrome privileges" to access the Components object.
var {Cu, Cc, Ci} = require("chrome");
Cu.import("resource://gre/modules/ctypes.jsm");
Cu.import("resource://gre/modules/Services.jsm");
var pageMod = require("sdk/page-mod");
var data = require("sdk/self").data;
pageMod.PageMod({
include: ["*", "file://*"],
attachTo: ["existing", "top"],
contentScriptFile: data.url("content.js"),
contentScriptWhen: "start", // Attach the content script before any page script loads.
onAttach: function(worker) {
worker.port.on("lengthInBytesCalled", function(arg) {
let result = lengthInBytes(arg);
worker.port.emit("lengthInBytesReturned", result);
});
}
});
function lengthInBytes(str) {
// str is a JS string; convert it to a ctypes string.
let cString = ctypes.char.array()(str);
libc.init();
let length = libc.strlen(cString); // defined elsewhere
libc.shutdown();
// `length` is a ctypes.UInt64; turn it into a JSON-serializable
// string before returning it.
return length.toString();
}
Finally, the user script (which will only work if the extension is installed) looks like this:
// user script, on an ordinary web page
lengthInBytes("hello", function(result) {
console.log("Length in bytes: " + result);
});
What I want to do
Now, the call to lengthInBytes in the user script is an asynchronous call; instead of returning a result, it "returns" its result in its callback argument. But, after seeing this video about using promises and generators to make async code easier to understand, I'm wondering how to rewrite this code in that style.
Specifically, what I want is for lengthInBytes to return a Promise that somehow represents the eventual payload of the lengthInBytesReturned message. Then, in the user script, I'd have a generator that evaluated yield lengthInBytes("hello") to get the result.
But, even after watching the above-linked video and reading about promises and generators, I'm still stumped about how to hook this up. A version of lengthInBytes that returns a Promise would look something like:
function lengthInBytesPromise(arg) {
self.port.emit("lengthInBytesCalled", arg);
return new Promise(
// do something with `lengthInBytesReturned` event??? idk.
);
}
and the user script would involve something like
var result = yield lengthInBytesPromise("hello");
console.log(result);
but that's as much as I've been able to figure out. How would I write this code, and what would the user script that calls it look like? Is what I want to do even possible?
A complete working example of what I have so far is here.
Thanks for your help!
A really elegant solution to this problem is coming in the next next version of JavaScript, ECMAScript 7, in the form of async functions, which are a marriage of Promises and generators that sugars over the warts of both. More on that at the very bottom of this answer.
I'm the author of Regenerator, a transpiler that supports async functions in browsers today, but I realize it might be overkill to suggest you introduce a compilation step into your add-on development process, so I'll focus instead on the questions you're actually asking: how does one design a sensible Promise-returning API, and what is the nicest way to consume such an API?
First of all, here's how I would implement lengthInBytesPromise:
function lengthInBytesPromise(arg) {
self.port.emit("lengthInBytesCalled", arg);
return new Promise(function(resolve, reject) {
self.port.on("lengthInBytesReturned", function(result) {
resolve(result);
});
});
}
The function(resolve, reject) { ... } callback is invoked immediately when the promise is instantiated, and the resolve and reject parameters are callback functions that can be used to provide the eventual value for the promise.
If there was some possibility of failure in this example, you could pass an Error object to the reject callback, but it seems like this operation is infallible, so we can just ignore that case here.
So that's how an API creates promises, but how do consumers consume such an API? In your content script, the simplest thing to do is to call lengthInBytesPromise and interact with the resulting Promise directly:
lengthInBytesPromise("hello").then(function(length) {
console.log(result);
});
In this style, you put the code that depends on the result of lengthInBytesPromise in a callback function passed to the .then method of the promise, which may not seem like a huge improvement over callback hell, but at least the indentation is more manageable if you're chaining a longer series of asynchronous operations:
lengthInBytesPromise("hello").then(function(length) {
console.log(result);
return someOtherPromise(length);
}).then(function(resultOfThatOtherPromise) {
return yetAnotherPromise(resultOfThatOtherPromise + 1);
}).then(function(finalResult) {
console.log(finalResult);
});
Generators can help reduce the boilerplate here, but additional runtime support is necessary. Probably the easiest approach is to use Dave Herman's task.js library:
spawn(function*() { // Note the *; this is a generator function!
var length = yield lengthInBytesPromise("hello");
var resultOfThatOtherPromise = yield someOtherPromise(length);
var finalResult = yield yetAnotherPromise(resultOfThatOtherPromise + 1);
console.log(finalResult);
});
This code is a lot shorter and less callback-y, that's for sure. As you can guess, most of the magic has simply been moved into the spawn function, but its implementation is actually pretty straightforward.
The spawn function takes a generator function and invokes it immediately to get a generator object, then invokes the gen.next() method of the generator object to get the first yielded promise (the result of lengthInBytesPromise("hello")), then waits for that promise to be fulfilled, then invokes gen.next(result) with the result, which provides a value for the first yield expression (the one assigned to length) and causes the generator function to run up to the next yield expression (namely, yield someOtherPromise(length)), producing the next promise, and so on, until there are no more promises left to await, because the generator function finally returned.
To give you a taste of what's coming in ES7, here's how you might use an async function to implement exactly the same thing:
async function process(arg) {
var length = await lengthInBytesPromise(arg);
var resultOfThatOtherPromise = await someOtherPromise(length);
var finalResult = await yetAnotherPromise(resultOfThatOtherPromise + 1);
return finalResult;
}
// An async function always returns a Promise for its own return value.
process(arg).then(function(finalResult) {
console.log(finalResult);
});
All that's really happening here is that the async keyword has replaced the spawn function (and the * generator syntax), and await has replaced yield. It's not a huge leap, but it will be really nice to have this syntax built into the language instead of having to rely on an external library like task.js.
If you're excited about using async functions instead of task.js, then by all means check out Regenerator!
I think the Promise is built by wrapping your original callback inside the resolve/reject function:
function lengthInBytesPromise(arg) {
self.port.emit("lengthInBytesCalled", arg);
let returnVal = new Promise(function(resolve, reject) {
self.port.on("lengthInBytesReturned", function(result) {
if (result) { // maybe some kind of validity check
resolve(result);
} else {
reject("Something went wrong?");
}
}
});
return returnVal;
}
Basically, it'd create the Promise and return it immediately, while the inside of the Promise kicks off and then handles the async task. I think at the end of the day someone has to take the callback-style code and wrap it up.
Your user would then do something like
lengthInBytesPromise(arg).then(function(result) {
// do something with the result
});
Is there a way to wait on a promise so that you can get the actual result from it and return that instead of returning the promise itself? I'm thinking of something similar to how the C# await keyword works with Tasks.
Here is an example of why I'd like to have a method like canAccess() that returns true or false instead of a promise so that it can be used in an if statement. The method canAccess() would make an AJAX call using $http or $resource and then somehow wait for the promise to get resolved.
The would look something like this:
$scope.canAccess = function(page) {
var resource = $resource('/api/access/:page');
var result = resource.get({page: page});
// how to await this and not return the promise but the real value
return result.canAccess;
}
Is there anyway to do this?
In general that's a bad idea. Let me tell you why. JavaScript in a browser is basically a single threaded beast. Come to think of it, it's single threaded in Node.js too. So anything you do to not "return" at the point you start waiting for the remote request to succeed or fail will likely involve some sort of looping to delay execution of the code after the request. Something like this:
var semaphore = false;
var superImportantInfo = null;
// Make a remote request.
$http.get('some wonderful URL for a service').then(function (results) {
superImportantInfo = results;
semaphore = true;
});
while (!semaphore) {
// We're just waiting.
}
// Code we're trying to avoid running until we know the results of the URL call.
console.log('The thing I want for lunch is... " + superImportantInfo);
But if you try that in a browser and the call takes a long time, the browser will think your JavaScript code is stuck in a loop and pop up a message in the user's face giving the user the chance to stop your code. JavaScript therefore structures it like so:
// Make a remote request.
$http.get('some wonderful URL for a service').then(function (results) {
// Code we're trying to avoid running until we know the results of the URL call.
console.log('The thing I want for lunch is... " + results);
});
// Continue on with other code which does not need the super important info or
// simply end our JavaScript altogether. The code inside the callback will be
// executed later.
The idea being that the code in the callback will be triggered by an event whenever the service call returns. Because event driven is how JavaScript likes it. Timers in JavaScript are events, user actions are events, HTTP/HTTPS calls to send and receive data generate events too. And you're expected to structure your code to respond to those events when they come.
Can you not structure your code such that it thinks canAccess is false until such time as the remote service call returns and it maybe finds out that it really is true after all? I do that all the time in AngularJS code where I don't know what the ultimate set of permissions I should show to the user is because I haven't received them yet or I haven't received all of the data to display in the page at first. I have defaults which show until the real data comes back and then the page adjusts to its new form based on the new data. The two way binding of AngularJS makes that really quite easy.
Use a .get() callback function to ensure you get a resolved resource.
Helpful links:
Official docs
How to add call back for $resource methods in AngularJS
You can't - there aren't any features in angular, Q (promises) or javascript (at this point in time) that let do that.
You will when ES7 happens (with await).
You can if you use another framework or a transpiler (as suggested in the article linked - Traceur transpiler or Spawn).
You can if you roll your own implementation!
My approach was create a function with OLD javascript objects as follows:
var globalRequestSync = function (pUrl, pVerbo, pCallBack) {
httpRequest = new XMLHttpRequest();
httpRequest.onreadystatechange = function () {
if (httpRequest.readyState == 4 && httpRequest.status == 200) {
pCallBack(httpRequest.responseText);
}
}
httpRequest.open(pVerbo, pUrl, false);
httpRequest.send(null);
};
I recently had this problem and made a utility called 'syncPromises'. This basically works by sending what I called an "instruction list", which would be array of functions to be called in order. You'll need to call the first then() to kick things of, dynamically attach a new .then() when the response comes back with the next item in the instruction list so you'll need to keep track of the index.
// instructionList is array.
function syncPromises (instructionList) {
var i = 0,
defer = $q.defer();
function next(i) {
// Each function in the instructionList needs to return a promise
instructionList[i].then(function () {
var test = instructionList[i++];
if(test) {
next(i);
}
});
}
next(i);
return defer.promise;
}
This I found gave us the most flexibility.
You can automatically push operations etc to build an instruction list and you're also able to append as many .then() responses handlers in the callee function. You can also chain multiple syncPromises functions that will all happen in order.
I want to implement a dynamic loading of a static resource in AngularJS using Promises. The problem: I have couple components on page which might (or not, depends which are displayed, thus dynamic) need to get a static resource from the server. Once loaded, it can be cached for the whole application life.
I have implemented this mechanism, but I'm new to Angular and Promises, and I want to make sure if this is a right solution \ approach.
var data = null;
var deferredLoadData = null;
function loadDataPromise() {
if (deferredLoadData !== null)
return deferredLoadData.promise;
deferredLoadData = $q.defer();
$http.get("data.json").then(function (res) {
data = res.data;
return deferredLoadData.resolve();
}, function (res) {
return deferredLoadData.reject();
});
return deferredLoadData.promise;
}
So, only one request is made, and all next calls to loadDataPromise() get back the first made promise. It seems to work for request that in the progress or one that already finished some time ago.
But is it a good solution to cache Promises?
Is this the right approach?
Yes. The use of memoisation on functions that return promises a common technique to avoid the repeated execution of asynchronous (and usually expensive) tasks. The promise makes the caching easy because one does not need to distinguish between ongoing and finished operations, they're both represented as (the same) promise for the result value.
Is this the right solution?
No. That global data variable and the resolution with undefined is not how promises are intended to work. Instead, fulfill the promise with the result data! It also makes coding a lot easier:
var dataPromise = null;
function getData() {
if (dataPromise == null)
dataPromise = $http.get("data.json").then(function (res) {
return res.data;
});
return dataPromise;
}
Then, instead of loadDataPromise().then(function() { /* use global */ data }) it is simply getData().then(function(data) { … }).
To further improve the pattern, you might want to hide dataPromise in a closure scope, and notice that you will need a lookup for different promises when getData takes a parameter (like the url).
For this task I created service called defer-cache-service which removes all this boiler plate code. It writted in Typescript, but you can grab compiled js file. Github source code.
Example:
function loadCached() {
return deferCacheService.getDeferred('cacke.key1', function () {
return $http.get("data.json");
});
}
and consume
loadCached().then(function(data) {
//...
});
One important thing to notice that if let's say two or more parts calling the the same loadDataPromise and at the same time, you must add this check
if (defer && defer.promise.$$state.status === 0) {
return defer.promise;
}
otherwise you will be doing duplicate calls to backend.
This design design pattern will cache whatever is returned the first time it runs , and return the cached thing every time it's called again.
const asyncTask = (cache => {
return function(){
// when called first time, put the promise in the "cache" variable
if( !cache ){
cache = new Promise(function(resolve, reject){
setTimeout(() => {
resolve('foo');
}, 2000);
});
}
return cache;
}
})();
asyncTask().then(console.log);
asyncTask().then(console.log);
Explanation:
Simply wrap your function with another self-invoking function which returns a function (your original async function), and the purpose of wrapper function is to provide encapsulating scope for a local variable cache, so that local variable is only accessible within the returned function of the wrapper function and has the exact same value every time asyncTask is called (other than the very first time)