WebGL: Callback hell in the resource loader - javascript

Please help me deal with this garbage I produced:
Program.prototype.init = function()
{
loadText('../res/shaders/blinnPhong-shader.vsh', function (vshErr, vshText) {
if (vshErr) {
alert('Fatal error loading vertex shader.');
console.error(vshErr);
} else {
loadText('../res/shaders/blinnPhong-shader.fsh', function (fshErr, fshText) {
if (fshErr) {
alert('Fatal error loading fragment shader.');
console.error(fshErr);
} else {
loadJSON('../res/models/dragon.json', function (modelErr, modelObj) {
if (modelErr) {
alert('Fatal error loading model.');
console.error(modelErr);
} else {
loadImage('../res/textures/susanTexture.png', function (imgErr, img) {
if (imgErr) {
alert('Fatal error loading texture.');
console(imgErr);
} else {
this.run = true;
RunProgram(vshText, fshText, img, modelObj);
}
});
}
});
}
});
}
});
};
My actual goal is to abstract the resource loading process for a WebGL program.
That means in the future there will be arrays of meshes, textures, shaders and I want to be able to connect certain dependencies between resources. For example: I want to create two GameObjects One and Two. One uses shaders and is loaded from a mesh but has no texture, whereas Two uses the same shaders as One but uses its own mesh and also needs a texture. What principles could I use to achieve building these dependencies in JavaScript (with asynchronous loading and so on)?
Edit:
So the following is happening with this code: I kept callbacks for now. However this method is part of a Singleton object. I edited the code because in the last else case I am setting a flag of program to true. I keep a global reference of the program object in my main. However due to the callbacks the reference is somehow lost, the global reference keeps its flag to false so the main loop is never reached. It is clearly a problem of the callbacks, since the flag is set when I call "this.run = true" outside the nested callbacks. Any advice on that?

Using modern APIs like Promises, Fetch and sugar like arrow functions, your code can become:
Program.prototype.init = function () {
return Promise.all(
fetch('../res/shaders/blinnPhong-shader.vsh').then(r=>r.text()),
fetch('../res/shaders/blinnPhong-shader.fsh').then(r=>r.text()),
fetch('../res/models/dragon.json').then(r=>r.json()),
new Promise(function (resolve,reject) {
var i = new Image();
i.onload = () => resolve(i);
i.onerror = () => reject('Error loading image '+i.src);
i.src = '../res/textures/susanTexture.png';
})
)
.then(RunProgram);
}
You could spice things up even further by using related ES2017 features like async functions/await or go all in on compatibility by forgoing arrow functions and using seamless polyfills for promises and fetch. For some simple request caching, wrap fetch:
const fetchCache = Object.create(null);
function fetchCached (url) {
if (fetchCache[url])
return Promise.resolve(fetchCache[url]);
return fetch.apply(null,arguments).then(r=>fetchCache[url]=r);
}
Note that you want your resources to be unique so the above mentioned caching still needs another layer of actual GPU resource caching on top of it, you don't want to create multiple shader programs with the same shader code or array buffers with the same vertex data in them.
Your actual core question as to how you could manage dependencies is a bit too broad / application specific to be answered here on SO. In regards to managing the async nature in such an environment I see two options:
Use placeholder resources and seamlessly replace them once the actual resources are loaded
Wait until everything is loaded before you insert the GameObject into the rendering pipeline
Both approaches have their pros and cons, but usually I'd recommend the first option.

You can use promises for this. With the bluebird module, you can convert loadText to a promise function with promise.promiseifyAll(the module loadText is from), or if that is your module, you can make it return a new Promise(function(resolve, reject){})
Using promises, you can make an array of all the promises you want to run and Promise.all([loadText('shader'), loadText("other shader"), ...])
More information on promises

Related

Passing context implicitly across functions and javascript files in nodes.js

I have created a web server i node.js using express and passport. It authenticates using an oauth 2.0 strategy (https://www.npmjs.com/package/passport-canvas). When authenticated, I want to make a call such as:
app.get("/api/courses/:courseId", function(req, res) {
// pass req.user.accessToken implicitly like
// through an IIFE
createExcelToResponseStream(req.params.courseId, res).catch(err => {
console.log(err);
res.status(500).send("Ops!");
});
});
My issue is that i would like, in all subsequent calls from createExcelToResponseStream, to have access to my accessToken. I need to do a ton of api calls later in my business layer. I will call a method that looks like this:
const rq = require("request");
const request = url => {
return new Promise(resolve => {
rq.get(
url,
{
auth: {
bearer: CANVASTOKEN // should be req.user.accessToken
}
},
(error, response) => {
if (error) {
throw new Error(error);
}
resolve(response);
}
);
});
};
If i try to create a global access to the access token, i will risk
race conditions (i think) - i.e. that people get responses in the context of another persons access token.
If i pass the context as a variable i have to refactor a
lof of my code base and a lot of business layer functions have to
know about something they don't need to know about
Is there any way in javascript where i can pass the context, accross functions, modules and files, through the entire callstack (by scope, apply, bind, this...). A bit the same way you could do in a multithreaded environment where you have one user context per thread.
The only thing you could do would be
.bind(req);
But that has has to be chained into every inner function call
somefunc.call(this);
Or you use inline arrow functions only
(function (){
inner=()=>alert(this);
inner();
}).bind("Hi!")();
Alternatively, you could apply all functions onto an Object, and then create a new Instance:
var reqAuthFunctions={
example:()=>alert(this.accessToken),
accessToken:null
};
instance=Object.assign(Object.create(reqAuthFunctions),{accessToken:1234});
instance.example();
You could use a Promise to avoid Race conditions.
Let's have this module:
// ContextStorage.js
let gotContext;
let failedGettingContext;
const getContext = new Promise((resolve,reject)=>{
gotContext = resolve;
failedGettingContext = reject;
}
export {getContext,gotContext, failedGettingContext};
And this inititalization:
// init.js
import {gotContext} from './ContextStorage';
fetch(context).then(contextIGot => gotContext(contextIGot));
And this thing that needs the context:
// contextNeeded.js
import {getContext} from './ContextStorage';
getContext.then(context => {
// Do stuff with context
}
This is obviously not very usable code, since it all executes on load, but I hope it gives you a framework of how to think about this issue with portals... I mean Promises...
The thing that happens when you call the imported 'gotContext', you actually resolve the promise returned by 'getContext'. Hence no matter the order of operations, you either resolve the promise after the context has been requested setting the dependent operation into motion, or your singleton has already a resolved promise, and the dependent operation will continue synchronously.
On another note, you could easily fetch the context in the 'body' of the promise in the 'ContextStorage' singleton. However that's not very modular, now is it. A better approach would be to inject the initializing function into the singleton in order to invert control, but that would obfuscate the code a bit I feel hindering the purpose of the demonstration.

Module returning asynchronously initialited object

I'm having this "design" problem that's driving me crazy.
My goal is having a decoupled RabbitMQ client. It has to be able to init it's connection and "return" a created channel so my publishing module can use it.
Code looks like this (i know that is not the better code but i expect it serves for this explanation).
var createConnection = (function() {
var channel;
var connect = function(){
// amqp connect
// error handling
createChannel();
});
}
var createChannel = function(){
//amqpConn.createConfirmChannel...
}
//pseudo
return{
getChannel : function(){
if(!initiated)
connect();
return channel;
}
}
})();
module.exports = createConnection;
Now, important things:
1- I know this ain't gonna work and i know why, its a simplification.
2- I'm aware that i can accomplish my goals by using async or promises.
3- Maybe has no sense decoupling a rabbit client but is for understanding purposes
That said, my questions:
1- Is there any way i can accomplish this without using other modules?
2- if so, can be accomplished in a fancy and stylish way?
3- Is there any fancy solution that allows 3rd party code executing a simple "publish(exchange, channel, msg)" been sure that connection has been established?
I feel able to work with JS but sometimes you just need do things one way only to know that you can but this is giving me some headache.
Truly thanks and i hope the question was understood :)
One way I've found to handle this is to wrap your asynchronous object in an object that is aware of the asynchronous state of your object and presents the same API regardless of whether or not the asynchronous object has finished initializing.
For example, you could wrap your channel object in another object that presents the same methods but internally checks if the actual channel object is initialized. If it is, use it as normal. If it isn't, wait for it to be initialized and use it as normal. The user of the wrapper object wouldn't need to know if the channel is actually initialized. The main drawback of this is every wrapper method that needs to access channel must be asynchronous even if the method it's accessing on channel is synchronous.
Example:
function initializeChannel() {
return new Promise((resolve, reject) => {
// create, initialize, and resolve channel
});
}
module.exports = { // wrapper
channelPromise: initializeChannel(),
foo(a) {
return this.channelPromise.then((channel) => channel.foo(a));
}
};

How to render result of an array of asynchronous forEach 'find' functions?

This is my simple task: Find images by id array and render images value into template.
router.get('/gallery', function(req, res) {
var images = [];
imagesIds.forEach(function(eachImageId) {
Images.findById(eachImageId).exec(function(findImageErr, foundImage) {
if (foundImage) {
images.push(foundImage);
}
});
});
res.render('gallery', {
images: images
});
});
The problem is the 'res.render' function does not wait for 'findById' function to finish. 'images' array always become '[]' empty.
I try to use generator but did not know how to achieve.
If someone can explain without library(like q) will be better. Because I want to know generator deeply how to deal with this problem.
Generators allow to write synchronous-like function, because they can stop its execution and resume it later.
I guess you already read some articles like this and know how to define generator function and use them.
Your asynchronous code can be represented as a simple iterator with a magic yield keyword. Generator function will run and stop here until you resume it using method next().
function* loadImages(imagesIds) {
var images = [], image;
for(imageId of imagesIds) {
image = yield loadSingleImage(imageId);
images.push(image);
}
return images;
}
Because there is a cycle, function will go though the cycle with each next() until all imagesIds will have been walked. Finally there will be executed return statement and you will get images.
Now we need to describe image loading. Our generator function need to know when current image have loaded and it can start to load next. All modern javascript runtimes (node.js and latest browsers) have native Promise object support and we will define a function which returns a promise and it will be eventually resolved with image if it will have been found.
function loadSingleImage(imageId) {
return new Promise((resolve, reject) => {
Images.findById(imageId).exec((findImageErr, foundImage) => {
if (foundImage) {
resolve(foundImage)
} else {
reject();
}
});
});
}
Well we have two functions, one for single image load and the second for putting them together. Now we need a some dispatcher for passing control from one to another function. Since your don't want to use libraries, we have to implement some helper by yourself.
It is a smaller version of spawn function, which can be simpler and better to understand, since we don't need to handle errors, but just ignore missing images.
function spawn(generator) {
function continuer(value) {
var result = generator.next(value);
if(!result.done) {
return Promise.resolve(result.value).then(continuer);
} else {
return result.value;
}
}
return continuer();
}
This functions performs a recursive calls of our generator within continuer function while the result.done is not true. Once it got, that means that generation has been successfully finished and we can return our value.
And finally, putting all together, you will get the following code for gallery loading.
router.get('/gallery', function(req, res) {
var imageGenerator = loadImages(imagesIds);
spawn(imageGenerator).then(function(images) {
res.render('gallery', {
images: images
});
});
});
Now you have a little bit pseudo-synchronous code in the loadImages function. And I hope it helps to understand how generators work.
Also note that all images will be loaded sequently, because we wait asynchronous result of loadSingleImage call to put it in array, before we can go to the next imageId. It can cause performance issues, if you are going to use this way in production.
Related links:
Mozilla Hacks – ES6 In Depth: Generators
2ality – ES6 generators in depth
Jake Archibald – ES7 async functions
It can be done without a 3rd party as you asked, but it would be cumbersome...
Anyway the bottom line is to do it inside the callback function "function(findImageErr,foundImage){..}".
1) Without a 3rd party you - you need to render only after all images were accounted for:
var images = [];
var results=0;
imagesIds.forEach(function(eachImageId) {
Images.findById(eachImageId).exec(function(findImageErr, foundImage) {
results++;
if(foundImage)
images.push(foundImage);
if(results == imagesIds.length)
res.render('gallery',{images:images});
});
});
2) I strongly recommend a 3rd party which would do the same.
I'm currently using async, but I might migrate to promises in the future.
async.map(
imageIds,
function(eachImageId,next){
Images.findById(eachImageId).exec(function(findImageErr, foundImage) {
next(null,foundImage);
// don't report errors to async, because it will abort
)
},
function(err, images){
images=_.compact(images); // remove null images, i'm using lodash
res.render('gallery',{images:images});
}
);
Edited: following your readability remark, please note if you create some wrapper function for 'findById(...).exec(...)' that ignores errors and just reports them as null (call it 'findIgnoreError'(imageId, callback)) then you could write:
async.map(
imageIds,
findIgnoreError,
function(err, images){
images=_.compact(images); // remove null images, i'm using lodash
res.render('gallery',{images:images});
}
);
In other words, it becomes a bit more readable if the reader starts to think Functions... It says "go over those imageIds in parallel, run "findIgnoreError" on each imageId, and the final section says what to do with the accumulated results...
Instead of querying mongo(or any DB) N times, I would just fire a single query using $in:
Images.find({ _id : { $in : imagesIds}},function(err,images){
if(err) return next(err);
res.render('gallery',{images:images});
});
This would also reduce the number of io's, plus you won't have to write additional code to handle res.render

Angular/Promises - Multiple Controllers waiting for the same $promise? [duplicate]

I want to implement a dynamic loading of a static resource in AngularJS using Promises. The problem: I have couple components on page which might (or not, depends which are displayed, thus dynamic) need to get a static resource from the server. Once loaded, it can be cached for the whole application life.
I have implemented this mechanism, but I'm new to Angular and Promises, and I want to make sure if this is a right solution \ approach.
var data = null;
var deferredLoadData = null;
function loadDataPromise() {
if (deferredLoadData !== null)
return deferredLoadData.promise;
deferredLoadData = $q.defer();
$http.get("data.json").then(function (res) {
data = res.data;
return deferredLoadData.resolve();
}, function (res) {
return deferredLoadData.reject();
});
return deferredLoadData.promise;
}
So, only one request is made, and all next calls to loadDataPromise() get back the first made promise. It seems to work for request that in the progress or one that already finished some time ago.
But is it a good solution to cache Promises?
Is this the right approach?
Yes. The use of memoisation on functions that return promises a common technique to avoid the repeated execution of asynchronous (and usually expensive) tasks. The promise makes the caching easy because one does not need to distinguish between ongoing and finished operations, they're both represented as (the same) promise for the result value.
Is this the right solution?
No. That global data variable and the resolution with undefined is not how promises are intended to work. Instead, fulfill the promise with the result data! It also makes coding a lot easier:
var dataPromise = null;
function getData() {
if (dataPromise == null)
dataPromise = $http.get("data.json").then(function (res) {
return res.data;
});
return dataPromise;
}
Then, instead of loadDataPromise().then(function() { /* use global */ data }) it is simply getData().then(function(data) { … }).
To further improve the pattern, you might want to hide dataPromise in a closure scope, and notice that you will need a lookup for different promises when getData takes a parameter (like the url).
For this task I created service called defer-cache-service which removes all this boiler plate code. It writted in Typescript, but you can grab compiled js file. Github source code.
Example:
function loadCached() {
return deferCacheService.getDeferred('cacke.key1', function () {
return $http.get("data.json");
});
}
and consume
loadCached().then(function(data) {
//...
});
One important thing to notice that if let's say two or more parts calling the the same loadDataPromise and at the same time, you must add this check
if (defer && defer.promise.$$state.status === 0) {
return defer.promise;
}
otherwise you will be doing duplicate calls to backend.
This design design pattern will cache whatever is returned the first time it runs , and return the cached thing every time it's called again.
const asyncTask = (cache => {
return function(){
// when called first time, put the promise in the "cache" variable
if( !cache ){
cache = new Promise(function(resolve, reject){
setTimeout(() => {
resolve('foo');
}, 2000);
});
}
return cache;
}
})();
asyncTask().then(console.log);
asyncTask().then(console.log);
Explanation:
Simply wrap your function with another self-invoking function which returns a function (your original async function), and the purpose of wrapper function is to provide encapsulating scope for a local variable cache, so that local variable is only accessible within the returned function of the wrapper function and has the exact same value every time asyncTask is called (other than the very first time)

Caching a promise object in AngularJS service

I want to implement a dynamic loading of a static resource in AngularJS using Promises. The problem: I have couple components on page which might (or not, depends which are displayed, thus dynamic) need to get a static resource from the server. Once loaded, it can be cached for the whole application life.
I have implemented this mechanism, but I'm new to Angular and Promises, and I want to make sure if this is a right solution \ approach.
var data = null;
var deferredLoadData = null;
function loadDataPromise() {
if (deferredLoadData !== null)
return deferredLoadData.promise;
deferredLoadData = $q.defer();
$http.get("data.json").then(function (res) {
data = res.data;
return deferredLoadData.resolve();
}, function (res) {
return deferredLoadData.reject();
});
return deferredLoadData.promise;
}
So, only one request is made, and all next calls to loadDataPromise() get back the first made promise. It seems to work for request that in the progress or one that already finished some time ago.
But is it a good solution to cache Promises?
Is this the right approach?
Yes. The use of memoisation on functions that return promises a common technique to avoid the repeated execution of asynchronous (and usually expensive) tasks. The promise makes the caching easy because one does not need to distinguish between ongoing and finished operations, they're both represented as (the same) promise for the result value.
Is this the right solution?
No. That global data variable and the resolution with undefined is not how promises are intended to work. Instead, fulfill the promise with the result data! It also makes coding a lot easier:
var dataPromise = null;
function getData() {
if (dataPromise == null)
dataPromise = $http.get("data.json").then(function (res) {
return res.data;
});
return dataPromise;
}
Then, instead of loadDataPromise().then(function() { /* use global */ data }) it is simply getData().then(function(data) { … }).
To further improve the pattern, you might want to hide dataPromise in a closure scope, and notice that you will need a lookup for different promises when getData takes a parameter (like the url).
For this task I created service called defer-cache-service which removes all this boiler plate code. It writted in Typescript, but you can grab compiled js file. Github source code.
Example:
function loadCached() {
return deferCacheService.getDeferred('cacke.key1', function () {
return $http.get("data.json");
});
}
and consume
loadCached().then(function(data) {
//...
});
One important thing to notice that if let's say two or more parts calling the the same loadDataPromise and at the same time, you must add this check
if (defer && defer.promise.$$state.status === 0) {
return defer.promise;
}
otherwise you will be doing duplicate calls to backend.
This design design pattern will cache whatever is returned the first time it runs , and return the cached thing every time it's called again.
const asyncTask = (cache => {
return function(){
// when called first time, put the promise in the "cache" variable
if( !cache ){
cache = new Promise(function(resolve, reject){
setTimeout(() => {
resolve('foo');
}, 2000);
});
}
return cache;
}
})();
asyncTask().then(console.log);
asyncTask().then(console.log);
Explanation:
Simply wrap your function with another self-invoking function which returns a function (your original async function), and the purpose of wrapper function is to provide encapsulating scope for a local variable cache, so that local variable is only accessible within the returned function of the wrapper function and has the exact same value every time asyncTask is called (other than the very first time)

Categories