Passing context implicitly across functions and javascript files in nodes.js - javascript

I have created a web server i node.js using express and passport. It authenticates using an oauth 2.0 strategy (https://www.npmjs.com/package/passport-canvas). When authenticated, I want to make a call such as:
app.get("/api/courses/:courseId", function(req, res) {
// pass req.user.accessToken implicitly like
// through an IIFE
createExcelToResponseStream(req.params.courseId, res).catch(err => {
console.log(err);
res.status(500).send("Ops!");
});
});
My issue is that i would like, in all subsequent calls from createExcelToResponseStream, to have access to my accessToken. I need to do a ton of api calls later in my business layer. I will call a method that looks like this:
const rq = require("request");
const request = url => {
return new Promise(resolve => {
rq.get(
url,
{
auth: {
bearer: CANVASTOKEN // should be req.user.accessToken
}
},
(error, response) => {
if (error) {
throw new Error(error);
}
resolve(response);
}
);
});
};
If i try to create a global access to the access token, i will risk
race conditions (i think) - i.e. that people get responses in the context of another persons access token.
If i pass the context as a variable i have to refactor a
lof of my code base and a lot of business layer functions have to
know about something they don't need to know about
Is there any way in javascript where i can pass the context, accross functions, modules and files, through the entire callstack (by scope, apply, bind, this...). A bit the same way you could do in a multithreaded environment where you have one user context per thread.

The only thing you could do would be
.bind(req);
But that has has to be chained into every inner function call
somefunc.call(this);
Or you use inline arrow functions only
(function (){
inner=()=>alert(this);
inner();
}).bind("Hi!")();
Alternatively, you could apply all functions onto an Object, and then create a new Instance:
var reqAuthFunctions={
example:()=>alert(this.accessToken),
accessToken:null
};
instance=Object.assign(Object.create(reqAuthFunctions),{accessToken:1234});
instance.example();

You could use a Promise to avoid Race conditions.
Let's have this module:
// ContextStorage.js
let gotContext;
let failedGettingContext;
const getContext = new Promise((resolve,reject)=>{
gotContext = resolve;
failedGettingContext = reject;
}
export {getContext,gotContext, failedGettingContext};
And this inititalization:
// init.js
import {gotContext} from './ContextStorage';
fetch(context).then(contextIGot => gotContext(contextIGot));
And this thing that needs the context:
// contextNeeded.js
import {getContext} from './ContextStorage';
getContext.then(context => {
// Do stuff with context
}
This is obviously not very usable code, since it all executes on load, but I hope it gives you a framework of how to think about this issue with portals... I mean Promises...
The thing that happens when you call the imported 'gotContext', you actually resolve the promise returned by 'getContext'. Hence no matter the order of operations, you either resolve the promise after the context has been requested setting the dependent operation into motion, or your singleton has already a resolved promise, and the dependent operation will continue synchronously.
On another note, you could easily fetch the context in the 'body' of the promise in the 'ContextStorage' singleton. However that's not very modular, now is it. A better approach would be to inject the initializing function into the singleton in order to invert control, but that would obfuscate the code a bit I feel hindering the purpose of the demonstration.

Related

Exporting asyc dependent, non-async const from JS module

I am building off of a forked React app. There is a dependency module that relies on a couple of static json files to map and export a couple of consts that are consumed in the app's React components. Instead of relying on these static json files, we're using an API as our source. Right now my build process is to fetch the json and transform it to generate our custom json files, which then gets consumed during the build process.
Instead, I want the process to be performed client side.
export const GALAXY = async () => {
const result = await fetch(JSON
, {
headers: {
'Content-Type': 'application/json',
'Accept': 'application/json'
}
});
if (result.ok) {
console.log('GALAXY LOADED')
return result.json();
} else {
console.log('GALAXY FAILED')
return undefined
}
}
It seems that anything that relies on an async/await function will only return a Promise. From the docs:
Return resolved promise from async function
Async functions always return a promise. If the return value of an async function is not explicitly a promise, it will be implicitly wrapped in a promise.
Most of the examples I've found show some variation of the .then pattern:
const galaxy = Promise.resolve(GALAXY()).then((value) => {
console.log(value);
});
but if I try
const galaxy = Promise.resolve(GALAXY()).then((value) => {
return value
});
console.log(galaxy);
I get Promise{pending}
I'm hoping to avoid rewriting a bunch of non-async code. We've got one TS file that gets imported by a number of React components:
import TokenMints from './token-mints.json';
export const TOKEN_MINTS: Array<{
address: PublicKey;
name: string;
}> = TokenMints.map((mint) => {
return {
address: new PublicKey(mint.address),
name: mint.name,
};
});
So how can I fetch from the JSON API and pass the data to this non-async const so that it can be used in the rest of the app?
If you have something like this:
const example = async () => {
return true
}
Then anywhere you call it or use it will result in a promise, that's how async works.
example().then(() => {
// any code here will execute after the promise resolved
}
// any code here can not directly use the promise data as it's not resolved yet.
The code inside then and outside then will execute in parallel, or at least the code outside it will continue executing while the promise resolves and once it finally resolves it will execute the code inside the then.
Alternatively you can use await.
const data = await example() // this forces all code execution to wait for the promise to resolve
// code here will not execute until that promise resolved
You have to correctly use async/await.
Several possibilities to consider
Fetch the JSON using a synchronous XMLHttpRequest operation. Use of this kind of synchronous requests in the main thread is, however, frowned upon (note).
Convert the JSON to JavaScript source on the server (if required, in many/most cases JSON is valid JavaScript), assign it to a global variable and include it in the head section of the client page using <script> tags.
A variation of (2): convert the JSON to a module file on the server which exports the JSON data converted to JavaScipt source, which client code includes as a module, and from which imports the data as a JavaScript object.
If the client code uses the data after waiting for an event such as DOMContentLoaded (then calling a main function rather than writing the app in file level code), you could promisify the DomContentLoaded event arrival, and pass it through Promise.all along with the galaxy promise before calling main, along the lines of
Promise.all(
new Promise( resolve=>
document.addEventListener("DOMContentLoaded", resolve)
),
galaxy
).then( data => main( data[1])); // passing galaxy value
About async
As already noticed, async functions return a promise for the value syntactically returned in the function body - because async functions return synchronously when called, without waiting for any asynchronous operations performed in the function body to complete. Given JavaScript is single threaded, the caller is not put on hold until all operations have been carried out, and must return to the event loop for other pieces of JavaScript code to execute.

node jasmine - how to unit test on function that involve redis call?

I just started playing around Jasmine and I'm still struggling on the spyon/mocking things, e.g., I have a function
module.exports = (() => {
....
function getUserInfo(id) {
return new Promise((resolve, reject) => {
redis.getAsync(id).then(result => {
resolve(result)
})
})
}
return { getUserInfo: getUserInfo }
})()
Then I start writing the Jasmine spec
describe('Test user helper', () => {
let userInfo
beforeEach(done => {
userHelper.getUserInfo('userid123')
.then(info => {
userInfo = info
done()
})
})
it('return user info if user is found', () => {
expect(userInfo).toEqual('info of userid 123')
})
})
It runs well, but my question is how can I mock the redis.getAsync call, so it can become a real isolated unit test?
Thanks.
Good question. You can mock out the redis dependency but only if you rewrite you code, slightly, to be more testable.
Here, that means making redis a parameter to the factory that returns the object containing getUserInfo.
Of course, this changes the API, callers now need to call the export to get the object. To fix this, we can create a wrapper module that calls the function with the standard redis object, and returns the result. Then we move the actual factory into an inner module, which still allows it to be tested.
Here is what that might well look like
user-helper/factory.js
module.exports = redis => {
....
function getUserInfo(id) {
return redis.getAsync(id); // note simplified as new Promise was not needed
}
return {getUserInfo};
};
user-helper/index.js
// this is the wrapper that preserves existing API
module.exports = require('./factory')(redis);
And now for the test
const userHelperFactory = require('./user-helper/factory');
function createMockRedis() {
const users = [
{userId: 'userid123'},
// etc.
];
return {
getAsync: function (id) {
// Note: I do not know off hand what redis returns, or if it throws,
// if there is no matching record - adjust this to match.
return Promise.resolve(users.find(user => user.userId === id));
}
};
}
describe('Test user helper', () => {
const mockRedis = createMockRedis();
const userHelper = userHelperFactory(mockRedis);
let userInfo;
beforeEach(async () => {
userInfo = await userHelper.getUserInfo('userid123');
});
it('must return user info when a matching user exists', () => {
expect(userInfo).toEqual('info of userid 123');
});
});
NOTE: As discussed in comments, this was just my incidental approach to the situation at hand. There are plenty of other setups and conventions you can use but the primary idea was just based on the existing export of the result of an IIFE, which is a solid pattern, and I leveraged the NodeJS /index convention to preserve the existing API. You could also use one file and export via both module.exports = factory(redis) and module.exports.factory = factory, but that would, I believe, be less idiomatic in NodeJS. The broader point was that being able to mock for tests, and testability in general is just about parameterization.
Parameterization is wonderfully powerful, and its simplicity is why developers working in functional languages sometimes laugh at OOP programmers, such as yours truly, and our clandestine incantations like "Oh glorious Dependency Injection Container, bequeath unto me an instanceof X" :)
It is not that OOP or DI get it wrong it is that testability, DI, IOC, etc. are just about parameterization.
Interestingly, if we were loading redis as a module, and if we were using a configurable module loader, such as SystemJS, we could do this by simply using loader configuration at the test level. Even Webpack lets you do this to some extent, but for NodeJS you would need to monkey patch the Require Function, or create a bunch of fake packages, which are not good options.
To the OP's specific response
Thanks! That's a good idea, but practically, it seems it's quite strange when I have tons of file to test in which I will need to create a factory and index.js for each of them.
You would need to restructure your API surface and simply export factories that consuming code must call, rather than the result of applying those factories, to reduce the burden, but there are tradeoffs and default instances are helpful to consumers.

Module returning asynchronously initialited object

I'm having this "design" problem that's driving me crazy.
My goal is having a decoupled RabbitMQ client. It has to be able to init it's connection and "return" a created channel so my publishing module can use it.
Code looks like this (i know that is not the better code but i expect it serves for this explanation).
var createConnection = (function() {
var channel;
var connect = function(){
// amqp connect
// error handling
createChannel();
});
}
var createChannel = function(){
//amqpConn.createConfirmChannel...
}
//pseudo
return{
getChannel : function(){
if(!initiated)
connect();
return channel;
}
}
})();
module.exports = createConnection;
Now, important things:
1- I know this ain't gonna work and i know why, its a simplification.
2- I'm aware that i can accomplish my goals by using async or promises.
3- Maybe has no sense decoupling a rabbit client but is for understanding purposes
That said, my questions:
1- Is there any way i can accomplish this without using other modules?
2- if so, can be accomplished in a fancy and stylish way?
3- Is there any fancy solution that allows 3rd party code executing a simple "publish(exchange, channel, msg)" been sure that connection has been established?
I feel able to work with JS but sometimes you just need do things one way only to know that you can but this is giving me some headache.
Truly thanks and i hope the question was understood :)
One way I've found to handle this is to wrap your asynchronous object in an object that is aware of the asynchronous state of your object and presents the same API regardless of whether or not the asynchronous object has finished initializing.
For example, you could wrap your channel object in another object that presents the same methods but internally checks if the actual channel object is initialized. If it is, use it as normal. If it isn't, wait for it to be initialized and use it as normal. The user of the wrapper object wouldn't need to know if the channel is actually initialized. The main drawback of this is every wrapper method that needs to access channel must be asynchronous even if the method it's accessing on channel is synchronous.
Example:
function initializeChannel() {
return new Promise((resolve, reject) => {
// create, initialize, and resolve channel
});
}
module.exports = { // wrapper
channelPromise: initializeChannel(),
foo(a) {
return this.channelPromise.then((channel) => channel.foo(a));
}
};

Angular/Promises - Multiple Controllers waiting for the same $promise? [duplicate]

I want to implement a dynamic loading of a static resource in AngularJS using Promises. The problem: I have couple components on page which might (or not, depends which are displayed, thus dynamic) need to get a static resource from the server. Once loaded, it can be cached for the whole application life.
I have implemented this mechanism, but I'm new to Angular and Promises, and I want to make sure if this is a right solution \ approach.
var data = null;
var deferredLoadData = null;
function loadDataPromise() {
if (deferredLoadData !== null)
return deferredLoadData.promise;
deferredLoadData = $q.defer();
$http.get("data.json").then(function (res) {
data = res.data;
return deferredLoadData.resolve();
}, function (res) {
return deferredLoadData.reject();
});
return deferredLoadData.promise;
}
So, only one request is made, and all next calls to loadDataPromise() get back the first made promise. It seems to work for request that in the progress or one that already finished some time ago.
But is it a good solution to cache Promises?
Is this the right approach?
Yes. The use of memoisation on functions that return promises a common technique to avoid the repeated execution of asynchronous (and usually expensive) tasks. The promise makes the caching easy because one does not need to distinguish between ongoing and finished operations, they're both represented as (the same) promise for the result value.
Is this the right solution?
No. That global data variable and the resolution with undefined is not how promises are intended to work. Instead, fulfill the promise with the result data! It also makes coding a lot easier:
var dataPromise = null;
function getData() {
if (dataPromise == null)
dataPromise = $http.get("data.json").then(function (res) {
return res.data;
});
return dataPromise;
}
Then, instead of loadDataPromise().then(function() { /* use global */ data }) it is simply getData().then(function(data) { … }).
To further improve the pattern, you might want to hide dataPromise in a closure scope, and notice that you will need a lookup for different promises when getData takes a parameter (like the url).
For this task I created service called defer-cache-service which removes all this boiler plate code. It writted in Typescript, but you can grab compiled js file. Github source code.
Example:
function loadCached() {
return deferCacheService.getDeferred('cacke.key1', function () {
return $http.get("data.json");
});
}
and consume
loadCached().then(function(data) {
//...
});
One important thing to notice that if let's say two or more parts calling the the same loadDataPromise and at the same time, you must add this check
if (defer && defer.promise.$$state.status === 0) {
return defer.promise;
}
otherwise you will be doing duplicate calls to backend.
This design design pattern will cache whatever is returned the first time it runs , and return the cached thing every time it's called again.
const asyncTask = (cache => {
return function(){
// when called first time, put the promise in the "cache" variable
if( !cache ){
cache = new Promise(function(resolve, reject){
setTimeout(() => {
resolve('foo');
}, 2000);
});
}
return cache;
}
})();
asyncTask().then(console.log);
asyncTask().then(console.log);
Explanation:
Simply wrap your function with another self-invoking function which returns a function (your original async function), and the purpose of wrapper function is to provide encapsulating scope for a local variable cache, so that local variable is only accessible within the returned function of the wrapper function and has the exact same value every time asyncTask is called (other than the very first time)

Caching a promise object in AngularJS service

I want to implement a dynamic loading of a static resource in AngularJS using Promises. The problem: I have couple components on page which might (or not, depends which are displayed, thus dynamic) need to get a static resource from the server. Once loaded, it can be cached for the whole application life.
I have implemented this mechanism, but I'm new to Angular and Promises, and I want to make sure if this is a right solution \ approach.
var data = null;
var deferredLoadData = null;
function loadDataPromise() {
if (deferredLoadData !== null)
return deferredLoadData.promise;
deferredLoadData = $q.defer();
$http.get("data.json").then(function (res) {
data = res.data;
return deferredLoadData.resolve();
}, function (res) {
return deferredLoadData.reject();
});
return deferredLoadData.promise;
}
So, only one request is made, and all next calls to loadDataPromise() get back the first made promise. It seems to work for request that in the progress or one that already finished some time ago.
But is it a good solution to cache Promises?
Is this the right approach?
Yes. The use of memoisation on functions that return promises a common technique to avoid the repeated execution of asynchronous (and usually expensive) tasks. The promise makes the caching easy because one does not need to distinguish between ongoing and finished operations, they're both represented as (the same) promise for the result value.
Is this the right solution?
No. That global data variable and the resolution with undefined is not how promises are intended to work. Instead, fulfill the promise with the result data! It also makes coding a lot easier:
var dataPromise = null;
function getData() {
if (dataPromise == null)
dataPromise = $http.get("data.json").then(function (res) {
return res.data;
});
return dataPromise;
}
Then, instead of loadDataPromise().then(function() { /* use global */ data }) it is simply getData().then(function(data) { … }).
To further improve the pattern, you might want to hide dataPromise in a closure scope, and notice that you will need a lookup for different promises when getData takes a parameter (like the url).
For this task I created service called defer-cache-service which removes all this boiler plate code. It writted in Typescript, but you can grab compiled js file. Github source code.
Example:
function loadCached() {
return deferCacheService.getDeferred('cacke.key1', function () {
return $http.get("data.json");
});
}
and consume
loadCached().then(function(data) {
//...
});
One important thing to notice that if let's say two or more parts calling the the same loadDataPromise and at the same time, you must add this check
if (defer && defer.promise.$$state.status === 0) {
return defer.promise;
}
otherwise you will be doing duplicate calls to backend.
This design design pattern will cache whatever is returned the first time it runs , and return the cached thing every time it's called again.
const asyncTask = (cache => {
return function(){
// when called first time, put the promise in the "cache" variable
if( !cache ){
cache = new Promise(function(resolve, reject){
setTimeout(() => {
resolve('foo');
}, 2000);
});
}
return cache;
}
})();
asyncTask().then(console.log);
asyncTask().then(console.log);
Explanation:
Simply wrap your function with another self-invoking function which returns a function (your original async function), and the purpose of wrapper function is to provide encapsulating scope for a local variable cache, so that local variable is only accessible within the returned function of the wrapper function and has the exact same value every time asyncTask is called (other than the very first time)

Categories