I want get a image from my local harddrive. My function returns undefined.
function getImageFromLocal() {
var imageurl = 0;
return new WinJS.Promise(function () {
picturesLib.getItemAsync("APP Folder X").then(function (appfolder) {
appfolder.getItemAsync("Subfolder X").then(function (file) {
file.getItemAsync("Image.jpg").done(function (image) {
imageurl = URL.createObjectURL(image);
});
});
});
});
return imageurl;
}
When working with asynchronous API calls like StorageFolder.getItemAsync, it doesn't work to assign the ultimate result to a local variable and attempt to return that.
In your code, moreover, you have two return statements, the first of which returns a promise inside which is wrapped the return value from a chain of promises ending in .done, which is what's giving you undefined. The second return statement is never reached, and even if it were, would return the initial value of 0 because by that time the async code won't have executed.
I'll point out further that using new WinJS.Promise isn't at all what you want here, but I won't belabor the details. If you really want the full story on promises and how they work, refer to my free ebook, Programming Windows Store Apps with HTML, CSS, and JavaScript, Second Edition, both in Chapter 3 (basics) and Appendix A (full story).
Back to your code, I'm not entirely sure what you're trying to do. With your function named with "Local" you suggest you're trying to retrieve an image from your app data, but the use of a variable called picturesLib suggests you're coming from the Pictures library. Let me address each of those separately.
First, if you're working with local appdata, you can bypass the entire process you've shown here by simply using a URI in the form ms-appdata:///local/folder1/folder2/image.jpg. You can assign such a URI directly to an img.src attribute and it'll just work. This will work synchronously, and also works for ms-appx:/// URIs that refer to in-package resources.
Second, when working with images you cannot reference so conveniently with a URI, you need to get its StorageFile and pass that to URL.createObjectURL, the result of which you can assign to an img.src as well, as I think you know. To save some trouble, though, you can use a full relative path with StorageFile.getFileAsync. This way you avoid having to write a promise chain to navigate a folder structure and reduce all that to a single call. That is, if pictureLib is the StorageFolder from Windows.Storage.KnownFolders.picturesLibrary, then you can just use:
picturesLib.getFileAsync("App folder X\subfolder x\image.jpg");
Now for your original function, whose purpose is to return a URI for that image, it's necessary that it returns a promise itself, and that the caller attached a .done to that promise to obtain the result. You do this properly by returning the value you want inside the completed handler for getFileAsync, but be sure to use .then which returns a promise for that return value (.done will return undefined).
function getPicturesLibUriAsync(relativePath) {
return picturesLib.getFileAsync(relativePath).then(function (file) {
return URL.createObjectURL(file);
});
}
Then you call the method like this, using .done to get the async result:
getPicturesLibUriAsync("folder1\folder2\image1.jpg").done(function (uri) {
someImageElement.src = uri;
});
Again, refer to my book for full details on how promises work, especially using the return value of .then.
Related
I have written a library function pRead(Path), which returns a JavaScript Promise to read a file on the local computer under an Apache server, using Ajax. I won't include the code for this, as it uses standard technology that is well-known to anyone who can give a good answer to this question.
I want to write a second library function, pReadObj(Path), which will return a Promise to read a JSON file and provide its object value to asynchronous code. It should work like this:
pReadObj("test.json").then(then2).catch(pErr);
function then2(obj)
{
alert(JSON.stringify(obj)); // Shows the JSON obj
} // then2
Here is the code I wrote:
var globalPreviousResolve;
function pReadObj(Path) // Promise to read JSON from file
{
return new Promise(function(resolve,reject)
{
globalPreviousResolve=resolve;
pRead(Path).then(pReadObj2).catch(pErr);
});
} // pReadObj
function pReadObj2(JSONStr)
{
globalPreviousResolve(JSON.parse(JSONStr));
} // pReadObj2
function pTestDB() // Called from button
{
pReadObj("test.json").then(then2).catch(pErr);
} // pTestDB
This works, but has a problem: using a global variable to hold the resolve callback is not only ugly, but it will clearly malfunction if two calls to pReadObj happen within a short period of time and the disk read takes a longer time than that.
The resolve function needs to be stored inside the Promise in some way, I'm thinking.
There's no need to explicitly create a Promise; just return the one created by .then:
function pReadObj(Path) {
return pRead(Path).then(JSON.parse);
}
I am using ionic 3 platform for uploading a video file to vimeo api. I need to get binary data for video file and I am uploading using input type file element.
The code I have written is as follow
videoUploadBody(videoObj) {
const r = new FileReader();
r.onload = function(){
console.log("Binary data", r.result);
return r.result;
};
r.readAsArrayBuffer(videoObj);
}
This is the function that I need to call and it should return me video file in binary form. The function from where I am calling above function is as follow
uploadVideo(videoFile, createdVideo) : Observable<any> {
const bodyObj = this.compilerProvider.videoUploadBody(videoFile);
return this.http.patch<Observable<any>>(createdVideo.upload.upload_link, bodyObj, this.uploadReqOpts);
}
Here the bodyObj variable contains undefined, whereas I have console.log is videoUploadBody function gives me data in binary form.
I think there is some async or promise issue. What do I need to change to get back binary data in uploadVideo function?
You're correct in that it is a promise issue. In your first function, r.onload will only return after the first function returns, and even then it will only return from the nested function.
I'm not going to write the correct code for you, but what you have to do is wrap the first function's body in a promise that resolves within the r.onload function, then the second function should call .then on that promise to operate on it (hint: you'll need to make another promise here).
MDN has good information about Promise.
https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Promise
I have a situation where I am building a data layer based on ES6 JS Promises that fetch data from the network. I am caching all Promises internally by the url.
Everything seems to be working fine except one thing. I want to ensure that the data coming out of the network layer is a copy/clone of the data retrieved from the network and I obviously do not want to do that everywhere in the client code that implements Promise's then handlers.
I would like to set this up so then handler automatically gets a copy of the cached data.
To add a twist to this, I would like this to be configurable on a url basis inside the data layer so that some Promises do the extra post-processing copy while others return just the raw result.
Can anyone suggest a proper implementation to accomplish this? I should mention that I would like to get a new copy of the original raw result each time a new client asks for it.
The current simplified pseudo implementation looks like this
getCachedData(url){
if (cache[url]) {
return cache[url];
} else {
var promise = new Promise(function(resolve,reject){
var data = ...ajax get...;
resolve(data);
});
cache[url] = promise;
}
getCachedData(url).then(result=>{
here I want the result to be a copy of data I resolved the original promise with.
});
Structure it like this:
function retrieveCopiedData () {
// getDataFromServer is your original Promise
return getDataFromServer().then(function (value) {
// use a library of your choice for copying the object.
return copy(value);
})}
}
This means that all consumers of retrieveCopiedData will receive the value returned from retrieveCopiedData's then() handler.
retrieveCopiedData().then(function (value) {
// value is the copy returned from retrieveCopiedData's then handler
})
You can add conditional logic to retrieveCopiedData as you see fit.
It seems like you just want to incorporate the cloning process right in your data layer:
getCachedData(url){
if (!cache[url]) {
cache[url] = new Promise(function(resolve,reject){
var data = ...ajax get...;
resolve(data);
});
}
if (requiresPostProcessing(url))
return cache[url].then(clone);
else
return cache[url];
}
Notice that it might be a good idea not to clone the data each time it is retrieved, but to simply freeze the object that your promise is resolved with.
I want to implement a dynamic loading of a static resource in AngularJS using Promises. The problem: I have couple components on page which might (or not, depends which are displayed, thus dynamic) need to get a static resource from the server. Once loaded, it can be cached for the whole application life.
I have implemented this mechanism, but I'm new to Angular and Promises, and I want to make sure if this is a right solution \ approach.
var data = null;
var deferredLoadData = null;
function loadDataPromise() {
if (deferredLoadData !== null)
return deferredLoadData.promise;
deferredLoadData = $q.defer();
$http.get("data.json").then(function (res) {
data = res.data;
return deferredLoadData.resolve();
}, function (res) {
return deferredLoadData.reject();
});
return deferredLoadData.promise;
}
So, only one request is made, and all next calls to loadDataPromise() get back the first made promise. It seems to work for request that in the progress or one that already finished some time ago.
But is it a good solution to cache Promises?
Is this the right approach?
Yes. The use of memoisation on functions that return promises a common technique to avoid the repeated execution of asynchronous (and usually expensive) tasks. The promise makes the caching easy because one does not need to distinguish between ongoing and finished operations, they're both represented as (the same) promise for the result value.
Is this the right solution?
No. That global data variable and the resolution with undefined is not how promises are intended to work. Instead, fulfill the promise with the result data! It also makes coding a lot easier:
var dataPromise = null;
function getData() {
if (dataPromise == null)
dataPromise = $http.get("data.json").then(function (res) {
return res.data;
});
return dataPromise;
}
Then, instead of loadDataPromise().then(function() { /* use global */ data }) it is simply getData().then(function(data) { … }).
To further improve the pattern, you might want to hide dataPromise in a closure scope, and notice that you will need a lookup for different promises when getData takes a parameter (like the url).
For this task I created service called defer-cache-service which removes all this boiler plate code. It writted in Typescript, but you can grab compiled js file. Github source code.
Example:
function loadCached() {
return deferCacheService.getDeferred('cacke.key1', function () {
return $http.get("data.json");
});
}
and consume
loadCached().then(function(data) {
//...
});
One important thing to notice that if let's say two or more parts calling the the same loadDataPromise and at the same time, you must add this check
if (defer && defer.promise.$$state.status === 0) {
return defer.promise;
}
otherwise you will be doing duplicate calls to backend.
This design design pattern will cache whatever is returned the first time it runs , and return the cached thing every time it's called again.
const asyncTask = (cache => {
return function(){
// when called first time, put the promise in the "cache" variable
if( !cache ){
cache = new Promise(function(resolve, reject){
setTimeout(() => {
resolve('foo');
}, 2000);
});
}
return cache;
}
})();
asyncTask().then(console.log);
asyncTask().then(console.log);
Explanation:
Simply wrap your function with another self-invoking function which returns a function (your original async function), and the purpose of wrapper function is to provide encapsulating scope for a local variable cache, so that local variable is only accessible within the returned function of the wrapper function and has the exact same value every time asyncTask is called (other than the very first time)
I want to implement a dynamic loading of a static resource in AngularJS using Promises. The problem: I have couple components on page which might (or not, depends which are displayed, thus dynamic) need to get a static resource from the server. Once loaded, it can be cached for the whole application life.
I have implemented this mechanism, but I'm new to Angular and Promises, and I want to make sure if this is a right solution \ approach.
var data = null;
var deferredLoadData = null;
function loadDataPromise() {
if (deferredLoadData !== null)
return deferredLoadData.promise;
deferredLoadData = $q.defer();
$http.get("data.json").then(function (res) {
data = res.data;
return deferredLoadData.resolve();
}, function (res) {
return deferredLoadData.reject();
});
return deferredLoadData.promise;
}
So, only one request is made, and all next calls to loadDataPromise() get back the first made promise. It seems to work for request that in the progress or one that already finished some time ago.
But is it a good solution to cache Promises?
Is this the right approach?
Yes. The use of memoisation on functions that return promises a common technique to avoid the repeated execution of asynchronous (and usually expensive) tasks. The promise makes the caching easy because one does not need to distinguish between ongoing and finished operations, they're both represented as (the same) promise for the result value.
Is this the right solution?
No. That global data variable and the resolution with undefined is not how promises are intended to work. Instead, fulfill the promise with the result data! It also makes coding a lot easier:
var dataPromise = null;
function getData() {
if (dataPromise == null)
dataPromise = $http.get("data.json").then(function (res) {
return res.data;
});
return dataPromise;
}
Then, instead of loadDataPromise().then(function() { /* use global */ data }) it is simply getData().then(function(data) { … }).
To further improve the pattern, you might want to hide dataPromise in a closure scope, and notice that you will need a lookup for different promises when getData takes a parameter (like the url).
For this task I created service called defer-cache-service which removes all this boiler plate code. It writted in Typescript, but you can grab compiled js file. Github source code.
Example:
function loadCached() {
return deferCacheService.getDeferred('cacke.key1', function () {
return $http.get("data.json");
});
}
and consume
loadCached().then(function(data) {
//...
});
One important thing to notice that if let's say two or more parts calling the the same loadDataPromise and at the same time, you must add this check
if (defer && defer.promise.$$state.status === 0) {
return defer.promise;
}
otherwise you will be doing duplicate calls to backend.
This design design pattern will cache whatever is returned the first time it runs , and return the cached thing every time it's called again.
const asyncTask = (cache => {
return function(){
// when called first time, put the promise in the "cache" variable
if( !cache ){
cache = new Promise(function(resolve, reject){
setTimeout(() => {
resolve('foo');
}, 2000);
});
}
return cache;
}
})();
asyncTask().then(console.log);
asyncTask().then(console.log);
Explanation:
Simply wrap your function with another self-invoking function which returns a function (your original async function), and the purpose of wrapper function is to provide encapsulating scope for a local variable cache, so that local variable is only accessible within the returned function of the wrapper function and has the exact same value every time asyncTask is called (other than the very first time)