I have created a Promise in oder to get the duration of one file whenever has finished its synthesising.
I believe that the solution is really inefficient, since I set a timeout regardless when the task has finished so probably I will just waste time each time I call the method:
polly.synthesizeSpeech(params, function (err, data) {
if (err)
console.log(err, err.stack);
else {
var uInt8Array = new Uint8Array(data.AudioStream);
var arrayBuffer = uInt8Array.buffer;
var blob = new Blob([arrayBuffer]);
var urlAudioFile = URL.createObjectURL(blob);
var audio = new Audio(urlAudioFile);
audio.type = 'audio/wav';
getAudioFileDurationAsync(audio);
};
});
function getAudioFileDurationAsync(audio) {
let promise = new Promise(function (resolve, reject) {
setTimeout(() => {
resolve("done!")
}, 3000);
});
promise.then(
result => {
console.log(audio.duration);
},
error => console.log(error) // doesn't run
);
};
Obviously, after 3000ms I get the duration of the file, but I would like to do it as soon as the file has finished synthesising. How could I do it?
From the documentation, it seems to be possible to get the duration :
var audioElement = new Audio('car_horn.wav');
audioElement.addEventListener('loadeddata', () => {
let duration = audioElement.duration;
// The duration variable now holds the duration (in seconds) of the audio clip
})
Hope it helps you
is that working for you?
Basically you just need to wrap the code you want to be notified about with a Promise. If you have a callback function just like in your example, all you have to do is let it resolve from within that callback.
const audioFileDuration = (params) => new Promise((resolve, reject) => {
polly.synthesizeSpeech(params, function(err, data) {
if (err) {
reject(err);
}
var uInt8Array = new Uint8Array(data.AudioStream);
var arrayBuffer = uInt8Array.buffer;
var blob = new Blob([arrayBuffer]);
var urlAudioFile = URL.createObjectURL(blob);
var audio = new Audio(urlAudioFile);
audio.type = 'audio/wav';
resolve(audio.duration)
});
});
audioFileDuration(params).then(duration => console.log(duration))
SetTimeOut acts as the maximum duration[TTL] you want to wait for the function.
You can try resolving the promise in two flows
when timeout reached of 3000ms
as soon as you get the duration of file after it has finished its synthesising.
whichever of the above flows completes early will resolve the promise and your code could proceed further not waiting for second resolve.
Make sure you clear timeout if the second process(getting duration) is finished earlier
Related
I apologise in advance as I am new to programming and I have been stuck at this for quite some time. I have a connect() function which returns a promise (it is also embedded in a class - not shown). I want this function to retry with a delay if the connection is not establish (i.e. reject is returned) but I have been unable to do so; I tried using the async js library and promise-retry library to no avail - i cant understand the documentation. For clarity, socket.connect emits a 'connect' function if connection is established.
this.socket = new net.Socket();
this.client = new Modbus.client.TCP(this.socket, this.unitID);
const net = require('net');
const Modbus = require('jsmodbus');
connect() {
return new Promise((resolve, reject) => {
this.socket.connect(options);
this.socket.on('connect', () => {
logger.info('*****CONNECTION MADE*****');
//does something once connection made
resolve();
});
this.socket.on('error', (error) => {
logger.error('failed to connect');
this.disconnect();
reject();
});
})
}
First define a utility function for having the delay:
const delay = ms => new Promise(resolve => setTimeout(resolve, ms));
Then chain a .catch handler to the new Promise:
.catch(() => delay(1000).then(() => this.connect()));
Of course, you should avoid an infinite series of retries. So implement some logic to definitely give up: after a fixed number of attempts, or after a certain time has passed, ...etc.
For instance, give a parameter to connect how many attempts it should allow:
connect(attempts=3) {
Then the catch handler could be:
.catch((err) => {
if (--attempts <= 0) throw err; // give up
return delay(1000).then(() => this.connect(attempts));
});
I'd do it by making the function that does a single connection attempt (basically, renaming your connect to tryConnect or similar, perhaps even as a private method if you're using a new enough version of Node.js), and then having a function that calls it with the repeat and delay, something like this (see comments):
Utility function:
function delay(ms, value) {
return new Promise(resolve => setTimeout(resolve, ms, value);
}
The new connect:
async connect() {
for (let attempt = 0; attempt < MAX_ATTEMPTS; ++attempt) {
if (attempt > 0) {
// Last attempt failed, wait a moment
await delay(RETRY_DELAY_IN_MS);
}
try {
await tryConnect();
return; // It worked
} catch {
}
}
// Out of retries
throw new Error("Couldn't create connection");
}
(If you're using a slightly older Node.js, you may need to add (e) after the catch above. Leaving it off when you don't need it is a relatively new feature.)
Re your current implementation of what I'm calling tryConnect, here are a few notes as comments for how I'd change it:
tryConnect() {
return new Promise((resolve, reject) => {
// Keep these local for now
const socket = new net.Socket();
const client = new Modbus.client.TCP(socket, this.unitID);
// Add handlers before calling `connect
socket.on('connect', () => {
logger.info('*****CONNECTION MADE*****');
// NOW save these to the instance and resolve the promise
this.socket = socket;
this.client = client;
resolve();
});
socket.on('error', (error) => {
logger.error('failed to connect');
// It's not connected, so no `disconnect` call here
reject();
});
socket.connect(options);
});
}
In the function where connect is called,
It can be called recursively, eg
const outerFunction = (times = 0) => {
if (times < 4) {
socket
.connect(params)
.then(() => {
// do good stuff
})
.catch(e => {
// increment the times so that it wont run forever
times++;
setTimeout(() => {
// delay for two seconds then try conecting again
outerFunction(times);
}, 2000);
});
}
};
This way your connect function is tried three times with a spacing of 2 seconds, i hope this solves your issue
I am trying to write a series of AJAX requests into a dictionary.
I am attempting to use promises for this, however I am either writing the promise syntax incorrectly, or what I think may be happening is that the function is actually completing (for loop is done, and AJAX requests sent) but the AJAX requests are still not returned. Therefore this is still returning an empty dictionary.
let dict = {};
let activeMachines = ["41", "42", "43"];
let dataPromise = new Promise (function (resolve, reject)
{
for (let i = 0; i < activeMachines.length; i++)
{
let machineID = activeMachines[i]
let getAPIData = new XMLHttpRequest();
let url = 'http://127.0.0.1:8000/processes/apidata/' +machineID + '/';
getAPIData.open('GET', url);
getAPIData.send();
getAPIData.onload = function()
{
let APIData = JSON.parse(getAPIData.responseText);
dict['machine_' + machineID] = APIData[0].author_id;
dict['temp' + machineID] = APIData[0].tempData; //get value
dict['humid' + machineID] = APIData[0].humidData;
timeValue = String((APIData[0].dateTime));
dict['time' + machineID] = new Date(timeValue);
console.log("done");
}
}
resolve();
});
dataPromise.then(function() {console.log(dict);});
Is there a way to "sense" when all of the XMLHTTPRequests have returned?
#Rafael's answer will work, but it doesn't illuminate much about what's going wrong, since you're trying to grok the concept of Promises and write one yourself.
Fundamentally I think your approach has two missteps: 1. creating a single Promise that handles calls to all of your arbitrary list of "activeMachines", and 2. putting your resolve() call in the wrong place.
Usually a Promise looks like this:
const myPromise = new Promise(function(resolve, reject) {
doSomeAsyncWork(function(result) {
// Some kind of async call with a callback function or somesuch...
resolve(result);
});
}).then(data => {
// Do something with the final result
console.log(data);
});
You can simulate some kind of arbitrary asynchronous work with setTimeout():
const myPromise = new Promise(function(resolve, reject) {
// Resolve with "Done!" after 5 seconds
setTimeout(() => {
resolve("Done!");
}, 5000);
}).then(data => {
console.log(data); // "Done!"
});
However your original code puts the resolve() call in a weird place, and doesn't even pass it any data. It looks sorta equivalent to this:
const myPromise = new Promise(function(resolve, reject) {
// Resolve with "Done!" after 5 seconds
setTimeout(() => {
// Doing some work here instead of resolving...
}, 5000);
resolve();
}).then(data => {
console.log(data); // This would be "undefined"
});
Where you're doing a console.log("done"); in your original code is actually where you should be doing a resolve(someData);!
You're also trying to do side effect work inside of your Promise's async function stuff, which is really weird and contrary to how a Promise is supposed to work. The promise is supposed to go off and do its async work, and then resolve with the resulting data -- literally with the .then() chain.
Also, instead of doing multiple asynchronous calls inside of your Promise, you should generalize it so it is reusable and encapsulates only a single network request. That way you can fire off multiple asynchronous Promises, wait for them all to resolve, and then do something.
const activeMachines = ["41", "42", "43"];
// Make a reusable function that returns a single Promise
function fetchAPI(num) {
return new Promise(function(resolve, reject) {
const getAPIData = new XMLHttpRequest();
const url = "http://127.0.0.1:8000/processes/apidata/" + num + "/";
getAPIData.open("GET", url);
getAPIData.send();
getAPIData.onload = function() {
const APIData = JSON.parse(getAPIData.responseText);
const resolveData = {};
resolveData["machine_" + num] = APIData[0].author_id;
resolveData["temp" + num] = APIData[0].tempData; //get value
resolveData["humid" + num] = APIData[0].humidData;
timeValue = String(APIData[0].dateTime);
resolveData["time" + num] = new Date(timeValue);
resolve(resolveData);
};
});
}
// Promise.all() will resolve once all Promises in its array have also resolved
Promise.all(
activeMachines.map(ea => {
return fetchAPI(ea);
})
).then(data => {
// All of your network Promises have completed!
// The value of "data" here will be an array of all your network results
});
The fetch() API is great and you should learn to use that also -- but only once you understand the theory and practice behind how Promises actually operate. :)
Here's an example of the Fetch API which uses Promises by default:
let m_ids = [1,2,3,4];
let forks = m_ids.map(m => fetch(`http://127.0.0.1:8000/processes/apidata/${m}`));
let joined = Promise.all(forks);
joined
.then(files => console.log('all done', files))
.catch(error => console.error(error));
I hope this helps!
I'm trying to get my head around promises, I think I can see how they work in the way that you can say do Step 1, Step 2 and then Step 3 for example.
I have created this download function using node-fetch (which uses native Promises)
## FileDownload.js
const fetch = require('node-fetch');
const fs = require('fs');
module.exports = function(url, target) {
fetch(url)
.then(function(res) {
var dest = fs.createWriteStream(target);
res.body.pipe(dest);
}).then(function(){
console.log(`File saved at ${target}`)
}).catch(function(err){
console.log(err)
});
}
So this all executes in order and I can see how that works.
I have another method that then converts a CSV file to JSON (again using a promise)
## CSVToJson.js
const csvjson = require('csvjson');
const fs = require('fs');
const write_file = require('../helpers/WriteToFile');
function csvToJson(csv_file, json_path) {
return new Promise(function(resolve, reject) {
fs.readFile(csv_file, function(err, data){
if (err)
reject(err);
else
var data = data.toString();
var options = {
delimiter : ',',
quote : '"'
};
const json_data = csvjson.toObject(data, options);
write_file(json_path, json_data)
resolve(data);
});
});
}
module.exports = {
csvToJson: csvToJson
}
When I call these functions one after another the second function fails as the first has not completed.
Do I need to wrap these two function calls inside another promise, even though on their own they each have promises implemented?
Please advise if I am totally misunderstanding this
When I call these functions one after another the second function fails as the first has not completed.
There are two issues with the first:
It doesn't wait for the file to be written; all it does is set up the pipe, without waiting for the process to complete
It doesn't provide any way for the caller to know when the process is complete
To deal with the first issue, you have to wait for the finish event on the destination stream (which pipe returns). To deal with the second, you need to return a promise that won't be fulfilled until that happens. Something along these lines (see ** comments):
module.exports = function(url, target) {
// ** Return the end of the chain
return fetch(url)
.then(function(res) {
// ** Unfortunately, `pipe` is not Promise-enabled, so we have to resort
// to creating a promise here
return new Promise((resolve, reject) => {
var dest = fs.createWriteStream(target);
res.body.pipe(dest)
.on('finish', () => resolve()) // ** Resolve on success
.on('error', reject); // ** Reject on error
});
}).then(result => {
console.log(`File saved at ${target}`);
return result;
});
// ** Don't `catch` here, let the caller handle it
}
Then you can use then and catch on the result to proceeed to the next step:
theFunctionAbove("/some/url", "some-target")
.then(() = {
// It worked, do the next thing
})
.catch(err => {
// It failed
});
(Or async/await.)
Side note: I haven't code-reviewed it, but a serious issue in csvToJson jumped out, a minor issue as well, and #Bergi has highlighted a second one:
It's missing { and } around the else logic
The minor issue is that you have var data = data.toString(); but data was a parameter of that function, so the var is misleading (but harmless)
It doesn't properly handle errors in the part of the code in the else part of the readFile callback
We can fix both by doing a resolve in the else and performing the rest of the logic in a then handler:
function csvToJson(csv_file, json_path) {
return new Promise(function(resolve, reject) {
fs.readFile(csv_file, function(err, data){
if (err)
reject(err);
else
resolve(data);
});
})
.then(data => {
data = data.toString();
var options = {
delimiter : ',',
quote : '"'
};
const json_data = csvjson.toObject(data, options);
write_file(json_path, json_data);
return data;
});
}
I am making a easy html5 game.
Object.keys(gameConfig.playerElems).map((e) =>{
let img = gameConfig.playerElems[e];
let name = e;
let imgObj;
imgObj = new Image();
imgObj.src = img;
imgObj.onload = () => {
playerElemsCounter++;
drawPlayer(imgObj);
}
});
Is it possible to pause .map() iteration while imgObj will be loaded?
Is it possible to pause .map() iteration while imgObj will be loaded?
No. So instead, you use an asynchronous loop. Here's one example, see comments:
// A named IIFE
(function iteration(keys, index) {
// Get info for this iteration
let name = keys[index];
let img = gameConfig.playerElems[name];
let imgObj = new Image();
// Set event callbacks BEFORE setting src
imgObj.onload = () => {
playerElemsCounter++;
drawPlayer(imgObj);
next();
};
imgObj.onerror = next;
// Now set src
imgObj.src = img;
// Handles triggering the next iteration on load or error
function next() {
++index;
if (index < keys.length) {
iteration(keys, index);
}
}
})(Object.keys(gameConfig.playerElems), 0);
But, as Haroldo_OK points out, this will wait for one image to load before requesting the next, which is not only unnecessary, but harmful. Instead, request them all, draw them as you receive them, and then continue. You might do that by giving yourself a loading function returning a promise:
const loadImage = src => new Promise((resolve, reject) => {
const imgObj = new Image();
// Set event callbacks BEFORE setting src
imgObj.onload = () => { resolve(imgObj); };
imgObj.onerror = reject;
// Now set src
imgObj.src = src;
});
Then:
// Load in parallel, draw as we receive them
Promise.all(Object.keys(gameConfig.playerElems).map(
key => loadImage(gameConfig.playerElems[key])
.then(drawPlayer)
.catch(() => drawPlayer(/*...placeholder image URL...*/))
)
.then(() => {
// All done, if you want to do something here
});
// No need for `.catch`, we handled errors inline
If you wanted (for some reason) to hold up loading the next image while waiting for the previous, that loadImage function could be used differently to do so, for instance with the classic promise reduce pattern:
// Sequential (probably not a good idea)
Object.keys(gameConfig.playerElems).reduce(
(p, key) => p.then(() =>
loadImage(gameConfig.playerElems[key])
.then(drawPlayer)
.catch(() => drawPlayer(/*...placeholder image URL...*/))
)
,
Promise.resolve()
)
.then(() => {
// All done, if you want to do something here
});
// No need for `.catch`, we handled errors inline
...or with ES2017 async/await:
// Sequential (probably not a good idea)
(async function() {
for (const key of Object.keys(gameConfig.playerElems)) {
try {
const imgObj = await loadImage(gameConfig.playerElems[name]);
playerElemsCounter++;
drawPlayer(imgObj);
} catch (err) {
// use placeholder
drawPlayer(/*...placeholder image URL...*/);
}
}
})().then(() => {
// All done
});
// No need for `.catch`, we handled errors inline
Side note: There's no point to using map if you're not A) Returning a value from the callback to use to fill the new array map creates, and B) Using the array map returns. When you're not doing that, just use forEach (or a for or for-of loop).
I'm kind of new with the whole promise thing so I might be doing something really wrong. If anyone can enlighten me, all advices/informations are welcome.
So here's the code of what I'm trying to accomplish (simplified and absolutly not optimal for understanding purpose):
// Get data from webservice
$scope.sendGet(id, option).then(function (response){
// Fill the model
$scope.model[option] = response.data;
}).then(function(){
if(option == $scope.PROFILES){
var p1 = new Promise((resolve, reject) => {
$scope.getX1($scope.model[option][0][0].id);
});
var p2 = new Promise((resolve, reject) => {
$scope.getX2($scope.model[option][0][0].id);
});
var p3 = new Promise((resolve, reject) => {
$scope.getX3($scope.model[option][0][0].id);
});
var p4 = new Promise((resolve, reject) => {
$scope.my_data = JSON.parse($scope.model[option][0][0].list);
});
// Execute all promises to get the data
Promise.all([p1,p2,p3,p4]).then(() => {
debugger;
// Do some validation and extra formatting on the data we just downloaded
$scope.update();
});
}
}).then(function(){
// Display the data to the user
$scope.move(option, 1, $scope.EDITING);
});
The intended behavior here is:
Get data -> With this data, use id to get data from 4 sources (the 4 promises) -> Once all the data is downloaded, update some references and do some cleaning -> move (which is a method that updates the view and do some other stuff UI related)
But for some reason, the debugger; and the $scope.update(); never get executed. I tried moving these in the same .then as the $scope.move() function but then it executes before the data from the Promise.all has been retrieved.
You never resolve promises 1-4, so the "success callback" to Promise.all(...).then never fires. In the callback given to the constructor of each promise, call resolve with the data each promise is getting.
// ...
var p1 = new Promise((resolve, reject) => {
resolve($scope.getX1($scope.model[option][0][0].id));
});
// ...
This is how you "return" data, so to speak, from a Promise. Please see this article for details.
EDIT: if $scope.getX1 itself returns a Promise, you can simply assign it to p1, ie:
var p1 = $scope.getX1($scope.model[option][0][0].id);