I'm attempting to run an automated job with CasperJS which brings in a JSON file and checks if certain elements on a given page declared in each JSON object exist. I am attempting to write this recursively, but I am running in to an issue where the beginning of the asynchronous recursive flow (i.e. my only synchronous function) is returning synchronously (as expected) and causing the script to fail.
According to this CasperJS github issue, Promises are not currently supported in PhantomJS. I have also tried the .waitFor() CasperJS method, but this causes an infinite loop if I try and bubble up a return value of true from the final asynchronous method (there is also the issue of waitFor being controlled by a timeout value and not waiting on a response, but the timeout value can be changed). Here is a code example attempting to use waitFor below, simplified slightly:
casper.start(urlStart, function() {
recursiveRun(jsonInput.length);
});
var recursiveRun = function(count) {
if (count) {
casper.waitFor(function check() {
return controller(); // Infinite loop here
}, function then() {
jsonInput.shift();
recursiveRun(count - 1);
}
}
}
var controller = function() {
casper.thenOpen(jsonInput[0].url, function() {
return renavigation();
}
}
var renavigation = function() {
casper.waitForSelector("#nav", function() {
this.thenOpen(inventoryUrl, function() {
return inventoryEvaluation();
}
}
}
var inventoryEvaluation = function() {
casper.waitForSelector("#status", function() {
this.evaluate(function() {
// Unimportant in-browser function
}
return true;
}
}
casper.run();
Why is this not working? What is the proper way to recursively and asynchronously perform these actions, if at all?
The CasperJS method .eachThen() seems to perform this functionality as intended.
From the example above, the synchronous function which recursively calls itself can be entirely replaced by having CasperJS iterate over the jsonInput variable in the callback of the .start() method.
casper.start(urlStart, function() {
this.eachThen(jsonInput, function() {
controller();
jsonInput.shift();
}
}
Related
My problem is pretty simple and I think lot of programmers already must have faced this.
I am using angularjs and my javascript looks as follow:
controller('myController', function())
{
if(validateForm())
{
//do after-validation stuff
}
validateForm = function()
{
var val = undefined;
$http.get('api/validate/').
success(function(data){
val = true;
}).
error(function(data){
val =false;
});
while(!val)
{ // loop over till val has value
}
return val;
}
}
I have two questions:
1) when I make http call it returns a promise object and it resolves to either success or error function. If that is the case then the while loop in this code should not be infinite, but it is. By debugging I found that if I assign a value to variable 'var' then only the http call is made i.e when the loops ends.
why http call, being a deferred call, waits on the while loop to finish?
2) If I remove the while loop, no matter what the value of 'var' is, the function returns. How can I make it return only if 'var' is defined?
JavaScript is single threaded, so the http success/failure never get executed because the while loop is going. This is causing your infinite loop.
Return the promise and use that in the calling function.
Well usually ajax calls like these are not synchronous, having said that, when your while loop is hit your variable val is not yet set. What you should do is, just call a function in either success or error case and that function should have your while loop. Or try to make async ajax call which I am not sure if possible with selected method. So your new code will be something like
function mycallback(someval){
while(!someval)
{ // loop over till val has value
}
}
controller('myController', function())
{
if(validateForm())
{
//do after-validation stuff
}
function()
{
var val = undefined;
$http.get('api/validate/').
success(function(data){
val = true;
mycallback(val);
}).
error(function(data){
val =false;
mycallback(val);
});
}
}
depending on your need you need to change flow to meet this sync/async call. I don't recommand syncing the call as it slow down the execution for time your get ajax response. You should practice to use Async call only.
Refactor, and use the .then
angular.module("myModule")
.controller("myController", ["$scope", "$http", function ($scope, $http) {
var controller = this;
angular.extend(controller, {
// ...
submit : submit,
handleSuccess : function () { /* ... success */ },
handleError : function (err) { /* error */ }
});
function submit (form) {
validateForm(form)
.then(controller.handleSuccess, controller.handleError)
.then(controller.cleanUp);
}
function validateForm (form) {
return $http.get("...").then(function (response) { return response.data; });
}
}]);
It should all be readable, now.
Additionally, the logic doesn't go inside of "success", instead it goes inside of subsequent .then calls, so you can keep your submit clean and readable.
But you can not ever do something like
while (true) {
}
or
for (var i = 0; i < 10000000; i += 1) { }
All you are doing is locking your browser, preventing people from using the page.
You can't animate CSS, you can't read AJAX values, you probably can't even close the tab (depending on the browser).
Everything should either be reactive (event-based) or asynchronous, using callbacks or promises.
Hi I am using Jquery and ember to delete certain elements ,I want to use Deferred objects to stop the code and then next statements has to be executed
Here KillMany is Function once it is called it will execute array.forEach(tryKill);
statement
that contains an array of elemets[array contains 100 elements inside for each each time a call back is calling to delete the each element from server]
Here I want to execute my code after completely finishing [deletion of elements] myFinalblock callback has to be calle
please guide me
killMany: function(c) {
var t = this
, wait = []
, dfd = new $.Deferred();
function keep(tile) {
tile.setProperties({ isSelected: false, isHidden: false });
}
function destroy(tile) {
if (t.get('reports')) {
t.get('reports').removeObject(tile.entity);
}
tile.remove.bind(tile);
}
function tryKill(tile) {
tile.set('isHidden', true);
tile.get('entity').kill()
.then(destroy.bind(null, tile),
keep.bind(null, tile));
}
function myFinalblock(){
this.set('selectedTiles', []);
}
this.set('promptDestroyMany', false);
if (c.response) {
var array = this.get('selectedTiles');
array.forEach(tryKill);
myFinalblock();
}
},
You seem to be missing the point of promises a bit. They do not "stop" your code. They allow you to cleanly route your code's asyncrhonous functionality. Therefore you need a way to wait until all of the tryKill calls are done before calling myFinalBlock. To do this, you first need to modify your tryKill function to return its promise:
function tryKill(tile) {
tile.set('isHidden', true);
return tile.get('entity')
.kill()
.then(destroy.bind(null, tile),
keep.bind(null, tile));
}
Then you can do this:
var tiles = this.get('selectedTiles');
$.when.apply($, tiles.map(tryKill))
.then(myFinalBlock)
.done();
On a side note, I suggest looking into a proper promise library and not using jQuery's built-in deferreds, as they are known to have a number of problems.
I'm trying to create a module that will fill in form inputs when functional testing, and I'd like to be able to call it from multiple test suites.
Pseudo code for the helper file (helper.js)
module.exports = {
fillForm: function() {
this.findByCssSelector('#firstname')
.click()
.pressKeys('John')
.end()
},
anotherFunction: function() {
// more code
}
}
In the spec for the functional test, I load that module as helper and I can see it execute. However, it seems I can't use this syntax and guarantee that the chained steps execute in the defined order:
'Test filling form data': function() {
return this.remote
.get(require(toUrl(url))
// should happen first
.then(helper.fillForm)
// should happen second
.then(helper.anotherFunction)
// only after the above should the click happen
.findByCsSelector('#submit')
// click evt should show the #someElement element
.click()
.findByCssSelector('#someElement')
.getComputedStyle('display')
.then(style) {
// assertions here
}
It seems that the promise chaining allows the click event to happen before the then callbacks have executed. Is this sort of flow possible with intern?
UPDATE:
For the moment, working around this with this sort of code:
var remote = initTest.call(this, url);
return helpers.fillForm1Data.call(remote)
.otherChainedMethodsHere()
.moreChainedMethods()
.then() {
// assertion code here
where the initTest method does url fetching, window sizing, clearing data, and the fillForm1Data does as you'd expect. But the syntax is pretty ugly this way.
Your helper is not returning any value so it is treated as a synchronous callback and the next thing in the chain is executed immediately. You also cannot return this from a promise helper or it will cause a deadlock (because the Command promise will be waiting for itself to resolve—Intern will throw an error instead if you try to do this), so you need to create a new Command and return that if you want to use the chained Command interface within your helper:
module.exports = {
fillForm: function() {
return new this.constructor(this.session)
.findByCssSelector('#firstname')
.click()
.pressKeys('John');
},
anotherFunction: function() {
// more code
}
};
You can also just return from this.session instead if you don’t care about the convenience of the Command API and can deal with normal promise callback chains:
module.exports = {
fillForm: function() {
var session = this.session;
return session.findByCssSelector('#firstname')
.then(function (element) {
return element.click();
})
.then(function () {
return session.pressKeys('John');
});
},
anotherFunction: function() {
// more code
}
};
I am trying to implement some flow control of async ajax calls. My project receives basically an array of values then completes an ajax call for each of them, process the results then run a function when complete.
I have a scoping issue in that the flow control function when initiated a 2nd time with a different array of work to complete its job reference value is overwritten. I have rewritten just the part I have trouble with eliminating all the other functions and posted it on jsfiddle.
http://jsfiddle.net/qanx9/
When the code is initiated the jobRunner function will console log the key to the globalObj where work is found. On the 3rd run of the function job2 is lost and everything is now referring to job1. I have tried putting everything inside this function in an anonymous function but that made no difference.
This is my first attempt at flow control and my project is actually in nodejs. There are libraries that do this for me but I am trying to get a good understanding on how this works.
// global used to track async flow control and store completed work
globalObj = {
job1: {
work: [0,1,2,3,4,5,6,7,8,10],
results: []
},
job2: {
work: [11,12,13,14,15,16,17,18,19,20],
results: []
},
async: {
limit: 5,
running: 0
}
};
// flow control
function jobRunner(job) {
console.log(job)
g = globalObj.async;
j = globalObj[job];
while (g.running < g.limit && j.work.length > 0) {
var task = j.work.shift();
fakeAsyncAjax(task,function(result){
j.results.push(result);
g.running--;
if (j.work.length > 0) {
jobRunner(job);
} else if (g.running == 0) {
jobDone(job);
}
})
g.running++;
}
}
function jobDone(job) {
console.log(job+' complete..');
}
// instead of doing real ajax calls here ive done a basic simulation with a random delay, using setTimeout to make it async.
function fakeAsyncAjax(ref,completeFunc){
setTimeout(function(){
completeFunc(ref);
},Math.floor((Math.random() * 500) + 1))
}
// initate jobs
jobRunner('job1')
jobRunner('job2')
If I have a function that's passed this function:
function(work) {
work(10);
work(20);
work(30);
}
(There can be any number of work calls with any number in them.)
work performance some asynchronous activity—say, for this example, it just is a timeout. I have full control over what work does on the completion of this operation (and, in fact, its definition in general).
What's the best way of determining when all the calls to work are done?
My current method increments a counter when work is called and decrements it when it completes, and fires the all work done event when the counter is 0 (this is checked after every decrement). However, I worry that this could be a race condition of some sort. If that is not the case, do show my why and that would be a great answer.
There are a ton of ways you can write this program, but your simple technique of using a counter will work just fine.
The important thing to remember, the reason this will work, is because Javascript executes in a single thread. This is true of all browsers and node.js AFAIK.
Based on the thoughtful comments below, the solution works because the JS event loop will execute the functions in an order like:
function(work)
work(10)
counter++
Start async function
work(20)
counter++
Start async function
work(30)
counter++
Start async function
-- back out to event loop --
Async function completes
counter--
-- back out to event loop --
Async function completes
counter--
-- back out to event loop --
Async function completes
counter--
Counter is 0, so you fire your work done message
-- back out to event loop --
There's no race condition. There is the added requirement for every request made to perform a decrement when it's finished (always! including on http failure, which is easy to forget). But that can be handled in a more encapsulated way by wrapping you calls.
Untested, but this is the gist (I've implemented an object instead of a counter, so theoretically you can extend this to have more granular queries about specific requests):
var ajaxWrapper = (function() {
var id = 0, calls = {};
return {
makeRequest: function() {
$.post.apply($, arguments); // for example
calls[id] = true;
return id++;
},
finishRequest: function(id) {
delete calls[id];
},
isAllDone: function(){
var prop;
for(prop in calls) {
if(calls.hasOwnProperty(prop)) {return false;}
}
return true;
}
};
})();
Usage:
Instead of $.post("url", ... function(){ /*success*/ } ... ); We'll do
var requestId;
requestId = ajaxWrapper.makeRequest("url", ...
function(){ /*success*/ ajaxWrapper.finishRequest(requestId); } ... );
If you wanted to be even more sophisticated you could add the calls to finishRequest yourself inside the wrapper, so usage would be almost entirely transparent:
ajaxWrapper.makeRequest("url", ... function(){ /*success*/ } ... );
I have an after utility function.
var after = function _after(count, f) {
var c = 0, results = [];
return function _callback() {
switch (arguments.length) {
case 0: results.push(null); break;
case 1: results.push(arguments[0]); break;
default: results.push(Array.prototype.slice.call(arguments)); break;
}
if (++c === count) {
f.apply(this, results);
}
};
};
The following code below would just work. Because javascript is single threaded.
function doWork(work) {
work(10);
work(20);
work(30);
}
WorkHandler(doWork);
function WorkHandler(cb) {
var counter = 0,
finish;
cb(function _work(item) {
counter++;
// somethingAsync calls `finish` when it's finished
somethingAsync(item, function _cb() {
finish()
});
});
finish = after(counter, function() {
console.log('work finished');
});
};
I guess I should explain.
We pass the function that does work to the workhandler.
The work handler calls it and passes in work.
The function that does work calls work multiple times incrementing the counter
Since the function that does work is not asynchronous (very important) we can define the finish function after it has finished.
The asynchronouswork that is being done cannot finish (and call the undefined finish function) before the current synchronous block of work (the execution of the entire workhandler) has finished.
This means that after the entire workhandler has finished (and the variable finish is set) the asynchronous work jobs will start to end and call finish. Only once all of them have called finish will the callback send to after fire.