A test which works perfectly well locally with selenium webdriver is timing out when run remotely on saucelabs.com. The same test works for Chrome (both local and on sauce).
From the client code's side, the click in the following code is never returning:
var someLink = await driver.findElement(By.className('some-class'));
await someLink.click()
I'm using jest for the test framework with at 60 second timeout, so on the client end, I get that timeout error after a minute.
When I log into sauce and look at the list of commands it processed I see:
POST elements
With parameters:
{"using":"css selector","value":".some-class"}
And the returned body is:
[{"ELEMENT":"2"}]
So that succeeds and finds the link. I then never see a click event on that element. Prior click events, and navigation commands are successful.
When I watch the video playback of the session, I see it click the given link and the new page load in Firefox, but the spinner (actually a little dot going back and forth) in the top right never stops.
I can't reproduce with Firefox myself, or even through the manual testing on Saucelabs where you can control the browser and VM through the web.
I'm wondering if there's some synchronous code that's running that just isn't resolving. But I can't figure out how to find that out. The developer tools don't appear to have any way to show currently blocking code.
When page is being loaded, Selenium is waiting for document.readyState to be complete.
Sometimes loading of some resource might stuck - when it tries to get big file and connection is poor, when resource is not reachable because of proxy, when service that provides this resource is down, and so on.
I had the same problem with Firefox and solved it using eager page load strategy.
With this load strategy Selenium will wait for document.readyState to be interactive - some resources might not be loaded, but main elements of the page are loaded and you can interact with them in common way.
DesiredCapabilities caps = DesiredCapabilities.firefox();
caps.setCapability(CapabilityType.PAGE_LOAD_STRATEGY, "eager");
Related
I need to make a headless (for a docker container) app that waits for an external signal and then acts on that signal by clicking on several html elements (selectors, buttons, links) and filling in some input fields. All this can be done using jQuery, I know how to do that.
The app needs to keep the page loaded so it can act immediately, reloading the page every time is taking too long. The whole action of receiving a signal and filling in a form and submit it, should be done under one second.
I made an electron app that does all this but I need to make the app headless so it can be run inside a docker container.
It looks like Phantomjs could do this but I see two problems:
The Phantom script needs to keep the web page loaded as the web page I need to automate is very heavy, it can take more than a minute to load.
The Phantom script needs to be able to receive a signal and report back on the progress. HTTP or file based is too slow, I'd like to use websockets for this communication.
I hope someone can point me to the right tools for this and/or point me to some examples how to achieve this.
I would like to use Javascript, but if there is a perfect solution in an another modern language, I have no problem to use that.
I managed to get it working inside a Docker container using Electron.
We've just upgraded group policies at work because of a big migration project. Nevermind... The thing is, some of our users use this java application, which reads the smart card reader. On new machines it doesn't work in IE, it has to run in firefox. The trouble is, that the first time firefox opens it, it says there's no java. As soon as you reload it, it's fine.
As users are users, they hate the thought of having to reload the page, and it's not very elegant either. As the process of upgrading anything in the company is difficult, and I'm only an entry level desktop support guy, it won't get fixed any time soon.
So I was thinking... is there any way to create a shortcut, that would open the page and then reload it once it finishes loading the first time?
It can be a shortcut to a local html file which then redirects it to the final location...
You can use a vbs:
set WScriptShell = CreateObject("WScript.Shell")
WScriptShell.Run("http://www.facebook.com/")
WScript.Sleep(2000)
WScriptShell.SendKeys "{F5}"
This one opens a page in the browser, waits 2000 ms (probably enough for the page to load) and then sends the "F5" key to the currently active window. This may not be a perfect solution, but you can extend it to match your needs.
Have you tried $( document ).ready() and insert the code in this function? This basically waits your whole page to load and after that executes the code in the function.
I have a basic HTML website (with some javascript) using a simple anchor tag to download a file like so:
Mexml Samples 1.0
In order to track the number of downloads, I have an onclick handler that passes an event to Google Analytics like so:
$('#mybutton').click(function(e){ga('send','event','Download','MexmlSample','MexmlSample-1.0');});
This works as expected when downloading the file using Chrome on OS X, and IE on Windows 7. The file downloads and I see the event in my GA account.
When I test it in Safari 8 on Yosemite, the file does download, but GA only rarely sees the event. And of course I get the dreaded Failed to load resource: Frame load interrupted in the Safari error console.
I assume that I get the GA event sometimes because of a race condition between when Safari interrupts the action and when the GA code fires.
So can anything be done to fix this in Safari so that I always get the GA events?
Note that my question probably has the same root cause as this unanswered question: Frame load interrupted when downloading excel files
Update June 6
I am now thoroughly confused. I just noticed that if I open up a new browser page to my site (in Safari), and click on the download, then it gets logged by GA. However subsequent clicks download still the file, but don't get logged by GA.
If I close that window, and open a new one, then again the first download gets logged by GA.
In contrast, when using Chrome every download gets logged by GA.
I am now thinking that I may be looking at the wrong problem. The behavior I am seeing is telling me that Safari is maintaining a state in JavaScript that allows the first GA call to go through, but blocks all subsequent calls.
But this is the same code being run by Chrome, so I don't know where to how to even start debugging the problem.
If you always want to get the ga event then the hitCallback is likely the only way to go until whatever is wrong with Safari is fixed. I use a similar pattern to send a GA event from a page in an app which is just a redirect after a whole load of database stuff has been executed in rails. There is no noticeable lag from adding the javascript redirect into the callback. However I am not sure how to initiate the download from javascript off the top of my head.
ga('send','event','Download','MexmlSample','MexmlSample-1.0', {
hitCallback: function(){
initiateDownload();
})
I am not aware of any need to use the setTimeout() for pattern in this instance.
The only solution I can think of - is waiting till GA request will finish, and only after that set location.href to desired file download link. But this is not really good from user's perspective. (This can be achieved with hit callback https://developers.google.com/analytics/devguides/collection/analyticsjs/field-reference#hitCallback).
Probably HTML5 download attribute for href will solve the problem.
I have no OSX with safari to test, so this is only my thoughts.
We're developing a web application that handles state change via change of the hash of the page (e.g. example.com/#/page1).
Lately, I've been running into an issue with Google Chrome, when the prefetch option is enabled ("Predict network actions to improve page load performance"). Among the different routes, we have #/logout that performs the logout.
In the "normal" state, I'm on the page example.com/#/ (the main page), and as I start typing "l" after that (example.com/#/l), Chrome autocompletes with logout. However, not only it does autocomplete, but it also calls the "haschange" event, so the client is sending a request to log out to the server... Even just by typing a l!
This behaviour is not only unexpected, but it's also dangerous. Aside from unchecking "Predict network actions to improve page load performance" in the settings page (which is on by default), is there a way to prevent Chrome to do this?
EDIT
A small new "discovery". Actually, Chrome is not firing the "hashchange" event, as a console.log in the event handler is not being executed. Chrome learnt that, when visiting the #/logout page, a request to the server (GET /auth/destroy) is called, and so it's firing it by itself! What can we do to stop this?
Answering my own question. This is not really a solution, but rather a workaround.
According to this documentation, prendering is disabled in certain situations: with POST requests (not an option in our case) and when the resources are served via HTTPS.
Since we were already going to enable HTTPS in the production environment, we just enabled it in the development one as well and the issue disappeared. However, I still feel like this is more of a workaround than a real solution.
My eventual goal is to find all resources that the webpage is attempting to load, stop them from loading, and then list them (so that the user can see what the webpage tried to load). Can anyone help me get started with the necessary JavaScript that I'll need to both stop the page from loading and print the resources? I'm just kind of lost on what the first step should be.
Take a look at webrequest for chrome extensions:
https://developer.chrome.com/extensions/webRequest
I would suggest to create a callback for onBeforeRequest() and then checking what resources are being fetched and loaded.
You can cancel a particular web request from this callback.