Site monitoring tool to look for javascript errors [closed] - javascript

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 4 years ago.
Improve this question
I am currently working on a site that includes javascript code that we get from several different sources and need to run on the site I maintain. Every once and a while some of this code breaks without our knowing until its too late. Is there a monitoring tool that will crawl our site and look for javascript errors and report them or could this be incorporated into a selenium test somehow?

On the sites I develop, I wrap everything in try ... catch blocks, and if the exceptions I catch cannot be handled, I always generate an AJAX request to a script which emails an error report to the development team with as much information as I can gather.
If the code is code you didn't write yourself and try...catch blocks would be difficult to add, you can use the window.onerror handler instead:
<script type="text/javascript">
window.onerror = function()
{
// Your code to generate an AJAX request to your error report script here
}
</script>

I know this post is old, but recently we've launched a tool that does this :)
It's called ConsoleWatch - https://www.consolewatch.io/
It lets you scan whole websites for JS errors and also schedule repeating scans with reports, so you might find t handy!

It would also be smart to utilize a tool that will catch any JavaScript errors that might happen after production. There're several tools out there but I recommend RootCause because it will allow you to automatically reproduce any user errors.
Disclaimer: I work for RootCause. Our software automatically reproduces JavaScript errors and lets you replay user sessions live in your browser with the click of a button.

Related

Automated javascript and html testing over multiple pages [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 7 years ago.
Improve this question
We have a webshop and we sell lots of items.
Our checkout process consists of 4 different pages where the user has to input their address, select a delivery method and confirm their order on different pages/urls. Each of those pages relies on communication with the server and lots of javascript / jquery.
Some of our users have reported problems at some parts of those pages. We suspect it could be a combination of OS/browser that can't understand a part of our javascript code.
Is there any way to automate testing of a checkout process of 4 different consecutive pages, each requiring user input?
We would like the testing environment to test on different brosers/browser versions.
We also had a customer recently that had an antivirus program that would change the urls of our js source files, is there any way to capture cases like that by testing?
Sounds like you might use E2E testing, using Protractor/Selenium.
It's basically about writing user behaviour, and the browser driver does it instead of user. You can write what should happen on the page and if any of these conditions is not met, it will be included in the post-test report. You can configure it to use whatever browser driver you like.
I suggest you to take a look at Selenium. The main purpose of it is to automate browsers actions.

what is the easiest web scraping tool that handles javascripts [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 6 years ago.
Improve this question
I would like to make a web scraping application that is able to log in to a website (I was able to do this with twill (python)), and also to be able to execute JavaScript which trigger access to other pages.
I would definitely prefer to use something in python, but I am ready to try something new. I have installed mechanize, watir, Hojocki, etc. but not sure if this really helps.
I'd recommend PhantomJS.
It's a full Webkit browser, but headless and scriptable.
It's ideal for this sort of thing.
I believe there are a few modules (such as Ghost), but I have used Selenium/WebDriver for things like this. It is ostensibly a testing framework, but it provides you with a lot of methods to allow you to interact with the page just as if you had loaded it as a normal user. You also have the benefit of running it so that a browser actually opens and you can watch the code execute (makes debugging easier), or in a 'headless' mode where the code just executes (there are other sites/SO answers with much better explanations than I can give :) ).
That being said, Ghost looks great as well, so try them both and hopefully one will get you what you need!
Also, see Javascript (and HTML rendering) engine without a GUI for automation? for a similar question that may have some additional answers.
I would recommend Octoparse, a free web scraper for Windows.
It's not programmble but it's very easy to use. But there's no Mac version.So...
JavaScript can be handled by Octoparse btw.

JavaScript AJAX remote logger [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 3 years ago.
Improve this question
I am working on JavaScript application on the platform which does not have support for log output, does not allow opening new windows for logger output and has nothing like Firebug or Safari debugger on it...
So far I was using the floating <div> on z-index 2 and I logged the text inside, but this is not sufficient. I am looking for some lightweight JavaScript JSONP logger and some PHP or Tomcat server counterpart...
I recently stumbled upon this presentation of N. Zakas, and implemented the technique explained there. It is quite simple but IMHO very effective
http://www.slideshare.net/nzakas/enterprise-javascript-error-handling-presentation
the idea is to simply issue a call to a server side component (I used a .net handler but it could be a php file as well) which takes some param, log the param values and returns a 1x1 image stream back. What I like the most is that there's no need to involve ajax calls at all.
The code from the presentation is as follows:
function log(severity, message) {
var img = new Image();
img.src = "log.php?sev=" + encodeURIComponent(severity) +
"&msg=" + encodeURIComponent(message);
}
log(1, "something bad happened");
Warning: No Longer Working!
As #JohnSmith commented below, the solution suggested here appears to no longer be functional.
An alternative to hosting your own server logging might be JSConsole.com. It's a general purpose remote debugger for JavaScript. Just register a listener, paste the script tag it generates into your page, then fire up an instance on any device. The debugger is bidirectional, so not only does the logging get forwarded to the remote console on JSConsole, you have full access to the JS environment on the remote client.

Greasemonkey-like Firefox plugin for automatic browsing [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 7 years ago.
Improve this question
Is there a plug-in for Firefox that would allow user's Javascript code like Greasemonkey and be able to browse from page to page?
I'd like to write a script to:
Log in to a website.
Follow several links.
Make a GET request to that host periodically with given data and time intervals.
Make a POST request based on the results of the previous in-loop requests.
Now I use Python's mechanize for a browser so I'm looking for something with similar (maybe not that rich) functionality within Firefox. Do you have experience with that type of things? What should I check out?
Selenium - which has an interface for recording and running tests inside the browser but can also export tests in many languages including Python for running as a suite in the SeleniumRC tool.
Or
Chickenfoot (You'll probably need to use setTimeout for the repeating requests.)
(source: mit.edu)
You also have iMacros
The software's description on Mozilla Addons says :
Automate Firefox. Record and replay
repetitious work. If you love the
Firefox web browser, but are tired of
repetitive tasks like visiting the
same sites every days, filling out
forms, and remembering passwords, then
iMacros for Firefox is the solution
you’ve been dreaming of! Whatever you
do with Firefox, iMacros can automate
it.
(source: extjs.com)
I would recommend Selenium RC. It comes as a Java command line tool and allows you to remote-control both Firefox, IE and Safari. Although it is officially a browser based web-testing tool, it can be very useful for crawling and scraping AJAX-based web applications and for all sort of automated tasks otherwise difficult to accomplish with non graphical HTTP clients such as Curl, Hpricot and Mechanize.
Moreover, it's widely spread, has an API for most popular programming languages (including python), and allows you inject custom javascript code within web-pages.
PS:
Documentation is here

Python library for rendering HTML and javascript [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 6 years ago.
Improve this question
Is there any python module for rendering a HTML page with javascript and get back a DOM object?
I want to parse a page which generates almost all of its content using javascript.
The big complication here is emulating the full browser environment outside of a browser. You can use stand alone javascript interpreters like Rhino and SpiderMonkey to run javascript code but they don't provide a complete browser like environment to full render a web page.
If I needed to solve a problem like this I would first look at how the javascript is rendering the page, it's quite possible it's fetching data via AJAX and using that to render the page. I could then use python libraries like simplejson and httplib2 to directly fetch the data and use that, negating the need to access the DOM object. However, that's only one possible situation, I don't know the exact problem you are solving.
Other options include the selenium one mentioned by Łukasz, some kind of webkit embedded craziness, some kind of IE win32 scripting craziness or, finally, a pyxpcom based solution (with added craziness). All these have the drawback of requiring pretty much a fully running web browser for python to play with, which might not be an option depending on your environment.
You can probably use python-webkit for it. Requires a running glib and GTK, but that's probably less problematic than wrapping the parts of webkit without glib.
I don't know if it does everything you need, but I guess you should give it a try.

Categories