Having read up recently on yahoo's web optimisation tips and using YSlow I've implemented a few of their ideas on one of my sites http://www.gwynfryncottages.com you can see the file here http://www.gwynfryncottages.com/js/gw-custom.js.
While this technique seems to work perfectly on most occasions, and really does speed up the site, but I do notice a significantly higher number of errors where the javascripts don't load or don't load completely while I'm working on the site so three questions:-
is combining scripts this way a good idea at all in terms of reliability?
is there any way to measure the number of errors i.e. the number of times the script failed to load?
is there any way to 'pre-load' the javascript or ensure that the number of loading errors is reduced?
Of course it's good. You will not only decrease HTTP requests but you will cut down delays in downloading other resources.
Try using minify: http://code.google.com/p/minify/, I've been using it and I've no complaints.
I can assure you that combining files WON'T cause any errors as a combined script is the same as 10 non-combined scripts, they all load in the same way (In an ordered way, left to right, top to bottom). Double check the way you're combining them.
Script execution stops at serious errors. If you have multiple scripts, the others will still run; if you packed everything into one big file, a lot more code won't get executed. So combining scripts is bad for reliability, but can be good for other purposes (mainly load time).
All browsers have some sort of javascript console which will show you the number of errors. Most have some sort of developer tool too (Firebug in Firefox, Dragonfly in Opera etc).
I'm not sure what you mean by preloading. Since a javascript file can affect the rest of the page in various ways, browsers will fully load and execute a script tag before continuing to parse the page (which is why scripts can slow page loading down so much).
I can't see the load function in your code which is being called on your body tag! I'd try and steer clear of adding JS to your HTML file, it can be added dynamically and will prob cause you less hassle along the way aas well as being easier to maintain.
I'd say that the things you need to look out for are making sure that you're not trying to call something before it's defined (maybe your seperate JS files were defined in a different order to how they appear in the single JS file).
Firebug for firefox is a good development tool, if you've not found it already. Webkit, Opera and IE also have various other dev tools.
Combining JavaScript files is always the best way to go, unless it's not logically sane to do so (downloading jQuery from Google Code instead of hosting it yourself is a good example).
I always combine as many files as I can (JavaScript, CSS, images (CSS Sprites), etc.), also in development, and I never experience any problems. It's way faster regarding less http connections, which should not in any case be underestimated.
Regarding that you want to count the errors, I don't exactly see what you mean. But, debugging tools like the built in one in Google Chrome or Firebug for Firefox are good tools for debugging your JavaScript code, and shows lists of the errors occurring.
And to that thing of preloading: Yes, it can be done, though it'll become nasty and illogical. However, I can't think of any case whatsoever where it would be a good solution to have the trouble to preload the JavaScript, compared to just make it work right out of the box, no error checking needed.
About the error you are experiencing, the only one that my Chrome points out is this:
Uncaught ReferenceError: load is not defined
... which seems to be the onload method "load()" set on line 55 of your HTML document when the body tag is started.
Related
First some backstory:
We have a website that includes a Google Map in the usual way:
<script src="https://maps.googleapis.com/maps/api/js?v=....></script>
Then there is some of our javascript code that initializes the map. Now suddenly yesterday pages started to load but then freezed up entirely. In Chrome this resulted in having to force quit and restart. Firefox was smarter and allowed the user to stop script execution.
Now after some debugging, I found out that a previous developer had included the experimental version of the Google Maps API:
https://maps.googleapis.com/maps/api/js?v=3.exp
So it's likely that something has changed on Google's servers (which is completely understandable). This has uncovered a bug on our end, and both in combination caused the script to hang and freeze up the website and the browser.
Now ok, bug is found and fixed, no big harm done.
And now the actual question:
Is it possible to somehow sandbox these external script references so that they cannot crash my main site. I am loading a decent amount of external javascript files (tracking, analytics, maps, social) from their own servers.
However such code could change at all times and could have bugs that freeze my site. How can I protect my site? Is there a way to maybe define maximum allowable execution time?
I'm open to all kinds of suggestions.
It actually doesn't matter where the scripts are coming from - whether an external source or your own server. Either way they are run in the clients browser. And that makes it quite difficult to achieve your desired sandbox behavior.
You can get a sandbox inside your DOM with the usage of iframes and the keyword "sandbox". This way the content of this iframe is independent from the DOM of your actual website and you can include scripts independent as well. But this is beneficial mainly for security. I am not sure how it would result regarding the overall stability when one script has a bug like an endless loop or similar. But imho this is worth a try.
For further explanation see: https://www.html5rocks.com/en/tutorials/security/sandboxed-iframes/
I'm a beginner web developer. I often use Firebug to debug my JavaScript.
Problem is that there are some script files from my page's UI that have a lot of code and this causes my web browser to be unresponsive, i.e. I get a dialog saying the script is unresponsive. Basically this happens when I am within Firebug's Script panel.
How can I deal with this?
I tried finding solution to this problem and nothing.
As for the answer I think the best was posted by #Pablo(can't assign answer to comment unfortunately) and it is simply trying out Google Chrome console. None of the problems I mentioned exists here.
Cheers guys!
I have had the same problem debugging some of our older scripts that make extensive use of the eval() function.
This causes many scripts to be displayed within the Script Location Menu. (Each dynamically generated script is represented there.)
A possible solution, given that it was caused by the number of files in my instance, might be to see if you can bypass the problem entirely by utilizing fewer source files for the same code. Using a 'built' version of whatever frameworks you use might alleviate the problem. (Particularly if they still are debug-able in a built form.)
If that does not work, you might try debugging using Firefox' built-in debugger (available via Ctrl+Shift+S. (Or switch to another browser to do the debugging, but that is obviously a far less desirable solution.)
I'm currently doing some optimization work on a large web project. I'm already doing JavaScript file combining, minification and compression. But I'm confused on one point.
For a number of non-technical reasons, my users are about 50% each IE7 and IE8. After doing some research, I'm getting the impression that IE7 loads the JavaScript files sequentially and IE8 loads them in parallel. I understand that going forward that this will not be an issue with more modern browsers (IE9+, FF, Chrome, etc).
Is this an accurate statement? If yes, then what is best practice for loading the files?
That statement is correct, but you should remember that even modern browser will make only a limited number of connections to the same server. So when your page, scripts, css and images are all on the same server, the browser may load only 2 or 4 of those at a time. Therefor it may be a good idea to add a subdomain or a different domain for scripts to trick the browser and make it load the scripts alongside with the images.
An even simpler solution is to merge all scripts into one script. You can do this 'on the fly' or cache it. You can even minimize the scripts (which means comments and whitespace are stripped and variable names are shortened). You shouldn't minimize and combine the original scripts, but you can cache the combined/minimized scripts so they won't need to be minimized with each request.
If you do this, you reduce traffic, and your browser will only need one request for the file, eliminating the overhead of multiple sequential requests.
See this MSDN blog article which shows some other tricks for script loading.
I am trying to reverse-engineer a website I don't own, figuring out how some dumb "encryption" works, in order to be able to carry out some operations automatically, by taking the functionality outside the browser.
One of the files is of particular interest, let's call it javascript.js. It is linked in the HTML document like this
<script src="/javascript.js" type="text/javascript"></script>
I have
deobfuscated javascript.js
pretty-printed its code
My question is now, considering that I'm using venkman and firefox, how to replace the on-site obfuscated javascript.js with my own pretty-printed code, in order to learn how it works.
Any other tool beside venkman should do, as long as I can still step through the deobfuscated code.
Additional question (just in case I may come cross this related situation):
How to do the same if the javascript.js would be emdedded inline in the html code like <script>code</script>?
For those of you wondering about how legal this is, my question is not the first about reverse-engineering on SO: https://stackoverflow.com/questions/tagged/reverse-engineering
Apparently there's no problem with those questions, why should there be one with mine?
My objective is to understand the code AND my question is about the TOOLS, as in "where to point and click" or which tool could help me (if venkman cannot).
You could also always use an intercepting proxy (something like Paros) which will allow you to replace any part of the response any way you like. So when the browser requests the JS file, you can catch the response in Paros, replace the content with your version, and you're done. I often use Paros for other things where I need that interception or observation point, and it's pretty simple and quite numerous in its possible applications. It's basically just a matter of running it and setting your browser proxy settings to use a proxy at localhost on the port Paros is listening on. You can then tell Paros to actually stop and allow you to edit the request or response just by checking a couple of boxes. Hope that helps.
This is going to be very difficult, if not impossible, to do without using browser debugging / extension features like GreaseMonkey or Chrome's Extension API. The reason being that if you don't get involved in the page load sequence, the obfuscated code will already have been run, setting up JavaScript objects, event handlers, etc., etc. You'd have to ensure that your new script replaced those objects and event handlers, which would be complicated and difficult.
With GreaseMonkey or Chrome Extensions or similar on whatever browser you're using, I'd expect it to be possible to detect the page loading script X and replace it with your local script Y. These things run at that level, they get involved in the process.
But despite your goals being aboveboard, debugging on someone else's site is a bad idea. If you introduce a bug through the deobfuscation process, or in the process of trying to understand the code, well that may at least waste time at the other end. I wouldn't be happy with people trying to do it on a site I was running. (That said, a site should be able to handle anything a client throws at it, because you can't trust anything coming from the client side.)
Instead of debugging on their site, I'd probably do my best to record (via Firebug or Chrome/Safari's Dev Tools, etc.) a sample ajax interaction, and then set up a dummy page on my own local server that would simply echo that interaction, playback style. Then you can experiment to your heart's content without risking throwing weird stuff at the site in question. I'd consider it unethical for me to play around in that way with someone else's site, whether they should be able to handle it or not.
Way 1:
Export the web page that uses the code to your drive (I know for sure Opera, Firefox and Chrome supports this - ctrl+s - make sure to save all content). They download all linked content (css, scripts, images), and fix the url's so the downloaded ones are loaded instead. Then replace the javascript file you want to debug and open the downloaded html in a browser, say firefox with firebug, and start debugging. This should work unless the page is heavily ajaxified.
Way 2:
I've managed to get this working in Google Chrome (v8.0.552.215 - I need to update BTW) on a page that has no jQuery (for example w3c.org) - try it yourself, just copy paste it in the address bar and wait for the page to disappear :)
javascript:(eval("var script=document.createElement('script');script.src='http://code.jquery.com/jquery-1.4.4.min.js'; document.getElementsByTagName('head')[0].appendChild(script);window.setTimeout(\"$('body').fadeOut(5000);\", 2000)"));
The script shows up in the scripts section of the console (CTRL+SHIFT+J) and you can set breakpoints. So something like this should work (feel free to modify):
javascript:(eval("for (var allsuspects=document.getElementsByTagName('script'), i=allsuspects.length, oldfile=prompt('Remove script src:'); oldfile && i>=0; i--) if (allsuspects[i] && allsuspects[i].getAttribute('src')!=null && allsuspects[i].getAttribute('src').indexOf(oldfile)!=-1) allsuspects[i].parentNode.removeChild(allsuspects[i]);var script=document.createElement('script');script.src = prompt('Inject script src:');document.getElementsByTagName('head')[0].appendChild(script);"));
The script expanded and explained:
for (var allsuspects=document.getElementsByTagName('script'), i=allsuspects.length, oldfile=prompt('Remove script src:'); oldfile && i>=0; i--)
if (allsuspects[i] && allsuspects[i].getAttribute('src')!=null && allsuspects[i].getAttribute('src').indexOf(oldfile)!=-1)
allsuspects[i].parentNode.removeChild(allsuspects[i]); // remove old script
var script=document.createElement('script'); // inject new script
script.src = prompt('Inject script src:');
document.getElementsByTagName('head')[0].appendChild(script);
The script works only in Chrome (maybe in Safari too?). I've tried Firefox, IE and Opera, but none of them worked. I would guess that there might also be an issue if the file is not available online (if you use you use the 'file://').
UPDATE: also works in Chrome v8.0.552.224
This is the page I'm working on... http://schnell.dreamhosters.com/folio/earthquake.html
Its purpose is explained via the instructions on the left. I'm finding that after doing so many searches and clicking so many of the links in the list on the right that the page freezes up, the Google Map stops working and Firebug tells me of an error in main.js and it goes like this...
b is undefined
Line 49
I really don't know why this decided to happen all of a sudden and the error is so cryptic and muddled amongst Google's code that I don't think I'll be able to figure this one out by myself.
Another problem I'm finding is that the page itself simply refuses to work in IE7 and IE8 (or probably any version of IE for that matter). I am also at a loss as to how to solve this problem because I can't figure out how to use any of IE's debuggers (if they even have one) and seeing how I already tested this and made it work in two browsers (technically three since Safari runs off WebKit just like Chrome), I just don't have the drive or capacity to imagine what could be going wrong.
Any help would be greatly appreciated
Moved from comment to answer.
As scunliffe mentioned, you are trying to do a crossbrowser AJAX without using jsonp. Use either $.ajax() with datatype jsonp or add a &callback=? at the end of the URL in the $.getJSON() call.
IE8 is quite good when it comes to helping out the developer. From memory F12 will open up the developer window where you can inspect the DOM, CSS and debug script.
Your error is cryptic because most javascript comes minified, so variables are all remapped to single letters, etc. See if the script causing the problem has a development (i.e. unminified) version as this will make a lot more sense to step through.
With regards to your specific issue it sounds like a timing issue. While browsers do a decent job of executing script in a consistent way if you follow standards, they do differ in their timings i.e. when things execute. That would explain why b is undefined in some cases and not others.