I'm working on a project where I need to show specific content & include a file for a specific set of browsers.
I'm starting off with IE.
Since IE10 conditional comments aren't a thing anymore I'm trying to figure out what the best practice is for including specific content and a specific javascript file with some general fixes for things like css variables aswell.
So far I got something for just "exclusive" JS code but, I only want the file to be included based on what browser is being used.
if (/Edge/.test(navigator.userAgent) || /rv:11.0/i.test(navigator.userAgent)) {
document.write('<script src="./js/app-legacy-browsers.js"></script>');
}
Other than that I need to show the active browser icon on the site with some pre-set content.
I'd like to accomplish the same result as if
<!--[if IE]><![endif]--> still existed.
Main reason of not wanting to use the if statement is not wanting to affect pagespeed on first load.
Related
I do not know weather there is any way to solve this or not, but I'm just asking it because I believe SO is a place of genius people. Anyways, we all know that if we use <noscript></noscript> tag within a HTML document and if any user who are viewing this page and have JavaScript disabled from their browser will see the no script message.
But what if JavaScript has been disabled by proxy? Many wifi networks needs to manually input some proxy to use internet and many of them disabled JS on that proxy from the proxy server. In this case if anybody visit the same page, the page will see that JavaScript has been enabled from browser, but disabled from proxy.
If there any way to check weather JavaScript has been disabled by any proxy(if using) and showing alert message for this? Also I will be glad if anybody can say that how to implement it with Wordpress and also without wordpress. :)
Thanks.
You can show the message by default and then remove or hide it with JavaScript, e.g.:
<div id="jsalert">JavaScript is disabled in your environment.</div>
<script>
(function() {
var elm = document.getElementById("jsalert");
elm.parentNode.removeChild(elm);
})();
</script>
<!-- Continue your content here -->
If script tags have been stripped by a proxy (which I'm fairly certain is very unusual; at least, I've never seen it), then of course the script won't be there to be run, and the div will show. If the script is present, it will remove the div.
By following the div with the script immediately (which is perfectly fine, no need for "DOM ready" stuff that will just delay things), the odds of the div "flashing" briefly on the page in the common case (where JavaScript is enabled and not stripped out) are very low. Not zero, but low.
If you believe the proxy doesn't strip out script tags but instead just blocks the downloads of JavaScript files (which would be dumb), you can change the above to use a JavaScript file, but beware that by doing that you either hold up the rendering of your page (if you use <script src="...">) or you increase (dramatically) the odds of the div "flashing" on the page briefly (if you load the script asynchronously).
This is just a specific use-case for a general practice called "progressive enhancement" (or sometimes "graceful degradation," but most people prefer the first). That's where you ensure that the page is presented correctly and usefully in the case where JavaScript is not available, and then use JavaScript to add behaviors to the page if JavaScript is enabled. In this case, the "useful" thing you're doing is saying that JavaScript isn't running for some reason, so it's a slightly different thing, but it's the same principle.
I have a page with a lots of javascript. However, the page once rendered remains static, there are no moving things or special effects, etc... It should be possible to render the same HTML without any javascript at all using only the plain HTML and CSS. This is exactly what I want - I would like to get a no javascript version of the particular page. Surely, I do not expect any dynamic behavior, so I am OK if buttons are dead, for example. I just want them rendered.
Now, I do not want an image. It needs to be an HTML with CSS, may be embedded with the HTML, which is fine too.
How can I do it?
EDIT
I am sorry, but I must have not been clear. My web site works with javascript and will not work without it. I do not want to check if it works without, I know it will not and I really do not care about it. This is not what I am asking. I am asking about a specific page, which I want to grab as pure HTML + CSS. The fact that its dynamic nature is lost is of no importance.
EDIT2
There is a suggestion to gram the HTML from the DOM inspector. This is what I did the first thing - in Chrome development utils copied as HTML the root html element and saved it to a file. Of course, this does not work, because it continues to reference the CSS files on the web. I guess I should have mentioned that I want it to work from the file system.
Next was to save the page as complete with all the environment using some kind of the Save menu (browser dependent). It saves the page and all the related files forming a closure, which can be open from the file system. But the html has to be manually cleaned up of all the javascript - tedious and error prone.
EDIT3
I seem to keep forgetting things. Images should be preserved, of course.
I have to do a similar task on a semi-regular basis. As yet I haven't found an automated method, but here's my workflow:
Open the page in Google Chrome (I imagine FireFox also has the relevant tools);
"Save Page As" (complete page), rename the html page to something nicer, delete any .js scripts which got downloaded, move everything into a single folder;
On the original page, open the Elements tab (DOM inspector), find and delete any tags which I know cause problems (Facebook "like" buttons for example) (I also try to delete script tags at this stage because it's easier) and copy as HTML (right-click the <html> tag. Paste this into (replace) the downloaded HTML file (remember to keep the DOCTYPE which doesn't get copied;
Search all HTML files for any remaining script sections and delete (also delete any noscript content), and search for on (that's with a space at the start but StackOverflow won't render it) to remove handlers (onload, onclick, etc);
Search for images (src=, url(), find common patterns in image filenames and use regular expressions to replace them globally. So for example src="/images/myimage.png" => |/images/||. This needs to be applied to all HTML and CSS files. Also make sure the CSS files have the correct path (href). While doing this I usually replace all href (links) with #;
Finally open the converted page in a browser (actually I tend to do this early on so that I can see if any change I make causes it to break), use the Console tab to check for 404 errors (images that didn't get downloaded or had a different name) and the Network tab to check if anything is still being loaded from the online version;
For any files which didn't get downloaded I go back to the original page and use the Resources tab to find them and download manually;
(Optional) Cull any content which isn't needed (tracker images/iframes, unused CSS, etc).
It's a big job. I'd love a tool which automated all that, but so far I haven't found one. The pages I download are quite badly made (shops) which have a lot of unusual code, so that's why there are so many steps. You might not need to follow every step.
I have downloaded a aspx webpage and saved it as html. I open it in IE and chrome and it takes time to load + some parts are missing. All the text is there but the onmouseover is not working properly and some css is not displaying correctly. Was the content not downloaded completely? i.e is it missing sme javascript, css or else?
I have done what you describe on many occasions for the purposes of putting together a prototype of new functionality in an existing application.
You will likely need to do a couple of things:
Ensure the paths to your JS and CSS resources are right (removing the unneccessary JS files, if any)
Also, you will likely need to update the paths in your CSS to any image resources in your page
I am building a website using PHP and JavaScript, and I feel that I have a good grasp on where to include my JavaScript, but a more specific situation has come up that has me confused. I currently have all of my JavaScript in one external file, which is being included on every PHP page.
Let's say that I have a paragraph with an id='myParagraph' and I need to highlight this paragraph in red with JavaScript on page load. This paragraph is only on ONE PHP page and my website has about 50 different pages. I immediately assumed that I should throw some code into my one external JavaScript file, something like:
$('#myParagraph').css('color', 'red')
and the paragraph would be highlighted when that page loads.
My question is: is this the best way to do it? To my understanding, every time I load a page it will be searched for an element with the id myParagraph, yet 98% of my pages won't even have that id. Is this wasteful? Should I instead include the following code:
function highlightParagraph()
{
$('#myParagraph').css('color', 'red')
}
in my one JavaScript file and then put some inline JavaScript in the PHP file with the id myParagraph to call the function highlightParagraph() when it's loaded? That way, only the one page with myParagraph will be searched and highlighted.
I feel like option 2 is the best, but I read all the time not to use inline JavaScript.
edit: I realize that for this example you would just use CSS. I'm just using it to get my question across
You should have a one "big" js file with the infrastructure functions and all the pages should have a reference to it.
Then each page should reference another js file with the functions related only.
The good things about using external js files are:
The files are cached after the first download => Faster surfing.
Separate of concerns, you keep the presentation tier away from the scripting tier.
Another important note:
The best way to change css is with css... not javascript.
I
If you change the element style on DOM ready, just add the element definition
#myParagraph{color: red;}
The problem with inline JavaScript is you might be starting with a few lines now, but in a few weeks or months, it will add up and be a lot of inline JavaScript.
That is bad, because inline JavaScript can't be cached by the browser like JavaScript files that you include with <script src="path/to/file.js" />.
That's bad because you add a lot of content that will be fetched every single page view by the user, adding load on your server bandwidth and slowing page load for the user.
If it's just a few selectors, don't worry; The time wasted on it won't cause any browser to sweat.
Though, if it becomes a lot of code for a different page/module of your site, you might want to split it into a different JavaScript file and include just that file when certain pages are loaded.
That way, the browser will cache that file and save that bandwidth for you and the user.
I wouldn't be too surprised if many people disagree with me (violently even) but I don't have a problem with putting a javascript tag with specific javascript for that page in the header if it will reduce the number of files or overall complexity of the project. Most of the core things that are done everywhere should of course be separated in another file but if it is a one page deal, then I would go for cleanliness.
The same goes for css, if it is specific to that page just put a css tag in the header with the specific changes that differ from the master css file. BTW as everyone is pointing out, this is a case where you want to just use CSS.
I am developing an eshop .At products page based on category i putted some javascript based filtering. However a problem arises if a category has a lot of products.
This link has something similar i do ...
http://www.snowandrock.com/sunglasses/snowboard/fcp-category/list?resetFilters=true
How ever this page is painfully slow and is over 2mb !!!
Every product for me needs half killobyte but the image is the problem..
So i am looking how to lazy load images..
Since my page has pagination unlike that site i think that loading images that are visible only to the page is a solution.The probem however is how to do it in order to work both for javascript and non javscript enabled people..
The only solution i though is storing the link at the css class somehow of the image for the non visible products and if shown after filtering change via javascript the image src...
Non javascript users dont have this problem as clicking on a filter would navigate them to other page...
Any other idea?
Four options:
Here are three options for you:
Use a background image
Kangkan's background answer has this covered.
If that doesn't work for you, I'm assuming you only need help with the JavaScript-enabled stuff, since you said the non-JavaScript users will see a different page.
Use a plug-in
Paging has been done. You've said in a comment that you're using jQuery. There are lots of jQuery plug-ins for paging. Find one you like, and use it. They will be of varying quality, so you'll want to test them out and review their code, but I'm sure there's a decent-quality one out there.
Server-side Paging
This is where the main page loads either without any products at all, or with only the first page of products. Typically you'd put all of the products into a container, like this:
<ul id='productList'>
</ul>
Then you'd have the usual UI controls for moving amongst the pages of results. You'd have a server-side resource that returned HTML snippets or JSON-formatted data that you could use to populate that list. I'll use HTML for simplicity (although I'd probably use JSON in a production app, as it would tend to be smaller). Each product entry is its own self-contained block:
<li id='product-001'>
<div>This is Product 001</div>
<img src='http://www.gravatar.com/avatar/88ca83ed97a129596d6e8dd86deef994?s=32&d=identicon&r=PG'>
<div>Blurb about Product 001</div>
</li>
...and then the page returns as many of these as you think is appropriate. You request the page using Ajax and update the product list using JavaScript. Since you've said you use jQuery, this can be be trivially simple:
$('#productList').load("/path/to/paging/page?start=X&count=Y");
Here's an example prototype (not production code); it fakes the Ajax because JSBin was giving me Ajax issues.
One big page download, then client-side JavaScript paging
I'm not sure how you're doing your filtering, but if you have an element that contains the product information, you can store the image URL in a data-xyz attribute on it:
<div id='product-123' data-image='/images/foo.png'>
Then when your code makes that visible, you can easily add an img to it:
var prod, imgsrc, img;
prod = document.getElementById('product-123');
prod.style.display = 'block'; // Or whatever you're doing to show it
imgsrc = prod.getAttribute('data-image');
if (imgsrc) {
img = document.createElement('img');
img.src = imgsrc;
prod.appendChild(img); // You'd probably put this somewhere else, but you get the idea
prod.removeAttribute('data-image');
}
Edit In a comment elsewhere you said you're using jQuery. If so, a translation of the above might look like this:
var prod, imgsrc, img;
prod = $('#product-123');
prod.show();
imgsrc = prod.attr('data-image');
if (imgsrc) {
$("<img/>").attr('src', imgsrc).appendTo(prod); // You'd probably put this somewhere else, but you get the idea
prod.removeAttr('data-image');
}
No need to remove it again when hiding, since the image will already be shown, which is why I remove the attribute once we've used it.
The reason I've used the data- prefix is validation: As of HTML5, you can define your owwn data-xyz attributes and your pages will still pass validation. In earlier versions of HTML, you were not allowed to define your own attributes (although in practice no major browser cares) and so if you used your own attribute for this, the page wouldn't validate.
References (w3.org):
Embedding custom non-visible data with the data-* attributes
getElementById
createElement
getAttribute
removeAttribute
appendChild
Off-topic, but a lot of this stuff gets a lot easier if you use a JavaScript library like jQuery, Closure, Prototype, YUI, or any of several others to smooth over the rough edges for you. (You've since said you're using jQuery.)
If you simply wish to load the images slowly and the rest of the page gets loaded first, you can put the images as background and not use the <img> tag. If you use the <img> tag, the image is loaded at the time of loading the page and so the page load becomes slow. However, the background images loads after the page is shown to the user. The user can read the text and see the images loading after some time.
I'm fairly certain it's not possible in plain HTML without some kind of Javascript intervention.
After all, if it was possible to do it without scriping, why would anyone have implemented it in Javascript in the first place?
My question is: How many visitors do you get who these days don't have Javascript enabled? I bet it's very few. And in any case, those people are used to sites not being fully functional when they have javascript disabled; your site will actually be better than most if the only difference they have to put up with is slower loading speed.
(ps - I presume you're using Jquery's LazyLoad plugin for the Javascript enabled people?)
I'd suggest to implement responsive image approach in order to avoid huge image files on devices which cannot display it properly (or human can't tell the difference).
I wrote the following code for my own site. I used JQuery:
1. Name all classes, where U want lazy loading by the same name, say "async"
2. Copy the real image location from 'src' to 'alt' attribute
3. After finishing page loading my script will copy all 'alt' values into 'src'
Look at example. This is full working sample html:
<html>
<head>
<script src="http://code.jquery.com/jquery-1.9.1.js"></script>
<script type="text/javascript">
$(document).ready(function(){
$('img.async').each(function(i, ele) {
$(ele).attr('src',$(ele).attr('alt'));
});
});
</script> </head> <body> <img class="async" title="Гороскопы" alt="http://virtual-doctor.net/images/horoscopes.jpg" width="135" height="135"/>
</body>
</html>
You can feel the speed in real site, where I used it
http://virtual-doctor.net/
Browser level support
Modern browsers have the ability to load images lazy using loading="lazy" attribute!
<img src="image.png" loading="lazy" alt="…" width="200" height="200">
For more information, visit here.
EDIT: I reread your question & noticed you also want this to work for people with Javascript disabled! Then yes my answer is not acceptable - but I'll leave it for the record.
Here are some Javascript libraries for Image Lazy Loading.
They help you load the images needed when the elements 'would' be in view by simply changing the image src attribute.
github.com/toddmotto/echo and toddmotto.com/echo-js-simple-javascript-image-lazy-loading : plain JS, IE8+, 2KB, only 5 contributors github.com/toddmotto/echo/graphs/contributors
github.com/dinbror/blazy/ and dinbror.dk/blog/blazy - plain JS, IE7+, 1.2KB (minified and gzipped), only ONE contributor github.com/dinbror/blazy/graphs/contributors
github.com/tuupola/jquery_lazyload and www.appelsiini.net/projects/lazyload : jQuery dependency, MIT License, tested with Safari 5.1, Safari 6, Chrome 20, Firefox 12 on OSX & Chrome 20, IE 8 & IE 9 on Windows, Safari 5.1 on iOS 5 both iPhone and iPad. 18 contributors https://github.com/tuupola/jquery_lazyload/graphs/contributors
github.com/luis-almeida/unveil and luis-almeida.github.io/unveil - special: lightweight version of Lazy Load github.com/tuupola/jquery_lazyload : IE7+, jQuery dependency, MIT License, 5 contributors https://github.com/luis-almeida/unveil/graphs/contributors
github.com/shprink/BttrLazyLoading and bttrlazyloading.julienrenaux.fr - special: dedicated for responsive designs : jQuery dependency, IE9+, MIT License, 3KB (minified & zipped), 5 contributors github.com/shprink/BttrLazyLoading/graphs/contributors
Important: I am still investigating which of these Javascript libraries is best to use. Do your homework I'd say, take some time to search what's the best tool for the job. My requirements are usually: license, dependencies, browser support, device support, weight, community, and history.