Is it sometimes ok NOT to Degrade Gracefully? - javascript

I am in the process of building a video sharing CMS that uses lots of jQuery and ajax for everything from rich UI effects to submitting and retrieving data to and from the database. When JavaScript is disabled it everything falls apart and 90% of the functionality doesn't work.
I am beginning to think its ok to not degrade gracefully for certain types of sites like this one which uses a flash player to stream the main content - the videos. So what would be the point of going to great lengths to enable dual support on everything else if the main content of the site can't be viewed. Even YouTube breaks with JS disabled.
I'm planning to release the CMS under open source license, so the question is:
For mass distribution (and for this type of site) is not degrading gracefully a good idea?

As long as you make it clear to the users that they need JS enabled, it is ok for it to "fall apart" without JS. However, if you give no indication that it shouldn't work without JS, then people will get angry. Most people nowadays expect sites to require JS in some aspect of their functionality.
For something as complex as a CMS with videos, it is the users fault if they don't enable JS. They shouldn't expect something like that work without JS, and even if they do, it's probably not worth your time maintaining two versions of your site: JS and non-JS, especially for something that is open source.

Seeing as your application relies on javascript for its entire purpose, it is impossible for you to degrade gracefully. As long as your site clearly tells the user to enable javascript to get all of your awesome functionality, and maybe some links as to how to do so in different browsers, you should be fine. :D

You are essentially choosing an audience. It's not unlike deciding whether to support IE6. It's not right-vs-wrong, it's simply a question of what percentage of your audience you are willing to lose, in exchange for ease of development on your end.
That said, I find progressive enhancement (of which graceful degradation is an outcome) to be an efficient and safer way to develop. Do the HTML first, make it work, then add JS as sugar on top.
It's not likely that one of your users is not running Javascript. What is likely, speaking for my humble self, is that you will have some small JS error which kills everything. (JS tends to just stop on exceptions, you may have noticed.)
It's nice to know that, in the event of such an error, your users can still use the site. That's what graceful degradation is for, in my opinion.

Graceful degradation doesn't mean "everything works fully in every browser", it means "if your browser can't handle something, you see something sensible instead of broken junk".
In your case, simply detecting that the site will not work and displaying a nice error page explaining what is required would be an acceptable form of graceful degradation.

If you're a perfectionist, there's nothing wrong in letting people w/o JS know what's going on, as opposed to just letting the website break. Here's a quick how-to: How to detect if JavaScript is disabled? .

Related

Website doesn't work with Javascript turned off

So head to www.jabsy.com, with Javascript turned off.
Basically, I use some JQuery UI Dialogs, I use Javascript for all the bindings on the page...I pretty much use it for everything. Is that really a bad thing though?
Nothing really works without Javascript. Not even the Google Maps API.
Should I go out of my way to try and make the entire page work without Javascript? Is that even possible with my site? I wouldn't even know where to begin as I use Javascript for everything, so could I get some points? How many users actually turn off their Javascript these days?
Would it help to let the user know if they have Javascript turned off and make them turn it on before accessing it and provide them with directions how?
Yes, if your site requires JavaScript you need to let the user know that it is required.
For example:
<noscript>
<div>
You need to have JavaScript enabled to use this site.
</div>
</noscript>
You can provide more description as appropriate. A savvy user that sees this text is going to be able to then go in and turn on JavaScript for your site. A non-technical user might have trouble, but I would think most of them would be running with JavaScript enabled anyway (?).
According to data collected in 2007, about 3% of users in the US have JavaScript off. I'm sure that number is lower today.
It really depends on how critical the sections of your page that require JavaScript are. If there is a form that is mission critical, but controlled completely by JavaScript, you probably want to engineer a way for that form to do the same thing with JS on and off.
However, you have animated snowflakes on your background (for the love of God, don't really do this), it's not going to negatively affect someone visiting your site with JavaScript off.
Really, it all comes down to how important the information or actions are to your site. Turn off JavaScript and note all the things you can't do that are absolutely vital, then make them work.
Keep in mind there are several audiences that will not render your JavaScript:
Screen readers/accessible browsers
Console-based browsers (Text based browsers)
Search Engines (Google)
Your specific service (location-based messages) will be way too cumbersome to use without JavaScript (and its content is dynamic). Therefore, I see no problem requiring it. You should, however, point out that JavaScript is necessary to use your site (Preferably at the top, in really large letters). You can do that by including the alternative no-JavaScript content in noscript tags, i.e.
<noscript>
<div style="font-size: 200%;">You need JavaScript!</div>
</noscript>
However, most websites are content-based, like a company's homepage, stackoverflow or Wikipedia. These websites should be usable without JavaScript. Nowadays, even smartphones have excellent JavaScript support, but Kindle and regular phones are still too slow for JavaScript.
There is a line of argument that says sites should work without JS. Personally, I think that is tosh, unless you have a clientelle for whom this is liable to be an issue. JS is a reasonable thing to expect for many sites.
However, it is polite to let people know that this is a requirement, and inform them rather than just letting it not work. If your site is heavily JS dependent, then you may have made some mistaken design decisions, but it is probably not worth re-working it. If you monitor the number of people who get the "you need js" message, you will identify if it is proving a turn-off. I suspect it will not be an issue.
So build based on what you need, BUT tell people if they need to have things set.
You can use the <noscript><!-- html here if no Javascript --></noscript> tags and place content to be rendered in between if javascript is turned off.
I don't think there are many sites that will work with these days without it. It's more or less mandatory.

When is it OK to use Javascript and when not?

Is it good practice not to use much javascript/jquery? Should we avoid it as much as possible (for good accessibility)?
When is it OK to use JavaScript and when is it not in web design and development? In what scenarios and with what conditions?
Update:
I'm asking regarding public websites.
I have to respectfully disagree with the posters that say that you shouldn't use JavaScript, or use it sparingly, or have it degrade gracefully.
The reason is that the vast majority of people nowadays has JavaScript enabled and appreciates the desktop-like experience it can provide from a website. Really, who doesn't have JavaScript enabled? People act as if this is a statistically significant group. It is not.
Not using JavaScript is a little bit like nitpicking about variable sizes (oh, I can use a 16-bit integer here instead of 32-bit to save some memory). Unless you are doing some monster project for hundreds of thousands of people, where the ROI of the time you spend on making your website degrade gracefully is actually positive, you should use JavaScript as freely as you like. The two people that can't access it because they disabled it are paranoid and probably not the kind of people you want as customers anyways.
Just my 2 cents.
+1 to everything Mr. Expert said.
One more thing to add: it is not good for accessibility to have critical functions of your website rely on JavaScript. If JS is disabled in the user's browser, they should still be able to submit all forms, click all buttons, et cetera. Your website must degrade gracefully in the absence of JavaScript.
One note for forms:
Where possible, use the Hijax approach to submitting forms. Make them work using traditional page refreshes, and then use JavaScript to "hijack" the form submission and do it with AJAX instead. If the client has AJAX disabled, the forms will still work fine.
Before I say anything, let me make it clear that all of this is relative - its all about YOUR TARGET AUDIENCE. The answer can be on opposite ends of the spectrum if majority of your target audience is disabled people in Africa and my target audience is gamers in South Korea.
First, look at the ratio of JS enabled vs JS disabled in your target audience. For an average website, it is 100:1.
Second, consider bandwidth. jQuery minified and gzipped is 24K. But do all browsers work properly with gzipped contents? Choose the right UI framework or choose whether to use one at all, depending on your target audience's bandwidth. If your target audience is young people with heavy-usage broadband plans, they won't complain if the framework is megabytes in size. But when your website targets remote villages in some country for a relief effort or educational program or something then avoid such frameworks - they can barely get access to the Internet.
Third, for accessibility, two things are important:
Anyone should be able to see/hear/know the contents in your website.
Anyone should be able to perform all important functions in your website.
Once you take care of these using the minimum denominator technologies for YOUR target audience, you can always use javascript to pretty up things and enhance existing basic functions (auto complete, AJAX submit, etc...)
To sum it up, gracefully degrade.
Accessibility aside, I don't agree that we should gracefully degrade in the case where someone has JS disabled!
For desktops with browsers, saying that When people don't have javascript enabled, your website should degrade gracefully is like saying Your game engine should gracefully degrade to DirectX 6 because some people use Windows 95.. Doesn't make sense anymore. Note the word anymore. It used to make sense when JavaScript was only there on 50% of browsers and it was an emerging technology.
Anyone have any good reason why my 3D game should be able to degrade gracefully and use DirectX 6? Its moot. What DOES make sense is, my game uses DirectX 11 on Windows 7 but degrades gracefully and uses DX10 in Vista or even DX9 in XP.
Come on.. look at some stats. JS enabled to Disable ratio is like 100:1
Again the whole thing changes if 80% of your audience uses some upcoming web browser in a mobile device with shaky JS implementation to see your website.
If majority of your target audience/device has JS enabled, use it well. If they don't have, then don't. You just have to give them what they can use and see.
There will always be a minority, but if there is a pre-requisite to see a website and it is fairly widespread, they should have it installed/enabled or else its too bad for them. You certainly don't want paranoids in your target audience.
End of the day, only you will have the information that will help you decide how much you should use JavaScript. It is always dictated by your target audience and their devices.
If you are developing a simple website, then you should only use JavaScript to enhance the user experience, and it should degrade gracefully for those that do not have it enabled. If your website is content-centric, then that's how you should treat it: content first, JavaScript-based bells and whistles second. There should not be a single piece of required functionality that does not work without JavaScript enabled.
However, if you are making a web application, then go nuts with it. Web apps are supposed to use JavaScript, so it doesn't make sense to cater to people who have it disabled; if they really want to use your produce, they will enable JavaScript (or use a difference device). It's not worth all the trouble making it work without JavaScript enabled. That's like arguing that you should not give your video game good graphics because lower-end computers will not be able to run it: the people who really want to play your video game will upgrade their machine.
mhr's answer, "Always, as long as it degrades gracefully", is a good baseline. I would add that reasonable exceptions can be made where Javascript provides application functionality (your site is a "web app" rather than purely informational) that has no server-side equivalent. So for example, "graceful degradation" as a rule should not prevent you from building a web-based drawing tool (which would be, at best, unusably onerous if it degraded gracefully to forms and server-side functionality). It should, however, prevent you from requiring Javascript to access any content which that drawing tool publishes to an audience other than the content creator (because the content creator, self-evidently, has already accepted that Javascript is required for their usage of the site).
With all my respect to Mr J. Nielsen: Conservativism in design can be really senseless in terms of evolution an progress. When flash first appeared (with the first versions of actionscript, really take-away programming) a lot of noisy-glossy over animated interfaces raised, most of them almost impossible to operate in terms of usability. But the fact is that, form all those futuristic experiments, some qualitative improvements have arisen in terms of UI.
With javascript something similar happened: what was a merely widget is becoming more and more popular even transcending the presentation layer to handle some business logic: the RIA (Rich Internet Applications) are gradually relying on javascript for the user experience.
To use or not to use javascript?
I think every tool oriented to improve usability and interaction is welcome in a brand new world that evolves continuously and that is way too far to have a sharpen shape, direction or just a simple plan underneath.
What others are saying here is completely true: it is not worth thinking of the one or two that still have javascript disabled, as game producers don't care too much if your computer stinks and they keep pulling the limits of graphics. Thanks to this, we are no longer playing PACMAN (only if Google wants) and we can enjoy Assassins Creed.
NASA is a great example of how not to use JavaScript in a public website - they appear to be using JS to serve browser specific style sheets resulting in an unstyled mess with JS disabled.
The primary use of JavaScript is to write functions that are embedded in or included from HTML pages and interact with the Document Object Model (DOM) of the page. Some simple examples of this usage are:
Opening or popping up a new window with programmatic control over the size, position, and attributes of the new window.
Validation of web form input values to make sure that they will be accepted before they are submitted to the server.
Changing images as the mouse cursor moves over them: This effect is often used to draw the user's attention to important links displayed as graphical elements.
Because JavaScript code can run locally in a user's browser (rather than on a remote server) it can respond to user actions quickly, making an application feel more responsive.
On a public website it's ok to use JavaScript as long as the information that your site contains is still available and usability is still good for people without JS.
If you're in a more controlled environment like a companies intranet or something, you can maybe rely a lot more on JavaScript.
Also if your site is more an application than a document and it's functionality is just not possible without JS, you can of course use it.
You can also just accept the fact that some people will have problems using your site but rely heavily on JS anyway. That's your choice.
With javascript we can have a client / server relationship with users, in the sense that we can use their cpu power to build the page and free some resources from our servers. we deliver the code and data and their browsers put them together and it is as it should be. :)
Sincerely
Babak
well i am not agree with your point that not to use java-script and jquery regularly.
now a day browser are compatible with javascript.And if you have to develope a website in which validation is required u have to prefer javascript(for client side validation).
Your site should degrade gracefully, if only for one reason: NoScript. That Firefox extension depends on whitelisting to allow sites to run Javascript. That implies that sites that I've never visited before will not be allowed to run Javascript. It's a good way to prevent a lot of phishing attacks and cross site scripting (XSS) attacks.
Sites have to earn my trust first. Yes, Ebay and Amazon may heavily depend on Javascript, but a site like www.buyviagracheap.com may not.
And if, with Javascript disabled, you have nothing to show for, I'll press "back" quickly before you can say "but...", and never come back.
If, however, I like what I see, I may enable Javascript for your site and improve my experience. So there's no need to provide full alternative functionality with Javascript disabled.

Web Development: Do we still need to support non-javascript users?

Background: I'm working on an e-commerce website. It was my original intention to add JavaScript on top of regular html pages, so that users with JS support got the added benefits, but users without it could still use the basic html forms to add things to their cart, to search, etc.
I've run into a few situations though, where there simply isn't a sane way to implement certain functionality in a non JavaScript way.
One example is chained attribute selects on product pages (where the color choices change based on the size chosen, because not all sizes come in every color). Even if I didn't use AJAX, it would still require JavaScript to dynamically change the options.
The only alternatives to JavaScript that I can think of would be:
A. Have an add to cart "wizard" where you have to step through each attribute choice on a separate page (yuck!)
B. Put each size/color variation out as a separate product (and force the customer to wade through the category page to find the color size combo that they want)
...And while both of the above would work regardless of whether the user has JavaScript on or not, they both punish the JavaScript users by restructuring the page and forcing them to use an interface designed for the lowest common denominator.
So the question is, since JavaScript has taken a much larger role in web development than it did a few years back, and the design pattern for an AJAX/JS application/site is so much different now than a 'classic' web design pattern, do we still go out of our way to support non JS users? Or do we say, "To hell with you! Update your browser, turn on JavaScript or go shop elsewhere"?
I'd be interested to see other developer's take on this.
I think it really depends on your target audience. I work for a company that has several types of websites, some are focused on your avg guy or gal who's into gaming. And our stats show us that the vast majority of these people have javascript enabled.
We also have a site that's focused at developers, and many of these developers won't allow javascript to run on a site unless they trust it. I've seen as many as 20-30% of the browsers on that site don't run javascript.
So it's very subjective.
IMHO, it's very reasonable to use vast amounts of tasteful javascript to enhance an otherwise mundane experience. However, I also think that when possible it should gracefully degrade. This form of degradation isn't too hard to achieve (in most cases) as long as you consider it when you're designing things.
The most important non-javascript user is Google. Do not forget that.
When it comes to things like Ajax, or any javascript for that matter, I think it's best to:
Plan for Ajax from the start; Implement Ajax at the end -Jeremy Keith
This means Intercept (hijack) links and forms using (unobtrusive) JavaScript making your code degrade well for those that don't have javascript enabled. If you want to show a fancy slider, make it a link that tells the server to show the div when you reload the page and tell your javascript to do something differently when the link is clicked.
These ideas are your safest bet for a functional, but fun website.
I think that supporting non-Javascript users is absolutely necessary for any site aiming at some kind of "normal" target group (i.e. not gamers or techies).
There is an increasing number of mobile devices accessing, and trying to parse, normal content.
Many corporate networks still block scripting for security reasons - you don't want to lose the occasional employee shopping from work, either.
Javascript tends to screw any attempt at accessibility. In my mind, creating sites that are as accessible as possible is a service to our fellow human beings.
I'm not saying I'm lily white on this. I hate replicating functionality that I just managed to achieve in my JS framework in static HTML, or making it degrade gracefully. But I think it still really, really is a must, and not merely a question of profitability. This is something worth investing a bit extra, or putting a bit of unpaid work into.
If you can exclude users that don't use javascript, so this will be some mobile devices, or the truly paranoid, as well as lynx users, and any users not using the version of javascript you write for.
If you are willing to go with that then I would suggest you have a static html page that has some message that javascript is required.
When your javascript is loaded, and the DOM tree is ready then you can replace this message, so it is never seen, with the rest of the webpage.
But, you may want to see how you can get the functionality, even if limited, for non-javascript browsers.
For example, for colors you can use a horizontal dropdown that can work on all but older IE browsers: http://www.alistapart.com/articles/horizdropdowns/
If you want to use javascript to make your life simpler then that may be a poor reason, but, if you are doing a photoshop webapp then you will need javascript.
NOTE I would suggest having it work with and without javascript, as an e-commerce site will want to not exclude any customers, I expect.
Much of it depends on your audience. As you said years ago JavaScript was used in a different way, and a bit of an annoyance even. Now with AJAX and increased functionality it's a must, and most people have it turned on.
You could say that someone with Javascript disabled is used to stuff not working pretty regularly, and a small minority.
However, if you're building a site that will be frequented by older computers or people on limited bandwidth (such as foreign traffic) you might want to consider working around them. Also a site that is heavily visited by mobile browsers might be another one to focus on.
Take a look at your analytics and see what the current usage is, and profile your audience to really find out.
The best answer will depend on a simple comparison: Estimate the extra money you will spend creating the non javascript alternative site. Estimate the money you will make selling products to your non javascript enabled customers. Compare. If you are a huge shop, then getting sales from that last 2% or 10% of users might be worth it. If you are just one guy, maybe you have more profitable ways to spend your time.
I found this interesting and fact driven post, it might help. Why we should support users without Javascript. Following is a summary:
Some people choose to turn JS off.
JS fails occasionally, HTML/CSS does not.
JS is designed to and should be the icing on the cake, not a patch-job for bad HTML/CSS.
In theory, yes; but in practice, no.
In theory it's in the spirit of the web to support hardware and software with a range of capabilities and configurations, scaling site features appropriately.
In practice, even mobile browsers are converging on the sweet spot occupied by the current major desktop browsers. Users on the outside can typically switch to an alternate browser or device in a pinch.
While it seems logical, progressive, simpler, and more efficient to require Javascript for an e-commerce site, you should ask yourself if you are willing to forego x percent of the business that would be generated by non-Javascript users, and weigh that against lower development costs.
The percent of business lost is likely not the percent of non-Javascript users, because a smaller percentage of non-Javascript visitors are likely to purchase goods are services. The percentage of lost business will probably be somewhat less than the percentage of non-Javascript users.
focus on making advanced feature to the large audience is better than spending time to find work around for non-javascript user and ones who use obsolete platform.
In my opinion there 2 things you have to consider when thinking about using JavaScript on a website:
Is Google still able to crawl all the content
If some parts of the website are not usable without JavaScript then make a very clear message for the non JavaScript users, why you site is not working for them
I think we need to support non-javascript users again.
There are users including me who doesn't disable JS for any kind of paranoia, but because some sites are so JS heavy nowadays that my browser is slowed down to a grinding halt. And this is getting worse recently, so I expect more users will do this eventually. I wanted my performance back so I installed browser plugins that let me selectively enable only the absolutely necessary scripts on the site. So I can have a dozen of tabs open again without performance problems.

Is graceful degradation in the absence of JavaScript still useful?

When even mobile browsers have JavaScript, is it really necessary to consider potential script-free users?
Yes. Your web pages aren't just consumed by people: they're consumed by search engines, and crawlers, and screenscrapers. Most of those automatic tools don't support Javascript, and essentially none are going to generate UI events or look at deeply nested AJAX data. You want to have a simple static HTML fallback, if nothing else then so that your web pages are well indexed by the search engines.
Forget the crazies who disable Javascript; think of the robots!
Yes.
People can (and do) browse with javascript disabled. If your site will work without users having to explicitly enable javascript for you, that makes them happy.
Exactly how relevant depends on your target audience, of course.
I would argue that you shouldn't go significantly out of your way to accommodate for non-JS users for the following reasons:
All Modern Browsers Support JS
This is a snapshot of browser usage
today:
http://www.w3schools.com/browsers/browsers_stats.asp
Even the oldest common browser, IE6,
supports basic JavaScript and AJAX.
If you decide not to integrate
certain features b/c of a JS
dependence, this proves that you are
essentially doing it for people who
started with JavaScript enabled and
explicitly chose to disable it. I
think these people should expect for
some features, and perhaps even
entire sites, not to work as a
consequence.
Few People Willingly Disable JS
Building on my point above, average
web users don't know or don't care
that JS can be disabled in browsers.
It's largely a tech savvy crowd who
knows how to do this (myself
included), and as tech savvy users we should know when to
turn it back on as well.
Cost of Support
In light of the above, consider that
choosing to accomodate users who have
primarily willingly disabled JS comes
with a very real cost. If you are
managing a large project with heavy
UI requirements, you can easily burn
a lot of developer hours
accommodating for what is a very
small user preference. Check your
budget. If it is going to take 2 devs
working 40 extra hours each on the project
to accomplish this feat, you are
easily going to burn a few thousand
dollars on what is essentially a
non-issue for the vast majority of
your users. How about using that time
and investment to further buff up
your core competency?
Precedence
I may very well be wrong on this, but
I think it would be difficult to find
major media or social sites that
don't rely on JavaScript for some
portion of their functionality to
work. If major businesses that rely
on the operation and accessibility of
their site to stay in business aren't
doing it, there's a good chance it's
because it isn't needed.
CAVEATS:
Know your market. Continue to build XHTML/CSS that is semantic (preferably by using the RDFa W3C recommendation). Still strive to make your sites accessible to the visually impaired. Don't believe everything you read. ;)
DISCLAIMER:
My argument above is largely dependent on how you define "graceful degradation." If you mean all the links still work, that's one thing, but if you mean all the links still work and so does the wombats game, that's another. I'm not trying to argue for making your site so JS dependent that non-JS users can't access any portion of it. I am trying to make an argument for the acceptability of certain features, even some core features, being reliant on JS.
It is relevant and it will be relevant even after 10-20 years when javascript might be supported everywhere. making things work without javascript is important development technique because it forces you to keep things simple and declarative. ideally javascript should be used only to enhance experience but your website shouldn't depend on it.
there is clear advantage from maintenance point of view to have most of the code in declarative format (html+css) and as little as possible in imperative (javascript).
My position:
I browse with NoScript, so if I come on your site it will be without benefit of Javascript. I don't expect the full user experience.
What I want, before turning on JS, is to be assured that you're reasonably competent and not malicious, and that I actually want what you're using JS for.
This means that, if you actually want me to use your site, you should allow me to look around, using links. (If I see a site that's totally useless without Javascript, I generally think the designers were incompetent.) You should let me know what sort of functionality I'll get from enabling Javascript, and you should present the site in a legitimate-seeming way.
I don't think that's too much to ask.
graceful degadation / progressive enhancement / unobstusive javascript is absolutely relevant!
as with all accessability issues: just imagine for one second what it's like to be the one on the outside who can't use the page.
imagine you're travelling around the world, you're in some hotel or internet café with really old computers, old software, old browsers, you want to look up your flight and you realize you can't because of some javascript incompatability in the old browser you're using.
(try 'old mobile phone' or 'stuck behind a corporate firewall' for different scenarios)
image what a world of possibilities opend up to blind people with screen readers and the web, and image what it's like to find these possibilties closed again because of javascript.
so much for appealing to your better nature.
you might also want to do it
to keep your site accessibly for search engines.
Yes, it's relevant. Mobile browsers in use today do not all have Javascript enabled. It's available on new phones, sure. But there are millions and millions of people like me, who have phones running older browsers, and for all of us, a JS-required browsing experience is just plain broken.
I don't even bother visiting sites that didn't have progressive enhancement in mind when they coded. I'm not technically behind the times. My phone is a year old. But I'm not going to re-up my contract and buy a new phone because of a crippled web experience.
It depends on who your target audience is. I have JavaScript turned off by default and turn it on when I know what the site's intent is.
It's generally much faster to browse with Javascript disabled (digg.com is lightning without JS), which is why it's popular.
In Opera it's really easy: you simply press F12 and untick the javascript option. I always browse without Flash, Java (not javascript), animated images and sound. I enable Flash on a per-site basis, eg YouTube. Sometimes I turn off JS temporarily if my system is slowing down.
And don't forget about:
Screen readers (I think they mostly have JS disabled)
Text browsers or other very old systems
Ad blockers (if your filename happens lands under their radar)
Any old browser that either doesn't support JS at all or the JS breaks (e.g. IE6 doesn't support some modern JS stuff).
The solution is to use progressive enhancement rather than graceful degradation, i.e. start with the basic HTML and add CSS. Then add Javascript and/or AJAX to parts of the site.
For example, if you had a site like Stack Overflow, voting up an answer could submit a form normally. If JS is enabled, it would do an AJAX request, update the vote count and cancel the form submission, without leaving the page. SO doesn't do this though...
I for one always have NoScript turned on unless I trust the site for a number of reasons including cross-site-scripting, click jacking, and HTML injection. It's not me being paranoid, it's because I know a lot of developers, and know most of them have no idea what web security is, never mind how to avoid vulnerabilities.
So until I trust a site there's no chance I'd let it do anything fancy.
For the unfamiliar, there are some interesting blog entries on the subject:
Protecting Your Cookies: HttpOnly
Cross-Site Request Forgeries and You
Sins of Software Security
Top 25 Most Dangerous Programming Mistakes
I'm going to have to make a case for the other side here. Peoples reasons for designing sites without javascript are largely idealistic. Given an enough time and money and the goal is achievable and will certainly open your website to the largest possible number of people. However in reality doing this will slow your development, increase the number of test cases that you have to deal with, and ultimately affect the quality of your application for those users that do use javascript.
In my opinion it is perfectly reasonable to choose to make your site only compatible with js enabled browsers and tell those users that dont have it that they are missing out. This allows you to concentrate on creating rich content that the majority of users will be able to view.
There are of course exceptions to this rule, but if you are looking to create a good website for the majority of users, or have a client who is after a flashy website with limited time or money then taking the decision that it is js enabled browsers only is a reasonable thing to do.
The real question is not whether it is relevant, but whether to use Graceful Degradation, or Progressive Enhancement as your scripting strategy.
I'm actually in an interesting position when it comes to graceful degradation of JS. I'm working on a web application that bots and crawlers have absolutely no business looking into. There's nothing they can gleam that should be indexed.
The informational site accompanying the web application, however, should be indexed, and therefore JS degrades gracefully there.
In the web application, if you don't have JavaScript enabled, you're probably not supposed to be there. It's intended to be a rich interactive experience. The web application actually requires JS to be enabled, and for you not to be sitting behind a corporate firewall.
We're not serving anything malicious, its just our intent and purpose for the web application that's different. The goals of our web application and those of our informational site are completely different.
I use JavaScript. I always keep my browser up-to-date. But sometimes, my Internet connection is so bad that scripts just don't load.
There are also cases when:
Some scripts load, but others fail, in which case parts of a website stop functioning
Scripts are loading, but I want to hit "submit" without waiting for that fancy frilly menu
A script malfunctions because it was partially loaded and then cached at that half-stage
I'm in such a hurry that I just decide to use Lynx.
Now, I'm not saying my Internet is bad all the time, or even most of the time, but it does happen. With the Internet expanding rapidly to many rural areas across the world, I'm sure I'm not the only one. So apart from bots as Nelson mentioned above, it's another thing to keep in mind. (Hint: check your demographics).
If you don't want the page to work when Javascript is off then just have that be the message in html, and if javascript is on, by using unobtrusive javascript you can get rid of that message and make visible the rest of the application.
Depending on what you write for, in terms of javascript version, you may need to degrade if the browser the user has doesn't not have the latest version, so gracefully handling that is also important.

Should your website work without JavaScript [duplicate]

This question already has answers here:
Do web sites really need to cater for browsers that don't have Javascript enabled? [closed]
(20 answers)
Closed 9 years ago.
We're developing a web application that is going to be used by external clients on the internet. The browsers we're required to support are IE7+ and FF3+. One of our requirements is that we use AJAX wherever possible. Given this requirement I feel that we shouldn't have to cater for users without javascript enabled, however others in the team disagree.
My question is, if, in this day and age, we should be required to cater for users that don't have javascript enabled?
Coming back more than 10 years later, it's worth noting my first two bullet points have faded to insignificance, and the situation has improved marginally for the third (accessible browsers do better) and fourth (Google runs more js) as well.
There are a lot more users on the public internet who may have trouble with javascript than you might think:
Mobile browsers (smartphones) often have very poor or buggy javascript implementations. These will often show up in statistics on the side of those that do support javascript, even though they in effect don't. This is getting better, but there are still lots of people stuck with old or slow android phones with very old versions of Chrome or bad webkit clones.
Things like NoScript are becoming more popular, so you should at least have a nice initial page for those users.
If your customer is in any way part of the U.S. Goverment, you are legally required to support screen readers, which typically don't do javascript, or don't do it well.
Search engines will, at best, only run a limited set of your javascript. You want to work well enough without javascript to allow them to still index your site.
Of course, you need to know your audience. You might be doing work for a corporate intranet where you know that everyone has javascript (though even here I'd argue there's a growing trend where these sites are made available to teleworkers with unknown/unrestricted browsers). Or you might be building an app for the blind community where no one has it. In the case of the public internet, you can typically figure about 95% of your users will support it in some fashion (source cited by someone else in one of the links below). That number sounds pretty high, but it can be misleading; turn it around, and if you don't support javascript you're turning away 1 visitor in 20.
See these:
https://stackoverflow.com/questions/121108/how-many-people-disable-javascript
https://stackoverflow.com/questions/822872/do-web-sites-really-need-to-cater-for-browsers-that-dont-have-javascript-enabled>
You should weigh the options and ask yourself:
1) what percentage of users will have javascript turned off. (according to this site, only 5% of the world has it turned off or not available.)
2) will those users be willing to turn it on
3) of those that aren't willing to turn it on, or switch to another browser or device that has javascript enabled, is the lost revenue more than the effort to build a separate non-javascript version?
Instinctively, I say most times the answer is no, don't waste the time building two sites.
My question is, if, in this day and age, we should be required to cater for users that don't have javascript enabled?
Yes, definitely, if the AJAX functionality is core to the working of your site. If you don't, you are effectively denying users who don't have Javascript enabled access to your website, and although this is a rather small proportion (<5% I believe), it means that they won't be able to use your site at all, because the core functions are not available to them.
Of course if you're doing more trivial things with AJAX that just enhance the user experience but are not actually central to the core functionality of the site, then this probably isn't necessary.
Depends really.
I personally switch off JavaScript all the time because I don't trust lots of sites.
However, since you users have explicitly asked for your application, you can assume they will trust it and there is no point in doing extra work.
More, if you have that strong AJAX-affinity requirement, the question seems odd enough.
This is a bit like beating a dead horse, but I'll have a go at it, sure.
I think there could be two basic approaches to this:
1.
Using ajax (and, basically,
javascript) to enhance the experience
of the users, while making sure, that
all of the application's features
work without javascript.
When I am
following this principle, I develop the
interface in two phases - first
without considering javascript at all
(say, using a framework, that doesn't
know about javascript) and then
augment certain workflows by adding
ajax-y validation (don't like pure js
validation, sorry) and so on.
This means, if the user has javascript disabled, your app shall in no way break or become unusable for him.
2.
Using javascript to its fullest, "no javascript - no go" style. If javascript is not available, the user will not be able to use your application at all. It is important to note, that, in my opinion, there is no middle ground, - if you are trying to be in both worlds at once, you are doing too much extra work. Removing the constraints of supporting no-javascript users, obviously adds more opportunities to create a richer user experience. And it makes creating that experience much easier.
I think that depends on the type of web application you are going to build. For example in an e-commerce application the checkout process should propably work without java script because there are some people who deactivate js for checking out (in our experience). In a web 2.0 application in my opinion it isn't necessary to support non-js browser experience.
Developing for both also complicates the development process and is more cost intensive. you have double your web test efforts (testing with and without js) and also think different in the planning phase.
I think it depends on the market segment you're aiming for, if you're going for a tech crowd -such as Stackoverflow.com, or perhaps slashdot- then you're probably fine in expecting users to have JS installed and active.
Other sites, with a medially tech-aware audience, may suffer from users knowing enough about JS-based exploits to have deactivated JS, but with not enough knowledge to enable Scriptblock (or other browser-equivalent).
The non-tech aware audience are probably with the tech-crowd, since they possibly just don't know how to disable JS -or why they may want to- regardless of the risk.
In short, you should cater to spiders without JavaScript enabled, but only to the degree necessary to index the data that you want to expose to the public. Your browser requirements of IE7+ and FF3+ exclude far more people than the total number of people who disable JavaScript. And of those who do disable it, the vast majority know how to enable it when necessary.
I asked myself the same question the other day and came up with the answer that in order to use my application one must have Javascript enabled. I also checked various Ajax powered sites. Even Stackoverflow.
But considering this fact I also believe that you do need to support some degree of prehistoric applications. The main idea is to not let application break when users don't have enabled Javascript. Application should still display relevant data, but its functionality would be limited.
To add to some of the old discussion on this page. Google is now searching JavaScript: http://www.i-programmer.info/news/81-web-general/4248-google-now-searches-javascript.html
This is an issue that I was thinking about just a few days ago. Here is some information
In Google Chrome there is no way (menu/option) inside the browser to turn off Javascript.
Many websites including those from leading names like Google, etc., will not work without Javascript.
According to stats over 95% of visitors have Javascript enabled now.
These stats made me think. Do I have to break my back writing a lot of background code and everything for users who have disabled Javascript?
My conclusion was this. Yes, I have to include Javascript support, but not at the cost of sanity. I.e. I can afford to give it a low priority.
So I am going to have support for non-javascript browsing, but I will build most of it after my site is deployed.

Categories