Third-party cookie method becoming obsolete to track users - what is next? [closed] - javascript

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 9 years ago.
Improve this question
Nowadays, people/companies behind the development of browsers are taking privacy in a serious way. They try to implement new security measures or simply change the default browser behaviours which have been around for a long time and today are considered as harmful for the privacy.
One example of this are third-party cookies. While IE requires a P3P policy to be sent when setting a cookie from a third-party domain, other browsers are blocking these cookies by default - or encouraging the user to activate such blocking option.
Also, if we think about extensions that help to prevent tracking (AdBlock, Ghostery...), it is getting more and more difficult to track users (whether for legitimate reasons or not).
As a developer, I found that there are some workarounds, such as ETag, although as you may know already, there are ways to prevent this type of tracking. Local Storage, available in most modern browsers (the ones that support HTML5 + enabled JS), is another way to accomplish this.
I would like to ask you what method do you find better and why. I feel like Local Storage could be the best replacement to third-party cookies, as it stores persistent data (it is not cleaned after the browser is closed) and it works in the vast majority of browsers - but still a much smaller percentage that cookies. A LocalStorage+fallback_to_cookies seems to be the best way for me, but would like to hear more opinions.

localstorage isn't getting the same heat as cookies simply because it's a "newer" technology. Give it time and I guarantee you it will end up being blocked/removed the same way cookies are being blocked/removed.
So far first party cookies are relatively safe, though ultimately scripts like GA still make requests to GA server, and as you said, there are many plugins/extensions/addons that block them.
But IMO the future will be in server-side tracking solutions. For example, when you go to a web page, that's a request to the server. Lots of basic info can be grabbed from it already. Then the javascript library would send (ajax) requests to the same server, not the 3rd party tracking server. Then all this data would then be forwarded to the 3rd party tracking vendor (e.g. GA, Adobe Analytics, etc.) by a server-side script.
Many tracking script offer server-side solutions already, but it's little more than an API with (many times) vague documentation, since it's not as popular to go this route. So I think there will be a lot of development to more easily handle payloads from the client and make server-side requests, make it almost as easy to implement as the current js version.
The main sticking point is tying the info to a single visitor. That's the most important part of the tracking cookies: a visitor ID that can tie all the activity together. Thing is, the alternatives (using combinations of IP and header info) isn't that far behind the accuracy of cookies, when you measure it against the cookies being blocked, so it's not a complete loss to not rely on cookies in the first place. But I think this will also have the affect of more and more websites enforcing a login system before a visitor does anything meaningful on their site. This will allow them to use your login id as the visitor id and would actually stand to increase accuracy.
But overall.. it's more important to look at the trends in the numbers, not the actual numbers, and from that PoV, it's even less a big deal. Unfortunately, a lot of people forget this or don't understand this point.

Related

Choosing "unavailable" pickup point in online shop [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 months ago.
Improve this question
just discovered one bug in a "n" online shop. The bug is the possibility to change html code (thanks to inspect element) and make an early unavailable pickup point available. As consequence, I was able to order some stuff, pay, and even get confirmation of my order. My question is, how an owner can prevent something like this?
P.s. During ordering, I was only on the one web page, there was no redirecting to another page or refreshing the current, until payment.
P.s.s. just want to mention, that I'm a total newbie in these "magic" things. So probably you can recommend me books/webpages etc. where I can read more about "server responses".
As you found out, editing the HTML code of a site and/or modifying the data sent to or from your browser is indeed not too difficult. That's part of how a browser is designed and intended to work, so you'll have to deal with this kind of "hacking" on the server side.
Here's a very superficial (and not complete) list of things to keep in mind when setting up your server and backend application:
Every request from outside ("the client") is potentially malicious or tampered with. → Make sure you use server-side validation for "everything". This may refer to:
Input fields (length, value, format, ...)
Data formats (e. g. correct JSON/XML structure)
User authentication and authorization
Your business rules (this is, as I think, the one decisive in your example - probably everything else was valid, but the server side did not check for the availability of that pickup point you injected)
Thus, do never rely on client-side validation (typically JavaScript / TypeScript) only! You can use this for a better user experience, but the real "hard" validation must take place on the server side.
Depending on the criticality of your site and the confidence of the data associated, think about adding more security by using a Web Application Firewall (WAF), rate limiting, log crawling and other techniques to identify and block suspicious traffic.
Keep your server software (the operating system with all its libraries etc., the application server (like Apache / Nginx / WildFly / ...) and the software your site comprises of (like a Spring / PHP / Angular / ... application)) up to date. There are means and methods like Dependabot helping you to automatize this process. Outdated software and libraries might have some known bugs an attacker can exploit.
Try to use standard software, frameworks and mechanisms wherever possible. Modern Web Frameworks like Spring Boot, Laravel, ... are well-maintained and security issues are found and fixed early. Also, the have validation and fraud detection methods built-in already, you'll just have to make use of them. On the other hand, if you try to code your own authorization framework (for example), you'll most likely overlook something and leave a security gap.

Does google analytics slow down my website? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 1 year ago.
Improve this question
I am at the final stages of my website, and currently I need to find a suitable statistics application/tool.
I have looked into webalizer, but it seems outdated.
Also, I have looked into Google analytics, but I am afraid that if I implement it, my website will go slow. It is already pretty heavy with database material being displayed which is dynamic btw.
I have read I can put the GA js code at the bottom of the page and thus the page will load first, but I still don't want a slow down.
You are all much more experienced in statistics than I am, so I believe you can give me some good advice.
I have my own private server (Linux) and I have root access as well (offcourse).
Do you think I should have a statistics app on the server, without interferring with my website, or should I go the Google way and use analytics?
Please give me good application names which you have tested etc...
Thanks
Any additional calls to scripts will slow down your site. However, Google Analytics instructs you to place it in a specific place so that it isn't loaded until the page has loaded. (It used to be before the </body> tag but I believe it's now supposed to be the last <script> in the <head> tag.) Don't worry about it too much; the benefits of analytics will far outweigh the extra call to a remote file.
Focus on other optimizations (database queries, CSS sprites, fewer HTTP requests). Analytics is necessary in today's site market and is indispensable; IMO it is not an option to forgo it.
As far as having your own "statistics app," I assume you're talking about building your own proprietary statistics codebase? I would discourage that, because it takes a lot of time and effort and in the end you will not have the same optimizations that Google has employed an entire project's worth of software engineers to make. Remember that while it's always great to create your own product, you don't have to reinvent the wheel, especially when it comes to things like this that have many sensible drop-in solutions that are widely available for free.
With respect to non-Google analytics solutions, one other of note is Clicky. I'm not as experienced with it as I am with GA, but I've heard many reviews that it is more precise and more informative than GA. However, just as an end-user browsing the web I've noticed a lot of times that its calls to Clicky's website do tend to slow down pages, and noticeably so; I cannot really say that I have seen the same effect with GA.
One last thing I would caution against is this: Do not employ more than one analytics solution unless you are trying to find the best one to suit your needs. It's just overkill to run two remotely-hosted analytics solutions on every single one of your pages, so what I would encourage you to do is try out a few for the first few weeks or so of your site (yes, pages will slow down during this trial phase) and then simply stick with the one that you like best. That will also give you the added benefit of being able to see first-hand what the speed implications are on your unique hosting environment for each script.
Here's some other analytics solutions that you might check out:
Piwik
Webtrends
GoingUp!
Yahoo! Web Analytics
Straight from Google's analytic sign up page (https://www.google.com/analytics/provision/)
"The appearance of your website will never be affected by your use of Google Analytics - we don't place any images or text on your pages. Likewise, the performance of your pages won't be impacted, with the possible exception of the very first page-load after you have added the tracking code. This first pageview calls the JavaScript on Google's servers, which may take slightly longer than a regular page load. Subsequent pageviews will use cached data and will not be affected."
Use the Asynchronous Snippet of Analytics:
http://code.google.com/apis/analytics/docs/tracking/asyncTracking.html
People focus to much on total load times when what is important is render times and in particular progressive rendering. If you use Google Analytics properly, it will load after the page has shown to the user. So yes, it will add a small overhead to every request but because the user can see the page already they probably won't even notice. Just go for it.
Webalizer runs on server side after apache logs doesn't it? That's why it appears outdated, it can't collect as much info as JS can. But it doesn't slow the user down any. You could run Webalizer and Google together for a bit and see what serves your needs best.
We decided to work around the possibility of google's servers appearing to slow our site down. Instead of our users downloading the ga.js file from google's servers we store it locally. The only problem with that approach is that our local copy becomes outdated. So we wrote an application that periodically compares our local file to google's and updates our file accordingly.
Andrew
Google Analytics is javascript based and does not tun on your server. All processing and storage is done on Google servers, so it's ideal if you are worrying about local resources.

Should your website work without JavaScript [duplicate]

This question already has answers here:
Do web sites really need to cater for browsers that don't have Javascript enabled? [closed]
(20 answers)
Closed 9 years ago.
We're developing a web application that is going to be used by external clients on the internet. The browsers we're required to support are IE7+ and FF3+. One of our requirements is that we use AJAX wherever possible. Given this requirement I feel that we shouldn't have to cater for users without javascript enabled, however others in the team disagree.
My question is, if, in this day and age, we should be required to cater for users that don't have javascript enabled?
Coming back more than 10 years later, it's worth noting my first two bullet points have faded to insignificance, and the situation has improved marginally for the third (accessible browsers do better) and fourth (Google runs more js) as well.
There are a lot more users on the public internet who may have trouble with javascript than you might think:
Mobile browsers (smartphones) often have very poor or buggy javascript implementations. These will often show up in statistics on the side of those that do support javascript, even though they in effect don't. This is getting better, but there are still lots of people stuck with old or slow android phones with very old versions of Chrome or bad webkit clones.
Things like NoScript are becoming more popular, so you should at least have a nice initial page for those users.
If your customer is in any way part of the U.S. Goverment, you are legally required to support screen readers, which typically don't do javascript, or don't do it well.
Search engines will, at best, only run a limited set of your javascript. You want to work well enough without javascript to allow them to still index your site.
Of course, you need to know your audience. You might be doing work for a corporate intranet where you know that everyone has javascript (though even here I'd argue there's a growing trend where these sites are made available to teleworkers with unknown/unrestricted browsers). Or you might be building an app for the blind community where no one has it. In the case of the public internet, you can typically figure about 95% of your users will support it in some fashion (source cited by someone else in one of the links below). That number sounds pretty high, but it can be misleading; turn it around, and if you don't support javascript you're turning away 1 visitor in 20.
See these:
https://stackoverflow.com/questions/121108/how-many-people-disable-javascript
https://stackoverflow.com/questions/822872/do-web-sites-really-need-to-cater-for-browsers-that-dont-have-javascript-enabled>
You should weigh the options and ask yourself:
1) what percentage of users will have javascript turned off. (according to this site, only 5% of the world has it turned off or not available.)
2) will those users be willing to turn it on
3) of those that aren't willing to turn it on, or switch to another browser or device that has javascript enabled, is the lost revenue more than the effort to build a separate non-javascript version?
Instinctively, I say most times the answer is no, don't waste the time building two sites.
My question is, if, in this day and age, we should be required to cater for users that don't have javascript enabled?
Yes, definitely, if the AJAX functionality is core to the working of your site. If you don't, you are effectively denying users who don't have Javascript enabled access to your website, and although this is a rather small proportion (<5% I believe), it means that they won't be able to use your site at all, because the core functions are not available to them.
Of course if you're doing more trivial things with AJAX that just enhance the user experience but are not actually central to the core functionality of the site, then this probably isn't necessary.
Depends really.
I personally switch off JavaScript all the time because I don't trust lots of sites.
However, since you users have explicitly asked for your application, you can assume they will trust it and there is no point in doing extra work.
More, if you have that strong AJAX-affinity requirement, the question seems odd enough.
This is a bit like beating a dead horse, but I'll have a go at it, sure.
I think there could be two basic approaches to this:
1.
Using ajax (and, basically,
javascript) to enhance the experience
of the users, while making sure, that
all of the application's features
work without javascript.
When I am
following this principle, I develop the
interface in two phases - first
without considering javascript at all
(say, using a framework, that doesn't
know about javascript) and then
augment certain workflows by adding
ajax-y validation (don't like pure js
validation, sorry) and so on.
This means, if the user has javascript disabled, your app shall in no way break or become unusable for him.
2.
Using javascript to its fullest, "no javascript - no go" style. If javascript is not available, the user will not be able to use your application at all. It is important to note, that, in my opinion, there is no middle ground, - if you are trying to be in both worlds at once, you are doing too much extra work. Removing the constraints of supporting no-javascript users, obviously adds more opportunities to create a richer user experience. And it makes creating that experience much easier.
I think that depends on the type of web application you are going to build. For example in an e-commerce application the checkout process should propably work without java script because there are some people who deactivate js for checking out (in our experience). In a web 2.0 application in my opinion it isn't necessary to support non-js browser experience.
Developing for both also complicates the development process and is more cost intensive. you have double your web test efforts (testing with and without js) and also think different in the planning phase.
I think it depends on the market segment you're aiming for, if you're going for a tech crowd -such as Stackoverflow.com, or perhaps slashdot- then you're probably fine in expecting users to have JS installed and active.
Other sites, with a medially tech-aware audience, may suffer from users knowing enough about JS-based exploits to have deactivated JS, but with not enough knowledge to enable Scriptblock (or other browser-equivalent).
The non-tech aware audience are probably with the tech-crowd, since they possibly just don't know how to disable JS -or why they may want to- regardless of the risk.
In short, you should cater to spiders without JavaScript enabled, but only to the degree necessary to index the data that you want to expose to the public. Your browser requirements of IE7+ and FF3+ exclude far more people than the total number of people who disable JavaScript. And of those who do disable it, the vast majority know how to enable it when necessary.
I asked myself the same question the other day and came up with the answer that in order to use my application one must have Javascript enabled. I also checked various Ajax powered sites. Even Stackoverflow.
But considering this fact I also believe that you do need to support some degree of prehistoric applications. The main idea is to not let application break when users don't have enabled Javascript. Application should still display relevant data, but its functionality would be limited.
To add to some of the old discussion on this page. Google is now searching JavaScript: http://www.i-programmer.info/news/81-web-general/4248-google-now-searches-javascript.html
This is an issue that I was thinking about just a few days ago. Here is some information
In Google Chrome there is no way (menu/option) inside the browser to turn off Javascript.
Many websites including those from leading names like Google, etc., will not work without Javascript.
According to stats over 95% of visitors have Javascript enabled now.
These stats made me think. Do I have to break my back writing a lot of background code and everything for users who have disabled Javascript?
My conclusion was this. Yes, I have to include Javascript support, but not at the cost of sanity. I.e. I can afford to give it a low priority.
So I am going to have support for non-javascript browsing, but I will build most of it after my site is deployed.

Javascript Distributed Computing [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 4 years ago.
Improve this question
Why aren't there any Javascript distributed computing frameworks / projects? The idea seems absolutely awesome to me because:
The Client is the Browser
Iteration can be done with AJAX
Webmasters could help projects by linking the respective Javascript
Millions or even billions of users would help DC projects without even noticing
Please share your views on this subject.
EDIT: Also, what kind of problems do you think would be suitable for JSDC?
GIMPS for instance would be impossible to implement.
I think that Web Workers will soon be used to create distributed computing frameworks, there are some early attempts at this concept. Non-blocking code execution could have been done before using setTimeout, but it made a little sense as most browser vendors focused on optimizing their JS engines just recently. Now we have faster code execution and new features, so running some tasks unconsciously in background as we browse the web is probably just a matter of months ;)
There is something to be said for 'user rights' here. It sounds like you're describing a situation where the webmaster for Foo.com includes the script for, say, Folding#Home on their site. As a result, all visitors to Foo.com have some fraction of their CPU "donated" to Folding#Home, until they navigate away from Foo.com. Without some sort of disclaimer or opt-in, I would consider that a form of malware and avoid vising any site that did that.
That's not to say you couldn't build a system that asked for confirmation or permission, but there is definite potential for abuse.
I have pondered this myself in the context of item recommendation.
First, there is no problem with speed! JIT compiled javascript can be as fast as unoptimized C, especially for numeric code.
The bigger problem is that running javascript in the background will slow down the browser and therefore users may not like your website because it runs slowly.
There is obviously an issue of security, how can you verify the results?
And privacy, can you ensure sensitive data isn't compromised?
On top of this, it's quite a difficult thing to do. Can the number of visits you receive justify the effort that you'll have to put into it? It would be better if you could run the code transparently on either the server or client-side. Compiling other languages to javascript can help here.
In summary, the reason that it's not widespread is because developers' time is more valuable than server time. The risk of losing user data and the inconvenience to users outweighs the potential gains.
First that comes to my mind is security.
Almost all distributed protocols that I know have encryption, thats why they prevent security risks. Although this subject is not so innovative..
http://www.igvita.com/2009/03/03/collaborative-map-reduce-in-the-browser/
Also Wuala is a distributed system, that is implemented using java applet.
I know of pluraprocessing.com doing similar thing, not sure if exactly javascript, but they run Java through browser and runs totally in-memory with strict security.
They have 50,000 computers grid on which they have successfully run applications even like web-crawling (80legs).
I think we can verify results on some kind of problem.
Let's say we have n number of items and need to sort it. We'll give it to worker-1, worker-1 will give us the result. We can verify it O(n) time. Please consider that it take at least O(n*log(n)) time to produce the result. Additionally we should consider how large is n items? (concern about network speed)
Another example, f(x)=12345, and function is given. Purpose is to find value of x. We can test it by replace x with some worker's result. I think some problems that are not verifiable are difficult to give to someone.
The whole idea of Javascript Distributed Computing has number of disadvantages:
single point of failure - there is no direct way to comunicate between nodes
natural fails of nodes - every node is working as long as browser
no guarantee that message sent will be ever received - according to natural fails of nodes
no guarantee that message received have been ever sent - because some hacker can interpose
annoying load on client side
ethical problems
while there is only one (but very tempting) advantage:
easy and free access to milions of nodes - almost every device has JS supporting browser nowadays
However the biggest problem is corelation between scalability and annoyance. Let's say you offer some attractive web service and run computing on client side. More people you use for computing, more people are annoyed. More people are annoyed, less people use your service. Well, you can limit annoyance (computing), scalability or try something between.
Consider google for example. If google will run computations on client side, some people will start to use bing. How many ? Depends on annoyance level.
The only hope for Javascript Distributed Computing may be multimedial services. As long as they consume lots of CPU, nobody will notice any additional load.
I think the no.1 problem is javascript inefficiency at computing. It wouldn't be just worth it, because an application in pure c/c++ would be 100 times faster.
I found a question similar to this a while back, so I built a thingy that does this. It uses web workers and fetches scripts dynamically (but no Eval!). Web workers sandbox the scripts so they cannot access the window or the DOM. You can see the code here, and the main website here
The library has a consent popup on first load, so the user knows what's going on in the background.

Why is JavaScript considered bad by some? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
Why is JavaScript allowed to be disabled in the browser? (i.e. Why is it considered bad?)
<body onload="for(i=0; i<1000000; i++){window.open(
'samplesite.com?pageid=' + i);}">
Why is javascript allowed to be disabled in the browser? (i.e. Why is it considered bad?)
Because it can be grossly misused (blinking images, anyone?), may slow the browser down and of course there's always the (very justified!) fear of exploited security holes.
First of all with Javascript you can create events that the user might not want you to, like e.g. changing the size of the window...
On the other hand think about people who are somehow limited... What if your user was blind and uses a screenreader while your page continously changes its content somehow... There are many reasons against Javascript when it comes to accessibility...
Back in time, it used to be:
A source of annoying cursor-following animations (I am sure you remember stuff, like raining sheeps or clocks following your cursor... I want to find the smart*** who thought of that and slap them with a trout)
Considered insecure
Served no purpose but bog down the browser
However, over the years it has become more advanced and applied with more thinking behind it.
Historically it has been a huge security problem for web based services. Also with any technology that is exploitable and has a low technical barrier for entry it ends up the tool of the low brow trouble maker (script kiddies). Quick searches for javascript or xss in a security exploit database will show hundreds of pages of vulnerability.
JavaScript is often considered dangerous or at least annoying for two reasons:
Websites can suddenly do stuff that you don't want them to do, e.g. open popups
Websites can suddenly keep you from doing stuff that you want to do, e.g. disabling right-clicks
Now, in the vast majority of cases JavaScript is harmless and can really enhance the user experience (Ajax comes to mind). But all it takes is one malicious site that uses JavaScript to do evil (TM) things like Cross-site Scripting. For that reason it is commonly considered best practice to disable JavaScript globally and to allow it for just those sites or domains that you explicitly trust. In this day and age being paranoid on the Internet is actually a good thing.
It's a weakly-typed scripting language. Programmers who usually use "big strong" languages look down upon such nonsense. Shame on you for even considering using it, and my God have mercy on your soul.
It can cause security problems. Especially in old versions of IE (not so much anymore).
Or maybe it has something to do with Stallman's ranting ;-)
The main consideration is security. Drive-by downloads that exploit browser security holes via JavaScript are currently the most common way for malware to spread.
As well as what others have said it confuses search engines. The more 'dynamic' content you add the higher the chances it cannot be indexed. In addition the Internet is used by many as a reference library. Books in a real library do not change things around while you are reading the page. You may think of your site as an "application" but your users may prefer to treat it as a "document".
In short JavaScript obfuscates information, sometimes to the point of completely denying access (i.e., the JavaScript code is buggy and crashing). A classic example of this was that I was unable to watch the Live8 concert broadcast by AOL a few years back because the JavaScript code was so poorly written it didn't actually work on my girlfriends' AOL browser (ironic I know). I tried to get to the movie URL directly but the obfuscation was so complex I couldn't find it. It did nothing to endear me to AOL.
BTW, I happen to be one of those people who disable JavaScript by default. If I need it I can enable it for a specific site or page in 2 seconds (really) using the NoScript add-on for Firefox.
Some companies, or business units, have a policy of not allowing javascript turned on, as there are concerns about any risk of security exploits, and that may be the biggest problem, that since it can't be locked down securely then it must be disabled. If you can run javascript in a strict mode, that doesn't allow ajax requests, for example, then you may find that more people are willing to use it on computers that are concerned about security.
As long as a user can go to a website, and information can be sent transparently over the Internet regarding what a user is doing, then these security concerns will exist.
For example, I could have a Firefox plugin that appears to be useful, but, it can do possibly send unwanted info to a website.
Because it shifts load from the server to the client and there is no way to control to what extent.
I work with Javascript every day and respectfully acknowledge what it has made possible, but sometimes when I browse a very simple page, and the interface reacts lightning fast because there is nothing to render but pure, simple HTML, I think that that used to be the original purpose the purpose of the internet. You can, and I am exaggerating only little, browse these pages with a 600 MhZ Pentium with 128 megabytes of RAM without problems. While for a Javascript-heavy, effectful "rich" website, you need massive resources on the client side for a halfway smooth experience, and you need to update your equipent almost as often as gamers do.
Also, I generally feel some, not hostility, but slight annoyance towards Javascript because it massively increased development costs by adding a host of incompatible target platforms, version, obscurities and specialties to cater for, as well as a generally bug-prone, hard to debug and volatile environment to work in.
That said, I think the industry owes the creators of JQuery, Prototype and the likes big, big thanks, among many others.
JavaScript, as the inventor of JSON called it, is the virtual machine for the world. It's where billions of people are. This great exposure comes with some dangers other languages do not have to face.
Example. Write a site that just 'redirects' you to another site, where you can sign in. If you are not completely in control of your browser/URL etc. some JavaScript just could have loaded the page content from another site and will log your keystrokes. This could be achieved with a few lines of JavaScript. It's not really the fault (if it's a fault at all) of JavaScript, but all the components (browser, HTML, and this vast space, we call Internet).
Why is javascript allowed to be
disabled in the browser? (i.e. Why is
it considered bad?)
Because browsers are not prefect! And Its give you the way to safe yourself when you need it.
When security risk found out, they will just post in their home page
Please disable javascript until its fixed
Like this, (I dont have offical page right now, so googled from somewhere)
http://browsers.about.com/b/2009/07/16/firefox-3-5-users-should-take-action-immediately.htm
However, until a fix is released, I
recommend that you either disable
JavaScript completely or use another
browser.
There are a few rare instances where JavaScript can be dangerous (but so can anything, including the massively ubiquitous Flash). The reason users actually do disable it or use addons like NoScript is largely unjustified paranoia.
In the end, users don't stick with behavior that breaks the websites they want to experience. So, I wouldn't expect JavaScript paranoia to be a long-term issue as only more and more sites depend on it (like this one).
It's similar to the hype we saw around cookies several years ago.
It can crash the browser, or do annoying things to users.
However, now a days Javascript has become such an integrated part of the internet (Gmail, bill paying for many companies sites, ect) that if you did disable it then browsing could arguably be difficult for you unless you had exceptions.
JavaScript has some very "odd" language features, like the handling of missing semicolons at statement endings by just ignoring the parse error ("semicolon insertion") or the behaviour of the typeof operator (array is an object).
You really need to know the language to know which things you should do and which are bad.
But there are also really good points about the language, like that it fully supports functional programming.
It is bad only you visit questionable sites. Without javascript you won't have apps like gmail, yahoo finance, etc.
Why is JavaScript allowed to be disabled in the browser?
Perhaps because computers are tools that serve humans? Computers speaking to computers via a protocol can mandate specific behaviour. Developers writing software for users have no such luxury.
It would be pointless for browser vendors to mandate that JavaScript "must" be enabled, since there are plenty of people who can't or don't want it. Especially since 90% of the time it's just being used by some spotty hipster to animate a cat picture.

Categories