There are many ES6 features that look great like => syntax, Map object, and a long etc.
To be honest I'm kind of tired of checking if there is support for addEventListener due to ie8 attachEvent, and I wouldn't like that kind of pain coming back to my life.
So how, would you deal with this new posibilities? (or how will you, lets say, in a year or so). Would you not use them for basic actions but to add another layer of extra functions? Would you use it just for apps that you know you will be running in browsers that support them? Would you wait untill there is at least 90% of support?
I understand these are great features but for short to medium term usage it seems that you'd need to double your code checking and fallbacking for support.
Any enlightment about this subject?
EDIT: Please, don't mark this as duplicate. Notice I'm not asking how to check for support, I'm asking if it is wise to start using it, or it is better to wait. I'm also asking if the support check is the best option, not how to do it, or if there are other ways to proced while designing your code.
tl;dr: Make use of transpilers and polyfills.
Whether or not you should use new features primarily depends on your target environment and how exactly you are using new features. E.g. if you are targeting only the latest browser version, then you won't have an issue. Have to support IE8? That could be more difficult.
In general though, you should start using new features as soon as possible, and make use of tools that help you with that.
There are two aspects to look at:
New APIs
New syntax constructs
APIs
New API's can often (but not always) be polyfilled. I.e. you include a library which checks whether certain parts of the API exist, e.g. Map, and provides an alternative implementation if it doesn't.
These alternative implements may not be 100% equivalent or may not be as performant as a native implementation, but I'd say they work for 95% for all use cases.
The nice thing about polyfills is that you will be automatically using the native browser implementation if it is available.
Syntax
Making use of new syntax constructs, such as arrow functions or classes, is a bit more complex (but not much). The biggest issue is that browsers who do not support the syntax cannot even evaluate your code. You can only send code to the browser that it can actually parse.
Fortunately many of the new syntax elements, such as arrow functions, are really just syntactic sugar for things that are already possible ES5. So we can convert ES6 code into their ES5 or even ES3 equivalent.
Several such tools, called transpilers, have emerged over the last one or two years. Note that the transpiler has to convert your code before it is sent to the browser. This means that instead of simply writing your JS file and directly include in your page, you need to have a build step that converts the code first (just like we have in other languages, like C or Java).
This is different from how we wrote JS a couple of years ago, but having a build step has become increasingly more accepted by the JS community. There are also many build tools that try to make this as painless as possible.
One drawback is, unlike with polyfills, that you won't magically be using the native features if they become available. So you could be stuck with shipping the transpiled version for a long time, until all your target environments support all the features you need. But that's probably still better than not using the new features at all.
You can use BabelJS or Google Traceur
You have to include in your build process a step to transform ES6, ES7 code to Javascript compatible with todays browsers. Like a gulp or grunt task. Babel has a list of supported tools here
Related
This question already has answers here:
What is TypeScript and why would I use it in place of JavaScript? [closed]
(5 answers)
Closed 6 years ago.
Judging by the ES6 compatibility table foundhere
Most shims and transpilers only implement below 70% of ES6 features, so why should someone use Babel/Traceur when Javascript ES6 is pretty much supported now in Chrome/Safari and Firefox by default.
I mean, if I was a developer at say Babel - surely it would be your number 1 priority to make sure you have ES6 and even ES7 features implemented before your competition.
Or am I missing something here?
Most shims and transpilers only implement below 70% of ES6 features, so why should someone use Babel/Traceur when Javascript ES6 is pretty much supported now in Chrome/Safari and Firefox by default.
Because some of my applications still need to support down to IE9 (we finally got your client to raise the bar to IE9). And it's not because of the lack of knowledge due to our client, it's because the users of that platform actually use this Browser.
Especially when your users are companies (or their employees) IE (not even Edge) is often still the standard.
"am I missing something here?" -Yes. You are missing decades of arguments about the relative merits of dynamic and static typing. Get to reading. – Jared Smith
#Daniel, none of your edits does change anything at this point made by Jared.
JS will probably never provide static type checking or compile time errors; because this is a substantially different approach to writing and publishing code.
I mean, if I was a developer at say Babel - surely it would be your number 1 priority to make sure you have ES6 and even ES7 features implemented before your competition.
Yes, I like using ES6 and use a transpiler when I need to, but I still consider ES7 features as unstable and work in progress. If the final implementation will differ from the current version I'd have to revisit and check every project I would have used it on. That's too much uncertainty to actually use them in production, so ...
I (as one of Babels's users) actually don't care wether one transpiler or the other already supports these latest features, unless there's a final standard published.
Although sometimes I like playing around with these new features and take like to take a look at their benefits, possibilities, limits and problems, and how to transpile them into current code.
In general, transpilers exist to convert code written for one environment to code written for a different environment. This can be done for converting code between entirely different languages, or merely between different versions of a language.
So early on when ECMAScript 6 was new and the browsers didn't support much or any of it, there were transpilers developed to let you use ECMAScript's new language features in browsers that only supported the "legacy" ECMAScript features.
Even though the newest browsers now support most or all of the new features, little has changed. You'd still use a transpiler to support those same previous implementations that didn't support those features.
At some point, a developer may drop support for certain browsers, which would mean that at that point they could stop transpiling the code. This decision is going to be made differently by different individuals, and so the transpilers will continue to be used for years to come.
I've integrated BabelJS into my workflow. This allows me to use ES6 features. I'm using gulp to convert my Javascript to ES5 Javascript.
I imagine that it would be better, though, to just use my ES6 code directly in newer browsers that support it. Is there a way to check for the availability of ES6 and use a BabelJS converted file only as a fallback?
Of course there is, but it's a lot of hard work. Similar approaches are being used to navigate a mobile client to a dedicated URL, but do you really want to start mapping each feature used in your code base, and then checking each and every feature in the client?
Stick with transpiling client code for now. It might be better in the future.
Is there a version of quickcheck that works for Javascript and that is well maintained? I have found several such as check.js and claire, but none of them seem to support shrinking of failing test cases, which has always struck me as the most useful part of the whole problem.
I'm creator of jsverify. I'll try constantly to make it better, bug reports and feature requests are welcomed.
There are also a list of other javascript generative testing libraries in a readme. So far I haven't found any other good alternative to the jsverify.
I recently released https://github.com/dubzzz/fast-check
I built it in order to answer several limitations I encountered in the existing quickcheck implementations in JavaScript.
It comes natively with a shrink feature which can shrink even combination of arbitraries (the frameworks I tried were failing on oneof like arbitraries).
It also can generate large objects and arrays.
By default it tends to try smaller values first in order to detect trivial edge cases while it covers all the possible inputs in next runs.
Lots of other features are on-going or already available :)
Cheers,
Nicolas
I wrote quick_check.js, which has a nice library of generators. Shrinking is not there yet, but is planned.
There seems to be a dearth of good quickcheck-like testing tools in javascript. However they are to be better supported in typed languages, and in fact you can write your tests in one of those languages if you wish.
To avoid dealing with runtime interop, I'd recommend going with a language which compiles to JS and runs on node.js (eg: Purescript with purescript-quickcheck), or a java-based language using the Nashorn engine provided in Java 8, for example ScalaCheck. You could even use ghcjs and the original flavor of the quickcheck library!
Reset.css files are used to resolve browser inconsistencies when it comes to styling.
Is there something similar for JavaScript inconsistencies across browsers like a reset.js?
For example this "reset.js" library would define a prototype for the String trim() method as specified in this question since (among other things) IE8 does not support this.
I know libraries like jQuery can be used to overcome these inconsistencies but having something like a reset.js could help when using 3rd party JavaScript libraries that do not use jQuery.
Yes, there are polyfills to do exactly that. But there are so many things you'd need to fix that you can't put all fixes in one single script :-)
Have a look at the HTML5 Cross Browser Polyfills list.
If you're specifically interested in EcmaScript compliance, there are ES5 shims to retrofit missing or incorrectly implemented methods like String::trim; yet they can't fix the engine bugs (identifier-keywords, NFEs, …).
I don't know of one that is/does exactly what you asked in a single library and as far as I know there are quite a lot of people against 'patching' the built in JavaScript objects, and some libraries (e.g. ExtJS) that did this in previous versions have changed and do now deliver the functionality in custom utility functions.
On the other hand there are a ton of smaller and larger shims to bring missing functionality to older browser, especially dealing with HTML5 inconsistencies.
I am trying to get a hang of ESx (Harmony?) Proxies. I think I know the basics now, but I don't think I'm capable of taking advantage of them.
Has anyone managed to use them for any good? I don't know any library or whatsoever that has done that.
Proxies are a rather strategic feature that is primarily intended for implementing bindings or advanced library abstractions. Don't worry if you don't see an immediate use case for your own code. In fact, if you did, you should think at least twice before using them -- more often than not they are overkill, and there is a simpler and more efficient way to achieve the same thing.
There are a couple of examples on the original proposal page: http://wiki.ecmascript.org/doku.php?id=harmony:proxies
For more info on proxies, check out this article by Assistant Professor Tom Van Cutsem. Together with Google's Mark Miller, Tom actually played a key role in the proposal of proxies for inclusion in a future ECMAScript standard during his work at the es-lab project.
Further, note that DirectProxies.js has been superseded by the new reflect.js shim.
Finally, check out Sindre Sorhus's negative-array project for an example of simple use case.
UPDATE:
Today, it's almost 5 years since this question was asked. While they have become part of the ECMAScript 2015 standard (aka ES6) in 2015, many browsers still haven't implemented them :
If you want to know which browsers support proxies by the time you're reading this answer, take a look at CanIUse.
Things have evolved a bit! Firefox supports Proxies natively. Using the implementation of harmony-reflect.js, you can try using proxies according to the proposed specification of Direct Proxies. This works with current Firefox or latest Chrome.
Possible use case: You have an object that represents a node in a graph containing an id, type, and arbitrary other user-defined properties. The library drawing this node to the screen wants to save screen-coordinates and similar directly to this node as well. This could possibly overwrite existing properties.
You can now hand over a proxy to the drawing library that catches access to the drawing specific properties of the node. And then, redirect them to an internal namespace property of the node - for example, drawing - to separate this data.
This way, all data according to one node can be kept in a single place. There is no need to copy and transform it around for different libraries and maybe later change the same properties in different places.
Most of the proxy "functionality" can kind of already be implemented in current Javascript. For example, getters and setters can be made as explicit "setXXX" or "getXXX" methods.
The largest up front advantages of proxies I can think of are
Taking existing behavior that currently resides in evil browser objects and allow it to be implemented with pure Javascript. (Great for browser implementors and for people writing shims)
Give you more freedom to change an implementation without breaking the interface (for example, if a property is removed from an object you might put a magic getter in its place to avoid breaking old code).
That said, I'm still curious about what other nice things are possible this new feature :)
First of all I would like to point out that according to this: http://wiki.ecmascript.org/doku.php?id=harmony:direct_proxies the direct proxies are the latest specification of the ES proxies and its now part of the draft, which means that this gonna be the ES6 standard, once it goes live, as well as the latest gecko engine: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Proxy using direct proxies already.
I wrote a little code using the new proxies to make the childnodes inside the DOM be available as properties of their parent node and multi-get, multi-set and function multi-call them, similarly to the query-selectors. This cannot be done dynamically without the proxies, and using proxies made this whole code be less than one KB.
Generally talking proxies makes prototyping and meta-programming be dynamic and easy.
Object fail safe access, without eval, for example:
someObj.someMethod.someFunction('param1');
//--> someObj doesn't exist (with fail safe, no error thrown)
I used ES6 Proxies together with ES6 Promises in a library in order to implement Lazy Loading:
https://github.com/Daniel-Abrecht/Crazy-Loading
ES6 Proxies are currently working in Firefox 45 and Edge 13, and Chrome 49.