I chose Vue.js for a new project because it seems to run natively in the browser, as opposed to something like React that has to be compiled/transpiled via Node. Is there any reason I couldn't just link to a CDN like this in my production code?
<script src="https://unpkg.com/vue#2.2.1"></script>
A co-worker suggested that may be for development only, and that unpkg is simply transpiling on the fly (which doesn't sound good for performance). But other than that it seems to work fine. I could also link to a more robust CDN such as this one, but just want to make sure I'm not violating some sort of best practice by not using a Node build system (e.g. webpack).
Is there any reason I couldn't just link to a CDN like this in my production code?
No, there is no reason not to use a CDN in production. It's even a preferred way for serving content in production mode, especially with common packages like jQuery, because most of the people would already have loaded and therefore cached this resource.
A co-worker suggested that may be for development only, and that unpkg is simply transpiling on the fly (which doesn't sound good for performance).
This is absolutely not true - that's why it is a CDN! :) It's a matter of choice, but you have to keep in mind that most of the time you should work with specific version of the library that you are using during development. If you just add the latest version of any code, you are vulnerable to all changes pushed to that repository, and so your clients will start receiving updated code, which you haven't tested yet.
Therefore fix to a specific version that you develop with, open a beer and have a good sleep :)
Updated (18.10.2022): Global caching no longer works
Actually, this one's been around for a while, but the answer was never updated. The short story is that caching works per site. Longer version can be found here (thanks to #Baraka's comment).
Either way, using CDN for production deployment is still much preferred!
These maybe help:
<!-- development version -->
<script src="https://unpkg.com/vue"></script>
<!-- production version -->
<script src="https://unpkg.com/vue/dist/vue.min.js"></script>
And it will keep current version of Vue.js automatically.
Related
import something from './something'
versus
<script src='https://code.jquery.com/jquery-3.4.1.js'></script>
I personally find that npm installing something and using modules/compiling is a bit of a hassle when I can just use a remote CDN or use a local file along with a script tag. But I'm wondering if javascript is moving towards import statements or will script tags stay?
Are library providers moving to only use npm, or will their libraries also have a way to just src a single script file? Will they need to support both?
An initial <script> tag is necessary regardless - that is, your
import something from './something'
must be in a (modular) <script> as well. So, at the very least, <script> tags will never completely go away.
The biggest problem with using import is that it requires browsers to understand the syntax. JS modules are not bleeding-edge anymore, but they're not supported everywhere either. See MDN's compatibility tables. Not supported on IE at all (of course), only supported on Chrome 61+ (released late 2017), with other modern browsers supporting it around a generally similar time period. Once you can count on all users you want your site to be usable for to have browsers that support import, switching to import instead of importing a library or another <script> tag is an option. Otherwise, those with older browsers will not be able to use your site, and it may take many years for those incompatible browsers to die off.
It's ultimately up to you - it's a trade-off between the (very slight) convenience of using import syntax, and the ability of those with older browsers to use your site.
There are other things to consider as well, though. Every import of a library and every <script> tag with a src means one more network request, which means more loading time before the site is fully functional. If you wish to minimize the number of network requests (which can be very important for users on mobile or with bad connections, when the connection is over http/1.1), you'll have to put all the Javascript into a single file that gets sent to a client. This can be done with module bundlers like Webpack.
(if you wanted to go the extra mile and put everything into a single inline <script>, so that only a single request from the client is necessary to load the whole page and its functionality, that's an option too - this sort of thing is often seen on huge websites)
As long as your script-writing includes a build process (which it probably should, for anything professional and non-trivial - it allows you to write in the latest and greatest version of the language, while still permitting obsolete browsers to understand your code), I think you may as well install libraries locally with NPM and bundle them together. Once you understand the tooling, it doesn't have any downside.
Good morning everyone.
I know you all don't like this question, as it is a bad design software behaviour which can create common breaks to suposed functionality.
But right now, I'm just designing a PoC, and I need to run it up always with the last version of the very common third part libraries.
As I can go to jQuery Github site, and using jsDelivr + SRIHash, I can serve to my code the very last version of jquery (or any other library which gots the "inclusable" JS on it).
So, this:
https://raw.githubusercontent.com/jquery/jquery-dist/master/dist/jquery.min.js
becomes this:
https://cdn.jsdelivr.net/gh/jquery/jquery-dist/dist/jquery.min.js
and finally:
<script src="https://cdn.jsdelivr.net/gh/jquery/jquery-dist/dist/jquery.min.js" integrity="sha384-tsQFqpEReu7ZLhBV2VZlAu7zcOV+rXbYlF2cqB8txI/8aZajjp4Bqd+V6D5IgvKT" crossorigin="anonymous"></script>
Upper LINK will always provide the very last Jquery version. No version specified at any point.
So, as said, this is usable for any kind of JS which mantains it's common library name on some folder and gets it's own versioning over itself, and not in some versions sub-folders.
Which is PRECISELY the problem Jquery-UI has. And even worst, because it doesn't have the proper jquery-ui.min.js on it, which is the one I need.
I have found third-people aproximations like this one:
https://github.com/components/jqueryui
But still it being the last version, it is an incorrect version of the library, as it has one little but important difference inside the code (an slash symbol on a RegExp), so it is not a reliable version of it.
I've tried with CDNs from Google and Microsoft. NPM, Composer, Bower... all of'em are versionated links, so you've to specify ".../1.12.1/..." in some point.
¿Ideas?
Thanks alot.
Nice!
Found by myself. After searching like crazy, seems like NPM has a repository by jquery people (official, not third peoples), called "DIST", and it has what it have to have inside:
<script src="https://cdn.jsdelivr.net/npm/jquery-ui-dist/jquery-ui.min.js" integrity="sha384-PtTRqvDhycIBU6x1wwIqnbDo8adeWIWP3AHmnrvccafo35E7oIvW7HPXn2YimvWu" crossorigin="anonymous"></script>
So this is it:
https://www.npmjs.com/package/jquery-ui-dist
Then via jsDelivr:
https://www.jsdelivr.com/package/npm/jquery-ui-dist
https://cdn.jsdelivr.net/npm/jquery-ui-dist/jquery-ui.min.js
And then, SRI Hash it and it's over.
Nice!
EDIT:
Happens that NPM dist repository has the very exact problem has I've found before:
(left, min version from jquery UI cdn site. Right, min version from NPM jquery-ui-dist repository)
It seems that minified version on NPM hast the "back-slash" error on the RegExp selector too. While in the minified version of CDN from jquery UI website, hasn't. Seen the uncompressed version of the code from both sites (jquery oficial CDN and NPM repository) which are exactly the same (binary identicals), the back-slash is not in there too. So the minified version existing in some repositories (NPM included), are incorrect.
I even minified full version myself, and the RegExp do not get modified in any manner (so no back-slash added ever). Dont understand why this happens, but solution is going to be to include de uncompressed version of jquery UI, which is double the size, from ~200KB to 500KB aprox.
I'm building a website which my target group is very general (ages 13-oo, so hello IE9, hello ancient android browser), so I need polyfills for some stuff (viewport, calc etc). Before I used Modernizr and some conditionals user agents to target IOS 6-7 etc. Then with yepnope.js I was loading the specific polyfills.
Now that modernizr 3.0 is out, I noticed that the Modernizr.load() is deprecated. Also the yepnope.js library is deprecated. As they say on their website
"There are new best practices that you should likely follow instead."
But I can't find any of them. After googling for some time everyone recommend Modernizr and Yepnope. But this issue is so fresh (the deprecation, the new version of Modernizr), and I can't find any new alternative method.
Maybe using of some module loader (like require.js) will do the job? And if yes, how?
I maintain the polyfill service at https://cdn.polyfill.io which may meet your needs. We have a library of around 800 polyfills, which are bundled selectively and served only to users that need them. You can run the service yourself or just load the polyfills from our CDN.
The most comprehensive write-up I've seen on this technique is Philip Walton's Loading Polyfills Only When Needed. It's too long to quote any parts here and should be read in its entirety so I'm not going to copy/paste sections into this answer.
I've spent my morning figuring this out myself. You can use jQuery's getScript method. I just answered a similar q here: https://stackoverflow.com/a/34518146/411436
from the yepnope repo
When it comes to loading things conditionally, we suggest that you
output a build for each combination of the things you're testing. This
might sound like it will generate a lot of files (it might), but
computers are pretty good at that. Then you can inline a script into
your page that only loads (asynchronously!) a single built script that
is tuned to the features of that user. All the performance win of
conditional loading, and none of the latency problems of loading 100
things at once.
I am playing aorund with using LESS along with respond.js to streamline the development of a new site. Both LESS and respond are quite simply neat. However, with LESS in IE I have run into many problems.
For starters in IE8 mode my IE10 reported that id did not understand "map". No problems, I wrote up an Array.prototype map extension. Then it said that it did not understand isArray, once again in IE8 mode. Prototype extensions to the rescue again. Now it comes back saying something along the lines of SyntaxError: Invalid operand to 'in': Object expected
I am not in fact aware of what in might be but in any case I cannot keep adding adhoc prototype extenions on the fly in the hope that things will eventually settle down. Either LESS is unusable with IE or else someone here can point me to all the fixes needed to make it work.
Answer for your question:
First of all, LESS client side compilation is supported only in IE9+.
You could probably fix this using shims and polyfills for ES5, like these.
But please, don't.
What you should probably do (and forget the first part):
However, despite of really good caching mechanisms provided by the LESS compiler (eg. using localStorage to preserve generated code) using it i production isn't considered a good practice.
GruntJS and Bower.io work in the console, but are relatively easy to configure. Basically, you set them up once and forget they've ever existed:)
Livereload provides you with a GUI and it's incredibly easy to use.
I used GruntJS for frontend development with backend developers working with PHP (CakePHP, Zend, Laravel) and it made our lives much, much easier :)
It seems much more reasonable to streamline your frontend development workflow using a task runner like GruntJS or Brunch.io or install Livereload. These tools will monitor the file changes and generate a new CSS file on every save (and also, reload your CSS on the fly).
You can install GrunJS with watch and LESS plugins and keep is very simple this way. You could even use LESS Node.js package installed globally to the job.
I'm trying out Yeoman Server for the first time and see that it offers a native watch tool as a fallback to LiveReload. Here's how the docs explain the fallback:
"[Yeoman Server] automatically fires up the yeoman watch process, so changes to any of the application's files cause the browser to refresh via LiveReload. Should you not have
LiveReload installed locally, a fallback reload process will be used instead."
So far the fallback process is working perfectly, and I like that it doesn't require installing anything in the browser/menu bar.
Has anyone tried both watch tools with Yeoman? How is the workflow different and what additional features do you get if you "upgrade" to LiveReload?
UPDATE: A quick inspection of the API revealed that Yeoman's live reload feature is in fact LiveReload. They're one and the same. The reason it works without the browser extensions is because they're using LiveReload's snipvr snippet instead. It's possible there are some additional features accessible via the LiveReload GUI and perhaps for mobile device testing, but more likely the functionality is identical.
As noted in my update, I checked the Yeoman source and realized that the live reload feature is in fact LiveReload. They're one and the same. It's pretty cool of LR's creator, Andrey Tarantsov, to let his valuable tool be used in a popular, open-source project like this without charging for its use.
The reason Yeoman Watch works without the browser extensions is because it's using LiveReload's snipvr snippet instead.
As a result, the functionality of LiveReload and running 'yeoman watch' is essentially identical. However, I find that there's still benefit to owning LiveReload. My preferred workflow is to combine LiveReload and CodeKit.
During (pre-build) development, I use CodeKit to compile my Sass/Compass files and Jade templates (another fantastic tool, btw) since CodeKit's config options are a little more extensive than LiveReload's. Since CodeKit doesn't work with Firefox (only Chrome and Safari), I run LiveReload concurrently so that I can see changes live in both browsers.
This workflow also has the added benefit of being able to "fork on the fly" by mixing LiveReload's "custom command" feature with CodeKit's "advanced compiler settings" feature.
EDIT:
What I said below isn't exactly correct after all. I did some more testing and found that editing a .scss file would have the changes show up even without editing the HTML file first, so yeah, at this point I haven't got a scooby as to what the difference between LiveReload and the fallback process is.
I say this with the caveat that I don't have LiveReload installed,
but from the testing I've done in Yeoman thus far, what I've seen with
the "fallback reload process" is that it doesn't reload the page until
the HTML file is saved, so saved CSS changes aren't immediately
visible until the HTML file receives a Save event from the system.
According to livereload.com, "...when you change a CSS file or an
image, the browser is updated instantly without reloading the page" so
it appears to be a more robust process.
(Sorry, not a complete answer since I don't have LiveReload available,
but this question's been up for a couple of days with no response yet,
so I figured any information was better than none.)