I want to ask if it is possible (and generally a good idea) to use npm to handle front-end dependencies (Backbone, jQuery).
I have found that Backbone, jQuery and so on are all available through npm but I would have to set another extraction point (the default is node_modules) or symlink or something else...
Has somebody done this before?
Is it possible?
What do I have to change in package.json?
+1 for using Browserify. We use it here at diy.org and love it. The best introduction and reasoning behind Browserify can be found in the Browserify Handbook. Topics like CommonJS & AMD solutions, build pipelines and testing are covered there.
The main reason Browserify works so well is it transparently works w/ NPM. As long as a module can be required it can be Browserified (though not all modules are made to work in the browser).
Basics:
npm install jquery-browserify
main.js
var $ = require('jquery-browserify');
$("img[attr$='png']").hide();
Then run:
browserify main.js > bundle.js
Then include bundle.js in your HTML doc and the code in main.js will execute.
Short answer: sort of.
It is largely up to the module author to support this, but it isn't common. Socket.io is an example of such a supporting module, as demonstrated on their landing page. There are other solutions however. These are the two I actually know anything about:
http://ender.no.de/ - Ender JS, self-described NPM analogue for client modules. A bit too involved for my tastes.
https://github.com/substack/node-browserify - Browserify, a utility that will walk your dependencies and allow you to output a single script by emulating the node.js module pattern. You can use a jake|cake|rake|make build script to spit out your application.js, and even automate it if you want to get fancy. I used this briefly, but decided it was a bit clunky, and became annoying to debug. Also, not all dual-environment npm modules like to be run through browserify.
Personally, I am currently opting for using RequireJS ( http://requirejs.org/ ) and manually managing my modules, similar to how Mozilla does with their BrowserQuest sample application ( https://github.com/mozilla/BrowserQuest ). Note that this comes with the challenge of having to potentially shim modules like backbone or underscore which removed support for AMD style module loaders. You can find an example of what is involved in shimming here: http://tbranyen.com/post/amdrequirejs-shim-plugin-for-loading-incompatible-javascript
Really it seems like it is going to hurt no matter what, which is why native module support is such a hot topic.
Our team maintains a tool called Lineman for building front-end projects. The tool is node-based, so a project relies on a lot of npm modules that operate server-side to build your assets, but out-of-the-box it expects to find your client-side dependencies in copied and committed to vendor/js.
However, a bunch of folks (myself included) have tried integrating with browserify, and we've run into a lot of complexity and problems, ranging from (a) npm modules being maintained by a third party which are either out of date or add unwanted changes, to (b) actual libraries that start failing when loaded traditionally whenever a top-level function named require is even defined, due to AMD/Require.js baggage.
My short-term recommendation is to hold off and stick with good ol' fashioned script concatenation until the dust settles. Until you have problems big enough or complex enough to warrant it, I suspect you'll spend more time debugging and remediating your build than you otherwise would. And I think most of us agree the best use of your time is focusing on your application code, not its build tools.
You might want to take a look at http://jspm.io/ which is a browser package manager. Has nice ES6 support too.
I personally use webmake for my small projects. It is an alternative to browserify in the way it brings npm dependencies into your browser, and it's apparently lighter.
I didn't have the opportunity to compare in details browserify and webmake, but I noticed webmake doesn't work well with modules internally using global variables such as socket.io (which is full of bloat anyway IMO).
I would be cautious about RequireJS, which has been recommended above. Because it is an AMD loader, your browser will load your JS files asynchronously. It will induces more exchanges between your client and server and may degrade the UX of people browsing from mobile networks / under bad WiFi. Moreover, if you succeed to keep your JS code simple and tiny, asynchronous loading is absolutely not needed !
Related
I have just started my NodeJS course, the lecture was recorded at the time of NodeJS version 10 (on a mac). I'm on Windows, it is now version 16. The lecture does not contain this page of the installation screen:
Summary: I do not know if I want native modules, or what they are - but I do not want chocolatey.
I have done my research, yet still I cannot find anything to clear up the following question for me anywhere.
1.My question:
How important are these native modules? Do I need them? Or do you recommend them, and why?
2.Chocolatey:
Out of interest, perhaps you could tell me why NodeJS have bundled together native modules and Chocolatey?
I have decided I do not want chocolatey (no problem, if I decide to install the 'tools' then I will go onto GitHub and install them manually, as it says in the screenshot.)
The reason I do not want chocolatey is because: from my research I do not think I need chocolatey and I have seen that uninstalling chocolatey will potentially cause me one or two problems, so I'll avoid it all together - but I thought I'd mention that here on the side, because maybe somebody knows a very valid reason why they are bundled together, and it will change my mind.
A big thank you to the Stack Overflow community.
Native modules need to be compiled, most often (but not exclusively) from C/C++ source, in order to function. Some folks avoid them like cancer, as they need to be compiled on installation which can be a deployment risk. Others (like me) embrace native modules because of the performance benefits they can bring.
Note that this is not a concept unique to Javascript or Node.js. Other languages like Ruby and Python also have "modules" (by other names) that involve compiling native code in order to function as well.
As to why Node.js uses Chocolatey to manage its native toolchain, it's because Chocolatey already has packages available for the tools it needs. It doesn't make sense to maintain separate NPM packages of these tools, and relying on existing packages reduces a lot of overhead in getting a wide berth of tools and utilities installed. In addition, Chocolatey can be installed system wide or for only a specific application's use. I'm not sure which technique Node.js uses but if it's asking, I assume it wants to use a system-wide config.
If you don't want to use Chocolatey, you'll have to manage the native toolchain on your own. If you tell it to use Chocolatey, you can manage the toolchain upgrades with the choco upgrade command.
That said, I would consider exploring Chocolatey if I were you. It makes package management so much easier on Windows. It's about as close to a standard as a 3rd party solution can get, in part because it builds off of nuget, and you can technically manage Chocolatey packages with PowerShell without installing Chocolatey (though I don't recommend this, just use Chocolatey).
Somebody had raised this question to the node.js developers through a github issue - https://github.com/nodejs/node/issues/30242.
tl;dr It is not essential for node.js
TL;DR; Is it possible to implement package inheritance in Node? Or is there a recommended tool to help with this?
I am working in an enterprise environment containing 60+ (and growing) web apps. We’ve also modularized components such as header/footer/sign-in/etc. so the web apps don’t need to repeat the code, and they can just pull them in as dependencies. We also have library modules to provide things like common logging, modeling, and error handling. We also have one main library that provides a foundation for maintenance like testing, and linting.
What I am trying to do is get access to the dependencies this main library uses in upper level modules.
lib-a
|
—> lib-b
|
—> babel, chai, mocha, etc.
I would like to have lib-a “inherit” babel, chai, mocha, etc. from lib-b rather than having to specifically add the dependencies. That way all my libraries, and eventually web apps will have the same version, and I won’t have to repeat the same dependencies in every package.json. Nor will I need to go through the headache of having N number of teams update the 60-100 apps/libs/whatnot, and having to deal with them complaining about maintenance.
I do understand this goes against the core of npm, but on the level we are using this it’s becoming a maintenance headache. Going more DRY would certainly have it’s benefits at this point.
So to repeat the original question at the top - Is it possible to implement package inheritance in Node? Or are there any recommended tools to help with this?
I have seen the following tools. Has anyone ever used them? or have thoughts on them. Are there others?
https://github.com/FormidableLabs/builder
https://github.com/Cosium/dry-dry
It's a bad idea. You should assume that you don't have control over the dependencies. How else would anybody be able to make changes to the dependencies?
Suppose lib a from your example uses mocha. Since it depends on lib b which also depends on mocha, you could decide to not list mocha in lib a's package.json.
If somebody refactors lib b to not use mocha anymore, lib a will fall all a sudden. That's not good.
We work with equally many projects. We use Greenkeeper, RenovateBot, and some tools that apply changes to all our repos at once. In the long run that's probably a better strategy for you than going against Node's dependency model.
I bought an HTML template recently, which contains many plugins placed inside a bower_components directory and a package.js file inside. I wanted to install another package I liked, but decided to use npm for this purpose.
When I typed:
npc install pnotify
the node_modules directory was created and contained about 900 directories with other packages.
What are those? Why did they get installed along with my package? I did some research and it turned out that those were needed, but do I really need to deliver my template in production with hundreds of unnecessary packages?
This is a very good question. There are a few things I want to point out.
The V8 engine, Node Modules (dependencies) and "requiring" them:
Node.js is built on V8 engine, which is written in C++. This means that Node.js' dependencies are fundamentally written in C++.
Now when you require a dependency, you really require code/functions from a C++ program or js library, because that's how new libraries/dependencies are made.
Libraries have so many functions that you will not use
For example, take a look at the express-validator module, which contains so many functions. When you require the module, do you use all the functions it provides? The answer is no. People most often require packages like this just to use one single benefit of it, although all of the functions end up getting downloaded, which takes up unnecessary space.
Think of the node dependencies that are made from other node dependencies as Interpreted Languages
For example, JavaScript is written in C/C++, whose functions and compilers are in turn originally written in assembly. Think of it like a tree. You create new branches each time for more convenient usage and, most importantly, to save time . It makes things faster. Similarly, when people create new dependencies, they use/require ones that already exist, instead of rewriting a whole C++ program or js script, because that makes everything easier.
Problem arises when requiring other NPMs for creating a new one
When the authors of the dependencies require other dependencies from here and there just to use a few (small amount) benefits from them, they end up downloading them all, (which they don't really care about because they mostly do not worry about the size or they'd rather do this than explicitly writing a new dependency or a C++ addon) and this takes extra space. For example you can see the dependencies that the express-validator module uses by accessing this link.
So, when you have big projects that use lots of dependencies you end up taking so much space for them.
Ways to solve this
Number 1
This requires some expert people on Node.js. To reduce the amount of the downloaded packages, a professional Node.js developer could go to the directories that modules are saved in, open the javascript files, take a look at their source code, and delete the functions that they will not use without changing the structure of the package.
Number 2 (Most likely not worth your time)
You could also create your own personal dependencies that are written in C++, or more preferably js, which would literally take up the least space possible, depending on the programmer, but would take/waste the most time, in order to reduce size instead of doing work. (Note: Most dependencies are written in js.)
Number 3 (Common)
Instead of Using option number 2, you could implement WebPack.
Conclusion & Note
So, basically, there is no running away from downloading all the node packages, but you could use solution number 1 if you believe you can do it, which also has the possibility of screwing up the whole intention of a dependency. (So make it personal and use it for specific purposes.) Or just make use of a module like WebPack.
Also, ask this question to yourself: Do those packages really cause you a problem?
No, there is no point to add about 900 packages dependencies in your project just because you want to add some template. But it is up to you!
The heavyness of a template is not challenging the node.js ecosystem nor his main package system npm.
It is a fact that javascript community tend to make smallest possible module to be responsible for one task, and just one.
It is not a bad thing I guess. But it could result of a situation where you have a lot of dependencies in your project.
Nowadays hard drive memory is cheap and nobody cares any more about making efficient/small apps.
As always, it's only a matter of choice.
What is the point of delivering hundreds of packages weighing hundreds of MB for a few kB project.
There isn't..
If you intend to provide it to other developers, just gitignore (or remove from shared package) node_modules or bower_components directories. Developers simply install dependencies again as required ;)
If it is something as simple as an HTML templates or similar stuff, node would most likely be there just for making your life as a developer easier providing live reload, compiling/transpiling typescript/babel/SCSS/SASS/LESS/Coffee... ( list goes on ;P ) etc.
And in that case dependencies would most likely only be dev_dependencies and won't be required at all in production environment ;)
Also many packages come with separate production and dev dependencies, So you just need to install production dependencies...
npm install --only=prod
If your project does need many projects in production, and you really really wanna avoid that stuff, just spend some time and include css/js files your your project needs(this can be a laborious task).
Update
Production vs default install
Most projects have different dev and production dependencies,
Dev dependencies may include stuff like SASS, typescript etc. compilers, uglifiers (minification), maybe stuff like live reload etc.
Where as production version will not have those things reducing the size node_modules directory.
** No node_modules**
In some html template kind of projects, you may not need any node_modules in production, so you skip doing an npm install.
No access to node_modules
Or in some cases, when server that serves exists in node_modules itself, access to it may be blocked (coz there is no need to access these from frontend).
What are those? Why did they get installed along with my package?
Dependencies exists to facilitate code reuse through modularity.
... do I need to deliver my template in production with hundreds of unnecessary packages?
One shouldn't be so quick to dismiss this modularity. If you inline your requires and eliminate dead code, you'll lose the benefit of maintenance patches for the dependencies automatically being applied to your code. You should see this as a form of compilation, because... well... it is compilation.
Nonetheless, if you're licensed to redistribute all of your dependencies in this compiled form, you'll be happy to learn those optimisations are performed by a compiler which compile Javascript to Javascript. The Closure Compiler, as the first example I stumbled across, appears to perform advanced compilation, which means you get dead code removal and function inlining... That seems promising!
This does however have another side effect when you are required to justify the licensing of all npm modules..so when you have hundreds of npm modules due to dependencies this effort also becomes a more cumbersome task
Very old question but I happened to come across very similar situation just as RA pointed out.
I tried to work with node.js framework using vscode and the moment when I tried to install start npm using npm init -y, it generated so many different dependencies. In my case, it was vscode extension ESlint that I added to prior to running npm init -y
Uninstalling ESlint
Restarted vscode to apply that uninstallation
removed previously generated package.json and node-modules folder
do npm init -y again
This solved my problem of starting out with so many dependencies.
Due the fact, that ES6-modules (JavaScript-modules) are available for testing:
https://www.chromestatus.com/feature/5365692190687232
https://medium.com/dev-channel/es6-modules-in-chrome-canary-m60-ba588dfb8ab7
I wonder, how should I minify and prepare the project release-file? Earlier, I havde bundled all JavaScript-files into the single and minified file, except situations, where I have to load the JS-file dynamically via XHR or Fetch API.
As I understand, it's rather impossible to prepare a single-minified file with the ES6-modules right now or may be, I'm just misunderstanding some ways of work.
So, are the any ways to prepare my ES6-modules into single file and how I should prepare the modern JavaScript-project in 2017 year, where JavaScript-modules will be available?
I wonder, how should I minify and prepare the project release-file?
That is purpose of this action? Okay, minified files take fewer network traffic, and will be downloaded faster, but most NPM libraries provides minified dist-files already. And main question about bundling in one big file.
Why webpack do it? Of cource, due absence of support for ES-modules in browser by native, What's why webpack resolves import statements and round dependencies in synchronous manner*, and then substitute it to IIFE for scoping. And perform babel translation and polyfilling, yes.
But then native support of ES-modules is started, it's become un-useful. One of main goals when exposing your web-app to production, is minify traffic volume for your server, using CDN. Now you can do it in native way, so just import ES-modules from unpkg.org and be happy
*If not using HMR, of course, But it's not appropriate for production mode.
Live examples here: https://jakearchibald.com/2017/es-modules-in-browsers/
This blog explains how you would use the ES6 module syntax and yet still bundle your code into something that the browser will understand.
The blog explains that using SystemJs as an ES6 module polyfill and Babel along with Gulp will enable you to code you modules in ES6 yet sill be able to use it today.
https://www.barbarianmeetscoding.com/blog/2016/02/21/start-using-es6-es2015-in-your-project-with-babel-and-gulp/
Using this guide will help you write your code in ES6 while still having a normal workflow to building, minifying and bundling your code.
Keep in mind there are a lot of tools out there that will help you achieve this but I've followed this method many times and I can vouch for its validity.
I'm creating a javascript library, and i want it to be environment agnostic (It will not use DOM, AJAX, or NodeJS api. It will be vanilla javascript). So, it's supposed to run in any javascript environment (browsers, npm, meteor smart packages, V8 C bindings...).
My currently approach is creating git repo with the library, with all the library inside a single global variable, without thinking about patterns like CommonJS or AMD.
Later, i'll create another git repo, using my library as a git submodule, and create what is needed to release it as a npm module. I'm concerned if it's a good approach, i didn't found anyone doing this way.
Pros: code will be vanilla javascript, without awareness of environment patterns. It will not bind itself to CommonJS. It will be repackable (copy and paste or git submodule) to any javascript environment. It will be as small as needed to be sent to browsers.
Cons: I'll have to maintain as many git as environments i want to support. At least a second git repo to deliver on npm.
Taking jQuery as example, it runs in both browser and nodejs, with just one git repo. There is some code to be aware of the "exports" variable to run on nodejs or other CommonJS compatible enviroment.
Pros: Just one git repo to mantain.
Cons: It will be binded to CommonJS pattern (to achieve npm compatibility)
My question is: Am i following a correct (or acceptable) approach? Or should i follow jquery's path, and try to create a single git repo?
Update 1:
Browserify and other require() libraries are not valid answers. My question is not how to use require() on the browser, instead, it's about the architecture pattern to achieve enviroment agnosticism.
Update 2:
Create a browser/nodejs module is not the question, it's known. The question is: can make a real enviroment agnostic library? This example is binded to CommonJS pattern, used in NodeJS.
If you are looking for design recommendation for your future library work then in my opinion you can think-future and just use usual Object Oriented Practices well proven in other languages, systems and libraries.
Mainly concentrate on the UML view of your design.
Forget the "one variable" requirement.
Use features proposed in the planned next version of JavaScript.
http://wiki.ecmascript.org/doku.php?id=strawman:maximally_minimal_classes
http://wiki.ecmascript.org/doku.php?id=harmony:modules_rationale
There is an experimental compiler available that allows you to write ES6-style code even today (see https://www.npmjs.org/package/es6-module-transpiler-rewrite).
Node.js has a --harmony command line switch that allows for the same (see What does `node --harmony` do?)
So in my opinion correct approach is to follow best practices and "think future"
"Use a build tool" is the answer for this question. With a build tool, you can develop with the best code pratices, without accopling your code to some enviroment standard of today (AMD, commonjs...) and still publish your code to these kind of enviroments.
For example, I'm using Grunt.js to run some tasks, like build, lint, test, etc.
It perform tedious operations (minification, compilation...) like Make, Maven, Gulp.js, and various others.
The build task can handle standards (like commonjs) for the compiled code. So, the library can be totally enviroment agnostic, and the build process handle enviroment adaptations.
Note that i'm not talking about compiling to binaries. It's compiling source to another source, like CoffeScript to JavaScript. In my case, it's compilation of JavaScript without enviroment standard to JavaScript with commonjs standard (to run as a Node.js module).
The final result is that i can compile my project to various standards without messing with my code.
Aditionally, with a build phase i can "think-future", like xmojmr answered and use the EcmaScript 6 features on my JavaScript code, using Grunt plugins like grunt-es6-transpiler or grunt-traceur to compile JavaScript code from ES 6 to 5 (so it can run on enviroments of today)
According to modular your library (if you need modules). Read this question Relation between CommonJS, AMD and RequireJS?
Take bootstrap for example, it uses npm to manage project dependencies and use bower to publish as static content for other web apps.
Take a look at browserify as reference, it's a little heavy because it provides the capability to bundle dependent npm modules as resource for browsers.