Why are frontend frameworks in NPM? - javascript

When I’m looking at some github projects and tutorials I look at the package.json file and see frontend frameworks listed as the dependencies quite a lot. I don’t get it. I thought Node was backend? My understanding is that to install frontend frameworks you dl them directly from their website or github or use a CDN then link them in your pages - all this has nothing to do with Node.
Even if I did install a framework through Node doesn’t it save it to the node_modules folder? There must be a reason for it as I’ve seen a lot of projects list them in their package.json file. Can anyone explain this to me?

NodeJS is not only a "server" in the sense that it is a programmable webserver, it is a JavaScript runtime. You can use it to serve webpages, but you can also use the NodeJS server as a parser / generator for JavaScript (meaning: reading and writing files on the system). If you use one of the frontend frameworks like react and angular, you install the packages just to get their sourcecode and not to actually run the code on the server. Then you use a bundler like webpack to turn the code you've written and the code from the modules into one (or multiple) large chunks of minified frontend code. You can usually find those generated files inside the /dist or /build folder. Now to get these files to clients, you can use NodeJS as a server too, serving the files to the clients. That way, the packages "installed" on your server end up on your client.

Related

Releasing a Javascript library for client use

I have a Javascript library that I wrote for a client. This is written in typescript using webpack and "compiles" into javascript. I want to give the client access to the distribution files but not the whole source. Ideally they can install from the command line to make installing updates easy.
The library provides some javascript functions. The client would install the script in one location on their server. They could then include the javascripts in their web surveys as they need it.
+project
+dist
-main.js
-vendor.js
-index.html
-README.md
-LICENSE.md
+src
-index.js
-index.html
...
My initial thoughts are to give them access to a private git repository that contains only the distribution files. So my project would be a git repository, only I would have access to this repo. I would then copy the contents of the dist directory to a release directory. The release directory would be another git repo I could supply to the client.
I'm not sure this is the best approach.
It was suggested that GitHub releases may be an option - but I don't use GitHub, I use GitLab and would like to continue to do so.
npm also doesn't seem like a good choice. It installs files into the node_modules directory and creates a package.json file. That's going to be confusing to my client and isn't "clean".
It sounds like a second git repository as submodule could work for you. On your side it would receive the built files, and on the clients side they could consume them.
I'd suggest making use of tags to indicate significant versions in the submodule
By using a separate repository there is no risk of leaking the original files.
Alternatively you could package the files as a zip, and upload somewhere like S3 as part of your ci process, and write a script to give the client that can automatically download the distribution files - but this seems more complex than just using a package manager like npm.

Organising a client & server SPA/Node.js application

I am writing an application with a Node.js backend and a single-page web app front-end.
I am keeping the client and server logic in the same project for simplicity and speed of development.
I am considering how best to organise the artifacts.
The Node.js part is straightforward because it doesn't need to go through a battery of pre-processors (transpilation, minification, concatenation etc).
The front-end needs to be transformed per the above, and I guess placed in a dist folder.
The current hierarchy of files is like so:
my-app
- src
- client
- server
Should I put the dist folder for the client artifacts under src/client?
Has anyone tried this and found problems with this approach?
I am using Heroku (a deployment system that uses git).
Committing the built artifacts for the client feels wrong, but if I want to deploy it by pushing to Heroku I think I need to commit them. Is this correct?
This question, as is, invites opinionated answers, so I'll start by saying this is by no means the only way to go, but in my opinion, it is the easiest to work with and makes the most sense.
The production client code, after pre-processing, should be located in my-app/dist or my-app/dst, which could either mean distribution or destination, depending on how you look at it. Either way, my recommendation is to commit this folder, as it saves you a lot of hassle debugging remotely.
For example, if your code works locally but not remotely, using something like the postinstall hook to generate your dist folder adds yet another suspect to check when trying to determine what the issue is with your program.
Another advantage of committing the dist folder is it allows you to specify all the packages you use for your build process as devDependencies rather than dependencies. This is a huge plus, and makes deployment a lot faster, as well as less memory usage on your heroku process.
That being said, I still recommend (as you already probably plan to do) using an automated watch task to build your dist folder for ease of development, even if you decide you don't want to use that same build process remotely and opt for committing the dist directory instead. You could add that as a custom npm command, e.g. npm run build and have that invoke your gulp task.
One last thing. For those of you using templating languages like pug or dust or ejs instead of a framework like react or angular, I recommend determining whether you can run any of your templates to build static HTML files that will be served in production.
If not, you should at least compile your templates (not to be confused with running them) by following the recommendations provided by your particular templating language. Typically, they'll suggest using their command line utility to generate the compiled templates, so that they don't have to be compiled every time they're invoked in production. This will make your node.js server respond faster to requests at the expense of using more memory to cache the compiled templates.
If you're planning to go this route, I would edit nodejs app.js/index.js to serve static file and point the directory to dist/.
Also, you would need to tell express to forward all non-api requests to the frontend.

Can you use a npm package without using NodeJS

I found a library on github that I would like to use but the download instructions only mention using npm but I am not using a NodeJS project (just a basic html,css,javascript front-end with no back-end server). Am I still able to use that library or is it a lost cause? Is there another way to download it without using npm?
Is there another way to download it without using npm?
If it's on github, then you can checkout or fork the repository as you can with any other git repo.
Am I still able to use that library or is it a lost cause?
Whether or not the library will work without Node will depend on the library.
If it presents itself as a Node module, then you'll probably have to modify it (or find a compatible module loader for browser-side JS).
If it depends on NodeJS features (such as the filesystem API) then you'll be out of luck (unless, for example, you polyfill them to work across HTTP)
If you use a build tool such as browserify, or webpack, you can author scripts that require node modules and the build tool will generate a script that includes all the necessary dependencies (assuming that the necessary dependencies are compatible with client-side JavaScript environments).
Downloading dependencies would still be done via npm, but only for local development. Your server would only need the generated script.
Alternatively, if the script is on github or any other repo online you may be able to download it directly. Many modules are published using UMD, which would allow you to use the script using various inclusion methods.

Loading modules in qooxdoo desktop (browser environment)

I'm struggling how to integrate client-side modules like - just as an example - Apollo Client
into the qooxdoo-specific generate.py workflow so that they become available in the browser.
According to the installation notes:
To use this client in a web browser or mobile app, you'll need a build system capable of loading NPM packages on the client. Some common choices include Browserify, Webpack, and Meteor 1.3. [...]
Side note: I currently use Babel 6 to recursively transpile all my sources from a separate folder source.es6/ into the "official" source/ folder, which is then watched and processed by generate.py. Is it possible to use this somehow as a solution to my question?
OTOH, I would love to see at least some kind of integration with Webpack, Browserify or SystemJS.
I suggest you do the following. First, create a loadable package(s) from the Apollo Client and its dependencies, e.g. using Webpack. Then make sure these package(s) are loaded in your web page before you load your qooxdoo app. Then the Apollo API is available to your qooxdoo code.
If you choose to deploy the Apollo packages with <script> tags you can let generate.py do that by using the add-script config key.
I suggest you place the output of the Webpack run in your qooxdoo project's resource path and add #asset hints for those files in your main qooxdoo class. This will make sure they are copied into the build version of your app, and you can use the relative URI to these files, either in your index.html directly or in the add-script config settings.
I don't think your transpiling with Babel6 will help here. The Apollo code is already consumable and you woudn't want to disect it and make it part of your qooxdoo (es6) source tree, let alone its dependencies. I would rather treat it as a shrink-wrapped JS library as I described that is added like a resource.

How do front end devs bundle and minify files?

What's the best practice for minifying and bundling js/css in a pure front end app, and how do the tools work?
I know how this can be done with server side apps like .NET/Java/LAMP/etc. But what about pure front end projects, SPA projects or backendless projects that are built with say, ember or angular these days? Say your entire project consists of HTML/css/js, which interfaces with a RESTful service elsewhere.
What kind of process or tool do you use to minify and bundle the resources for that?
I've seen grunt plugins that exist for this, but I find the documentation to be pretty magical and it's still unclear to me how they work.
Specifically, does the tool:
1) Replace src="/js/a.js",src="/js/b.js" with src="/js/bundle-a+b.min.js"? (and likewise with css?) in the source html files?
2) have different modes for dev and release, or is the tool only run when the project is released?
Or are the resource requests entirely managed by a js tool and js/css files have to be requested via a library function? Wouldn't the lag be noticeable in this case?
Thanks.
Through the use of Build tools front end devs can have minified javascript, css, or even images and html files automatically minified as they develop. The most common is grunt, with gulp close behind.
You configure grunt tasks, like grunt-contrib-uglify and grunt-contrib-copy, and put those tasks under a grunt-contrib-watch task. Have the grunt watch task watch the files you modify, and every time a change is detected those .min files are automatically generated.
These build tools have no impact on your application, they are run before the files are servered. You were correct to assume there was an easy way to do this. I suggest you look at grunt getting started, a sample gruntfile, or a project that uses grunt - here's mine, it does minification like you requested. Clone my repo, run sudo npm install, then sudo grunt. I don't have watch set up in my project but grunt is very well documented.

Categories