Bundle js vs bundle.js.gz - javascript

I used webpack to generate my bundle.js but since I was using some thrid party libraries the size is about 1MB. I use the compressed plugin and got a bundle.js.gz of 200kb. I used it and change the header to let the browser know it is compressed and it worked perfect. I am just worried about any side effects I am not seeing at first.
Can anyone tell me what could go wrong.

I don't foresee any problems. Another option, depending on your host, is to configure gzip on your http server. This will gzip on the fly and cache it for future requests (dependent on config).
Here are instructions for configuring Apache & nginx:
https://www.vultr.com/docs/gzip-compression-on-apache-and-nginx

Related

Can I use node js with a static webiste?

I am making a small website for a buddy but he only has a Webserver from hostinger, for his website I want to use a Node.js Package. Is it still possible to use this hosting service if I just point the URL to the index.html file?
Or does the javascript not work anymore if I use Node.js?
tl'dr: Most likely not.
If the plan your friend has does not offer support Node.js you will not be able to use such package, but it will be hard to say for sure without knowing exactly the plan your friend has.
Webserver usually means static content (HTML, CSS and JS) hosting, and serving to the browser as-is. In this case no actual processing is done on the server - which is what node.js is for.
You could use other browser npm packages (or isomorphic ones, that support Node.js and browser environments), but in the case you described you won't be able to use node specific packages.
According to Hostinger's website, you can only use node.js if you are on VPS plan.
EDIT: It is worth mentioning that you could use a node package if you use it locally to generate or preprocess the assets before putting it in a webserver. But in this case the package will be executing in your machine instead of the server, during build time (not run time).
It depends on the package and what you want to do with it.
Node.js can be used to run server-side JavaScript. If you need to run server-side JavaScript then you need a hosting service that supports it. A service that supports only static files will not suffice.
The term “a Node.js package” might refer to a package available via NPM. Some such packages require Node.js (in which case see the previous paragraph). Others will run client-side in the browser. Typically, if you are using such a package in client-side code you will use a bundler (such as Webpack or Parcel) to convert your program (which imports said package) for use in browsers.
Some websites are generated programatically at build time and the resulting static files uploaded to a static hosting site. Node.js can be used to do that generation. It is, for example, the usual means by which sites using Gatsby.js are built.

Load JS files from CDN using Webpack in Angular 5

I am using Angular 5 + Webpack for one of my projects. Now I want to load all assets with JS files (including lazy loaded .chunks.js files) from a CDN.
For CSS and images, I have changed the publicPath option of the webpack so I am able to load CSS and images from the CDN, but the problem is with JS files.
For JS file I have changed the <base href="{{CDN-PATH-HARE}}"> but it gives me this error
I also tried renaming the JS file using webpack but that trick also didn't work.
I just want to know whether I am going in a right direction or I should think it in a different way.
Thanks.
I imagine the "using CDN when performing tree-shaking" discussion could be a fairly involved one...
This answer assumes that the break happens only when switching to the CDN, and that your app is properly served otherwise. Are you sure your app is being served?
The history API is used by angular, but because the origin is different, it's being treated as a security issue.
Assuming external cache-able libraries in a CDN should be included in your bundling, you can try a couple options listed below. I would revisit that assumption first. The point of webpack is to treeshake and then bundle, so I would make sure you are convinced of the benefit of bundling files that are cached and bundled on CDN already.
If you are sure you want to take this approach, you can do a couple things. You can either use hashlocationstrategy on your angular router:
imports [
RouterModule.forRoot(routes, {useHash: true})
]
Or, you can fully host your app when building it, using node's http-server, and play with the host baseUrl in your app's index.html.
Finally, if none of that works for you, you could try using a htaccess file, or doing some manual routing, but I have found those approaches to be a little brittle.
There's a couple pre-existing questions on SO that handle this issue somewhat, but I thought the CDN wrinkle warranted a fresh response.
Angular 5 : Failed to execute 'replaceState' on 'History': A history state object with URL cannot be created in a document with origin 'null'
ng build failed to execute 'replaceState' on 'History': A history state object with URL cannot be created in a document with origin 'null' and URL
How to perform redirects in NodeJS like a .htaccess file?
Maybe a better answer would be to ask this question: If your libraries never change, and you have a version update of your app, why would you force your users to re-download your libraries?

Developing Less CSS locally without a server; "no sheets were loaded"

So I'm trying to extract my own CSS framework from my projects so I can develop it separately.
I have my index.html with less.js and included my main .less file which #imports a dozen other files...
However, in the console I get an error for each of my less files:
Resource interpreted as Stylesheet but transferred with MIME type text/plain: "file:///path/to/project/src/file.less".
And a final line that says Less has finished and no sheets were loaded.
So I understand this to mean they aren't being served correctly, I normally use Node.js/Express, but I don't want to include all of that in my repo just to develop some CSS. How do I get around this?
I thought about using some node package like serve for development but I feel like this shouldn't be necessary.. Unless I'm wrong?
EDIT: here's my repo, https://github.com/kenmorechalfant/framewerk
Don't use file://. You're going to run into a mountain of XSS errors. Just use a quick static HTTP server. I use http-server with node and most IDEs have one built in. It is going to be more trouble that it's worth to try and not use a static HTTP server if you don't have a good reason not to.

How can I avoid serving too many files on Karma server

I am using Karma as my unit test runner. Generally speaking, it is awesome, but I am facing some annoying problem.
The website I am developing on is based on a quite large ui library with about 40000 files. All the resource from the library is loaded on use, so I have to set them all as 'served' files on the Karma server. Thus I will frequently meet the EMFILE error, which means I am opening too many files at the same time.
Of course I can modify the open file maximum through ulimit -n, but this is not as graceful as I expected, and loading tens of thousands files is really time-consuming. Furthermore, I am going to work on a shared machine which I do not have root auth.
I am currently using another http-server and let Karma server proxy all the lib files to it, but it is only a temporary solution.
So I am wondering if there is any way to avoid including all the library files as 'served'?

How to combine JS/CSS for Amazon S3?

On a regular VPS site I would use Minify to compress and combine multiple CSS/JS files so the site's only using 1 or 2 HTTP requests. A site I'm working on now has it's CSS/JS files hosted on Amazon S3 and served through Amazon CloudFront. Obviously Minify is a PHP5 app and not able to run on AWS.
I can compress the scripts easily before uploading but what's the best way to combine scripts on AWS S3 to reduce HTTP requests?
http://code.google.com/p/minify/
Minify combines and minifies JS/CSs on the fly.
S3 and CloudFront serve static files - you'll have to combine and minify them yourself before you upload. It's easy enough - concat the files together and minify with YUI Compressor or Google Closure Compiler (2 free cross-platform command-line minifiers).
It's usually convenient to have a script or build step that does this, something like:
#!/bin/bash
cat a.js b.js c.js | java -jar yuicompressor-1.4.2.jar --type js -o output.min.js
On Windows, another excellent option is Microsoft's Ajax Minifier.
When CloudFront receives a cold-cache hit it requests the content from the distribution's configured origin server. In most cases a S3 bucket is configured as the origin. So the easiest way is to combine and minify your JS and CSS is to store it in S3 as part of your build/deployment process.
If you really want to minify on-the-fly you can configure CloudFront to use a "Custom Origin". In this distribution configuration cold-cache hits would be requested from your server running Minify.
See the CloudFront documentation on creating distributions for details.
If you plan to serve static content from S3/CloudFront, I would recommend compressing your content ahead of time. Personally, I use Juicer. Once you've done that, you can gzip -9 your production files, and upload them to S3 with a Content-Encoding: gzip header.
The problem with compressing on the fly is the performance hit your site takes. CloudFront custom-origin support alleviates this a bit, but it would be really easy to automate your deployments with a tool like Capistrano that does this work for you. This is the approach I take, myself.
New – Gzip Compression Support for Amazon CloudFront, Check here.
Enabling Gzip Compression
You can enable this feature in a minute! Simply open up the CloudFront Console, locate your distribution, and set Compress Objects Automatically to Yes in the Behavior options:

Categories