Does react production build automatically compresses vidoes and images files? - javascript

I have created a react application which is right now deployed with the production build.
yarn run build
serve -S build
Well I know it compresses the .js and .scss files and creates a build folder then serve.
The issue is does it compresses the images & videos files as well? If not how can I do that? because the project I am working on has a lot's of images & vidoes files which impacts a lot on performance and loads the page very slowly.
Kindly help with this...

React doesn't do those stuff. It is just a framework for development.
It is the web server's responsibility to handle those stuff, i.e. serve in your case.
But even that, it is a common practice that web servers don't apply additional compression to assets (fonts, images, videos, audios, etc.), because they are already compressed in the first place.
Take images as an example, common file formats like JPG, PNG, WEBP are compressed. Unless you are serving BMP or RAW, which you shouldn't, there is no point for web servers to apply any compressions to them.

Related

preloading images and fonts conflict with code splitting

I'm using react +17.0 and the craco (module bundler)
similar to this post,
When I'm trying to preload some images/fonts on my index.html with:
<link rel='preload' as='image' href='assets/images/fooImage.png' crossorigin/>
I've found that preloading does not work because it needs the same image/font name in the assets directory but with code splitting after building I saw that my images and fonts will be renamed in a new name like this:
as a result, preloading still does not work. any help will be appropriate.
What is the problem?
module bundler changes the assets name (adding random numbers to them) on the built version.
so chrome lighthouse (or other performance testers) show that those assets can't find in preload link (index.html) or any related file.
So, What to fix it?
definitely, ejecting or switching to another module bundler like the Webpack and changing the configuration will solve the issue, but I can't do that like many projects.
The standard solution to solve name problem:
transfer all project's assets (static assets) like fonts and images from the frontEnd project (client-side) into a public folder in my server and use its URL in my client-side project.
Now, it can be easily preloaded or anything else.

why use module loading in the backend?

I have been trying to find an answer why webpack cares about module loading on the backend. Is there a reason why this may be needed?
Does JSPM do backend module loading as well?
Assuming your first question is along the lines of "Why pre-bundle JavaScript code for the client?"
There are many reasons for module bundling. A few:
Simple file aggregation: Bundling related code makes many tasks easier / more intuitive. Instead of deploying a large directory tree of files after bundling those files may instead be a single bundle file.
Loading Performance: Individually loading dependencies that are in separate files on the client side has historically been very slow. Each file must be parsed and evaluated separately and depending on the module system used may incur considerable delay while waiting for dependencies to be discovered and loaded.
Media type abstraction: Bundlers typically allow methods for bundling non-JavaScript content. Including assets like images and stylesheets is convenient and encourages explicit/clear dependency by parts of your application that use them.
Tree shaking: By analyzing dependency among modules and code it's often possible to selectively include what is needed by the application and reduce the size of your overall code base. This isn't inherently a characteristic of bundling, but is commonly done because there is some notion of a build step.
Regarding your second question:
JSPM does offer this functionality. This is can be done on the command line with the jspm bundle command.
The simplest reason is for performance. Opening a file and closing a file is a slower processes than the time it takes to send the file (stream) so the fewer open and close file operations the faster the server can send the files requested. So by reducing the number of files that make up a javascript/web project the faster the browser will finish getting the files and start processing them for the end user.
The things that a good build process can do for you web project can go beyond simply adding all your Js files together as tools such as JSPM can also bring css and html files together into one bundle.js file further adding to your end-user experience.

How can I avoid serving too many files on Karma server

I am using Karma as my unit test runner. Generally speaking, it is awesome, but I am facing some annoying problem.
The website I am developing on is based on a quite large ui library with about 40000 files. All the resource from the library is loaded on use, so I have to set them all as 'served' files on the Karma server. Thus I will frequently meet the EMFILE error, which means I am opening too many files at the same time.
Of course I can modify the open file maximum through ulimit -n, but this is not as graceful as I expected, and loading tens of thousands files is really time-consuming. Furthermore, I am going to work on a shared machine which I do not have root auth.
I am currently using another http-server and let Karma server proxy all the lib files to it, but it is only a temporary solution.
So I am wondering if there is any way to avoid including all the library files as 'served'?

Install files to sdcard on Android/IOS with phonegap

I have some files that I need to copy to a folder once the user has installed my app. I was trying to find a way to transfer the files directly from the application root directory but no luck. I had also seen a suggestion about zipping the files and including the zipped file in my package, then extracting them to the folder. But did not know how to implement either methods.
Another alternative is to download and install the files when the app starts, but didnt want to rely on having to maintain the files on a server or that the user always has internet access since my app can be run offline.
How do I do this with a Phonegap/JQM/Javascript app?
Thanks,
Robert

How to combine JS/CSS for Amazon S3?

On a regular VPS site I would use Minify to compress and combine multiple CSS/JS files so the site's only using 1 or 2 HTTP requests. A site I'm working on now has it's CSS/JS files hosted on Amazon S3 and served through Amazon CloudFront. Obviously Minify is a PHP5 app and not able to run on AWS.
I can compress the scripts easily before uploading but what's the best way to combine scripts on AWS S3 to reduce HTTP requests?
http://code.google.com/p/minify/
Minify combines and minifies JS/CSs on the fly.
S3 and CloudFront serve static files - you'll have to combine and minify them yourself before you upload. It's easy enough - concat the files together and minify with YUI Compressor or Google Closure Compiler (2 free cross-platform command-line minifiers).
It's usually convenient to have a script or build step that does this, something like:
#!/bin/bash
cat a.js b.js c.js | java -jar yuicompressor-1.4.2.jar --type js -o output.min.js
On Windows, another excellent option is Microsoft's Ajax Minifier.
When CloudFront receives a cold-cache hit it requests the content from the distribution's configured origin server. In most cases a S3 bucket is configured as the origin. So the easiest way is to combine and minify your JS and CSS is to store it in S3 as part of your build/deployment process.
If you really want to minify on-the-fly you can configure CloudFront to use a "Custom Origin". In this distribution configuration cold-cache hits would be requested from your server running Minify.
See the CloudFront documentation on creating distributions for details.
If you plan to serve static content from S3/CloudFront, I would recommend compressing your content ahead of time. Personally, I use Juicer. Once you've done that, you can gzip -9 your production files, and upload them to S3 with a Content-Encoding: gzip header.
The problem with compressing on the fly is the performance hit your site takes. CloudFront custom-origin support alleviates this a bit, but it would be really easy to automate your deployments with a tool like Capistrano that does this work for you. This is the approach I take, myself.
New – Gzip Compression Support for Amazon CloudFront, Check here.
Enabling Gzip Compression
You can enable this feature in a minute! Simply open up the CloudFront Console, locate your distribution, and set Compress Objects Automatically to Yes in the Behavior options:

Categories