I have to deploy a heavily JS based project to a embedded device. Its disk size is no more than 16Mb. The problem is the size of my minified js file all-classes.js is about 3Mb. If I compress it using gzip I get a 560k file which saves about 2.4M. Now I want to store all-classes.js as all-classes.js.gz so I can save space and it can be uncompressed by browser very well. All I have to do is handle the headers.
Now the question is how do I include the .gz file so browser understands and decompresses? Well i am aware that a .gz file contains file structure information while browser accepts only raw gzipped data. In that I would like to store the raw gzipped data. It'd some sort of caching!
What you need to do, when the "all-classes.js" file is requested, is to return the content of "all-classes.js.gzip" with the additional "Content-Encoding: gzip" HTTP header.
But it's only possible if the request contained the "Accept-Encoding: gzip" HTTP header in the first place...
Related
Background:
I have a small RaspberyPi-like server on Armbian (20.11.6) (precisely - on Odroid XU4).
I use lighttpd to serve pages (including Home Assistant and some statistics and graphs with chartjs).
(the example file here is Chart.bundle.min.js.gz)
Issue:
There seems to be a growing amount of javascript files, which become larger than the htmls and the data itself (some numbers for power/gas consumption etc.). I am used to use mod_compress, mod_deflate etc on servers (to compress files on the fly), but this would kill the Odroid (or unnecessarily load CPU and the pitiful SD card for caching).
Idea:
Now, the idea is, just to compress the javascript (and other static (like css)) files and serve it as static gzip file, which any modern browser can/should handle.
Solution 0:
I just compressed the files, and hoped that the browser will understand it...
(Clearly the link was packed in the script html tag, so if the browser would get that gz is a gzip... it should maybe work).
It did not ;)
Solution 1:
I enabled mod_compress (a suggested on multiple pages) and and tried to serve static js.gz file.
https://www.drupal.org/project/javascript_aggregator/issues/601540
https://www.cyberciti.biz/tips/lighttpd-mod_compress-gzip-compression-tutorial.html
Without success (browser takes it as binary gzip, and not as application/javascript type).
(some pages suggested enabling mod_deflate, but it does not seem to exist)
Solution 2:
(mod_compress kept on) I did the above, and started fiddling with the Content-Type, Content-Encoding in the HTML (in the script html tag). This did not work at all, as the Content-Type can be somehow influenced in HTML, but it seems that the Content-Encoding can not.
https://www.geeksforgeeks.org/http-headers-content-type/
(I do not install php (which could do it) to save memory, sd card lifetime etc.).
Solution 3:
I added "Content-Encoding" => "gzip" line to the 10-simple-vhost.conf default configuration file in the setenv.add-response-header. This looked as a dirty crazy move, but I wanted to check if the browser accepts my js.gz file... It did not.
And furthermore nothing loaded at all.
Question:
What would be an easy way to do it ? (without php).
Maybe something like htaccess in Apache ?
EDIT 1:
It seems that nginx can do it out-of-the-box:
Serve static gzip files using node.js
http://nginx.org/en/docs/http/ngx_http_gzip_static_module.html
I am also digging into the headers story in lighttpd:
https://community.splunk.com/t5/Security/How-to-Disable-http-response-gzip-encoding/m-p/64396
EDIT 2:
Yes... after some thinking, I got to the idea that it seems that this file could be cached for a long time anyway, so maybe I should not care so much :)
Your solution (below) to set the response header is a workable one for your situation.
However, I would recommend using lighttpd mod_deflate with deflate.cache-dir (lighttpd 1.4.56 and later)
When configured properly, lighttpd will serve gzipped Content-Encoding to clients which support the compression, and lighttpd will serve plain content to clients which do not support the compression. lighttpd will compress each file as it is served and will save the compressed file in deflate.cache-dir so that lighttpd does not have to re-compress the file the next time the file is requested. lighttpd will detect if the original file has changed and will re-compress it into the cache the next time the file is requested.
It seems that I was writing the question so long, that I was near to the solution.
I created an module file 12-static_gzip.conf, with following content:
$HTTP["url"] =~ ".gz" {
setenv.add-response-header = (
"Content-Encoding" => "gzip"
)
}
I have not found any similar trick for lighttpd, so I applied here a similar solution which I would use for Apache. Expected behavior was, that it will just respond the Content-Encoding header for the gz files, without using php or any additional modules... and it works !!!
The mod_compress module or any other of this kind is disabled and no other changes are needed.
Clearly, the http negotiation is more complex, so I am not sure if this will work for all browsers, but it surely work very nicely for Chrome.
I am also planning to create some ESP32 web servers, where drive and memory are even more critical, so I will try to apply similar solution.
Nevertheless, the questions still hold...
is there a better/cleaner solution ?
Are there some caveats to be expected ? Browser compatibility etc. ?
I have a simple file upload funcitonality in place using knockout, in my durandal website. I upload the file to the server by converting the file to a base64StringArray, then uploading the file using an AJAX post method, i.e.
$.post("localhost/uploadDocument", dataToPost)
I have the following request filtering in place in my application:
<requestLimits maxAllowedContentLength="31457280" />
and
<httpRuntime targetFramework="4.5.2" maxRequestLength="30720" />
So I have about a 30mb file limit.
The problem I am having is with a specific Microsoft Excel file, which also includes some embedded PDF files. This file is 14,887,424 bytes, but when I upload it through my application, Fiddler shows that 49,158,346 bytes were sent, therefore I receive a 404.13 error - where the request is denied due to exceeding the request content length.
Why are so many bytes being sent for this one Excel file with embedded PDF files?
I would compress the string client side using something like:
http://rosettacode.org/wiki/LZW_compression#JavaScript
and then on the server side, decompress it, and perform whatever validation you might be doing to check file size
I am trying to gzip some of my larger .files and have taken the initial approach like here
Can someone walk me through serving gzipped files from Cloudfront via S3 origin?
Gzip the file, remove the .gz and then set correct HTTP Headers.
Now this all works, but I am told some browsers won't support the gzip files, especially mobile.
So I want to do the approach where my folder contains both i.e.
myfile.js.gz &
myfile.js
And store them in my S3 Bucket, and cloudfront should as far as I am aware select the correct file based on which the browser can support ?
But if I have both in the folder, the larger myfile.js get selected each time.
Anyone know what I am missing here ?
Do you have to use JavaScript necessarily? Not sure about the Cloudfront part but for gzipping, you can also use a Python Lambda function. With the gzipand zipfile libraries, it could be pretty easy:
with zipped.open(file, "r") as f_in:
gzipped_content = gzip.compress(f_in.read())
destinationbucket.upload_fileobj(io.BytesIO(gzipped_content),
final_file_path,
ExtraArgs={"ContentType": "text/plain"}
)
There's a full tutorial for the lambda function here: https://medium.com/p/f7bccf0099c9
Lets assume I have a file on a CDN (Cloud Files from Rackspace) and a static html page with a link to that file. Is there any way I can force download this file (to prevent it from opening in the browser -- for mp3s for example)?
We could make our server read the file and set the corresponding header to:
header("Content-Type: application/force-download")
but we have about 5 million downloads per month so we would rather let the CDN take care of that.
Any ideas?
There’s no way to do this in HTML or JavaScript. There is now! (Ish. See #BruceAldrige’s answer below.)
The HTTP Content-Disposition header is what tells browsers to download the files, and that’s sent by the server. You have to configure the CDN to send that header with whichever files you want to browser to download instead of display.
Unhelpfully, I’m entirely unfamiliar with Rackspace’s Cloud Files service, so I don’t know if they allow this, nor how to do it. Just found a page from December 2009 that suggests not thought, sadly:
Cloud Files cannot serve a file with the 'Content-Disposition: attachment' HTTP header. Therefore, a download link that would work perfectly in any other service may result in the browser rendering the file directly. This was confirmed by Rackspace engineers. :-(
http://drupal.org/node/656714
I know that you can with Amazon’s CloudFront service, as it’s backed by S3 (see e.g. http://blog.cloudberrylab.com/2009/06/how-to-set-custom-http-headers-for.html)
You can use the download attribute:
<a href="http..." download></a>
https://stackoverflow.com/a/11024735/21460
However, it’s not currently supported by Safari (7) or IE (11).
Yes, you can do this through the cloudfiles API. Using the method stream allows you to stream the contents of files in - setting your own headers etc.
A crazy idea: download via XMLHttpRequest and serve a data: URL with the content type you want? :P
I have an html file saved in gzip format. The browser displays the html file but without the javascript and CSS. Non-zipped html files in the same directory do display correctly. In addition, I saved the source from the compressed html file and it reopened correctly, with JS and CSS applied.
What is different about displaying the zipped html that would not allow it to pick up the JS and CSS?
The basic problem is you can't just serve a gzip file where the browser expects CSS. By itself, that does not work any more than if you return a JPEG or a ham sandwich.
When content gets zipped on the fly, the response is somewhat different -- the response says "I am text/css, but, happen to be encoded with gzip for transfer". The browser can figure that one out.
Some web servers like Apache will do that sort of thing for you if you supply gzipped files locally, too. But I imagine your server isn't.
Why does it work for HTML? Hmm, I don't know, maybe your browser actually manages to figure it out in that particular case?
What you ultimately want to do is serve the response with Content-Type: text/css and Content-Encoding: gzip to have it recognized correctly.
it you're working on Localhost on your own server (like XAMPP)
then you need to configure the .htaccess file to send the right
headers that said that files might be gziped.
try adding this to you main .htaccess file :
AddEncoding x-gzip .gz
AddType text/html .gz
and make sure your gziped compressed
files are endind with the .gz extention.
also, always run this in a server.. :)