Background:
I have a small RaspberyPi-like server on Armbian (20.11.6) (precisely - on Odroid XU4).
I use lighttpd to serve pages (including Home Assistant and some statistics and graphs with chartjs).
(the example file here is Chart.bundle.min.js.gz)
Issue:
There seems to be a growing amount of javascript files, which become larger than the htmls and the data itself (some numbers for power/gas consumption etc.). I am used to use mod_compress, mod_deflate etc on servers (to compress files on the fly), but this would kill the Odroid (or unnecessarily load CPU and the pitiful SD card for caching).
Idea:
Now, the idea is, just to compress the javascript (and other static (like css)) files and serve it as static gzip file, which any modern browser can/should handle.
Solution 0:
I just compressed the files, and hoped that the browser will understand it...
(Clearly the link was packed in the script html tag, so if the browser would get that gz is a gzip... it should maybe work).
It did not ;)
Solution 1:
I enabled mod_compress (a suggested on multiple pages) and and tried to serve static js.gz file.
https://www.drupal.org/project/javascript_aggregator/issues/601540
https://www.cyberciti.biz/tips/lighttpd-mod_compress-gzip-compression-tutorial.html
Without success (browser takes it as binary gzip, and not as application/javascript type).
(some pages suggested enabling mod_deflate, but it does not seem to exist)
Solution 2:
(mod_compress kept on) I did the above, and started fiddling with the Content-Type, Content-Encoding in the HTML (in the script html tag). This did not work at all, as the Content-Type can be somehow influenced in HTML, but it seems that the Content-Encoding can not.
https://www.geeksforgeeks.org/http-headers-content-type/
(I do not install php (which could do it) to save memory, sd card lifetime etc.).
Solution 3:
I added "Content-Encoding" => "gzip" line to the 10-simple-vhost.conf default configuration file in the setenv.add-response-header. This looked as a dirty crazy move, but I wanted to check if the browser accepts my js.gz file... It did not.
And furthermore nothing loaded at all.
Question:
What would be an easy way to do it ? (without php).
Maybe something like htaccess in Apache ?
EDIT 1:
It seems that nginx can do it out-of-the-box:
Serve static gzip files using node.js
http://nginx.org/en/docs/http/ngx_http_gzip_static_module.html
I am also digging into the headers story in lighttpd:
https://community.splunk.com/t5/Security/How-to-Disable-http-response-gzip-encoding/m-p/64396
EDIT 2:
Yes... after some thinking, I got to the idea that it seems that this file could be cached for a long time anyway, so maybe I should not care so much :)
Your solution (below) to set the response header is a workable one for your situation.
However, I would recommend using lighttpd mod_deflate with deflate.cache-dir (lighttpd 1.4.56 and later)
When configured properly, lighttpd will serve gzipped Content-Encoding to clients which support the compression, and lighttpd will serve plain content to clients which do not support the compression. lighttpd will compress each file as it is served and will save the compressed file in deflate.cache-dir so that lighttpd does not have to re-compress the file the next time the file is requested. lighttpd will detect if the original file has changed and will re-compress it into the cache the next time the file is requested.
It seems that I was writing the question so long, that I was near to the solution.
I created an module file 12-static_gzip.conf, with following content:
$HTTP["url"] =~ ".gz" {
setenv.add-response-header = (
"Content-Encoding" => "gzip"
)
}
I have not found any similar trick for lighttpd, so I applied here a similar solution which I would use for Apache. Expected behavior was, that it will just respond the Content-Encoding header for the gz files, without using php or any additional modules... and it works !!!
The mod_compress module or any other of this kind is disabled and no other changes are needed.
Clearly, the http negotiation is more complex, so I am not sure if this will work for all browsers, but it surely work very nicely for Chrome.
I am also planning to create some ESP32 web servers, where drive and memory are even more critical, so I will try to apply similar solution.
Nevertheless, the questions still hold...
is there a better/cleaner solution ?
Are there some caveats to be expected ? Browser compatibility etc. ?
Related
I am trying to gzip some of my larger .files and have taken the initial approach like here
Can someone walk me through serving gzipped files from Cloudfront via S3 origin?
Gzip the file, remove the .gz and then set correct HTTP Headers.
Now this all works, but I am told some browsers won't support the gzip files, especially mobile.
So I want to do the approach where my folder contains both i.e.
myfile.js.gz &
myfile.js
And store them in my S3 Bucket, and cloudfront should as far as I am aware select the correct file based on which the browser can support ?
But if I have both in the folder, the larger myfile.js get selected each time.
Anyone know what I am missing here ?
Do you have to use JavaScript necessarily? Not sure about the Cloudfront part but for gzipping, you can also use a Python Lambda function. With the gzipand zipfile libraries, it could be pretty easy:
with zipped.open(file, "r") as f_in:
gzipped_content = gzip.compress(f_in.read())
destinationbucket.upload_fileobj(io.BytesIO(gzipped_content),
final_file_path,
ExtraArgs={"ContentType": "text/plain"}
)
There's a full tutorial for the lambda function here: https://medium.com/p/f7bccf0099c9
I had an idea on how to vastly speed up my website and ease the load on my server for cached objects, but I'm wondering if this is an extremely stupid idea before I spend all day thinking about it! Hear me out -
The localStorage object has a minimum of 2.5 MB (in Chrome, more here) which is good enough for most small sites. Which is why I am wondering - for small, static websites, would it be a good idea to basically save the entire site in a huge localStorage file and basically load the site from the visitor's own computer each time? To better explain my idea, here's some pseudo-code:
<!DOCTYPE html>
<html>
<head>
<meta charset="UTF-8">
<!--other head information-->
</head>
<body>
<script>
if (localStorage.getItem('cached_site')!=null) {
document.getElementsByTagName("body")[0].remove();
document.documentElement.innerHTML = localStorage.getItem('cached_site');
//Somehow stop loading the rest of the site (ajax?)
}
else {
//set localStorage var to the source code of the site.
//**This will be fleshed out to data/base64 all image resources as well**
localStorage.setItem('cached_site',document.documentElement.innerHTML);
//load the site as normal
}
</script>
</body>
</html>
This will save the contents of the body in a localStorage file called cached_site. There are some issues with the logic (like the "stop loading the site" part) but you get the idea.
Is this worth it? Potentially, requests could be completely cut from static content because it's all being loaded onto the user's computer. In order to update the content, maybe another value, perhaps a version number could be saved and checked in the else block.
My webhost limits the amount of requests I can get per day, which is why I thought of this. It's not a huge deal, but an interesting project nonetheless.
This idea will have issues with dynamic web sites... how will you treat dynamically generated web pages?
Another idea would be to save each page into a static html file on the server, and then serve the static page, which will be regenerated when needed.
You should also cache the static parts of your website, i.e. images, scripts, CSS, and store these in the cache of your visitors' browsers.
In Apache, you could use mod_expires, and then set up an .htaccess file like this:
### turn on the Expires engine
ExpiresActive On
<FilesMatch "\.(ico|pdf|flv|jpg|jpeg|png|gif|swf)(\.gz)?$">
ExpiresDefault "access plus 1 month"
FileETag None
</FilesMatch>
This will basically cache all of the static parts of your website in your visitors' browser cache, so they will stop refetching the same files over and over.
You should also try and combine your CSS/Javascript files, and ideally end up with 1-2 javascript files, and 1 CSS file, thus limiting the number of requests per client.
I would also suggest using Yahoo's YSlow and Google's PageSpeed Tools to profile your site. They both contain best practices on how to speed up your page here and here.
Here are some suggestions from Yahoo + Google, at a glance:
Minimize HTTP Requests (use combined files for JS/CSS, CSS sprites, image maps and inline images, where possible)
Add an Expires or a Cache-Control Header (like what has been explained above)
Properly configure ETags, or remove them
Gzip Components (e.g. with mod_deflate in Apache)
Make JavaScript and CSS External
Minify JavaScript and CSS (i.e. remove whitespace and newlines) - in PHP, you can use the minify script to do this
Optimize images
Lets assume I have a file on a CDN (Cloud Files from Rackspace) and a static html page with a link to that file. Is there any way I can force download this file (to prevent it from opening in the browser -- for mp3s for example)?
We could make our server read the file and set the corresponding header to:
header("Content-Type: application/force-download")
but we have about 5 million downloads per month so we would rather let the CDN take care of that.
Any ideas?
There’s no way to do this in HTML or JavaScript. There is now! (Ish. See #BruceAldrige’s answer below.)
The HTTP Content-Disposition header is what tells browsers to download the files, and that’s sent by the server. You have to configure the CDN to send that header with whichever files you want to browser to download instead of display.
Unhelpfully, I’m entirely unfamiliar with Rackspace’s Cloud Files service, so I don’t know if they allow this, nor how to do it. Just found a page from December 2009 that suggests not thought, sadly:
Cloud Files cannot serve a file with the 'Content-Disposition: attachment' HTTP header. Therefore, a download link that would work perfectly in any other service may result in the browser rendering the file directly. This was confirmed by Rackspace engineers. :-(
http://drupal.org/node/656714
I know that you can with Amazon’s CloudFront service, as it’s backed by S3 (see e.g. http://blog.cloudberrylab.com/2009/06/how-to-set-custom-http-headers-for.html)
You can use the download attribute:
<a href="http..." download></a>
https://stackoverflow.com/a/11024735/21460
However, it’s not currently supported by Safari (7) or IE (11).
Yes, you can do this through the cloudfiles API. Using the method stream allows you to stream the contents of files in - setting your own headers etc.
A crazy idea: download via XMLHttpRequest and serve a data: URL with the content type you want? :P
I'd like to make an XML document in JavaScript then have a save dialog appear.
It's OK if they have to click before the save can occur.
It's *not* OK if I *have* to use IE to achieve this (I don't even need to support it at all). However, Windows is a required platform (so Firefox or Chrome are the preferred browsers if I can only do this in one browser).
It's *not* OK if I need a web server. But conversely, I don't want to require the JavaScript to be run on a local file only, i.e. elevated privileges -- if possible. That is, I'd like to to run locally or on a *static* host. But just locally is OK.
It's OK to have to bend over backwards to do this. The file won't be very big, but internet access might either be there, be spotty or just not be a possibility at all -- see (3).
So far the only ideas I have seen are to save the XML to an iframe and save that document -- but it seems that you can only do this in IE? Also, that I could construct a data URI and place that in a link. My fear here is that it will just open the XML file in the window, rather than prompt the user to save it.
I know that if I require the JavaScript to be local, I can raise privileges and just directly save the file (or hopefully cause a save dialog box to appear). However, I'd much prefer a solution where I do not require raised privileges (even a Firefox 3.6 only solution).
I apologize if this offends anyone's sensibilities (for example, not supporting every browser). I basically want to write an offline application and Javascript/HTML/CSS seem to be the best candidate considering the complexity of the requirements and the time available. However, I have this single requirement of being able to save data that must be overcome before I can choose this line of development.
How about this downloadify script?
Which is based on Flash and jQuery, which can prompt you dialog box to save file in your computer.
Downloadify.create('downloadify',{
filename: function(){
return document.getElementById('filename').value;
},
data: function(){
return document.getElementById('data').value;
},
onComplete: function(){
alert('Your File Has Been Saved!');
},
onCancel: function(){
alert('You have cancelled the saving of this file.');
},
onError: function(){
alert('You must put something in the File Contents or there will be nothing to save!');
},
swf: 'media/downloadify.swf',
downloadImage: 'images/download.png',
width: 100,
height: 30,
transparent: true,
append: false
});
Using a base64 encoded data URI, this is possible with only html & js. What you can do is encode the data that you want to save (in your case, a string of XML data) into base64, using a js library like jquery-base64 by carlo. Then put the encoded string into a link, and add your link to the DOM.
Example using the library I mentioned (as well as jquery):
<html>
<head>
<title>Example</title>
</head>
<body>
<script>
//include jquery and jquery-base64 here (or whatever library you want to use)
document.write('click to make save dialog');
</script>
</body>
</html>
...and remember to make the content-type something like application/octet-stream so the browser doesn't try to open it.
Warning: some older IE versions don't support base64, but you said that didn't matter, so this should work fine for you.
Without any more insight into your specific requirements, I would not recommend a pure Javascript/HTML solution. From a user perspective you would probably get the best results writing a native application. However if it will be faster to use Javascript/HTML, I recommend using a local application hosting a lightweight web server to serve up your content. That way you can cleanly handle the file saving server-side while focusing the bulk of your effort on the front-end application.
You can code up a web server in - for example - Python or Ruby using very few lines of code and without 3rd party libraries. For example, see:
Making a simple web server in python
WEBrick - Writing a custom servlet
python-trick-really-little-http-server - This one is really simple, and will easily let you server up all of your HTML/CSS/JS files:
"""
Serves files out of its current directory.
Doesn't handle POST requests.
"""
import SocketServer
import SimpleHTTPServer
PORT = 8080
def move():
""" sample function to be called via a URL"""
return 'hi'
class CustomHandler(SimpleHTTPServer.SimpleHTTPRequestHandler):
def do_GET(self):
#Sample values in self for URL: http://localhost:8080/jsxmlrpc-0.3/
#self.path '/jsxmlrpc-0.3/'
#self.raw_requestline 'GET /jsxmlrpc-0.3/ HTTP/1.1rn'
#self.client_address ('127.0.0.1', 3727)
if self.path=='/move':
#This URL will trigger our sample function and send what it returns back to the browser
self.send_response(200)
self.send_header('Content-type','text/html')
self.end_headers()
self.wfile.write(move()) #call sample function here
return
else:
#serve files, and directory listings by following self.path from
#current working directory
SimpleHTTPServer.SimpleHTTPRequestHandler.do_GET(self)
httpd = SocketServer.ThreadingTCPServer(('localhost', PORT),CustomHandler)
print "serving at port", PORT
httpd.serve_forever()
Finally - Depending on who will be using your application, you also have the option of compiling a Python program into a Frozen Binary so the end user does not have to have Python installed on their machine.
Javascript is not allowed to write to a local machine. Your question is similar to this one.
I suggest creating a simple desktop app.
Is localhost PHP server ok? Web traditionally can't save to hard drive because of security concerns. PHP can push files though it requires a server.
Print to PDF plugins are available for available for all browsers. Install once, print to PDF forever. Then, you can use a javascript or Flash to call a Print function.
Also, if you are developing for an environment where internet access is spotty, conwider using VB.NET or some other desktop language.
EDIT:
You can use the browser's Print function.
Are you looking for something like this?
If PHP is ok, if would be much easier.
With IE you could use document.execCommand, but I note that IE is not an option.
Here's something that looks like it might help, although it will not prompt with SaveAs dialog, https://developer.mozilla.org/en/Code_snippets/File_I%2F%2FOL.
One simple but odd way to do this that doesn't require any Flash is to create an <a/> with a data URI for its href. This even has pretty good cross-browser support, although for IE it must be at least version 8 and the URI must be < 32k. It looks like someone else on SO has more to say on the topic.
Why not use a hybrid flash for client and some server solution server-side. Most people have flash so you can default to client side to conserve resources on the server.
I have an html file saved in gzip format. The browser displays the html file but without the javascript and CSS. Non-zipped html files in the same directory do display correctly. In addition, I saved the source from the compressed html file and it reopened correctly, with JS and CSS applied.
What is different about displaying the zipped html that would not allow it to pick up the JS and CSS?
The basic problem is you can't just serve a gzip file where the browser expects CSS. By itself, that does not work any more than if you return a JPEG or a ham sandwich.
When content gets zipped on the fly, the response is somewhat different -- the response says "I am text/css, but, happen to be encoded with gzip for transfer". The browser can figure that one out.
Some web servers like Apache will do that sort of thing for you if you supply gzipped files locally, too. But I imagine your server isn't.
Why does it work for HTML? Hmm, I don't know, maybe your browser actually manages to figure it out in that particular case?
What you ultimately want to do is serve the response with Content-Type: text/css and Content-Encoding: gzip to have it recognized correctly.
it you're working on Localhost on your own server (like XAMPP)
then you need to configure the .htaccess file to send the right
headers that said that files might be gziped.
try adding this to you main .htaccess file :
AddEncoding x-gzip .gz
AddType text/html .gz
and make sure your gziped compressed
files are endind with the .gz extention.
also, always run this in a server.. :)