Javascript lossless image compression - javascript

I am looking for a way to encoding a bmp image as a tiff or some other standard, compressed but lossless format in a browser (javascript). I guess if worst comes to worst I can write a library myself (getting the data from a canvas element) but I was hoping either: there's a better way to do it or someone else has already written something similar (I can't find a library).
I have users wanting to upload ~ 4mb (8-bit monochrome) bmp files, which I can losslessly compress to ~700kb tiff files with LZW (or even better ~300kb lossless JPEG-2000). I'd like to do this before the files are uploaded to save transfer costs/time.
Before you ask, I'm not being anal about the lossless encoding instead of just using high bitrate JPEG. These are astronomy photos that are used for analysis so they can't handle any compression artifacts being introduced.
Thanks,
Jonny

Use PNG. It's lossless and uses zlib compression. There are even programs like pngcrush that will optimize the image for size (only problem is it takes a while for this).
Is there any reason you're using JavaScript of all things for this process? Wouldn't it be easier in the long run if you did it in a Java applet (using a library that interfaces with the java.awt.Image class), or uploaded it to a server that did the image processing and returned the compressed image?
Edit: Please don't use a Java applet; the technology isn't well-supported anymore.

If you are willing to experiment something new, you could try a new lossless compression algorithm I created; through a Java applet it is possible to visualize the compressed images in a browser. You could use Javascript to trigger the loading of the applet, or manipulate directly the Java code (the program is open source). For many images, the compression ratio will be better than lossless Jpeg 2000. The address of the website is http://www.researchandtechnology.net/bcif/ .
If instead you want to use some well-known standard, then I'd agree with amphetamachine: using PNG would be the best choice.
So long,
Stefano

Related

Why does .ico ( Base64 ) appear to waste so much space?

Below is the base64 representation of my logo icon. It is mostly the character A. I made it in gimp and then converted it to base64.
Is there something I could have done differently so that I do not waste so much space. I would assume there is someway to encode A over and over again, instead of explicitly writing them?
I know that Base64 kills 33% off the top, but this is not my concern.
In gimp I save to .ico and then converted to Base64 using an online tool.
url(data:image/vnd.microsoft.icon;base64,AAABAAEAICAAAAEAIACoEAAAFgAAACgAAAAgAAAAQAAAAAEAIAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAADd3d0B3d3dQ93d3dUAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA3d3d1d3d3UPd3d0BAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA3d3dJd3d3dXd3d3/3d3d/wAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAADd3d3/3d3d/93d3dXd3d0lAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAN3d3Vbd3d3/3d3d/93d3f/d3d3/AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAN3d3f/d3d3/3d3d/93d3f/d3d1WAAAA
.../snip continues like this
Windows icon files contain raw uncompressed bitmap data, and Base64 is just a way of encoding data with a 33% expansion rate.
Depending on what you're wanting to do, there are several solutions:
Use the PNG ICO format: this is a normal multi-image'd Windows *.ico file, except the bitmap data is stored as PNG instead of a raw bitmap. This is only supported by Windows Vista or later. (PNGs are used for 128x128 and larger icon sizes but bitmaps are used for all smaller sizes for compatibility and performance reasons).
Use PNG directly - it doesn't look like you're taking advantage of the multi-image capabilities of the ICO format, is this for a favicon? Note that favicons can be in any bitmap format, not just ICO.
Use your webserver's GZIP compression - assuming you're offering your ICO files over the web then the inefficient storage isn't a concern because most servers, including IIS, come with HTTP Gzip compression support which really shrinks those things down.
Other than that, I/we need more information about what you're wanting to accomplish.
Save it as a 2 color palette GIF file.
Once you know the Base64 value, you could write a loop to make that many As I suppose. Depending on the length of the loop it may or may not save file space.

Render RGBA to PNG in pure JavaScript?

Let's say I have a canvas element, and I need to turn the image on the canvas into a PNG or JPEG. Of course, I can simply use canvas.toDataURL, but the problem is that I need to do this a twenty times a second, and canvas.toDataURL is extremely slow -- so slow that the capturing process misses frames because the browser is busy converting to PNG.
My idea is to call context.getImageData(...), which evidently is much faster, and send the returned CanvasPixelArray to a Web Worker, which will then process the raw image data into a PNG or JPEG. The problem is that I can't access the native canvas.toDataURL from within the Web Worker, so instead, I'll need to resort to pure JavaScript. I tried searching for libraries intended for Node.js, but those are written in C++. Are there any libraries in pure JavaScript that will render raw image data into a PNG or JPEG?
There have been several JavaScript ports of libpng, including pnglets (very old), data:demo, and PNGlib.
The PNG spec itself is pretty straightforward – the most difficult part you may have is with implementing a simple PNG encoder yourself is getting ZLIB right (for which there are also many independent JavaScript implementations out there).
There's actually a C++ to JavaScript compiler called Emscripten.
Someone made a port of libpng, which you might want to check.
I was able to write my own PNG encoder, which supports both RGB and palette depending on how many colors there are. It's intended to be run as a Web Worker. I've open-sourced it as usain-png.

Client-side zipping with Flash + JavaScript

I'm looking for a robust way of creating a zip archive on the fly from information on a given page and making this available for download. Client-side zipping is a must since my script runs from a bookmarklet.
My first approach while I was more concerned with writing the rest of the script was just to post the information to a few lines of PHP running on my local server which zipped it and sent it back. This is obviously not suitable for a bookmarklet worth sharing.
I found JSZip earlier today, and I thought that'd be the end of it. This library works great when it works; unfortunately, the archives I'm creating frequently exceed a couple of MBs, and this breaks JSZip. (Note: I've only tested this on Chrome.)
Pure JS downloads also have the limitation of funky names due the data URI, which I intended to solve using JSZip's recommended method, using Downloadify, which uses Flash. This made me wonder whether the size limitations on JS zip generating could be / have been overcome by using a similar interplay of Flash & JS.
I Googled this, but having no experience with Actionscript I couldn't figure out quickly whether what I'm asking is possible. Is it possible to use a Flash object from JS to create relatively large (into the 10s of MBs) zip file on the client-side?
Thanks!
First of all some numbers:
Flash promises that uploads will work if the file is smaller than 100 Mb (I don't know whether it means base 10 or base 16).
There are two popular libraries in Flash for creating ZIP archives, but read on first.
ZIP archiver is a program that both compresses and archives the data, and it does it in exactly this order. I.e. it compresses each file separately and then appends it to the entire archive. This yields worse compression rate but allows for iterative creation of the archive. With the benefit being that you can even start sending the archive before it is entirely compressed.
An alternative to ZIP is first to use a dedicated archiver and then to compress the entire archive at once. This, some times can achieve few times better compression, but the cost is that you have to process the data at once.
But Flash ByteArray.compress() method offers you native implementation of deflate algorithm, which is mostly the same thing you would use in ZIP archiver. So, if you had implemented something like tar, you could significantly reduce the size of the files being sent.
But Flash is a single-thread environment, so, you would have to be careful about the size of the data you compress, and, probably, will have to find it out empirically. Or just use ZIP - more redundancy, but easier to implement.
I've used this library before: nochump. Didn't have any problems. Although, it is somewhat old, and it might make sense to try to port it to use Alchemy opcodes (which are used for fast memory access significantly reducing the cost of low-level binary arithmentic operations such as binary or, binary and etc.) This library implements CRC32 algorithm, which is an essential part of ZIP archive and it uses Alchemy - so it should be considerably faster, but you would have to implement the rest on your own.
Yet another option you might consider is Goole's NaCl - there you would be able to choose from archiver and compression implementations because it essentially runs the native code, so you could even use bz2 and other modern stuff - unfortunately, only in Chrome (and users must enable it) or Firefox (need plugin).

Splitting a file before upload?

On a webpage, is it possible to split large files into chunks before the file is uploaded to the server? For example, split a 10MB file into 1MB chunks, and upload one chunk at a time while showing a progress bar?
It sounds like JavaScript doesn't have any file manipulation abilities, but what about Flash and Java applets?
This would need to work in IE6+, Firefox and Chrome. Update: forgot to mention that (a) we are using Grails and (b) this needs to run over https.
You can try Plupload. It can be configured to check whatever runtime is available on users side, be it - Flash, Silverlight, HTML5, Gears, etc, and use whichever satisfies required features first. Among other things it supports image resizing (on users side, preserving EXIF data(!)), stream and multipart upload, and chunking. Files can be chunked on users side, and sent to a server-side handler chunk-by-chunk (requires some additional care on server), so that big files can be uploaded to a server having max filesize limit set to a value much lower then their size, for example. And more.
Some runtimes support https I believe, some need testing. Anyway, developers on there are quite responsive these days. So you might at least try ;)
The only option I know of that would allow this would be a signed Java applet.
Unsigned applets and Flash movies have no filesystem access, so they wouldn't be able to read the file data. Flash is able to upload files, but most of that is handled by the built-in Flash implementation and from what I remember the file contents would never be exposed to your code.
There is no JavaScript solution for that selection of browsers. There is the File API but whilst it works in newer Firefox and Chrome versions it's not going to happen in IE (no sign of it in IE9 betas yet either).
In any case, reading the file locally and uploading it via XMLHttpRequest is inefficient because XMLHttpRequest does not have the ability to send pure binary, only Unicode text. You can encode binary into text using base-64 (or, if you are really dedicated, a custom 7-bit encoding of your own) but this will be less efficient than a normal file upload.
You can certainly do uploads with Flash (see SWFUpload et al), or even Java if you must (Jumploader... I wouldn't bother, these days, though, as Flash prevalence is very high and the Java plugin continues to decline). You won't necessarily get the low-level control to split into chunks, but do you really need that? What for?
Another possible approach is to use a standard HTML file upload field, and when submit occurs set an interval call to poll the server with XMLHttpRequest, asking it how far the file upload is coming along. This requires a bit of work on the server end to store the current upload progress in the session or database, so another request can read it. It also means using a form parsing library that gives you progress callback, which most standard language built-in ones like PHP's don't.
Whatever you do, take a ‘progressive enhancement’ approach, allowing browsers with no support to fall back to a plain HTML upload. Browsers do typically have an upload progress bar for HTML file uploads, it just tends to be small and easily missed.
Do you specifically need it two be in X chunks? Or are you trying to solve the problems cause by uploading large files? (e.g. can't restart an upload on the client side, server side crashes when the entire file is uploaded and held in memory all at once)
Search for streaming upload components. It depends on what technologies you are working with as to which component you will prefer jsp, asp.net, etc.
http://krystalware.com/Products/SlickUpload/ This one is a server side product
Here are some more pointers to various uploaders http://weblogs.asp.net/jgalloway/archive/2008/01/08/large-file-uploads-in-asp-net.aspx
some try to manage memory on the server,e.g. so the entire huge file isn´t in memory at one time, some try to manage the client side experience.

Is it possible to optimize/shrink images before uploading?

I am working on a web application that will deal with many image uploads. Its quite likely that the users will be in areas with slow internet connections and I'm hoping to save them upload time by compressing the images before uploading.
I have seen that Aurigma Image Uploader achieves this using a java applet or active x but it's expensive and I'd rather something open source or at least a little cheaper. Ideally I'd like to roll my own if its at all possible.
I'm developing on Ruby on Rails if that makes any difference..
Thanks!
Edit just to clarify: I don't mind if the solution uses activeX or an applet (although js is ideal) - I'm just looking for something a little cheaper than Aurigma at this stage of development.
Also, it may not be feasible for users to shrink the image themselves as in many instances they will uploading directly from an internet cafe or other public internet spot.
Generally, it isn't possible to write an image compressor in JavaScript. Sorry.
You'll need to use a plugin of some sort, and as you mention, other sites use Java.
It appears to be possible to write something to encode a JPEG in ActionScript (i.e. Flash), which will reach a much larger audience than the Java plugin you mention. Here's a link to a blog post talking about PNG & JPEG encoders in ActionScript.
Here's another blog post with a demo of an inlined JPEG encoder in ActionScript.
Only if you use Flash or Silverlight (only way to be cross-platform)
http://www.jeff.wilcox.name/2008/07/fjcore-source/ may be worth a read.
Without using applets or activex (only in windows) you can't execute anything on a client pc.
Probably not, but you can always insist that image uploads over x size will not succeed.
Is this an application where you can force them to insert a smaller image. In that case you could grab the size first to verify it fits standards. This is what facebook used to do with profile pictures. If it was too big they wouldn't take it.

Categories