Download multiple images into 1 zip file - javascript

I have a function that currently downloads multiple images and saves them to a users "download" folder (Only works in Chrome)
I want to take this function to the next step and put these images in a single zip file.
Below is an example of my current code. I want to merge my code with the JSZip API I found online here.
I have done the bower install for this JSZip API already and included the script in my html.
Here is my code that works perfectly downloading multiple SINGLE images at once:
$scope.downloadPhotos = function() {
var photoUrls = [];
for (var x = 0; x < $scope.$parent.photos.length; x++) {
var p = $scope.$parent.photos[x];
if (p.isChecked) {
photoUrls.push($scope.bucketUrl() + p.photoUrl);
}
}
saveImage(photoUrls);
};
/*----this function saveImage works great (only Chrome)-----*/
function saveImage(urls) {
var link = document.createElement('a');
link.setAttribute('download', null);
link.style.display = 'none';
document.body.appendChild(link);
for (var i = 0; i < urls.length; i++) {
link.setAttribute('href', urls[i]);
link.click();
}
document.body.removeChild(link);
}
And here is the JSZip API code example to create a zip file with content in it:
function create_zip() {
var zip = new JSZip();
zip.add("hello1.txt", "Hello First World\n");
zip.add("hello2.txt", "Hello Second World\n");
content = zip.generate();
location.href = "data:application/zip;base64," + content;
}
Now I'm just wondering how to combine the two to put my images into a zipfile.
Thanks for your help!

I put this together that will let you zip an array of image urls.
https://jsfiddle.net/jaitsujin/zrdgsjht/
You can manage zip folder structure by modifying this line
filename = filename.replace(/[\/\*\|\:\<\>\?\"\\]/gi, '').replace("httpsi.imgur.com","");

To Download multiple files in Zip format we can use jsZip and FileSaver.js or if we are using Web API and Angularjs then we can create an API method to create zip archieve file at server and then in angularjs we can use $http post or get api call to download the file as zip file (We have to use filesaver to save the file in zip format). for example -
api call in angularjs -
function downloadFiles(files) {
return $http.post(baseUrl + 'api/download/files', files, { responseType: 'arraybuffer' });
}
call above function and on response use fileSaver.js method saveAs to save file in zip format for example -
//files - array input of files like http://www.example.com/file1.png', 'http://www.example.com/file2.jpeg', 'http://www.example.com/file3.jpg'];
downloadFiles(files).then(function (response) {
//on success
var file = new Blob([response.data], { type: 'application/zip' });
saveAs(file, 'example.zip');
}, function (error) {
//on error
//write your code to handle error
});

You should see this example. Currently, you just ask the browser to trigger the download of a file. If you want to create a zip file client side, your js code needs to access the content of the files with ajax calls (you will get CORS issues if they aren't stored on the same server).
Without copy/pasting the whole code, the example:
triggers ajax calls (with JSZipUtils but you can easily only use a responseType = "arraybuffer" if you only supports recent browsers)
wrap them into promises (jQuery promises here but you can use your own)
add the result into a zip object
wait for all promises to complete before triggering a download

function downloadImageAsZip(imageUrl){
var zip = new JSZip();
var img = new Image();
img.crossOrigin = 'Anonymous';
img.src = imageUrl;
img.onload = function() {
$scope.count2++;
var canvas = document.createElement('CANVAS');
var ctx = canvas.getContext('2d');
var dataURL;
canvas.height = img.height;
canvas.width = img.width;
ctx.drawImage(this, 0, 0);
ctx.enabled = false;
dataURL = canvas.toDataURL("Canvas");
canvas = null;
//var base64String = dataURL.replace("/^data:image\/(png|jpg);base64,/", "");
var base64String = dataURL.replace("data:image/png;base64,", "");
zip.file("ImageName", base64String, {base64: true});
zip.generateAsync({type:"blob"}).then(function(content) {
saveAs(content, "ZipFileName.zip");
});
}
}

Related

Javascript - Callback

I am new to Javascript and am working on a task to compress and then upload an already uploaded image.
I am trying to:
Retrieve the uploaded image,
Compress it
Convert it to a base64 URL
Convert it into a blob
And then into a file and upload it.
But this code just doesn't work.
When I step through it using a debugging tool it does it's job but otherwise it doesn't.
I think the rest of the code after the loadImage function call doesn't really execute.
Please help me make sense of it! Thanks!
function loadImage(formObj2, fldid2, file, callback) {
var oldImage = document.createElement("img");
var psImageOutput = new Image();
var reader = new FileReader();
reader.onload = function(e) {
/* code to compress image */
callback(psImageOutput);
}
reader.readAsDataURL(file);
}
var inputFile = fileQueue[i].file;
var formObj1 = formObject;
var fldid1 = fldid;
loadImage(formObj1, fldid1, inputFile, function(psImageOutput) {
var newImageDataSRC = psImageOutput.src;
/* Manipulate SRC string and create a blob and an image file from it */
formObj1.append(fldid1, newimgfile);
});
Be careful, on the line :
formObj1.append(fldid1, newimgfile);
You seem to append a dom node called newimgfile but in your code this variable doesn't exist.

Waiting for canvas.toDataURL() to complete before submitting blob to server

I have a project requirement here using file input to accept file and display it on the canvas while at the same time processes it as a blob and send it to the backend for processing. Depending on the file type, different processes my be used instead.
If the file type is a pdf, it will be processed using pdf.js, and the pdf image will be displayed on the canvas while the file is converted into a blob and sent to the backend for processing. So far this is okay for single page pdf only, as i can straight get the canvas data into a blob and send it directly after toDataURL has finished.
However for multiple pages pdf, i'm having trouble on doing multiple toDataURL run on each page. My idea is, when i iterate each page using pdf.js's getPage, then i will display the page on the canvas, and get the canvas image as a blob and push it into an array. Once all pages have been processed and all the blobs are pushed into the array, i will send it to the backend for processing.
var __TOTAL_PAGES, __ISMULTIPLEPAGE, __CANVAS = undefined;
var __MULTIPAGEHOLDER = [];
function showPDF(pdf_url) {
PDFJS.getDocument({url: pdf_url}).then(function (pdf_doc) {
__TOTAL_PAGES = pdf_doc.numPages;
__ISMULTIPLEPAGE = __TOTAL_PAGES > 1 ? true : false;
// Show the first page
for (var i = 0; i < __TOTAL_PAGES; i++){
$('#parentId').append('<canvas id="canvas-'+(i+1)+'"></canvas>');
showPage(i+1);
}
}).catch(function (error) {
/* Insert any error handling here */
});
}
function showPage(page_no) {
__PDF_DOC.getPage(page_no).then(function (page) {
/* Some setup here */
var renderContext = {
canvasContext: /* Some value */,
viewport: /* Some value */
};
// Render the page contents in the canvas
page.render(renderContext).then(function () {
__CANVAS = document.getElementById('canvas-'+page_no);
// $('#go_pdf').click();
if (__ISMULTIPLEPAGE){
var thisPageURL = __CANVAS.toDataURL();
thisPage = dataURLtoBlob(thisPageURL);
__MULTIPAGEHOLDER.push(thisPage);
if (__MULTIPAGEHOLDER.length === __TOTAL_PAGES){
// Once all blob are ready then only will submit to backend
submitPDFtoBackend(__MULTIPAGEHOLDER);
}
} else {
var pdfData = [];
if (pdfCanvas != null) {
var data = __CANVAS.toDataURL();
pdfData.push(dataURLtoBlob(data));
}
submitPDFtoBackend(pdfData);
}
});
});
}
But the above didn't work to my expectation, because at times, the submitPDFtoBackend method will still be called even if toDataURL is still running. I only want the submitPDFtoBackend to be called only after all toDataURL has been completed, and all blobs have been pushed to the array.
I'd read about making toDataURL processes as a promise, therefore can execute the submit method on the promises using $.when but i'm not sure if that is feasible. Something like
$.when.apply(null, promise).then(function() {
submitPDFtoBackend(__MULTIPAGEHOLDER);
});
But i'm not sure on how to do this.
Thanks
Instead of going to dateURL and then from there to blob, you can go to blob directly by canvas.toBlob
To check if the blob is ready you can trap it in the onload handler of an img element then call submitPDFToBackend within the onload handler. Below snippet directly from mdn
var canvas = document.getElementById('canvas');
canvas.toBlob(function(blob) {
var newImg = document.createElement('img'),
url = URL.createObjectURL(blob);
newImg.onload = function() {
// no longer need to read the blob so it's revoked
URL.revokeObjectURL(url);
};
newImg.src = url;
document.body.appendChild(newImg);
});

Chrome crashes when exporting file via Filesystem API

Im trying to run in-browser encryption application which uses jQuery 1.10.2 and CryptoJS 3.2.1
the problem that I face starts at around 2mb files. File can be encrypted just fine, but when a data URI is created for the file it crashes the browser.
I would like a way around this to make it possible to encrypt files up-to 50mb's without browser crashing.
Here is the current snippt responsible for file saving via FileReader API
var reader = new FileReader();
if(body.hasClass('encrypt')){
// Encrypt the file!
reader.onload = function(e){
// Use the CryptoJS library and the AES cypher to encrypt the
// contents of the file, held in e.target.result, with the password
var encrypted = CryptoJS.AES.encrypt(e.target.result, password);
// The download attribute will cause the contents of the href
// attribute to be downloaded when clicked. The download attribute
// also holds the name of the file that is offered for download.
a.attr('href', 'data:application/octet-stream,' + encrypted);
a.attr('download', file.name + '.encrypted');
step(4);
};
// This will encode the contents of the file into a data-uri.
// It will trigger the onload handler above, with the result
reader.readAsDataURL(file);
}
else {
// Decrypt it!
reader.onload = function(e){
var decrypted = CryptoJS.AES.decrypt(e.target.result, password)
.toString(CryptoJS.enc.Latin1);
if(!/^data:/.test(decrypted)){
alert("Invalid pass phrase or file! Please try again.");
return false;
}
a.attr('href', decrypted);
a.attr('download', file.name.replace('.encrypted',''));
step(4);
};
reader.readAsText(file);
}
What can I change in above code to allow for larger files to be encrypted and decrypted?
Live site: droplet.so (currently capped at 1.5mb otherwise browser crash is guaranteed)
Kindly thanks in advance.
With a little research I found out that 1.99MB is the maximum the can be saved in the data url in chrome.
Your problem can be done by converting your data url to blob
You can find more information here:
Blob from DataURL?
Chrome crashes when URI is too long is here a similar post ( see second answer ).
EDIT:
Possible solution
function dataURItoBlob(dataURI) {
var byteString = atob(dataURI.split(',')[1]);
var mimeString = dataURI.split(',')[0].split(':')[1].split(';')[0]
var ab = new ArrayBuffer(byteString.length);
var ia = new Uint8Array(ab);
for (var i = 0; i < byteString.length; i++) {
ia[i] = byteString.charCodeAt(i);
}
var bb = new BlobBuilder();
bb.append(ab);
return bb.getBlob(mimeString);
}
function download(dataURI) {
var blob = dataURItoBlob(dataURI);
var url = window.URL.createObjectURL(blob);
window.location.assign(url);
}
And you can use this code by calling download(dataURI).

How to save a image on server with javascript?

I'm working in a little web app that generates an base64 image, I'm using blob to put it back into a file (is a .png but I haven't renamed it yet), now I'm trying to save it on my sever Any ideas or different approaches?
This is the script:
var img = document.getElementById("MyPix");
img.onclick = function() {
var image_data = atob(img.src.split(',')[1]);
var arraybuffer = new ArrayBuffer(image_data.length);
var view = new Uint8Array(arraybuffer);
for (var i=0; i<image_data.length; i++) {
view[i] = image_data.charCodeAt(i) & 0xff;
}
try {
var blob = new Blob([arraybuffer], {type: 'application/octet-stream'});
} catch (e) {
var bb = new (window.WebKitBlobBuilder || window.MozBlobBuilder);
bb.append(arraybuffer);
var blob = bb.getBlob('application/octet-stream');
}
var url = (window.webkitURL || window.URL).createObjectURL(blob);
valor = (document.getElementById("link").value = url)
location.href = valor;
};
I'm not very good with js so if you want to have a better idea visit the project clicking here its all javascript so just see source code.
you can't save to your server with just client-side JavaScript. Form the data you want to save in Javascript, then POST that to your server with a call to a page that you write that can turn POST data into a file on your filesystem, so in your case a .php file with code that looks for $_POST data and then writes that to file. After making sure it's safe, because anyone will be able to post data to that page, not just people using your webpage.

How to upload/POST multiple canvas elements

I have to create an image uploader for a future project (No flash, IE10+, FF7+ etc.) that does image resizing/converting/cropping on the clientside and not on the server.
So I made a javascript interface where the user can 'upload' their files and get resized/cropped in the browser directly, without ever contacting the server. The performance is OK, not that good, but it works.
The endresult is an array of canvas elements. The user can edit/crop the images after they got resized, so I keep them as canvas instead of converting them to jpeg. (Which would worsen the initial performance)
Now this works fine, but I don't know what's the best way to actually upload the finished canvas elements to the server now. (Using a asp.net 4 generic handler on the server)
I have tried creating a json object from all elements containing the dataurl of each canvas.
The problem is, when I got 10-40 pictures, the browser starts freezing when creating the dataurls, especially for images that are larger than 2 megabyte.
//images = array of UploadImage
for (var i = 0; i < images.length; i++) {
var data = document.getElementById('cv_' + i).toDataURL('image/jpg');
images[i].data = data.substr(data.indexOf('base64') + 7);
}
Also converting them to a json object (I am using json2.js) usually crashes my browser. (FF7)
My object
var UploadImage = function (pFileName, pName, pDescription) {
this.FileName = pFileName;
this.Name = pName;
this.Description = pDescription;
this.data = null;
}
The upload routine
//images = array of UploadImage
for (var i = 0; i < images.length; i++) {
var data = document.getElementById('cv_' + i).toDataURL('image/jpg');
images[i].data = data.substr(data.indexOf('base64') + 7);
}
var xhr, provider;
xhr = jQuery.ajaxSettings.xhr();
if (xhr.upload) {
xhr.upload.addEventListener('progress', function (e) {
console.log(Math.round((e.loaded * 100) / e.total) + '% done');
}, false);
}
provider = function () {
return xhr;
};
var ddd = JSON.stringify(images); //usually crash here
$.ajax({
type: 'POST',
url: 'upload.ashx',
xhr: provider,
dataType: 'json',
success: function (data) {
alert('ajax success: data = ' + data);
},
error: function () {
alert('ajax error');
},
data: ddd
});
What would be the best way to send the canvas elements to the server?
Should I send them all at once or one by one?
Uploading files one by one is better. Requires less memory and as soon as one file ready to upload, the upload can be started instead of waiting while all files will be prepared.
Use FormData to send files. Allows to upload files in binary format instead of base64 encoded.
var formData = new FormData;
If Firefox use canvas.mozGetAsFile('image.jpg') instead of canvas.toDataUrl(). Allow to avoid unnecessary conversion from base64 to binary.
var file = canvas.mozGetAsFile('image.jpg');
formData.append(file);
In Chrome use BlobBuilder to convert base64 into blob (see dataURItoBlob function
accepted
After playing around with a few things, I managed to figure this out myself.
First of all, this will convert a dataURI to a Blob:
//added for quick reference
function dataURItoBlob(dataURI) {
// convert base64/URLEncoded data component to raw binary data held in a string
var byteString;
if (dataURI.split(',')[0].indexOf('base64') >= 0)
byteString = atob(dataURI.split(',')[1]);
else
byteString = unescape(dataURI.split(',')[1]);
// separate out the mime component
var mimeString = dataURI.split(',')[0].split(':')[1].split(';')[0];
// write the bytes of the string to a typed array
var ia = new Uint8Array(byteString.length);
for (var i = 0; i < byteString.length; i++) {
ia[i] = byteString.charCodeAt(i);
}
return new Blob([ia], {type:mimeString});
}
From this question):
var blob = dataURItoBlob(canvas.toDataURL('image/jpg'));
formData.append(blob);
And then send the formData object. I'm not sure how to do it in jQuery, but with plain xhr object it like so:
var xhr = new XMLHttpRequest;
xhr.open('POST', 'upload.ashx', false);
xhr.send(formData);
On server you can get files from Files collection:
context.Request.Files[0].SaveAs(...);

Categories