How do I transfer a zip archive generated on the server back to the client? I'm using AngularJS and SailsJS. Currently I set the HTML headers to match the content type, generate the archive using archiver and pipe the data into the res obejct before calling res.end().
The file-data is succesfully placed inside the XHR response, but the file is never downloaded on the clients side - unless I make an API call to zipFiles (see the code below).
How do I fix this?
zipFiles: async function (req, res) {
var archiver = require('archiver');
var year = req.allParams().year;
var quarter = req.allParams().quarter;
/*
* FIXME: This is dangerous, the same code is present in api/controllers/sirka/SirkaShStatController.js
* FIXME: A globally-available file should contain all relevant paths
*/
var src_path = __some__path__
var file_name = `download.zip`;
// Set HTML headers to match the contents of the respone
res.writeHead(200, {
'Content-Type': 'application/zip',
'Content-Disposition': `attachment; filename=${file_name}`,
});
var archive = archiver('zip');
archive.on('error', function(err) {
throw err;
});
// Once the archive has been finished (by archive.finalize()) send the file
archive.on('finish', function() {
sails.log.info('Archive finished, sending...')
res.end();
});
// Pipe the archive data into the respone object
archive.pipe(res);
// Append files found in src_path at the top level of the archive
archive.directory(src_path, false);
archive.finalize();
}
After a lot of searching and tinkering I've finally managed to solve the issue. I'll try to explain the different approaches that I took and their results.
1st approach
Generate the ZIP-file in-memory and transfer the binary data back to the user through the request.
This approach failed (see original question) since the call to zip the files was done through XHR/AJAX, even though it was possible to pipe the data into the response, it coulnd't be fetched on the client side.
2nd approach
Create the zip-file on the server, then represent the binary data as a Buffer. With this approach, I could simply return the buffer back to the caller by calling res.ok(data) once the zip-file was fully generated:
var archiver = require('archiver');
var archive = archiver('zip');
var fs = require('fs');
var output = fs.createWriteStream(dst_path);
archive.on('error', function(err) {
throw err;
});
// Once the archive has been finished (by archive.finalize()) send the file
archive.on('finish', function() {
sails.log.info('Archive finished, sending...');
});
output.on('close', function () {
var data = fs.readFileSync(dst_path);
console.log(data);
console.log(Buffer.byteLength(data));
return res.ok(data);
})
// Pipe the archive data into the response object
archive.pipe(output);
// Append files found in src_path at the top level of the archive
archive.directory(src_path, false);
archive.finalize();
Then on the client-side I simply receive the data, convert it to Uint8Array and wrap it with another array. The wrapping is necessary since the data makes up one whole part of the Blob.
Then once the Blob is generated, I create an ObjectURL for it and attach the link to an invisible a element that is automatically clicked.
var dataBuffer = res["data"];
var binaryData = new Uint8Array(dataBuffer);
var blobParts = [binaryData];
var blob = new Blob(blobParts, {type: 'application/zip'});
var downloadUrl = URL.createObjectURL(blob);
var a = document.createElement('a');
document.body.appendChild(a);
a.style = "display: none";
a.href = downloadUrl;
a.download = `Sygehusstatistik-${year}-K${quarter}.zip`;
a.click()
I've had issues with the generated zip-file getting placed into itself recursively, in order to avoid that ensure that src_path != dst_path
Related
More Specifically, I have a blob initialized and processed in a .js file in the 'public' (static folder) folder. Since that has been processed at the client-side, I want to know if there's a way I can somehow get access to the blob at the server-side without using a POST request. The file we talking about has been processed and is stored in a variable in a static file (script.js), Now I need to upload that variable/blob onto the Database. But, in a static file, I don't have access to the Database and can't even export the variable to the server. How do I get access to that variable which is within the static file? Someone, please edit if they have understood my requirement.
What my program does is that it records audio through the microphone of the client And that audio file has to be uploaded onto the Database. Now, I can add the functionality of 'Download' for the client and let the client download the file and then the client uses <input> tag to send a POST request to the server, But, now the client can upload any audio file into that input tag, Basically, this is a web app for live monitoring students writing exam, So that don't they don't cheat I capture their audio and save it to the DB. Please refer to my folder Structure for more details and then read the question again.
My Folder Structure:
--DBmodels
---AudioModels.js
---ImageModel.js
--node_modules
--public
---scipt.js (This contains the code for audio processing)
--views
---test.ejs (Main HTML page)
--index.js (server file)
--package.json
Here is a small diagram for reference :
Diagram
And here is my Folder Structure :
Folder Structure
One way of doing is this to download it on the client-side and then ask the client to upload but that doesn't work for me due to some reasons.
Here is my script.js, But I don't have access of the variables such as chunk_audio[] on the server.
const chunks_audio = [];
mediaRecorder.onstop = async function (e) {
console.log("DONE WITH AUDIO");
const blob = new Blob(chunks_audio, {
'type': "audio/ogg codecs=opus"
});
const audioURL = window.URL.createObjectURL(blob);
console.log(audioURL);
var link = document.getElementById("downloadaudio");
link.href = audioURL;
var audioMIMEtypes = ["audio/aac", "audio/mpeg", "audio/ogg", "audio/opus", "audio/wav"]
const audio = blob
const audiodb = new AudioSchema({
name : "Audio"+Date.now().toString()[5]
});
saveAudio(audiodb,audio)
try{
const new_audio = await audiodb.save();
console.log("AUDIO UPLOADED" + new_audio);
}catch (err){
console.log(err);
}
function saveAudio(audiodb, audioEncoded) {
if (audioEncoded == null) return;
console.log("before parse: " + audioEncoded);
const audio = JSON.parse(audioEncoded);
console.log("JSON parse: " + audio);
if (audio != null && audioMIMEtypes.includes(audio.type)) {
audiodb.audio = new Buffer.from(audio.data, "base64");
audiodb.audioType = audio.type;
}
}
// module.exports = chunks_audio; (This doesn't work for obvious reasons)
Here is my server file (index.js) , I tried to use POST request where the user posts the audio file after it gets downloaded, But the user could post any other file in the <input> tag, So that doesn't match with my requirement:
var audioMIMEtypes = ["audio/aac", "audio/mpeg", "audio/ogg", "audio/opus", "audio/wav"]
app.post('/', async ( req, res, next)=>{
const audio = blob // 'blob' is a variable in script.js , Hence don't have access here
const audiodb = new AudioSchema({
name : "Audio"+Date.now().toString()[5]
});
saveAudio(audiodb,audio)
try{
const new_audio = await audiodb.save();
console.log("AUDIO UPLOADED" + new_audio);
}catch (err){
console.log(err);
}
function saveAudio(audiodb, audioEncoded) {
if (audioEncoded == null) return;
console.log("before parse: " + audioEncoded);
const audio = JSON.parse(audioEncoded);
console.log("JSON parse: " + audio);
if (audio != null && audioMIMEtypes.includes(audio.type)) {
audiodb.audio = new Buffer.from(audio.data, "base64");
audiodb.audioType = audio.type;
}
}
})
I am creating a chat app (in React Native), but for now, I have made some tests in vanilla JavaScript. The server is a NodeJS-server.
It works with sending text messages, but now I have some questions about sending photos/videos/audio files. I'm doing a lot of research online on what's the best method to do this.
I came up with the idea to use the FileReader API and split up the file into chunks, and sending chunk by chunk via the socket.emit()-function.
This is my code so far (simplified):
Please note that I will create a React Native app, but for now (for testing), I've just created a HTML-file with an upload form.
// index.html
// the page where my upload form is
var reader = {};
var file = {};
var sliceSize = 1000 * 1024;
var socket = io('http://localhost:8080');
const startUpload = e => {
e.preventDefault();
reader = new FileReader();
file = $('#file)[0].files[0]
uploadFile(0)
}
$('#start-upload').on('click', startUpload)
const uploadFile = start => {
var slice = start + sliceSize + 1;
var blob = file.slice(start, slice)
reader.on('loadend', e => {
if (slice < file.size) {
socket.emit('message', JSON.stringify({
fileName: file.name,
fileType: file.type,
fileChunk: e.target.result
})
} else {
console.log('Upload completed!')
}
})
reader.readAsDataURl(blob)
}
// app.js
// my NodeJS server-file
var file;
var files = {};
io.on('connection', socket => {
console.log('User connected!');
// when a message is received
socket.on('message', data => {
file = JSON.parse(data)
if (!files[file.fileName]) {
// this is the first chunk received
// create a new string
files[file.fileName] = '';
}
// append the binary data
files[file.fileName] = files[file.fileName] + file.fileChunk;
})
// on disconnect
socket.on('disconnect', () => {
console.log('User disconnected!');
})
})
I did not include any checks for file type (I'm not at that point yet), I first want to make sure that this is the right thing to do.
Stuff I need to do:
Send a message (like socket.emit('uploaddone', ...)) from the client to the server to notify the server that the upload is done (and the server can emit the complete file to another user).
My questions are:
Is it okay to send chunks of binary data (base64) over a socket, or would it take up to much bandwidth?
Will I lose some quality (photos/videos/audio files) when splitting them up into chunks?
If there is a better way to do this, please let me know. I'm not asking for working code examples, just some guidance in the good direction.
You can send raw bytes over WebSocket, base64 has 33% size overhead.
Also you won't have to JSON.stringify all (and maybe large) body and parse it on client-side.
Will I lose some quality
No, underlying protocol (TCP) delivers data in-order and without corruption.
I realize this answer is a couple of months late, but just for future reference you should look into using the acknowledgment option with socket.io here
// with acknowledgement
let message = JSON.stringify({
fileName: file.name,
fileType: file.type,
fileChunk: e.target.result
})
socket.emit("message", message, (ack) => {
// send next chunk...
});
I am a newbie to both Javascript and ipfs and I am trying an experiment to fetch an image buffer from the ipfs hash "QmdD8FL7N3kFnWDcPSVeD9zcq6zCJSUD9rRSdFp9tyxg1n" using ipfs-mini node module.
Below is my code
const IPFS = require('ipfs-mini');
const FileReader = require('filereader');
var multer = require('multer');
const ipfs = initialize();
app.post('/upload',function(req,res){
upload(req,res, function(err){
console.log(req.file.originalname);
ipfs.cat('QmdD8FL7N3kFnWDcPSVeD9zcq6zCJSUD9rRSdFp9tyxg1n', function(err, data){
if(err) console.log("could not get the image from the ipfs for hash " + ghash);
else {
var wrt = data.toString('base64');
console.log('size ; ' + wrt.length);
fs.writeFile('tryipfsimage.gif',wrt, (err) =>{
if(err)console.log('can not write file');
else {
//console.log(data);
ipfs.stat('QmdD8FL7N3kFnWDcPSVeD9zcq6zCJSUD9rRSdFp9tyxg1n', (err, data)=>{
// console.log(hexdump(wrt));
});
console.log("files written successfully");
}
});
}
});
});
});
function initialize() {
console.log('Initializing the ipfs object');
return new IPFS({
host: 'ipfs.infura.io',
protocol: 'https'
});
}
I could view the image properly in the browser using the link below "https://ipfs.io/ipfs/QmdD8FL7N3kFnWDcPSVeD9zcq6zCJSUD9rRSdFp9tyxg1n", but if I open the file 'tryipfsimage.gif' in which I dump the return buffer of the cat API in above program, the content of the image seems corrupted. I am not sure what the mistake I am doing in the code. it would be great If someone points me the mistake.
From ipfs docs https://github.com/ipfs/interface-ipfs-core/blob/master/SPEC/FILES.md#javascript---ipfsfilescatipfspath-callback
file in the callback is actually a Buffer so by toString('base64')'ing it you are writing actual base64 into the .gif file - no need to do this. you can pass the Buffer directly to the fs.writeFile api with
fs.writeFile('tryipsimage.gif', file, ...
For larger files I would recommend using the ipfs catReadableStream, where you can do something more like:
const stream = ipfs.catReadableStream('QmdD8FL7N3kFnWDcPSVeD9zcq6zCJSUD9rRSdFp9tyxg1n')
// don't forget to add error handlers to stream and whatnot
const fileStream = fs.createWriteStream('tryipsimage.gif')
stream.pipe(fileStream);
I'm trying to insert an image into a pdf I'm creating server-side with PDFkit. I'm using cfs:dropbox to store my files. Before when I was using cvs:filesystem, it was easy to add the images to my pdf's cause they were right there. Now that they're stored remotely, I'm not sure how to add them, since PDFkit does not support adding images with just the url. It will, however, accept a buffer. How can I get a buffer from my CollectionFS files?
So far I have something like this:
var portrait = Portraits.findOne('vS2yFy4gxXdjTtz5d');
readStream = portrait.createReadStream('portraits');
I tried getting the buffer two ways so far:
First using dataMan, but the last command never comes back:
var dataMan = new DataMan.ReadStream(readStream, portrait.type());
var buffer = Meteor.wrapAsync(Function.prototype.bind(dataMan.getBuffer, dataMan))();
Second buffering the stream manually:
var buffer = new Buffer(0);
readStream.on('readable', function() {
buffer = Buffer.concat([buffer, readStream.read()]);
});
readStream.on('end', function() {
console.log(buffer.toString('base64'));
});
That never seems to come back either. I double-checked my doc to make sure it was there and it has a valid url and the image appears when I put the url in my browser. Am I missing something?
I had to do something similar and since there's no answer to this question, here is how I do it:
// take a cfs file and return a base64 string
var getBase64Data = function(file, callback) {
// callback has the form function (err, res) {}
var readStream = file.createReadStream();
var buffer = [];
readStream.on('data', function(chunk) {
buffer.push(chunk);
});
readStream.on('error', function(err) {
callback(err, null);
});
readStream.on('end', function() {
callback(null, buffer.concat()[0].toString('base64'));
});
};
// wrap it to make it sync
var getBase64DataSync = Meteor.wrapAsync(getBase64Data);
// get a cfs file
var file = Files.findOne();
// get the base64 string
var base64str = getBase64DataSync(file);
// get the buffer from the string
var buffer = new Buffer(base64str, 'base64')
Hope it'll help!
Am working on an offline application using HTML5 and jquery for mobile. i want to back up files from the local storage using jszip. below is a code snippet of what i have done...
if (localStorageKeys.length > 0) {
for (var i = 0; i < localStorageKeys.length; i++) {
var key = localStorageKeys[i];
if (key.search(_instrumentId) != -1) {
var data = localStorage.getItem(localStorageKeys[i])
var zip = new JSZip();
zip.file(localStorageKeys[i] + ".txt", data);
var datafile = document.getElementById('backupData');
datafile.download = "DataFiles.zip";
datafile.href = window.URL.createObjectURL(zip.generate({ type: "blob" }));
}
else {
}
}
}
in the code above am looping through the localstorage content and saving ezch file in a text format. the challenge that am facing is how to create several text files inside DataFiles.zip as currently am only able to create one text file inside the zipped folder. Am new to javascript so bare with any ambiguity in my question.
thanks in advance.
Just keep calling zip.file().
Look at the example from their documentation page (comments mine):
var zip = new JSZip();
// Add a text file with the contents "Hello World\n"
zip.file("Hello.txt", "Hello World\n");
// Add a another text file with the contents "Goodbye, cruel world\n"
zip.file("Goodbye.txt", "Goodbye, cruel world\n");
// Add a folder named "images"
var img = zip.folder("images");
// Add a file named "smile.gif" to that folder, from some Base64 data
img.file("smile.gif", imgData, {base64: true});
zip.generateAsync({type:"base64"}).then(function (content) {
location.href="data:application/zip;base64," + content;
});
The important thing is to understand the code you've written - learn what each line does. If you do this, you'd realize that you just need to call zip.file() again to add another file.
Adding to #Jonathon Reinhart answer,
You could also set both file name and path at the same time
// create a file and a folder
zip.file("nested/hello.txt", "Hello World\n");
// same as
zip.folder("nested").file("hello.txt", "Hello World\n");
If you receive a list of files ( from ui or array or whatever ) you can make a compress before and then archive. The code is something like this:
function upload(files){
var zip = new JSZip();
let archive = zip.folder("test");
files.map(function(file){
files.file(file.name, file.raw, {base64: true});
}.bind(this));
return archive.generateAsync({
type: "blob",
compression: "DEFLATE",
compressionOptions: {
level: 6
}
}).then(function(content){
// send to server or whatever operation
});
}
this worked for me at multiple json files. Maybe it helps.
In case you want to zip files and need a base64 output, you can use the below code-
import * as JSZip from 'jszip'
var zip = new JSZip();
zip.file("Hello.json", this.fileContent);
zip.generateAsync({ type: "base64" }).then(function (content) {
const base64Data = content