My backend (in nodejs) uses a webservice that allows to download a file from a distant server. It has to "return" the download towards my reactJS webapp in the frontend.
The download is functionnal when testing the service with the swagger. Thus, my backend returns a string with the raw data, which I re-interpet to download.
My frontend treatment is as such :
downloadFile(){
let path = this.props.result.link;
let name = this.props.result.fileName;
downloader.downloadAFile(path,name).then( response => {
if (response){
if (response.success == true) {
let bytes = this.data64ToArrayBuffer(response.data);
var file = new File([bytes],this.props.result.fileName+".zip",{type: "application/zip"});
fileSaver.saveAs(file);
} else {
console.log(response);
if (response.data == "Fichier introuvable."){
window.alert(response.data);
}
}
}
}).catch(function (error){
console.log(error);
});
}
dataToArrayBuffer = (data) => {
var binaryString = data;
console.log(binaryString);
var binaryLen = binaryString.length;
var bytes = new Uint8Array(binaryLen);
for (var i = 0; i < binaryLen; i++) {
var ascii = binaryString.charCodeAt(i);
bytes[i] = ascii;
}
return bytes;
}
For a file named "test.txt", containing "test" as text, at the "/PATH/TO/FOLDER" path, the API should propose a zip file to download, names "test.txt.zip", with inside, all the folders trees /PATH/TO/FOLDER, and in the final folder, the mentionned file.
My download in the frontend is effective in some way since it downloads a zip with the right name, the right size, and the right folder tree inside.
However, the test.txt file in the final folder of the path seems to be corrupted, for when I try to open it in 7zip, I have the following error : CRC failed : test.txt .
I suspect it is a problem with how I handle the raw data. I'll take any hints and leads, since I am effectively blocked.
Related
I am creating a chat app (in React Native), but for now, I have made some tests in vanilla JavaScript. The server is a NodeJS-server.
It works with sending text messages, but now I have some questions about sending photos/videos/audio files. I'm doing a lot of research online on what's the best method to do this.
I came up with the idea to use the FileReader API and split up the file into chunks, and sending chunk by chunk via the socket.emit()-function.
This is my code so far (simplified):
Please note that I will create a React Native app, but for now (for testing), I've just created a HTML-file with an upload form.
// index.html
// the page where my upload form is
var reader = {};
var file = {};
var sliceSize = 1000 * 1024;
var socket = io('http://localhost:8080');
const startUpload = e => {
e.preventDefault();
reader = new FileReader();
file = $('#file)[0].files[0]
uploadFile(0)
}
$('#start-upload').on('click', startUpload)
const uploadFile = start => {
var slice = start + sliceSize + 1;
var blob = file.slice(start, slice)
reader.on('loadend', e => {
if (slice < file.size) {
socket.emit('message', JSON.stringify({
fileName: file.name,
fileType: file.type,
fileChunk: e.target.result
})
} else {
console.log('Upload completed!')
}
})
reader.readAsDataURl(blob)
}
// app.js
// my NodeJS server-file
var file;
var files = {};
io.on('connection', socket => {
console.log('User connected!');
// when a message is received
socket.on('message', data => {
file = JSON.parse(data)
if (!files[file.fileName]) {
// this is the first chunk received
// create a new string
files[file.fileName] = '';
}
// append the binary data
files[file.fileName] = files[file.fileName] + file.fileChunk;
})
// on disconnect
socket.on('disconnect', () => {
console.log('User disconnected!');
})
})
I did not include any checks for file type (I'm not at that point yet), I first want to make sure that this is the right thing to do.
Stuff I need to do:
Send a message (like socket.emit('uploaddone', ...)) from the client to the server to notify the server that the upload is done (and the server can emit the complete file to another user).
My questions are:
Is it okay to send chunks of binary data (base64) over a socket, or would it take up to much bandwidth?
Will I lose some quality (photos/videos/audio files) when splitting them up into chunks?
If there is a better way to do this, please let me know. I'm not asking for working code examples, just some guidance in the good direction.
You can send raw bytes over WebSocket, base64 has 33% size overhead.
Also you won't have to JSON.stringify all (and maybe large) body and parse it on client-side.
Will I lose some quality
No, underlying protocol (TCP) delivers data in-order and without corruption.
I realize this answer is a couple of months late, but just for future reference you should look into using the acknowledgment option with socket.io here
// with acknowledgement
let message = JSON.stringify({
fileName: file.name,
fileType: file.type,
fileChunk: e.target.result
})
socket.emit("message", message, (ack) => {
// send next chunk...
});
I am trying to upload a folder present on my local directory to my server on S3 using an API call.
This works fine for small folders(number of files: 100, size: 51mb), however, for large folders(number of files: 868, size: 180mb), the tab crashes after a few seconds giving the message " Aw snap! Something went wrong while displaying this webpage.".
Uploading is done by first selecting the folder on local, which gives me paths of all the files in it and then converting it into a tree structure to send in the body of an API. The folder structure I send at backend has a tree-like structure. The function used is given below:
// files are the paths of the files present in the folder
this.readFolder = function(files) {
return new Promise(function(resolve, reject) {
var tree = {};
for (var i = 0; i < files.length; i++) {
var file = files[i];
var loadCount = 0;
var reader = new FileReader();
if (file.type && file.type.indexOf('image') > -1) {
reader.readAsDataURL(file);
} else {
reader.readAsText(file);
}
(function(file, reader) {
reader.onload = function() {
var path = file.webkitRelativePath.split('/');
var folder = tree;
for (var i in path) {
if (i == path.length - 1) {
folder[path[i]] = reader.result;
} else {
folder[path[i]] = folder[path[i]] || {};
}
folder = folder[path[i]];
}
loadCount++;
if (loadCount == files.length) {
resolve(tree);
}
};
})(file, reader);
}
});
};
The tree structure returned by the function is then sent in the body of an API.
$http.post("/api/upload", {
data: tree
}).then(function(res) {
if (res.data.error) {
toaster.pop('error', '', res.data.reason);
} else {
toaster.pop('success', '', 'Uploaded successfully.');
}
}, function(rej) {
var error = (rej.data) ? rej.data.msg : "Some error occured.";
toaster.pop('error', '', error);
});
After some digging, I have found out that this might be caused because the body part of the API is converted to a string and sending large strings might create memory issues in chrome. Can somebody tell me exactly why this is not working and how to make it work?
PS: I couldn't find an error logged on the server or developer console. Also after the tab crashes, developer tools gets disconnected.
Also, this is working on Mozilla Firefox.
I am trying to write a file uploader for Meteor framework.
The principle is to split the fileon the client from an ArrayBuffer in small packets of 4096 bits that are sent to the server through a Meteor.method.
The simplified code below is the part of the client that sends a chunk to the server, it is repeated until offset reaches data.byteLength :
// data is an ArrayBuffer
var total = data.byteLength;
var offset = 0;
var upload = function() {
var length = 4096; // chunk size
// adjust the last chunk size
if (offset + length > total) {
length = total - offset;
}
// I am using Uint8Array to create the chunk
// because it can be passed to the Meteor.method natively
var chunk = new Uint8Array(data, offset, length);
if (offset < total) {
// Send the chunk to the server and tell it what file to append to
Meteor.call('uploadFileData', fileId, chunk, function (err, length) {
if (!err) {
offset += length;
upload();
}
}
}
};
upload(); // start uploading
The simplified code below is the part on the server that receives the chunk and writes it to the file system :
var fs = Npm.require('fs');
var Future = Npm.require('fibers/future');
Meteor.methods({
uploadFileData: function(fileId, chunk) {
var fut = new Future();
var path = '/uploads/' + fileId;
// I tried that with no success
chunk = String.fromCharCode.apply(null, chunk);
// how to write the chunk that is an Uint8Array to the disk ?
fs.appendFile(path, chunk, 'binary', function (err) {
if (err) {
fut.throw(err);
} else {
fut.return(chunk.length);
}
});
return fut.wait();
}
});
I failed to write a valid file to the disk, actually the file is saved but I cannot open it, when I see the content in a text editor, it is similar to the original file (a jpg for example) but some characters are different, I think that could be an encoding problem as the file size is not the same, but I don't know how to fix that...
Saving the file was as easy as creating a new Buffer with the Uint8Array object :
// chunk is the Uint8Array object
fs.appendFile(path, Buffer.from(chunk), function (err) {
if (err) {
fut.throw(err);
} else {
fut.return(chunk.length);
}
});
Building on Karl.S answer, this worked for me, outside of any framework:
fs.appendFileSync(outfile, Buffer.from(arrayBuffer));
Just wanted to add that in newer Meteor you could avoid some callback hell with async/await. Await will also throw and push the error up to client
Meteor.methods({
uploadFileData: async function(file_id, chunk) {
var path = 'somepath/' + file_id; // be careful with this, make sure to sanitize file_id
await fs.appendFile(path, new Buffer(chunk));
return chunk.length;
}
});
I'm trying to open a binary file './test' and send it's contents, one byte at a time, to an external device through the UART. The external device echos back each byte.
The regular file './test' and the buffer 'dta' in this case are both 19860 bytes in length however the code will continue to send bytes from beyond the end of buffer 'dta' well after 'a' becomes greater than 'dta.length' and I can't figure out why. Any ideas?
var fs = require('fs');
var SerialPort = require("serialport").SerialPort
var serialPort = new SerialPort("/dev/ttyAMA0", {baudrate: 115200}, false);
stats = fs.statSync(__dirname+"/test");
dta = new Buffer (stats.size);
dta = fs.readFileSync(__dirname+"/test");
a=0;
serialPort.open(function (error) {
if ( error ) {
console.log('failed to open: '+error);
} else {
serialPort.write(dta[a]);
}
});
serialPort.on('data', function(data) {
a++;
if (a < dta.length) serialPort.write(dta[a]);
});
Am working on an offline application using HTML5 and jquery for mobile. i want to back up files from the local storage using jszip. below is a code snippet of what i have done...
if (localStorageKeys.length > 0) {
for (var i = 0; i < localStorageKeys.length; i++) {
var key = localStorageKeys[i];
if (key.search(_instrumentId) != -1) {
var data = localStorage.getItem(localStorageKeys[i])
var zip = new JSZip();
zip.file(localStorageKeys[i] + ".txt", data);
var datafile = document.getElementById('backupData');
datafile.download = "DataFiles.zip";
datafile.href = window.URL.createObjectURL(zip.generate({ type: "blob" }));
}
else {
}
}
}
in the code above am looping through the localstorage content and saving ezch file in a text format. the challenge that am facing is how to create several text files inside DataFiles.zip as currently am only able to create one text file inside the zipped folder. Am new to javascript so bare with any ambiguity in my question.
thanks in advance.
Just keep calling zip.file().
Look at the example from their documentation page (comments mine):
var zip = new JSZip();
// Add a text file with the contents "Hello World\n"
zip.file("Hello.txt", "Hello World\n");
// Add a another text file with the contents "Goodbye, cruel world\n"
zip.file("Goodbye.txt", "Goodbye, cruel world\n");
// Add a folder named "images"
var img = zip.folder("images");
// Add a file named "smile.gif" to that folder, from some Base64 data
img.file("smile.gif", imgData, {base64: true});
zip.generateAsync({type:"base64"}).then(function (content) {
location.href="data:application/zip;base64," + content;
});
The important thing is to understand the code you've written - learn what each line does. If you do this, you'd realize that you just need to call zip.file() again to add another file.
Adding to #Jonathon Reinhart answer,
You could also set both file name and path at the same time
// create a file and a folder
zip.file("nested/hello.txt", "Hello World\n");
// same as
zip.folder("nested").file("hello.txt", "Hello World\n");
If you receive a list of files ( from ui or array or whatever ) you can make a compress before and then archive. The code is something like this:
function upload(files){
var zip = new JSZip();
let archive = zip.folder("test");
files.map(function(file){
files.file(file.name, file.raw, {base64: true});
}.bind(this));
return archive.generateAsync({
type: "blob",
compression: "DEFLATE",
compressionOptions: {
level: 6
}
}).then(function(content){
// send to server or whatever operation
});
}
this worked for me at multiple json files. Maybe it helps.
In case you want to zip files and need a base64 output, you can use the below code-
import * as JSZip from 'jszip'
var zip = new JSZip();
zip.file("Hello.json", this.fileContent);
zip.generateAsync({ type: "base64" }).then(function (content) {
const base64Data = content