I am creating a chat app (in React Native), but for now, I have made some tests in vanilla JavaScript. The server is a NodeJS-server.
It works with sending text messages, but now I have some questions about sending photos/videos/audio files. I'm doing a lot of research online on what's the best method to do this.
I came up with the idea to use the FileReader API and split up the file into chunks, and sending chunk by chunk via the socket.emit()-function.
This is my code so far (simplified):
Please note that I will create a React Native app, but for now (for testing), I've just created a HTML-file with an upload form.
// index.html
// the page where my upload form is
var reader = {};
var file = {};
var sliceSize = 1000 * 1024;
var socket = io('http://localhost:8080');
const startUpload = e => {
e.preventDefault();
reader = new FileReader();
file = $('#file)[0].files[0]
uploadFile(0)
}
$('#start-upload').on('click', startUpload)
const uploadFile = start => {
var slice = start + sliceSize + 1;
var blob = file.slice(start, slice)
reader.on('loadend', e => {
if (slice < file.size) {
socket.emit('message', JSON.stringify({
fileName: file.name,
fileType: file.type,
fileChunk: e.target.result
})
} else {
console.log('Upload completed!')
}
})
reader.readAsDataURl(blob)
}
// app.js
// my NodeJS server-file
var file;
var files = {};
io.on('connection', socket => {
console.log('User connected!');
// when a message is received
socket.on('message', data => {
file = JSON.parse(data)
if (!files[file.fileName]) {
// this is the first chunk received
// create a new string
files[file.fileName] = '';
}
// append the binary data
files[file.fileName] = files[file.fileName] + file.fileChunk;
})
// on disconnect
socket.on('disconnect', () => {
console.log('User disconnected!');
})
})
I did not include any checks for file type (I'm not at that point yet), I first want to make sure that this is the right thing to do.
Stuff I need to do:
Send a message (like socket.emit('uploaddone', ...)) from the client to the server to notify the server that the upload is done (and the server can emit the complete file to another user).
My questions are:
Is it okay to send chunks of binary data (base64) over a socket, or would it take up to much bandwidth?
Will I lose some quality (photos/videos/audio files) when splitting them up into chunks?
If there is a better way to do this, please let me know. I'm not asking for working code examples, just some guidance in the good direction.
You can send raw bytes over WebSocket, base64 has 33% size overhead.
Also you won't have to JSON.stringify all (and maybe large) body and parse it on client-side.
Will I lose some quality
No, underlying protocol (TCP) delivers data in-order and without corruption.
I realize this answer is a couple of months late, but just for future reference you should look into using the acknowledgment option with socket.io here
// with acknowledgement
let message = JSON.stringify({
fileName: file.name,
fileType: file.type,
fileChunk: e.target.result
})
socket.emit("message", message, (ack) => {
// send next chunk...
});
Related
I have a client attempting to send images to a server over BLE.
Client Code
//BoilerPlate to setup connection and whatnot
sendFile.onclick = async () => {
var fileList = document.getElementById("myFile").files;
var fileReader = new FileReader();
if (fileReader && fileList && fileList.length) {
fileReader.readAsArrayBuffer(fileList[0]);
fileReader.onload = function () {
var imageData = fileReader.result;
//Server doesn't get data if I don't do this chunking
imageData = imageData.slice(0,512);
const base64String = _arrayBufferToBase64(imageData);
document.getElementById("ItemPreview").src = "data:image/jpeg;base64," + base64String;
sendCharacteristic.writeValue(imageData);
};
}
};
Server Code
MyCharacteristic.prototype.onWriteRequest = function(data, offset, withoutResponse, callback) {
//It seems this will not print out if Server sends over 512B.
console.log(this._value);
};
My goal is to send small images (Just ~6kb)...These are still so small that'd I'd still prefer to use BLE over a BT Serial Connection. Is the only way this is possible is to perform some chunking and then streaming the chunks over?
Current 'Chunking' Code
const MAX_LENGTH = 512;
for (let i=0; i<bytes.byteLength; i+= MAX_LENGTH) {
const end = (i+MAX_LENGTH > bytes.byteLength) ? bytes.byteLength : i+MAX_LENGTH;
const chunk = bytes.slice(i, end);
sendCharacteristic.writeValue(chunk);
await sleep(1000);
}
The above code works, however it sleeps in between sends. I'd rather not do this because there's no guarantee a previous packet will be finished sending and I could sleep longer than needed.
I'm also perplexed on how the server code would then know the client has finished sending all bytes and can then assemble them. Is there some kind of pattern to achieving this?
BLE characteristic values can only be 512 bytes, so yes the common way to send larger data is to split it into multiple chunks. Use "Write Without Response" for best performance (MTU-3 must be at least as big as your chunk).
I have an Express server that receives FormData with an attached FLAC audio file. The code works as expected for several files of varying size (10 - 70MB), but some of them get stuck in the 'file' event and I cannot figure out why this happens. It is even more strange when a file that previously did not fire the file.on('close', => {}) event, as can be seen in the documentation for Busboy, suddenly does so, with the file being successfully uploaded.
To me, this seems completely random, as I have tried this with a dozen files of varying size and content type (audio/flac & audio/x-flac), and the results have been inconsistent. Some files will, however, not work at all, even if I attempt to parse them many times over. Whereas, certain files can be parsed and uploaded, given enough attempts?
Is there some error that I fail to deal with in the 'file' event? I did try to listen to the file.on('error', => {}) event, but there were no errors to be found. Other answers suggest that the file stream must be consumed for the 'close' event to proceed, but I think that file.pipe(fs.createWriteStream(fileObject.filePath)); does that, correct?
Let me know if I forgot to include some important information in my question. This has been bothering me for about a week now, so I am happy to provide anything of relevance to help my chances of overcoming this hurdle.
app.post('/upload', (request, response) => {
response.set('Access-Control-Allow-Origin', '*');
const bb = busboy({ headers: request.headers });
const fields = {};
const fileObject = {};
bb.on('file', (_name, file, info) => {
const { filename, mimeType } = info;
fileObject['mimeType'] = mimeType;
fileObject['filePath'] = path.join(os.tmpdir(), filename);
file.pipe(fs.createWriteStream(fileObject.filePath));
file.on('close', () => {
console.log('Finished parsing of file');
});
});
bb.on('field', (name, value) => {
fields[name] = value;
});
bb.on('close', () => {
bucket.upload(
fileObject.filePath,
{
uploadType: 'resumable',
metadata: {
metadata: {
contentType: fileObject.mimeType,
firebaseStorageDownloadToken: fields.id
}
}
},
(error, uploadedFile) => {
if (error) {
console.log(error);
} else {
db.collection('tracks')
.doc(fields.id)
.set({
identifier: fields.id,
artist: fields.artist,
title: fields.title,
imageUrl: fields.imageUrl,
fileUrl: `https://firebasestorage.googleapis.com/v0/b/${bucket.name}/o/${uploadedFile.name}?alt=media&token=${fields.id}`
});
response.send(`File uploaded: ${fields.id}`);
}
}
);
});
request.pipe(bb);
});
UPDATE: 1
I decided to measure the number of bytes that were transferred upon each upload with file.on('data', (data) => {}), just to see if the issue was always the same, and it turns out that this too is completely random.
let bytes = 0;
file.on('data', (data) => {
bytes += data.length;
console.log(`Loaded ${(bytes / 1000000).toFixed(2)}MB`);
});
First Test Case: Fenomenon - Sleepy Meadows Of Buxton
Source: https://fenomenon.bandcamp.com/track/sleepy-meadows-of-buxton
Size: 30.3MB
Codec: FLAC
MIME: audio/flac
Results from three attempts:
Loaded 18.74MB, then became stuck
Loaded 5.05MB, then became stuck
Loaded 21.23MB, then became stuck
Second Test Case: Almunia - New Moon
Source: https://almunia.bandcamp.com/track/new-moon
Size: 38.7MB
Codec: FLAC
MIME: audio/flac
Results from three attempts:
Loaded 12.78MB, then became stuck
Loaded 38.65, was successfully uploaded!
Loaded 38.65, was successfully uploaded!
As you can see, the behavior is unpredictable to say the least. Also, those two successful uploads did playback seamlessly from Firebase Storage, so it really worked as intended. What I cannot understand is why it would not always work, or at least most of the time, excluding any network-related failures.
UPDATE: 2
I am hopelessly stuck trying to make sense of the issue, so I have now created a scenario that closely resembles my actual project, and uploaded the code to GitHub. It is pretty minimal, but I did add some additional libraries to make the front-end pleasant to work with.
There is not much to it, other than an Express server for the back-end and a simple Vue application for the front-end. Within the files folder, there are two FLAC files; One of them is only 4.42MB to prove that the code does sometimes work. The other file is much larger at 38.1MB to reliably illustrate the problem. Feel free to try any other files.
Note that the front-end must be modified to allow files other than FLAC files. I made the choice to only accept FLAC files, as this is what I am working with in my actual project.
You'll need to write the file directly when BusBoy emits the file event.
It seems there is a race condition if you rely on BusBoy that prevents the file load from being completed. If you load it in the file event handler then it works fine.
app.post('/upload', (request, response) => {
response.set('Access-Control-Allow-Origin', '*');
const bb = busboy({
headers: request.headers
});
const fileObject = {};
let bytes = 0;
bb.on('file', (name, file, info) => {
const {
filename,
mimeType
} = info;
fileObject['mimeType'] = mimeType;
fileObject['filePath'] = path.join(os.tmpdir(), filename);
const saveTo = path.join(os.tmpdir(), filename);
const writeStream = fs.createWriteStream(saveTo);
file.on('data', (data) => {
writeStream.write(data);
console.log(`Received: ${((bytes += data.length) / 1000000).toFixed(2)}MB`);
});
file.on('end', () => {
console.log('closing writeStream');
writeStream.close()
});
});
bb.on('close', () => {
console.log(`Actual size is ${(fs.statSync(fileObject.filePath).size / 1000000).toFixed(2)}MB`);
console.log('This is where the file would be uploaded to some cloud storage server...');
response.send('File was uploaded');
});
bb.on('error', (error) => {
console.log(error);
});
request.pipe(bb);
});
I'm using electron to develop an app. after some encryption operations are done, I need to show a dialog to the user to save the file. The filename I want to give to the file is a random hash but I have no success also with this. I'm trying with this code but the file will not be saved. How I can fix this?
const downloadPath = app.getPath('downloads')
ipcMain.on('encryptFiles', (event, data) => {
let output = [];
const password = data.password;
data.files.forEach( (file) => {
const buffer = fs.readFileSync(file.path);
const dataURI = dauria.getBase64DataURI(buffer, file.type);
const encrypted = CryptoJS.AES.encrypt(dataURI, password).toString();
output.push(encrypted);
})
const filename = hash.createHash('md5').toString('hex');
console.log(filename)
const response = output.join(' :: ');
dialog.showSaveDialog({title: 'Save encrypted file', defaultPath: downloadPath }, () => {
fs.writeFile(`${filename}.mfs`, response, (err) => console.log(err) )
})
})
The problem you're experiencing is resulting from the asynchronous nature of Electron's UI functions: They do not take callback functions, but return promises instead. Thus, you do not have to pass in a callback function, but rather handle the promise's resolution. Note that this only applies to Electron >= version 6. If you however run an older version of Electron, your code would be correct -- but then you should really update to a newer version (Electron v6 was released well over a year ago).
Adapting your code like below can be a starting point to solve your problem. However, since you do not state how you generate the hash (where does hash.createHash come from?; did you forget to declare/import hash?; did you forget to pass any message string?; are you using hash as an alias for NodeJS' crypto module?), it is (at this time) impossible to debug why you do not get any output from console.log (filename) (I assume you mean this by "in the code, the random filename will not be created"). Once you provide more details on this problem, I'd be happy to update this answer accordingly.
As for the default filename: As per the Electron documentation, you can pass a file path into dialog.showSaveDialog () to provide the user with a default filename.
The file type extension you're using should also actually be passed with the file extension into the save dialog. Also passing this file extension as a filter into the dialog will prevent users from selecting any other file type, which is ultimately what you're also currently doing by appending it to the filename.
Also, you could utilise CryptoJS for the filename generation: Given some arbitrary string, which could really be random bytes, you could do: filename = CryptoJS.MD5 ('some text here') + '.mfs'; However, remember to choose the input string wisely. MD5 has been broken and should thus no longer be used to store secrets -- using any known information which is crucial for the encryption of the files you're storing (such as data.password) is inherently insecure. There are some good examples on how to create random strings in JavaScript around the internet, along with this answer here on SO.
Taking all these issues into account, one might end up with the following code:
const downloadPath = app.getPath('downloads'),
path = require('path');
ipcMain.on('encryptFiles', (event, data) => {
let output = [];
const password = data.password;
data.files.forEach((file) => {
const buffer = fs.readFileSync(file.path);
const dataURI = dauria.getBase64DataURI(buffer, file.type);
const encrypted = CryptoJS.AES.encrypt(dataURI, password).toString();
output.push(encrypted);
})
// not working:
// const filename = hash.createHash('md5').toString('hex') + '.mfs';
// alternative requiring more research on your end
const filename = CryptoJS.MD5('replace me with some random bytes') + '.mfs';
console.log(filename);
const response = output.join(' :: ');
dialog.showSaveDialog(
{
title: 'Save encrypted file',
defaultPath: path.format ({ dir: downloadPath, base: filename }), // construct a proper path
filters: [{ name: 'Encrypted File (*.mfs)', extensions: ['mfs'] }] // filter the possible files
}
).then ((result) => {
if (result.canceled) return; // discard the result altogether; user has clicked "cancel"
else {
var filePath = result.filePath;
if (!filePath.endsWith('.mfs')) {
// This is an additional safety check which should not actually trigger.
// However, generally appending a file extension to a filename is not a
// good idea, as they would be (possibly) doubled without this check.
filePath += '.mfs';
}
fs.writeFile(filePath, response, (err) => console.log(err) )
}
}).catch ((err) => {
console.log (err);
});
})
More Specifically, I have a blob initialized and processed in a .js file in the 'public' (static folder) folder. Since that has been processed at the client-side, I want to know if there's a way I can somehow get access to the blob at the server-side without using a POST request. The file we talking about has been processed and is stored in a variable in a static file (script.js), Now I need to upload that variable/blob onto the Database. But, in a static file, I don't have access to the Database and can't even export the variable to the server. How do I get access to that variable which is within the static file? Someone, please edit if they have understood my requirement.
What my program does is that it records audio through the microphone of the client And that audio file has to be uploaded onto the Database. Now, I can add the functionality of 'Download' for the client and let the client download the file and then the client uses <input> tag to send a POST request to the server, But, now the client can upload any audio file into that input tag, Basically, this is a web app for live monitoring students writing exam, So that don't they don't cheat I capture their audio and save it to the DB. Please refer to my folder Structure for more details and then read the question again.
My Folder Structure:
--DBmodels
---AudioModels.js
---ImageModel.js
--node_modules
--public
---scipt.js (This contains the code for audio processing)
--views
---test.ejs (Main HTML page)
--index.js (server file)
--package.json
Here is a small diagram for reference :
Diagram
And here is my Folder Structure :
Folder Structure
One way of doing is this to download it on the client-side and then ask the client to upload but that doesn't work for me due to some reasons.
Here is my script.js, But I don't have access of the variables such as chunk_audio[] on the server.
const chunks_audio = [];
mediaRecorder.onstop = async function (e) {
console.log("DONE WITH AUDIO");
const blob = new Blob(chunks_audio, {
'type': "audio/ogg codecs=opus"
});
const audioURL = window.URL.createObjectURL(blob);
console.log(audioURL);
var link = document.getElementById("downloadaudio");
link.href = audioURL;
var audioMIMEtypes = ["audio/aac", "audio/mpeg", "audio/ogg", "audio/opus", "audio/wav"]
const audio = blob
const audiodb = new AudioSchema({
name : "Audio"+Date.now().toString()[5]
});
saveAudio(audiodb,audio)
try{
const new_audio = await audiodb.save();
console.log("AUDIO UPLOADED" + new_audio);
}catch (err){
console.log(err);
}
function saveAudio(audiodb, audioEncoded) {
if (audioEncoded == null) return;
console.log("before parse: " + audioEncoded);
const audio = JSON.parse(audioEncoded);
console.log("JSON parse: " + audio);
if (audio != null && audioMIMEtypes.includes(audio.type)) {
audiodb.audio = new Buffer.from(audio.data, "base64");
audiodb.audioType = audio.type;
}
}
// module.exports = chunks_audio; (This doesn't work for obvious reasons)
Here is my server file (index.js) , I tried to use POST request where the user posts the audio file after it gets downloaded, But the user could post any other file in the <input> tag, So that doesn't match with my requirement:
var audioMIMEtypes = ["audio/aac", "audio/mpeg", "audio/ogg", "audio/opus", "audio/wav"]
app.post('/', async ( req, res, next)=>{
const audio = blob // 'blob' is a variable in script.js , Hence don't have access here
const audiodb = new AudioSchema({
name : "Audio"+Date.now().toString()[5]
});
saveAudio(audiodb,audio)
try{
const new_audio = await audiodb.save();
console.log("AUDIO UPLOADED" + new_audio);
}catch (err){
console.log(err);
}
function saveAudio(audiodb, audioEncoded) {
if (audioEncoded == null) return;
console.log("before parse: " + audioEncoded);
const audio = JSON.parse(audioEncoded);
console.log("JSON parse: " + audio);
if (audio != null && audioMIMEtypes.includes(audio.type)) {
audiodb.audio = new Buffer.from(audio.data, "base64");
audiodb.audioType = audio.type;
}
}
})
I am trying to speed up the upload. So I tried with different solution, with both BackEnd and Front-End. Those are,
1) I uploaded the tar file (already compressed one)
2) I tried chunk upload (sequentially), if the response is success next API will get triggered. In the back-end side, in the same file the content will get appended.
3) I tried chunk upload but in parallel, at a single time I make the 50 request to upload the chunk content (I know, at a time browser handle only 6 requests). From the backend side, we are storing all the chunk file separately, after receiving the final request, appending all those chunks in to the single file.
But observed is, I am not seeing the much difference with all these cases.
Following is my service file
export class largeGeneUpload {
chromosomeFile: any;
options: any;
chunkSize = 1200000;
activeConnections = 0;
threadsQuantity = 50;
totalChunkCount = 0;
chunksPosition = 0;
failedChunks = [];
sendNext() {
if (this.activeConnections >= this.threadsQuantity) {
return;
}
if (this.chunksPosition === this.totalChunkCount) {
console.log('all chunks are done');
return;
}
const i = this.chunksPosition;
const url = 'gene/human';
const chunkIndex = i;
const start = chunkIndex * this.chunkSize;
const end = Math.min(start + this.chunkSize, this.chromosomeFile.size);
const currentchunkSize = this.chunkSize * i;
const chunkData = this.chromosomeFile.webkitSlice ? this.chromosomeFile.webkitSlice(start, end) : this.chromosomeFile.slice(start, end);
const fd = new FormData();
const binar = new File([chunkData], this.chromosomeFile.upload.filename);
console.log(binar);
fd.append('file', binar);
fd.append('dzuuid', this.chromosomeFile.upload.uuid);
fd.append('dzchunkindex', chunkIndex.toString());
fd.append('dztotalfilesize', this.chromosomeFile.upload.total);
fd.append('dzchunksize', this.chunkSize.toString());
fd.append('dztotalchunkcount', this.chromosomeFile.upload.totalChunkCount);
fd.append('isCancel', 'false');
fd.append('dzchunkbyteoffset', currentchunkSize.toString());
this.chunksPosition += 1;
this.activeConnections += 1;
this.apiDataService.uploadChunk(url, fd)
.then(() => {
this.activeConnections -= 1;
this.sendNext();
})
.catch((error) => {
this.activeConnections -= 1;
console.log('error here');
// chunksQueue.push(chunkId);
});
this.sendNext();
}
uploadChunk(resrc: string, item) {
return new Promise((resolve, reject) => {
this._http.post(this.baseApiUrl + resrc, item, {
headers: this.headers,
withCredentials: true
}).subscribe(r => {
console.log(r);
resolve();
}, err => {
console.log('err', err);
reject();
});
});
}
But the thing is, If I upload the same file in google drive it is not taking much time.
Let's consider, I have 700 MB file, to upload it in google drive it took 3 mins. But the same 700 MB file to upload with my Angular code with our back-end server it took 7 mins to finish it.
How do I improve the performance of file upload.?
forgive me ,
it seems silly answer but this depend on your hosting infrastructure
A lot of variables can cause this, but by your story it has nothing to do with your front-end code. Making it into chunks is not going to help, because browsers have their own optimized algorithm to upload files. The most likely culprit is your backend server or the connection from your client to the server.
You say that google drive is fast, but you should also know that google has a very widespread global infrastructure with top of the line cloud servers. If you are using, for example, a 2 euro per month fixed place hosting provider, you cannot expect the same processing and network power as google.