How to make upload faster in angular 7? - javascript

I am trying to speed up the upload. So I tried with different solution, with both BackEnd and Front-End. Those are,
1) I uploaded the tar file (already compressed one)
2) I tried chunk upload (sequentially), if the response is success next API will get triggered. In the back-end side, in the same file the content will get appended.
3) I tried chunk upload but in parallel, at a single time I make the 50 request to upload the chunk content (I know, at a time browser handle only 6 requests). From the backend side, we are storing all the chunk file separately, after receiving the final request, appending all those chunks in to the single file.
But observed is, I am not seeing the much difference with all these cases.
Following is my service file
export class largeGeneUpload {
chromosomeFile: any;
options: any;
chunkSize = 1200000;
activeConnections = 0;
threadsQuantity = 50;
totalChunkCount = 0;
chunksPosition = 0;
failedChunks = [];
sendNext() {
if (this.activeConnections >= this.threadsQuantity) {
return;
}
if (this.chunksPosition === this.totalChunkCount) {
console.log('all chunks are done');
return;
}
const i = this.chunksPosition;
const url = 'gene/human';
const chunkIndex = i;
const start = chunkIndex * this.chunkSize;
const end = Math.min(start + this.chunkSize, this.chromosomeFile.size);
const currentchunkSize = this.chunkSize * i;
const chunkData = this.chromosomeFile.webkitSlice ? this.chromosomeFile.webkitSlice(start, end) : this.chromosomeFile.slice(start, end);
const fd = new FormData();
const binar = new File([chunkData], this.chromosomeFile.upload.filename);
console.log(binar);
fd.append('file', binar);
fd.append('dzuuid', this.chromosomeFile.upload.uuid);
fd.append('dzchunkindex', chunkIndex.toString());
fd.append('dztotalfilesize', this.chromosomeFile.upload.total);
fd.append('dzchunksize', this.chunkSize.toString());
fd.append('dztotalchunkcount', this.chromosomeFile.upload.totalChunkCount);
fd.append('isCancel', 'false');
fd.append('dzchunkbyteoffset', currentchunkSize.toString());
this.chunksPosition += 1;
this.activeConnections += 1;
this.apiDataService.uploadChunk(url, fd)
.then(() => {
this.activeConnections -= 1;
this.sendNext();
})
.catch((error) => {
this.activeConnections -= 1;
console.log('error here');
// chunksQueue.push(chunkId);
});
this.sendNext();
}
uploadChunk(resrc: string, item) {
return new Promise((resolve, reject) => {
this._http.post(this.baseApiUrl + resrc, item, {
headers: this.headers,
withCredentials: true
}).subscribe(r => {
console.log(r);
resolve();
}, err => {
console.log('err', err);
reject();
});
});
}
But the thing is, If I upload the same file in google drive it is not taking much time.
Let's consider, I have 700 MB file, to upload it in google drive it took 3 mins. But the same 700 MB file to upload with my Angular code with our back-end server it took 7 mins to finish it.
How do I improve the performance of file upload.?

forgive me ,
it seems silly answer but this depend on your hosting infrastructure

A lot of variables can cause this, but by your story it has nothing to do with your front-end code. Making it into chunks is not going to help, because browsers have their own optimized algorithm to upload files. The most likely culprit is your backend server or the connection from your client to the server.
You say that google drive is fast, but you should also know that google has a very widespread global infrastructure with top of the line cloud servers. If you are using, for example, a 2 euro per month fixed place hosting provider, you cannot expect the same processing and network power as google.

Related

How to write BLE write characteristic over 512B

I have a client attempting to send images to a server over BLE.
Client Code
//BoilerPlate to setup connection and whatnot
sendFile.onclick = async () => {
var fileList = document.getElementById("myFile").files;
var fileReader = new FileReader();
if (fileReader && fileList && fileList.length) {
fileReader.readAsArrayBuffer(fileList[0]);
fileReader.onload = function () {
var imageData = fileReader.result;
//Server doesn't get data if I don't do this chunking
imageData = imageData.slice(0,512);
const base64String = _arrayBufferToBase64(imageData);
document.getElementById("ItemPreview").src = "data:image/jpeg;base64," + base64String;
sendCharacteristic.writeValue(imageData);
};
}
};
Server Code
MyCharacteristic.prototype.onWriteRequest = function(data, offset, withoutResponse, callback) {
//It seems this will not print out if Server sends over 512B.
console.log(this._value);
};
My goal is to send small images (Just ~6kb)...These are still so small that'd I'd still prefer to use BLE over a BT Serial Connection. Is the only way this is possible is to perform some chunking and then streaming the chunks over?
Current 'Chunking' Code
const MAX_LENGTH = 512;
for (let i=0; i<bytes.byteLength; i+= MAX_LENGTH) {
const end = (i+MAX_LENGTH > bytes.byteLength) ? bytes.byteLength : i+MAX_LENGTH;
const chunk = bytes.slice(i, end);
sendCharacteristic.writeValue(chunk);
await sleep(1000);
}
The above code works, however it sleeps in between sends. I'd rather not do this because there's no guarantee a previous packet will be finished sending and I could sleep longer than needed.
I'm also perplexed on how the server code would then know the client has finished sending all bytes and can then assemble them. Is there some kind of pattern to achieving this?
BLE characteristic values can only be 512 bytes, so yes the common way to send larger data is to split it into multiple chunks. Use "Write Without Response" for best performance (MTU-3 must be at least as big as your chunk).

Why does Busboy yield inconsistent results when parsing FLAC files?

I have an Express server that receives FormData with an attached FLAC audio file. The code works as expected for several files of varying size (10 - 70MB), but some of them get stuck in the 'file' event and I cannot figure out why this happens. It is even more strange when a file that previously did not fire the file.on('close', => {}) event, as can be seen in the documentation for Busboy, suddenly does so, with the file being successfully uploaded.
To me, this seems completely random, as I have tried this with a dozen files of varying size and content type (audio/flac & audio/x-flac), and the results have been inconsistent. Some files will, however, not work at all, even if I attempt to parse them many times over. Whereas, certain files can be parsed and uploaded, given enough attempts?
Is there some error that I fail to deal with in the 'file' event? I did try to listen to the file.on('error', => {}) event, but there were no errors to be found. Other answers suggest that the file stream must be consumed for the 'close' event to proceed, but I think that file.pipe(fs.createWriteStream(fileObject.filePath)); does that, correct?
Let me know if I forgot to include some important information in my question. This has been bothering me for about a week now, so I am happy to provide anything of relevance to help my chances of overcoming this hurdle.
app.post('/upload', (request, response) => {
response.set('Access-Control-Allow-Origin', '*');
const bb = busboy({ headers: request.headers });
const fields = {};
const fileObject = {};
bb.on('file', (_name, file, info) => {
const { filename, mimeType } = info;
fileObject['mimeType'] = mimeType;
fileObject['filePath'] = path.join(os.tmpdir(), filename);
file.pipe(fs.createWriteStream(fileObject.filePath));
file.on('close', () => {
console.log('Finished parsing of file');
});
});
bb.on('field', (name, value) => {
fields[name] = value;
});
bb.on('close', () => {
bucket.upload(
fileObject.filePath,
{
uploadType: 'resumable',
metadata: {
metadata: {
contentType: fileObject.mimeType,
firebaseStorageDownloadToken: fields.id
}
}
},
(error, uploadedFile) => {
if (error) {
console.log(error);
} else {
db.collection('tracks')
.doc(fields.id)
.set({
identifier: fields.id,
artist: fields.artist,
title: fields.title,
imageUrl: fields.imageUrl,
fileUrl: `https://firebasestorage.googleapis.com/v0/b/${bucket.name}/o/${uploadedFile.name}?alt=media&token=${fields.id}`
});
response.send(`File uploaded: ${fields.id}`);
}
}
);
});
request.pipe(bb);
});
UPDATE: 1
I decided to measure the number of bytes that were transferred upon each upload with file.on('data', (data) => {}), just to see if the issue was always the same, and it turns out that this too is completely random.
let bytes = 0;
file.on('data', (data) => {
bytes += data.length;
console.log(`Loaded ${(bytes / 1000000).toFixed(2)}MB`);
});
First Test Case: Fenomenon - Sleepy Meadows Of Buxton
Source: https://fenomenon.bandcamp.com/track/sleepy-meadows-of-buxton
Size: 30.3MB
Codec: FLAC
MIME: audio/flac
Results from three attempts:
Loaded 18.74MB, then became stuck
Loaded 5.05MB, then became stuck
Loaded 21.23MB, then became stuck
Second Test Case: Almunia - New Moon
Source: https://almunia.bandcamp.com/track/new-moon
Size: 38.7MB
Codec: FLAC
MIME: audio/flac
Results from three attempts:
Loaded 12.78MB, then became stuck
Loaded 38.65, was successfully uploaded!
Loaded 38.65, was successfully uploaded!
As you can see, the behavior is unpredictable to say the least. Also, those two successful uploads did playback seamlessly from Firebase Storage, so it really worked as intended. What I cannot understand is why it would not always work, or at least most of the time, excluding any network-related failures.
UPDATE: 2
I am hopelessly stuck trying to make sense of the issue, so I have now created a scenario that closely resembles my actual project, and uploaded the code to GitHub. It is pretty minimal, but I did add some additional libraries to make the front-end pleasant to work with.
There is not much to it, other than an Express server for the back-end and a simple Vue application for the front-end. Within the files folder, there are two FLAC files; One of them is only 4.42MB to prove that the code does sometimes work. The other file is much larger at 38.1MB to reliably illustrate the problem. Feel free to try any other files.
Note that the front-end must be modified to allow files other than FLAC files. I made the choice to only accept FLAC files, as this is what I am working with in my actual project.
You'll need to write the file directly when BusBoy emits the file event.
It seems there is a race condition if you rely on BusBoy that prevents the file load from being completed. If you load it in the file event handler then it works fine.
app.post('/upload', (request, response) => {
response.set('Access-Control-Allow-Origin', '*');
const bb = busboy({
headers: request.headers
});
const fileObject = {};
let bytes = 0;
bb.on('file', (name, file, info) => {
const {
filename,
mimeType
} = info;
fileObject['mimeType'] = mimeType;
fileObject['filePath'] = path.join(os.tmpdir(), filename);
const saveTo = path.join(os.tmpdir(), filename);
const writeStream = fs.createWriteStream(saveTo);
file.on('data', (data) => {
writeStream.write(data);
console.log(`Received: ${((bytes += data.length) / 1000000).toFixed(2)}MB`);
});
file.on('end', () => {
console.log('closing writeStream');
writeStream.close()
});
});
bb.on('close', () => {
console.log(`Actual size is ${(fs.statSync(fileObject.filePath).size / 1000000).toFixed(2)}MB`);
console.log('This is where the file would be uploaded to some cloud storage server...');
response.send('File was uploaded');
});
bb.on('error', (error) => {
console.log(error);
});
request.pipe(bb);
});

429 Too Many Requests - Angular 7 - on multiple file upload

I have this problem when I try to upload more than a few hundred of files at the same time.
The API interface is for one file only so I have to call the service sending each file. Right now I have this:
onFilePaymentSelect(event): void {
if (event.target.files.length > 0) {
this.paymentFiles = event.target.files[0];
}
let i = 0;
let save = 0;
const numFiles = event.target.files.length;
let procesed = 0;
if (event.target.files.length > 0) {
while (event.target.files[i]) {
const formData = new FormData();
formData.append('file', event.target.files[i]);
this.payrollsService.sendFilesPaymentName(formData).subscribe(
(response) => {
let added = null;
procesed++;
if (response.status_message === 'File saved') {
added = true;
save++;
} else {
added = false;
}
this.payList.push({ filename, message, added });
});
i++;
}
}
So really I have a while for sending each file to the API but I get the message "429 too many request" on a high number of files. Any way I can improve this?
Working with observables will make that task easier to reason about (rather than using imperative programming).
A browser usually allows you to make 6 request in parallel and will queue the others. But we don't want the browser to manage that queue for us (or if we're running in a node environment we wouldn't have that for ex).
What do we want: We want to upload a lot of files. They should be queued and uploaded as efficiently as possible by running 5 requests in parallel at all time. (so we keep 1 free for other requests in our app).
In order to demo that, let's build some mocks first:
function randomInteger(min, max) {
return Math.floor(Math.random() * (max - min + 1)) + min;
}
const mockPayrollsService = {
sendFilesPaymentName: (file: File) => {
return of(file).pipe(
// simulate a 500ms to 1.5s network latency from the server
delay(randomInteger(500, 1500))
);
}
};
// array containing 50 files which are mocked
const files: File[] = Array.from({ length: 50 })
.fill(null)
.map(() => new File([], ""));
I think the code above is self explanatory. We are generating mocks so we can see how the core of the code will actually run without having access to your application for real.
Now, the main part:
const NUMBER_OF_PARALLEL_CALLS = 5;
const onFilePaymentSelect = (files: File[]) => {
const uploadQueue$ = from(files).pipe(
map(file => mockPayrollsService.sendFilesPaymentName(file)),
mergeAll(NUMBER_OF_PARALLEL_CALLS)
);
uploadQueue$
.pipe(
scan(nbUploadedFiles => nbUploadedFiles + 1, 0),
tap(nbUploadedFiles =>
console.log(`${nbUploadedFiles}/${files.length} file(s) uploaded`)
),
tap({ complete: () => console.log("All files have been uploaded") })
)
.subscribe();
};
onFilePaymentSelect(files);
We use from to send the files one by one into an observable
using map, we prepare our request for 1 file (but as we don't subscribe to it and the observable is cold, the request is just prepared, not triggered!)
we now use mergeMap to run a pool of calls. Thanks to the fact that mergeMap takes the concurrency as an argument, we can say "please run a maximum of 5 calls at the same time"
we then use scan for display purpose only (to count the number of files that have been uploaded successfully)
Here's a live demo: https://stackblitz.com/edit/rxjs-zuwy33?file=index.ts
Open up the console to see that we're not uploading all them at once

File uploads through socket.io (JavaScript & FileReader)

I am creating a chat app (in React Native), but for now, I have made some tests in vanilla JavaScript. The server is a NodeJS-server.
It works with sending text messages, but now I have some questions about sending photos/videos/audio files. I'm doing a lot of research online on what's the best method to do this.
I came up with the idea to use the FileReader API and split up the file into chunks, and sending chunk by chunk via the socket.emit()-function.
This is my code so far (simplified):
Please note that I will create a React Native app, but for now (for testing), I've just created a HTML-file with an upload form.
// index.html
// the page where my upload form is
var reader = {};
var file = {};
var sliceSize = 1000 * 1024;
var socket = io('http://localhost:8080');
const startUpload = e => {
e.preventDefault();
reader = new FileReader();
file = $('#file)[0].files[0]
uploadFile(0)
}
$('#start-upload').on('click', startUpload)
const uploadFile = start => {
var slice = start + sliceSize + 1;
var blob = file.slice(start, slice)
reader.on('loadend', e => {
if (slice < file.size) {
socket.emit('message', JSON.stringify({
fileName: file.name,
fileType: file.type,
fileChunk: e.target.result
})
} else {
console.log('Upload completed!')
}
})
reader.readAsDataURl(blob)
}
// app.js
// my NodeJS server-file
var file;
var files = {};
io.on('connection', socket => {
console.log('User connected!');
// when a message is received
socket.on('message', data => {
file = JSON.parse(data)
if (!files[file.fileName]) {
// this is the first chunk received
// create a new string
files[file.fileName] = '';
}
// append the binary data
files[file.fileName] = files[file.fileName] + file.fileChunk;
})
// on disconnect
socket.on('disconnect', () => {
console.log('User disconnected!');
})
})
I did not include any checks for file type (I'm not at that point yet), I first want to make sure that this is the right thing to do.
Stuff I need to do:
Send a message (like socket.emit('uploaddone', ...)) from the client to the server to notify the server that the upload is done (and the server can emit the complete file to another user).
My questions are:
Is it okay to send chunks of binary data (base64) over a socket, or would it take up to much bandwidth?
Will I lose some quality (photos/videos/audio files) when splitting them up into chunks?
If there is a better way to do this, please let me know. I'm not asking for working code examples, just some guidance in the good direction.
You can send raw bytes over WebSocket, base64 has 33% size overhead.
Also you won't have to JSON.stringify all (and maybe large) body and parse it on client-side.
Will I lose some quality
No, underlying protocol (TCP) delivers data in-order and without corruption.
I realize this answer is a couple of months late, but just for future reference you should look into using the acknowledgment option with socket.io here
// with acknowledgement
let message = JSON.stringify({
fileName: file.name,
fileType: file.type,
fileChunk: e.target.result
})
socket.emit("message", message, (ack) => {
// send next chunk...
});

Splitting large file load into chunks, stitching to AudioBuffer?

In my app, I have an hour-long audio file that's entirely sound effects. Unfortunately I do need them all - they're species-specific sounds, so I can't cut any of them out. They were separate before, but I audiosprite'd them all into one large file.
The export file is about 20MB compressed, but it's still a large download for users with a slow connection. I need this file to be in an AudioBuffer, since I'm seeking to sections of an audioSprite and using loopStart/loopEnd to only loop that section. I more or less need the whole thing downloaded before playback can start, because the requested species are randomly picked when the app starts. They could be looking for sounds at the start of the file, or at the very end.
What I'm wondering is, if I were to split this file in fourths, could I load them in in parallel, and stitch them into the full AudioBuffer once loading finishes? I'm guessing I'd be merging multiple arrays, but only performing decodeAudioData() once? Requesting ~100 separate files (too many) was what brought me to audiosprites in the first place, but I'm wondering if there's a way to leverage some amount of async loading to lower the time it takes. I thought about having four <audio> elements and using createMediaElementSource() to load them, but my understanding is that I can't (?) turn a MediaElementSource into an AudioBuffer.
Consider playing the files immediately in chucks instead of waiting for the entire file to download. You could do this with the Streams API and:
Queuing chunks with the MediaSource Extensions (MSE) API and switching between buffers.
Playing back decoded PCM audio with the Web Audio API and AudioBuffer.
See examples for low-latency audio playback of file chunks as they are received.
I think in principle you can. Just download each chunk as an ArrayBuffer, concatenate all of the chunks together and send that to decodeAudioData.
But if you're on a slow link, I'm not sure how downloading in parallel will help.
Edit: this code is functional, but on occasion produces really nasty audio glitches, so I don't recommend using it without further testing. I'm leaving it here in case it helps someone else figure out working with Uint8Arrays.
So here's a basic version of it, basically what Raymond described. I haven't tested this with a split version of the large file yet, so I don't know if it improves the load speed at all, but it works. The JS is below, but if you want to test it yourself, here's the pen.
// mp3 link is from: https://codepen.io/SitePoint/pen/JRaLVR
(function () {
'use strict';
const context = new AudioContext();
let bufferList = [];
// change the urlList for your needs
const URL = 'https://s3-us-west-2.amazonaws.com/s.cdpn.io/123941/Yodel_Sound_Effect.mp3';
const urlList = [URL, URL, URL, URL, URL, URL];
const loadButton = document.querySelector('.loadFile');
const playButton = document.querySelector('.playFile');
loadButton.onclick = () => loadAllFiles(urlList, loadProgress);
function play(audioBuffer) {
const source = context.createBufferSource();
source.buffer = audioBuffer;
source.connect(context.destination);
source.start();
}
// concatenates all the buffers into one collected ArrayBuffer
function concatBufferList(buflist, len) {
let tmp = new Uint8Array(len);
let pos = 0;
for (let i = 0; i < buflist.length; i++) {
tmp.set(new Uint8Array(buflist[i]), pos);
pos += buflist[i].byteLength;
}
return tmp.buffer;
}
function loadAllFiles(list, onProgress) {
let fileCount = 0;
let fileSize = 0;
for (let i = 0; i < list.length; i++) {
loadFileXML(list[i], loadProgress, i).then(e => {
bufferList[i] = e.buf;
fileSize += e.size;
fileCount++;
if (fileCount == bufferList.length) {
let b = concatBufferList(bufferList, fileSize);
context.decodeAudioData(b).then(audioBuffer => {
playButton.disabled = false;
playButton.onclick = () => play(audioBuffer);
}).catch(error => console.log(error));
}
});
}
}
// adapted from petervdn's audiobuffer-load on npm
function loadFileXML(url, onProgress, index) {
return new Promise((resolve, reject) => {
const request = new XMLHttpRequest();
request.open('GET', url, true);
request.responseType = 'arraybuffer';
if (onProgress) {
request.onprogress = event => {
onProgress(event.loaded / event.total);
};
}
request.onload = () => {
if (request.status === 200) {
const fileSize = request.response.byteLength;
resolve({
buf: request.response,
size: fileSize
});
}
else {
reject(`Error loading '${url}' (${request.status})`);
}
};
request.onerror = error => {
reject(error);
};
request.send();
});
}
function loadProgress(e) {
console.log("Progress: "+e);
}
}());

Categories