I need to download a large (>100MB) file of data via XmlHttpRequest. The data is from a third party and I would like to display the content gradually as it gets downloaded.
So I thought the following would work:
var req = new XMLHttpRequest();
req.open( "GET", mirror.url, true );
req.responseType = "arraybuffer";
req.onload = function( oEvent ) {
console.log( "DONE" );
};
var current_offset = 0;
req.addEventListener("progress", function(event) {
if( event.lengthComputable ) {
var percentComplete = Math.round(event.loaded * 100 / event.total);
}
var data = req.response;
// Fails here: req.response is null till load is called
var dataView = new DataView( data );
while( current_offset < dataView.byteLength ) {
// do work
++current_offset;
}
console.log( "OFFSET " + current_offset + " [" + percentComplete + "%]" );
}, false);
try {
req.send( null );
} catch( er ) {
console.log( er );
}
Sadly, according to the spec, .response is not available.
Is there any way to access the already downloaded data without going to such horrible workarounds like using Flash?
EDIT:
Found at least a working non-standard solution for Firefox:
responseType = "moz-chunked-arraybuffer";
See also: WebKit equivalent to Firefox's "moz-chunked-arraybuffer" xhr responseType
One solution would be to download only parts of the file using range requests to only download parts of the file. For more information see the following blog post HTTP Status: 206 Partial Content and Range Requests.
If you have control over the server you could also split up the file on the server side and download chunks of it. This way you would be able to access the response as the different chunks are recieved.
A third approach would be to use WebSockets to download the data.
If you have no control of the server you are downloading from and none of the other options will work, you will probably need to implement a proxy service that will act as an intermediate and allow you to download only part of it.
Have you looked at using a library for this?
I've never used it to load a 100MB file, but PreloadJS is pretty nice for loading all kinds of assets.
Related
I have a JS library that is responsible to perform the download of JPEG images for the client. All of this is done asynchronously. In some cases, the count of images is really large... Around 5000 images. In this case, the Chrome browser issues the "ERR_INSUFFICIENT_RESOURCES" error for the ajax request.
Each request must be done individually, there is no option to pack the images on the server-side.
What are my options here? How can I find a workaround for this problem? The download works fine in Firefox...
Attached code of the actual download:
function loadFileAndDecrypt(fileId, key, type, length, callback, obj) {
var step = 100 / length;
eventBus.$emit('updateProgressText', "downloadingFiles");
var req = new dh.crypto.HttpRequest();
req.setAesKey(key);
let dataUrl;
if (type == "study") {
dataUrl = "/v1/images/";
}else {
dataUrl = "/v1/dicoms/";
}
var url = axios.defaults.baseURL + dataUrl + fileId;
req.open("GET", url, true);
req.setRequestHeader("Authorization", authHeader().Authorization+"")
req.setRequestHeader("Accept", "application/octet-stream, application/json, text/plain, */*");
req.responseType = "arraybuffer";
req.onload = function() {
console.log(downloadStep);
downloadStep += step;
eventBus.$emit('updatePb', Math.ceil(downloadStep));
var data = req.response;
obj.push(data);
counter ++;
//last one
if (counter == length) {
callback(obj);
}
};
req.send();
}
The error means your code is overloading your memory (most likely, or the quota of pending requests was exhausted). Instead of sending all the data from the backend, make your frontend request for 5000 individual images instead and control the requests flow. regardless, downloading 5000 images is bad. You should pack them up for downloading. If you mean fetching the images, then loading images from the frontend through static or dynamic links is much more logical ;)
Create a class:
Which accepts the file-Id (image that needs to be downloaded) as an argument
Which can perform the HTTP API request
Which can store the result of the request
Create an array of objects from this class using how many ever file-Ids that needs to be downloaded.
Store the array in a RequestManager which can start and manage the downloads:
can batch the downloads, say it fires 5 requests from the array and waits for them to finish before starting the next batch
can stop the downloads on multiple failures
manipulate batch size depending on the available bandwidth
stops download on auth expiry and resumes on auth refresh
offers to retry the previously failed downloads
I am allowing users to upload CSV files through the website. The file is getting read using the JavaScript file API then getting sent through to the server to be saved.
, upload: function (prefix, numberType, file, name)
{
this.attributes = { // Set the data to be sent along
'upload': true,
'prefix': prefix,
'file': file,
'name': name,
'numberType': numberType
};
console.log('upload', this) // This will correctly show in the console
return this.sync('create', this, { // This is when Chrome crashes
xhr: function() {
var xhr = $.ajaxSettings.xhr();
xhr.upload.onprogress = function(evt){
document.querySelector('.uploadProgressBar').style.width = parseInt(evt.loaded/evt.total*100) + '%';
document.querySelector('#uploadNow').classList.add('percentageUpload');
document.querySelector('#uploadNow').innerText = parseInt(evt.loaded/evt.total*100) + '%';
};
return xhr;
}
});
}
When inspecting the network tab it looks like the request is never sent so it's breaking just while the request is being created. This will only break when the file is around 100mb and smaller files will upload fine. As well as this, it will work fine on both Safari and Firefox so it's a Chrome specific issue. Is this a known issue with Chrome where it has trouble dealing with large files?
I'm thinking the only way to really get around this problem is to split the file into chunks and piece it back together on the server. This will certainly be possible but it would be worth finding out if it's a limitation to note in the future.
The browser crashes because it runs out of memory.
Instead of loading the file in memory pass the file object to XMLHttpRequest so that Chrome can stream the file contents in the upload form.
Use the FormData object for this:
// your file input
const file = document.getElementById('file').files[0];
// your form
var form = new FormData();
form.append('file', file);
const xhr = $.ajaxSettings.xhr();
xhr.upload.onprogress = function(evt) {
document.querySelector('.uploadProgressBar').style.width = parseInt(evt.loaded / evt.total * 100) + '%';
document.querySelector('#uploadNow').classList.add('percentageUpload');
document.querySelector('#uploadNow').innerText = parseInt(evt.loaded / evt.total * 100) + '%';
};
xhr.open('POST', 'http://example.com/'); // Url where you want to upload
xhr.send(form);
<img src="/a.jpg" onerror="fetch(\'/a.jpg\')
.then(code => console.log(code === 499
? 'image will be available in a moment'
: 'image not found'))">
Is it possible to do this without firing two HTTP requests (one by img.src and one by fetch function)?
My use case is I want to fire a polling loop (which I have already implemented, just skipped it there for simplicity) that will retry loading the image if it is still being prepared on server (the loop will of course fire more HTTP requests, but that's OK), but if the image actually does not exist, just show "image not found".
The server can be implemented for example this way:
if an image exists and has a thumbnail ready, return an image response
if an image exists but thumbnail is not ready yet, return specific HTTP code (499)
Compatibility with modern browsers & IE 11 is enough for me.
Finally found the solution myself - load the image using XHR and display it using BLOB API.
This solution provides all what I wanted:
fires only single HTTP request to get an {image|error code},
does not need additional user permissions (like implementation with webRequest hook),
does not pollute DOM with extra long base64 urls,
seems compatible with modern browsers and even IE10.
var myImage = document.querySelector('img');
var myRequest = new XMLHttpRequest();
myRequest.open('GET', 'http://placekitten.com/123/456', true);
myRequest.responseType = 'blob';
myRequest.onreadystatechange = () => {
if (myRequest.readyState !== 4) {
return;
}
if (myRequest.status === 200) {
var blob = myRequest.response;
var objectURL = URL.createObjectURL(blob);
// this is the trick - generates url like
// blob:http://localhost/adb50c88-9468-40d9-8b0b-1f6ec8bb5a32
myImage.src = objectURL;
} else if (myRequest.status === 499) {
console.log('... waiting for thumbnail');
retryAfter5s(); // implementation of retry loop is not important there
} else {
console.log('Image not found');
}
};
myRequest.send();
I'm making a JavaScript script that is going to essentially save an old game development sandbox website before the owners scrap it (and lose all of the games). I've created a script that downloads each game via AJAX, and would like to somehow upload it straight away, also using AJAX. How do I upload the downloaded file (that's stored in responseText, presumably) to a PHP page on another domain (that has cross origin headers enabled)?
I assume there must be a way of uploading the data from the first AJAX request, without transferring the responseText to another AJAX request (used to upload the file)? I've tried transferring the data, but as expected, it causes huge lag (and can crash the browser), as the files can be quite large.
Is there a way that an AJAX request can somehow upload individual packets as soon as they're recieved?
Thanks,
Dan.
You could use Firefox' moz-chunked-text and moz-chunked-arraybuffer response types. On the JavaScript side you can do something like this:
function downloadUpload() {
var downloadUrl = "server.com/largeFile.ext";
var uploadUrl = "receiver.net/upload.php";
var dataOffset = 0;
xhrDownload = new XMLHttpRequest();
xhrDownload.open("GET", downloadUrl, true);
xhrDownload.responseType = "moz-chunked-text"; // <- only works in Firefox
xhrDownload.onprogress = uploadData;
xhrDownload.send();
function uploadData() {
var data = {
file: downloadUrl.substring(downloadUrl.lastIndexOf('/') + 1),
offset: dataOffset,
chunk: xhrDownload.responseText
};
xhrUpload = new XMLHttpRequest();
xhrUpload.open("POST", uploadUrl, true);
xhrUpload.setRequestHeader('Content-Type', 'application/json; charset=UTF-8');
xhrUpload.send(JSON.stringify(data));
dataOffset += xhrDownload.responseText.length;
};
}
On the PHP side you need something like this:
$in = fopen("php://input", "r");
$postContent = stream_get_contents($in);
fclose($in);
$o = json_decode($postContent);
file_put_contents($o->file . '-' . $o->offset . '.txt', $o->chunk);
These snippets will just give you the basic idea, you'll need to optimize the code yourself.
I try to create code to upload files using XHR2 and web workers.
I thought I should use web workers , so if a file is big, web page will not freeze.
This is not working for two reasons, I never used web workers before, and I want to post to the server the file and vars at the same time, with the same xhr. When I say vars I mean the name of the file, and an int.
Heres is what I got
Client side
//create worker
var worker = new Worker('fileupload.js');
worker.onmessage = function(e) {
alert('worker says '+e.data);
}
//handle workers error
worker.onerror =werror;
function werror(e) {
console.log('ERROR: Line ', e.lineno, ' in ', e.filename, ': ', e.message);
}
//send stuff to the worker
worker.postMessage({
'files' : files, //img or video
'name' : nameofthepic, //text
'id':imageinsertid //number
});
Inside the worker (fileupload.js file)
onmessage = function (e) {var name=e.data.name; var id=e.data.id ; var file=e.data.files;
//create a var to catch the anser of the server
var datax;
var xhr = new XMLHttpRequest();
xhr.onload = function() {
if (xhr.status == 200) {datax=xhr.response;}
else { datax=525;}//actually, whatever, just give a value
};
xhr.open('POST', 'upload.php');
xhr.send(file,name,id);
//i also tried xhr.send('file=file&name=name&id=id'); and still nothing
//i also tried just the text/int xhr.send('name=name&id=id'); and still nothing
I am confused. I cannot send anything to the server. I get no feedback from the worker. I dont even know if the data are send to the fileupload.js. Server side does not INSERT.
Is that possible, sending files and text at the same time? What am I missing?
I need to pass text and int along with the file, so server side not only will upload the file, but also will INSERT to the database the int and the text, if the file is uploaded succesfully. This was easy just with formData and xhr, but, putting web workers in the middle, I cant get it right.
Also, can I use Transferable Objects to speed things up? Are Transferable Objects supported in all major browsers?
Thanks in advance