google app engine PHP - 500 error on upload - javascript

I have a PHP application running and I'm uploading a PDF for processing. I have not overridden any PHP ini values, so I'm expecting the post to be able to handle 32MB of data and a timeout of 60 seconds.
When I upload a large document (7.7MB in this case), the app fails in 2 very different ways.
The back end times out and returns successfully having apparently been passed duff data
The backend does not timeout (returning in under a minute) but has an internal server error
The timeout seems to manifest as the back end PHP page getting no data, i.e. whatever is doing the transport times out rather than my page timing out. I can handle this scenario because of the duff data passed in and pass back a useful error message. I don't seem to be able to reproduce this on my local development machine.
The second issue is perplexing as it happens almost immediately on my dev machine and also if the page response is under 1 minute on GAE. I can upload a document of 4.1MB and all is good. The 7.7MB doc causes this every time. The headers look fine and the form data has all the data it needs, although I haven't tried to decode the form data.
Is there another PHP setting causing this? Is there a way to alter this? What could possibly be going on? Could this be a javascript thing as I am using javascript file reader in an HTML5 dropzone - here are the appropriate portions of the handler code:
function loadSpecdoc(file) {
var acceptedTypes = {
'application/pdf' : true
};
if (acceptedTypes[file.type] === true) {
var reader = new FileReader();
reader.onload = function(event) {
$.ajax({
url : "my_handler.php",
type : 'POST',
dataType : "json",
data : {
spec_id : $('#list_spec_items_spec_id').val(),
file_content : event.target.result
},
success : function(response) {
// Handle Success
},
error : function(XMLHttpRequest, textStatus, exception) {
// Handle Failure
},
async : true
});
};
reader.readAsDataURL(file);
} else {
alert("Unsupported file");
console.log(file);
}
};
I have heard of the Blob store but I'm on a bit of a deadline and can't find an documentation on how to make it work in javascript, and can't get the file to my PHP application to use it.
Any help much appreciated.
EDIT: I have just checked in the chrome javascript network console and there are 2 sizes in the "Size Content" column, 79B and 0B, I'm guessing that this is the size up (79B) and response (0B). This being the case, this seems to be a javascript issue??
EDIT 2: Just found the log for my application and it says there is a post content limit of about 8MB... not quite the 32MB I was expecting:
06:26:03.476 PHP Warning: Unknown: POST Content-Length of 10612153 bytes exceeds the limit of 8388608 bytes in Unknown on line 0
EDIT 3: Just found this on limits:
https://cloud.google.com/appengine/docs/php/requests#PHP_Quotas_and_limits
I am expecting 32MB of post, assuming I can upload it quick enough :) How do I fix this?

Related

Limit size of AJAX response

I'm building a decentralized application (I don't control the servers, only the client), and want to add some sanity checks and preventative measures to stop bad people from doing malicious things. This involves (among many, many other things), preventing DoS attempts on the client by the use of arbitrary payload data being sent from the servers.
The question is this: How can the client limit the maximum size of data received from a server over JQuery AJAX? If I'm expecting to fetch a few bytes of JSON, and am instead greeted by a 30MB video file when I make the AJAX request, how can I stop the request and throw an error after I've received the first 16 KB?
While I recognize that the nature of my undertaking is unique, any feedback is wwlcome.
As #Barmar pointed out in the comments, this was a simple case of checking the "onprogress" event of the download and terminating it when it exceeded my desired max size.
Here is the code for any interested parties:
var xhr = $.ajax({
url: "your-url",
success: () => {
// ...
},
xhrFields: {
onprogress: function(progress) {
if (progress.loaded > config.MAX_HASH_DESCRIPTOR_SIZE) {
// stop any unreasonably long malicious payload downloads.
xhr.abort()
}
}
}
})

How can i check the HTML document GET request status?

I am doing "window.open(file_url)" to download a file and if a file exists backend returns a Blob which gets downloaded by the browser but if the file doesn't exist then backend returns a JSON error message with request status as 500.
so is there is some way to know that "status" for a page.
I know for AJAX we get the status property but for normal web pages do we have some way to know that status since when the browser makes a request for a page its an HTTP GET and it should have status.
This is a working example code. So, you should use it.
$.get(url, function(data,status, xhr) {
alert(xhr.status);
});
You can check for error as
var test = window.open(file_url)
test.onError = alert('The specified file was not found. Has it been renamed or
removed?');

How to request url with authorized token through API

I am trying to develop a web-app based on girder platform (a data management platform). During development, I get some problem confused me for a long time. Let me briefly explain my work and if anything I understand worry, please point it out coz I just start.
The thing is that,
In the font end I am using AMI(a javascript library) to deal with image virtualization, and the way is to tell (make a request to girder server) AMI the URL address which contains image to display as attachment. (e.g.http://girderDomain:port/api/v1/file/imageID/download?contentDisposition=attachment API of girder)
When this URL does not have any permission, everything works fine. When it needs permission (which is a token generated when authorized user login), purely URL does not work, so I was trying to use ajax to make a request with requestHeader(token) something like below:
$.ajax({
type:'GET',
url:'http://girderDomain:port/api/v1/file/imageWithPermissionID/download?contentDisposition=attachment',
crossDomain:true,
processData: false,
beforeSend:function(xhr){
// xhr.setRequestHeader("Access-Control-Allow-Origin:", "*");
// xhr.setRequestHeader("girderToken", token);
},
success:function (d,s,xhr) {}
});
Although I still get some error not solving yet, but ajax is the only way that in my mind.
and the whole process is like pseudocode below:
//***AMI***//
var t2 = ["imageID",....]; //////No permission need image
files = t2.map(function(v) {
return 'http://girderDomain:port/api/v1/file/' + v+'/download?contentDisposition=attachment';
});
AMI.display(files)
As you can see no permission image display is simple and easy to do, and with permission image through ajax request would be something like:
//***ajax to download permission file first***//
$.ajax({
//url:'http://girderDomain:port/api/v1/file/imageWithPermissionID/download?contentDisposition=attachment'
//set requestHeader("girderToken",token);
//success:function(){
//download this permission file to request domain tempFolder as A
}
});
/***AMI***/
var t2 = ["A"];
files = t2.map(function(v) {
return 'http://requestDomain/tempFolder/A';
});
AMI.display(files);
and that looks stupid, so I am wondering does anyone have any idea to request a file with permission, like maybe save token in cookie or session and using some way to make request with cookie or session, or any other new methods, frameworks.
Since I just start that kind of client-server development, any help truly appreciated.

How to return a result per-image in dropzone.js

I'm using DropZone to upload a batch of images, but I would like to inform the user immediately after each image on it's status.
e.g. if an image is not qualified then I would like to mark is as such.
Dropzone.js allows you to attach callbacks to various events in the cycle. To provide a status after each file is uploaded to the server, you can register success and error callbacks to provide this functionality.
var myDropzone = new Dropzone("div#myId", { url: "/file/post" });
// attach callback to the `success` event
myDropzone.on("success", function( file, result ) {
// the file parameter is https://developer.mozilla.org/en-US/docs/DOM/File
// the result parameter is the result from the server
// [success code here]
});
// attach callback to the 'error' event
myDropzone.on("error", function( file, errorMessage, xhr ) {
// if the xhr parameter exists, it means the error was server-side
if (xhr) {
// [error code here]
}
});
I found the solution here https://github.com/enyo/dropzone/issues/247
enyo commented on Aug 1, 2013 If you have uploadMultiple set to true,
then one AJAX request should contain as many files as parallelUploads.
If you have uploadMultiple set to false then there will be as many
AJAX requests as parallelUploads. Pedro Cunha
pedrocunha commented on Aug 1, 2013 Exactly see this bug could be
considered a feature :) because I could set parallelUploads to 1 and
per AJAX request have only one image at time but several uploads at
same time Pedro Cunha
pedrocunha commented on Aug 1, 2013 My point being: this is very
important because if one image fails to upload for some reason, let's
say a server validation error, you can handle just that image. If you
are handling a collection of stuff it's trickier to show errors. Do
you understand I mean?
It looks like it would be nice to configure the amount of images sent
on each AJAX request ?
Meaning if I set:
{
parallelUploads: 8,
uploadMultiple: false,
}
Then it will upload 8 files in prallel each in its own AJAX call and the server can return deiffernt error code per-image.

What are the ways to display 'chunked' responses as soon as they come into AngularJS?

Currently I have a problem displaying 'chunks' of responses that I am sending from my Web Service Node.js server (localhost:3000) to a simulated client running on a Node.js server (localhost:3001).
edit * - Current implementation just uses Angular's %http as the transport without web-sockets
The logic goes as follows:
1 . Create an array on the client side of 'Cities' and POST them (from the AngularJS controller) to the Web Service located at: localhost:3000/getMatrix
$http({
method: 'POST',
url: 'http://localhost:3000/getMatrix',
data: cityArray
}).
success(function (data,status,headers,config){
// binding of $scope variables
// calling a local MongoDB to store the each data item received
for(var key in data){
$http.post('/saveRoutes', data[key])
.success(function (data, status){
// Data stored
})
.error(function (data, status){
// error prints in console
});
}
}).
error(function (data,status,headers,config){
alert("Something went wrong!!");
});
2 . The Web Service then runs through its process to make a matrix of 'Cities' (eg. If it was passed 5 cities, it would return a JSON matrix of 5by5 [25 items]). But the catch is that it passes back the data in 'chunks' thanks to Node's > response.write( data )
Side note - Node.js automatically sets 'Transfer-Encoding':'chunked' in the header
* Other code before (routing/variable creation/etc.) *
res.set({
'Content-Type':'application/json; charset=utf-8',
});
res.write("[\n");
* Other code to process loops and pass arguments *
// query.findOne to MongoDB and if there are no errors
res.write(JSON.stringify(docs)+",\n");
* insert more code to run loops to write more chunks *
// at the end of all loops
res.end("]");
// Final JSON looks like such
[
{ *data* : *data* },
{ *data* : *data* },
......
{ *data* : *data* }
]
Currently the problem is not that the 'chunked' response is not reaching its destination, but that I do not know of a way to start processing the data as soon as the chunks come in.
This is a problem since I am trying to do a matrix of 250x250 and waiting for the full response overloads Angular's ability to display the results as it tries to do it all at once (thus blowing up the page).
This is also a problem since I am trying to save the response to MongoDB and it can only handle a certain size of data before it is 'too large' for MongoDB to process.
I have tried looking into Angular's $q and the promise/defer API, but I am a bit confused on how to implement it and have not found a way to start processing data chunks as they come in.
This question on SO about dealing with chunks did not seem to help much either.
Any help or tips on trying to display chunked data as it comes back to AngularJS would be greatly appreciated.
If the responses could be informative code snippets demonstrating the technique, I would greatly appreciate it since seeing an example helps me learn more than a 'text' description.
-- Thanks
No example because I am not sure what you are using in terms of transport code/if you have a websocket available:
$http does not support doing any of the callbacks until a success code is passed back through at the end of the request - it listens for the .onreadystatechange with a 200 -like value.
If you're wanting to do a stream like this you either need to use $http and wrap it in a transport layer that makes multiple $http calls that all end and return a success header.
You could also use websockets, and instead of calling $http, emit an event in the socket.
Then, to get the chunks back the the client, have the server emit each chunk as a new event on the backend, and have the front-end listen for that event and do the processing for each one.

Categories