SSE loads lots of data and paralyzes AJAX post requests - javascript

This is my sse_server.php file
include_once 'server_files/init2.php'; //this file includes the connection file to the database and some functions
header('Content-Type: text/event-stream');
header('Cache-Control: no-cache');
$assocArray = array();
$fetch_article = $dbh->prepare("SELECT
article_author_id,
article_author_un,
article_id,
article_cover,
article_title,
article_desc
FROM articles ORDER BY article_id DESC");
$fetch_article->execute();
while ($fetch = $fetch_article->fetch(PDO::FETCH_ASSOC))
{
$article_author_id = $fetch['article_author_id'];
$article_author_u = $fetch['article_author_un'];
$article_id = $fetch['article_id'];
$article_cover = $fetch['article_cover'];
$article_title = $fetch['article_title'];
$article_desc = $fetch['article_desc'];
$randomNum = rand(0,500);
//Insert the Random Number along with the article's info | Random Number as a Value and the contents as a Key
$assocArray[
'randomNum'.'|'. //0
$article_author_id.'|'. //1
$article_author_u.'|'. //2
$article_id.'|'. //3
$article_cover.'|'. //4
$article_title.'|'. //5
$article_desc //6
] = $randomNum;
}
//sort the array
arsort($assocArray, 1);
//echo '<pre>';
//print_r($assocArray);
//while(true){
$var = '';
foreach ($assocArray as $key => $value) {
$var .= $value .' => ' . $key . '`|||`<br>';
}
echo "retry: 6000\n";
echo "data: {$var}\n\n";
ob_flush();
flush();
//}
and this is how I'm processing the data in client.php file
<div id="feeds"></div>
<script>
if(typeof(EventSource)!=="undefined") {
var eSource = new EventSource("sse_server.php");
//detect message received
eSource.addEventListener('message', function(event) {
var jsV_feeds = event.data;
var eventList = document.getElementById("feeds");
var jsV_feedsArray = jsV_feeds.split('`|||`'); //Seperator
eventList.innerHTML = jsF_ToFetch(jsV_feedsArray);
}, false);
}
else {
document.getElementById("feeds").innerHTML="Whoops! Your browser doesn't receive server-sent events.";
}
function jsF_ToFetch(jsP_array)
{
var string = ''; //an empty string
for (var i = 0; i < jsP_array.length-1; i++)
{
jsV_Feed = jsP_array[i].split('|');
jsV_randomNum = jsV_Feed[0];
jsV_article_author_id = jsV_Feed[1];
jsV_article_author_u = jsV_Feed[2];
jsV_article_id = jsV_Feed[3];
jsV_article_cover = jsV_Feed[4];
jsV_article_title = jsV_Feed[5];
jsV_article_desc = jsV_Feed[6];
string += jsV_randomNum +'<li><b>'+jsV_article_author_u+'</b><!--process the rest in a similar way--> </li>';
} // for loop ENDS here
return '<ol>' + string + '</ol>';
}
</script>
The Problem is if I use the foreach loop only, it reconnects every 6 seconds.
And if I wrap the foreach inside a while loop it keeps the connection alive but continously keeps on sending data. This eventually loads up a lot of data within seconds. Also it makes AJAX Post request very slow which is executed via another page simultaneously.
Why is that happening ?
How can I get it to keep the connection open, not send data, and not slow down the AJAX post requests.
PS: I have visited these links -
http://www.html5rocks.com/en/tutorials/eventsource/basics/
PHP Event Source keeps executing
May be I didn't understood them very well enough. If it could be boiled down to even simpler terms, kindly do it!
Thanks in advance!

You want to be using the while(true){} loop that you've commented out in sse_server.php. Your SSE script should never exit (until the socket is closed, which would happen from client-side, i.e. your JavaScript script closing it, or the browser window being closed).
The reason you have problems when using the while loop is that there is no sleep() or wait action inside the while loop. So you are sending data to the client (the same data over and over again!), at maximum rate.
Conceptually, what I'm guessing you are after is this code:
$lastID = 0;
while(true){
$fetch_article = $dbh->prepare("SELECT something FROM somewhere WHERE conditions AND articleID > ?");
$results = $fetch_article->execute($lastID);
if(has 1+ results) {
foreach($results){
echo result formatted for SSE
$lastID = ...;
}
flush();ob_flush();
}
sleep(5);
}
This is saying it will poll the DB every 5 seconds for new records. If there are no new records it does nothing - just goes back to sleep for another 5 seconds. But if there are new records it pushes them out over SSE to the client.
You can adjust the 5 second sleep to find the balance between CPU usage on the server and latency. Shorter sleep means lower latency (your clients get the new data sooner), but higher CPU on the server.
Aside: The lastID approach above is just some way of detecting what records you have seen, and have not yet seen. It is good if you have a unique ID in your table, which is AUTO_INCREMENT. But, alternatively, if DB records are inserted with a created timestamp, then the query becomes:
$now = 0;
while(true){
$stmt = prepare( "SELECT ... AND created > ?" );
$stmt->execute( $now );
$now = time();
...process results ...
sleep(5);
}
(Slightly safer is to set $now to the maximum created timestamp that was found in results, rather than to time() each time; otherwise it is possible for a record to slip through the cracks and not get sent to clients.)

Related

javascript alternatives to setInterval

I know that would be a duplicate, but I can't find what I need. I try to make a real time mmorpg (refferences:travian,tribal-wars,ikariam,etc) in order to get some experience and still have some fun (my childhood dream).
User have more then one city and he can access each one using a 'select form'.
When user changes the select form, an ajax is going to DB and returns 'current city resources: wood,iron,stone' also the 'current production' for each one. All works well. When I change the select form, ajax is updating resources bar with a loop over values class. I will update my DB stocks table with the current values + productions each hour(would be a waste to make it every 10 seconds). The problem is that I want to run a script every 5-10 seconds which should update client side resources stock something like this: "document.getElemByID(wood).html = current dbStocked wood(//which has been query once with ajax) + (wood_production/3600(//seconds in a minute)*((current_minutes*60)+(current_seconds)))". All works fine, but when I change the city with select form, setInterval keep running, now having 2 values of wood and wood_prod in script logic, now on each itineration toogle between this 2 cases. Every 5 seconds the div representing wood value gets:one time last selected city calculation, one time current city calculation. So the div content is juggling each 5 second(interval time). EDIT: The setInterval is kidnaping the value of which one started and not gonna drop it away even replaced by another, so it forces initial values and toogling them with the current ones, every 5 seconds.
Here is a part of my code:
$(document).ready(
function() {
$("#setCitty").on("change", getstats);
}
);
function getstats() {
var city = $('#setCitty').val(); //the select form triggering the ajax
var identifier = $('#identifier').val(); //don t mind it
$.ajax({
type: 'post',
url: 'handle/php/cityStatsGet.php',
dataType: "json",
data: {
city: city,
identifier: identifier,
},
success: function(response) {
console.log(response.citystats);
console.log(response.production); //Added console log here...all seems ok, changing city is changing the content of response.citystats and response.production ..
clearInterval(interval);
var v = 0;
$(".values").each(function() { //i have a mess here, i will use vanilla for performanece later
$(this).html(response.citystats[v]);
v++;
});
incoming();
setInterval(incoming, 5000);
function incoming() {
var d = new Date();
var m = d.getMinutes();
var s = d.getSeconds(); //response.citystats[19] is iron stock
$('#ironInc').html(parseInt(parseInt(response.citystats[19]) + ((parseInt(response.production[2]) / 3600)) * ((+parseInt(m) * 60) + parseInt(s))));
} //i parseint all because i have a headpain trying to figure out what is wrong just because js treats pure numbers as strings
});
}
<script src="https://ajax.googleapis.com/ajax/libs/jquery/2.1.1/jquery.min.js"></script>
<span class="values" id="ironInc"></span>
So...my question is...if I can in some way restart the execution of setInterval (because is not restarting when I recall its patern function).
EDIT: also, I have in php
$stmt = $conn->prepare("SELECT * FROM city_stats WHERE user = ? AND city = ? ");
$stmt->bind_param('ss', $username, $city);
$stmt->execute();
$result = $stmt->get_result();
$row = $result->fetch_assoc();
//here are some math to get final values eg.:$income = $row['foo']+$row['bar'];
$data = array();
$data['citystats'] = array(
$population,$workers,$houses,$happiness,$popularity,$filantropy,$terror,$tax,
$foodFactor,$innFactor,$religion,$crime,$crowding,$rats,$food,$meat,$fruits,
$chease,$bread,$kiron,$kstone,$kgold,$kwood
); //23 ELEMENTs I<23
$data['production'] = array(
$goldincome,$pwood,$piron,$pstone
);
$stmt->free_result();
$stmt->close();
$conn->close();
echo json_encode($data);
setInterval(..) returns an id for the timer. Call clearInterval(id) when you change cities.
i.E.
var id = setInterval(..);
And when changing city
clearInterval(id);
This stops the periodic refresh of wood etc from the previously selected city
You could use async functions and work with promises:
function sleepPromise(ms)
{
return new Promise(resolve => setTimeout(resolve, ms));
}
async function sleep()
{
while(true)
{
// do something
await sleepPromise(2000); // Sleep desired amount of miliseconds
// break if needed
console.log('I have awakend.');
}
}
sleep();
EDIT: Here is an example of a valid PHP file structure returning JSON after doing a MySQLi query:
<?php
$con = mysqli_connect(... //Your connection info
// Your query
$result = mysqli_query($con, "SELECT name, price FROM ...");
// If there are results
if(mysqli_num_rows($result) > 0)
{
$resultArray = array();
// Go through them row by row
while($row = mysqli_fetch_array($result))
{
// Make the associative array for each row
$arr = array ('name'=>$row[0],'price'=>$row[1]);
// Add the row to a list of rows
array_push($resultArray,$arr);
}
// set headers for JSON and json_encode the result array
header('Content-type: application/json');
echo json_encode($resultArray);
}
else echo 'error'
?>
EDIT 2: Here is your javascript code written with promises. See if this works for you:
$(document).ready(function()
{
$("#setCitty").on("change", getstats);
};
// Returns promise
function goToSleep(miliseconds)
{
return new Promise(resolve => setTimeout(resolve, miliseconds));
}
// Your tasks
function incoming(response)
{
var d = new Date();
var m = d.getMinutes();
var s = d.getSeconds();
$('#ironInc').html(parseInt(parseInt(response.citystats[19])+((parseInt(response.production[2])/3600))*((+parseInt(m)*60)+parseInt(s))));
}
// function to handle events
async function handleEvents(city, response)
{
while(city == $("#setCitty option:selected").val())
{
incoming(city, response); // call function to do stuff
await goToSleep(1000*5); // 5 seconds
}
}
function getstats()
{
var city = $("#setCitty option:selected").val(); // current selected item
var identifier = $('#identifier').val();
$.ajax(
{
type: 'post',
url: 'handle/php/cityStatsGet.php',
dataType: "json",
data: {
city: city,
identifier: identifier,
},
success: function (response)
{
handleEvents(city, response);
});
}

What is the proper way of using JavaScript, jQuery and AJAX to avoid reaching CPU limits?

Maybe I think this script below is causing me High server load make my site down as well as CPU reach limit. Is there a proper way to code the variables and arrangement of this script?
One point here also, I am using here a SetTimeout() because I really need to get data from a JSON file from Codeigniter PHP and MySQL every 2 seconds. Is there any other way to set this in proper way? What can I do minimize my server load and avoid reaching the limit of the CPU of my web host?
<script>
var data1;
var data2;
var id = "1";
var url = '<?php echo base_url();?>index.php/site/get_products/';
products();
function products() {
$(document).ready(function () {
$.get(url + id, function (data) {
var obj = JSON.parse(data)
data1 = obj.product[0].data1;
data2 = obj.product[0].data2;
if (obj.product[0].data2 == "") {
document.getElementById("datap").innerHTML = "No data found";
} else {
document.getElementById("datap").innerHTML = data1 + data2;
}
});
});
setTimeout(products, 2000);
};
server side: (mycontroller.php)
public function get_product($id){
$this->db->select('*');
$this->db->from('product');
$this->db->where('id', $id);
$query = $this->db->get();
if($query->num_rows() > 0){
$data['product'] = $query->result();
}
echo json_encode($data);
}
There is no use of $(document).ready(function () {...} inside the locate function
Also you have place the setTimeout outside the function locate
function locate() {
$.get(get_loc + v_id, function(data) {
var obj = JSON.parse(data)
longtitude = obj.vehicle[0].longtitude;
latitude = obj.vehicle[0].latitude;
if (obj.vehicle[0].longtitude == "") {
document.getElementById("coordinates").innerHTML = "No coordinates found in the database";
} else {
document.getElementById("coordinates").innerHTML = "Longtitude: " + longtitude + "<br>Latitude: " + latitude + " ";
}
});
};
setTimeout(locate, 2000);
After chatting with Cross I think we've find the problem. He's creating and Android map which looks for GPS coord, and send them to a PHP script on a web server. The PHP script update the record of the vehicle with the lat and Long, using the id of the vehicle (it's a car tracker) to know which record must be updated.
Then an other PHP script reads the database and, using JS, move a marker on a map.
The problem came from the fact Cross was using a timer in the Android App to ask GPS position every 2 sec, but he was calling the web server inside a loop without "timer". So in pseudo code he had something like that:
Every 2 sec, ask for GPS Location
Do
send Location to server
Loop
So even if the GPS location was uptaded only every 2 sec, he was calling the web server at full speed in continous.... which explain the CPU overload.
I suggest Cross to such a think
Get GPS location every 2 sec
Do
if Location has changed
send to server
endif
Loop

How can I run ajax for an array of objects?

I'm not sure if this is an efficient way to use ajax but I am looping through an array of information using a for loop:
loadProfiles.js
var tempString = "";
var searchPeople = function(sv){
var xhttp = new XMLHttpRequest();
xhttp.onreadystatechange = function(){
if(xhttp.readyState == 4 && xhttp.status == 200){
tempString = xhttp.responseText;
loadPeople(tempString, sv);
}
}
var searchvalue = sv;
searchvalue = searchvalue.join(" ");
xhttp.open("GET", "php/searchProfiles.php?searchvalue=" + searchvalue, true);
xhttp.send();
}
var loadPeople = function(people, sv){
loadedPeople = [];
var normList = people.split(",");
var list = people.toLowerCase().split(",");
list.splice(list.length - 1, 1);
var zsearch = sv;
for(var i = 0; i < list.length; i++){
loadedImageId[i] = list[i].split("_")[1];
if(loadedImageId[i] == 0){
loadedImageId[i] = "images/GrayProfilePic.png";
}
else{///////////////////////////////////This is what I need to fix
var grabPic = new XMLHttpRequest();
grabPic.onreadystatechange = function(){
if(grabPic.readyState == 4 && grabPic.status == 200){
console.log("ready to go");
loadedImageId[i] = grabPic.responseText;
if(loadedImageId[i] == "Error1"){
loadedImageId[i] = "images/GrayProfilePic.png";
}
}
}
grabPic.open("GET", "php/grabProfPics.php?imageid=" + loadedImageId[i], true);
grabPic.send();
}//////////////////////////////////////////////
list[i] = list[i].split("_")[0];
for(var j = 0; j < zsearch.length; j++){
if(list[i].indexOf(zsearch[j]) > -1){
if(loadedPeople.indexOf(list[i]) == -1){
if(loadedPeople.indexOf(normList[i].split("_")[0]) == -1){
loadedPeople.push(normList[i].split("_")[0]);
}
}
}
}
}
console.log(loadedPeople);
console.log(loadedImageId);
}
searchProfiles.php
$query = "SELECT username, imageid FROM `memberHandler`";
$result = mysqli_query($connect, $query) or die("Could not query");
while($row = mysqli_fetch_assoc($result)){
echo $row['username'] . "_" . $row['imageid'] . ",";
}
grabProfPics.php
$query = "SELECT image, mime_type FROM memberProfilePictures WHERE `id`='$imageid'";
$result = mysqli_query($connect, $query);
if(mysqli_num_rows($result) != 0){
$row = mysqli_fetch_assoc($result);
$imagesrc = $row['image'];
$imagesrc = base64_encode($imagesrc);
$imagetype = $row['mime_type'];
echo "data:" . $imagetype . ";base64," . $imagesrc . "";
}
else{
echo "Error1";
}
However the server takes a moment to send it's return code, by which time the variable i in the for loop has long since been changed. Is there a way to do this efficiently and update the array with new information based on what the current array value is? I hope this question makes sense! Thanks for the help =)
Basically I am trying to loop through the image id, and if the id is not zero(meaning they have already set an image for their profile-otherwise they haven't and the id is 0) then it will use ajax to connect to a database of images, grab the image that is relative to the specific ID, and then return the image source as well as update the array. I am sorry I was not more specific in saying this before I just figured i could get away with a more simplified version.
Post Question Update: I wrote this before your pasted all of your code. It still applies, but a few more thoughts:
You seem to be just dumping data into SQL query strings. Little Bobby Tables would be proud, but you should worry about SQL injection.
If you insist on writing the std new XMLHttpRequest(); code yourself (and not use a library, like fetch or jquery), you should wrap that in a function(url,data,method,successCb,errorCb). Libraries will help.
In your marked error code, here's the one that really bites you:
The i has long since moved on and doesn't match the index the call was used to make.
loadedImageId[i] = grabPic.responseText;
Moving on, original aysnc explanation:
Async Code
You're touching on how to handle general asynchronous tasks, which include ajax calls.
There are a host of ways to handle this problem, notably callbacks and promises.
While you could do this in a synchronous way, for anything other than toys or quick hacks, favoring asynchronous data is best.
Example
First, define our service. In this case, it's not leaving our machine, but the principle would be the same. You send something (profile id) and get something back (profile image url).
// After ~1-2 seconds, answer the callback with the evenness of the input
var isEvenAjax = function(num,cb) {
setTimeout(function(){
var isEven = num % 2 === 0;
cb(num + " is " + (isEven ? "Even" : "Odd"));
},(Math.floor(Math.random() * 12) + 3) * 150);
};
You can have different signatures, but this is the crux. You put data into something, wait a while, and get a response.
For example:
isEvenAjax(2,console.log);
isEvenAjax(3,console.log);
isEvenAjax(7,console.log);
Could result in a feedback of:
"7 is Odd"
"2 is Even"
"3 is Odd"
And our test data:
var information = [
10,11,12,
];
Now to send our data to the service and get something back. A simple foreach can handle this (NOTE: this is for simple demo purposes. This could get real messy real fast. Promises are a good way to go).
var getInformationResponses = function(information,cb) {
var responses = [];
information.forEach(function(i){
isEvenAjax(i,function(response){
console.log("Feedback for " + i + " is: " + response);
responses.push({num:i,response:response});
if (responses.length >= information.length){
cb(responses);
}
});
});
};
Note that the function that wraps all of your asynchronous calls is itself asynchronous (and, under our callback style, it needs a 'done' callback).
Breaking this down:
After declaring a responses array (into which we put all the results), loop through all of the information elements:
var responses = [];
information.forEach(function(i){
For every element, make an async call.
isEvenAjax(i,function(response){
For the callback for every element (as in, when data is returned from the long running service), note with console.log (for demo) and push the results and the original data into the responses array. Maintaining the source data may not matter for all apps, but in some cases (like which profile ids correspond to which profile urls) it will. Recall: async calls will never guarantee order.
console.log("Feedback for " + i + " is: " + response);
responses.push({num:i,response:response});
Now, check if the number of responses match the requests. If not, then not all the results are in and do nothing. If so, then trigger the main callback and send the complete data back to the main caller.
if (responses.length >= information.length){
cb(responses);
}
So an example like:
getInformationResponses(information,console.log);
can return something such as:
"Feedback for 10 is: 10 is Even"
"Feedback for 12 is: 12 is Even"
"Feedback for 11 is: 11 is Odd"
[[object Object] {
num: 10,
response: "10 is Even"
}, [object Object] {
num: 12,
response: "12 is Even"
}, [object Object] {
num: 11,
response: "11 is Odd"
}]
Promises
This exercise is purely intended to explore how asynchronous calls can be handled and wouldn't do well in production. Problems like error handling (ajax calls will fail) aren't addressed here.
As mentioned by CallMeNorm, promises can be great. I don't have time to cover them now.
As it stands the example you have is going to throw some errors
information[i] does not have a doAjaxstuff method. However, what I think you're trying would be easily done with Promises which are native in modern browsers and even jquery 3.0 has a compliant implementation. In that case, you could do something like:
var promises = information.map(function(piece, index) {
//doAjaxstuff must should return a promise
return Promise.resolve([index, doAjaxstuff(piece)]);
});
var inOrderPromises = promises.reduce(function(state, value) {
return state[value[0], value[1]];
}, []);
});
Promises.all(inOrderPromises)
.then(function(inOrderValues) {
//doYourThing
});
Ajax calls are asynchronous, which mean they will not wait for the for loop to finish. If you want to "pause" the iteration and only resume when the ajax call returns you have to set it synchronous.
You can do it by adding async: false
for more information check jquery docs

mySQL not returning an array into angularJS

I am trying to retrieve the data stored in a mySQL table with a PHP script. I want this data to be returned as an array because I then loop through it in my AngularJS app and conduct various transformations etc. I am getting the data out just fine, but it is returned as just one item in an array i.e. each row is not returned as a separate item of the array. My code as it stands is:
PHP Get Request
<?php
require 'config.php';
$pdo = Database::connect();
$pdo->setAttribute(PDO::ATTR_ERRMODE, PDO::ERRMODE_EXCEPTION);
$sql = 'SELECT * FROM user_details';
$stmt = $pdo->prepare( $sql );
$stmt->execute();
$result = $stmt->fetchAll( PDO::FETCH_ASSOC );
$json = json_encode( $result );
echo $json;
Database::disconnect();
?>
Angular Controller
$scope.userprofiles = [];
$http.get('php/getUserDetails.php')
.success(function(data) {
$scope.userprofiles = data;
});
I also run some tests to see what the issue is. Specifically, I see if the variable is an array with:
$scope.varcheck = $scope.userprofiles.constructor === Array;
This returns true. And then I check the length of the array with:
$scope.numRecords = $scope.userprofiles.length;
This returns 0.
If anyone had any thoughts it would be a great help.
I also have an issue that if a "/" or a "'" is stored in the database it throws the get request. I assume that it is exiting early. If anyone knew about this it would be great too!
Thanks,
Jack
$http methods return a promise, which can't be iterated, so you have to attach the results to the scope variable through the callbacks:
$scope.userprofiles = [];
$http.get('php/getUserDetails.php')
.then(function(response) {
$scope.userprofiles = response.data;
});
Hope it may help you :-)

How to restructure a long running php process to not time out [duplicate]

This question already has answers here:
How to increase the execution timeout in php?
(14 answers)
Closed 8 years ago.
I have a simple javascript function like so:
$(document).ready(function(){
var request = $.ajax({
url: "read_images.php",
type: "GET",
dataType: "html"
});
request.done(function(msg) {
$("#mybox").html(msg);
document.getElementById('message').innerHTML = '';
});
request.fail(function(jqXHR, textStatus) {
alert( "Request failed: " + textStatus );
});
});
The php script it is calling loops on the contents of a folder, runs some checks, and returns a response. The script is as follows:
//Get all Images from server, store in variable
$server_images = scandir('../images/original');
//Remove first 3 elements, which are not correct
array_shift($server_images);
array_shift($server_images);
array_shift($server_images);
$j = 0;
for($i=0;$i<count($server_images) && $i<3000;$i++) {
$server_image = $server_images[$i];
//Make sure that the server image does not have a php extension
if(!preg_match('/.php/',$server_image)) {
//Select products_id and name from table where the image name is equal to server image name
$query = "SELECT `name`
FROM `images`
WHERE `name` = '$server_image'";
$mro_images = $db->query($query);
$mro_images_row = $mro_images->fetch();
$mro_image = $mro_images_row['name'];
//If no results are found
if(empty($mro_image)) {
$images[$j] = $server_image;
$j++;
}
}
}
It works if the loop is restricted to 2000 iterations but if I try to do e.g. 3000 iterations the result is:
HTTP/1.1 500 Internal Server Error 31234ms
I've tried increasing the php execution limit, but this didn't have any effect as, after contacting my host:
Unfortunately in our environment we don't have any way to increase the loadbalancer timeout beyond 30 seconds
Therefore: How can I restructure this code to avoid hitting the execution time limit?
The below code indicates the basic logic to follow. It isn't tested code and should not be taken as a drop in code example.
Use a javascript loop
Instead of making a slow process slower - write your JavaScript to ask for smaller chunks of data in a loop.
I.e. the js could use a while loop:
$(document).ready(function(){
var done = false,
offset = 0,
limit = 20;
while (!done) {
var url = "read_images.php?offset=" + offset + "&limit=" + limit;
$.ajax({
async: false,
url: url
}).done(function(response) {
if (response.processed !== limit) {
// asked to process 20, only processed <=19 - there aren't any more
done = true;
}
offset += response.processed;
$("#mybox").html("Processed total of " + offset + " records");
}).fail(function(jqXHR, textStatus) {
$("#mybox").html("Error after processing " + offset + " records. Error: " textStatus);
done = true;
});
}
});
Note that in the above example the ajax call is forced to be syncronous. Normally you don't want to do this, but in this example makes it easier to write, and possibly easier to understand.
Do a fixed amount of work per php request
The php code also needs modifying to expect and use the get arguments being passed:
$stuff = scandir('../images/original');
$offset = $_GET['offset'];
$limit = $_GET['limit'];
$server_images = array_slice($stuff, $offset, $limit);
foreach($server_images as $server_image) {
...
}
...
$response = array(
'processed' => count($server_images),
'message' => 'All is right with the world'
);
header('Content-Type: application/json');
echo json_encode($response);
die;
In this way the amount of work a given php request needs to process is fixed, as the overall amount of data to process grows (assuming the number of files in the directory doesn't grow to impractical numbers).
If everything works with 2000 iterations for 3000 iterations try upping the time limit to allow php to execute longer. But under normal circumstances this is not a good idea. Make sure you know what you are doing and have a good reason for increasing the execution time.
set_time_limit ( 60 );
http://www.php.net/manual/en/function.set-time-limit.php
Also this could be due to the script exhausting the amount of memory. Create a file with the phpinfo function in it and then check the value for the memory_limit.
<?php phpinfo(); ?>
Then you can increase the limit by htaccess file. But again make sure you want the script to consume more memory. Be careful.
ini_set('memory_limit', '128M'); #change 128 to suit your needs
Your count($server_images) is probably resulting in an infinite loop.
If count() returns 0, your for loop will never end. So you need to check that first.
//Get all Images from server, store in variable
$server_images = scandir('../images/original');
//Remove first 3 elements, which are not correct
array_shift($server_images);
array_shift($server_images);
array_shift($server_images);
$j = 0;
if(count($server_images) > 0){
for($i=0;$i<count($server_images) && $i<3000;$i++) {
//Do something
}
}

Categories