file_put_contents access denied inside a loop - javascript

Today, I were updating some legacy code. It was some vanilla PHP small application with ajax queries to export data from database.
The application allows you to select a type of data you want to extract, you click on "Export" button, it posts an Ajax query that will query a database and writes the result in a .xlsx file.
The previous developer made a system to display a progress bar while the export is being written.
A system I have never seen nor thought.
He has made a jQuery function, that will read a .txt file that contains the number of elements being processed, example :
1205/15000
The function :
$.get( "./suivis/" + <?php echo $idUser ?>+"_suivis.txt", function( data ) {
if(data === "save"){
$("#results").html('<p>Creating Excel file, thank you for waiting.</p>' );
}
var elem = data.split('/');
if( elem[0] != '' && elem[1] != '' && data != "save")
{
$("#results").html( '<progress value="'+elem[0]+'" max="'+elem[1]+'"></progress>' );
}
})
So he uses the content of the .txt file to manage the progress bar. That function is called every 0.5 seconds.
Now, on the part of the script executed by the Ajax query, we query the database, count the array of results to get the total of lines to write, create a counter for the foreach, and while we write the .xlsx file, we also do :
$current = ($i-1).'/'.$total;
file_put_contents($file, $current);
That's how we write the processing state in the .txt file.
The error I got, happened only when I had many lines being written (more than 10K).
Once during the process, I'd randomly get (around 5K / 7K) an error from file_put_contents function : failed to open stream. Permission denied.
I have two guesses about why it's happening.
First : It may be because I'm getting it's content with Jquery every 0.5 seconds, and opening, writing and closing it really fast inside a foreach loop. Maybe that the file is "locked" while Jquery accesses its content and PHP can't open it at the same time. Many lines being processed would mean a greater probability that this issue could happen.
Second : It's more unlikely, but maybe doing too manies file_put_contents in a row, could make the system struggle and would try to open and write while the file is still not closed. But as file_put_contents is a wrapper for open, write, close, I doubt it, but who knows, I'm not an expert.
I eventually "fixed" the problem, at least it's not showing anymore for the volume I treat, by making it write the .txt file every 200 records instead of every record.
What do you think is happening ? Is my first guess correct ? Do you see any better approach ?
Thank you for your insights.

Related

PHP doesn't see a cookie unless I refresh the page again

I know there are other questions like this on here, but I've read through the answers and I'm still not finding a solution to my issue.
All of these scripts are on the same page, search-results.
It shows results in data-tables, and lets the user select records to save to a list, which they can then print. The IDs of the selected records are stored in a records cookie. I can see that this works.
There's a link at the bottom of the page that fires this JS function:
function resultsPage(){
window.location = "/search-results/?printqueue=true";
}
As you can see, this is the same page, but with a different parameter. This tells the page to display their saved results.
On the initial load with ?printqueue=true PHP cannot read the cookie, even tho the page has refreshed. I have to refresh (or click the link) AGAIN before PHP picks up the cookie. Why?
I'm getting the cookie data like this:
if( $_GET['printqueue'] == 'true' && $_COOKIE['records'] !== 'undefined' ) {
$posts_to_show = explode(',',$_COOKIE['records']);
}
If that's just because of the order of operations, that's cool, nothing to do about it then. But any clean ways to "double refresh" the page without adding an extra parameter and refreshing again if it's present? I'd really hate to do that...
I beleieve its becouse $_COOKIE['records'] is not set yet..
Try:
if( $_GET['printqueue'] == 'true' && isset($_COOKIE['records']) ) {
// your code
}

Trying to generate a temporary txt file and then delete it

I have a site where I want a user to be able to download some data as a text file, without permanently saving the file on my server. The approach I'm trying is to use JavaScript to send a POST, using PHP to generate and save a text file. On success, JavaScript will open that file in a separate window. After a few seconds delay, it will then send a POST with the file to delete.
I have most of it working, but for some reason, when I try to delete the file, I keep getting an error - No such file or directory. I don't know why, especially since using a test file to delete in the same directory works fine. Here's what I'm using on the javascript side:
////CREATE FILE
function exportGroup() {
$.post("../Modules/Export_Mod/export_mod.php",
{ submit:'export',
groupIndex: groupSelect.value,
userRole: 'admin',
serial: <?php echo $serial;?>
},
function(data,status){
//open created file in new window
window.open("../Modules/Export_Mod/"+data);
removeExport(data);
});
};
//////REMOVE FILE
function removeExport(filename) {
///After 1 second, send post to delete file
setTimeout(function() {
$.post("../Modules/Export_Mod/export_mod.php",
{ submit:'removeExport',
file: filename
},
function(data,status){
data;
});
}, 1000);
}
and my PHP:
//I'm creating the file successfully with this
...
$filename = $groupName."_group_export.txt";
$content = $header.$dataStr;
$strlength = strlen($content);
$create = fopen($filename, "w");
$write = fwrite($create, $content, $strlength);
$close = fclose($create);
But when I try to delete a second (or more) later using this:
if (($_POST)&&($_POST['submit']=='removeExport')){
$file = $_POST['file'];
unlink($file); ///works when using an already-existing file in the same directory ... unlink('test.txt');
}
I get the error. The first thing am wondering is if I'm approaching this the right way. If not, is there a better way to do it? And then the second thing I'm wondering is why I'm getting this error and what I need to change to make it work.
I would check the server permissions to see if the php script that is running is able to delete the files that are created. If you are on a unix based script I would run the
ls -l /usr/var/
command on the directory that you are storing the files in, to see what permissions have been assigned to them.
I've done similar sorts of things where a file is created and then deleted some time later, in my case 24hrs. But what I did was to set up a cron job to find the files that are older than a period of time and then delete them. That way I don't have to depend on the browser to post back a delete request.
Another option is to set up a custom session handler that deletes files associated with the session upon close of the session. Though that may leave things lying about if the user doesn't officially log out.
Or you could keep the file data in CLOBs on MySQL and then set up a query that kills them after a period.
Or if you feel like using Cassandra, you can set a TTL on a row and it will magically disappear.
Don't get locked in to operating system files. You control what data is provided by your pages, so if you want to send back data and call it a "file" the user will never know that it is actually a database entry.

Simple online web game crash eventually

Background:
I am making a simple game in PHP, JavaScript and HTML for the web. A player control movements of a box on the screen, and see others fly around with their boxes.
I have the following files, that I upload to my domain via a hosting company:
index.html: a file with some buttons (eg. to start the game) and frames (for putting boxes in).
server.php: PHP script that receives messages from client, performs reads/writes to a database, echoes (using echo) boxes from database to the client. Does not echo the box of the player the message came from.
database.txt: a JSON text file containing data of players and the next free ID number. When empty it looks like this: {"players":[], "id": 1}. players contain objects with values such as ID, position and rotation.
script.js: JavaScript file with script to send/receive messages, display data from messages etc. Linked to index.html. Moves your box.
A screenshot, two players in movement:
Problem: The game crashes, always. Sooner or later. This is what happens:
Client recevies player data from server.php, everything is fine. This could be for 10 seconds or up to some minutes.
The data starts to falter, the message sometimes is null instead of actual data.
The data recevied is always null. The database file is now {"players":null,"id":5}. (The "id" could be any number, does not have to be 5).
Picture of data flow, printing of players from database. Two players. Before this screenshot lots of rows with valid data. Then as seen two null messages. Then after a while null forever.
I am not completely sure where the problem is, but I am guessing it has to do with my read/write in server.php. I feels like a lot of player movement makes the program more likely to crash. Also how often the program sends data affetcs.
Code Piece 1: This is code from server.php, that writes to the database. I have some sort of semaphore (the flock( ... ) ) to prevent clients from reading/writing at the same time (causing errors). I have an other function, read, which is very similar to this. Possible problems here:
The semaphore is incorrect.
The mode for fopen() is incorrect. See PHP docs. The mode w is for write. The tag b is for "If you do not specify the 'b' flag when working with binary files, you may experience strange problems with your data ...".
Something weird happening because I use read() in my writing function?
Code:
// Write $val to $obj in database JSON
function write($obj,$val){
$content = read();
$json = json_decode($content);
$json->{$obj} = $val; // eg. $json->{'id'} = 5;
$myfile = fopen("database.txt", "wb") or die("Unable to open file!");
if(flock($myfile, LOCK_EX|LOCK_NB)) {
fwrite($myfile,json_encode($json));
flock($myfile, LOCK_UN);
}
fclose($myfile);
}
Code Piece 2: This is my code to send data. It is called via a setInterval(). In script.js:
// Send message to server.php, call callback with answer
function communicate(messageFunc,callback){
var message = messageFunc();
if (window.XMLHttpRequest) {
var xmlhttp=new XMLHttpRequest();
}
xmlhttp.onreadystatechange= function() {
if (this.readyState==4 && this.status==200) {
callback(this.responseText);
}
}
xmlhttp.open("GET","server.php?msg="+message,true);
xmlhttp.send();
}
This is my code to receive data, in server.php: $receive = $_GET["msg"].
My current work of solving
This is what I have done so far, but nothing has changed:
Added mode b to fopen().
Added flock() to read/write functions in server.php.
Much reworking on script.js, I would say it looks/works very clean.
Check memory_get_peak_usage(), and check with the hosting company for memory limits. Should be no problem at all.
Looked at PHP garbage collecting and gc_enable() (I don't know why that would change anything).
Lots of testing, looking at the data flow.
Crying.
Conclusion: Is this type of application what PHP is for? What do you think is wrong? If you want more code/info I provide. Thank you very much.
Here is the root of your problem:
$myfile = fopen("database.txt", "wb") or die("Unable to open file!");
Note the behavior of the w open mode (emphasis mine):
Open for writing only; place the file pointer at the beginning of the file and truncate the file to zero length. If the file does not exist, attempt to create it.
This happens before you lock the file. What's happening is that between this fopen() call and the following flock() call, the file's content is zero length, and a reader is coming along during that time and reading the empty file.
Why doesn't this cause an error in PHP when you parse the empty string as JSON? Because json_decode() is defective, and returns null when the input is not valid JSON rather than throwing an exception. Nevermind that the string "null" is valid JSON -- json_decode() gives you no way to differentiate between the cases of valid input representing the null value and invalid input. If json_decode() actually threw an exception or triggered a PHP error (don't ask me why two error-signalling mechanisms are necessary in PHP), you would have a fantastic point to start debugging to figure out why the file is empty, and you might have solved this problem by now!
... sigh ...
PHP's "design" gives me headaches. But I digress.
To fix this whole problem, change the open mode to "cb" and ftruncate($myfile, 0) after you successfully acquire the lock.
Note the behavior of the c mode, which actually specifically mentions the approach you are using (emphasis mine):
Open the file for writing only. If the file does not exist, it is created. If it exists, it is neither truncated (as opposed to 'w'), nor the call to this function fails (as is the case with 'x'). The file pointer is positioned on the beginning of the file. This may be useful if it's desired to get an advisory lock (see flock()) before attempting to modify the file, as using 'w' could truncate the file before the lock was obtained (if truncation is desired, ftruncate() can be used after the lock is requested).

Laravel, best method, Ajax progressbar for files download

I'm building a Laravel4 website currently. With one main feat : downloading (updating) multiple files.
I'd like to display a progressBar (something simple as x% = (file n°)/(file total number), to ensure UX feedback for users.
Something already brainstormed # Displaying progressbar for file upload
Do you know a way to do this without Flex?
And what would the best one ?
try to pass $_SESSION vars... inspired from {session.upload_progress}
http://www.sitepoint.com/tracking-upload-progress-with-php-and-javascript/
For instance, in your php foreach :
$_SESSION['percentdownload'] = 95;//or any var e.g. $runningPercent;
and run a JS loop which refresh the HTML/CSS progressBar periodically :
var xpercent = '#Session["percentdownload"]';
$("#myProgressElemId").updateFunction(xpercent);
But doing a JS loop is quite dirty ...
Actually you look for pushing updates.
ISNT it saner/safer to make oneByOne download ??
- 1st you query (Jquery get) the infos (number of files, names, sizes, etc)
- 2nd you $.each() (JQuery) and request download for each file... dumb ?

How to update/modify webpage content with Javascript before page load completed?

I'm trying to display a progress bar during mass mailing process. I use classic ASP, disabled content compression too. I simply update the size of an element which one mimics as progress bar and a text element as percent value.
However during the page load it seems Javascript ignored. I only see the hourglass for a long time then the progress bar with %100. If I make alerts between updates Chrome & IE9 refresh the modified values as what I expect.
Is there any other Javascript command to replace alert() to help updating the actual values? alert() command magically lets browser render the content immediately.
Thanks!
... Loop for ASP mail send code
If percent <> current Then
current = percent
%>
<script type="text/javascript">
//alert(<%=percent%>);
document.getElementById('remain').innerText='%<%=percent%>';
document.getElementById('progress').style.width='<%=percent%>%';
document.getElementById('success').innerText='<%=success%>';
</script>
<%
End If
... Loop end
These are the screenshots if I use alert() in the code: As you see it works but the user should click OK many times.
First step is writing the current progress into a Session variable when it changes:
Session("percent") = percent
Second step is building a simple mechanism that will output that value to browser when requested:
If Request("getpercent")="1" Then
Response.Clear()
Response.Write(Session("percent"))
Response.End()
End If
And finally you need to read the percentage with JavaScript using timer. This is best done with jQuery as pure JavaScript AJAX is a big headache. After you add reference to the jQuery library, have such code:
var timer = window.setTimeout(CheckPercentage, 100);
function CheckPercentage() {
$.get("?getpercent=1", function(data) {
timer = window.setTimeout(CheckPercentage, 100);
var percentage = parseInt(data, 10);
if (isNaN(percentage)) {
$("#remain").text("Invalid response: " + data);
}
else {
$("#remain").text(percentage + "%");
if (percentage >= 100) {
//done!
window.clearTimeout(timer);
}
}
});
}
Holding respond untill your complete processing is done is not a viable option, just imagine 30 people accessing the same page, you will have 30 persistent connections to the server for a long time, especially with IIS, i am sure its not a viable option, it might work well in your development environment but when you move production and more people start accessing page your server might go down.
i wish you look into the following
Do the processing on the background on the server and do not hold the response for a long time
Try to write a windows service which resides on the server and takes care of your mass mailing
if you still insist you do it on the web, try sending one email at a time using ajax, for every ajax request send an email/two
and in your above example without response.flush the browser will also not get the % information.
Well, you don't.
Except for simple effects like printing dots or a sequence of images it won't work safely, and even then buffering could interfere.
My approach would be to have an area which you update using an ajax request every second to a script which reads a log file or emails sent count file or such an entry in the database which is created by the mass mailing process. The mass mailing process would be initiated by ajax as well.
ASP will not write anything to the page until it's fully done processing (unless you do a flush)
Response.Buffer=true
write something
response.flush
write something else
etc
(see example here: http://www.w3schools.com/asp/met_flush.asp)
A better way to do this is to use ajax.
Example here:
http://jquery-howto.blogspot.com/2009/04/display-loading-gif-image-while-loading.html
I didn't like ajax at first, but I love it now.

Categories