Simple online web game crash eventually - javascript

Background:
I am making a simple game in PHP, JavaScript and HTML for the web. A player control movements of a box on the screen, and see others fly around with their boxes.
I have the following files, that I upload to my domain via a hosting company:
index.html: a file with some buttons (eg. to start the game) and frames (for putting boxes in).
server.php: PHP script that receives messages from client, performs reads/writes to a database, echoes (using echo) boxes from database to the client. Does not echo the box of the player the message came from.
database.txt: a JSON text file containing data of players and the next free ID number. When empty it looks like this: {"players":[], "id": 1}. players contain objects with values such as ID, position and rotation.
script.js: JavaScript file with script to send/receive messages, display data from messages etc. Linked to index.html. Moves your box.
A screenshot, two players in movement:
Problem: The game crashes, always. Sooner or later. This is what happens:
Client recevies player data from server.php, everything is fine. This could be for 10 seconds or up to some minutes.
The data starts to falter, the message sometimes is null instead of actual data.
The data recevied is always null. The database file is now {"players":null,"id":5}. (The "id" could be any number, does not have to be 5).
Picture of data flow, printing of players from database. Two players. Before this screenshot lots of rows with valid data. Then as seen two null messages. Then after a while null forever.
I am not completely sure where the problem is, but I am guessing it has to do with my read/write in server.php. I feels like a lot of player movement makes the program more likely to crash. Also how often the program sends data affetcs.
Code Piece 1: This is code from server.php, that writes to the database. I have some sort of semaphore (the flock( ... ) ) to prevent clients from reading/writing at the same time (causing errors). I have an other function, read, which is very similar to this. Possible problems here:
The semaphore is incorrect.
The mode for fopen() is incorrect. See PHP docs. The mode w is for write. The tag b is for "If you do not specify the 'b' flag when working with binary files, you may experience strange problems with your data ...".
Something weird happening because I use read() in my writing function?
Code:
// Write $val to $obj in database JSON
function write($obj,$val){
$content = read();
$json = json_decode($content);
$json->{$obj} = $val; // eg. $json->{'id'} = 5;
$myfile = fopen("database.txt", "wb") or die("Unable to open file!");
if(flock($myfile, LOCK_EX|LOCK_NB)) {
fwrite($myfile,json_encode($json));
flock($myfile, LOCK_UN);
}
fclose($myfile);
}
Code Piece 2: This is my code to send data. It is called via a setInterval(). In script.js:
// Send message to server.php, call callback with answer
function communicate(messageFunc,callback){
var message = messageFunc();
if (window.XMLHttpRequest) {
var xmlhttp=new XMLHttpRequest();
}
xmlhttp.onreadystatechange= function() {
if (this.readyState==4 && this.status==200) {
callback(this.responseText);
}
}
xmlhttp.open("GET","server.php?msg="+message,true);
xmlhttp.send();
}
This is my code to receive data, in server.php: $receive = $_GET["msg"].
My current work of solving
This is what I have done so far, but nothing has changed:
Added mode b to fopen().
Added flock() to read/write functions in server.php.
Much reworking on script.js, I would say it looks/works very clean.
Check memory_get_peak_usage(), and check with the hosting company for memory limits. Should be no problem at all.
Looked at PHP garbage collecting and gc_enable() (I don't know why that would change anything).
Lots of testing, looking at the data flow.
Crying.
Conclusion: Is this type of application what PHP is for? What do you think is wrong? If you want more code/info I provide. Thank you very much.

Here is the root of your problem:
$myfile = fopen("database.txt", "wb") or die("Unable to open file!");
Note the behavior of the w open mode (emphasis mine):
Open for writing only; place the file pointer at the beginning of the file and truncate the file to zero length. If the file does not exist, attempt to create it.
This happens before you lock the file. What's happening is that between this fopen() call and the following flock() call, the file's content is zero length, and a reader is coming along during that time and reading the empty file.
Why doesn't this cause an error in PHP when you parse the empty string as JSON? Because json_decode() is defective, and returns null when the input is not valid JSON rather than throwing an exception. Nevermind that the string "null" is valid JSON -- json_decode() gives you no way to differentiate between the cases of valid input representing the null value and invalid input. If json_decode() actually threw an exception or triggered a PHP error (don't ask me why two error-signalling mechanisms are necessary in PHP), you would have a fantastic point to start debugging to figure out why the file is empty, and you might have solved this problem by now!
... sigh ...
PHP's "design" gives me headaches. But I digress.
To fix this whole problem, change the open mode to "cb" and ftruncate($myfile, 0) after you successfully acquire the lock.
Note the behavior of the c mode, which actually specifically mentions the approach you are using (emphasis mine):
Open the file for writing only. If the file does not exist, it is created. If it exists, it is neither truncated (as opposed to 'w'), nor the call to this function fails (as is the case with 'x'). The file pointer is positioned on the beginning of the file. This may be useful if it's desired to get an advisory lock (see flock()) before attempting to modify the file, as using 'w' could truncate the file before the lock was obtained (if truncation is desired, ftruncate() can be used after the lock is requested).

Related

file_put_contents access denied inside a loop

Today, I were updating some legacy code. It was some vanilla PHP small application with ajax queries to export data from database.
The application allows you to select a type of data you want to extract, you click on "Export" button, it posts an Ajax query that will query a database and writes the result in a .xlsx file.
The previous developer made a system to display a progress bar while the export is being written.
A system I have never seen nor thought.
He has made a jQuery function, that will read a .txt file that contains the number of elements being processed, example :
1205/15000
The function :
$.get( "./suivis/" + <?php echo $idUser ?>+"_suivis.txt", function( data ) {
if(data === "save"){
$("#results").html('<p>Creating Excel file, thank you for waiting.</p>' );
}
var elem = data.split('/');
if( elem[0] != '' && elem[1] != '' && data != "save")
{
$("#results").html( '<progress value="'+elem[0]+'" max="'+elem[1]+'"></progress>' );
}
})
So he uses the content of the .txt file to manage the progress bar. That function is called every 0.5 seconds.
Now, on the part of the script executed by the Ajax query, we query the database, count the array of results to get the total of lines to write, create a counter for the foreach, and while we write the .xlsx file, we also do :
$current = ($i-1).'/'.$total;
file_put_contents($file, $current);
That's how we write the processing state in the .txt file.
The error I got, happened only when I had many lines being written (more than 10K).
Once during the process, I'd randomly get (around 5K / 7K) an error from file_put_contents function : failed to open stream. Permission denied.
I have two guesses about why it's happening.
First : It may be because I'm getting it's content with Jquery every 0.5 seconds, and opening, writing and closing it really fast inside a foreach loop. Maybe that the file is "locked" while Jquery accesses its content and PHP can't open it at the same time. Many lines being processed would mean a greater probability that this issue could happen.
Second : It's more unlikely, but maybe doing too manies file_put_contents in a row, could make the system struggle and would try to open and write while the file is still not closed. But as file_put_contents is a wrapper for open, write, close, I doubt it, but who knows, I'm not an expert.
I eventually "fixed" the problem, at least it's not showing anymore for the volume I treat, by making it write the .txt file every 200 records instead of every record.
What do you think is happening ? Is my first guess correct ? Do you see any better approach ?
Thank you for your insights.

How to submit a form and execute javascript simultaneously

As a follow-up to my last question, I have run into another problem. I am making a project on google homepage replica. The aim is to show search results the same as google and store the search history on a database. To show results, I have used this javascript:-
const q = document.getElementById('form_search');
const google = 'https://www.google.com/search?q=';
const site = '';
function google_search(event) {
event.preventDefault();
const url = google + site + '+' + q.value;
const win = window.open(url, '_self');
win.focus();
}
document.getElementById("s-btn").addEventListener("click", google_search)
To create my form, I have used the following HTML code:-
<form method="POST" name="form_search" action="form.php">
<input type="text" id="form_search" name="form_search" placeholder="Search Google or type URL">
The terms from the search bar are to be sent to a PHP file with the post method. I have 2 buttons. Let's name them button1 and button2. The javascript uses the id of button1 while button2 has no javascript and is simply a submit button.
The problem is that when I search using button1, the search results show up but no data is added to my database. But when I search using button2, no results show up( obviously because there is no js for it) but the search term is added to my database. If I reverse the id in javascript, the outcome is also reversed. I need help with making sure that when I search with button1, it shows results and also saves the data in the database. If you need additional code, I will provide it. Please keep your answers limited to javascript, PHP, or HTML solutions. I have no experience with Ajax and JQuery. Any help is appreciated.
Tony since there is limited code available so go with what you had stated in your question.
It is a design pattern issue not so much as so the event issue.
Copy pasting from Wikipedia "software design pattern is a general, reusable solution to a commonly occurring problem within a given context in software design. It is not a finished design that can be transformed directly into source or machine code. Rather, it is a description or template for how to solve a problem that can be used in many different situations. Design patterns are formalized best practices that the programmer can use to solve common problems when designing an application or system."
So here is how things play out at present;
forms gets submitted to specific URL i.e. based on action attribute
Requested page gets Query sting in php and lets you play around with it
then from there on .....
3. either you get results from database and return response
4. or you put search request into database and return success response
Problem statement
if its 3 then search request is not added to database if its 4 then results in response to search request are not returned.
Solution
you need to combine both 3 and 4 in to one processing block and will always run regardless of the search query is.
So our design pattern could use mysql transaction so whole bunch of queries would run a single operation example
$db->beginTransaction(); // we tell tell mysql we will multiple queries as single operation
$db->query('insert query');
$results= $db->query('search query');
$db->commit(); // if we have reached to this end it means all went fine no error etc so we commit which will make database record insert query into database. If there were errors then mysql wont record data.
if($results) {echo $results;} else {echo 'opps no result found';}
slightly more safe version
try {
$db->beginTransaction(); // we tell tell mysql we will multiple queries as single operation
$db->query('insert query');
$results= $db->query('search query');
$db->commit(); // if we have reached to this end it means all went fine no error etc so we commit which will make database record insert query into database. If there were errors then mysql wont record data.
if($results) {echo $results;} else {echo 'opps no result found';}
} catch (\Throwable $e) {
// An exception has been thrown must rollback the transaction
$db->rollback();
echo 'oho server could not process request';
}
We have effectively combined two query operation into one always recording into database and always searching in database.

Uncaught TypeError: URL.createObjectURL: Argument 1 is not valid for any of the 1-argument overloads

I'm trying to provide the administrators with a CSV-download on button click. The ajax call used to do so, gets triggered by a button in the Wordpress Admin-Dashboard. For the AJAX-part I mostly copied the code from Jonathan Amend's answer on this question. With a few adjustments and the serverside set up, I've been trying to download the csv-file. The download window doesn't pop up and the console says:
Uncaught TypeError: URL.createObjectURL: Argument 1 is not valid for any of the 1-argument overloads.
By Argument 1 it can only mean the blob which gets sent from the serverside. The documentation for URL.createObjectURL says:
object:
A File, Blob, or MediaSource object to create an object URL for.
Putting that together it can only mean, that the blob data received from the server has incorrect format. When I log the typeof blob to console, it says "string".
This is part of the code used on the serverside:
// Reading arguments, sanitize and validate data
// Query database and store in $results (as assoc_array)
$delimiter = ";";
$file = fopen("php://output","w");
$cols_printed = false;
foreach($results as $row) {
if (!$cols_printed){
// FPUT 1: Write Aliases of query to file once
fputcsv($file, array_keys($row), $delimiter);
$cols_printed = true;
}
// FPUT 2: Write values to file
fputcsv($file, array_values($row), $delimiter);
}
// Close file, sending headers
If I comment out FPUT 2, then the csv can be downloaded (of course with only the column names and no data). The typeof blob changes to "object".
I tried converting the whole blob explicitely before calling the URL.createObjectURL in JS, letting me successfully download the file without error, but the actual rows were nowhere to be seen. So what makes the 2nd FPUT diffrent?
EDIT 1: I've done some further testing. If I take the FPUT1 out of the if-statement, it throws the same error. Within the .php script there are no other var_dump, print_r or echo who could produce this output. Even when putting an echo "test"; at the very end of this file, the two newlines come still after. I could fix this by moving all code into the main plugin.php file. However, the client still thinks of the response as a string, despite changing my SELECTto a single INT column.

Using Javascript to make parallel server requests THREDDS OPeNDAP

For the following THREDDS OPeNDAP server:
http://data.nodc.noaa.gov/thredds/catalog/ghrsst/L2P/MODIS_T/JPL/2015/294/catalog.html
I would like to note four Attributes of every file in there. The attributes are:
northernmost lattitude; easternmost lattitude; westernmost lattitude; southernmost lattitude. These can be found under the Global attributes under:
http://data.nodc.noaa.gov/thredds/dodsC/ghrsst/L2P/MODIS_T/JPL/2015/294/20151021-MODIS_T-JPL-L2P-T2015294235500.L2_LAC_GHRSST_N-v01.nc.bz2.html
At first I tried this with MATLAB. Problem is: all the netcdf files on the server are compiled to .bz2 files. This makes calling for the Global attributes take around 15 seconds (the server is extracting the file). I would like javascript to run these server requests parallel to save me time. In total I need 90,000 files.
Is there a way to code this using javascript?
You can use the THREDDS DAS service. DAS
Change the OPenDAP link you have above replacing the .html extension with .das
This is a small text file with metadata about the file which could be easily parsed with javascript and includes a section with the global attributes:
NC_GLOBAL {
. . .
Float32 northernmost_latitude 89.9942;
Float32 southernmost_latitude 66.9853;
Float32 easternmost_longitude -121.445;
Float32 westernmost_longitude 76.7485;
. . .
}
This metadata is cached by THREDDS and the above DAS link responds instantly.
Edit:
Re: the correct comments below, (cache exists only after the first request) one alternative might be to use the source data at the NASA JPL OPeNDAP Server (Hyrax): http://podaac-opendap.jpl.nasa.gov/opendap/allData/ghrsst/data/L2P/MODIS_T/JPL/
My browser only tests (i.e. subjective) seem to show that a random DAS responses are quicker, than 15 seconds.
http://podaac-opendap.jpl.nasa.gov/opendap/allData/ghrsst/data/L2P/MODIS_T/JPL/2015/294/20151021-MODIS_T-JPL-L2P-T2015294084500.L2_LAC_GHRSST_N-v01.nc.bz2.das

How to update/modify webpage content with Javascript before page load completed?

I'm trying to display a progress bar during mass mailing process. I use classic ASP, disabled content compression too. I simply update the size of an element which one mimics as progress bar and a text element as percent value.
However during the page load it seems Javascript ignored. I only see the hourglass for a long time then the progress bar with %100. If I make alerts between updates Chrome & IE9 refresh the modified values as what I expect.
Is there any other Javascript command to replace alert() to help updating the actual values? alert() command magically lets browser render the content immediately.
Thanks!
... Loop for ASP mail send code
If percent <> current Then
current = percent
%>
<script type="text/javascript">
//alert(<%=percent%>);
document.getElementById('remain').innerText='%<%=percent%>';
document.getElementById('progress').style.width='<%=percent%>%';
document.getElementById('success').innerText='<%=success%>';
</script>
<%
End If
... Loop end
These are the screenshots if I use alert() in the code: As you see it works but the user should click OK many times.
First step is writing the current progress into a Session variable when it changes:
Session("percent") = percent
Second step is building a simple mechanism that will output that value to browser when requested:
If Request("getpercent")="1" Then
Response.Clear()
Response.Write(Session("percent"))
Response.End()
End If
And finally you need to read the percentage with JavaScript using timer. This is best done with jQuery as pure JavaScript AJAX is a big headache. After you add reference to the jQuery library, have such code:
var timer = window.setTimeout(CheckPercentage, 100);
function CheckPercentage() {
$.get("?getpercent=1", function(data) {
timer = window.setTimeout(CheckPercentage, 100);
var percentage = parseInt(data, 10);
if (isNaN(percentage)) {
$("#remain").text("Invalid response: " + data);
}
else {
$("#remain").text(percentage + "%");
if (percentage >= 100) {
//done!
window.clearTimeout(timer);
}
}
});
}
Holding respond untill your complete processing is done is not a viable option, just imagine 30 people accessing the same page, you will have 30 persistent connections to the server for a long time, especially with IIS, i am sure its not a viable option, it might work well in your development environment but when you move production and more people start accessing page your server might go down.
i wish you look into the following
Do the processing on the background on the server and do not hold the response for a long time
Try to write a windows service which resides on the server and takes care of your mass mailing
if you still insist you do it on the web, try sending one email at a time using ajax, for every ajax request send an email/two
and in your above example without response.flush the browser will also not get the % information.
Well, you don't.
Except for simple effects like printing dots or a sequence of images it won't work safely, and even then buffering could interfere.
My approach would be to have an area which you update using an ajax request every second to a script which reads a log file or emails sent count file or such an entry in the database which is created by the mass mailing process. The mass mailing process would be initiated by ajax as well.
ASP will not write anything to the page until it's fully done processing (unless you do a flush)
Response.Buffer=true
write something
response.flush
write something else
etc
(see example here: http://www.w3schools.com/asp/met_flush.asp)
A better way to do this is to use ajax.
Example here:
http://jquery-howto.blogspot.com/2009/04/display-loading-gif-image-while-loading.html
I didn't like ajax at first, but I love it now.

Categories