I am working on a PHP based web app (that i didn't build).
I am running this ajax request:
$.ajax({
type: 'POST',
url: "/potato/ajax.php?module=test_module",
dataType: 'json',
async: true,
data: {
start_ts: that.start_date,
stop_ts: that.end_date,
submitted: true
},
beforeSend: function() {
console.log('Start: ' + new Date().toLocaleString());
// Show Chart Loading
that.qwChart.showLoading({
color: '#00b0f0',
// text: that.returnNumWithPrecent(that.progress)
text: that.qwChartProgress
});
// If data div isn't displayed
if (!that.dataDisplayed) {
// Show divs loading
that.showMainDiv();
} else {
that.$qwTbody.slideUp('fast');
that.$qwTbody.html('');
}
},
complete: function(){},
success: function(result){
console.log('End: ' + new Date().toLocaleString());
// Clear timer
clearInterval(timer);
// Set progressbar to 100%
that.setProgressBarTo100();
// Show Download Button
that.downloadBtn.style.display = 'inline-block';
// Insert Chart Data
that.insertChartData(result);
// Insert Table Data
that.insertTableData(result);
}
});
And for some reason it gets my whole web-app stuck until it returns the data. I know that by default ajax requests are set to 'true' but i added it anyway just to make sure it is.
If it is async, it should do the job without getting my web-app stuck, am I right? What can be the problem? Is this a serverside problem? How do I debug this situation?
Edit: By saying "stuck" I mean - when I wait for the response after submitting the ajax call, refreshing the page or opening in parallel other pages (within my web app only) display a white loading screen. Whenever the ajax call returns the data - the white page loads to the requested page.
Data is returned from the PHP file:
<?php
require_once("/www/common/api/db.php");
if (!empty($_POST['submitted'])) {
// error_reporting(-1);
// Users Array:
$users = get_qw_data($start_ts_to_date, $stop_ts_to_date);
// Summary Array:
$summary = get_qw_summary($users);
// QW Score Array:
$qws = get_qw_score($users);
// Generate CSV Report files
/* Remove old:*/
if (!is_file_dir_exist($customer))
create_qw_directory($customer);
/* Report #1: */ users_apps_google_macros_ma($users['users'], $customer);
/* Report #2: */ usage_and_qw_summary($summary, $customer);
/* Report #3: */ qw_score($qws, $customer);
/* Zip Files: */ zip_qw_files($customer);
echo json_encode($qws);
}
PHP sessions are a prime candidate for other requests getting “stuck”, because the session file gets write-locked, so as long as one running script instance has the session open, all others have to wait.
Solution to that is to call session_write_close as soon as possible.
A little extended explanation:
The default storage mechanism for session data is simply the file system. For every active session, PHP simply puts a file into the configured session directory, and writes the contents of $_SESSION to it, so that it can be read back from there on the next request that needs to access it.
Now if several PHP script instances tried to write changed session data to that file “simultaneously”, that would quite obviously have great conflict/error potential.
Therefor PHP puts a write lock on the session file, as soon as one script instance accesses the session - everybody else, other requests (to the same script, or a different one also using the session), will have to wait, until the first script is done with the session, and the write lock gets released again.
Per default, that happens when the script is done running. But if you have longer running scripts, this can easily lead to such “blocking” effects as you are experiencing here. The solution to that is to explicitly tell PHP (via session_write_close), “I’m done with the session here, not gonna write any new/changed data to it from this point on - so feel free to release the lock, so that the next script can start reading the session data.”
The important thing is that you only do this after your script is done manipulating any session data. You can still read from $_SESSION during the rest of the script - but you can not write to it any more. (So anything like $_SESSION['foo'] = 'bar'; would have to fail, after you released the session.)
If the only purpose the session serves at this point (in this specific script) is to check user authentication, then you can close the session directly after that. The rest of the script can then run as long as it wants to, without blocking other scripts from accessing the same session any more.
This isn’t limited to AJAX requests - those are just one of the places where you usually notice stuff like this first, because otherwise you usually don’t have that many requests using the session running in “parallel”. But if you were to f.e. open a long-running script multiple times in several browser tabs, you would notice the same effect there - in the first tab the script will run and do its business, whereas in the following tabs you should notice that those requests are “hanging” as well, as long as the previous script instance holds the write lock on the session.
Related
For now, I have this :
<?php
$result = get_metadata('post', 3241, 'progression_aujourdhui', true);
?>
<div class="ligne_barre ligne_barre_aujourdhui">
<div id="progress_bar-aujourdhui" class="progress_bar_salle_presse">
<h2 class="progress-title"><?= wp_get_attachment_image(3278, 'full'); ?></h2>
<div class="blocs-barre-progression">
<div class="skill-item">
<div class="progression">
<div class="progress_bar" data-progress-value="<?= $result; ?>" data-progress-equipe="equipe1">
<div class="progress-value"><?= $result . "%" ?></div>
</div>
</div>
</div>
</div>
</div>
</div>
The code is inserted in a page called "Salle de Presse" using a shortcode.
This page called "Salle de Presse" has a metakey named 'progression_aujourdhui'.
On reloading that "Salle de Presse" page, if the value of the metakey "progression_aujourdhui" has been updated, the "data-progress-value" updates well in the div with class "progress_bar".
Now, what I would like is to make the div with class "ligne_barre" to reload each time the value of the meta key "progression_aujourdhui" is updated, without having to refresh the whole page myself.
I know that AJAX is needed, but I'm not sure how to use it in wordpress, and furthermore the "detect when a meta value is updated" part leaves me with no success in my research on the internet.
This will not be an easy task to establish on a wordpress. There are 2 general solutions to this problem.
Use "long pooling", basically call your wordpress api from the frontpage each n seconds and update data if changed. This may prove costly as each client will bombard your backend.
Use web-sockets and subscription method, usually you will need a custom vps (server) for this with nignx proxy, enable tcp connection, and get a "subcription" whenever database changes, but still the logic "to who and where to send this database change info" will be on your side. Wordpress and websocets should be enough to get you going
Good luck
It sounds like you are trying to retrieve data from a database and update the data on the front end without a page reload.
I use Ajax calls quite a lot in wordpress for this and I find them pretty easy to do.
You make an Ajax call from your front end JavaScript.
The Ajax call triggers a PHP function in your function.php file. The function sends a response containing the requested data back to the front end.
The font end JavaScript then processes the response received and updates the page values, etc without reloading the webpage.
Use Ajax. What you'll want is to use a single ajax session to get updates with an infinite timeout. you'll need javascript for this (i dont bother with jquery), and some php hooks.
For javascript you can dynamically generate it such as using admin_url(); to output the path of admin but the normal static path is /wp-admin/admin-ajax.php
Give your elements an id thats related. for instance i use a button to fetch data so use an onclick trigger to a function that sends the ajax.
var t0 = performance.now();
var request=document.getElementById('status');
var table=document.getElementById('contents');//div that will contain the updated html
var t1;
xhr = new XMLHttpRequest();
xhr.open('POST', '../wp-admin/admin-ajax.php',true);//../ forces root url but just / works
xhr.setRequestHeader('Content-Type', 'application/x-www-form-urlencoded');
xhr.onload = function() {
if (xhr.status === 200) {
t1 = performance.now();
request.innerHTML='Status:Successful Time:'+ (t1-t0) + 'ms';
table.innerHTML=xhr.responseText;
//polymorphism here, recall the ajax function
}
else if (xhr.status !== 200) {
t1 = performance.now();
request.innerHTML='Status:Failed Time:'+ (t1-t0) + 'ms -'+xhr.status;
//polymorphism here, recall the ajax function
}
xhr.send("action=me_action&mevar1="+me_value+"&..."+status);
On the php side you'll need this:
add_action("wp_ajax_me_action", "me_function");
function me_function(){
$response='your response here';
$mevar=$_Request['mevar1'];.....
echo $response;
}
To improve performance, set output_buffering=On or 1 (dont use a set limit as a smaller output will cause delays) in your php.ini as large requests can be more efficiently packaged across the network and with compression.
To continuously update or recheck just use
setTimeout(my-ajax-function,0);
but if the server has a timeout for this then use setInterval(my-ajax-function,less-then-server-timeout-in-milliseconds);
many wordpress setups are already heavy, it takes a lot of resources on servers to run the php that while a static web page can be delivered in 50ms, your wordpress response will tend to be delivered in 500ms-1s for most installs unless you actually know how to optimise it (a service i do offer from the ground up, server to wordpress). I did not use jquery because if you barely need it for a page, please avoid using it to save on resources. Same for your ajax calls, do it with as few requests as possible and try to get everything in 1 request. This applies to other wordpress related work like using the wordpress REST API as each request adds a significant delay that can end up stacking from seconds into minutes. A page of 100 listed items with 100 requests can take 50 seconds, and a lot of CPU, so do it all in 1 or as few requests as possible.
I am creating a turn-based chinese checkers game. In the body I added onload function that sends ajax request to the server to receive the player number for the connection. But it seems that the response always returns the same number. I tried using $GLOBALS, but didn't make it work.
How I want it to work: when I open a new window with the game the connection gets the next number with simple +1. JS code is in the home html page, using Symfony.
Some snippets of the code:
<body onload="getPlayerNum()">
var playerNumber = 0;
function getPlayerNum(){
$.ajax({
url: "http://localhost:8000/ajaxPlayer",
method: "POST",
data: {"playerNumber": playerNumber},
success: function(data) {
console.log(data);
}
});
}
/**
* #Route("/ajaxPlayer")
*/
public function ajaxPlayer(Request $request){
if ($request->isXmlHttpRequest()){
if (isset($GLOBALS["number"])){
$playerNumber = $GLOBALS["number"] + 1;
$GLOBALS["number"] = $playerNumber;
} else {
$playerNumber = 1;
$GLOBALS["number"] = $playerNumber;
}
return new Response($playerNumber);
}
}
Global variables (like the one you're setting in $GLOBALS["number"]) are only "global" within the PHP script that's executed when your browser requests http://localhost:8000/ajaxPlayer. They aren't automatically remembered by PHP the next time that script is called.
If you use a session variable, e.g. $_SESSION["number"] = $playerNumber;, then it will be present in any script your browser requests during a browsing session (by default, that means until you close your browser).
I'm guessing however that you intend for multiple players to play together from different computers, for which you'll need more than that: you'll need a way of sharing data between different sessions.
The most common way of doing this is to use a database. If you are doing your whole app in Symfony, then you probably want to start by reading the Doctrine documentation.
I am working on a PHP based web app (that i didn't build).
I am running this ajax request:
$.ajax({
type: 'POST',
url: "/potato/ajax.php?module=test_module",
dataType: 'json',
async: true,
data: {
start_ts: that.start_date,
stop_ts: that.end_date,
submitted: true
},
beforeSend: function() {
console.log('Start: ' + new Date().toLocaleString());
// Show Chart Loading
that.qwChart.showLoading({
color: '#00b0f0',
// text: that.returnNumWithPrecent(that.progress)
text: that.qwChartProgress
});
// If data div isn't displayed
if (!that.dataDisplayed) {
// Show divs loading
that.showMainDiv();
} else {
that.$qwTbody.slideUp('fast');
that.$qwTbody.html('');
}
},
complete: function(){},
success: function(result){
console.log('End: ' + new Date().toLocaleString());
// Clear timer
clearInterval(timer);
// Set progressbar to 100%
that.setProgressBarTo100();
// Show Download Button
that.downloadBtn.style.display = 'inline-block';
// Insert Chart Data
that.insertChartData(result);
// Insert Table Data
that.insertTableData(result);
}
});
And for some reason it gets my whole web-app stuck until it returns the data. I know that by default ajax requests are set to 'true' but i added it anyway just to make sure it is.
If it is async, it should do the job without getting my web-app stuck, am I right? What can be the problem? Is this a serverside problem? How do I debug this situation?
Edit: By saying "stuck" I mean - when I wait for the response after submitting the ajax call, refreshing the page or opening in parallel other pages (within my web app only) display a white loading screen. Whenever the ajax call returns the data - the white page loads to the requested page.
Data is returned from the PHP file:
<?php
require_once("/www/common/api/db.php");
if (!empty($_POST['submitted'])) {
// error_reporting(-1);
// Users Array:
$users = get_qw_data($start_ts_to_date, $stop_ts_to_date);
// Summary Array:
$summary = get_qw_summary($users);
// QW Score Array:
$qws = get_qw_score($users);
// Generate CSV Report files
/* Remove old:*/
if (!is_file_dir_exist($customer))
create_qw_directory($customer);
/* Report #1: */ users_apps_google_macros_ma($users['users'], $customer);
/* Report #2: */ usage_and_qw_summary($summary, $customer);
/* Report #3: */ qw_score($qws, $customer);
/* Zip Files: */ zip_qw_files($customer);
echo json_encode($qws);
}
PHP sessions are a prime candidate for other requests getting “stuck”, because the session file gets write-locked, so as long as one running script instance has the session open, all others have to wait.
Solution to that is to call session_write_close as soon as possible.
A little extended explanation:
The default storage mechanism for session data is simply the file system. For every active session, PHP simply puts a file into the configured session directory, and writes the contents of $_SESSION to it, so that it can be read back from there on the next request that needs to access it.
Now if several PHP script instances tried to write changed session data to that file “simultaneously”, that would quite obviously have great conflict/error potential.
Therefor PHP puts a write lock on the session file, as soon as one script instance accesses the session - everybody else, other requests (to the same script, or a different one also using the session), will have to wait, until the first script is done with the session, and the write lock gets released again.
Per default, that happens when the script is done running. But if you have longer running scripts, this can easily lead to such “blocking” effects as you are experiencing here. The solution to that is to explicitly tell PHP (via session_write_close), “I’m done with the session here, not gonna write any new/changed data to it from this point on - so feel free to release the lock, so that the next script can start reading the session data.”
The important thing is that you only do this after your script is done manipulating any session data. You can still read from $_SESSION during the rest of the script - but you can not write to it any more. (So anything like $_SESSION['foo'] = 'bar'; would have to fail, after you released the session.)
If the only purpose the session serves at this point (in this specific script) is to check user authentication, then you can close the session directly after that. The rest of the script can then run as long as it wants to, without blocking other scripts from accessing the same session any more.
This isn’t limited to AJAX requests - those are just one of the places where you usually notice stuff like this first, because otherwise you usually don’t have that many requests using the session running in “parallel”. But if you were to f.e. open a long-running script multiple times in several browser tabs, you would notice the same effect there - in the first tab the script will run and do its business, whereas in the following tabs you should notice that those requests are “hanging” as well, as long as the previous script instance holds the write lock on the session.
Preface
For this question, I have a MVC partial view. The view has a section which displays a list of documents. Each document has a hyperlink: when clicked, the hyperlink takes the user to a second page view displaying additional information.
The link is inside an unordered list:
<a style="text-decoration:underline;" onclick="sendToDocketSearch('#currentDocument.DktYear','#currentDocument.DktSequence','#currentDocument.DktSubActionID');">#currentDocument.DktYear.ToString().PadLeft(2, '0') - #currentDocument.DktSequence.ToString().PadLeft(5, '0')</a>
When the user clicks the link, it takes them to a sendToDocketSearch javascript function (to prepare to search for the document):
var sendToDocketSearch = function (yearOfDocket, sequenceOfDocket, dktSubActionIDOfDocket) {
jQuery.ajax({
type: "POST",
url: "#Url.Action("DocketSearchOnDemand")",
dataType: "json",
contentType: "application/json; charset=utf-8",
data: JSON.stringify({ docketYear: yearOfDocket,
docketSequence: sequenceOfDocket,
DktSubActionID: dktSubActionIDOfDocket,
userIsAuthorized: '#Model.userIsAuthorized' }),
success: function (data) {
alert(data);
},
failure: function (errMsg) {
alert(errMsg);
}
});
submitForm();
}
Note that the page/view/form is submitted after the following controller method is run:
public ActionResult DocketSearchOnDemand(string docketYear, string docketSequence, decimal DktSubActionID, bool userIsAuthorized, PortalIndexView viewmodel)
{
System.Web.HttpContext.Current.Session.Add("userIsAuthorized", userIsAuthorized);
string docketSearch = docketYear + "-" + docketSequence;
System.Web.HttpContext.Current.Session["DocketSearchOnDemand"] = docketSearch;
if (DktSubActionID > 0)
{
System.Web.HttpContext.Current.Session["DktSubActionID"] = DktSubActionID.ToString();
System.Web.HttpContext.Current.Session["searchingCustomID"] = true;
}
else
{
System.Web.HttpContext.Current.Session["DktSubActionID"] = "1";
System.Web.HttpContext.Current.Session["searchingCustomID"] = false;
}
return View(viewmodel);
}
The above controller method runs; then, because the form is submitted, the HttpPost action for the page takes place. When running it on my local PC, the link is clicked and the next page is loaded without drama.
Problem
The problems start when I upload the code to the dev/test server. I don't know how to use breakpoints while troubleshooting an active website, so I follow along with the browser developer tool to monitor network traffic.
When clicking the link when running the website on my localserver, the process continues:
the hyperlink takes me to a method where I pass information to be searched
the page/view/form is submitted
the controller redirects where I have to go.
When I click the link on the site and it's on the server, the first click is completely ignored - network traffic shows that it tries to navigate to the controller via the javascript function above, but the failure happens so fast I can't even take a screenshot of it. The page reloads a second time at this point.
When I click on the same link a second time, it works without fail.
I believe the view/javascript/controller code works because it works the second time (and on subsequent attempts). It just flagrantly fails the first time on the server; after that, the user is fine. I'd like to prevent that "first-time" failure, however, and I'm wondering what the problem could be...
Bad timing
I may be passing the information too early (or too late for my website/server to process it properly). The page does it correctly the second time, so maybe I'm just "jumping the gun" by not waiting a little longer for page-loading processes to sort themselves out. (Maybe I can fiddle around with the $(document).ready() javascript portion of the first page to "delay" allowing people to click a link.)
Code error
I'll be glad to admit bad code if I'm genuinely messing something up. Maybe it's my javascript function, or maybe it's the code in my controller; at any rate, something is making the first pass of that function call be rejected. Maybe my code is bad because the problem doesn't happen the second time, and I'm getting a false sense of security (i.e. there are problems with my code that the system is willing to forgive after the page has thoroughly loaded).
Server problem/miscellaneous
I'm wondering if I missed something when I uploaded my latest changes, or if I should have contacted my network team in case there are permissions that need to be activated for the site to work smoothly. I'm already in touch with them regarding something else, so I might take advantage of the opportunity today.
There is an alternative in place that could help me prevent this problem from happening, but I want to find out why the "first-time" failure happens. Other similar actions fail the first time on the site, and I'd like to apply the insights from fixing this issue to them.
Thank you for looking at this issue. Have a great day.
Are you sure you want to call submitForm(); before your jQuery.ajax has finished? your ajax call is async so it will hit submitForm(); before it has had time to finish. should submitForm(); be in your success event instead?
I'm trying to display a progress bar during mass mailing process. I use classic ASP, disabled content compression too. I simply update the size of an element which one mimics as progress bar and a text element as percent value.
However during the page load it seems Javascript ignored. I only see the hourglass for a long time then the progress bar with %100. If I make alerts between updates Chrome & IE9 refresh the modified values as what I expect.
Is there any other Javascript command to replace alert() to help updating the actual values? alert() command magically lets browser render the content immediately.
Thanks!
... Loop for ASP mail send code
If percent <> current Then
current = percent
%>
<script type="text/javascript">
//alert(<%=percent%>);
document.getElementById('remain').innerText='%<%=percent%>';
document.getElementById('progress').style.width='<%=percent%>%';
document.getElementById('success').innerText='<%=success%>';
</script>
<%
End If
... Loop end
These are the screenshots if I use alert() in the code: As you see it works but the user should click OK many times.
First step is writing the current progress into a Session variable when it changes:
Session("percent") = percent
Second step is building a simple mechanism that will output that value to browser when requested:
If Request("getpercent")="1" Then
Response.Clear()
Response.Write(Session("percent"))
Response.End()
End If
And finally you need to read the percentage with JavaScript using timer. This is best done with jQuery as pure JavaScript AJAX is a big headache. After you add reference to the jQuery library, have such code:
var timer = window.setTimeout(CheckPercentage, 100);
function CheckPercentage() {
$.get("?getpercent=1", function(data) {
timer = window.setTimeout(CheckPercentage, 100);
var percentage = parseInt(data, 10);
if (isNaN(percentage)) {
$("#remain").text("Invalid response: " + data);
}
else {
$("#remain").text(percentage + "%");
if (percentage >= 100) {
//done!
window.clearTimeout(timer);
}
}
});
}
Holding respond untill your complete processing is done is not a viable option, just imagine 30 people accessing the same page, you will have 30 persistent connections to the server for a long time, especially with IIS, i am sure its not a viable option, it might work well in your development environment but when you move production and more people start accessing page your server might go down.
i wish you look into the following
Do the processing on the background on the server and do not hold the response for a long time
Try to write a windows service which resides on the server and takes care of your mass mailing
if you still insist you do it on the web, try sending one email at a time using ajax, for every ajax request send an email/two
and in your above example without response.flush the browser will also not get the % information.
Well, you don't.
Except for simple effects like printing dots or a sequence of images it won't work safely, and even then buffering could interfere.
My approach would be to have an area which you update using an ajax request every second to a script which reads a log file or emails sent count file or such an entry in the database which is created by the mass mailing process. The mass mailing process would be initiated by ajax as well.
ASP will not write anything to the page until it's fully done processing (unless you do a flush)
Response.Buffer=true
write something
response.flush
write something else
etc
(see example here: http://www.w3schools.com/asp/met_flush.asp)
A better way to do this is to use ajax.
Example here:
http://jquery-howto.blogspot.com/2009/04/display-loading-gif-image-while-loading.html
I didn't like ajax at first, but I love it now.