So I'm trying to make a chat program in JavaScript using AJAX, and PHP. I am currently updating the chat like this, and I'm sure it's very hard on my server:
<div id="messages">[no messages]</div>
This is whats in the file called ajax-load-messages.php
<?php
$sql_posts_result = mysql_query("SELECT Post FROM Posts ORDER BY Date ASC LIMIT 50", $db) or die("Can't load post"."<br/>".mysql_error());
if(!empty($sql_posts_result)){
while($row = mysql_fetch_row($sql_posts_result)){
echo '<div class="message-post">'.$row[0].'</div>';
}
}
?>
and that's called by this javascript:
setInterval(function(){
$('#messages').load('/ajax-load-messages.php');
}, 3000);
So every 3 seconds I load the last 50 messages to the #messages div.
I know there's a way to handle this that isn't even like 10% as resource intensive, but I don't know where to start. How can I handle this better?
Give the table an int autoincrement id. Keep track of the highest id received (in the session maybe), and on next poll only look for ids higher than that (i.e. only records created since last poll).
These would be my suggestions to handle your chat system better:
1) I would suggest to use a chained-setTimeout instead of setInterval
Why? Suppose the load takes longer than 3 seconds. Then setInterval will beat that and cause more than 1 XML HTTP Request to happen, causing a strain in the browser.
This is how a chained setTimeout would look like in your example:
setTimeout(function loadMessages() {
$("#messages").load('/ajax-load-messages.php', function onLoadMessagesComplete(responseText, textStatus, xmlHttpRequest) {
setTimeout(loadMessages, 3000);
});
}
2) Instead of writing HTML in ajax-load-messages.php, you could respond back with a JSON object json_encode(). Then if you keep track of each chat instance's messages in an array that are currently displayed, then you can figure out if there is a new message or not (developerwjk's answer is a good suggestion). This way, you don't have to reload the DOM every 3 seconds (regardless if there was a new message or not). Of course, you would need to keep aware of the memory usage in the browser though.
===
Usually chat systems (like the Facebook, Google+) use pushing systems rather than polling. The server pushes to the client if there is a new message. This reduces the number of requests to the server, but it could be more difficult to implement.
Related
For now, I have this :
<?php
$result = get_metadata('post', 3241, 'progression_aujourdhui', true);
?>
<div class="ligne_barre ligne_barre_aujourdhui">
<div id="progress_bar-aujourdhui" class="progress_bar_salle_presse">
<h2 class="progress-title"><?= wp_get_attachment_image(3278, 'full'); ?></h2>
<div class="blocs-barre-progression">
<div class="skill-item">
<div class="progression">
<div class="progress_bar" data-progress-value="<?= $result; ?>" data-progress-equipe="equipe1">
<div class="progress-value"><?= $result . "%" ?></div>
</div>
</div>
</div>
</div>
</div>
</div>
The code is inserted in a page called "Salle de Presse" using a shortcode.
This page called "Salle de Presse" has a metakey named 'progression_aujourdhui'.
On reloading that "Salle de Presse" page, if the value of the metakey "progression_aujourdhui" has been updated, the "data-progress-value" updates well in the div with class "progress_bar".
Now, what I would like is to make the div with class "ligne_barre" to reload each time the value of the meta key "progression_aujourdhui" is updated, without having to refresh the whole page myself.
I know that AJAX is needed, but I'm not sure how to use it in wordpress, and furthermore the "detect when a meta value is updated" part leaves me with no success in my research on the internet.
This will not be an easy task to establish on a wordpress. There are 2 general solutions to this problem.
Use "long pooling", basically call your wordpress api from the frontpage each n seconds and update data if changed. This may prove costly as each client will bombard your backend.
Use web-sockets and subscription method, usually you will need a custom vps (server) for this with nignx proxy, enable tcp connection, and get a "subcription" whenever database changes, but still the logic "to who and where to send this database change info" will be on your side. Wordpress and websocets should be enough to get you going
Good luck
It sounds like you are trying to retrieve data from a database and update the data on the front end without a page reload.
I use Ajax calls quite a lot in wordpress for this and I find them pretty easy to do.
You make an Ajax call from your front end JavaScript.
The Ajax call triggers a PHP function in your function.php file. The function sends a response containing the requested data back to the front end.
The font end JavaScript then processes the response received and updates the page values, etc without reloading the webpage.
Use Ajax. What you'll want is to use a single ajax session to get updates with an infinite timeout. you'll need javascript for this (i dont bother with jquery), and some php hooks.
For javascript you can dynamically generate it such as using admin_url(); to output the path of admin but the normal static path is /wp-admin/admin-ajax.php
Give your elements an id thats related. for instance i use a button to fetch data so use an onclick trigger to a function that sends the ajax.
var t0 = performance.now();
var request=document.getElementById('status');
var table=document.getElementById('contents');//div that will contain the updated html
var t1;
xhr = new XMLHttpRequest();
xhr.open('POST', '../wp-admin/admin-ajax.php',true);//../ forces root url but just / works
xhr.setRequestHeader('Content-Type', 'application/x-www-form-urlencoded');
xhr.onload = function() {
if (xhr.status === 200) {
t1 = performance.now();
request.innerHTML='Status:Successful Time:'+ (t1-t0) + 'ms';
table.innerHTML=xhr.responseText;
//polymorphism here, recall the ajax function
}
else if (xhr.status !== 200) {
t1 = performance.now();
request.innerHTML='Status:Failed Time:'+ (t1-t0) + 'ms -'+xhr.status;
//polymorphism here, recall the ajax function
}
xhr.send("action=me_action&mevar1="+me_value+"&..."+status);
On the php side you'll need this:
add_action("wp_ajax_me_action", "me_function");
function me_function(){
$response='your response here';
$mevar=$_Request['mevar1'];.....
echo $response;
}
To improve performance, set output_buffering=On or 1 (dont use a set limit as a smaller output will cause delays) in your php.ini as large requests can be more efficiently packaged across the network and with compression.
To continuously update or recheck just use
setTimeout(my-ajax-function,0);
but if the server has a timeout for this then use setInterval(my-ajax-function,less-then-server-timeout-in-milliseconds);
many wordpress setups are already heavy, it takes a lot of resources on servers to run the php that while a static web page can be delivered in 50ms, your wordpress response will tend to be delivered in 500ms-1s for most installs unless you actually know how to optimise it (a service i do offer from the ground up, server to wordpress). I did not use jquery because if you barely need it for a page, please avoid using it to save on resources. Same for your ajax calls, do it with as few requests as possible and try to get everything in 1 request. This applies to other wordpress related work like using the wordpress REST API as each request adds a significant delay that can end up stacking from seconds into minutes. A page of 100 listed items with 100 requests can take 50 seconds, and a lot of CPU, so do it all in 1 or as few requests as possible.
I am working on a PHP based web app (that i didn't build).
I am running this ajax request:
$.ajax({
type: 'POST',
url: "/potato/ajax.php?module=test_module",
dataType: 'json',
async: true,
data: {
start_ts: that.start_date,
stop_ts: that.end_date,
submitted: true
},
beforeSend: function() {
console.log('Start: ' + new Date().toLocaleString());
// Show Chart Loading
that.qwChart.showLoading({
color: '#00b0f0',
// text: that.returnNumWithPrecent(that.progress)
text: that.qwChartProgress
});
// If data div isn't displayed
if (!that.dataDisplayed) {
// Show divs loading
that.showMainDiv();
} else {
that.$qwTbody.slideUp('fast');
that.$qwTbody.html('');
}
},
complete: function(){},
success: function(result){
console.log('End: ' + new Date().toLocaleString());
// Clear timer
clearInterval(timer);
// Set progressbar to 100%
that.setProgressBarTo100();
// Show Download Button
that.downloadBtn.style.display = 'inline-block';
// Insert Chart Data
that.insertChartData(result);
// Insert Table Data
that.insertTableData(result);
}
});
And for some reason it gets my whole web-app stuck until it returns the data. I know that by default ajax requests are set to 'true' but i added it anyway just to make sure it is.
If it is async, it should do the job without getting my web-app stuck, am I right? What can be the problem? Is this a serverside problem? How do I debug this situation?
Edit: By saying "stuck" I mean - when I wait for the response after submitting the ajax call, refreshing the page or opening in parallel other pages (within my web app only) display a white loading screen. Whenever the ajax call returns the data - the white page loads to the requested page.
Data is returned from the PHP file:
<?php
require_once("/www/common/api/db.php");
if (!empty($_POST['submitted'])) {
// error_reporting(-1);
// Users Array:
$users = get_qw_data($start_ts_to_date, $stop_ts_to_date);
// Summary Array:
$summary = get_qw_summary($users);
// QW Score Array:
$qws = get_qw_score($users);
// Generate CSV Report files
/* Remove old:*/
if (!is_file_dir_exist($customer))
create_qw_directory($customer);
/* Report #1: */ users_apps_google_macros_ma($users['users'], $customer);
/* Report #2: */ usage_and_qw_summary($summary, $customer);
/* Report #3: */ qw_score($qws, $customer);
/* Zip Files: */ zip_qw_files($customer);
echo json_encode($qws);
}
PHP sessions are a prime candidate for other requests getting “stuck”, because the session file gets write-locked, so as long as one running script instance has the session open, all others have to wait.
Solution to that is to call session_write_close as soon as possible.
A little extended explanation:
The default storage mechanism for session data is simply the file system. For every active session, PHP simply puts a file into the configured session directory, and writes the contents of $_SESSION to it, so that it can be read back from there on the next request that needs to access it.
Now if several PHP script instances tried to write changed session data to that file “simultaneously”, that would quite obviously have great conflict/error potential.
Therefor PHP puts a write lock on the session file, as soon as one script instance accesses the session - everybody else, other requests (to the same script, or a different one also using the session), will have to wait, until the first script is done with the session, and the write lock gets released again.
Per default, that happens when the script is done running. But if you have longer running scripts, this can easily lead to such “blocking” effects as you are experiencing here. The solution to that is to explicitly tell PHP (via session_write_close), “I’m done with the session here, not gonna write any new/changed data to it from this point on - so feel free to release the lock, so that the next script can start reading the session data.”
The important thing is that you only do this after your script is done manipulating any session data. You can still read from $_SESSION during the rest of the script - but you can not write to it any more. (So anything like $_SESSION['foo'] = 'bar'; would have to fail, after you released the session.)
If the only purpose the session serves at this point (in this specific script) is to check user authentication, then you can close the session directly after that. The rest of the script can then run as long as it wants to, without blocking other scripts from accessing the same session any more.
This isn’t limited to AJAX requests - those are just one of the places where you usually notice stuff like this first, because otherwise you usually don’t have that many requests using the session running in “parallel”. But if you were to f.e. open a long-running script multiple times in several browser tabs, you would notice the same effect there - in the first tab the script will run and do its business, whereas in the following tabs you should notice that those requests are “hanging” as well, as long as the previous script instance holds the write lock on the session.
So far I've been doing an synchronous ajax call to fetch events from database and then initialize the calendar afterwards using the variable returned from the ajax (Solution 1).
I have read that doing synchronous calls are usually considered bad though since it freezes the browser so I have tried another option where put the ajax call directly inside the events array of FullCalendar calendar initialization (Solution 2). This gives a nice experience upon first load since the browser isn't locked up due to the asynchronous ajax and you can see the calendar as it builds up.
This has a downside however, every time you change view, it re renders the events giving the user a less smooth experience compared to the first one. Here are the code the two solutions that I tried so far:
Solution 1:
$(document).ready(function(){
$.ajax({
url: 'script.php',
type: 'POST',
async: false,
success: function(response){
json_events = response;
}
});
$('#calendar').fullCalendar({
events: JSON.parse(JSON.stringify(json_events)),
});
});
Solution 2:
$(document).ready(function(){
$('#calendar').fullCalendar({
events: {
url: 'script.php',
type: 'POST',
success : function(response){
}
}
});
});
Are there any other solutions to this "problem"? Right not I like Solution 1 more since you don't have to deal with the re rendering of events as you use the calendar but it would be nice to not have the initial freeze upon loading the page.
Edit:
script.php
$events = array();
$query = mysqli_query($link, "SELECT * FROM calendar");
while($fetch = mysqli_fetch_array($query,MYSQLI_ASSOC)) {
$e = array();
$e['id'] = $fetch['id'];
$e['start'] = $fetch['startdate'];
$e['end'] = $fetch['enddate'];
array_push($events, $e);
}
echo json_encode($events);
Would this work? (for aDyson in comments)
events: {
url: 'script2.php',
type: 'POST',
data : {
calendar_id : calendarId
},
success : function(response){
}
}
script2.php
$calend_id = $_POST['calendar_id'];
$start = $_POST['start'];
$end = $_POST['end'];
$events = array();
$query = mysqli_query($link, "SELECT startdate,enddate,id FROM calendar WHERE calendar_id = '$calend_id' AND startdate >= '$start' AND enddate <= '$end'");
while($fetch = mysqli_fetch_array($query,MYSQLI_ASSOC)) {
$e = array();
$e['id'] = $fetch['id'];
$e['start'] = $fetch['startdate'];
$e['end'] = $fetch['enddate'];
array_push($events, $e);
}
echo json_encode($events);
FullCalendar really expects that you only load the events that are required for the current view. This means that you don't have to load all your events upfront. Think about it, once your software has been running for a few years, "all events" could be quite a large amount, and slow down the loading of the calendar. Also, you have to consider how likely it is that a user will suddenly want to view the calendar from 3 years ago? It's probably an edge case, so you don't need those events immediately.
FullCalendar provides a very neat way to do this automagically. See this link https://fullcalendar.io/docs/event_data/events_json_feed/
Basically you simply specify your PHP endpoint as the "events" URL, and as long as you comply with the structure specified in that link it will automatically download the right events when the user changes the date range and/or view being displayed.
So, in your calendar config:
events: "script.php"
As simple as that! FullCalendar will automatically pass two fields - "start" and "end" as the dates to fetch events for. So in your PHP, you'd need something like:
$events = array();
$start = $_GET["start"];
$end = $_GET["end"];
//code to build a query with a WHERE clause specifying the start/end dates
//...
//and finally echo the resulting events
echo json_encode($events);
Unless network latency is a massive issue for you, then downloading the events in small batches, only when required, is preferred the way to go. Also, by default there is a calendar option set: lazyFetching: true which tries to minimise the number of ajax calls required - see https://fullcalendar.io/docs/event_data/lazyFetching/ for more details of exactly how it works.
If you're worried about user experience while the events are loading, you can handle the "loading" callback, so you can add something like a "loading" spinner or something else to indicate to the user that they just need to wait a couple of seconds for the events to appear. See https://fullcalendar.io/docs/event_data/loading/ for more details again.
There are many things to optimize here so let's do it step-by-step.
First in your php script you're issuing Select * from calendar but you only use three columns to generate the json feed so why not Select id,startdate,enddate from calendar ? this(depending on calendar schema) will fetch less from the database and if you've done your indexing right, it will give a better chance for those indexes to be used.
Secondly you can do some kind of pagination, I'm pretty sure you don't want your users to see all things at once, in that case you use SQL's LIMIT and OFFSET or better a Where clause that only fetches events between two dates or both.
Now we have to differentiate between two cases :
Case 1 : we don't mind the user seeing a stale version of the events
So maybe you don't mind your users looking at a deleted event,updated event and most importantly you don't mind your users missing new events.
First thing to do is to pass cache: true inside events: {...} you're passing to fullCalendar, this will activate the browser's cache, you then need to update script.php to check for Last-modified and probably Etag headers, check the updates your table had since then and if no updates(no Insert/Delete/Update) you just send a 304 response(not modified), you can use whatever history mechanism to know what updates your table had between dates.
Second thing is to use some kind of caching mechanism on the server side that caches the database response, you can either use Redis(or memcached or any other) and build a background job which on every time interval you choose syncs the cache with the database.
Or you can make a materialized view in the database ,have it refresh on every time interval and select from that view instead, unfortunately mysql doesn't support it natively but there is Flexviews , see here : https://mariadb.com/kb/en/mariadb/flexviews/
Third thing to do is to find a way to only send those changes instead of sending all events, this is going to be hard, basically you pass events as a function instead of as a json feed to fullCalendar, then you use whatever history mechanism that supports mysql to know the updates, you send them and on the client side you find a way to interpret those changes and update some in-memory object.
If you have concern that your javascript will consume a lot of memory or simply that the events won't fit you can use IndexedDB or you can use pouchdb as it's simpler and will give you support for old browsers that don't support IndexedDB yet.
Case 2 : The User must see the events update in realtime
For mysql you need to monitor changes to calendar table someway(could be triggers for example), you then use Websockets(or any technology built upon it) to push the changes whenever they happen to the client, on the client side the onmessage callback needs to tell fullCalendar about the updates somehow, one way is to update a global array or a client-side database and pass events as a function, another way is to use fullCalendar's updateEvent(s) and removeEvent(s)(apparently no insertEvent(s) though :3 ).
If you feel adventurous you can use RethinkDB which is a realtime database specifically made for your need, or you can store your data in an event database like this one : https://geteventstore.com/
P.S. : you maybe have a mix of both cases, for example you don't care about Insert but you need delete/update to be notified about, you then mix approaches I think :) .
I am trying to use javascript in my CI view to update (without refresh) a data model every 2 seconds, for my use case where the database contents can be changed by other users.
<script type="text/javascript">
var refreshFunc = setInterval(function() {
<?php
$this -> load -> model('m_cube', '', TRUE);
$stamp = $this -> $m_cube -> stamp();
?>
var stamp = "<?php echo $stamp; ?>";
console.log(stamp);
}, 2000);
refreshFunc;
</script>
I am using JS setInterval to create the 2 second loop, and calling the CI model to retrieve data from the Postgresql database. In the simplified code sample, it's just asking the DB for a timestamp. The problem is that the timestamp written to console doesn't update - something is stuck.
2013-10-21 14:35:54.168-04
2013-10-21 14:35:54.168-04
2013-10-21 14:35:54.168-04
...
Same behavior when querying a table of real data - it doesn't return up-to-date values.
Why does the model access a "frozen" version of the DB?
It's not stuck or "frozen", it's that you had a bit of confusion on what comes before and what after.
I don't see you using AJAX, so by the time your php has been processed (i.e, the data fetched from the db and assigned to $stamp) the page - html, css and javascript too - are yet to be generated and served by the server, nor outputted by the browser.
This means that inside your setInterval you always have the same value, which has been already generated, and thus you keep reprinting the same string.
If you want a continue update, you need to keep requesting the data to the server, and that's where AJAX (Asynchronous JavaScript and XML) can be handy since it runs as a separate request from the main one, so you can work on two different "levels" and fetch content while the rest of the page remains static (already served and outputted).
If you're using jQUery you can look into $.ajax(), which makes this kind of things pretty easy.
When this script runs at the server it fetches the model data and replace the <?php ?> tags with the results. So when it comes to client browser, it doesn't contact server every 2 seconds, but logs the stamp value every 2 seconds. If you want it to be updated you should consider using Ajax technology.
I'm trying to display a progress bar during mass mailing process. I use classic ASP, disabled content compression too. I simply update the size of an element which one mimics as progress bar and a text element as percent value.
However during the page load it seems Javascript ignored. I only see the hourglass for a long time then the progress bar with %100. If I make alerts between updates Chrome & IE9 refresh the modified values as what I expect.
Is there any other Javascript command to replace alert() to help updating the actual values? alert() command magically lets browser render the content immediately.
Thanks!
... Loop for ASP mail send code
If percent <> current Then
current = percent
%>
<script type="text/javascript">
//alert(<%=percent%>);
document.getElementById('remain').innerText='%<%=percent%>';
document.getElementById('progress').style.width='<%=percent%>%';
document.getElementById('success').innerText='<%=success%>';
</script>
<%
End If
... Loop end
These are the screenshots if I use alert() in the code: As you see it works but the user should click OK many times.
First step is writing the current progress into a Session variable when it changes:
Session("percent") = percent
Second step is building a simple mechanism that will output that value to browser when requested:
If Request("getpercent")="1" Then
Response.Clear()
Response.Write(Session("percent"))
Response.End()
End If
And finally you need to read the percentage with JavaScript using timer. This is best done with jQuery as pure JavaScript AJAX is a big headache. After you add reference to the jQuery library, have such code:
var timer = window.setTimeout(CheckPercentage, 100);
function CheckPercentage() {
$.get("?getpercent=1", function(data) {
timer = window.setTimeout(CheckPercentage, 100);
var percentage = parseInt(data, 10);
if (isNaN(percentage)) {
$("#remain").text("Invalid response: " + data);
}
else {
$("#remain").text(percentage + "%");
if (percentage >= 100) {
//done!
window.clearTimeout(timer);
}
}
});
}
Holding respond untill your complete processing is done is not a viable option, just imagine 30 people accessing the same page, you will have 30 persistent connections to the server for a long time, especially with IIS, i am sure its not a viable option, it might work well in your development environment but when you move production and more people start accessing page your server might go down.
i wish you look into the following
Do the processing on the background on the server and do not hold the response for a long time
Try to write a windows service which resides on the server and takes care of your mass mailing
if you still insist you do it on the web, try sending one email at a time using ajax, for every ajax request send an email/two
and in your above example without response.flush the browser will also not get the % information.
Well, you don't.
Except for simple effects like printing dots or a sequence of images it won't work safely, and even then buffering could interfere.
My approach would be to have an area which you update using an ajax request every second to a script which reads a log file or emails sent count file or such an entry in the database which is created by the mass mailing process. The mass mailing process would be initiated by ajax as well.
ASP will not write anything to the page until it's fully done processing (unless you do a flush)
Response.Buffer=true
write something
response.flush
write something else
etc
(see example here: http://www.w3schools.com/asp/met_flush.asp)
A better way to do this is to use ajax.
Example here:
http://jquery-howto.blogspot.com/2009/04/display-loading-gif-image-while-loading.html
I didn't like ajax at first, but I love it now.