Setting time interval in HTML5 server sent events - javascript

I want to send regular updates from server to client. For that I used server-sent event. I'm pasting the codes below:
Client side
Getting server updates
<script>
if(typeof(EventSource)!="undefined")
{
var source=new EventSource("demo_see.php");
source.onmessage=function(event)
{
document.getElementById("result").innerHTML=event.data + "<br>";
}
}
else
{
document.getElementById("result").innerHTML="Sorry, your browser does not support server-sent events...";
}
</script>
</body>
</html>
Server side
<?php
header('Content-Type: text/event-stream');
header('Cache-Control: no-cache');
$x=rand(0,1000);
echo "data:{$x}\n\n";
flush();
?>
The code works fine but it sends updates in every 3 seconds. I want to send updates in milliseconds. I tried sleep(1) after flush() but it only increases the interval further by 1 sec. Does anyone have an Idea how I can accomplish this?
Also, can I send images using server-sent events?

As discussed in the comments above running a PHP script in an infinite loop with a sleep or a usleep is incorrect for two reasons
The browser will not see any event data (presumably it waits for the connection to close first) while that script is still running. I recall that early browser implementations of SSE allowed this but it is no longer the case.
Even if it did work browser-side you would still be faced with the issue of having a PHP script that runs excessively long (until the PHP.ini time_out settings kick in). If this happens once or twice it is OK. If there are X thousand browsers that simultaneously seek the same SSE from your server it will bring down your server.
The right way to do things is to get your PHP script to respond with event stream data and then gracefully terminate as it would normally do. Provide a retry value - in milliseconds - if you want to control when the browser tries again. Here is some sample code
function yourEventData(&$retry)
{
//do your own stuff here and return your event data.
//You might want to return a $retry value (milliseconds)
//so the browser knows when to try again (not the default 3000 ms)
}
header('Content-Type: text/event-stream');
header('Cache-Control: no-cache');
header('Access-Control-Allow-Origin: *');//optional
$data = yourEventData($retry);
echo "data:{$str}\n\nretry:{$retry}\n\n";
As an answer to the original question this is a bit late but nevertheless in the interests of completeness:
What you get when you poll the server in this way is just data. What you do with it afterwards is entirely up to you. If you want to treat those data as an image and update an image displayed in your web page you would simply do
document.getElementById("imageID").src = "data:image/png;base64," + Your event stream data;
So much for the principles. I have on occasion forgotten that retry has to been in milliseconds and ended up returning, for example, retry:5\n\n which, much to my surprise, still worked. However, I would hesitate to use SSE to update a browser side image at 100ms intervals. A more typical usage would be along the following lines
User requests a job on the server. That job either gets queued behind other jobs or is likely to take quite a bit of time to execute (e.g. creating a PDF or an Excel spreadsheet and sending it back)
Instead of making the user wait with no feedback - and risking a timeout - one can fire up an SSE which tells the browser the ETA for the job to finish and a retry value is setup so the browser knows when to look again for a result.
The ETA is used to provide the user with some feedback
At the end of the ETA the browser will look again (browsers do this automatically so you need do nothing)
If for some reason the job is not completed by the server it should indicate that in the event stream it returns, e.g. data{"code":-1}\n\n so browser side code can deal with the situation gracefully.
There are other usage scenarios - updating stock quotes, news headlines etc. Updating images at 100ms intervals feels -a purely personal view - like a misuse of the technology.
It is now close to 5 years since I posted this answer and it still gets upvoted quite regularly. For the benefit of anyone still using it as a reference - in many ways SSE is, in my view, a rather outdated technology. With the advent of widespread support for WebSockets why bother doing SSE. Quite apart from anything else the cost of setting up and tearing down an HTTPS connection from the browser for each browser side retry is very high. The WSS protocol is far more efficient.
A spot of reading if you want to implement websockets
Client Side
Server side via PHP with Ratchet
With Nginx and NChan
To my mind PHP is not a great language to handle websockets and Ratchet is far from easy to setup. The Nginx/NChan route is far easier.

The reason for this behavior (message every 3 seconds) is explained here:
The browser attempts to reconnect to the source roughly 3 seconds after each connection is closed
So one way to get message every 100 milliseconds is changing the reconnect time: (in the PHP)
echo "retry: 100\n\n";
This is not very elegant though, better approach would be endless loop in PHP that will sleep for 100 milliseconds on each iteration. There is good example here, just changing the sleep() to usleep() to support milliseconds:
while (1) {
$x=rand(0,1000);
echo "data:{$x}\n\n";
flush();
usleep(100000); //1000000 = 1 seconds
}

I believe that the accepted answer may be misleading. Although it answers the question correctly (how to set up 1 second interval) it is not true that infinite loop is a bad approach in general.
SSE is used to get updates from the server when there actually are the updates opposed to Ajax polling that constantly checks for updates (even when there are none) in some time intervals. This can be accomplished with an infinite loop that keeps the server-side script running all the time, constantly checks for updates and echos them only if there are changes.
It is not true that:
The browser will not see any event data while that script is still running.
You can run the script on the server and still sent the updates to the browser not ending the script execution like this:
while (true) {
echo "data: test\n\n";
flush();
ob_flush();
sleep(1);
}
Doing it by sending retry parameter without infinite loop will end the script and then start the script again, end it, start again... This is similar to Ajax-polling checking for updates even if there are none and this is not how SSE is intended to work. Of course there are some situations where this approach is appropriate like it's listed in the accepted answer (for example waiting for server to create PDF and notify a client when it's done).
Using infinite loop technique will keep the script running on the server all time so you should be careful with a lot of users because you will have a script instance for each of them and it could lead to server overload. On the other hand, the same issue would happen even in some simple scenario where you suddenly get bunch of users on the website (without SSE) or if you would using Web Sockets instead of SSE. Everything has its own limitations.
Another thing to be careful about is what you put in the loop. For example, I wouldn't recommend putting database query in the loop that runs every second because then you're also putting a database at risk of overloading. I would suggest using some kind of cache (Redis or even simple text file) for this case.

SSE is an interesting technology, but one that comes with a choking side effect on implementations using APACHE/PHP backend.
When I first found out about SSE I got so excited that I replaced all Ajax polling code with SSE implementation. Only a few minutes of doing this I notice my CPU usage went up to 99/100 and the fear that my server was soon going to be brought down, forced me to revert the changes back to the friendly old Ajax polling. I love PHP and even though I knew SSE would work better on Node.is, I just wasn't ready to go that route yet!
After a period of critical thinking, I came up with an
SSE APACHE/PHP implementation that could work without literally choking my server to death.
I'm going to share my SSE server side code with you, hopefully it helps someone overcome the challenges of implementing SSE with PHP.
<?php
/* This script fetches the lastest posts in news feed */
header("Content-Type: text/event-stream");
header("Cache-Control: no-cache");
// prevent direct access
if ( ! defined("ABSPATH") ) die("");
/* push current user in session data into global space so
we can release session lock */
$GLOBALS["exported_user_id"] = user_id();
$GLOBALS["exported_user_tid"] = user_tid();
/* now release session lock having exported session data
in global space. if we don't do this, then no other scripts
will run thus causing the website to lag even when
opening in a new tab */
session_commit();
/* how long should this connection be maintained -
while we want to wait on the server long enoug for
update, holding the connection forever burn CPU
resources, depending on the server resources you have
available you can tweak this higher or lower. Typically, the
higher the closer your implementation stays as an SSE
otherwise it will be equivalent to Ajax polling. However, an
higher time burns CPU resource especially when there's
more users on your website */
$time_to_stay = strtotime("1 minute 30 seconds");
/* if no data is sent, we wait 2 seconds then abort
connection. You can use this to test when a data you
require for script operation is not passed along. Typically
SSE reconnects after 3 seconds */
if ( ! isset( $_GET["id"] ) ){
exit;
}
/* if "HTTP_LAST_EVENT_ID" is set, then this is a
continue of temporily terminated script operation. This is
important if your SSE is maintaining state you can use
the header to get last event ID sent */
$last_postid = ( ( isset(
$_SERVER["HTTP_LAST_EVENT_ID"] ) ) ? intval(
$_SERVER["HTTP_LAST_EVENT_ID"] ) :
intval( $_GET["id"] ) );
/* keep the connection active until there's data to send to
client */
while (true) {
/* You can assume this function perform some database
operations to get latest posts */
$data = fetch_newsfeed( $last_postid );
/* if data is not empty, we want to push back to the client
then there must have been some new posts to push to
client */
if ( ! empty( trim( $data ) ) ){
/* With SSE its my common practice to Json encode all
data because I notice that not doing so, sometimes
cause SSE to lose the data packet and only deliver a
handful of the data on the client. This is bad since we are
returning a structured HTML data and loosing some part
of it will cause our HTML page to break when the data is
inserted in our page */
$data = json_encode(array("result" => $data));
echo "id: $last_postid \n"; // this is the lastEventID
echo "data: $data\n\n"; // our data
/* flush to avoid waiting for script to terminate - make
sure its in the same order */
#ob_flush(); flush();
}
// the amount of time that has been spent on this script
$time_stayed = intval(floor($time_to_stay) - time());
/* if we have stayed more than time to stay, then abort
this connection to free up CPU resource */
if ( $time_stayed <= 0 ) { exit; }
/* we simply wait 5 seconds and continue again from
start . We don't want to keep pounding our DB since we
are in a tight loop so we sleep a few seconds and start
from top*/
sleep(5);
}

SSE on Nginx driven PHP websites seems to have some finer nuances. Firstly, I had to give this setting in the Location section of the Nginx configuration
fastcgi_buffering off;
Someone recommended that I change the fastcgi_read_timeout to a longer period but it did not really help... or maybe I did not dive deep enough
fastcgi_read_timeout 600s;
Both those settings are to be given in the Nginx configuration's location section.
The standard endless loop that many are recommending inside the SSE code tends to hang Nginx (or possibly PHP7.4fpm ) and that is serious; as it brings down the entire server. Though people have suggested set_time_out(0) in PHP to change the default time out (which I believe is 30 seconds) I am not very sure it is a good strategy
If you remove the endless loop entirely, the SSE system seems to work like polling: The Javascript code for EventSource keeps calling back the SSE PHP module. Which made it a bit more simpler than Ajax polling (as we don't have to write any extra code for Javascript to do that polling) but nevertheless, it is still going to keep retrying and hence is very similar to Ajax polling. And each retry is a complete reload of the PHP SSE code, so it is slower than what I finally did.
This is what worked for me. It is a hybrid solution, where there is a loop alright, but not an endless one. Once that loop is finished, the SSE PHP code terminates. That gets registered in the browser as a failure (You can see that in the inspector console) and the browser then calls the SSE code once again on the server. It is like polling, but at longer intervals.
In between one load of the SSE and the next reload, the SSE keeps working in the loop, during which additional data can be pushed into the browser. So you do have enough speed, without the headache of the entire server hanging.
<?php
$success = set_time_limit( 0 );
ini_set('auto_detect_line_endings', 1);
ini_set('max_execution_time', '0');
ob_end_clean();
header('Content-Type: text/event-stream');
header('Cache-Control: no-cache');
header('X-Accel-Buffering: no');
//how fast do you want the browser to reload this SSE
//after the while loop fails:
echo "retry: 200\n\n";
//If any dynamic data comes into your application
//in this 'retry' time period, and disappears,
//then SSE will NOT be able to push that data
//If it is too short, there may be insufficient
//time to finish some work within the execution
//of one loop of the SSE while loop below
$emptyCount = 0;
$execCount = 0;
$countLimit = 60; //Experiment with this, which works for you
$emptyLimit = 5;
$prev = "";
while($execCount < $countLimit){
$execCount++;
if( connection_status() != CONNECTION_NORMAL or connection_aborted() ) break;
if(file_exists($file_path)) {
//The file is to be deleted
//so that it does not return back again
//There can be better method than one suggested here
//But not getting into it, as this is only about SSE overall
$s= file_get_contents("https://.....?f=$file_path");
if($s == "")
{
$emptyCount++;
$prev = "";
}
else {
if($s != $prev){
$prev = $s;
echo $s; //This is formatted as data:...\n\n
//as needed by SSE
}
}
//If it is continually empty then break out of the loop. Why hang around?
if($emptyCount >$emptyLimit) {
$emptyCount=0;
$prev = "";
break;
}
} else $prev = "";
#ob_flush();
#flush();
sleep(1);
}

Related

long poll hanging all other server requests

I have a simple long poll request on my website to check if there are any new notifications available for a user. As far as the request goes, everything seems to work flawlessly; the request is only fulfilled once the database is updated (and a notification has been created for that specific user), and a new request is sent out straight after.
The Problem
What I have noticed is that when the request is waiting for a response from the database (as long polls should), all other requests to the server will also hang with it - whether it be media files, AJAX requests or even new pages loading. This means that all requests to the server will hang until I close my browser and reopen it.
What is even stranger is that if I visit another one of my localhost sites (my long poll is on a MAMP virtualhost site, www.example.com), their is no problem and I can still use them as if nothing has happened - despite the fact they're technically on the same server.
My Code
This is what I have on my client side (longpoll.js):
window._Notification = {
listen: function(){
/* this is not jQuery's Ajax, if you're going to comment any suggestions,
* please ensure you comment based on regular XMLHttpRequest's and avoid
* using any suggestions that use jQuery */
xhr({
url: "check_notifs.php",
dataType: "json",
success: function(res){
/* this will log the correct response as soon as the server is
* updated */
console.log(res);
_Notification.listen();
}
});
}, init: function(){
this.listen();
}
}
/* after page load */
_Notification.init();
And this is what I have on my server side (check_notifs.php):
header("Content-type: application/json;charset=utf-8", false);
if(/* user is logged in */){
$_CHECKED = $user->get("last_checked");
/* update the last time they checked their notifications */
$_TIMESTAMP = time();
$user->update("last_checked", $_TIMESTAMP);
/* give the server a temporary break */
sleep(1);
/* here I am endlessly looping until the conditions are met, sleeping every
* iteration to reduce server stress */
$_PDO = new PDO('...', '...', '...');
while(true){
$query = $_PDO->prepare("SELECT COUNT(*) as total FROM table WHERE timestamp > :unix");
if($query->execute([":unix" => $_CHECKED])){
if($query->rowCount()){
/* check if the database has updated and if it has, break out of
* the while loop */
$total = $query->fetchAll(PDO::FETCH_OBJ)[0]->total;
if($total > 0){
echo json_encode(["total" => $total]);
break;
}
/* if the database hasn't updated, sleep the script for one second,
* then check if it has updated again */
sleep(1);
continue;
}
}
}
}
/* for good measure */
exit;
I have read about NodeJS and various other frameworks that are suggested for long-polling, but unfortunately they're currently out of reach for me and I'm forced to use PHP. I have also had a look around to see if anything in the Apache configuration could solve my problem, but I only came across How do you increase the max number of concurrent connections in Apache?, and what's mentioned doesn't seem like it would be the problem considering I can still use my other localhost website on the same server.
Really confused as to how I can solve this issue, so all help is appreciated, Cheers.
What is actually happening is that php is waiting for this script to end (locked) to serve the next requests to the same file.
As you can read here:
there is some lock somewhere -- which can happen, for instance, if the two requests come from the same client, and you are using
file-based sessions in PHP : while a script is being executed, the
session is "locked", which means the server/client will have to wait
until the first request is finished (and the file unlocked) to be able
to use the file to open the session for the second user.
the requests come from the same client AND the same browser; most browsers will queue the requests in this case, even when there is
nothing server-side producing this behaviour.
there are more than MaxClients currently active processes -- see the quote from Apache's manual just before.
There's actually some kind of lock somewhere. You need to check what lock is happening. Maybe $_PDO is having the lock and you must close it before the sleep(1) to keep it unlocked until you make the next request.
You can try to raise your MaxClients and/or apply this answer
Perform session_write_close() (or corresponding function in cakephp) to close the session in the begin of the ajax endpoint.

SSE incredibly slow

I am currently writing the communication framework for a web game, the communications map can be seen below: The code is as follows:
test.php:
<!DOCTYPE html>
<html>
<head>
<title> Test </title>
<script>
function init()
{
var source = new EventSource("massrelay.php");
source.onmessage = function(event)
{
console.log("massrelay sent: " + event.data);
var p = document.createElement("p");
var t = document.createTextNode(event.data);
p.appendChild(t);
document.getElementById("rec").appendChild(p);
};
}
function test()
{
var xhr = new XMLHttpRequest();
xhr.onreadystatechange = function ()
{
if(xhr.readyState === XMLHttpRequest.DONE && xhr.status === 200)
{
console.log("reciver responded: " + xhr.responseText);
}
}
xhr.open("GET", "reciver.php?d=" + document.getElementById("inp").value , true);
xhr.send();
console.log("you sent: " + document.getElementById("inp").value);
}
</script>
</head>
<body>
<button onclick="init()">Start Test</button>
<textarea id="inp"></textarea>
<button onclick="test()">click me</button>
<div id="rec"></div>
</body>
</html>
This takes user input (currently a textbox for testing) and sends it to the receiver, and writes back what the receivers response to the console, i have never received an error from the receiver. it also adds an event listener for the SSE that is sent.
reciver.php:
<?php
$data = $_REQUEST["d"];
(file_put_contents("data.txt", $data)) ? echo $data : echo "error writing";
?>
This as you can see is very simple and only functions to write the data to data.txt before sending back that the write was successful. data.txt is simply the "tube" data is passed through to massrelay.php.
massrelay.php:
<?php
header('Content-Type: text/event-stream');
header('Cache-Control: no-cache');
while(1)
{
$data = file_get_contents("data.txt");
if ($data != "NULL")
{
echo "data: " . $data . "\n\n";
flush();
file_put_contents("data.txt", "NULL");
}
}
?>
massrelay.php checks if there is any data in data.txt and if so will pass it using SSE to anyone with an event listener for it, once it reads the data it will clear the data file.
The entire thing actually works perfectly except for the slight ishue that it can take anywhere from 30 seconds to 10 minutes for massrelay.php to send the data from the data file. For a web game this is completely unacceptable as you need real time action. I was wondering if it was taking so long due to a flaw in my code or if not im thinking hardware (Hosting it myself on a 2006 Dell with a sempron). If anyone sees anything wrong with it please let me know thanks.
Three problems I see with your code:
No sleep
No ob_flush
Sessions
Your while() loop is constantly reading the file system. You need to slow it down. I've put a half second sleep in the below; experiment with the largest value for acceptable latency.
PHP has its own output buffers. You use #ob_flush() to flush them (the # suppresses errors) and flush() to flush the Apache buffers. Both are needed, and the order is important, too.
Finally, PHP sessions lock, so if your clients might be sending session cookies, even if your SSE script does not use the session data, you must close the session before entering the infinite loop.
I've added all three of those changes to your code, below.
<?php
header('Content-Type: text/event-stream');
header('Cache-Control: no-cache');
session_write_close();
while(1)
{
$data = file_get_contents("data.txt");
if ($data != "NULL")
{
echo "data: " . $data . "\n\n";
#ob_flush();flush();
file_put_contents("data.txt", "NULL");
}
usleep(500000);
}
BTW, the advice in the other answer about using an in-memory database is good, but the file system overhead is in milliseconds, so it won't explain a "30 second to 10 minute" latency.
I don't know that writing to a flat file is the best way to do this. File I/O is going to be your big bottleneck here (reading on top of writing means you'll reach that max really quick). But assuming you want to keep on doing it...
Your application could benefit from a PHP session, to store some data so you're not waiting on I/O. This is where an intermediate software like Memcached or Redis could also help you. What you would do is store the data from reciver.php in your text file AND write it into memory cache (or put it into your session which writes to the memory store). This makes retrieval very quick and reduces file I/O.
I would highly suggest a database for your data tho. MySQL in particular will load commonly accessed data into memory to speed read operations.
Years ago I experimented with flat-file & also storing data in a DB for communication between multiple concurrent users to a server (this was for a Flash game but the same principals apply).
Flat-file offers worst performance as you will eventually run into read/write access issues.
With DB it will eventually also fall over with too many requests especially if you are hitting the DB thousands of times a second and there's no load balancing in place.
My answer is not solving your current problem but steering you in a different direction. You really gotta look at using a socket server. Maybe look into something like: https://github.com/reactphp/socket
A few issues you may experience with using a socket server is the fact that shared hosts don't allow you to run shell scripts. My solution was to use my home PC for the socket communication and use my domain as a public entry point for the hosted game. Obviously we don't all have static IPs to point our games to so I had to use dyndns and back then it was free: http://dyn.com (there may be some other new services that are now free). With using a home server you will also need to setup your router for port forwarding to send any specific port requests on your IP/router to your LAN server. Make sure you are running firewalls on both the router and the server to protect your other potentially exposed ports.
I know this maybe seems complicated but trust me it's the most optimal solution. If you need any help PM me and I can try guide you through any issues you may experience.
EDIT: I deleted this answer as OP is saying that my suggested test 1. (see below) is working fine so my theory about output buffering is wrong. But on other hand he is saying the same code with native functions fread fwrite fclose flock doesnt work, so if buffering and file I/O is not a solution i don't know what is it. I removed my post because I don't thinks it's a valid answer. Let me sum this up:
error display is enabled E_ALL
flush is working fine
OP says he used native file functions properly fopen fread fwrite flock and it doesn't help.
If flush is working, file system is working, I can't do anything but trust OP he is right and give up.
So right know my job here is done, I can't help if I can't try it by myself on OP's system, configuration and code.
I undeleted my answer so OP can have links to docs and other people can see my attempt to make a solution.
MY OLD POST I DELETED
1. Work on test massrelay.php
while(true) {
echo "test!";
sleep(1);
}
so you'll be sure that problem is not file related.
2. Make sure you have error_reporting and display_errors enabled.
I am guessing you get response after 30 seconds because PHP script is being terminated after time limit. if you would have errors enabled you would see error message informing you about that.
3. Make sure you actually flush your output and it's not buffered.
that it can take anywhere from 30 seconds to 10 minutes
You being able to see data after 30 seconds make sense because 30 seconds is default value for max execution time in PHP.
It looks like flush() is not working in your screnario, and you should check output_buffering setting in your php.ini file
Please see this: php flush not working
Documentation:
http://php.net/manual/en/function.flush.php
http://php.net/manual/en/function.ob-flush.php
http://php.net/manual/en/book.outcontrol.php
http://php.net/manual/en/outcontrol.configuration.php
In one of several individual instances of having to debug SSE I discovered that if (ob_get_level() > 0) {ob_end_clean();} was causing the issue ironically. That is the code you need to prevent PHP errors from spawning if there aren't any levels. Reverting back to ob_end_clean(); solved the problem.

long polling - jquery + php

I want to long poll a script on my server from within a phonegap app, to check for things like service messages, offers etc.
I'm using this technique in the js:
(function poll(){
$.ajax({
url: "/php/notify.php",
success: function(results){
//do stuff here
},
dataType: 'json',
complete: poll,
timeout: 30000,
});
})();
which will start a new poll every 5 minutes (will be stopping the polling when the app is 'paused' to avoid extra load)
I am not sure how to set up the php though? I can set it up so it doesnt return anything and just loops trough the script, but how to make it return a response as soon as i decide i want to send a message to the app? my php code so far is:
<?php
include 'message.php';
$counter = 1;
while($counter > 0){
//if the data variable exists (from the included file) then send the message back to the app
if($message != ''){
// Break out of while loop if we have data
break;
}
}
//if we get here weve broken out the while loop, so we have a message, but make sure
if($message != ''){
// Send data back
print(json_encode($message));
}
?>
message.php contains a $message variable (array), which normally is blank however would contain data when i want it to. The problem is, when i update the $message var in message.php, it doesnt send a response back to the app, instead it waits until it has timed out and the poll() function starts again.
so my question is, how do i set-up the php so i can update the message on my server and it be sent out instantly to anyone polling?
Long polling is actually very resource intensive for what it achieves
The problem you have is that it's constantly opening a connection every second, which in my opinion is highly inefficient. For your situation, there are two ways to achieve what you need; the preferred way being to use web sockets (I'll explain both):
Server Sent Events
To avoid your inefficient Ajax timeout code, you may want to look into Server Sent Events, an HTML5 technology designed to handle "long-polling" for you. Here's how it works:
In JS:
var source = new EventSource("/php/notify.php");
source.onmessage=function(event) {
document.getElementById("result").innerHTML+=event.data + "<br>";
};
In PHP:
You can send notifications & messages using the SSE API interface. I
don't have any code at hand, but if you want me to create an example,
I'll update this answer with it
This will cause Javascript to long-poll the endpoint (your PHP file) every second, listening for updates which have been sent by the server. Somewhat inefficient, but it works
WebSockets
Websockets are another ballgame completely, and are really great
Long-Polling & SSE's work by constantly opening new requests to the server, "listening" for any information that is generated. The problem is that this is very resource-intensive, and consequently, quite inefficient. The way around this is to open a single sustained connection called a web socket
StackOverflow, Facebook & all the other "real-time" functionality you enjoy on these services is handled with Web Sockets, and they work in exactly the same way as SSE's -- they open a connection in Javascript & listen to any updates coming from the server
Although we've never hard-coded any websocket technology, it's by far recommended you use one of the third-party socket services (for reliability & extensibility). Our favourite is Pusher

how to update chat window with new messages

setInterval(function{
//send ajax request and update chat window
}, 1000)
is there any better way to update the chat with new messages? is this the right way to update the chat using setInterval?
There are two major options (or more said popular ways)
Pulling
First is pulling, this is what you are doing. Every x (milli)seconds you check if the server config has changed.
This is the html4 way (excluding flash etc, so html/js only). For php not the best way because you make for a sinle user a lot of connections per minute (in your example code at least 60 connections per second).
It is also recommended to wait before the response and then wait. If for example you request every 1 second for an update, but your response takes 2 seconds, you are hammering your server. See tymeJV answer for more info
Pushing
Next is pushing. This is more the HTML5 way. This is implemented by websockets. What is happining is the client is "listing" to a connection and waiting to be updated. When it is updated it will triger an event.
This is not great to implement in PHP because well you need a constanct connection, and your server will be overrun in no time because PHP can't push connections to the background (like Java can, if I am correct).
I made personally a small chat app and used pusher. It works perfectly. I only used the free version so don't know how expensive it is.
Pretty much yes, one minor tweak, rather than encapsulate an AJAX call inside an interval (this could result in pooling of unreturned requests if something goes bad on the server), you should throw a setTimeout into the AJAX callback to create a recursive call. Consider:
function callAjax() {
$.ajax(options).done(function() {
//do your response
setTimeout(callAjax, 2000);
});
}
callAjax();

Periodic refresh or polling

I am trying to use periodic refresh(ajax)/polling on my site by XMLHttp(XHR) to check if a user has a new message on the database every 10 seconds, then if there is inform him/her by creating a div dynamically like this:
function shownotice() {
var divnotice = document.createElement("div");
var closelink = document.createElement("a");
closelink.onclick = this.close;
closelink.href = "#";
closelink.className = "close";
closelink.appendChild(document.createTextNode("close"));
divnotice.appendChild(closelink);
divnotice.className = "notifier";
divnotice.setAttribute("align", "center");
document.body.appendChild(divnotice);
divnotice.style.top = document.body.scrollTop + "px";
divnotice.style.left = document.body.scrollLeft + "px";
divnotice.style.display = "block";
request(divnotice);
}
Is this a reliable or stable way to check message specifically since when I look under firebug, a lot of request is going on to my database? Can this method make my database down because of too much request? Is there another way to do this since when I login to facebook and check under firebug, no request is happening or going on but I know they are using periodic refresh too... how do they do that?
You can check for new data every 10 seconds, but instead of checking the db, you need to do a lower impact check.
What I would do is modify the db update process so that when it makes a change to some data, it also updates the timestamp on a file to show that there is a recent change.
If you want better granularity than "something changed somewhere in the db" you can break it down by username (or some other identifier). The file(s) to be updated would then be the username for each user who might be interested in the update.
So, when you script asks the server if there is any information for user X newer than time t, instead of making a DB query, the server side script can just compare the timestamp of a file with the time parameter and see if there is anything new in the database.
In the process that is updating the DB, add code that (roughly) does:
foreach username interested in this update
{
touch the file \updates\username
}
Then your function to see if there is new data looks something like:
function NewDataForUser (string username, time t)
{
timestamp ts = GetLastUpdateTime("\updates\username");
return (ts > t);
}
Once you find that there is new data, you can then do a full blown DB query and get whatever information you need.
I left facebook open with firebug running and I'm seeing requests about once a minute, which seems like plenty to me.
The other approach, used by Comet, is to make a request and leave it open, with the server dribbling out data to the client without completing the response. This is a hack, and violates every principle of what HTTP is all about :). But it does work.
This is quite unreliable and probably far too taxing on the server in most cases.
Perhaps you should have a look into a push interface: http://en.wikipedia.org/wiki/Push_technology
I've heard Comet is the most scalable solution.
I suspect Facebook uses a Flash movie (they always download one called SoundPlayerHater.swf) which they use to do some comms with their servers. This does not get caught by Firebug (might be by Fiddler though).
This is not a better approach. Because you ended up querying your server in every 10 seconds even there is no real updates.
Instead of this polling approach, you can simulate the server push (reverrse AJAX or COMET) approach. This will compeletly reduce the server workload and only the client is updated if there is an update in server side.
As per wikipedia
Reverse Ajax refers to an Ajax design
pattern that uses long-lived HTTP
connections to enable low-latency
communication between a web server and
a browser. Basically it is a way of
sending data from client to server and
a mechanism for pushing server data
back to the browser.
For more info, check out my other response to the similar question

Categories