javascript EventSource to update import progress in a php-file - javascript

I'm trying to implement a progress bar for importing records into a database.
The import is started using jQuery's $.post(...) sent to a php script on the server.
I tried several approaches:
start the import and write the progress to a SESSION-Var, polling the var with a second call via EventSource
php for import is something like:
foreach($importProduct as $ip){
$_SESSION['importedProducts'] += 1;
// ... do the import-stuff
}
And then fetch the import-progress using EventSource
var jsonStream = new EventSource('eventSource.php');
if(typeof(EventSource) === "undefined"){
alert('browser doesn\'t support EventSource');
}else{
console.log('fetching stream');
jsonStream.onmessage = function (e) {
console.log('stream: ' + e.data);
$('#eventReturn').html(e.data);
//var message = JSON.parse(e.data);
// handle message
};
}
and in the PHP-script
header('Content-Type: text/event-stream');
header('Cache-Control: no-cache');
while(1){
echo 'data: {"imported":"'.$_SESSION['importedProducts'].'","total":"'.$_SESSION['totalProductsToImport'].'"}';
echo "\n\n";
ob_flush();
flush();
sleep(1);
}
Which obviously doesn't work since the SESSION is not updated in between calls.
Writing the progress to a file and then read it out every second seems to be a bit of an overhead...
Another thing I tried is using a js-function that calls itself every second and attempts to get the progress from the same script - but it hangs until the import-script has finished
function uploadProgress(){
// Fetch the latest data
$.get('progress.php', function(data){
console.log(data);
});
setTimeout(uploadProgress, 5000);
}
Any ideas?
Notes: I'm starting the Session at ever call (session_start). I'm aware that the «while(1)» creates an endless loop... :)

I found a solution.
The trick was to open and close the session in between calls.
So in the import-script:
foreach($importProduct as $ip){
session_start();
$_SESSION['importedProducts'] += 1;
session_write_close();
// ... do the import-stuff
}
and in the progress-script:
header('Content-Type: text/event-stream');
header('Cache-Control: no-cache');
while(1){
session_start();
echo 'data: {"imported":"'.$_SESSION['importedProducts'].'","total":"'.$_SESSION['totalProductsToImport'].'"}';
echo "\n\n";
ob_flush();
flush();
session_write_close();
sleep(1);
}
Now the session gets updated continuously and I can use the EventSource-approach

Related

Cache problem? Server side events work in localhost, not in production enviroment

I want to ask this question with a simply example. (I will write down at the end of the post).
I have read this:
server sent events not updating until script is finished
But I don't know how to solve it.
With the solution of its answer (https://stackoverflow.com/a/37690766/8494053) works perfect, so may be it is a problem of cache in my production server (share web hosting).
I recieve all the EventStream at the end, all in a row at the same time.
I have already checked all combinations of:
header('Cache-Control: no-cache, no-store, must-revalidate, private, max-age=0');
header('Pragma: no-cache');
header('Expires: 0');
But no luck
Any one knows how to solve this without "str_pad($message, 800000)" ?
Any clue to compare my localhost configuration of the server and the shared hostweb server?
Thanks,
NOTE 1: php version 8 in both enviroments. I have checked that I work with apached as developement enviroment and CGI/FastCGI in my shared webserver. Is it related?
I have found this:
Event Source -> Server returns event stream in bulk rather then returning in chunk
NOTE 2: Output buffering is the same in both servers: output_buffering 4096
This is a simple example that doesnt work in my hosting:
test.html
<!DOCTYPE html>
<html>
<head>
<meta charset="utf-8" />
</head>
<body>
<br />
<input type="button" onclick="startTask();" value="Start Long Task" />
<input type="button" onclick="stopTask();" value="Stop Task" />
<br />
<br />
<p>Results</p>
<br />
<div id="results" style="border:1px solid #000; padding:10px; width:300px; height:250px; overflow:auto; background:#eee;"></div>
<br />
<progress id='progressor' value="0" max='100' ></progress>
<span id="percentage" style="text-align:right; display:block; margin-top:5px;">0</span>
</body>
</html>
<script>
var es;
function startTask() {
if (!!window.EventSource) {
es = new EventSource('long_process.php');
//a message is received
es.addEventListener('message', function(e) {
var result = JSON.parse( e.data );
addLog(result.message);
if(e.lastEventId == 'CLOSE') {
addLog('Received CLOSE closing');
es.close();
var pBar = document.getElementById('progressor');
pBar.value = pBar.max; //max out the progress bar
}
else {
var pBar = document.getElementById('progressor');
pBar.value = result.progress;
var perc = document.getElementById('percentage');
perc.innerHTML = result.progress + "%";
perc.style.width = (Math.floor(pBar.clientWidth * (result.progress/100)) + 15) + 'px';
}
});
es.addEventListener('error', function(e) {
addLog('Error occurred');
es.close();
});
}
}
function stopTask() {
es.close();
addLog('Interrupted');
}
function addLog(message) {
var r = document.getElementById('results');
r.innerHTML += message + '<br>';
r.scrollTop = r.scrollHeight;
}
</script>
long_process.php
<?php
header('Content-Type: text/event-stream');
// recommended to prevent caching of event data.
header('Cache-Control: no-cache');
function send_message($id, $message, $progress) {
$d = array('message' => $message , 'progress' => $progress);
echo "id: $id" . PHP_EOL;
echo "data: " . json_encode($d) . PHP_EOL;
echo PHP_EOL;
//push the data out by all force possible
ob_flush();
flush();
}
//LONG RUNNING TASK
for($i = 1; $i <= 10; $i++) {
send_message($i, 'on iteration ' . $i . ' of 10' , $i*10);
sleep(1);
}
send_message('CLOSE', 'Process complete', 100);
?>
UPDATE About #Tigger answer: I have used this code, but no luck. Again I recieve all in a row at the end of the script (10seconds), not a message every 1 second.
(I have also checked "\n" and PHP_EOL).
function send_message($id, $message, $progress) {
$d = array('message' => $message , 'progress' => $progress);
echo "id: $id" . "\n";
echo "data: " . json_encode($d) . "\n";
echo "\n";
//push the data out by all force possible
while(ob_get_level() > 0) {
ob_end_flush();
}
flush();
}
UPDATE About second #Tigger answer
I have used MDN sample on GitHub and no luck. XAMPP works, my production webserver ... doesn't.
UPDATE About hosting provider
As I have not found a solution, I have contacted with my shared web hosting, and here is their answer:
(translate with google):
Hello, After analyzing the case, as we have been able to verify, the use of SSE on a platform like ours with an nginx proxy ahead of apache, would require certain customizations in the nginx configuration of the hosting, which makes it incompatible with the service of shared hosting. You need a service that is more customizable such as a vps, or a virtual private server or similar. Greetings,
As I can't change nginx configuration, is it any other configuration/command in my php files or javascript that will help me?
After a lot of messing around I found the following syntax for your long_process.php works best in my environment.
My server is using FreeBSD and my PHP scripts (PHP 8) are also Unix formatted (important for line returns). If you are on a mix of Windows and Linux, your line returns could be part of the issue.
I also found ob_get_level() helped a lot. The connection_aborted() check will close off the script quicker too. This will prevent the script from continuing when the user navigates away, returning resources to the webserver.
My JavaScript structure is a bit different from yours as well, but your issue appears to be on the PHP side, so I have skipped that part.
long_process.php
// how long between each loop (in seconds)
define('RETRY',4);
header("Cache-Control: no-cache");
header("Content-Type: text/event-stream");
// skip the first check as the member just started
echo 'retry: '.(RETRY * 1000);
echo 'data: {"share":true,"update":false}';
echo "\n\n";
flush();
sleep(RETRY);
while(1) {
if (... some conditional check here ...) {
echo 'data: {"share":true,"update":true}';
} else {
echo 'data: {"share":true,"update":false}';
}
echo "\n\n";
while(ob_get_level() > 0) {
ob_end_flush();
}
flush();
if (connection_aborted()) {
break;
}
sleep(RETRY);
}
As per this answer on a similar question, this is an Nginx isssue. You can fix this by adding a 'X-Accel-Buffering' header with value 'no' in your response. See this entry in the Nginx documentation for more detail.

Server Side Events, HTML5, PHP & Javascript... index page not 'refreshing'

I found a really good article with a feature I want to add to a page, but have been stuck the entire day with one small error. For reference the tutorial is located here.
Everything is working, the only thing that is not happening is the fact that the index.php webpage is not refreshing on changes made to the hosted php array. Could anyone glance at my code and tell me if I have a typo or missed part of the article?
My array file - selectedSystemStateResults.php
<?php
$selectedSystemStateResults = ["cart", "dogsss", "cows", "zebra", "snake"];
My serverside PHP script file - selectedSystemState-script.php
<?php
header("Cache-Control: no-cache");
header("Content-Type: text/event-stream");
// Require the file which contains the $animals array
require_once "selectedSystemStateResults.php";
// Encode the php array in json format to include it in the response
$selectedSystemStateResults = json_encode($selectedSystemStateResults);
echo "data: $selectedSystemStateResults" . "\n\n";
flush();
echo "retry: 1000\n";
echo "event: selectedSystemStateResultsMessage\n";
My Client side web page - index.php
<?php require "selectedSystemStateResults.php"; ?>
<html>
<body>
<?php foreach ($selectedSystemStateResults as $selectedSystemStateResult) : ?>
<li><?php echo $selectedSystemStateResult; ?></li>
<?php endforeach ?>
</ul>
<script src="/selectedSystemState-script.js"></script>
</body>
</html>
My javascript file - selectedSystemState-script.js
let eventSource = new EventSource('selectedSystemState-script.php');
eventSource.addEventListener("selectedSystemStateResultsMessage", function(event) {
let data = JSON.parse(event.data);
let listElements = document.getElementsByTagName("li");
for (let i = 0; i < listElements.length; i++) {
let selectedSystemStateResults = listElements[i].textContent;
if (!data.includes(selectedSystemStateResults)) {
listElements[i].style.color = "red";
}
}
});
I have read this and re-read this for the past 8 hours and feel really stuck. Does anyone see any blaring php or javascript typos or could the tutorial be wrong?
Please pardon the typo I had in the file names on my unedited original post. The directory shows the files all named properly.
Using this tutorial Using server-sent events
I found out that the script.php file must NOT stop executing !!
or (selectedSystemState-script.php) in your case .
So I guess the the tutorial you linked is wrong in some point ?
try this
while (1) {
// Every second, send a "selectedSystemStateResultsMessage" event.
echo "event: selectedSystemStateResultsMessage\n";
require("selectedSystemStateResults.php");
$selectedSystemStateResults = json_encode($selectedSystemStateResults);
echo "data: $selectedSystemStateResults" . "\n\n";
ob_end_flush();
flush();
sleep(1);
}
this is new to me but i noticed a few things :
1- the php event script file must have header text/event-stream
2- that file must not stop executing !
3- event: is sent before data: .
Hope this help
EDIT
After a test on your script It worked when I changed
<script src="/selectedSystemState-script.js"></script>
to <script src="./selectedSystemState-script.js"></script>
it was calling selectedSystemState-script.js from root folder ! and generate 404 error
and in selectedSystemState-script.php
<?php
header("Cache-Control: no-cache");
header("Content-Type: text/event-stream");
// Require the file which contains the $animals array
require_once "selectedSystemStateResults.php";
// Encode the php array in json format to include it in the response
$selectedSystemStateResults = json_encode($selectedSystemStateResults);
// data after event
flush();
echo "retry: 1000\n";
echo "event: selectedSystemStateResultsMessage\n";
echo "data: $selectedSystemStateResults" . "\n\n";
?>
and I edited selectedSystemState-script.js a bit :
let eventSource = new EventSource('selectedSystemState-script.php');
eventSource.addEventListener("selectedSystemStateResultsMessage", function(event) {
let data = JSON.parse(event.data);
let listElements = document.getElementsByTagName("li");
for (let i = 0; i < listElements.length; i++) {
let selectedSystemStateResults = listElements[i].textContent;
if (!data.includes(selectedSystemStateResults)) {
listElements[i].style.color = "red";
} else {
listElements[i].style.color = "blue";
}
}
});
<script src="/selectedSystemState-script.js"></script>
does not match your javascript filename selectSystemState-script.js. Verify javascript errors next time by opening the developer tools console!
Another error is that you're sending the data before setting the event name. The end of selectedSystemState-script.php should be:
echo "retry: 1000\n";
echo "event: selectedSystemStateResultsMessage\n";
echo "data: $selectedSystemStateResults" . "\n\n";
flush();

How to show each "echo" from PHP one by one. Instead wainting until the all script is complete

i have a php script that takes some moments to execute. And it does multiple "echos" as the progress is going.
This script connects to ftp, deletes all contents and then upload new files. All is working fine and i'm getting the result but only when the script ends.
My output looks like this:
Deleting /var/www/index.php ... ok
Deleting /var/www/tempo.php ... ok
Deleting /var/www/indexOLD.php ... ok
//many lines here ...
//...
//...
Done!
Uploading /var/www/index.php ... ok
Uploading /var/www/tempo.php ... ok
Uploading /var/www/indexOLD.php ... ok
//many lines here ...
//...
//...
Done!
This is my client side code:
function atualizar(id) {
if (window.XMLHttpRequest) {
// code for IE7+, Firefox, Chrome, Opera, Safari
xmlhttp = new XMLHttpRequest();
} else {
// code for IE6, IE5
xmlhttp = new ActiveXObject("Microsoft.XMLHTTP");
}
xmlhttp.onreadystatechange = function() {
document.getElementById("estado").value = xmlhttp.responseText;
};
xmlhttp.open("GET","atualizarBOX.php?id="+id,true);
xmlhttp.send();
}
And here is my serverside php piece of code:
function ftp_putAll($conn_id, $src_dir, $dst_dir) {
$d = dir($src_dir);
while($file = $d->read()) { // do this for each file in the directory
if ($file != "." && $file != "..") { // to prevent an infinite loop
if (is_dir($src_dir."/".$file)) { // do the following if it is a directory
if (!#ftp_chdir($conn_id, $dst_dir."/".$file)) {
ftp_mkdir($conn_id, $dst_dir."/".$file); // create directories that do not yet exist
}
ftp_putAll($conn_id, $src_dir."/".$file, $dst_dir."/".$file); // recursive part
} else {
ftp_put($conn_id, $dst_dir."/".$file, $src_dir."/".$file, FTP_BINARY); // put the files
print "Uploading ".$src_dir."/".$file. " ... OK \n";
}
}
}
$d->close();
}
I already tried to use ob_flush(); flush(); but it ist working.
Also i already changed this values
In php.ini:
. output_buffering = Off
. zlib.output_compression = Off
All i want is to get "echo" one by one as the progress is going...
Something like this:
Deleting /var/www/index.php ... ok
//now user waits a couple of seconds
Deleting /var/www/index.php ... ok
Quick google search reveals that a common way of solving this is called 'comet programming technique'
Here is a quick PHP example (Source)
Stealing server side code from above link and pasting here:
<?php
/**
Ajax Streaming without polling
*/
//type octet-stream. make sure apache does not gzip this type, else it would get buffered
header('Content-Type: text/octet-stream');
header('Cache-Control: no-cache'); // recommended to prevent caching of event data.
/**
Send a partial message
*/
function send_message($id, $message, $progress)
{
$d = array('message' => $message , 'progress' => $progress);
echo json_encode($d) . PHP_EOL;
//PUSH THE data out by all FORCE POSSIBLE
ob_flush();
flush();
}
$serverTime = time();
//LONG RUNNING TASK
for($i = 0; $i < 10; $i++)
{
//Hard work!!
sleep(1);
//send status message
$p = ($i+1)*10; //Progress
send_message($serverTime, $p . '% complete. server time: ' . date("h:i:s", time()) , $p);
}
sleep(1);
send_message($serverTime, 'COMPLETE');
Looks like you just need to add the appropriate headers..

Re-execute a PHP script in a JavaScript function

I wrote a code to re-read a content of a file (on the server) every time the file is modified. The scenario is like this:
- The webpage is loaded
- If the file (on the server) is newer than the starting time of the page (the time when the webpage was started), the content of the file is read
- If the file is modified later, the content must be read again by PHP script
I tried this using EventSource. Here is the code for the browser:
<html>
<head>
<?php
$startTime = time();
$flag = 0;
?>
<script type="text/javascript" language="javascript">
lastFileTime = <?php echo $startTime; ?>;
var fileTime;
if(typeof(EventSource) !== "undefined") {
var source=new EventSource("getFileTime.php");
source.onmessage = function(event) {
fileTime = parseInt(event.data);
if (fileTime > lastFileTime) {
readFile();
lastFileTime = fileTime;
}
};
}
else {
alert("Sorry, your browser does not support server-sent events.");
}
function readFile() {
<?php
$fid = fopen("file.bin", "rb");
... // Read the content of the file
$flag = $flag + 1;
?>
... // Transfer the content of the file to JavaScript variables
flag = <?php echo $flag; ?>;
}
</script>
</head>
<body>
...
</body>
</html>
And here is the server-side code (getFileTime.php):
<?php
header('Content-Type: text/event-stream');
header('Cache-Control: no-cache');
$filetime = filemtime("file.bin");
echo "data: {$filetime}\n\n";
flush();
?>
When I start the webpage and created file.bin afterwards, readFile() is called for the first time (I checked the value flag = 1. But then, if I modified file.bin again, obviously readFile() is not called. I checked the content of the file; it's still from the previous file, and also flag is still 1. It seems that a PHP script in a JavaScript function can only be called once. How to re-execute the PHP script in a JavaScript function?
Your PHP script needs to remain active, sending new events to the client when something changes:
<?php
header('Content-Type: text/event-stream');
header('Cache-Control: no-cache');
$lastFiletime = null;
while (true) {
$filetime = filemtime("file.bin");
if ($filetime != $lastFiletime) {
echo "data: {$filetime}\n\n";
$lastFiletime = $filetime;
flush();
}
}

PHP: Running Multiple Scripts at the Same Time for the Same Client

I have one PHP script that can take several minutes to complete. The script downloads a file to the user PC.
I have another PHP script and its role is to monitor progress of the main download script. That script is called by the client via AJAX calls and should return download progress information.
Right now, my tests show, that during the execution of the main script(in other words, during file download), the AJAX - monitor script returns no values at all. It starts behaving normally, when the main - Download script finishes.
Is it possible that PHP can not run two or more scripts simultaneously and it allows to run script only in sequential order?
I could insert my code, but I think for the purpose of my question, it is not needed. I simply need to know, if two or more PHP scripts may run simultaneously for the same client.
I use:
WAMP
PHP Version 5.4.12
JavaScript without jQuery
Code Used:
As I was asked to show you my code, please, see the below code parts.
Main PHP(later Download) Script:
<?php
// disable script expiry
set_time_limit(0);
// start session if session is not already started
if (session_status() !== PHP_SESSION_ACTIVE)
{
session_start();
}
// prepare session variable
$_SESSION['DownloadProgress'] = 0;
for( $count = 0; $count < 60; $count++)
{
sleep(1);
echo "Iteration No: " . $count;
$_SESSION['DownloadProgress']++;
echo '$_SESSION[\'DownloadProgress\'] = ' . $_SESSION['DownloadProgress'];
flush();
ob_flush();
}
?>
Monitoring PHP script:
// construct JSON
$array = array("result" => 1, "download_progress" => $_SESSION['DownloadProgress']);
echo json_encode($array);
?>
JavaScript code, where I call the both PHP scripts:
SearchResults.myDownloadFunction = function()
{
console.log( "Calling: PHP/fileDownload.php" );
window.location.href = 'PHP/fileDownload.php?upload_id=1';
console.log( "Calling: getUploadStatus()" );
FileResort.SearchResults.getUploadStatus();
console.log( "Called both functions" );
};
JavaScript AJAX:
// call AJAX function to get upload status from the server
SearchResults.getUploadStatus = function ()
{
var SearchResultsXMLHttpRequest = FileResort.Utils.createRequest();
if (SearchResultsXMLHttpRequest == null)
{
console.log("unable to create request object.");
}
else
{
SearchResultsXMLHttpRequest.onreadystatechange = function ()
{
console.log("Response Text: " + SearchResultsXMLHttpRequest.responseText);
console.log("AJAX Call Returned");
if ((SearchResultsXMLHttpRequest.readyState == 4) && (SearchResultsXMLHttpRequest.status == 200))
{
//if (that.responseJSON.result == "true")
{
var responseJSON = eval('(' + SearchResultsXMLHttpRequest.responseText + ')');
console.log("Download Progress: " + responseJSON.download_progress);
}
}
}
var url = "PHP/fileDownloadStatus.php";
SearchResultsXMLHttpRequest.open("POST", url, true);
SearchResultsXMLHttpRequest.send();
}
};
Code Update I:
PHP Script that will later download files:
<?php
// disable script expiry
set_time_limit(0);
for( $count = 0; $count < 60; $count++)
{
sleep(1);
}
?>
PHP Monitoring script that outputs test values:
<?php
$test_value = 25;
// construct JSON
$array = array("result" => 1, "download_progress" => $test_value);
//session_write_close();
echo json_encode($array);
?>
Both scripts are called followingly:
SearchResults.myDownloadFunction = function()
{
console.log( "Calling: PHP/fileDownload.php" );
window.setTimeout(FileResort.SearchResults.fileDownload(), 3000);
console.log( "Calling: getUploadStatus()" );
window.setInterval(function(){FileResort.SearchResults.getDownloadStatus()}, 1000);
console.log( "Called both functions" );
};
Without more info there are a few possibilities here, but I suspect that the issue is your session. When a script that uses the session file start, PHP will lock the session file until session_write_close() is called or the script completes. While the session is locked any other files that access the session will be unable to do anything until the first script is done and writes/closes the session file (so the ajax calls have to wait until the session file is released). Try writing the session as soon as you've done validation, etc on the first script and subsequent scripts should be able to start.
Here's a quick and dirty approach:
The "Landing" page:
This is the page that the user is going to click the download link
<html>
<head>
<script src="//ajax.googleapis.com/ajax/libs/jquery/1.10.2/jquery.min.js"></script>
<script>
$(document).ready(function(e) {
//Every 500ms check monitoring script to see what the progress is
$('#large_file_link').click(function(){
window.p_progress_checker = setInterval( function(){
$.get( "monitor.php", function( data ) {
$( ".download_status" ).html( data +'% complete' );
//we it's done or aborted we stop the interval
if (parseInt(data) >= 100 || data=='ABORTED'){
clearInterval(window.p_progress_checker);
}
//if it's aborted we display that
if (data=='ABORTED'){
$( ".download_status" ).html( data );
$( ".download_status" ).css('color','red').css('font-weight','bold');
}
})
}, 500);
});
});
</script>
</head>
<body>
<div class="download_status"><!-- GETS POPULATED BY AJAX CALL --></div>
<p>Start downloading large file</p>
</body>
</html>
The "File Uploader"
This is the PHP script that serves the large file... it breaks it into chunks and after sending each chunk it closes the session so the session becomes available to other scripts. Also notice that I've added a ignore_user_abort/connection_aborted handler so that it can take a special action should the connection be terminated. This is the section that actually deals with the session_write_close() issue, so focus on this script.
<?php
/*Ignore user abort so we can catch it with connection_aborted*/
ignore_user_abort(true);
function send_file_to_user($filename) {
//Set the appropriate headers:
header('Content-Description: File Transfer');
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename='.basename($filename));
header('Content-Transfer-Encoding: binary');
header('Expires: 0');
header('Cache-Control: must-revalidate');
header('Pragma: public');
header('Content-Length: ' . filesize($filename));
$chunksize = 10*(1024); // how many bytes per chunk (i.e. 10K per chunk)
$buffer = '';
$already_transferred =0;
$file_size = filesize( $filename );
$handle = fopen($filename, 'rb');
if ($handle === false) {
return false;
}
while (!feof($handle)) {
/*if we're using a session variable to commnicate just open the session
when sending a chunk and then close the session again so that other
scripts which have request the session are able to access it*/
session_start();
//see if the user has aborted the connection, if so, set the status
if (connection_aborted()) {
$_SESSION['file_progress'] = "ABORTED";
return;
}
//otherwise send the next packet...
$buffer = fread($handle, $chunksize);
echo $buffer;
ob_flush();
flush();
//now update the session variable with our progress
$already_transferred += strlen($buffer);
$percent_complete = round( ($already_transferred / $file_size) * 100);
$_SESSION['file_progress'] = $percent_complete;
/*now close the session again so any scripts which need the session
can use it before the next chunk is sent*/
session_write_close();
}
$status = fclose($handle);
return $status;
}
send_file_to_user( 'large_example_file.pdf');
?>
The "File Monitor"
This is a script that is called via Ajax and is in charge of reporting progress back to the Landing Page.
<?
session_start();
echo $_SESSION['file_progress'];
?>

Categories