I used this in my index.php
<?
include('config.php');
if($site->maintenance > 0){
echo "<script>document.location.href='maintenance'</script>";
exit;
}
?>
and in my config.php after checked database connection
$site = mysql_fetch_object(mysql_query("SELECT * FROM problems"));
I made a table in my database and called it problems but even when I put the value 0, it transfers me to maintenance. When I checked the variable $site by var_dump($site) it outputs:
bool(false)
and if I checked it like this: var_dump($site->maintenance) it outputs:
NULL
What do I have to do to manage the maintenance from database and when I want my site to work I change value?
Why you are using JS for this? What if user JS is off? I would use PHP instead
Create a table for maintenance with a column say maintenance_status, now this will hold a boolean value, 0 1... 0 => off, 1 => on, will keep only a single record which will be created once, and later will update it always...
So now later you create this function
function check_maintenance($connection) { /* Call this function on every page,
pass your database connection var
as a function parameter */
$query = mysqli_fetch_array(mysqli_query($connection, "SELECT * FROM tbl_maintenance LIMIT 1"));
/* Limit 1 is optional if you are using only 1 row as I told you,
if you are keeping records of the previous maintenance, probably
you've to sort desc and use limit 1 */
if($query['tbl_maintenance'] == '1') { //Check boolean value, if it's on than redirect
header('Location: maintenance.php'); //Redirect the user to maintenance page
exit;
}
}
The fact that $site is false, could being caused by problem with the query. Change the mysql code to:
$result = mysql_query("SELECT * FROM problems");
if(!$result) {
die(mysql_error();
}
$site = mysql_fetch_object($result);
Further you should learn howto enable error messages in PHP. I guess there are bunch of them. They are disabled by default as it could be a security risk in a production system. But when you are developing you MUST enable them. You can enable it in the php.ini of development system:
php.ini:
...
display_errors=1
...
log_errors=1
...
error_log="/path/to/writable/file"
...
error_reporting=E_ALL
After modifying the php.ini don't forget to restart the web server.
Related
I'm trying to GET a large amount of data from the API (over 300k records). It has pagination (25 records per page) and request limit is 50 request per 3 minutes. I'm using PHP curl to get the data. The API needs JWT token authorization. I can get a single page and put its records into an array.
...
$response = curl_exec($curl);
curl_close($curl);
$result = json_decode($response, true);
The problem is I need to get all records from all pages and save it into array or file. How to do it? Maybe I should use JS to do it better?
Best regards and thank you.
Ideally use cron and some form of storage, database or a file.
It is important that you ensure a new call to the script doesn't start unless the previous one has finished, otherwise they start stacking up and after a few you will start having server overload, failed scripts and it gets messy.
Store a value to say the script is starting.
Run the CURL request.
Once curl has been returned and data is processed and stored change the value you stored at the beginning to say the script has finished.
Run this script as a cron in the intervals you deem necessary.
A simplified example:
<?php
if ($script_is_busy == 1) exit();
$script_is_busy = 1;
// YOUR CURL REQUEST AND PROCESSING HERE
$script_is_busy = 0;
?>
I would use a series of requests. A typical request takes at most 2 seconds to fulfill, so 50 requests per 3oo secs does not require parallel requests. Still you need to measure time and wait if you don't want to be banned for DoS. Note that even with parallelism, curl supports it as far as I remember. When you reach the request limit you must use the sleep function to wait until you can send new requests. For PHP the real problem that it is a long running job, so you need to change settings, otherwise it will timeout. You can do it this way: Best way to manage long-running php script? As of nodejs, I think it is a lot better solution for this kind of async tasks, because the required features come naturally with nodejs without extensions and such things, though I am biased towards it.
Okay. I misinterpreted what you needed. I have more questions.
Can you do one request and get your 50 records immediately? That is assuming when you said 50 requests per 3 minutes you meant 50 records.
Why do you think there is this 50/3 limitation?
Can you provide a link to this service?
Is that 50 records per IP address?
Is leasing 5 or 6 IP addresses an option?
Do you pay for each record?
How many records does this service have total?
Do the records have a time limit on their viability.
I am thinking if you can use 6 IP addresses (or 6 processes) you can run the 6 requests simultaneously using stream_socket_client().
stream_socket_client allows you to make simultaneous requests.You then create a loop that monitors each socket for a response.
About 10 years ago I made an app that evaluated web page quality. I ran
W3C Markup Validation
W3C CSS Validation
W3C Mobile OK
WebPageTest
My own performance test.
I put all the URLs in an array like this:
$urls = array();
$path = $url;
$url = urlencode("$url");
$urls[] = array('host' => "jigsaw.w3.org",'path' => "/css-validator/validator?uri=$url&profile=css3&usermedium=all&warning=no&lang=en&output=text");
$urls[] = array('host' => "validator.w3.org",'path' => "/check?uri=$url&charset=%28detect+automatically%29&doctype=Inline&group=0&output=json");
$urls[] = array('host' => "validator.w3.org",'path' => "/check?uri=$url&charset=%28detect+automatically%29&doctype=XHTML+Basic+1.1&group=0&output=json");
Then I'd make the sockets.
foreach($urls as $path){
$host = $path['host'];
$path = $path['path'];
$http = "GET $path HTTP/1.0\r\nHost: $host\r\n\r\n";
$stream = stream_socket_client("$host:80", $errno,$errstr, 120,STREAM_CLIENT_ASYNC_CONNECT|STREAM_CLIENT_CONNECT);
if ($stream) {
$sockets[] = $stream; // supports multiple sockets
$start[] = microtime(true);
fwrite($stream, $http);
}
else {
$err .= "$id Failed<br>\n";
}
}
Then I monitored the sockets and retrieved the response from each socket.
while (count($sockets)) {
$read = $sockets;
stream_select($read, $write = NULL, $except = NULL, $timeout);
if (count($read)) {
foreach ($read as $r) {
$id = array_search($r, $sockets);
$data = fread($r, $buffer_size);
if (strlen($data) == 0) {
// echo "$id Closed: " . date('h:i:s') . "\n\n\n";
$closed[$id] = microtime(true);
fclose($r);
unset($sockets[$id]);
}
else {
$result[$id] .= $data;
}
}
}
else {
// echo 'Timeout: ' . date('h:i:s') . "\n\n\n";
break;
}
}
I used it for years and it never failed.
It would be easy to gather the records and paginate them.
After all sockets are closed you can gather the pages and send them to your user.
Do you think the above is viable?
JS is not better.
Or did you mean 50 records each 3 minutes?
This is how I would do the pagination.
I'd organize the response into pages of 25 records per page.
In the query results while loop I'd do this:
$cnt = 0;
$page = 0;
while(...){
$cnt++
$response[$page][] = $record;
if($cnt > 24){$page++, $cnt = 0;}
}
header('Content-Type: application/json');
echo json_encode($response);
I am facing a strange issue here.
I am using javascript ajax(I used jquery). Now the scenario is;
One ajax call is invoking a php script which is basically a long running process and it sets some session variables.
Later in some intervals(lets say in each 2 sec) I am running another ajax calls to check the session variables to know when the process(first php script execution) is completed.
First php script is fetching data from database and wring it into a file. In each fetching I am counting the loop number and storing it into a session variable to keep some kind of tracking record. Like;
$i=0;
$_SESSION['time']=date('m-d-Y H:i:s');
while(...)
{
ini_set('session.use_only_cookies', false);
ini_set('session.use_cookies', false);
ini_set('session.use_trans_sid', false);
ini_set('session.cache_limiter', null);
session_start();
$_SESSION['tracksatus']="loop number : ".$i." time is :"$_SESSION['time'];
session_write_close();
$i++;
......
......
}
Another php script which I am invoking via setInterval ajax is just doing like;
echo $_SESSION['trackstatus']
The set interval ajax is returning me like;
loop number 1 time is m-d-Y H:i:s
loop number 5 time is m-d-Y H:i:s
loop number 8 time is m-d-Y H:i:s
......
Then after few call again;
loop number 1 time is m-d-Y H1:i1:s1
.....
Notice the change of H:i:s to H1:i1:s1
So as per my understanding the php script is invoking twice. And for your information same code was working just before 12 hrs may be. And I faced this issue before and somehow solved it(trial and error so I don't know how or may be automatically....ok actually I have no clue).
Can you please give me an insight what I am doing wrong?
Please mention if you need more information.
And the funny thing is that it is working as expected just after asking this question without even changing a single line of code. But I want to know the reason.
I think that I know what the reason, PHP writes session variables to file, but it do it only on end of script execution, so you can`t see the changes of session in another script before end of long one.
You can fix it by adding session_write_close(); session_start(); after each change of session data.
session_write_close will write changes to HD, so another script can read it.
session_start will load session from HD, but make sure that your another script make no changes for a session, these changes will be overwritten by your long script.
And one more thing if you are using separate domains:
Before actual AJAX call happen your browser sends OPTIONS request to the same URL for checking CORS headers. So on start of your script check the HTTP METHOD and if it HEAD or OPTIONS make die();
Instead of using sessions, try using a temp file to keep count with a dynamic ID
Javascript
var time = Date.now();
$.get('/firstURL?time='+time);
setInterval(function(){
$.get('/secondURL?time='+time, function(response){
console.log(response);
}
}, 1000);
PHP 1st URL
<?php
$id = $_GET['time'];
$count = 0;
while(...) {
// Do your stuff
$count++;
file_put_contents("/tmp/{$id}", $count);
}
?>
PHP 2nd URL
<?php
$id = $_GET['time'];
$count = 0;
try {
$count = file_get_contents("/tmp/{$id}");
} catch(Exception $e) {}
echo $count;
?>
As other have said PHP does not write the session until execution has finished. you better off creating a php function that you call that writes a file with the progress and then your second ajax call just reads the file.
function updateCreateProgress($jobStartTime, $progress){
file_put_contents('/tmp/'.$jobStartTime.'.txt', $progress);
}
function completeProgress($jobStartTime){
unlink('/tmp/'.$jobStartTime.'.txt')
}
now your second script can check for '/tmp/'.$jobStartTime.'.txt' if it's there read it using file_get_contents if its not there report back it has finished.
Try adjusting to something like this:
$i=0;
ini_set('session.use_only_cookies', false);
ini_set('session.use_cookies', false);
ini_set('session.use_trans_sid', false);
ini_set('session.cache_limiter', null);
session_start();
$_SESSION['time']=date('m-d-Y H:i:s');
while(...)
{
$_SESSION['tracksatus']="loop number : ".$i." time is :"$_SESSION['time'];
session_write_close();
session_start();
$i++;
......
......
}
You started talking about $_SESSION before calling session_start();
If you call ajax with GET method - you must set "cache:false" option.
Yes, you must protect your php script from other requests. With unique key (GET parameter) or session.
php lock session data for single call and release it only when this call end. Using session_write_close() when script still working - bad practice. Maybe you want save into session something more after loop but before using this data from other requests.
Flexible and clear solution:
1) script1.php - invoke from ajax for start long job.
2) script2.php (long job here) - run directly from script1.php as background without wait, or add new cron job (insert into table) and run script2.php from cron (check jobs every second or other time).
3) script3.php - check job status (ajax).
For "communication" between script2.php and script3.php can be use database or special file with flock(), clearstatcache() and flush().
I am making a project which is a website. Basically it will set a reminder and notify the user using email/SMS. I am using PHP and JavaScript. My database stores the the list of users in table 1 and a separate table for each user and his tasks(with the time and dates). I want to refer the database every minute to check for tasks even if the user is not logged in(browser is closed). What do i do to keep running the check for query all the time?
I want something that will run in background all the time even if user never opens the browser.
Please help.
The php code to store in a users database is
<?php
include("init.php");
session_start();
if(isset($_POST))
{
$date = $_POST["date"];
$event = $_POST["event"];
$time = $_POST["time"];
$daily = $_POST["daily"];
$weekly = $_POST["weekly"];
$monthly = $_POST["monthly"];
$fname = $_SESSION['fname'];
$fname = mysql_real_escape_string($fname);
$sql = "insert into $fname(fname,date,event,time,daily,weekly,monthly) values('$fname','$date','$event','$time','$daily','$weekly','$monthly')";
if(mysqli_multi_query($con,$sql))
echo "<br><h3> row inserted...</h3>done";
else
echo "Error in insertion...".mysqli_error($con);
}
?>
There is no issue with the code.
I just need to know how and using what can i refer the database all the time at the server end when user is not on the page.
Can php work 24hrs even if the browser is closed because i know javascript wont work.
You need to create an event in MySQL (or the database manager you are using, for example:
CREATE EVENT e_totals
-> ON SCHEDULE AT '2006-02-10 23:59:00'
-> DO INSERT INTO test.totals VALUES (NOW());
Or a recurrent event:
delimiter |
CREATE EVENT e_daily
ON SCHEDULE
EVERY 1 DAY
COMMENT 'Saves total number of sessions then clears the table each day'
DO
BEGIN
INSERT INTO site_activity.totals (time, total)
SELECT CURRENT_TIMESTAMP, COUNT(*)
FROM site_activity.sessions;
DELETE FROM site_activity.sessions;
END |
delimiter ;
Sagar what you are looking for is CRON Task. I am afraid that PHP and Javascript alone can't trigger it.
Work flow:
Make an API containing all your business logic or processing you need to execute it.
Register a CRON job in cPanel or crontab -e in your linux machine.
Use the end point directly using AJAX calls or make a separate end point as cron task will continue working.
Refer to this link in case you want to learn more about cron jobs - http://www.thegeekstuff.com/2009/06/15-practical-crontab-examples
Thanks,
Abhishek Jain
I run a server on a shared hosted space (Aruba) with LAMP configuration, with two separate sets of PHP pages, one for the administrator and one for several clients (imagine a quiz game where the administrator submits questions).
I want to achieve this behaviour:
The administrator presses a button and released the question
At any time within ten seconds (or even immediately, but not strictly requested) all the clients must display AT THE SAME TIME the page with the text of the question.
To achieve this, I thought of different solutions:
Web sockets (not feasible, as I cannot install server components on
my web page)
Trigger file generated by the administrator; the clients will periodically (~10 sec) poll (setInterval()) for the presence of this file and, depending on the
creation time of the file (or an equivalent timestamp read from the
file name or file content) the client will start a countdown (setTimeout()) for the
time remaining to when the new page has to be fired, to make sure that all clients eventually trigger at the same time (tenth of second)
Trigger via database (basically same as trigger file, but possibly slower).
I tried the second solution (to manage the reading of trigger file on client-side) both in PHP and in Javascript, but they both fail when there is more than a client connected:
PHP apparently fails because Apache does not support many simultaneous threads
and gets somehow stuck
Javascript somehow occasionally misses to recognize the presence
of the file in the local directory (with more than one client
connected XMLHttpRequest.status incorrectly returns 404 even when the trigger file is there) - I even created separate trigger files for the different clients, to make sure there are no concurrency conflicts.
Any hints on why XMLHttpRequest.status occasionally fails, or advice on a better way of achieving this behaviour?
Thank you in advance.
Have you considered long polling? See https://github.com/panique/php-long-polling for an example of how to do this with PHP. This will not scale well because of the number of apache and php processes that would have to stay active, but would be fine for a few clients. If you need it to scale then I would consider switching server technologies to something like hack (like PHP; see http://hacklang.org/) or node which is great at this kind of thing.
EDIT: I didn't understand the question fully in my original answer. Here is my refined answer:
With the current limitations you are under, I see only one way of achieving a simultaneous server response. First, you will need to implement HTML5 SSE (server side events). When your server is ready to send a message to the clients, trigger an SSE to be sent to each client. This event does not need to send any data so there's no need for clients being contacted simultaneously. The event tells the clients to execute an ajax call to your php ajaxHandler.
During each ajax call from the client, your server will check your database for the value of 'waitingClients' in some table you created. If the value is 0, set the value to 1. If the value is greater than 0, increment the waitingClients value by 1. After each ajax call increments the database value, the individual ajax calls are then suspended in a while loop until 'waitingClients' is equal to the value of 'totalClients'. I recommend that you create some kind of entry in your database that records the number of active clients. This makes your 'totalClients' value more dynamic.
You may run into problems with the ajax calls timing out after 30 seconds. Since you're only returning database values, I doubt that you'll run into this problem unless something happens with a client's connection hanging.
Here is some example code (untested):
HTML
<!DOCTYPE html>
<html>
<head lang="en">
<meta charset="UTF-8">
<title></title>
<script src="jquery-1.9.1.min.js"></script>
<script src="ajaxTest.js"></script>
</head>
<body>
<div id="server_message">Waiting for server response</div>
</body>
</html>
Ajax:
$(function() {
var message = $('#server_message');
$.ajax({
url: 'yourAjaxHandler.php',
type: 'POST',
data: {
getAnswer: true
},
success: function(response) {
console.log(response);
message.text(response);
}
})
});
PHP Ajax handler
<?php
$host = 'db host address';
$dbname = 'your database name';
$username = 'your username';
$password = 'your password';
$conn = new PDO("mysql:host=$host;dbname=$dbname", $username, $password);
// Define expected number of total clients. I would recommend having clients log an entry into the database upon initial login/connection.
// This would make tallying the number of clients more dynamic. Otherwise you will always need 4 clients connected
$totalClients = 4;
if (isset($_REQUEST['getAnswer'])) {
$qry = 'SELECT waitingClients from some_table';
$waitingClients = $conn->query($qry);
if ($waitingClients === 0) {
// Create waitingClients in database if it doesn't exist. Otherwise, increment value to 1
$qry = "UPDATE some_table set waitingClients = 1";
$conn->exec($qry);
} else {
// Increment waitingClients
$qry = "UPDATE some_table set waitingClients = waitingClients + 1";
$conn->exec($qry);
}
while ($waitingClients <= $totalClients) {
// The while loop will keep the ajax call active for all clients
// Keep querying database until waitingClients value in DB matches the number of totalClients
$qry = 'SELECT waitingClients from some_table';
$waitingClients = $conn->query($qry);
}
// Set the value of waitingClients back to 0
$qry = "UPDATE some_table SET waitingClients = 0";
$conn->exec($qry);
// Return your server message to the clients
echo json_encode("Your server message"); // You could also store your server message in the database
}
I am not sure if this is the best way to do it, but I have a button that when pressed it call a onClick JS function and it passed two parameters. I want to save those two parameters on a php session, then load another page and use those values.
So, I know that if I use something like this on PAGE !:
<?php
session_start();
$message1 = "A message";
$message2 = "Another message";
$_SESSION['routineName'] = $message1;
$_SESSION['dayName'] = $message2;
?>
I can go to PAGE 2, and by using $_SESSION['routineName'] I can use that info.
So, on PAGE 1 I have that code inside the function that is called with my onClick:
function trackIt(routine, dayName)
{
<?php
session_start();
$message1 = "A message";
$message2 = "Another message";
$_SESSION['routineName'] = $message1;
$_SESSION['dayName'] = $message2;
?>
}
I tried things like:
function trackIt(routine, dayName)
{
<?php
session_start();
$_SESSION['routineName'] = ?> routine; <?php
$_SESSION['dayName'] = $message2;
?>
}
and others, but nothing works.
And this is how I am calling the onClick (trackIt) function:
echo('<td colspan="3" style="background-color:#005673; text-align:right; padding: 4px 0px;">
<button class="btnTrack" onClick="trackIt(\'' . $name . '\' , \'' . $nameday1 . '\')" >Track It!</button></td>');
What I want to do is to save both, routine and dayName, into the session.
Is it possible to save JS variables/parameters into PHP Session?
PS: I am using Wordpress.
Thanks!
The PHP code you put in your files is not executed at Javascript run time, it is executed even before the page gets sent to the client. So you can't access $_SESSION from anywhere within your content, you need to do that from Wordpress's code. Usually this is done via a plugin.
You need to pass your Javascript variables to a server side PHP. As #Grasshopper said, the best (or at least most maintainable way) is through AJAX:
// This is your JAVASCRIPT trackit function
function trackIt(routine, day) {
$.post(
'/wp-setvar.php',
{
routine : routine,
day : day
}, // You can add as many variables as you want (well, within reason)
function success(data) {
// Here we should receive, given the code below, an object
// such that data.result is a string saying "OK".
// Just in case you need to get back something from the server PHP.
// Otherwise just leave this function out.
}
);
};
On the server, you need to create a specific file to accept the incoming variables (it would be best if you did this from a plugin, in order not to add files outside the installation: such practices are frowned upon by security scanners such as WordFence). This here below is a butcher's solution.
<?php /** This is wp-setvar.php */
/** Set up WordPress environment, just in case */
require_once( dirname( __FILE__ ) . '/wp-load.php' );
session_id() || session_start();
nocache_headers();
// DO NOT, FOR ANY REASON, ACCESS DIRECTLY $_SESSION
// ONLY USE A VARIABLE WITHIN $_SESSION (here, "ajjx")
// OTHERWISE THIS MAY ALLOW ANYONE TO TAKE CONTROL OF YOUR INSTALLATION.
$_SESSION['ajjx'] = $_POST;
Header('Content-Type: application/json;charset=utf8');
die(json_encode(array(
'result' => 'OK', // This in case you want to return something to the caller
)));
Now whenever you need the session-saved variable, e.g. "routine", you put
<?php
...
$value = '';
if (array_key_exists('ajjx', $_SESSION)) {
if (array_key_exists('routine', $_SESSION['ajjx']) {
$value = $_SESSION['ajjx']['routine'];
}
}
Or you can define a function in your plugin,
function ajjx($varname, $default = '') {
if (array_key_exists('ajjx', $_SESSION)) {
if (array_key_exists($varname, $_SESSION['ajjx']) {
return $_SESSION['ajjx'][$varname];
}
}
return $default;
}
Then you just:
<?php print ajjx('routine', 'none!'); ?><!-- will print routine, or "none!" -->
or
<?php print ajjx('routine'); ?><!-- will print nothing if routine isn't defined -->
An even more butcherful solution is to add the function definition above within wp-config.php itself. Then it will be available everywhere in Wordpress. Provided you have access to wp-config.php. Also, backup wp-config first and use a full FTP client to do it; do not use a Wordpress plugin to edit it, since if wp-config crashes, the plugin may crash too... and you'll find yourself in a my-can-opener-is-locked-within-a-can situation.
If you don't feel comfortable with some of the above, it's best if you do nothing. Or practice first on an expendable Wordpress installation that you can reinstall easily.