enhancing php script performance - javascript

(subject of this ques might not match with the ques, but i couldn't think of better) I have a webpage, where user provides email address of recipients, there can be 100 and more email addresses delimited by ; provided.in the textarea. Ofcourse i have to send an email to all those addresses. I have 2 approches in mind but couldn't decide on which one would provide better user experience and performance.
approach 1: i loop through all those emails in my js and send ajax request to php script. But then there would be 100 requests to the server, and if user closes browser in between, all email address wont go through
approach 2: i send all the 100 email addresses in one go to the php script, and let php script loop through emails. I am assuming that i would be able to echo some mesg back to client with success message after each loop count, and even if client is dead, then also at least php will keep executing untill loop ends
can somebody pls provide me cons and pros of these 2 approaches

Here is an idea on how to implement a queue.
define('MAX_EMAIL_BUFFER_SIZE', 15);
// do a query to see how many emails are needed to be sent, you need to do store
// this data in mysql or some other place.
// array getEmails() { }
$total = count( getEmails());
$pages = ceil($total / MAX_EMAIL_BUFFER_SIZE);
$i = 0;
for(; $i < $total; $i++) {
$offset = ($page - 1) * MAX_EMAIL_BUFFER_SIZE;
/* query
SELECT
*
FROM
table
ORDER BY
name
LIMIT
MAX_EMAIL_BUFFER_SIZE
OFFSET
$offset
*/
// the result returned by the query are the emails you wills send.
// do the above query in a function that returns the results
foreach($data as $email) {
mail(...);
}
// sleep for 10 seconds.
sleep(10);
}

Related

Get large data from API with pagination

I'm trying to GET a large amount of data from the API (over 300k records). It has pagination (25 records per page) and request limit is 50 request per 3 minutes. I'm using PHP curl to get the data. The API needs JWT token authorization. I can get a single page and put its records into an array.
...
$response = curl_exec($curl);
curl_close($curl);
$result = json_decode($response, true);
The problem is I need to get all records from all pages and save it into array or file. How to do it? Maybe I should use JS to do it better?
Best regards and thank you.
Ideally use cron and some form of storage, database or a file.
It is important that you ensure a new call to the script doesn't start unless the previous one has finished, otherwise they start stacking up and after a few you will start having server overload, failed scripts and it gets messy.
Store a value to say the script is starting.
Run the CURL request.
Once curl has been returned and data is processed and stored change the value you stored at the beginning to say the script has finished.
Run this script as a cron in the intervals you deem necessary.
A simplified example:
<?php
if ($script_is_busy == 1) exit();
$script_is_busy = 1;
// YOUR CURL REQUEST AND PROCESSING HERE
$script_is_busy = 0;
?>
I would use a series of requests. A typical request takes at most 2 seconds to fulfill, so 50 requests per 3oo secs does not require parallel requests. Still you need to measure time and wait if you don't want to be banned for DoS. Note that even with parallelism, curl supports it as far as I remember. When you reach the request limit you must use the sleep function to wait until you can send new requests. For PHP the real problem that it is a long running job, so you need to change settings, otherwise it will timeout. You can do it this way: Best way to manage long-running php script? As of nodejs, I think it is a lot better solution for this kind of async tasks, because the required features come naturally with nodejs without extensions and such things, though I am biased towards it.
Okay. I misinterpreted what you needed. I have more questions.
Can you do one request and get your 50 records immediately? That is assuming when you said 50 requests per 3 minutes you meant 50 records.
Why do you think there is this 50/3 limitation?
Can you provide a link to this service?
Is that 50 records per IP address?
Is leasing 5 or 6 IP addresses an option?
Do you pay for each record?
How many records does this service have total?
Do the records have a time limit on their viability.
I am thinking if you can use 6 IP addresses (or 6 processes) you can run the 6 requests simultaneously using stream_socket_client().
stream_socket_client allows you to make simultaneous requests.You then create a loop that monitors each socket for a response.
About 10 years ago I made an app that evaluated web page quality. I ran
W3C Markup Validation
W3C CSS Validation
W3C Mobile OK
WebPageTest
My own performance test.
I put all the URLs in an array like this:
$urls = array();
$path = $url;
$url = urlencode("$url");
$urls[] = array('host' => "jigsaw.w3.org",'path' => "/css-validator/validator?uri=$url&profile=css3&usermedium=all&warning=no&lang=en&output=text");
$urls[] = array('host' => "validator.w3.org",'path' => "/check?uri=$url&charset=%28detect+automatically%29&doctype=Inline&group=0&output=json");
$urls[] = array('host' => "validator.w3.org",'path' => "/check?uri=$url&charset=%28detect+automatically%29&doctype=XHTML+Basic+1.1&group=0&output=json");
Then I'd make the sockets.
foreach($urls as $path){
$host = $path['host'];
$path = $path['path'];
$http = "GET $path HTTP/1.0\r\nHost: $host\r\n\r\n";
$stream = stream_socket_client("$host:80", $errno,$errstr, 120,STREAM_CLIENT_ASYNC_CONNECT|STREAM_CLIENT_CONNECT);
if ($stream) {
$sockets[] = $stream; // supports multiple sockets
$start[] = microtime(true);
fwrite($stream, $http);
}
else {
$err .= "$id Failed<br>\n";
}
}
Then I monitored the sockets and retrieved the response from each socket.
while (count($sockets)) {
$read = $sockets;
stream_select($read, $write = NULL, $except = NULL, $timeout);
if (count($read)) {
foreach ($read as $r) {
$id = array_search($r, $sockets);
$data = fread($r, $buffer_size);
if (strlen($data) == 0) {
// echo "$id Closed: " . date('h:i:s') . "\n\n\n";
$closed[$id] = microtime(true);
fclose($r);
unset($sockets[$id]);
}
else {
$result[$id] .= $data;
}
}
}
else {
// echo 'Timeout: ' . date('h:i:s') . "\n\n\n";
break;
}
}
I used it for years and it never failed.
It would be easy to gather the records and paginate them.
After all sockets are closed you can gather the pages and send them to your user.
Do you think the above is viable?
JS is not better.
Or did you mean 50 records each 3 minutes?
This is how I would do the pagination.
I'd organize the response into pages of 25 records per page.
In the query results while loop I'd do this:
$cnt = 0;
$page = 0;
while(...){
$cnt++
$response[$page][] = $record;
if($cnt > 24){$page++, $cnt = 0;}
}
header('Content-Type: application/json');
echo json_encode($response);

How to keep running a query to check database all the time every minute in PHP and JavaScript

I am making a project which is a website. Basically it will set a reminder and notify the user using email/SMS. I am using PHP and JavaScript. My database stores the the list of users in table 1 and a separate table for each user and his tasks(with the time and dates). I want to refer the database every minute to check for tasks even if the user is not logged in(browser is closed). What do i do to keep running the check for query all the time?
I want something that will run in background all the time even if user never opens the browser.
Please help.
The php code to store in a users database is
<?php
include("init.php");
session_start();
if(isset($_POST))
{
$date = $_POST["date"];
$event = $_POST["event"];
$time = $_POST["time"];
$daily = $_POST["daily"];
$weekly = $_POST["weekly"];
$monthly = $_POST["monthly"];
$fname = $_SESSION['fname'];
$fname = mysql_real_escape_string($fname);
$sql = "insert into $fname(fname,date,event,time,daily,weekly,monthly) values('$fname','$date','$event','$time','$daily','$weekly','$monthly')";
if(mysqli_multi_query($con,$sql))
echo "<br><h3> row inserted...</h3>done";
else
echo "Error in insertion...".mysqli_error($con);
}
?>
There is no issue with the code.
I just need to know how and using what can i refer the database all the time at the server end when user is not on the page.
Can php work 24hrs even if the browser is closed because i know javascript wont work.
You need to create an event in MySQL (or the database manager you are using, for example:
CREATE EVENT e_totals
-> ON SCHEDULE AT '2006-02-10 23:59:00'
-> DO INSERT INTO test.totals VALUES (NOW());
Or a recurrent event:
delimiter |
CREATE EVENT e_daily
ON SCHEDULE
EVERY 1 DAY
COMMENT 'Saves total number of sessions then clears the table each day'
DO
BEGIN
INSERT INTO site_activity.totals (time, total)
SELECT CURRENT_TIMESTAMP, COUNT(*)
FROM site_activity.sessions;
DELETE FROM site_activity.sessions;
END |
delimiter ;
Sagar what you are looking for is CRON Task. I am afraid that PHP and Javascript alone can't trigger it.
Work flow:
Make an API containing all your business logic or processing you need to execute it.
Register a CRON job in cPanel or crontab -e in your linux machine.
Use the end point directly using AJAX calls or make a separate end point as cron task will continue working.
Refer to this link in case you want to learn more about cron jobs - http://www.thegeekstuff.com/2009/06/15-practical-crontab-examples
Thanks,
Abhishek Jain

allow PHP script with long execution time to send updates back to the browser

I looked over a few of the questions, namely
Show progress for long running PHP script
How do you run a long PHP script and keep sending updates to the browser via HTTP?
and neither one seems to answer my question, part of which seems to be "how do I do this?" and the other half is "Hey, the way I'm doing this right now - is it the best way? Could I code this better?"
I have a simple ajax script that sends some data over to a PHP script:
$.ajax({
type: 'POST',
url: 'analysis.php',
data: { reportID:reportID, type:type, value:value, filter_type:filter_type, filter_value:filter_value, year:year },
success:function(dataReturn){
analysis_data = JSON.parse(dataReturn);
/* do stuff with analysis_data... */
});
This PHP script takes about 3 minutes to run, as it loops through a database and runs some pretty complex queries:
<?php
session_start();
ob_start();
ini_set('max_execution_time', 180);
$breaks = [ 1000, 2000, 4000, 6000, 8000, 10000, 20000, 50000, 99999999 ];
$breaks_length = count($breaks);
$p = 0;
foreach ( $breaks as $b ) {
$p++;
$percentage_complete = number_format($p / $breaks_length,2) . "%";
$sql = "query that takes about 20 seconds to run each loop of $b....";
$query = odbc_exec($conn, $sql);
while(odbc_fetch_row($query)){
$count = odbc_result($query, 'count');
}
$w[] = $count;
/* tried this... doesn't work as it screws up the AJAX handler success which expects JSON
echo $percentage_complete;
ob_end_flush();
*/
}
echo json_encode($w);
?>
All of this works - but what I'd really like to do is find a way after each foreach loop, to output $percentage_complete back to the user so they can see it working, instead of just sitting there for 2 minutes with a FontAwesome icon spinning in front of them. I tried using ob_start();, but not only does it not output anything until the page is done running, it echoes the value, which is then part of what is sent back to my AJAX success handler, causing it to screw up. (I need the output in a JSON_encoded format as I use it for something else later.)
So far in threads I've read, my only thought is to start the $breaks array loop on the previous page, so instead of looping 6 times on the same page, I loop once, return an answer, then call analysis.php again using the second element of the $breaks array, but I'm not sure this is the best way to go about things.
Also - during the 3 minutes that the user is waiting for this script to execute, they cannot do anything else on the page, so they just have to sit and wait. I'm sure there's a way to get this script to execute in such a way it doesn't "lock down" the rest of the server for the user, but everything I've searched for in Google doesn't give me a good answer for this as I'm not sure exactly what to search for...
You are encountering what is know as Session Locking. So basically PHP will not accept another request with session_start() until the first request has finished.
The immediate fix to your issue is to remove session_start(); from line #1 completely because I can see that you do not need it.
Now, for your question about showing a percentage on-screen:
analysis.php (modified)
<?php
ob_start();
ini_set('max_execution_time', 180);
$breaks = [ 1000, 2000, 4000, 6000, 8000, 10000, 20000, 50000, 99999999 ];
$breaks_length = count($breaks);
$p = 0;
foreach ( $breaks as $b ) {
$p++;
session_start();
$_SESSION['percentage_complete'] = number_format($p / $breaks_length,2) . "%";
session_write_close();
$sql = "query that takes about 20 seconds to run each loop of $b....";
$query = odbc_exec($conn, $sql);
while(odbc_fetch_row($query)){
$count = odbc_result($query, 'count');
}
$w[] = $count;
/* tried this... doesn't work as it screws up the AJAX handler success which expects JSON
echo $percentage_complete;
ob_end_flush();
*/
}
echo json_encode($w);
check_analysis_status.php get your percentage with this file
<?php
session_start();
echo (isset($_SESSION['percentage_complete']) ? $_SESSION['percentage_complete'] : '0%');
session_write_close();
Once your AJAX makes a call to analysis.php then just call this piece of JS:
// every half second call check_analysis_status.php and get the percentage
var percentage_checker = setInterval(function(){
$.ajax({
url: 'check_analysis_status.php',
success:function(percentage){
$('#percentage_div').html(percentage);
// Once we've hit 100% then we don't need this no more
if(percentage === '100%'){
clearInterval(percentage_checker);
}
}
});
}, 500);
I have done this a couple different ways, but the pattern I like the best is to have three scripts (or one controller to handle all of this), analysis_create.php, analysis.php, and analysis_status.php. The key is to create a DB object that you reference in your status checks (analysis_status.php). analysis_create.php will store all the data in the post into a DB table that will also have a column for percent_complete. The analysis_create.php function should return an ID/Token for the analysis. Once the front-end has the ID, it would post to analysis.php and then after a delay (250ms) kill the request, because you don't want to wait for it to finish. analysis.php should read the data out of the DB and start doing the work. You will need to make sure ignore_user_abort is set properly in your analysis.php script. Once the request to analysis.php is killed, you will start long polling to analysis_status.php with that ID. As analysis.php is working through the query, it should be updating the corresponding DB record with the percentage complete. analysis_status.php should look up this record and return the percentage complete to the front end.
I ran into the same issue. What caused it is different to what people are suggesting here.
Reason was gzip was enabled. Leading to a type of session locking even without an actual session.
Several ways to disable for one specific file:
How to disable mod_deflate in apache2?
Put this in httpd.conf
SetEnvIfNoCase Request_URI getMyFile\.php$ no-gzip dont-vary

repeat php sleep method inside a foreach loop each iteration

I have to send around 1k emails to different customers for different issues. Normally I would get an excel file with all the necessary information to send the emails. I have a web form where I insert the ticket number and that retrieves the necessary information to send the mail (which is also provided in the excel file). The problem is that inserting 1k ticket numbers in the form is an exhausting and time consuming work. So I copied the link that is generated to send the emails and created 1k links with the specific variables needed to send 1k different emails. Now all I have to do is generate a php function to open all the links and the job is done. However the mail server does not allow more than 20 emails to be sent from the same IP at once. It marks the emails as spam and blocks the IP. I tried a foreach loop with php sleep function inside of it and it is not working. The function sleeps for the given amount of time and then opens all the links given at once. I want to state that the function will be run from my laptop and will not be uploaded in any server or whatsoever.
Below is the function I currently have:
$emails = ["http://www.facebook.com","http://www.tuttojuve.com","http://www.google.com"];
//testing with these links instead of the email links
foreach($emails as $key => $email){
$mail = "<script type='text/javascript' language='Javascript'>window.open('".$email."','_blank');</script>";
sleep(5);
echo $mail;
}
Any help or hint is appreciated,
Thanks in advance
$emails = ["http://www.facebook.com","http://www.tuttojuve.com","http://www.google.com"];
?>
<script>
var linksToOpen = <?php echo json_encode($emails); ?>;
var currentLink = 0;
setInterval(function(){
window.open(linksToOpen[currentLink++], '_blank');
/* browser popup blocker may block this window so add window url to allowed list. */
}, 5000);
/* every x millisecond */
</script>

How to use typeahead.js with a large database

I have a large database of 10,000 addresses and 5,000 people.
I want to let users search the database for either an address or a user. I'd like to use Twitter's typeahead to suggest results as they enter text.
See the NBA example here: http://twitter.github.io/typeahead.js/examples.
I understand that prefetching 15,000 items would not be optimal from a speed and load standpoint. What would be a better way to try and achieve this?
Since no one made any answer, I will go ahead with my suggestion then.
I think the best fit for your big database is using remote with typeahead.js. Quick example:
$('#user-search').typeahead({
name: 'user-search',
remote: '/search.php?query=%QUERY' // you can change anything but %QUERY
});
What it does is when you type characters in the input#user-search it will send AJAX request to the page search.php with query as the content of the input.
On search.php you can catch this query and look it up in your DB:
$query = $_GET['query'].'%'; // add % for LIKE query later
// do query
$stmt = $dbh->prepare('SELECT username FROM users WHERE username LIKE = :query');
$stmt->bindParam(':query', $query, PDO::PARAM_STR);
$stmt->execute();
// populate results
$results = array();
foreach ($stmt->fetchAll(PDO::FETCH_COLUMN) as $row) {
$results[] = $row;
}
// and return to typeahead
return json_encode($results);
Of course since your DB is quite big, you should optimize your SQL query to query faster, maybe cache the result, etc.
On the typeahead side, to reduce load to query DB, you can specify minLength or limit:
$('#user-search').typeahead({
name: 'user-search',
remote: '/search.php?query=%QUERY',
minLength: 3, // send AJAX request only after user type in at least 3 characters
limit: 10 // limit to show only 10 results
});
So it doesn't really matter how big your DB is, this approach should work nicely.
This is an example in PHP but of course it should be the same for whatever backend you have. Hope you get the basic idea.

Categories