repeat php sleep method inside a foreach loop each iteration - javascript

I have to send around 1k emails to different customers for different issues. Normally I would get an excel file with all the necessary information to send the emails. I have a web form where I insert the ticket number and that retrieves the necessary information to send the mail (which is also provided in the excel file). The problem is that inserting 1k ticket numbers in the form is an exhausting and time consuming work. So I copied the link that is generated to send the emails and created 1k links with the specific variables needed to send 1k different emails. Now all I have to do is generate a php function to open all the links and the job is done. However the mail server does not allow more than 20 emails to be sent from the same IP at once. It marks the emails as spam and blocks the IP. I tried a foreach loop with php sleep function inside of it and it is not working. The function sleeps for the given amount of time and then opens all the links given at once. I want to state that the function will be run from my laptop and will not be uploaded in any server or whatsoever.
Below is the function I currently have:
$emails = ["http://www.facebook.com","http://www.tuttojuve.com","http://www.google.com"];
//testing with these links instead of the email links
foreach($emails as $key => $email){
$mail = "<script type='text/javascript' language='Javascript'>window.open('".$email."','_blank');</script>";
sleep(5);
echo $mail;
}
Any help or hint is appreciated,
Thanks in advance

$emails = ["http://www.facebook.com","http://www.tuttojuve.com","http://www.google.com"];
?>
<script>
var linksToOpen = <?php echo json_encode($emails); ?>;
var currentLink = 0;
setInterval(function(){
window.open(linksToOpen[currentLink++], '_blank');
/* browser popup blocker may block this window so add window url to allowed list. */
}, 5000);
/* every x millisecond */
</script>

Related

Get large data from API with pagination

I'm trying to GET a large amount of data from the API (over 300k records). It has pagination (25 records per page) and request limit is 50 request per 3 minutes. I'm using PHP curl to get the data. The API needs JWT token authorization. I can get a single page and put its records into an array.
...
$response = curl_exec($curl);
curl_close($curl);
$result = json_decode($response, true);
The problem is I need to get all records from all pages and save it into array or file. How to do it? Maybe I should use JS to do it better?
Best regards and thank you.
Ideally use cron and some form of storage, database or a file.
It is important that you ensure a new call to the script doesn't start unless the previous one has finished, otherwise they start stacking up and after a few you will start having server overload, failed scripts and it gets messy.
Store a value to say the script is starting.
Run the CURL request.
Once curl has been returned and data is processed and stored change the value you stored at the beginning to say the script has finished.
Run this script as a cron in the intervals you deem necessary.
A simplified example:
<?php
if ($script_is_busy == 1) exit();
$script_is_busy = 1;
// YOUR CURL REQUEST AND PROCESSING HERE
$script_is_busy = 0;
?>
I would use a series of requests. A typical request takes at most 2 seconds to fulfill, so 50 requests per 3oo secs does not require parallel requests. Still you need to measure time and wait if you don't want to be banned for DoS. Note that even with parallelism, curl supports it as far as I remember. When you reach the request limit you must use the sleep function to wait until you can send new requests. For PHP the real problem that it is a long running job, so you need to change settings, otherwise it will timeout. You can do it this way: Best way to manage long-running php script? As of nodejs, I think it is a lot better solution for this kind of async tasks, because the required features come naturally with nodejs without extensions and such things, though I am biased towards it.
Okay. I misinterpreted what you needed. I have more questions.
Can you do one request and get your 50 records immediately? That is assuming when you said 50 requests per 3 minutes you meant 50 records.
Why do you think there is this 50/3 limitation?
Can you provide a link to this service?
Is that 50 records per IP address?
Is leasing 5 or 6 IP addresses an option?
Do you pay for each record?
How many records does this service have total?
Do the records have a time limit on their viability.
I am thinking if you can use 6 IP addresses (or 6 processes) you can run the 6 requests simultaneously using stream_socket_client().
stream_socket_client allows you to make simultaneous requests.You then create a loop that monitors each socket for a response.
About 10 years ago I made an app that evaluated web page quality. I ran
W3C Markup Validation
W3C CSS Validation
W3C Mobile OK
WebPageTest
My own performance test.
I put all the URLs in an array like this:
$urls = array();
$path = $url;
$url = urlencode("$url");
$urls[] = array('host' => "jigsaw.w3.org",'path' => "/css-validator/validator?uri=$url&profile=css3&usermedium=all&warning=no&lang=en&output=text");
$urls[] = array('host' => "validator.w3.org",'path' => "/check?uri=$url&charset=%28detect+automatically%29&doctype=Inline&group=0&output=json");
$urls[] = array('host' => "validator.w3.org",'path' => "/check?uri=$url&charset=%28detect+automatically%29&doctype=XHTML+Basic+1.1&group=0&output=json");
Then I'd make the sockets.
foreach($urls as $path){
$host = $path['host'];
$path = $path['path'];
$http = "GET $path HTTP/1.0\r\nHost: $host\r\n\r\n";
$stream = stream_socket_client("$host:80", $errno,$errstr, 120,STREAM_CLIENT_ASYNC_CONNECT|STREAM_CLIENT_CONNECT);
if ($stream) {
$sockets[] = $stream; // supports multiple sockets
$start[] = microtime(true);
fwrite($stream, $http);
}
else {
$err .= "$id Failed<br>\n";
}
}
Then I monitored the sockets and retrieved the response from each socket.
while (count($sockets)) {
$read = $sockets;
stream_select($read, $write = NULL, $except = NULL, $timeout);
if (count($read)) {
foreach ($read as $r) {
$id = array_search($r, $sockets);
$data = fread($r, $buffer_size);
if (strlen($data) == 0) {
// echo "$id Closed: " . date('h:i:s') . "\n\n\n";
$closed[$id] = microtime(true);
fclose($r);
unset($sockets[$id]);
}
else {
$result[$id] .= $data;
}
}
}
else {
// echo 'Timeout: ' . date('h:i:s') . "\n\n\n";
break;
}
}
I used it for years and it never failed.
It would be easy to gather the records and paginate them.
After all sockets are closed you can gather the pages and send them to your user.
Do you think the above is viable?
JS is not better.
Or did you mean 50 records each 3 minutes?
This is how I would do the pagination.
I'd organize the response into pages of 25 records per page.
In the query results while loop I'd do this:
$cnt = 0;
$page = 0;
while(...){
$cnt++
$response[$page][] = $record;
if($cnt > 24){$page++, $cnt = 0;}
}
header('Content-Type: application/json');
echo json_encode($response);

Php script is automatically being invoked multipletimes

I am facing a strange issue here.
I am using javascript ajax(I used jquery). Now the scenario is;
One ajax call is invoking a php script which is basically a long running process and it sets some session variables.
Later in some intervals(lets say in each 2 sec) I am running another ajax calls to check the session variables to know when the process(first php script execution) is completed.
First php script is fetching data from database and wring it into a file. In each fetching I am counting the loop number and storing it into a session variable to keep some kind of tracking record. Like;
$i=0;
$_SESSION['time']=date('m-d-Y H:i:s');
while(...)
{
ini_set('session.use_only_cookies', false);
ini_set('session.use_cookies', false);
ini_set('session.use_trans_sid', false);
ini_set('session.cache_limiter', null);
session_start();
$_SESSION['tracksatus']="loop number : ".$i." time is :"$_SESSION['time'];
session_write_close();
$i++;
......
......
}
Another php script which I am invoking via setInterval ajax is just doing like;
echo $_SESSION['trackstatus']
The set interval ajax is returning me like;
loop number 1 time is m-d-Y H:i:s
loop number 5 time is m-d-Y H:i:s
loop number 8 time is m-d-Y H:i:s
......
Then after few call again;
loop number 1 time is m-d-Y H1:i1:s1
.....
Notice the change of H:i:s to H1:i1:s1
So as per my understanding the php script is invoking twice. And for your information same code was working just before 12 hrs may be. And I faced this issue before and somehow solved it(trial and error so I don't know how or may be automatically....ok actually I have no clue).
Can you please give me an insight what I am doing wrong?
Please mention if you need more information.
And the funny thing is that it is working as expected just after asking this question without even changing a single line of code. But I want to know the reason.
I think that I know what the reason, PHP writes session variables to file, but it do it only on end of script execution, so you can`t see the changes of session in another script before end of long one.
You can fix it by adding session_write_close(); session_start(); after each change of session data.
session_write_close will write changes to HD, so another script can read it.
session_start will load session from HD, but make sure that your another script make no changes for a session, these changes will be overwritten by your long script.
And one more thing if you are using separate domains:
Before actual AJAX call happen your browser sends OPTIONS request to the same URL for checking CORS headers. So on start of your script check the HTTP METHOD and if it HEAD or OPTIONS make die();
Instead of using sessions, try using a temp file to keep count with a dynamic ID
Javascript
var time = Date.now();
$.get('/firstURL?time='+time);
setInterval(function(){
$.get('/secondURL?time='+time, function(response){
console.log(response);
}
}, 1000);
PHP 1st URL
<?php
$id = $_GET['time'];
$count = 0;
while(...) {
// Do your stuff
$count++;
file_put_contents("/tmp/{$id}", $count);
}
?>
PHP 2nd URL
<?php
$id = $_GET['time'];
$count = 0;
try {
$count = file_get_contents("/tmp/{$id}");
} catch(Exception $e) {}
echo $count;
?>
As other have said PHP does not write the session until execution has finished. you better off creating a php function that you call that writes a file with the progress and then your second ajax call just reads the file.
function updateCreateProgress($jobStartTime, $progress){
file_put_contents('/tmp/'.$jobStartTime.'.txt', $progress);
}
function completeProgress($jobStartTime){
unlink('/tmp/'.$jobStartTime.'.txt')
}
now your second script can check for '/tmp/'.$jobStartTime.'.txt' if it's there read it using file_get_contents if its not there report back it has finished.
Try adjusting to something like this:
$i=0;
ini_set('session.use_only_cookies', false);
ini_set('session.use_cookies', false);
ini_set('session.use_trans_sid', false);
ini_set('session.cache_limiter', null);
session_start();
$_SESSION['time']=date('m-d-Y H:i:s');
while(...)
{
$_SESSION['tracksatus']="loop number : ".$i." time is :"$_SESSION['time'];
session_write_close();
session_start();
$i++;
......
......
}
You started talking about $_SESSION before calling session_start();
If you call ajax with GET method - you must set "cache:false" option.
Yes, you must protect your php script from other requests. With unique key (GET parameter) or session.
php lock session data for single call and release it only when this call end. Using session_write_close() when script still working - bad practice. Maybe you want save into session something more after loop but before using this data from other requests.
Flexible and clear solution:
1) script1.php - invoke from ajax for start long job.
2) script2.php (long job here) - run directly from script1.php as background without wait, or add new cron job (insert into table) and run script2.php from cron (check jobs every second or other time).
3) script3.php - check job status (ajax).
For "communication" between script2.php and script3.php can be use database or special file with flock(), clearstatcache() and flush().

How to keep running a query to check database all the time every minute in PHP and JavaScript

I am making a project which is a website. Basically it will set a reminder and notify the user using email/SMS. I am using PHP and JavaScript. My database stores the the list of users in table 1 and a separate table for each user and his tasks(with the time and dates). I want to refer the database every minute to check for tasks even if the user is not logged in(browser is closed). What do i do to keep running the check for query all the time?
I want something that will run in background all the time even if user never opens the browser.
Please help.
The php code to store in a users database is
<?php
include("init.php");
session_start();
if(isset($_POST))
{
$date = $_POST["date"];
$event = $_POST["event"];
$time = $_POST["time"];
$daily = $_POST["daily"];
$weekly = $_POST["weekly"];
$monthly = $_POST["monthly"];
$fname = $_SESSION['fname'];
$fname = mysql_real_escape_string($fname);
$sql = "insert into $fname(fname,date,event,time,daily,weekly,monthly) values('$fname','$date','$event','$time','$daily','$weekly','$monthly')";
if(mysqli_multi_query($con,$sql))
echo "<br><h3> row inserted...</h3>done";
else
echo "Error in insertion...".mysqli_error($con);
}
?>
There is no issue with the code.
I just need to know how and using what can i refer the database all the time at the server end when user is not on the page.
Can php work 24hrs even if the browser is closed because i know javascript wont work.
You need to create an event in MySQL (or the database manager you are using, for example:
CREATE EVENT e_totals
-> ON SCHEDULE AT '2006-02-10 23:59:00'
-> DO INSERT INTO test.totals VALUES (NOW());
Or a recurrent event:
delimiter |
CREATE EVENT e_daily
ON SCHEDULE
EVERY 1 DAY
COMMENT 'Saves total number of sessions then clears the table each day'
DO
BEGIN
INSERT INTO site_activity.totals (time, total)
SELECT CURRENT_TIMESTAMP, COUNT(*)
FROM site_activity.sessions;
DELETE FROM site_activity.sessions;
END |
delimiter ;
Sagar what you are looking for is CRON Task. I am afraid that PHP and Javascript alone can't trigger it.
Work flow:
Make an API containing all your business logic or processing you need to execute it.
Register a CRON job in cPanel or crontab -e in your linux machine.
Use the end point directly using AJAX calls or make a separate end point as cron task will continue working.
Refer to this link in case you want to learn more about cron jobs - http://www.thegeekstuff.com/2009/06/15-practical-crontab-examples
Thanks,
Abhishek Jain

Insert into MySQL database when user clicks on a Link

I am creating a website that has users log in and select a pdf document that they want to download. When they open up the document to view and possibly download, I want data to be logged into a database at the same time.
The code to send the data to the database works (Except for: Undefined index: learningMaterial). But when I want to have the pdf document open and at the same time log the user and other data, all that happens is the document opens up.
Any advice would be appreciated, even for overall better methods of going about what I'm trying to achieve here. Still inexperienced with PHP.
See code below.
HTML
<form name="myform" method='post' action="../includes/writeStats.php">
<input type='hidden' name='learningMaterial' id='learningMaterial' value='learningMaterial'>
<a href='../documents/test.pdf' id='mylink' class='courses' name='Driver Training'> Driver Training </a>
</form>
JS - In header
<script type="text/javascript">
function submitform(){
document.myform.submit(); }
var form = document.getElementById("myform");
document.getElementById("mylink").addEventListener("click", function () {
submitform();
});
</script>
PHP
<?php
$con=mysqli_connect("localhost","root","password","qmptest");
// Check connection
if (mysqli_connect_errno()) {
echo "Failed to connect to MySQL: " . mysqli_connect_error();
}
//Get latest log nr
$result = mysqli_query($con,"SELECT * FROM logbook ORDER BY log DESC LIMIT 1");
while($row = mysqli_fetch_array($result)) {
$log = $row['log'] + 1;
//If statement to check if log is 0(first entry) to go here
}
$date = date("Y/m/d");
session_start(); // Start a new session
$person = $_SESSION['currentUser'];
//Not sure if this is correct along with my HTML input
$material = mysqli_real_escape_string($con, $_POST['learningMaterial']);
//Insert into database
$sql="INSERT INTO logbook (log, date, person, learningMaterial)
VALUES ('$log', '$date', '$person', '$material')";
if (!mysqli_query($con,$sql)) {
die('Error: ' . mysqli_error($con));
}
mysqli_close($con);
?>
Your way, clicking the link will override the form being submitted. This leads to the file opening and the form never going through.
Instead, you could try either opening the file in a new window by adding target="_blank" to the tag, or send the files URL through to the PHP, executing the database code then adding to the end:
header("Location: http://yourdomain.com/yourfile.pdf");
Your file is just a normal file being returned by your web server:
<a href='../documents/test.pdf' ...
So while you may be able to suggest to users or browsers that they should invoke some code before downloading this file, you can't actually require it. Any user can just request the file directly. And since PDF files don't execute PHP code (thankfully), your server-side PHP code has no way of knowing that the file has been requested.
What you can do is obscure the file itself behind a PHP request. You can create something like a download.php page which accepts the name of a file (test.pdf) and returns that file.
Be very careful when doing this. Don't just allow users to request any file and blindly return whatever they request. A user can request something like "../../../../../../../../../../etc/passwd" and if your code just builds a path and returns the file then you've just given users a sensitive file. It's best practice to keep a finite known list of identified files (perhaps in a database table) and let users request by the identifier rather than by the file path itself. That way the actual path is only ever known server-side in data that you control.
The main point here, however, is that by using such a page you inject some PHP code in between the user and the file. In order to get the file, the user needs to make a request to a PHP page. On that page you can record the act of the user having requested the file. (As well as perform authorization checks to validate that the user is allowed to view the file, etc.)
Never assume client-side code is going to do what you expect it to do. If you want to ensure something happens for anything approaching security or auditing purposes, it needs to happen in server-side code.

PHP counter disappears/reappears

I downloaded a script to run a very basic counter on two of my website's pages. Since April 2009 it's run beautifully, but in the last three weeks it would suddenly disappear, then reappear occasionally. This week it's every day.At first the counter just disappeared, now the pages with the counters don't load except the banner. The page will load eventually, up to five minutes sometimes. But without the counter showing. That comes ages later. Then it can all disappear again!
[http://www.thepenvro.com/][1] is the home page Then if you click on "NEWS", then on "Social Events News" that's the other page that has a counter. (We are trying to see who is interested in the reunion info). The pages are erratic. They will either be OK, or they are there but missing the counter in the lower left of each the two page, or the pages will only show the headers with no page content OR counter. All in no particular order.
I have gone into the server side of my site and reset the scripting (was told to do that by the Streamline.net tekkie). It doesn't seem to help but now and then and wonder if it's just coincidence.
It affects another script. I have a form to email that works great, but when this counter disappears, it brings down the form to email function on the Contacts page. I put a note at the bottom of the form for visitors to just send an email when they get the error message. The full error message when you can manage to get SUBMIT to even change screens is:
FastCGI Error
The FastCGI Handler was unable to process the request.
Error Details:
The FastCGI pool queue is full
Error Number: 4 (0x80070004).
Error Description: The system cannot open the file.
HTTP Error 500 - Server Error.
Internet Information Services (IIS)
Streamline asks me to replicate the error...I can't! I can only give them what I am posting here and screenshots. So I don't have a clue if it's my script or them. The script for the counter is below. It was something I purchased as well. I first thought maybe it was IE8 that was causing the trouble, but the same problem shows in Firefox.
One last note....It's not the form to email that's a problem as I have that also running in one of the sub-domain's of the site and there is NO trouble there. But I do not have the counter running anywhere on the sub-domain either. I have all the same features for the main and sub-domain.
Thank you for any help...I am a complete novice so any solutions will be gratefully received. We are doing the publicity for our reunion in May and I have a big email campaign after Christmas to get out and I don't want the site all buggered up. If there is an alternative counter or if the version's php I have is too old, I am happy to purchase a better one from a reputable source.
<?php
/*******************************************************************************
* Title: PHP hit counter (PHPcount)
* Version: 1.2 # October 26, 2007
* Author: Klemen Stirn
* Website: http://www.phpjunkyard.com
********************************************************************************
* COPYRIGHT NOTICE
* Copyright 2004-2007 Klemen Stirn. All Rights Reserved.
*******************************************************************************/
// SETUP YOUR COUNTER
// Detailed information found in the readme.htm file
// Count UNIQUE visitors ONLY? 1 = YES, 0 = NO
$count_unique = 1;
// Number of hours a visitor is considered as "unique"
$unique_hours = 1;
// Minimum number of digits shown (zero-padding). Set to 0 to disable.
$min_digits = 0;
#############################
# DO NOT EDIT BELOW #
#############################
/* Turn error notices off */
error_reporting(E_ALL ^ E_NOTICE);
/* Get page and log file names */
$page = input($_GET['page']) or die('ERROR: Missing page ID');
$logfile = 'logs/' . $page . '.txt';
/* Does the log exist? */
if (file_exists($logfile)) {
/* Get current count */
$count = trim(file_get_contents($logfile)) or $count = 0;
if ($count_unique==0 || $_COOKIE['counter_unique']!=$page) {
/* Increase the count by 1 */
$count = $count + 1;
$fp = #fopen($logfile,'w+') or die('ERROR: Can\'t write to the log file
('.$logfile.'), please make sure this file exists and is CHMOD to 666 (rw-rw-rw-)!');
flock($fp, LOCK_EX);
fputs($fp, $count);
flock($fp, LOCK_UN);
fclose($fp);
/* Print the Cookie and P3P compact privacy policy */
header('P3P: CP="NOI NID"');
setcookie('counter_unique', $page, time()+60*60*$unique_hours);
}
/* Is zero-padding enabled? */
if ($min_digits > 0) {
$count = sprintf('%0'.$min_digits.'s',$count);
}
/* Print out Javascript code and exit */
echo 'document.write(\''.$count.'\');';
exit();
} else {
die('ERROR: Invalid log file!');
}
/* This functin handles input parameters making sure nothing dangerous is passed in */
function input($in) {
$out = htmlentities(stripslashes($in));
$out = str_replace(array('/','\\'), '', $out);
return $out;
}
?>
This has nothing to do with the PHP code, but with the configuration of the webserver. It probably gets hit too many times per second to be able to process all requests.
Try looking at the following settings from IIS:
instanceMaxRequests
maxInstances
queueLength
If you visit the counter directly you can see this error message:
<h1>FastCGI Error</h1>
The FastCGI Handler was unable to process the request.
<hr>
<p>Error Details:</p>
<ul>
<li>The FastCGI pool queue is full</li>
<li>Error Number: 4 (0x80070004).</li>
<li>Error Description: The system cannot open the file.
</li>
</ul>
<h2>HTTP Error 500 - Server Error.<br>Internet Information Services (IIS)</h2>
I'd say it's either what Tomh says, it gets too many hits so while one request is reading from the file another one tries to open it and it fails, OR it simply cannot open it because of a permission problem.
A lot of people have experienced the same problem while using streamline.net, myself included. I currently have a site with them that is down about 50% of the day, every day of the week with that error.
My recommendation, change to a new provider.
Streamline.net won't do a thing to help you and will meerly fob you off with vague / innacurate answers. I'm just waiting for my next paycheque then I'm going to buy hosting with someone else.

Categories