<?php
if(isset($_POST["submit"]))
{
$adm=$_POST["admno"];
$phn=$_POST["phn1"];
include("model.php");
$db = new database;
$r=$db->register($adm);
while($row=mysql_fetch_array($r))
{
if($row["phn_no1"]==$phn || $row["phn_no2"]==$phn || $row["phn_no3"]==$phn)
{
$formatted = "".substr($phn,6,10)." ";
$password = $formatted + $adm;
echo $password;
$db->setpassword($adm,$password);
$pre = 'PREFIX';
$suf = '%20ThankYou';
$sms = $pre.$password.$suf;
session_start();
$ch = curl_init("http://www.perfectbulksms.in/Sendsmsapi.aspx? USERID=ID&PASSWORD=PASS&SENDERID=SID&TO=$phn&MESSAGE=$sms");
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_AUTOREFERER, true);
$result = curl_exec($ch);
curl_close($ch);
header("Location:password.php?msg=new");
}
else
{
header("Location:register.php?msg=invalid");
}
}
}
?>
this code is working perfect on my local host .. but when i put it on server ... it takes lots of time but the code in curl command is not working it only refers to next page ... i checked that curl is enabled .. if i use only sms api without curl command it sends sms immidiately.... but i want to run both header and also want to hide my sms api.... is there any alternate of this ???
Check if simple wget or curl from server to SMS API working fine or not ?
bash~/$wget "http://www.perfectbulksms.in/Sendsmsapi.aspx? USERID=ID&PASSWORD=PASS&SENDERID=SID&TO=$phn&MESSAGE=$sms"
bash~/$curl "http://www.perfectbulksms.in/Sendsmsapi.aspx? USERID=ID&PASSWORD=PASS&SENDERID=SID&TO=$phn&MESSAGE=$sms"
If wget or curl is fine then something wrong with your code.
If wget or curl not working from server then might be port 80 is blocked by your ISP for outgoing traffic. Check with ISP for same.
Also you can try
telnet www.perfectbulksms.in 80
and see if its getting connected or not.
Related
I am trying to scrape the information from a couple sites (mega.nz, openlaod.co) and the content is loaded dynamically so the code i am actuallu using doesn't work
<?php
require 'simple_html_dom.php';
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL,"https://openload.co/f/41I9Ak_QBxw/DPLA.mp4");
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, 0);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
$response = curl_exec($ch);
curl_close($ch);
echo $response;
$html = new simple_html_dom();
$html->load($response);
foreach ($html->find('img[id=imagedisplay]') as $key ) {
echo $key;
}
?>
when i use it on openload (like the example above) it redirects me to "https://oload.download/scraping/" being "/scraping" the folder where i have my script at.
Is there any javascript/jquery framework (or php) that i can use to scrape the content on the fly??
It's not suitable for a large amount of scraping, but in the past when I've needed to grab some basic data from a dynamic web page I've found that Selenium works pretty well.
Depending on your stack of choice, I'd recommend looking into headless browsers. This way you can render a page in the background and parse the resulting HTML.
This question already has answers here:
PHP file_get_contents() returns "failed to open stream: HTTP request failed!"
(16 answers)
Closed 4 years ago.
I have created script for lk domain search.
this is the code
<form action="" method="GET">
<input type="text" name="dm" placeholder="tx">
</form>
<?php
if (isset($_GET["dm"])) {
$domain = $_GET["dm"];
$res = file_get_contents("https://www.domains.lk/domainsearch/doDomainSearch?domainname=$domain");
echo $domain;
}
?>
<script type="text/javascript">
var data = '<?php echo $res ?>';
document.write(data);
</script>
var data will show in local host. but i have hosted it in my server then result will not show.
this is server hosted file http://vishmaloke.com/dm/ser.php
SOLUTION #1
There is PHP setting by name allow_url_fopen. This must be enable to get content from remote url. You can do it via .htaccess file.
Put following line in an .htaccess file in the directory you want the setting to be enabled:
php_value allow_url_fopen On
Note: This above setting will apply only into same directory where .htaccess file placed.
SOLUTION #2
Alternatively you can update php.ini
PHP.INI UPDATE
add following line to php.ini
allow_url_fopen = On;
SOLUTION 3
It is recommended to use curl instead of file_get_contents
CURL UPDATE
if (isset($_GET["dm"]))
{
$domain = $_GET["dm"];
// curl
$curl_handle=curl_init();
curl_setopt($curl_handle, CURLOPT_URL,"https://www.domains.lk/domainsearch/doDomainSearch?domainname=$domain");
curl_setopt($curl_handle, CURLOPT_CONNECTTIMEOUT, 2);
curl_setopt($curl_handle, CURLOPT_RETURNTRANSFER, 1);
$res = curl_exec($curl_handle);
curl_close($curl_handle);
echo $domain;
}
i am using curl to open a page and want to play video using javascript that was shown on the page . i have used following code
$url = "https://www.example.com/";
$link = "http://www.example.com/oembed?url=" . $url. "&format=json";
$curl = curl_init($link);
curl_setopt($curl, CURLOPT_RETURNTRANSFER, 1);
$return = curl_exec($curl);
curl_close($curl);
$result = json_decode($return, true);
echo '<pre>'; print_r($result);
echo $result['html'];
play();
function play(){
document.getElementById("play-button").click();
}
my curl is working but it didn't play the video.where am iI wrong? do i have pass the x-path of the button to play video?
PHP scripts are executed on the server, while JavaScript is executed on the browser (Node.js is an exception). Thus your PHP code is already executed when the JS wanted to call the click action and there's no way that the PHP code will execute on the browser, thus the curl is not getting called.
What you need to do is call the URL using JavaScript asynchronously. You can either use Ajax or Fetch for this.
I have used the following code to incrementally stream PHP output to JavaScript via AJAX:
ini_set('output_buffering', 'off');
ini_set('zlib.output_compression', false);
while (#ob_end_flush());
ini_set('implicit_flush', true);
ob_implicit_flush(true);
header("Content-type: text/plain");
header('Cache-Control: no-cache');
for($i = 0; $i < 1000; $i++) {
echo ' ';
}
// show output here
This works fine for me on localhost, but I've tried with three different remote servers, all of which don't work.
What is the reason for this? I can provide more code as necessary.
How do I get the HTML code of another site it wants cookies to be enabled?
I just need to parse this page www.fx-trend.com/pamm/rating/
I'm using javascript jquery (jQMobile) and sometimes PHP.(I prefer to use js)
here is a sample with PHP:
<?php
$url = 'url';
$html = file_get_html($url);
//$html = file_get_contents($url);
echo $html;
?>
here is a sample with js:
How to get data with JavaScript from another server?
OR
$(this).load(url);
alert($(this)); //returns object Object
server answer:
Cookies must be enabled in your browser! Try to clear all cookies, if
cookies are enabled.
code samples are welcome.
Try using Curl and enable cookies. The code sample below is snagged from this page.
<?php
/* STEP 1. let’s create a cookie file */
$ckfile = tempnam ("/tmp", "CURLCOOKIE");
/* STEP 2. visit the homepage to set the cookie properly */
$ch = curl_init ("url");
curl_setopt ($ch, CURLOPT_COOKIEJAR, $ckfile);
curl_setopt ($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt ($ch, CURLOPT_USERAGENT, 'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.8.1.13) Gecko/20080311 Firefox/2.0.0.13');
$output = curl_exec ($ch);
var_dump($output);
Edit: You might have to fake a browser by changing the default user agent header.