Get Url Source, through javascript ajast - javascript

I've been experimenting with Ajast and it's very useful for getting remote URL sources etc. In the below example it bypasses same-domain-policy and gets "Hello World !", but I cannot recreate this when I change it to google.com.
<html>
<head>
<script type="text/javascript" src="http://ajast.org/ajast/ajast.js"></script>
<script id="TestScript" Language="javascript">
function test()
{
var xmlhttp = new AJAST.JsHttpRequest();
xmlhttp.onreadystatechange = function()
{
if (xmlhttp.readyState==4) // 4 = "loaded"
{
if (xmlhttp.status == 200)
document.write(xmlhttp.responseText);
else
alert('ERROR: ' + xmlhttp.status + ' -> ' + xmlhttp.statusText);
}
}
xmlhttp.open("GET", 'http://riffelspot.com/ajast/ajast_full.php', false);
xmlhttp.send();
}
</script>
</head>
<body onload="test();">Please wait...</body>
</html>
</code>
My problem occurs when I change the get url to google.com, can anyone help me? I want JavaScript to fetch the source of a page.

Read the documentation.
AJAST can only be used to send a request to a compatible server-side script.
Basically, it's a non-standard form of JSONP.

I thought that dynamicly loading the script into the DOM would bypass this security feature, like the quote suggests
"The main advantage of AJAST is its ability to make requests to foreign hosts (cross domain) which a standard AJAX request cannot do using a technique known as 'the script tag hack'. "
Where would I be able to find documentation as i dont want to use a JSONP proxy, I would like to request the webpage without signing.

Related

what is wrong with my javascript structure

I am really new on javascript. I want to read xml from an url and want to parse it on html. I have html and javascript codes like that:
<!DOCTYPE html>
<html>
<head>
</head>
<body>
<script>
function loadXMLDoc(filename)
{
if (window.XMLHttpRequest)
{
xhttp=new XMLHttpRequest();
}
else // code for IE5 and IE6
{
xhttp=new ActiveXObject("Microsoft.XMLHTTP");
}
xhttp.open("GET",filename,false);
xhttp.send();
return xhttp.responseXML;
}
xmlDoc=loadXMLDoc("http://www.w3schools.com/dom/books.xml");
x=xmlDoc.getElementsByTagName("title")[0];
y=x.childNodes[0];
document.write(y.nodeValue);
</script>
</body>
</html>
What is wrong ? thanks
This is the standard method to do this task:
Since people keep going along the path of trying to use cross domain for something other than JSONP... IT WILL NOT WORK!
The following code below is an example of what will work, as your server is allowed to get content from other network locations being its more controlled.. Your browser on the other hand can only receive JSONP or PLAIN TEXT... Most google results should explain this as well..
YOUR ONLY OPTION
Is to use a PROXY of some form to obtain what your trying to access, So you only have three choices here.
Use JSONP or Plain Text
Use a Proxy or some other method that is local to your server/website/script/page
Keep trying to use examples posted here after being told cross domain rules apply.
JAVASCRIPT:
function loadXMLDoc(sURL) {
$.post( "myproxy.php", { requrl: sURL }).done(function( data ) {
alert(data);
console.log(data);
document.write(data);
});
}
PHP: myproxy.php
<?php
header('Content-Type: application/xml; charset=utf-8');
$file = file_get_contents($_POST['requrl']);
echo $file;
?>
Please note that is you plan to use this with other types of content then you will need to change/remove the header line.
YOUR BROWSER ALLOWS YOU TO AJAX XML FROM OTHER WEBSITES
If this is the case, then you need to replace or update your web browser..
THE ABOVE SOLUTION IS NOT COMPLEX
This is virtually copy and paste code ready to go, The JS function will return the result/data/content in the three most know ways...
The PHP script is a copy and paste as well.. So if you have PHP installed.
All you would need to do is create a new text file in the same location as your html document and name it as "myproxy.php" and this example will work.
This is a proper XmlHttpRequest with a callback function to handle your XML:
<!DOCTYPE html>
<html>
<head></head>
<body>
<script type="text/javascript">
function loadXMLDoc(url){
var xmlhttp = new XMLHttpRequest();
xmlhttp.onreadystatechange=function(){
if (xmlhttp.readyState == 4 && xmlhttp.status == 200) {
callbackFunction(xmlhttp.responseText);
}
}
xmlhttp.open("GET", url, true);
xmlhttp.send();
}
function callbackFunction(response){
if (window.DOMParser){ // Non IE
var parser = new DOMParser();
var xml_doc = parser.parseFromString(response,"text/xml");
}else{ // Internet Explorer
var xml_doc = new ActiveXObject("Microsoft.XMLDOM");
xml_doc.async = false;
xml_doc.loadXML(response);
}
// Do something with your 'xml_doc' object variable here
console.log(xml_doc) // Debugging only.. to see the XML in the browser console for your own reference
var x = xml_doc.getElementsByTagName("title")[0]
var y = x.childNodes[0];
document.write(y.nodeValue);
}
// Call the function to begin code execution
loadXMLDoc('http://www.w3schools.com/dom/books.xml')
</script>
</body>
</html>
This is working code so you can just erase what you have and put this directly in place of it. Good luck!
If you're planning on hosting the file on your own server to access via XHR, the code I offered is intended for that. If w3schools.com had a 'Access-Control-Allow-Origin: *' header on the XML file your are requesting, it would also work. But they don't. So you need to have the XML file in a place where your browser's security will let you access it (same domain origin as your webpage). Otherwise your browser will continue to block the resource with a 'cross-origin-request-blocked' error in the console.

if (xmlhttp.readyState==4 && xmlhttp.status==200) in AJAX not executing

I was trying ajax on my page. But it is not working as if (xmlhttp.readyState==4 && xmlhttp.status==200) is always false. I have alerted the values of xmlhttp.readyState and xmlhttp.status. There values are always 1 and 0 respectively for xmlhttp.open event and 4 & 0 respectively for xmlhttp.close event.
<script src="http://ajax.googleapis.com/ajax/libs/jquery/1.11.1/jquery.min.js"></script>
<script type="text/javascript">
function captcha_check()
{
var code = document.getElementById("captcha").value;
var url = "http://www.opencaptcha.com/validate.php?img='.$captcha_name.'.jpgx&ans="+code;
var xmlhttp=new XMLHttpRequest();
xmlhttp.onreadystatechange=function() {
alert(xmlhttp.readyState + " " + xmlhttp.status);
if (xmlhttp.readyState==4 && xmlhttp.status==200) {
document.getElementById("captcha_error").innerHTML=xmlhttp.responseText;
return false;
}
}
xmlhttp.open("GET","captcha_check.php?img=abc.jpg&ans="+code,true);
xmlhttp.send();
}
</script>
What the issue may be and how can I solve it and make the AJAX functioning. Thanks in advance.
The correct order of calls is:
new XMLHttpRequest
xhr.open()
xhr.onreadystatechange = ...
xhr.send()
In some browsers, calling .open clears any event handlers on it. This allows for clean re-use of the same XHR object, which is supposedly more memory-efficient (but that really doesn't matter if you code properly to let the GC do its job)
So, simply put the .open call before the onreadystatechange assignment and you should be good to go.
Even though your code is working perfectly, as mentioned in the comments, since your already included jQuery try:
<script src="http://ajax.googleapis.com/ajax/libs/jquery/1.11.1/jquery.min.js"></script>
<script type="text/javascript">
function captcha_check() {
var code = document.getElementById("captcha").value;
var url = "http://www.opencaptcha.com/validate.php?img='.$captcha_name.'.jpgx&ans="+code;
jQuery.get("captcha_check.php?img=abc.jpg&ans="+code", function(data) {
alert("Load was performed.");
console.log(data);
});
}
</script>
It almost sounds like you are making an AJAX request from a page loaded in the browser directly from the file system, rather than from a Web Server. Since you are issuing a GET request, browser caching might be an issue as well. Try appending a timestamp to the URL each time so the URL is unique:
xmlhttp.open("GET", "captcha_check.php?img=abc.jpg&ans=" + code
+ "&__cachebuster__=" + new Date().getTime());
Secondly, you need to escape the code variable to make it safe for a query string:
xmlhttp.open("GET", "captcha_check.php?img=abc.jpg&ans=" + escape(code)
+ "&__cachebuster__=" + new Date().getTime());
Lastly, please check for any occurences of $_POST in your captcha_check.php file, as this would indicate you should be issuing a POST request, not a GET request.
If:
You are loading a page in the browser directly from the file system, AJAX requests will fail
You enter non query string safe characters for the code, then you end up with an invalid URL, and AJAX requests will fail
The captcha_check.php file requires a POST request and you issue a GET request, the AJAX request will fail.
xmlhttp.open("GET","captcha_check.php?img=abc.jpg&ans="+code,true);
Please check file path, maybe its wrong path.

Get text from a link in javascript

I am trying to get text from a service on the same server as my webserver. The link is something like this:
http://<OwnIPadres>:8080/calc/something?var=that
This is my code:
function httpGet(theUrl)
{
alert(theUrl);
var doc = new XMLHttpRequest();
doc.onreadystatechange = function() {
if (doc.readyState == XMLHttpRequest.DONE) {
alert("text: " + doc.responseText );
document.getElementById('ctm').text = doc.responseText;
}
}
doc.open("get", theUrl);
doc.setRequestHeader("Content-Encoding", "UTF-8");
doc.send();
}
The url that i print in my first alert is the good one if i test in my browser, it is an html page with a table in it. But the alert of my text is empty? Is it a problem that the text is html?
Actually, its quite ok that your 'text' is 'html'. The problem is that using a different port counts as cross-site scripting. Therefore, your XMLHttpRequest is being stopped by the browser before it actually reaches your page across port 8080.
I'm not sure what else you're doing before and around this code snippet, but you could try an iframe call to your url to get your data, or you could add an
Access-Control-Allow-Origin: http://:8080/
in your header (however that will only get you the most modern browsers).
Finally, you could pull in a JS framework like JQuery which could help you with pulling in this service data.

URL validation/connectivity using javascript

I want to verify if an external url valid/exists/responsive using javascript. For example, "www.google.com" should return true and "www.google123.com" should return false.
I thought to use AJAX for this purpose by testing : if (xmlhttp.readyState == 4 && xmlhttp.status == 200) but it seems that this doesn't work for remote servers(external urls). As my server uses a proxy, i planned to use browser side script so that it automatically uses user's browser proxy if present.
Please tell me do I have to use "AJAX Cross Domain"? How to achieve this, as i simply want to validate a url.
Any way other than using AJAX?
I'm pretty sure this is not possible. Any AJAX that allowed you to call a random page on another domain in the user's context would open up all sorts or security holes.
You will have to use a server-side solution.
The usual way to avoid cross-domain issues is to inject a tag. Tags like image or script kan load their content from any domain. You could inject, say a script tag with type "text/x-unknown" or something, and listen to the tags load-event. When the load event triggers, you can remove the script tag from the page again.
Of course, if the files you are looking for happens to be images, then you could new Image() instead. That way you don't have to pollute the page by injecting tags, because images load when they are created (this can be used to preload images). Again, just wait for the load event on the image.
UPDATE
Okay, it seems I am jumping to conclusions here. There is some differences between browsers on how this can be supported. The following is a complete example, of how to use the script tag for validating urls in IE9 and recent versions of Firefox, Chrome and Safari.
It does not work in older versions of IE (IE8 at least) because apparently they don't provide load/error events for script-tags.
Firefox refuses to load anything if the contenttype for the script-tag is not empty or set to 'text/javascript'. This means that it may be somewhat dangerous to use this approach to check for scriptfiles. It seems like the script tag is deleted before any code is executed in my tests, but I don't for sure...
Anyways, here is the code:
<!doctype html>
<html>
<head>
<script>
function checkResource(url, callback) {
var tag = document.createElement('script');
tag.src = url;
//tag.type = 'application/x-unknown';
tag.async = true;
tag.onload = function (e) {
document.getElementsByTagName('head')[0].removeChild(tag);
callback(url, true);
}
tag.onerror = function (e) {
document.getElementsByTagName('head')[0].removeChild(tag);
callback(url, false);
}
document.getElementsByTagName('head')[0].appendChild(tag);
}
</script>
</head>
<body>
<h1>Testing something</h1>
<p>Here is some text. Something. Something else.</p>
<script>
checkResource("http://google.com", function (url, state) { alert(url + ' - ' + state) });
checkResource("http://www.google.com/this-does-not-exists", function (url, state) { alert(url + ' - ' + state) });
checkResource("www.asdaweltiukljlkjlkjlkjlwew.com/does-not-exists", function (url, state) { alert(url + ' - ' + state) });
</script>
</body>
</html>

XMLHttpRequst return null on Chrome

I have the following code that works fine in IE:
<HTML>
<BODY>
<script language="JavaScript">
text="";
req = new XMLHttpRequest();
if (req)
{
req.onreadystatechange = processStateChange;
req.open("GET", "http://www.boltbait.com", true);
req.send();
}
function processStateChange()
{
// is the data ready for use?
if (req.readyState == 4) {
// process my data
alert(req.status);
alert(req.responseText);
}
}
</script>
</BODY>
</HTML>
In IE, the first alert returns 200, the second returns the web page.
However, in Chrome the first alert returns 0 and the second returns the empty string.
My intent is to grab a web page into a string for processing. If I'm not doing this right, how should I be doing this?
Thanks.
In general, due to the same origin policy (for security reasons), you can't make request to URLs outside your domain. So, if your domain isn't boltbait.com, you can't make that request. What's strange is that IE doesn't give you an error...
However, in Chrome, an extension can make cross-origin requests (check this).

Categories