What would I need to do to get this example running on my machine?
http://www.w3schools.com/ajax/tryit.asp?filename=tryajax_httprequest_js (page no longer available)
I'm looking to access the XML file hosted on w3schools (and not move it to my machine), but run the HTML and Javascript code on my machine. I tried changing the third to last line from:
<button onclick="loadXMLDoc('note.xml')">Get XML</button>
to:
<button onclick="loadXMLDoc('http://www.w3schools.com/ajax/note.xml')">Get XML</button>
thinking this would make it work, but it didn't seem to help. Any suggestions?
Just put the full URL into your browser window which will let your browser get it, then copy/paste and save locally. Javascript won't fetch stuff from outside the domain it's served from (without a fair bit of extra work), due to the Same Origin policy ( a security feature).
You can't go cross domain using AJAX. You should move the XML file to the same server that you have the site files stored on and call it that way.
https://developer.mozilla.org/en/Same_origin_policy_for_JavaScript
You need to use the following code in the function that does the AJAX:
try {
netscape.security.PrivilegeManager.enablePrivilege("UniversalPreferencesRead");
} catch (e) {
alert("error");
}
This only works for Firefox! There are other options which can be passed to enablePrivilege that may be useful.
Related
I have the following code, which is supposed to be a simple example of using the google api javascript client, and simply displays the long-form URL for a hard-coded shortened URL:
<script>
function appendResults(text) {
var results = document.getElementById('results');
results.appendChild(document.createElement('P'));
results.appendChild(document.createTextNode(text));
}
function makeRequest() {
console.log('Inside makeRequest');
var request = gapi.client.urlshortener.url.get({
'shortUrl': 'http://goo.gl/fbsS'
});
request.execute(function(response) {
appendResults(response.longUrl);
});
}
function load() {
gapi.client.setApiKey('API_KEY');
console.log('After attempting to set API key');
gapi.client.load('urlshortener', 'v1', makeRequest);
console.log('After attempting to load urlshortener');
}
</script>
<script src="https://apis.google.com/js/client.js?onload=load"></script>
except with an actual API key instead of the text 'API_KEY'.
The console output is simply:
After attempting to set API key
After attempting to load urlshortener
but I never see 'Inside makeRequest', which is inside the makeRequest function, which is the callback function for the call to gapi.client.load, leading me to believe that the function is not working (or failing to complete).
Can anyone shed some light on why this might be so and how to fix it?
Thanks in advance.
After spending hours googling the problem, I found out the problem was because I was running this file on the local machine and not on a server.
When you run the above code on chrome you get this error in the developer console "Unable to post message to file://. Recipient has origin null."
For some reason the javascript loads only when running on a actual server or something like XAMPP or WAMP.
If there is any expert who can shed some light to why this happens, it would be really great full to learn.
Hope this helps the others noobies like me out there :D
Short answer (http://code.google.com/p/google-api-javascript-client/issues/detail?id=46):
The JS Client does not currently support making requests from a file:// origin.
Long answer (http://en.wikipedia.org/wiki/Same_origin_policy):
The behavior of same-origin checks and related mechanisms is not well-defined
in a number of corner cases, such as for protocols that do not have a clearly
defined host name or port associated with their URLs (file:, data:, etc.).
This historically caused a fair number of security problems, such as the
generally undesirable ability of any locally stored HTML file to access all
other files on the disk, or communicate with any site on the Internet.
I am using a mobile network based internet connection and the source code is being rewritten when they present the site to the end user.
In the localhost my website looks fine, but when I browse the site from the remote server via the mobile network connection the site looks bad.
Checking the source code I found a piece of JavaScript code is being injected to my pages which is disabling the some CSS that makes site look bad.
I don't want image compression or bandwidth compression instead of my well-designed CSS.
How can I prevent or stop the mobile network provider (Vodafone in this case) from proxy injecting their JavaScript into my source code?
You can use this on your pages. It still compresses and put everything inline but it wont break scripts like jquery because it will escape everything based on W3C Standards
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">
On your server you can set the cahce control
"Cache-Control: no-transform"
This will stop ALL modifications and present your site as it is!
Reference docs here
http://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html#sec14.9.5
http://stuartroebuck.blogspot.com/2010/08/official-way-to-bypassing-data.html
Web site exhibits JavaScript error on iPad / iPhone under 3G but not under WiFi
You're certainly not the first. Unfortunately many wireless ISPs have been using this crass and unwelcome approach to compression. It comes from Bytemobile.
What it does is to have a proxy recompress all images you fetch smaller by default (making image quality significantly worse). Then it crudely injects a script into your document that adds an option to load the proper image for each recompressed image. Unfortunately, since the script is a horribly-written 1990s-style JS, it craps all over your namespace, hijacks your event handlers and stands a high chance of messing up your own scripts.
I don't know of a way to stop the injection itself, short of using HTTPS. But what you could do is detect or sabotage the script. For example, if you add a script near the end of the document (between the 1.2.3.4 script inclusion and the inline script trigger) to neuter the onload hook it uses:
<script type="text/javascript">
bmi_SafeAddOnload= function() {};
</script>
then the script wouldn't run, so your events and DOM would be left alone. On the other hand the initial script would still have littered your namespace with junk, and any markup problems it causes will still be there. Also, the user will be stuck with the recompressed images, unable to get the originals.
You could try just letting the user know:
<script type="text/javascript">
if ('bmi_SafeAddOnload' in window) {
var el= document.createElement('div');
el.style.border= 'dashed red 2px';
el.appendChild(document.createTextNode(
'Warning. Your wireless ISP is using an image recompression system '+
'that will make pictures look worse and which may stop this site '+
'from working. There may be a way for you to disable this feature. '+
'Please see your internet provider account settings, or try '+
'using the HTTPS version of this site.'
));
document.body.insertBefore(el, document.body.firstChild);
}
</script>
I'm suprised no one has put this as answer yet. The real solution is:
USE HTTPS!
This is the only way to stop ISPs (or anyone else) from inspecting all your traffic, snooping on your visitors, and modifying your website in flight.
With the advent of Let's Encrypt, getting a certificate is now free and easy. There's really no reason not to use HTTPS in this day and age.
You should also use a combination of redirects and HSTS to keep all of your users on HTTPS.
You provider might have enabled a Bytemobile Unison feature called "clientless personalization". Try accessing the fixed URL http://1.2.3.50/ups/ - if it's configured, you will end up on a page which will offer you to disable all feature you don't like. Including Javascript injection.
Good luck!
Alex.
If you're writing you own websites, adding a header worked for me:
PHP:
Header("Cache-Control: no-transform");
C#:
Response.Cache.SetNoTransforms();
VB.Net:
Response.Cache.SetNoTransforms()
Be sure to use it before any data has been sent to the browser.
I found a trick. Just add:
<!--<![-->
After:
<html>
More information (in German):
http://www.programmierer-forum.de/bmi-speedmanager-und-co-deaktivieren-als-webmaster-t292182.htm#3889392
BMI js it's not only on Vodafone. Verginmedia UK and T-Mobile UK also gives you this extra feature enabled as default and for free. ;-)
In T-mobile it's called "Mobile Broadband Accelerator"
You can Visit:
http://accelerator.t-mobile.co.uk
or
http://1.2.3.50/
to configure it.
In case the above doesn't apply to you or for some reason it's not an option
you could potentially set-up your local proxy (Polipo w/wo Tor)
There is also a Firefox addon called "blocksite"
or as more drastic approach reset tcp connection to 1.2.3.0/24:80 on your firewall.
But unfortunately that wouldn't fix the damage.
Funny enough T-mobile and Verginmedia mobile/broadband support is not aware about this feature! (2011.10.11)
PHP: Header("Cache-Control: no-transform"); Thanks!
I'm glad I found this page.
That Injector script was messing up my php page source code making me think I made an error in my php coding when viewing the page source. Even though the script was blocked with firefox NoScript add on. It was still messing up my code.
Well, after that irritating dilemma, I wanted to get rid of it completely and not just block it with adblock or noscript firefox add ons or just on my php page.
STOP http:// 1.2.3.4 Completely in Firefox: Get the add on: Modify
Headers.
Go to the modify header add on options... now on the Header Tab.
Select Action: Choose ADD.
For Header Name type in: cache-control
For Header Value type in: no-transform
For Comment type in: Block 1.2.3.4
Click add... Then click Start.
The 1.2.3.4 script will not be injected into any more pages! yeah!
I no longer see 1.2.3.4 being blocked by NoScript. cause it's not there. yeah.
But I will still add: PHP: Header("Cache-Control: no-transform"); to my php pages.
If you are getting it on a site that you own or are developing, then you can simply override the function by setting it to null. This is what worked for me just fine.
bmi_SafeAddOnload = null;
As for getting it on other sites you visit, then you could probably open the devtools console and just enter that into there and wipe it out if a page is taking a long time to load. Haven't yet tested that though.
Ok nothing working to me. Then i replace image url every second because when my DOM updates, the problem is here again. Other solution is only use background style auto include in pages. Nothing is clean.
setInterval(function(){ imageUpdate(); }, 1000);
function imageUpdate() {
console.log('######imageUpdate');
var image = document.querySelectorAll("img");
for (var num = 0; num < image.length; num++) {
if (stringBeginWith(image[num].src, "http://1.1.1.1/bmi/***yourfoldershere***")) {
var str=image[num].src;
var res=str.replace("http://1.1.1.1/bmi/***yourfoldershere***", "");
image[num].src = res;
console.log("replace"+str+" by "+res);
/*
other solution is to push img src in data-src and push after dom loading all your data-src in your img src
var data-str=image[num].data-src;
image[num].src = data-str;
*/
}
}
}
function stringEndsWith(string, suffix) {
return string.indexOf(suffix, string.length - suffix.length) !== -1
}
function stringBeginWith(string, prefix) {
return string.indexOf(prefix, prefix.length-string.length) !== -1
}
An effective solution that I found was to edit your hosts file (/etc/hosts on Unix/Linux type systems, C:\Windows\System32\drivers\etc on Windows) to have:
null 1.2.3.4
Which effectively maps all requests to 1.2.3.4 to null. Tested with my Crazy Johns (owned by Vofafone) mobile broadband. If your provider uses a different IP address for the injected script, just change it to that IP.
Header("Cache-Control: no-transform");
use the above php code in your each php file and you will get rid of 1.2.3.4 code injection.
That's all.
I too was suffering from same problem, now it is rectified. Give a try.
I added to /etc/hosts
1.2.3.4 localhost
Seems to have fixed it.
I have a html page on my localhost - get_description.html.
The snippet below is part of the code:
<input type="text" id="url"/>
<button id="get_description_button">Get description</button>
<iframe id="description_container" src="#"/>
When the button is clicked the src of the iframe is set to the url entered in the textbox. The pages fetched this way are very big with lots of linked files. What I am interested in the page is a block of text contained in a <div id="description"> element.
Is there a way to mitigate downloading of resources linked in the page that loads into the iframe?
I don't want to use curl because the data is only available to logged in users and the steps to take with curl to get the content is too complicated. The iframe is simple as I use this on a box which sends the right cookies to identify the request as coming from a logged in user, but the problem is that it is very wasteful to get nearly 1 MB of data to keep 1 KB of it and throw out the rest.
Edit
If the proposed method just works in Firefox it is fine, so I added Firefox tag. Also, it is possible that the answer actually is from the realm of Firefox add-on techniques, so I added that tag as well.
The problem is not that I cannot get at what I'm looking for, rather, the problem is the easy iframe method is wasteful.
I know that Firefox does allow loading only the text of a page. If you open a page and press Ctrl+U you are taken to 'view page source' window, There links behave as normal and are clickable, if you click on a link in source view, the source of the new page is loaded into the view source window, without the linked resources being downloaded, exactly what I'm trying to get. But I don't know how to access this behaviour.
Another example is the Adblock add-on. It somehow kills elements before they get loaded. With plain Javascript this is not possible. Because it only is triggered too late to intervene in good time.
The Same Origin Policy forbids any web page to access contents of any other web page in a different domain so basically you cannot do that.
However it seems that with some browsers it is allowed to access web pages content if you are trying to access it from a local web page which seems to be your case.
Safari, IE 6/7/8 are browser that allow a local web page to do so via XMLHttpRequest (source: Google Browser Security Handbook) so you may want to choose to use one of those browsers to do what you need (note that future versions of those browsers may not allow to do so anymore).
A part from this solution I only see two possibities:
If the web pages you need to fetch content from are somehow controlled by you, you can create a simpler interface to let other web pages to get the content you need (for example allowing JSONP requests).
If the web pages you need to fetch content from are not controlled by you the only solution I see is to fetch content server side logging in from the server directly (I know that you don't want to do so, but I don't see any other possibility if the previous I mentioned are not practicable)
Hope it helps.
Actually I've seen Cross Domain jQuery .load request before, here: http://james.padolsey.com/javascript/cross-domain-requests-with-jquery/
The author claims that codes like these found on that page
$('#container').load('http://google.com'); // SERIOUSLY!
$.ajax({
url: 'http://news.bbc.co.uk',
type: 'GET',
success: function(res) {
var headline = $(res.responseText).find('a.tsh').text();
alert(headline);
}
});
// Works with $.get too!
would work. (The BBC code might not work because of the recent redesign, but you get the idea)
Apparently it is using YQL wrapped into a jQuery plugin to do the trick. Now I cannot say I fully understand what he is doing there but it appears to work, and fits the bill. Once you load the data I suppose it is a simple matter of filtering out the data that you need.
If you prefer something that works at the browser level, may I suggest Mozilla's Jetpack framework for lightweight extensions. I've not yet read the documentations in its entirety but it should contain the APIs needed for this to work.
There are various ways to go about this in AJAX, I'm going to show the jQuery way for brevity as one option, though you could do this in vanilla JavaScript as well.
Instead of an <iframe> you can just use a container, let's say a <div> like this:
<div id="description_container"></div>
Then to load it:
$(function() {
$("#get_description_button").click(function() {
$("#description_container").load($("input").val() + " #description");
});
});
This uses the .load() method which takes a string in this format: .load("url selector"), then takes that element in the page and places it's content inside the container you're loading, in this case #description_container.
This is just the jQuery route, mainly to illustrate that yes, you can do what you want, but you don't have to do it exactly like this, just showing the concept is getting what you want from an AJAX request, rather than in an <iframe>.
Your description sounds like you are fetching pages from the same domain (you said that you need to be logged in and have session credentials) so have you tried to use async request via XMLHttpRequest? It might complain if the html on a page is particularly messed up but you chould still be able to get raw text via .responseText and extract what you need with a regex.
function publish(text) {
$('#helpdiv').prepend(text);
}
function get_help(topic) {
$.get(topic, publish);
}
<p>Hi. click here for more help.</p>
<div id="helpdiv"></div>
I've inherited this chunk of HTML and javascript above (snippet). It is/was going to be used as local help. Currently it is online only and it works fine. However, when I copy the files locally, I get "Permission Denied" in Internet Explorer and in Chrome doesn't do anything when I "click here for more help". What it's supposed to do is load the help content from inline-help.html and display it in the helpdiv div. Now here is the kicker, if I take the same files and copy them to inetpub on my PC and load them as http://localhost/hello.html it functions perfectly.
Presumably this is a security thing where the "local" zone isn't allowing me to load files off of the user's HD? But I'm not really sure what's going on and would like to understand this problem further and potentially come up with a workaround.
Any insight is greatly appreciated.
jquery's "get" uses xmlHttpRequest, which doesn't work on local files, unfortunately. If you really need to be able to fetch local data (or data from a different domain) asynchronously, you should use dynamic script tags. However that means the data file has to be reformatted as JSON data.
I don't think your browser is allowing you to run javascript locally (using the file:/// access method). But when you load it from http://localhost/ it works fine.
You need to either develop on a website, or use your localhost server.
Hey everyone, I'm working on a widget for Apple's Dashboard and I've run into a problem while trying to get data from my server using jquery's ajax function. Here's my javascript code:
$.getJSON("http://example.com/getData.php?act=data",function(json) {
$("#devMessage").html(json.message)
if(json.version != version) {
$("#latestVersion").css("color","red")
}
$("#latestVersion").html(json.version)
})
And the server responds with this json:
{"message":"Hello World","version":"1.0"}
For some reason though, when I run this the fields on the widget don't change. From debugging, I've learned that the widget doesn't even make the request to the server, so it makes me think that Apple has some kind of external URL block in place. I know this can't be true though, because many widgets phone home to check for updates.
Does anyone have any ideas as to what could be wrong?
EDIT: Also, this code works perfectly fine in Safari.
As requested by Luca, here's the PHP and Javascript code that's running right now:
PHP:
echo $_GET["callback"].'({"message":"Hello World","version":"1.0"});';
Javascript:
function showBack(event)
{
var front = document.getElementById("front");
var back = document.getElementById("back");
if (window.widget) {
widget.prepareForTransition("ToBack");
}
front.style.display = "none";
back.style.display = "block";
stopTime();
if (window.widget) {
setTimeout('widget.performTransition();', 0);
}
$.getJSON('http://nakedsteve.com/data/the-button.php?callback=?',function(json) {
$("#devMessage").html(json.message)
if(json.version != version) {
$("#latestVersion").css("color","red")
}
$("#latestVersion").html(json.version)
})
}
In Dashcode click Widget Attributes then Allow Network Access make sure that option is checked. I've built something that simply refused to work, and this was the solution.
Cross-domain Ajax requests ( Using the XMLHttpRequest / ActiveX object ) are not allowed in the current standard, as per the W3C spec:
This specification does not include
the following features which are being
considered for a future version of
this specification:
Cross-site XMLHttpRequest;
However there's 1 technique of doing ajax requests cross-domain, JSONP, by including a script tag on the page, and with a little server configuration.
jQuery supports this, but instead of responding on your server with this
{"message":"Hello World","version":"1.0"}
you'll want to respond with this:
myCallback({"message":"Hello World","version":"1.0"});
myCallback must be the value in the "callback" parameter you passed in the $.getJSON() function. So if I was using PHP, this would work:
echo $_GET["callback"].'({"message":"Hello World","version":"1.0"});';
Apple has some kind of external URL block in place.
In your Info.plist you need to have the key AllowNetworkAccess set to true.
<key>allowNetworkAccess</key>
<true/>
Your code works in Safari because it is not constrained in the dashboard sever and it is not standards complient in that it DOES allow cross site AJAX. FF IS standards complient in that it DOES NOT allow cross site ajax.
If you are creating a dashboard widget, why don't you use the XMLHttpRequest Setup function in the code library of DashCode. Apple built these in so you don't need to install 3rd party JS libraries. I'm not sure about JSON support but perhaps starting here will lead you in a better direction.
So another solution is to create your own server side web service where you can control the CORS of, the users web browser can't access another site, but if you wrap that other site in your own web service (on the same domain) then it does not cause an issue.
Interesting that it works in Safari. As far as I know to do x-domain ajax requests you need to use the jsonp dataType.
http://docs.jquery.com/Ajax/jQuery.getJSON
http://bob.pythonmac.org/archives/2005/12/05/remote-json-jsonp/
Basically you need to add callback=? to your query string and jquery will automatically replace it with the correct method eg:
$.getJSON("http://example.com/getData.php?act=data&callback=?",function(){ ... });
EDIT: put the callback=? bit at the end of the query string just to be safe.