I want to use javascript to open one text file in child window, and then read the content to the parent window. How to implement it?
The code like below, if the data.xml is not HTML page, how to get the content to the parent window through javascript?
function op() {
win = window.open("http://xxx.bb.com/data.xml", "win", "width=200,height=200")
}
You have already tried XmlHttpRequest and fallen foul of the cross-domain restriction. A web page (and code contained in it) can only manipulate data or elements of another if both pages are in the same domain. You will run into the same cross-domain restriction with two windows.
The correct method is to use XmlHttpRequest but ensure that the target of that request is in your domain. That will probably involve the creation of a proxy script on your server which can serve pages or data from other domains. Your page requests your script to get the external data; the script gets the data and serves it. Because the script in your domain, the data appears to come from your domain and it's not subject to cross-domain restrictions.
Simple PHP proxy script
If the data you are attempting to get at is not yours, you should really ask permission to deal with it or republish it. Are web developers allowed to scrape html content? (The accepted answer is not the best answer)
Related
In the context of a user script, for example executed by Tampermonkey, is it possible to communicate between two pages of different domains, which set 'X-Frame-Options' to 'SAMEORIGIN'?
I know about this way of sending messages from one page to another by using iFrames and postMessage, but when working with sites you dont control, like in my case Stack Overflow and Google (working on a bot to automate something for myself), you'll get the SAMEORIGIN error when trying to create the iFrame.
But I thought since I'm able to insert script in both pages, it might be possible to pull off some workaround or alternate solution.
One suggestion, a shared worker looked promising, but it seems to require the page to be from the same origin. Also I looked at the Broadcast Channel API spec, but it isn't implemented anywhere yet, and it also seems to be bound to the same origin policy.
Another suggested possibility mentioned so far in the comments is to use the GM API since this is a user-script (extended / special JS features). With GM_xmlhttpRequest we can ignore cross domain restrictions and load google.com, then put it in an iframe, but all the sources will point to the site where the iframe is embedded, so searching the Google page tries to execute the search params on the parent site's domain.
GM_xmlhttpRequest({
method: "GET",
url: "https://www.google.com",
headers: {
"User-Agent": "Mozilla/5.0",
"Accept": "text/xml"
},
onload: function(response) {
$('html').html('<iframe id="iframe"></iframe>');
$("#iframe").contents().find('html').html(response.responseText);
});
Maybe I could edit the search requests to point to google.com specifically, rather than letting the search adopt the parent page's domain. And if that fails due to some hangup with the same origin policy, I could even try to replace Google's xmlhttpRequest's with GM_xmlhttpRequest's, but I'm not sure if that can be done since the user script, if you load GM functions, will run in a sandbox, unable to intertwine with the pages scripts if I understand correctly. I'm just not sure.
On the other hand, if we can trick the iframe's contents to treat google.com as the domain for requests, though we're in business, but examples don't seem to exist for this kind of thing, so I'm having trouble figuring out how to make it happen.
Yes, it is possible, via 2 routes:
Since it's a user-script, you have access to special functions known as GM functions. With GM_xmlhttpRequest, we can send a request which ignores the same origin policy, allowing us to load the third party page in an iFrame, allowing communication between the frames via postMessage. The good thing about this is there's no page reloading, but the bad thing is, you will have to dynamically modify the frame's native xmlhttpRequest to execute a GM_xmlhttpRequest and specify the full target URL, not just a path such as /example.js, otherwise the domain of the outer window will be used in any requests made by the internal frame.
We can use URL queries by openning a tab of the same origin as the page you want to communicate with. Then we can use shared web workers to post messages to any previously opened pages of that domain and send the data from the URL query to the page you want. The pros are you don't have to dynamically modify the pages' scripts, but the cons are that you have to open a new tab for each message between different domains.
This is a security question. Say User A is logged into example.com, and he then downloads a PDF hosted by example.com which he views within the browser. Is JavaScript embedded within the PDF able to access example.com's APIs as if it were the logged in User A?
Or stated differently, do API calls originating from JavaScript within a PDF get sent to the server with the session cookie that the browser may be holding for that domain?
There are a few URL-related commands in Acrobat JavaScript. There is also the path information which can be retrieved.
The URL-related commands are (list maybe not complete) getURL(), launchURL(), and submitForm() . They all can essentially send data to any server, as long as that server can do something with the data. It does not matter whether the PDF is viewed in the browser or not. This can indeed be a security issue.
However, for several versions already, at least the Adobe products have a barrier built in, which either completely bans, or requests permission to contact another server than where the document comes from. That means, that it is quite difficult to do something behind the provider's and the user's backs.
They can interact, but only via the HostContainer object.
This object manages communication between a PDF document and a
corresponding host container that the document is contained within,
such as an HTML page. The host container for a document is specified
by the Doc hostContainer property. The Embedded PDF object provides
the corresponding API for the object model of the container
application.
The HostContainer is limited to the postMessage method for communications with the hosting page:
Sends a message asynchronously to the message handler for the host
container of the PDF document. For this message to be delivered, the
host container (for example, an element in an HTML page) must
have registered for notification by setting its messageHandler
property. The message is passed to the onMessage method of the
messageHandler.
So it does not have full access to the DOM like an HTML document does. It is more like a document loaded from another, or a null origin (dynamically created document in an IFrame) which can use the HTML5 postMessage functionality to communicate with its parent frame in a safe manner.
There is also the submitForm method mentioned in the documentation.
Submits the form to a specified URL. To call this method, you must be
running inside a web browser or have the Acrobat Web Capture plug-in
installed.
It is not clear whether cookies are included in the request or not, or whether the referer header is set or not. If both are true then an CSRF attack could be accomplished if the site is using referer checking as the defense mechanism.
I am trying to load a a page into an iframe. When that page is loaded i wish to edit the contents of the "pre" tag which is inside the loaded document. The loaded doc is from another domain. I am using : resultframe is the iframe
var atag= document.getElementById("resultframe").contentWindow.document.getElementsByTagName('pre');
atag[0].innerHTML="done";
to access the tag.
problem: there seems to be no effect of this statement. I need to know the correct syntax and also that can i access the elements of pages loaded from different domain. I got the syntax from the web and also some variation of it.
Please suggest.
While JavaScript is limited by cross-domain policies that prevent interaction with another domain, there is one potential workaround as long as you can live with certain limitations.
By using something like PHP and it's cURL library you can grab the contents of a page from just about anywhere (even a secure page or one that requires a login, as long as you have credentials). You can then parse the page, edit what you need to, and display it within your own site. It's important to realize, though, that this is simply your own local copy of the page. You won't have the luxury of actually changing the contents of the page itself.
Another possibility, which would require access to all domains you wish to edit, would be to employ a web service that would accept edits in the form of a PUT request. You can achieve a lot more with a web service, but it would have to be available on all target domains that you wish to make changes to.
In the near future, XMLHttpRequest Level 2 might become a reality and will bring Cross-Origin Resource Sharing (CORS) with it. CORS will allow web applications on one domain to make cross domain AJAX requests to another domain. The target domain will have a header giving express permission to allow requests from another. Potentially, this could be used to send edits to another site.
You can't. Browsers have cross-domain policies, for security reasons.
What if I included the facebook page in an iframe, and I can get all your information because you're always connected to it?
I'm building a chrome extension that adds a content script to a site (let's call it the host). the content script creates an iframe in the host which leads to my domain (cross domain).
I'm able to send messages from the iframe to the host via parent.postMessage(). however, the 'message' event received does not contain a 'source' property which blocks me from communicating messages back to the child.
UPDATE
I'm looking for a client side solution or an explanation for this behaviour.
You'll have to do it the difficult way.
When doing the iframe, send a unique install id in the url.
e.g.
http://www.trackingdomain.tld/trackingscript.php?uid=38736238
then have your script pull a JSON script reply every 1000ms from your domain with the same uid to get the message that is returned.
You could also use the json to send messages.
But this solution would mean you'd be forced to use server side scripting.
It seems there is an issue with the sandboxed window object available in the extension.
A quick workaround is to inject the javascript code directly to the DOM via a src element instead of running it from the extension. This way you're dealing with the regular window objects.
You can see an example in the answer to this question
There is this 3rd party webservice. One of the public webmethods available is a GetDocument() method. This method returns a Document object. The Document object has properties for File(byte[]), ContentType(string) ect.
My Question : Can I subscribe to this service using javascript(mootools) + ajax + JSON, return the document object, in this case an excel document, and force the file download?
It is true that typically you cannot initiate a download from JavaScript, but there is a flash component, Downloadify that does enable client side file generation.
So you can serve files for download from HTML/JavaScript.
With that problem solved, you still have the problem of how to get the data that you wish to serve from the source web service.
3rd party implies XSS (cross site scripting) which is a no-no using XmlHttpRequest (Ajax).
A possible solution to this problem could be to use a common hidden IFrame technique to get the data.
Simply have an appropriate (hidden?) form that correctly posts to the web service and point it's action to an hidden IFrame element upon which you are trapping the Load event and parse the data returned.
But current browsers have different levels of security measures that limit your ability to access IFrames with an external source so you are actually stuck here. Sorry to get your hopes up.
The only practical robust way to accomplish what you would like to do is to have a local server side script that can act as a proxy between your HTML/JavaScript and the external web service.
Using such a proxy, you can simply go back to using Ajax to get your data to serve up with Downloadify.
But then, since you are using a server script to get the data, why not just serve the data from the script for download?
These are just my observations on the problem domain you present.