I'm trying to handle reauthentication using a different Authorisation website, while within a Single Page Application (SPA) in my home website. Both websites are internal to a client site.
I can't use the standard "redirect" method as I'll lose my SPA JavaScript context.
I've investigated and had CORS setup on the Auth website so it's now returning Access-Control-Allow-Origin: https://www.mywebsite.com. When I try to load the Auth page into a JQuery UI dialog it fails as the scripts all try to load in the context of the Home website.
i.e.
From my website https://www.mywebsite.com/static/
I'm loading https://www.auth.com/login.html.
When loaded into a JQuery UI dialog it tries to load it's scripts as https://www.mywebsite.com/static/scripts/authscript.js
instead of
https://www.auth.com/scripts/authscript.js
I also tried loading the Auth page into an iframe by changing the src tag but it just reloaded the page.
Is there a way to change the Source directory in the context of the CORS web page I'm trying to show?
So I couldn't get the page to load correctly into a standard jQuery Modal dialog and I have no control over the Auth page I was connecting to. To solve this I investigated the problem I was having with the iframe and used an HTML5 feature sandbox to fix it.
The Auth page was using a "break out" script to reload the page if it was loaded into an iframe and obviously this broke my SPA context. HTML5 allows you to restrict the contents of an iframe.
<iframe src="" id="my_auth" sandbox="allow-forms allow-scripts allow-same-origin"></iframe>
allow-forms Allow form and submit
allow-scripts let it run JavaScript
allow-same-origin let it consider the Origin the same as the script (i.e. the Auth website), this was necessary to enable cookies
Of note is that I am not including allow-top-navigation which is what was allowing the reload of the page and loss of my JavaScript context
VoilĂ , auth page displayed inside a jQuery modal dialog (containing the iframe) with no need to pop up another window.
CORS was necessary to allow the redirect to the Auth website (which happens in the browser before JavaScript gets involved). You could also trap the "Access-control-allow-origin" exception instead.
The iframe with the Siteminder login was accessing a page from my site that set a value in localStorage once the user had logged in. The app just polled localStorage in the background to see if the user had logged in successfully.
Hope this is useful to someone.
Related
I'm developing a flutter app with Webviews using this plugin: Flutter_inappwebview.
At first, I am loading a webview with the login page in a own domain and later redirect users to a different domain url. On this second url I inject an iframe menu loaded from the same site where the user was logged before. I need the login cookies accesible on this iframe, but i can't get it working.
Menu is loaded and the cookies exists, but the menu injected with an iframe in the second domain doesnt' works.
By the way, i need a way to communicate between javascript loaded in the url and a flutter webview. I tried with the services included in the plugin, but doesn't work.
Thanks for your help =)
You can use the cookies_jar package available for flutter to achieve the behaviour you are expecting.
I need to check, if website in iframe is loaded properly. On my website, users can POST custom website, which will show them in iframe. But some websites are protected from insert to iframe (such as google or facebook).
How can I check, if is website loadable in iframe and can be used in iframe?
PS: I haven't show any code, because I have no code and no idea how to do it. (My website runs on Java, so no Apache or PHP).
Check HTTP response header for X-Frame-Options. Facebook sends X-Frame-Options=DENY, which means "The page cannot be displayed in a frame, regardless of the site attempting to do so."
The X-Frame-Options HTTP response header can be used to indicate
whether or not a browser should be allowed to render a page in a
<frame>, <iframe> or <object>. Sites can use this to avoid
clickjacking attacks, by ensuring that their content is not embedded
into other sites.
Check this: Accessing the web page's HTTP Headers in JavaScript
I have an html page that is being accessed via a link that places an external page in the url - e.g.
http://www.mydomain.com/mypage?external-page=encodedURL
It is the responsibility of my page to scrape some data from the URL it is handed.
How can I access the passed-in page using javascript/jquery? I need to be able to pull out the content for certain classes and ids.
Is this a violation of same origin policy? If so, is there some other way to process an external page like this? Seems strange to me that I can hit the web page in a browser or a terminal command and receive the content, but not in a js file.
You can use a browser extension to scrape the external page, then send the data to your site, OR display it within the page, so that it can then be accessed by your page's javascript via the DOM.
You can use a proxy on your domain which fetches the external page and hands it to your javascript whose origin is on your domain, too.
You can use an API for the external page which is accessible.
You can ask,command, change the code of the external page (if you have access to it) to serve pages with Access-Control-Allow-Origin=*
I think this is all you can do.
EDIT: The "seems strange" is until you realize the intended difference between a user, and a process. The user is not thought to be malicious, but a process could be. A process could for example, grab data from a user's logged in gmail session if it had access to the external page, and transmit that data to a server. Since the user on the terminal is probably (but not always !) the one who logged in to that session, the user is not thought to be malicious. But a script whose origin is some website that user navigates to, should not be able to act with the same permissions as that user. Since that script is an agent as well, and can make actions, but it is not created or directed by the user. That's the strongest reason for the isolation of origin's and the same origin policy.
Example
Execution Context of Bookmarklets, and IFrames
If you are injecting JS into every page via a bookmarklet, then that injected code will behave as if it has the same origin as the rest of the page, or at least the "top frame" of that page. It will execute in the same context as the top frame. If there are nested iframes in the page then you will get an "unsafe attempt to access page x from " error if your bookmarklet tries to inject into there. This is because the bookmarklet has it's origin in the top page, and the top page can never access nested iframes on different domains anyway.
So if some part of the site you wish to scrape is in an iframe below the top frame, your bookmarklet will fail to get it.
Transmitting Data using a bookmarklet
If you want to take a url on one page, on your domain, then grab data from that url, on another domain, then display that data back on the same page, you need a way to get the data across. You could use a bookmarklet but the flow would still involve some "user help". It would go something like this:
Load your domain's page, D. User puts a url into an input box. Clicks submit.
Javascript on D opens a new tab/window pointing to the user provided url.
User clicks your scraping bookmarklet on that external page, which collects the desired data, X.
Desired data, X, is sent via Ajax to a "server", S, with session identifier I.
Page D, polls the server S, until it gets notified that some data with session identifier I has been grabbed, then it gets that data and displays it on D.
There is the need for a server. You can't use local storage to transmit the information since this is specific to a domain. There is an alterative that does not require a server. It requires making a browser extension.
Transmitting data using a browser extension The "background page" of the extension is basically the same as a local server for all the browser tabs, it permits transmitting of information across tabs targeted to different domains. The "clients" in this set up are the "content scripts", which are loaded to every page (just like a bookmarklet, except without the requirement for a user to actually click the bookmarklet to load it. It happens automatically). The flow would go like this:
Page D again. User inputs url in input box. Clicks submit -> which triggers some code in the extension.
The extension background page instructs a tab to open and targets it to the url.
A content script loads automatically into that tab, checks with the background what data it should get. It gets that data, and sends it, via a message (a json string) to the background page.
The background page pushes that notification and the data on to the original contents script on page D. Which displays the information.
Optionally, the background page also transmits the information to your server for saving into that user's datastore.
The language I use for the browser extension "background page" and "content script" is pretty much focussed on Google Chrome. The same concepts are available in Safari, Firefox as well. If you want to support IE you're going to have to work out something else. IE10 does not plan to even support extensions.
If the external page and your page is on the same domain, then you should be able to access that external page using JavaScript. Otherwise, the JavaScript won't be allowed to access the external site, browsers will prevent Cross-site scripting.
I have a requirement where I have to submit same/similar data to 10-15 forms at a time. What I want to do is create a single page where all those forms are loaded, and fill in all known values automatically... The end user simply has to fill in the captchas shown for those 15 forms... Now I want each form's submission response to be loaded into an iframe within the same web page.
After this, I want a simple js to be loaded into each iframe, which reads some data from the parent document, as well as entire content of the response web page, and sends this using XMLHttpRequest to my web application. (The web application will parse through the content of form submission response, and see if the submission is successful or not).
The script that should be loaded into each iframe (within the main window) should read the iframe ID, some divs from the main window, and entire content of that iframe, and send it as a POST request to my web app.
Can such a scenario be implemented using Greasemonkey? Note that initially when the page with iframes is loaded, at that stage the iframes are blank- at this stage no data from iframes should be sent to my web app. Only after user submits all 10 forms, and the iframes are all loaded with respective form submission responses, now the js should send the data within each iframe to my web app.
One more question- currently I plan to use Google Chrome with appropriate runtime parameters to disable the same origin policy...But if the above scenario can be implemented using Greasemonkey script, then will I need to disable Same Origin Policy in Firefox also? Also, there is an extension in Firefox to add CORS enabler to a web page, can I combine that script with the code for above scenario, so that even if an iframe has different domain compared to main window, even then the data of each iframe is submitted?
1- Greasemonkey script load on every page and iframe that matches with your site filter.
you can stop it from running for main window with this command:
if(window == window.top) return;
// else do the rest
2- You can access parent window and its content with window.parent. and access the iframe from Parent with .contentWindow property of your iframe. (if they have save domains)
I have modal popup of external site using jquery
The external site is a login
After completion of the login the modal window will redirect to a site.
I want to:
i) identify when the modal has redirected (login complete)
ii) capture the modal url to acquire parameters from the url.
How do i do this in jquery methods & javascript
If it's an external site, you can't, due to the Same Origin Policy. You can't sneak a look into other sites' documents, even if you opened them in a modalDialog or in-page iframe. It would be a security hole if you could steal parameters from the URL.
You can't repurpose another site's login process for your own ends unless that site deliberately exposes an interface to let you do it.