Using jQuery/javascript with Google Forms - javascript

I am trying to use jQuery/javascript with an iframe containing google form.
Here is the code snippet:
<body>
<iframe id="myFormFrame" src="https://docs.google.com/forms/d/smfjkafj809890dfafhfdfd/viewform?embedded=true" width="760" height="1750" frameborder="0" marginheight="0" marginwidth="0">Loading...</iframe>
<button id="mybtn">Click</button>
<script type="text/javascript">
$('#mybtn').click(function(){
var m=document.getElementById("ss-form");
alert(m.innerHTML);
});
</script>
The button on click should give the innerHTML of the form(which on loading inside iFrame has an id of "ss-form"). But its not working. So what could be the reason(cross domain iframe?)?
And what is the way to make jQuery work with google forms?
If it helps, I am trying to put custom ajax validation in place(for example, the email id field must contain emails from a specific list obtained from different database).
EDIT
Well it looks like a cross domain issue. I had this doubt in mind. But after looking at this article, it looks like possible.

document.getElementById searches the current document, not other documents loaded into frames.
If the framed document was on your own site then you could access the frame and then access the document (via contentDocument) first. Then call getElementById on that.
Since it is cross-origin, that will fail. The reasons for this are obvious if you consider what would happen if Mallory's evil website were to stick your bank's online banking site into a frame while you were logged in.
If you had the cooperation of Google (you don't) then you could pass messages between the frames using postmessage.
Well it looks like a cross domain issue. I had this doubt in mind. But after looking at this article, it looks like possible.
They aren't using a form hosted by Google. They've used Google tools to design the form and then copied the HTML to their own site. If Google ever implement CSRF protection, that will break.

Related

Loading a specific div from another website into own website

Ive tried using the js load function but as the external site does not allow CORS requests, my original GET request gets blocked.
<div id="test"></div>
<script>
$(document).ready(function () {
$("#test").load("https://mywebsite.com");
});
</script>
So it seems that my only approach is to use iframes?! Is there a way to only crawl a specific div with iframes? I dont want to display the whole website.
EDIT: Since I am using Django I was able to crawl the website with python in a view and then push the crawled and cleaned up code snippet in the html template. Nevertheless to answer my question -> There is no correct way of doing it as long as the website you are trying to access is blocking the content.
Work with the owner of the site you want to take content from.
They can set you up with an API. That avoids having to use hackey methods or risking copyright-related legal trouble.

Security concerns with same origin iframes [duplicate]

I am planning to create an open source education web app where people can add and edit the content (a bit like Wikipedia).
However I wish to add another feature that allows the user to add their own interactive content using JavaScript. (similar how JSFiddle does it)
What are the security concerns in doing this?
Optional question: How can these issues be overcome?
Yes you could use HTML5 Sandbox to only load user scripts in an IFrame.
You should only host user content from a different domain than your main site. This will prevent any XSS attack if an attacker convinces a user to visit the page directly (outside of the sandbox). e.g. if your site is www.example.com you could use the following code to display the sandboxed IFrame (note .org rather than .com, which is an entirely different domain):
<iframe src="https://www.example.org/show_user_script.aspx?id=123" sandbox="allow-scripts"></iframe>
This will allow scripts, but forms and navigation outside of the IFrame will be prevented. Note that this approach could still risk a user hosting a phishing form to capture credentials. You should make sure that the boundaries between your site and the user content are clear within the user interface. Even though we haven't specified allow-forms, this only prevents a form from being submitted directly, it does not prevent form elements and JavaScript event handlers from sending any data to an external domain.
The HTML5 Security Cheat Sheet guidance on OWASP states this is the purpose of the sandbox:
Use the sandbox attribute of an iframe for untrusted content
You should test whether sandbox is supported first, before rendering the IFrame:
<iframe src="/blank.htm" sandbox="allow-scripts" id="foo"></iframe>
var sandboxSupported = "sandbox" in document.createElement("iframe");
if (sandboxSupported) {
document.getElementById('foo').setAttribute('src', 'https://www.example.org/show_user_script.aspx?id=123');
}
else
{
// Not safe to display IFrame
}
It is safer to do it this way by dynamically changing the src rather than redirecting away if sandboxSupported is false because then the iframe will not accidentally be rendered if the redirect doesn't happen in time.
As a simpler alternative, without the need to check whether the sandbox is supported, you can use the srcdoc IFrame attribute to generate the sandboxed content, making sure that all content is HTML encoded:
e.g.
<html><head></head><body>This could be unsafe</body></html>
would be rendered as
<iframe srcdoc="<html><head></head><body>This could be unsafe</body></html>" sandbox="allow-scripts"></iframe>
Or you could construct a data blob object, being careful to HTML encode again:
<body data-userdoc="<html><head></head><body>This could be unsafe</body></html>">
<script>
var unsafeDoc = new Blob([document.body.dataset.userdoc], {type: 'text/html'});
var iframe = document.createElement('iframe');
iframe.src = window.URL.createObjectURL(unsafeDoc);
iframe.sandbox = 'allow-scripts';
</script>
Of course you could also set the unsafeDoc variable from a JSON data source. It is not recommended to load an HTML file, as this has the same problem of it having to be from an external domain, as the attacker could just entice the user to load that directly.
Also, please don't be tempted to write user content into a script block directly. As shown above, data attributes is the safe way to do this, as long as correct HTML encoding is carried out on the user data as it is output server-side.
In these cases you can leave src as blank.html as older browsers that do not support srcdoc will simply load that URL.
As #Snowburnt touches upon, there is nothing stopping a user script from redirecting a user to a site where a drive-by download occurs, but this approach, assuming a user is up to date on patches, and there are no zero day vulnerabilities, this is a safe approach because it protects its end users and their data on your site via the same origin policy.
One big issue is cross-site scripting where users add code that tells the browser to open and run code from other sites. Say they add something that creates an iFrame or a hidden iFrame pointing to a site and starts downloading malicious code.
There's no simple way around it (thanks to Bergi in the comments) to make sure no elements are created and no ajax calls are made.
I've been a member of sites that provided this functionality, but for those sites I paid for my own space so any vulnerabilities I add are inconveniencing my own clients, in that case it's a little more okay to let that slip by since it's not a security leak for everyone.
One way around this is to create customizable controls for the users to use to add interactivity. The plus is that you control the javascript being added, the minus is that your user base will have to request and then wait for you to create them.

Determine if object/embed is forbidden?

You might know HN, but you maybe also do not like the fact of clicking around in many tabs/going back and forward. I thought about making some page on which both webpage and links from HN are. So I made this: http://goodfrontpage.com/direct
But there are 2 Problems:
First: How to determine if a page doesn't allow in it's http-headers to open it in something like this:
<iframe class="webpage" src="{{post.url}}" ></iframe>
or this:
<object class="webpage" data="http://asfasfasfa.com" >
<embed class="webpage" src="http://asfasffvasf.com" > </embed>
Error: Webpage not accessible!
</object>
This is true for pages like github.com, eff.org or youtube.com
Second: Is there any possibility to fetch the sites in a different way allowing me to display all pages?
If you want to embed a webpage within another one then you should use the iframe element:
<iframe src="http://asfasffvasf.com"/>
You can style this like any other block element, and set an explicit width and height.
Some pages ask browsers not to include them within iframes (using the X-Frame-Options header). I don't think there's an easy way to solve this on the client side, but you could create a simple backend or proxy to request the page you want and return the content. This gets round the iframe restriction because you're now including content from your own domain.
This does have a couple of security issues to be aware of:
You've now made a backend which can be used to download any page on the Internet. There's a denial of service vulnerability if someone makes lots of requests to download huge pages.
The pages you're including will no longer be restricted by the same origin policy. Scripts on those pages will be able to interact with everything on the parent page. This may be a problem if you plan on creating login functionality in the future.
Seems like you are looking for iframe property.It specifies an inline frame.
An inline frame is used to embed another document within the current HTML document.
<iframe src="http://asfasfasfa.com"></iframe>

Is it possible to use jQuery to grab the HTML of another web page into a div?

I am trying to integrate with the FireShot API to given a URL, grab HTML of another web page into a div then take a screenshot of it.
Some things I will need to do after getting the HTML
grab <link> & <script> from <head>
grab <body> into <div>
But 1st, it seems when I try to do a
$.get("http://google.com", function(data) { ... });
I get a 200 in firebug colored red. I think it has to do with sites not allowing you to grab their page with JS? Then is opening a window the best I can do? But how might I control the other page with jQuery or call fsapi on that page?
UPDATE
I tried to do something like below to do something when the new window is ready, but FireBug says "Permission denied to access property 'document'"
w = window.open($url.val());
setTimeout(function() { // if I dont do this, I always get about:blank, is there a better way around this?
$(w.document).ready(function() {
console.log(w.document.body);
});
}, 1000);
I believe the cross-site security setup within Javascript is basically blocking this. You'd likely have to proxy the content through your own domain.
There are a couple other options I think for break the cross-site security constraints, but I'm not sure I'd promote them.
If the "another page" locates within the same domain of your hosting page, yes, you can. Please refer to jQuery's $().load() API.
Otherwise, you're disallowed to do so by the browser's Cross-Site Security Policy. At this moment, you can choose to use iFrame instead of DIV.
Some jQuery plugins, e.g. thickbox provides ability to load pages to appropriate container automatically.
Unless I am correct, I do not believe you can AJAX a page cross domain (e.g. from domain1.com to domain2.com). To get around this, you can have a PHP "proxy" script that does the "getting" of the page and then pass it to JS.
For example, in JS you would get() http://mydomain.com/get/?domain=http://google.com and then do what you need to do!

Is there a way to mitigate downloading of resources (images/css and js files) with Javascript?

I have a html page on my localhost - get_description.html.
The snippet below is part of the code:
<input type="text" id="url"/>
<button id="get_description_button">Get description</button>
<iframe id="description_container" src="#"/>
When the button is clicked the src of the iframe is set to the url entered in the textbox. The pages fetched this way are very big with lots of linked files. What I am interested in the page is a block of text contained in a <div id="description"> element.
Is there a way to mitigate downloading of resources linked in the page that loads into the iframe?
I don't want to use curl because the data is only available to logged in users and the steps to take with curl to get the content is too complicated. The iframe is simple as I use this on a box which sends the right cookies to identify the request as coming from a logged in user, but the problem is that it is very wasteful to get nearly 1 MB of data to keep 1 KB of it and throw out the rest.
Edit
If the proposed method just works in Firefox it is fine, so I added Firefox tag. Also, it is possible that the answer actually is from the realm of Firefox add-on techniques, so I added that tag as well.
The problem is not that I cannot get at what I'm looking for, rather, the problem is the easy iframe method is wasteful.
I know that Firefox does allow loading only the text of a page. If you open a page and press Ctrl+U you are taken to 'view page source' window, There links behave as normal and are clickable, if you click on a link in source view, the source of the new page is loaded into the view source window, without the linked resources being downloaded, exactly what I'm trying to get. But I don't know how to access this behaviour.
Another example is the Adblock add-on. It somehow kills elements before they get loaded. With plain Javascript this is not possible. Because it only is triggered too late to intervene in good time.
The Same Origin Policy forbids any web page to access contents of any other web page in a different domain so basically you cannot do that.
However it seems that with some browsers it is allowed to access web pages content if you are trying to access it from a local web page which seems to be your case.
Safari, IE 6/7/8 are browser that allow a local web page to do so via XMLHttpRequest (source: Google Browser Security Handbook) so you may want to choose to use one of those browsers to do what you need (note that future versions of those browsers may not allow to do so anymore).
A part from this solution I only see two possibities:
If the web pages you need to fetch content from are somehow controlled by you, you can create a simpler interface to let other web pages to get the content you need (for example allowing JSONP requests).
If the web pages you need to fetch content from are not controlled by you the only solution I see is to fetch content server side logging in from the server directly (I know that you don't want to do so, but I don't see any other possibility if the previous I mentioned are not practicable)
Hope it helps.
Actually I've seen Cross Domain jQuery .load request before, here: http://james.padolsey.com/javascript/cross-domain-requests-with-jquery/
The author claims that codes like these found on that page
$('#container').load('http://google.com'); // SERIOUSLY!
$.ajax({
url: 'http://news.bbc.co.uk',
type: 'GET',
success: function(res) {
var headline = $(res.responseText).find('a.tsh').text();
alert(headline);
}
});
// Works with $.get too!
would work. (The BBC code might not work because of the recent redesign, but you get the idea)
Apparently it is using YQL wrapped into a jQuery plugin to do the trick. Now I cannot say I fully understand what he is doing there but it appears to work, and fits the bill. Once you load the data I suppose it is a simple matter of filtering out the data that you need.
If you prefer something that works at the browser level, may I suggest Mozilla's Jetpack framework for lightweight extensions. I've not yet read the documentations in its entirety but it should contain the APIs needed for this to work.
There are various ways to go about this in AJAX, I'm going to show the jQuery way for brevity as one option, though you could do this in vanilla JavaScript as well.
Instead of an <iframe> you can just use a container, let's say a <div> like this:
<div id="description_container"></div>
Then to load it:
$(function() {
$("#get_description_button").click(function() {
$("#description_container").load($("input").val() + " #description");
});
});
This uses the .load() method which takes a string in this format: .load("url selector"), then takes that element in the page and places it's content inside the container you're loading, in this case #description_container.
This is just the jQuery route, mainly to illustrate that yes, you can do what you want, but you don't have to do it exactly like this, just showing the concept is getting what you want from an AJAX request, rather than in an <iframe>.
Your description sounds like you are fetching pages from the same domain (you said that you need to be logged in and have session credentials) so have you tried to use async request via XMLHttpRequest? It might complain if the html on a page is particularly messed up but you chould still be able to get raw text via .responseText and extract what you need with a regex.

Categories