URL tracking functionality - javascript

I want my webpage to have two parts. The top part has a textbox. When the user types a URL into the textbox, the bottom part browses to the content of that URL. When the user clicks a link within the bottom part, the bottom part navigates to the new URL, and the textbox in the top part changes to the new URL. How can I do it?
NOTE: This behavior is the same as in Google Translate (e.g. here), but without any translation.

first problem..
Same origin issue
The only way to achieve what you are asking is exactly the way google translate does what it does - which is to use a server-side powered script as a proxy request:
http://translate.google.com/translate_un?depth=1&hl=en&ie=UTF8&prev=_t&rurl=translate.google.com&sl=auto&tl=en&twu=1&u=http://de.wikipedia.org/wiki/USA&lang=de&usg=ALkJrhgoLkbUGvOPUCHoNZIkVcMQpXhxZg
The above is the URL taken from the iframe that Google translate uses to display the translated page. The main thing to note is that the domain part of the URL is the same as the parent page's URL http://translate.google.com -- if both your frame and your parent window do not share the same domain, then your parent window's JavaScript wont be able to access anything within the iframe. It will be blocked by your browser's in-built security.
Obviously the above wont be a problem if in your project you are only ever going to be navigating your own pages (on the same domain), but considering you are proffering Google Translate as an example I'm assuming not.
What would Google do?
What the above URL does is to ask the server-side to fetch the wikipedia page and return it so that the iframe can display it - but to the iframe this page appears to be hosted on translate.google.com rather than wikipedia. This means that the iframe stays within the same origin as the parent window, and means that JavaScript can be used to edit or modify the page within the iframe.
next problem....
Rewrite the proxied content
Basically what I'm saying is that this can't be achieved with just HTML and client-side JavaScript - you need to have something to help from the server-side i.e. PHP, Python, Ruby, Lisp, Node.. and so on. This script will be responsible for making sure the proxied page appears/renders correctly e.g. you will have to make sure relative links to content/images/css on the original server are not broken (you can use the base tag or physically rewrite relative links). There are also many sites that would see this as an illegal use of their site, as per their site's terms of use and so should be black listed from your service.
final problem..?
Prevent the user from breaking away from your proxy
Once you have your proxy script, you can then use an iframe (please avoid using old framesets), and a bit of JavaScript magic that onload or ondomready of the iframe rewrites all of the links, forms and buttons in the page. This is so that when clicked or submitted, they post to your proxy script rather than the original destination. This rewrite code would also have to send the original destination to your proxy script some how - like u in the Google translate URL. Once you've sorted this, it will mean your iframe will reload with the new destination content, but - all importantly - your iframe will stay on the same domain.
too many problems!
If it were me, personally, I'd rethink your strategy
Overall this is not a simple task, and it isn't 100% fullproof either because there are many things that will cause problems:
Certain sites are designed to break out of frames.
There are ways a user can navigate from a page that can not be easily rewritten i.e. any navigation powered by JavaScript.
Certain pages are designed to break when served up from the wrong host.
Sites that do this kind of 'proxying' of other websites can get into hot water with regards to copyright and usage.
The reason why Google can do it is because they have a lot of time, money and resources... oh and a great deal of what Google translate does is actually handled on the server-side - not in JavaScript.
suggestions
If you are looking for tracking users navigating through your own site:
Use Google Analytics.
Or implement a simple server-side tracking system using cookies.
If you are looking to track users coming to your site and then travelling on to the rest of the world wide web:
Give up, web technologies are designed to prevent things like this.
Or join an online marketing company, they do their best to get around the prevention of things like this.

add a javascript function to your second frame -
<frame id="dataframe" src="frame_a.htm" onload="load()">
let the text box have an id - say "test"
function load()
{
document.getElementById('test').value=document.getElementById('dataframe').src
}

Related

Iframes security tips and scripts

I am trying to launch a website for myself which people might be using in future. Currently I am allowing users to post iframes for YouTube and Google Maps etc. Copy entire 'iframe' from Google Maps or YouTube and paste it in post box just to keep it simple.
Later I am storing it in MySQL database. I am displaying this post on some page. I am little worried since though I have asked user to paste only YouTube or Maps iframes, a devil mind might put src of malicious code.
What are all the possible ways to prevent this?
I think there are multiple risks, some that come to mind are:
Cross-site scripting. There are too many ways to achieve this if you allow the full <iframe> tag to be displayed as entered. This is probably the main risk, and the showstopper. It would be really hard to prevent XSS if you just write the full iframe tag (as entered by an attacker) into subsequent pages. If you really want to do this, you should look into HTML sanitization like Google Caja or HTMLPurifier or similar, but it is a can of worms that you better avoid if possible.
Information leak to malicious website. This very much depends on the browser (and the exact version of such browser), but some information (like for example teh window size, etc.) does leak to the website in an iframe, even if it's from a different origin.
Information / control leak from malicious website. Even worse than the previous, the embedded website would have some control over the window, for example it can redirect it (again, I think it depends on the browser though, I'm not quite sure), or can change the url hash fragment. Also if postMessage is used, the iframe can send messages to your application, which can be exploited if your application is not properly secured (not necessarily right now, but at any time in the future, like 5 years from now, after much development).
Arbitrary text injection, possibly leading to social engineering. Say an adversary includes a frame that says something like "You are the winner of this month's super-prize! Call 1-800-ATTACKER to provide your details and get your reward!"... You get the idea. The message would look like a legitimate one from your website, when it's not.
So you'd better not allow people to enter full tags as copied from Google Maps or anywhere else. There appears to be a finite set of things you want to allow (like for example Youtube videos and Google Maps links are only two), for which you should have customized controls. The user would only enter the video id/slug (the part after ?v=...), or would paste the full link, from which you would take the id, and you would make the actual tag for your page on the server side. The same for Google Maps, if the user navigates to wherever he wants in a Maps window and pastes the url, you can make your own iframe I think, because everything is in the url in Google Maps.
So in short, you should not allow people entering tags. XSS can be mitigated by sanitizers, but other risks listed above cannot.

Deep linking javascript powered websites

I have a website which has two versions, an all singing all dancing javascript powered application which is served when you request the root url
/
As you navigate around the lovely website the content updates, as does the url, thanks to html5 push state or good old correctly formatted #! urls. However if you don't have javascript enabled you can still use all functionality of the site as each piece of content also exists under it's own url. This is great for 3 reasons
non javascript users can still use the site
SEO - web crawlers can index the site easily
everything is shareable on social networks
The third reason is very important to me as every piece of content must be individually shareable on the site. And because each piece of content has it's own url it is easy to deep link to that url, and each piece of content can have it's own specific open graph data.
However the issue I hit is the following. You are a normal person and have javascript enabled and you are browsing and image gallery on the site and decide to share the picture of a lovely cat you have found. Using javascript the url has been updated to
/gallery/lovely-cat
You share this url and your friend clicks on it. When they click on the link the server sends you the non javascript / web crawler version of the site, and the experience is no where near as nice as the javascript version you would have been served if you directly went to the root of the site and navigated there.
Do anyone have a nice solution / alternative setup to solve this problems? I have several hacks which work, however I am not that happy with them. They include :
javascript redirect to the root of the site on every page and store a cookie / add a #! to the url so on page render the javascript router will show the correct content. ( does google punish automatic javascript redirects? )
render the no javascript page, and add some javascript which redirects the user to the root, similar to above, whenever the user clicks on a link
I don't particularly like either of these solutions, but can't think of a better solution. Rendering the entire javascript app for each page doesn't appear to be a solution to me, as you would end up with bad looking urls such as /gallery/lovely-cat/gallery/another-lovely-cat as you start navigating through the site.
My solution must support old browsers which do not implement push state
Make the "non javascript / web crawler version of the site" the same as the JavaScript version. Just build HTML on the server instead of DOM on the client.
Rendering the entire javascript app for each page doesn't appear to be a solution to me,
That is the robust approach
as you would end up with bad looking urls such as /gallery/lovely-cat/gallery/another-lovely-cat
Only if you linked (and pushStateed) to gallery/another-lovely-cat instead of /gallery/another-lovely-cat. (Note the / at the front).
Try out this plugin it might solve your 3rd reason, along with two reasons.
http://www.asual.com/jquery/address/

How do you keep content from your previous web page after clicking a link?

I'm sorry if this is a newbie question but I don't really know what to search for either. How do you keep content from a previous page when navigating through a web site? For example, the right side Activity/Chat bar on facebook. It doesn't appear to refresh when going to different profiles; it's not an iframe and doesn't appear to be ajax (I could be wrong).
Thanks,
I believe what you're seeing in Facebook is not actual "page loads", but clever use of AJAX or AHAH.
So ... imagine you've got a web page. It contains links. Each of those links has a "hook" -- a chunk of JavaScript that gets executed when the link gets clicked.
If your browser doesn't support JavaScript, the link works as it normally would on an old-fashioned page, and loads another page.
But if JavaScript is turned on, then instead of navigating to an HREF, the code run by the hook causes a request to be placed to a different URL that spits out just the HTML that should be used to replace a DIV that's already showing somewhere on the page.
There's still a real link in the HTML just in case JS doesn't work, so the HTML you're seeing looks as it should. Try disabling JavaScript in your browser and see how Facebook works.
Live updates like this are all over the place in Web 2.0 applications, from Facebook to Google Docs to Workflowy to Basecamp, etc. The "better" tools provide the underlying HTML links where possible so that users without JavaScript can still get full use of the applications. (This is called Progressive Enhancement or Graceful degradation, depending on your perspective.) Of course, nobody would expect Google Docs to work without JavaScript.
In the case of a chat like Facebook, you must save the entire conversation on the server side (for example in a database). Then, when the user changes the page, you can restore the state of the conversation on the server side (with PHP) or by querying your server like you do for the chat (Javascript + AJAX).
This isn't done in Javascript. It needs to be done using your back-end scripting language.
In PHP, for example, you use Sessions. The variables set by server-side scripts can be maintained on the server and tied together (between multiple requests/hits) using a cookie.
One really helpful trick is to run HTTPFox in Firefox so you can actually monitor what's happening as you browse from one page to the next. You can check out the POST/Cookies/Response tabs and watch for which web methods are being called by the AJAX-like behaviors on the page. In doing this you can generally deduce how data is flowing to and from the pages, even though you don't have access to the server side code per se.
As for the answer to your specific question, there are too many approaches to list (cookies, server side persistence such as session or database writes, a simple form POST, VIEWSTATE in .net, etc..)
You can open your last closed web-page by pressing ctrl+shift+T . Now you can save content as you like. Example: if i closed a web-page related by document sharing and now i am on travel web page. Then i press ctrl+shift+T. Now automatic my last web-page will open. This function works on Mozilla, e explorer, opera and more. Hope this answer is helpful to you.

How to offer a webapp to other sites. (div with javascript, iframe or..?)

I am quite new to web application development and I need to know how would I make other sites use it.
My webapp basically gets a username and returns some data from my DB. This should be visible from other websites.
My options are:
iframe. The websites owners embed an iframe and they pass the userid in the querystring. I render a webpage with the data and is shown inside the iframe.
pros: easy to do, working already.
cons: the websites wont know the data returned, and they may like to know it.
javascript & div. They paste a div and some javascript code in their websites and the div content is updated with the data retrieved by the small javascript.
pros: the webside would be able to get the data.
cons: I could mess up with their website and I don't know wow would I run the javascript code appart from being triggered by a document ready, but I wouldn't like to add jquery libraries to their sites.
There must be better ways to integrate web applications than what I'm thinking. Could someone give me some advice?
Thanks
Iframes cannot communicate with pages that are on a different domain. If you want to inject content into someone else's page and still be able to interact with that page you need to include (or append) a JavaScript tag (that points to your code) to the hosting page, then use JavaScript to write your content into the hosting page.
Context Framework contains embedded mode support, where page components can be injected to other pages via Javascript. It does depend on jQuery but it can always be used in noConflict-mode. At current release the embedded pages must be on same domain so that same-origin-policy is not violated.
In the next release, embedded mode can be extended to use JSONP which enables embedding pages everywhere.
If what you really want is to expose the data, but not the visual content, then I'd consider exposing your data via JSONP. There are caveats to this approach, but it could work for you. There was an answer here a couple of days ago about using a Web Service, but this won't work directly from the client because of the browser's Same Origin policy. It's a shame that the poster of that answer deleted it rather than leave it here as he inadvertently highlighted some of the misconceptions about how browsers access remote content.

How does Google Analytics In-Page Analytics work?

Google Analytics features an 'In-Page Analytics' view to show click-through rates and other information directly on your own website. I'm looking to build something similar that logs all clicks.
The problem is I'm not really sure how Google implement their In-Page Analytics views - they seem to use an iframe, or two, and have injected their own HTML and JavaScript onto other pages.
How would one go about doing such a thing - are iframes the best way to go? How would you avoid the same-origin security policies of Javascript if domainX is trying to manipulate the rendering of domainY?
This is a very interesting question. You're right, the same origin policy forbids injecting JS. But Google Analytics has an advantage: it already is in your site (the tracker code).
So here is how it works (as far as I can see):
When you open in-page analytics, you are first taken to https://www.google.com/analytics/reporting/iyp_launch
This page redirects to your site and adds a Google session to the url (like http://example.com/#gaso=THESESSION
The tracker now checks if the referrer is iyp_launch and gaso is set. If yes, it does not only load the tracker, but also injects the JS needed for requesting further data and rendering the overlays. This way, the JS is executed inside the frame (or window) and bypasses the same-origin policy.
Since Google Anaytics already tracks your visit (i.e. identifies you as the same user that viewed the previous page), it can from then on inject the additional JS along with the tracker until your visit is over (i.e. you close the page). This way, the overlays can be rendered again after you click a link.
So I guess the bottom line is this: Things like in-page analytics can be done if the site's owner has already trusted you by adding a script you control to his website (this is a good example why one should be very careful before doing such a thing). If you don't have that kind of access to the site, it might be impossible to bypass the same-origin policy - at least, I can't think of another way to do it (except maybe proxying all the requests through your sever, but that leads to other major problems).
I think this can be done quite simply. You inject javascript that makes sure that whenever a user clicks a link it requests/posts to a special page in the iframe. Something like this (jquery):
$('a').live('click', function() {
$('#' + iframeid).attr('http://somedomain.com/my_magic_page.php?linkClicked=' + $(this).attr('id') + '&page=' + window.location.toString());
}
Don't know if it'll work though.

Categories