One of the first earlier hacks of a major social website ( using old browser) was :
login to the site, so you got a cookie
create script tag which reference the "get friends list" url
override the array constructor via JS
and since the <script> tag does send a cookie , the request is authenticated and response with pure JSON (unrunnable) code - but since we override the array ctor , we can get the data.
That's fine ( that was a preview to my question).
Question
What is the complete list of elements which also send a cookie for a cross domain request ?
Wouldn't it be more accurate to say that any resource requested by the browser to a certain domain, would send the cookies in the request. So really, any elements that "loads" any resource from a server would have the cookies sent. So I'd say images, json files, html/php files, external CSS files and probably web fonts would send cookies. This could be one of the reasons why you would want to host your resources (scripts,CSS files, images) on another domain as an optimisation thing.
This JSFiddle is mostly a proof that CSS files can "remember".
HTML
<link href="remember.php?.css" rel="stylesheet"/>
Remember Red
Javascript
red.onclick=function(e){
var img=new Image()
img.src="remember.php?col=red"
return false
}
remember.php
if(isset($_GET["col"])){
$_SESSION["fav_color"]=$_GET["col"];
}
echo "body {
color:".htmlentities(#$_SESSION["fav_color"] ?: "blue")."
}";
So what should happen is that, when we load an image with URI remember.php?col=red, the server will remember that the color value even on refresh. Same principal with images and I would assume web fonts.
Another example are images. Which should send cookies, when loaded. Though, for example, stackoverflow.com hosts the images in another domain (in this case the layout stuff is on cdn.sstatic.net/stackoverflow/img/sprites.png ). And even if it did send, we wouldn't normally know if cookies was sent unless the cookie affects the image somehow. But if we check with the developer tools we would actually note that cookies do get sent. For example:
An image hosted on php.net
Same image on a different domain
As you can see, the cookies do get sent. Even when cross-domain. As further proof, the remember.php demo but with images.
Demo
HTML
<img src="http://mfirdaus.net/random/so/remember_image.php"/>
Toggle Image
Javascript
toggle.onclick=function(){
var img=new Image()
img.src="http://mfirdaus.net/random/so/remember_image.php?toggle"
img.onerror=function(){
window.location=window.location
}
return false
}
remember_image.php
if(isset($_GET["toggle"])){
$_SESSION["like_cats"]=!#$_SESSION["like_cats"];
die();
}
echo file_get_contents(#$_SESSION["like_cats"] ? "cat.jpeg" : "duck.jpeg" );
In this demo, the cookie does affect the image hence, it's easier to tell that the cookies get sent with images.
Now whether this resource contains privileged data (such as the JSON data that contains the friendlist) and the page calling this resource have the capability to use this privileged data (in this case, by doing magic javascript stuff to exploit the JSON) is another matter. Browsers should be safe enough that most of the obvious vectors should be secured. We can't even access other domain's images directly to put in canvases due to security. But of course there will be those pesky bugs and exploits for browser vendors to deal with.
I used to use this fact to make a Firefox extension that just scraped authenticated pages of a website to show a sidebar with parsed data, because ajax in Firefox extensions doesn't have the same-domain restrictions as normal pages, and I didn't have to bother to do anything special to authenticate because ajax sends the cookies as one would expect.
Related
How can i cache javascript static files on www.foo.ru from www.abc.ru ?
I try load it by script tag (set src attribute), but when i go to www.abc.ru requests is sending again, cache is ignored. Does browser separate cache by origin or something else?
As Terry said in a comment, you can't do that directly. It used to be possible, but it was an information leak (http://malicious-site-example.com could see that it gets a really fast response to http://example.com/some-asset and use that information [probably in combination with other similar heuristics] to infer that you've been on http://example.com lately). So now, the cached response is only used for the origin that originally requested it — that is, effectively different origins have different caches.
Presumably you only want to do this on a pair of sites where you control both of them. In that case, you might use an iframe on foo.ru's page that directly loads a page from abc.ru that does the load. Then it's abc.ru that's doing the request, and it's cached such that it's connected to abc.ru, not foo.ru. You can hide the iframe by making it zero-height or off the page or similar.
I'm a desktop developer that is trying to learn some web basics on the side. I've previously put together an asp.net mvc website that worked more or less okay and am currently working on a simpler, html/css/js only website.
A number of the pages on the website will contain images, with a number of pieces of data accompanying them, so I thought I'd put together a JSON with all of the data, including the links to the images and generate the image list on page load. The problem that I ran into is the JavaScript cross origin request when trying to fetch the JSON file.
I've looked around at solutions and most of them recommend spinning up a server - either asp.net or node.js to fetch the JSON from. Couple of questions:
If I can write HTML that references image files, why can I not fetch a json from javascript? Is there a fundamental piece of understanding that I'm missing here?
Is there any other way of using a JSON without spinning up a web server? Should I try embedding it into the HTML? Is that a bad idea?
Any other pointers/links to resources with relevant info :)
// My JavaScript:
<script>
$(document).ready(function(){
buildGallery('test.json', '#gallery');
});
// Builds a collection of thumbnails from the json specified inside of specified div
function buildGallery(jsonUrl, galleryDiv){
$.getJSON(jsonUrl, function(data){
// Ensure the data is in correct format
if (typeof(data) !== 'object'){
return;
}
// Build the gallery
$.each(data['images'], function(key, image){
var thumbnail = '<img src="' + image['url'] + '"/>'
$(galleryDiv).append(thumbnail);
});
});
}
</script>
This is based of: https://api.jquery.com/jQuery.getJSON/
Thanks heaps!
There are a couple issues with what you are trying to accomplish with the provided code.
First, you are trying to make an Ajax request to a resource that is not hosted on an http server. Ajax is a wrapper for XMLHttpRequest which was designed for fetching resources using the http protocol. However, it can support other protocols such as file, and ftp.
Second, CORS is not controlled by the browser, it's controlled by the http server. Cross domain origin requests can work, but only if the resource you are requesting responds with an http header that allows your domain to access it. Since the resource you are requesting has nothing to do with http, it will probably throw an error.
So why do images work using the file:// scheme? The <img/> tag supports loading resources using any scheme your browser cares to support. It turns out most browsers support it.
So I can't get json into my app without an http server!#? Yes and no. No because you usually cannot request a resource not served through an http server using XMLHttpRequest. However, you can still request resources through other means.
I recommend using the File API for reading files from the users filesystem.
I would like to display a existing sub-site in an iframe.
The twist is that I would like the content to be served via a custom Websocket http proxy. The server-side of the Websocket would handle retrieving the original sub-site content via http from the origin server.
I assume that all iframe browser (and Ajax) resource loading calls would need to be intercepted and satisfied by some Javascript code, which would get the needed resources via a Websocket connection.
Is this plain impossible?
When I got your problem correctly you try to get a web document and remove all the <iframe>-Tags.
You can do this by getting reading the page by file_get_contents() and removing all <iframe>-Tags by its pattern using preg_replace()
<?php
$content = file_get_contents('http://www.w3schools.com/html/html_iframe.asp');
echo preg_replace('/<iframe(|\/)(?!\?).*?(|\/)>/','', $content);
?>
Note: As some requests without a base URL (for example <img src="...) will look the for the resources on your server the site will not render correctly.
dran you stackoverflow! one day i will know your formitting... :x
How to secure the src path of the image when clicks on inspect element so that user should not get to know about the actual src path..please help me with the solution and it should be done with javascript only no other tags should be used.
You can convert image into base 64 data URIs for embedding images.
Use: http://websemantics.co.uk/online_tools/image_to_data_uri_convertor/
Code sample:
.sprite {
background-image:url(data:image/png;base64,iVBORw0KGgoAAAA... etc );
}
This is commonly done server-side, where you have an endpoint that serves the image file to you as bytes...
You can store the images in a private location on the server where IIS/<your favourite web server> doesn't have direct access to it, but only a web app, running on it, with the required privilege is authorized to do so.
Alternatively people also "store" the images in the database itself and load it directly from there.
In either case, the response which has to be sent back has to be a stream of bytes with the correct mime type.
Edit:
Here are a couple of links to get you started if you are into ASP.NET:
http://www.codeproject.com/Articles/34084/Generic-Image-Handler-Using-IHttpHandler
http://aspalliance.com/1322_Displaying_Images_in_ASPNET_Using_HttpHandlers.5 <- this sample actually does it from a database.
Don't let the choice of front-end framework (asp.net, php, django, etc) hinder you. Search for similar techniques in your framework of choice.
Edit:
Another way if you think html5 canvas is shown here: http://www.html5canvastutorials.com/tutorials/html5-canvas-images/
However you run into the same problem. Someone can view the image url if they can see the page source. You'll have to revert to the above approach eventually.
This is what I want to do:
I want to send an HTTP request to a server, potentially returning a PDF file. But the server may also just return an error code (PDF file unavailable, PDF file invalid, PDF system down, etc). When I get the PDF, I would like to open the PDF and refresh the page that loaded the PDF, because the PDF is then marked as "read". When I get an error code (or timeout), I would like to redirect the page to an error screen. Downloading Google Chrome works in a similar manner:
http://www.google.com/chrome/eula.html?hl=en&platform=win
This is what I don't want do:
For performance reasons, I don't want to issue two requests as suggested in this question here:
Download and open pdf file using Ajax
Two requests can mean:
Make a request for the PDF and return a code to indicate whether the PDF is available or not. If unavailable, immediately display an error page
If it is available, open a window and request the PDF again in that window, and display it.
That's expensive because the PDF's have to be accessed via remote systems. I don't want to access the PDF resource twice. Another solution involving two requests:
Make a request for the PDF and retrieve an error code or a temporary URL where the PDF is cached. On error, immediately display an error page
If the PDF is available, open a window in which the cached PDF is displayed.
This will require for quite a large cache for the PDF's
This might be an interesting lead:
I found this question here giving me some information about how I could download the binary data and make it available in JavaScript as binary data:
Is there a way to read binary data in JavaScript?
Maybe that's a nice lead, but of course it won't solve my problem yet, as I want to use the browser's default editor to open the file, just as if I had requested the file from a normal URL.
So the question is:
Can I download binary data and open them like a regular document from JavaScript? If not, I'll cache the document in some managed memory container in Weblogic and just hope that this won't kill our system. Please only respond:
If you know for sure it cannot be done (some links explaining why would be nice)
If you know how to do it
If you have a different solution doing roughly what I want to do (not issuing two requests)
The implemented "old-school" solution works like this:
The JavaScript client sends an AJAX request to the server to "prepare" a PDF document
The server responds with any of these three messages:
a) Document available at URL http://www.example.com/doc.pdf
b) Document unavailable
c) Document being "prepared" (i.e. client has to wait)
The JavaScript client then reacts as such:
a) Open the returned URL in a new window, refresh the current window after 5 seconds
b) The current window is redirected to an error screen
c) The current window stays unchanged and AJAX polling is implemented to repeat step 2