I need to embed WebView in my application, which has to pull some data via AJAX from multiple remote servers. Unfortunetely due to ajax sandbox, connections to foreign servers are blocked. How can I disable it, as the js code I'm running is trusted?
There's a simple workaround to allow connections to single server. It's as simple as using loadDataWithBaseUrl and passing the top level url as the first parameter. But what to do, when js should be able to access multiple different domains?
Thanks
Are the pages loaded into the webview local? i.e, Are they loaded from the local file system like: file://yourpage.html or are they remote pages?
Webpages loaded locally are not affected by the cross-domain ajax restrictions so you can load whatever you like.
If they're remote pages then i'm not sure how you're going to get around it, perhaps setup your own webservice on the same domain as where the pages are served from which simply fetches the data from the remote services and spits it back
Related
I'm working on an SDK type thing for submitting data (including file uploads) to a service that I run.
I'm doing research, and trying to figure out the best way to submit data and get a response to an external server (my server) without being blocked by XSS restrictions.
The current setup is as so:
The customer hosts a server, and uses my server side library.
They generate a client page that loads the required JS from my server.
The client page requests data from my server (if it was not passed from the SDK on page load), and displays the information to the user.
The user then triggers an event, which submits data (potentially including file uploads) to my server (not the local server with the SDK library).
My server responds success or fail and the client JS handles it appropriately.
Some notes:
My server is a private PHP server that I have complete control over.
Although I could route all data through the customer's server (as they are using my library), it is not ideal, as it requires more set up for the customer, is slower for the end user, and handling file uploads is problematic as I want those files on my server, not theirs.
I thought perhaps the file upload inputs could be in an iframe. Will this allow uploads direct to my server?
Since the customer is using my library with an API key, I can authenticate the client's requests by passing an authentication token to the front end on page load that then gets passed to my server with whatever communication method ends up working.
I am open to changes in the architecture, but this is the ideal set up for me. I am just not sure what frontend methods are best for implementing this.
JSONP would work if you only need to make GET requests, but it sounds like you need to do POSTs as well since you mention file uploads.
For your case, Cross-Origin Resource Sharing (CORS) might work. The short explanation is that a browser will send an extra header named Origin if you make a request with XMLHttpRequest to another domain. Your server needs to respond with an additional header named Access-Control-Allow-Origin with a value of * or the value the browser sent in the Origin header. There are some nuances and gotchas when using CORS, so I recommend reading the link above for a thorough explanation.
With CORS set up, you should be able to use XMLHttpRequest to upload files.
I am building a simple web page that will run on a computer connected to a large TV, displaying some relevant information for whomever passes it.
The page will (somehow) fetch some text files which are located on a svn server and then render the them into html.
So I have two choices how to do this:
Set up a cron job that periodically checks the svn server for any changes, and if so updates the files from svn, and and (somehow) updates the page. This has the problem of violating the Access-Control-Allow-Origin policy, since the files now exist locally, and what is a good way to refresh a page that runs in full screen mode?
Make the javascript do the whole job: Set it up to periodically ajax request the files directly from the svn server, check for differences, and then render the page. This somehow does not seem as elegant.
Update
The Access-Control-Allow-Origin policy doesn't seem to be a problem when running on a web server, even though the content is on the same domain..
What I did in the end was a split between the two:
A cron job update the files from svn.
The javascript periodicly requests the files using window.setInterval and turning on the ifModified flag on the ajax request to only update the html if a changed had occured.
I have a webpage with a master script that connects, via AJAX, to a remote server and downloads unsecure JS scripts (let's call them slave scripts), to be executed lately on the client. I would like to limit the Internet access slave scripts have; e.g. they can communicate just with the remote server.
Do you have any idea of how can I achieve this?
Thanks,
Laurențiu Dascălu
You can't.
JavaScript AJAX calls will have access to whatever the browser has access to.
Your best bet would be attempt to create a third JavaScript component to proxy the slave script calls through. That component would be responsible for ensuring that the slave scripts weren't calling any URLs that they shouldn't be.
The downside, of course, is that anybody can download and modify all of your scripts anyway...which means that any proxy would be easy to overcome.
Use Caja. It can convert untrusted Javascript into safe Javascript which can only access specific resources as defined by you.
Run the scripts in an iframe hosted on a different domain, and the browser same-origin security policy should make it more secure.
There is this 3rd party webservice. One of the public webmethods available is a GetDocument() method. This method returns a Document object. The Document object has properties for File(byte[]), ContentType(string) ect.
My Question : Can I subscribe to this service using javascript(mootools) + ajax + JSON, return the document object, in this case an excel document, and force the file download?
It is true that typically you cannot initiate a download from JavaScript, but there is a flash component, Downloadify that does enable client side file generation.
So you can serve files for download from HTML/JavaScript.
With that problem solved, you still have the problem of how to get the data that you wish to serve from the source web service.
3rd party implies XSS (cross site scripting) which is a no-no using XmlHttpRequest (Ajax).
A possible solution to this problem could be to use a common hidden IFrame technique to get the data.
Simply have an appropriate (hidden?) form that correctly posts to the web service and point it's action to an hidden IFrame element upon which you are trapping the Load event and parse the data returned.
But current browsers have different levels of security measures that limit your ability to access IFrames with an external source so you are actually stuck here. Sorry to get your hopes up.
The only practical robust way to accomplish what you would like to do is to have a local server side script that can act as a proxy between your HTML/JavaScript and the external web service.
Using such a proxy, you can simply go back to using Ajax to get your data to serve up with Downloadify.
But then, since you are using a server script to get the data, why not just serve the data from the script for download?
These are just my observations on the problem domain you present.
In a system that I'm building I want to serve
Static files (static HTML pages and a lot of images), and
Dynamic XML generated by my servlet.
The dynamic XML is generated from my database (through Hibernate) and I use Restlets to serve it in response to API calls. I want to create a static file server (e.g. Apache) so that this does not interfere with the dynamic server traffic. Currently both servers need to run on the same machine.
I've never done something like this before and this is where I'm stuck:
The static HTML pages contain JavaScript that makes API calls to the dynamic server. However, since the two servers operate on different ports, I get stuck with the same origin problem. How can this be solved?
As a bonus, if you can point me to any resources that explain how to create such a static/dynamic content serving system, I'll be happy.
Thanks!
You should setup mod_proxy in apache to forward dynamic requests to whatever backend server you are using. Your existing setup (ie. two separate ports) is perfect, you just need to tell apache 'proxy dynamic requests to my backend server without letting the browser know'.
This page should get you started - http://httpd.apache.org/docs/1.3/mod/mod_proxy.html
You need to load a script tag from the Reslet server... have a look at JSONP and this SO post