making json/jsonp xhr requests on the file: protocol - javascript

I'm writing a javascript app that will be hosted on a file: protocol (ie: the application is just a folder of html, css, and javascript sitting someplace on my hard drive). When I try normal XHR requests they fail because of the same origin policy afaict.
So my question is this, what's the best way to request json/jsonp files with an app as described above?
Note: So far I've got all of my jsonp files using a hard-coded callback functions, but I'd like to be able to use dynamic callback functions for these requests.. is there a way to do this?

This is kind of a hatchet job, but it will get you your dynamic callbacks. Basically it counts on the fact that file: transfers will be pretty fast. It sets up a queue of requests and sends them out one at a time. That was the only way I could figure out to make sure that the correct response and callback could be linked (in a guaranteed order). Hopefully someone can come up with a better way, but without being able to dynamically generate the responses, this is the best I can do.
var JSONP = {
queue: [],
load: function(file, callback, scope) {
var head = document.getElementsByTagName('head')[0];
var script = document.createElement('script');
script.type = "text/javascript";
script.src = file;
head.appendChild(script);
},
request: function(file, callback, scope) {
this.queue.push(arguments);
if (this.queue.length == 1) {
this.next();
}
},
response: function(json) {
var requestArgs = this.queue.shift();
var file = requestArgs[0];
var callback = requestArgs[1];
var scope = requestArgs[2] || this;
callback.call(scope, json, file);
this.next();
},
next: function() {
if (this.queue.length) {
var nextArgs = this.queue[0];
this.load.apply(this, nextArgs);
}
}
};
This is what I did to test
window.onload = function() {
JSONP.request('data.js', function(json, file) { alert("1 " + json.message); });
JSONP.request('data.js', function(json, file) { alert("2 " + json.message); });
}
Data.js
JSONP.response({
message: 'hello'
});

Chrome has very tight restrictions on making ajax calls from a file:// url, for security reasons. They know it breaks apps that run locally, and there's been a lot of debate about alternatives, but that's how it stands today.
Ajax works fine from file urls in Firefox, just be aware that the return code is not an http status code; i.e., 0 is success, not 200-299 + 304.
IE handles these security concerns differently from both Chrome and Firefox, and I'd expect other browsers to each have their own approaches. The border between web and desktop apps is very problematic territory.

Related

Serving a different JS file from a server depending on the number of times a URL is visited by a client?

I've an web app that injects a server based myjavascriptfile.js file from my server, using jQuery AJAX GET request. Currently, this GET request is called every time the client visits https://www.google.co.uk.
However I'd like to be able to send mysecondjavascriptfile.js file to the client, if the client has gone to https://www.google.co.uk more that 10 times.
Do you have any ways I can do this?
First thing to do, is to persist the hits the client do to the site. I think SessionStorage could help here:
sessionStorage.counter = ++(sessionStorage.counter) || 0;
var sources = {
lessThanTen : 'http://yourscript.com/lessthan10hits.js',
moreThanTen : 'http://yourscript.com/morethan10hits.js'
}
var script = document.createElement('script');
if(sessionStorage.counter >= 10){
script.src = sources.moreThanTen;
} else {
script.src = sources.lessThanTen;
}
document.getElementsByTagName('head')[0].appendChild(script);
This is of course a client-side verification of the hit. You could implement a server-side verification through AJAX or just serve a slightly different HTML markup after 10 requests. You'll need to use sessions (or just plain cookies) to persist them on the server-side.
AJAX verification:
var xhr = new XMLHttpRequest();
xhr.addEventListener('load', function(){
var script = document.createElement('script');
script.src = xhr.response;
document.getElementsByTagName('head')[0].appendChild(script);
});
xhr.open('POST', 'http://www.urltocheckhits.com/hits');
xhr.send('url=' + encodeURIComponent(window.location.hostname));
And then from Node.js (with body-parser and express-session):
var sources = {
lessThanTen : 'http://yourscript.com/lessthan10hits.js',
moreThanTen : 'http://yourscript.com/morethan10hits.js'
}
app.post('/hits', urlEncoded, function(req, res){
if(req.body){
var url = req.body.url;
if(!req.session.views){
req.session.views = { };
}
if(req.session.views[url]){
req.session.views[url]++;
} else {
req.session.views[url] = 1;
}
if(req.session.views[url] > 10){
res.send(sources.moreThanTen);
} else {
res.send(sources.lessThanTen);
}
}
});
I suggest you check the documentation of express-session and body-parser.
Note that you'll need to add CORS Headers for this (you could just as easily do it with JSONP too instead of using XHR).
Might be easier if you just serve the JS file instead of doing the AJAX call and then including the returned script. So then you could just:
<script src="http://onesingleurl.com/hits">
Caching will behave weird like this though, so that's why I favor the other approach.

Using Blob WebWorker to send Synchronous XMLHttpRequest

First off, I am very new to web services, web workers, and XMLHttpRequests, so please bear with me. Also, there are a lot of stipulations in my project, so solutions to "just do it this way" may not be viable.
So I have a web service set up to receive calls from an XMLHttpRequest in javascript, and it does this synchronously. This works fine, but it ties up the UI thread, and I would like to have a loading spinner run while making the requests to the server. Due to one issue, I can't have the program access external scripts on the web, so I am using a Blob to mask the "file://" preface.
I am also using an inline webworker to accomplish this. Now I'm getting to my actual issue. Spawning the webworker is fine, and I can create and send the XMLHttpRequest, but as soon as I call "send" everything exits. It will run no lines of code after this.
Here's some code:
Called from JS:
var blob = new Blob([document.querySelector('#getWorker').textContent]);
var blobUrl = window.URL.createObjectURL(blob);
var worker = new Worker(window.URL.createObjectURL(blob));
worker.onmessage = function (e) {
alert(e.data);
}
worker.postMessage();
The worker:
var bigString = "";
var invocation = new XMLHttpRequest();
var url = 'http://<ipAddress>/<serviceName>/Service.asmx/<method>';
if (invocation) {
invocation.open('GET', url, false);
invocation.send(); //*****EXITS AFTER THIS LINE*****//
if (invocation.status == 200) {
var responseText = invocation.responseText.replace(/(\r\n|\n|\r)/gm, "");
responseText = responseText.replace('<?xml version="1.0" encoding="utf-8"?>', '');
responseText = responseText.replace('<string xmlns="http://tempuri.org/">', '');
responseText = responseText.replace('</string>', '');
postMessage("Success");
//updateTable(responseText);
} else {
postMessage("Fail");
//alert("Could not connect to database. Check your internet connection.");
}
var c = 0;
var b = 1;
}
var q=1;
The debugger will just end after the "invocation.send()" line. No error, no status, no nothing. And that's where I'm lost.
Any insight would be greatly appreciated. Also, this exact code works when it is not in a WebWorker, so there's likely something about them that I do not understand.
Thanks in advance!
Chrome problem. Silent fail if COR request is blocked. Firefox console shows more information on the error.

reading server file with javascript

I have a html page using javascript that gives the user the option to read and use his own text files from his PC. But I want to have an example file on the server that the user can open via a click on a button.
I have no idea what is the best way to open a server file. I googled a bit. (I'm new to html and javascript, so maybe my understanding of the following is incorrect!). I found that javascript is client based and it is not very straightforward to open a server file. It looks like it is easiest to use an iframe (?).
So I'm trying (first test is simply to open it onload of the webpage) the following. With kgr.bss on the same directory on the server as my html page:
<IFRAME SRC="kgr.bss" ID="myframe" onLoad="readFile();"> </IFRAME>
and (with file_inhoud, lines defined elsewhere)
function readFile() {
func="readFile=";
debug2("0");
var x=document.getElementById("myframe");
debug2("1");
var doc = x.contentDocument ? x.contentDocument : (x.contentWindow.document || x.document);
debug2("1a"+doc);
var file_inhoud=doc.document.body;
debug2("2:");
lines = file_inhoud.split("\n");
debug2("3");
fileloaded();
debug2("4");
}
Debug function shows:
readFile=0//readFile=1//readFile=1a[object HTMLDocument]//
So statement that stops the program is:
var file_inhoud=doc.document.body;
What is wrong? What is correct (or best) way to read this file?
Note: I see that the file is read and displayed in the frame.
Thanks!
Your best bet, since the file is on your server is to retrieve it via "ajax". This stands for Asynchronous JavaScript And XML, but the XML part is completely optional, it can be used with all sorts of content types (including plain text). (For that matter, the asynchronous part is optional as well, but it's best to stick with that.)
Here's a basic example of requesting text file data using ajax:
function getFileFromServer(url, doneCallback) {
var xhr;
xhr = new XMLHttpRequest();
xhr.onreadystatechange = handleStateChange;
xhr.open("GET", url, true);
xhr.send();
function handleStateChange() {
if (xhr.readyState === 4) {
doneCallback(xhr.status == 200 ? xhr.responseText : null);
}
}
}
You'd call that like this:
getFileFromServer("path/to/file", function(text) {
if (text === null) {
// An error occurred
}
else {
// `text` is the file text
}
});
However, the above is somewhat simplified. It would work with modern browsers, but not some older ones, where you have to work around some issues.
Update: You said in a comment below that you're using jQuery. If so, you can use its ajax function and get the benefit of jQuery's workarounds for some browser inconsistencies:
$.ajax({
type: "GET",
url: "path/to/file",
success: function(text) {
// `text` is the file text
},
error: function() {
// An error occurred
}
});
Side note:
I found that javascript is client based...
No. This is a myth. JavaScript is just a programming language. It can be used in browsers, on servers, on your workstation, etc. In fact, JavaScript was originally developed for server-side use.
These days, the most common use (and your use-case) is indeed in web browsers, client-side, but JavaScript is not limited to the client in the general case. And it's having a major resurgence on the server and elsewhere, in fact.
The usual way to retrieve a text file (or any other server side resource) is to use AJAX. Here is an example of how you could alert the contents of a text file:
var xhr;
if (window.XMLHttpRequest) {
xhr = new XMLHttpRequest();
} else if (window.ActiveXObject) {
xhr = new ActiveXObject("Microsoft.XMLHTTP");
}
xhr.onreadystatechange = function(){alert(xhr.responseText);};
xhr.open("GET","kgr.bss"); //assuming kgr.bss is plaintext
xhr.send();
The problem with your ultimate goal however is that it has traditionally not been possible to use javascript to access the client file system. However, the new HTML5 file API is changing this. You can read up on it here.

Using JavaScript to perform a GET request without AJAX

Out of curiosity, I'm wondering about the best (easiest, fastest, shortest, etc; make your pick) way to perform a GET request in JavaScript without using AJAX or any external libraries.
It must work cross-browser and it's not allowed to distort the hosting web page visually or affect it's functionality in any way.
I don't care about headers in the request, just the url-part. I also don't care about the result of the request. I just want the server to do something as a side effect when it receives this request, so firing it is all that matters. If your solution requires the servers to return something in particular, that's ok as well.
I'll post my own suggestion as a possible answer, but I would love it if someone could find a better way!
Have you tried using an Image object? Something like:
var req = new Image();
req.onload = function() {
// Probably not required if you're only interested in
// making the request and don't need a callback function
}
req.src = 'http://example.com/foo/bar';
function GET(url) {
var head = document.getElementsByTagName('head')[0];
var n = document.createElement('script');
n.src = url;
n.type = 'text/javascript';
n.onload = function() { // this is not really mandatory, but removes the tag when finished.
head.removeChild(n);
};
head.appendChild(n);
}
I would go with Pekka idea and use hidden iframe, the advantage is that no further parsing will be done: for image, the browser will try to parse the result as image, for dynamically creating script tag the browser will try to parse the results as JavaScript code.. iframe is "hit and run", the browser doesn't care what's in there.
Changing your own solution a bit:
function GET(url) {
var oFrame = document.getElementById("MyAjaxFrame");
if (!oFrame) {
oFrame = document.createElement("iframe");
oFrame.style.display = "none";
oFrame.id = "MyAjaxFrame";
document.body.appendChild(oFrame);
}
oFrame.src = url;
}

Dynamically Preloading/Displaying Webcam Snapshots on a Web Page Using AJAX

I have an IP Camera that streams out live video to a web site of mine. Problem is, it is powered by an ActiveX control. Even worse, this control is unsigned. To provide a more secure alternative to the people that are using browsers other than IE, or are (rightfully) unwilling to change their security settings, I am tapping into the cameras built in snap-shot script that serves up a 640x480 live JPEG image. The plan was to update the image live on the screen every ~500ms using Javascript without having to reload the entire page.
I tried using the Image() object to pre-load the image and update the SRC attribute of the image element when onload fired:
function updateCam() {
var url = "../snapshot.cgi?t=" + new Date().getTime();
img = new Image();
img.onload = function() {
$("#livePhoto").attr("src", url);
camTimer = setTimeout(updateCam, 500);
}
img.src = url;
}
This worked decently, but it was difficult to determine when the camera had been disabled, which I needed to do in order to degrade gracefully. The internal snapshot script is setup to return an HTTP status code of 204 (No Content) under this circumstance, and of course there is no way for me to detect that using the Image object. Additionally, the onload event was not 100% reliable.
Therefore, I am using the jQuery (version 1.2.6) ajax function to do a GET request on the URL, and on the complete callback I evaluate the status code and set the URL accordingly:
function updateCam() {
var url = "../snapshot.cgi?t=" + new Date().getTime();
$.ajax({
type: "GET",
url: url,
timeout: 2000,
complete: function(xhr) {
try {
var src = (xhr.status == 200) ? url : '../i/cam-oos.jpg';
$("#livePhoto").attr("src", src);
}
catch(e) {
JoshError.log(e);
}
camTimer = setTimeout(updateCam, 500);
}
});
}
And this works beautifully. But only in IE! This is the question that I would like to have answered: Why doesn't this work in Firefox or Chrome? The complete event does not even fire in Firefox. It does fire in Chrome, but only very rarely does setting the SRC actually load the image that was requested (usually it displays nothing).
Posting a second answer, because the first was just really incorrect. I can't test this solution (because I don't have access to your webcam script), but I would suggest trying to sanitise the response from the camera - since you obviously can't handle the raw image data, try adding the dataFilter setting like so:
function updateCam() {
var url = "../snapshopt.cgi?t=" + new Date().getTime();
$.ajax({
type: "GET",
url: url,
timeout: 2000,
dataFilter : function(data, type) {
return '<div></div>' //so it returns something...
},
complete: function(xhr) {
try {
var src = (xhr.status == 200) ? url : '../i/cam-oos.jpg';
$("#live").attr("src", src);
}
catch(e) {
JoshError.log(e);
}
camTimer = setTimeout(updateCam, 500);
}
});
}
Like I said, I haven't been able to test this - but it might allow jquery to use the status codes without breaking like crazy.
img.onerror = function(){
alert('offline');
}
Well, I ended up using the data URI scheme (hat tip to Eric Pascarello) for non-IE browsers. I wrote a HTTP handler (in VB.NET) to proxy the IP camera and base-64 encode the image:
Imports Common
Imports System.IO
Imports System.Net
Public Class LiveCam
Implements IHttpHandler
Private ReadOnly URL As String = "http://12.34.56.78/snapshot.cgi"
Private ReadOnly FAIL As String = Common.MapPath("~/i/cam-oos.jpg")
Public Sub ProcessRequest(ByVal context As System.Web.HttpContext) Implements System.Web.IHttpHandler.ProcessRequest
Dim Data As Byte()
With context.Response
.ContentEncoding = Encoding.UTF8
.ContentType = "text/plain"
.Write("data:image/png;base64,")
Try
Using Client As New WebClient()
Data = Client.DownloadData(URL)
End Using
Catch ex As WebException
Data = File.ReadAllBytes(FAIL)
End Try
.Write(Convert.ToBase64String(Data))
End With
End Sub
End Class
Then I just put a little non-IE detection (using the classic document.all check) in order to call the correct URL/set the correct SRC:
function updateCam() {
var url = (document.all) ? "../snapshot.cgi?t=" : "../cam.axd?t=";
url += new Date().getTime();
$.ajax({
type: "GET",
url: url,
timeout: 2000,
complete: function(xhr) {
try {
var src;
if(document.all)
src = (xhr.status == 200) ? url : '../i/cam-oos.jpg';
else
src = xhr.responseText;
$("#livePhoto").attr("src", src);
}
catch(e) {
JoshError.log(e);
}
camTimer = setTimeout(updateCam, 500);
}
});
}
It's very unfortunate I had to resort to this workaround for. I hate browser detection code, and I hate the additional load that is put on my server. The proxy will not only force me to waste more bandwidth, but it will not operate as efficiently because of the inherent proxy drawbacks and due to the time required to base-64 encode the image. Additionally, it is not setup to degrade as gracefully as IE. Although I could re-write the proxy to use HttpWebRequest and return the proper status codes, etc. I just wanted the easiest way out as possible because I am sick of dealing with this!
Thanks to all!
I believe the jquery will try to interpret the response from the server. I believe some browsers are more tolerant of the response interpretation so more restrictive browsers will fail because an image cannot be seen as HTML!
The solution to this would be to use a HEAD request type instead of a GET ( type: "HEAD" ). That way you will still get status responses back, without grabbing the image itself. Therefore the response will be empty (and shouldn't mess you up). Also, the response should be much faster.

Categories