Synchronous cross domain javascript call, waiting for response - is it possible? - javascript

Disclaimer
Firstly, a disclaimer: I am working within specific boundaries, so whilst it may seem I'm going about something the long way round, I am limited as to what I can do. I know I should be doing this entirely differently, but I cannot. If it's not possible to do what I'm trying to do here, then that's fine, I just need to know.
Background
Basically, this boils down to a cross-domain javascript call. However, I need to wait for the response before returning the method.
Say I have a page - example1.com/host.html. This contains a javascript method of 'ProvideValue()' which returns an int. Edit: This method must be executed where it is found, since it may need to access other resources within that domain, and access global variables set for the current session.
https://example1.com/host.html
function ProvideValue(){
return 8; // In reality, this will be a process that returns a value
}
This host.html page contains an iframe pointing to example2.com/content.html (note the different domain). This content.html page contains a method that needs to display the value from host.html in an alert.
https://example2.com/content.html
function DisplayValue(){
var hostValue = //[get value from ProvideValue() in host.html]
alert(hostValue);
}
That's it.
Limitations
I can run any javascript I like on the host.html, but nothing server-side. On content.html I can run javascript and anything server-side. I have no control over the example1.com domain, but full control over example2.com.
Question
How can I retrieve the value from ProvideValue() on example1.com/host.html within the DisplayValue() method on example2.com/content.html?
Previous Attempts
Now, I've tried many of the cross-domain techniques, but all of them (that I've found) use an asynchronous callback. That won't work in this case, because I need to make the request to the host.html, and receive the value back, all within the scope of a single method on the content.html.
The only solution I got working involved relying on asynchronous cross-domain scripting (using easyXDM), and a server-side list of requests/responses in example2.com. The DisplayValue() method made the request to host.html, then immediately made a synchronous post to the server. The server would then wait until it got notified of the response from the cross-domain callback. Whilst waiting, the callback would make another call to the server to store the response. It worked fine in FireFox and IE, but Chrome wouldn't execute the callback until DisplayValue() completed. If there is no way to address my initial question, and this option has promise, then I will pose this as a new question, but I don't want to clutter this question with multiple topics.

Use XMLHttpRequest with CORS to make synchronous cross-domain requests.
If the server doesn't support cors, use a proxy which adds the appropriate CORS headers, e.g. https://cors-anywhere.herokuapp.com/ (source code at https://github.com/Rob--W/cors-anywhere).
Example 1: Using synchronous XHR with CORS
function getProvidedValue() {
var url = 'http://example.com/';
var xhr = new XMLHttpRequest();
// third param = false = synchronous request
xhr.open('GET', 'https://cors-anywhere.herokuapp.com/' + url, false);
xhr.send();
var result = xhr.responseText;
// do something with response (text manipulation, *whatever*)
return result;
}
Example 2: Use postMessage
If it's important to calculate the values on the fly with session data, use postMessage to continuously update the state:
Top-level document (host.html):
<script src="host.js"></script>
<iframe name="content" src="https://other.example.com/content.html"></iframe>
host.js
(function() {
var cache = {
providedValue: null,
otherValue: ''
};
function sendUpdate() {
if (frames.content) { // "content" is the name of the iframe
frames.content.postMessage(cache, 'https://other.example.com');
}
}
function recalc() {
// Update values
cache.providedValue = provideValue();
cache.otherValue = getOtherValue();
// Send (updated) values to frame
sendUpdate();
}
// Listen for changes using events, pollers, WHATEVER
yourAPI.on('change', recalc);
window.addEventListener('message', function(event) {
if (event.origin !== 'https://other.example.com') return;
if (event.data === 'requestUpdate') sendUpdate();
});
})();
A script in content.html: content.js
var data = {}; // Global
var parentOrigin = 'https://host.example.com';
window.addEventListener('message', function(event) {
if (event.origin !== parentOrigin) return;
data = event.data;
});
parent.postMessage('requestUpdate', parentOrigin);
// To get the value:
function displayValue() {
var hostName = data.providedValue;
}
This snippet is merely a demonstration of the concept. If you want to apply the method, you probably want to split the login in the recalc function, such that the value is only recalculated on the update of that particular value (instead of recalculating everything on every update).

Related

How to call a fetch request and wait for it's answer inside a onBeforeRequest in a web extension

I'm trying to write a web extension that stops the requests from a url list provided locally, fetches the URL's response, analyzes it in a certain way and based on the analysis results, blocks or doesn't block the request.
Is that even possible?
The browser doesn't matter.
If it's possible, could you provide some examples?
I tried doing it with Chrome extensions, but it seems like it's not possible.
I heard it's possible on mozilla though
I think that this is only possible using the old webRequestBlocking API which Chrome is removing as a part of Manifest v3. Fortunately, Firefox is planning to continue supporting blocking web requests even as they transition to manifest v3 (read more here).
In terms of implementation, I would highly recommend referring to the MDN documentation for webRequest, in particular their section on modifying responses and their documentation for the filterResponseData method.
Mozilla have also provided a great example project that demonstrates how to achieve something very close to what I think you want to do.
Below I've modified their background.js code slightly so it is a little closer to what you want to do:
function listener(details) {
if (mySpecialUrls.indexOf(details.url) === -1) {
// Ignore this url, it's not on our list.
return {};
}
let filter = browser.webRequest.filterResponseData(details.requestId);
let decoder = new TextDecoder("utf-8");
let encoder = new TextEncoder();
filter.ondata = event => {
let str = decoder.decode(event.data, {stream: true});
// Just change any instance of Example in the HTTP response
// to WebExtension Example.
str = str.replace(/Example/g, 'WebExtension Example');
filter.write(encoder.encode(str));
filter.disconnect();
}
// This is a BlockingResponse object, you can set parameters here to e.g. cancel the request if you want to.
// See: https://developer.mozilla.org/en-US/docs/Mozilla/Add-ons/WebExtensions/API/webRequest/BlockingResponse#type
return {};
}
browser.webRequest.onBeforeRequest.addListener(
listener,
// 'main_frame' means this will only affect requests for the main frame of the browser (e.g. the HTML for a page rather than the images, CSS, etc. that are loaded afterwards). You might want to look into whether you want to expand this.
{urls: ["*://*/*"], types: ["main_frame"]},
["blocking"]
);
Correction:
The above example only works properly if the response data fits in one chunk. If it is larger (and you still want to inspect the entirety of the response data), you would need to put all of the data into a buffer, and then work on it once all data has been received. See the document here for more information: https://developer.mozilla.org/en-US/docs/Mozilla/Add-ons/WebExtensions/API/webRequest/StreamFilter/ondata#webextension_examples (the code section titled "This example combines all buffers into a single buffer" would be of most interest to you I think).
In terms of using this API to block responses, data is only returned from this URL if you call filter.write(), so if you don't like the response, you can simply not call it (just call filter.close()) and an empty response will be returned. You can also only return part of the full response body by filter.write()ing only the bits that you want to return.

Cross-domain XMLHttpRequest to verify website existence?

I'm trying to write a JavaScript function that gets a foreign url, and attempts to verify its existence within 'tmOut' msecs. If verified within this timeframe, it should call a 'callback' function with this url as an argument.
Here is the function:
function chkUrl(url, tmOut, callback) {
var abortChk = false;
var abortTmr = setTimeout(function(){abortChk = true;}, tmOut);
var x = new XMLHttpRequest();
x.onreadystatechange = function() {
if (x.readyState == 4) {
if (x.status < 400 && !abortChk) {
clearTimeout(abortTmr);
callback(url);
}
}
};
x.open('GET', url, true);
x.send(null);
}
Problem is because of cross-domain calls (probably) I get x.status=0 regardless of the url existence.
Is there a way to overcome/workaround the problem (without the users having to modify any default browser settings)? Alternatively, is there a way to achieve the same functionality otherwise?
Is this function "reentrant"? (can I call it safely several times for different urls at once?)
Is there a way to overcome/workaround the problem
Client side? Only if the sites you are making the request to use CORS to grant you permission (which seems unlikely given the context).
Perform your test from your server instead of directly from the browser.
Is this function "reentrant"? (can I call it safely several times for different urls at once?)
Yes. You aren't creating any globals.

simple javascript/ajax call - does not return status

The following code produces nothing on the html page, it seems to break down on 'status':
var get_json_file = new XMLHttpRequest();
get_json_file.open("GET", "/Users/files/Documents/time.json", true);
**document.write(get_json_file.status);**
keep in mind, that I am on a Mac, so there is no C: drive....however, this line of code does work fine:
document.write(get_json_file.readyState);
I just want to know that I was able to successfully find my json file. Perhaps, I should ask, what should I be looking for to achieve what I want ?
Another basic question about AJAX. I suggest you to read the MDN article about using XMLHttpRequest. You can't access the 'status' property until it is ready, and you haven't even called the 'send()' method, which performs the actual request. You can't have a status without making an HTTP request first. Learn how AJAX works before trying to use it. Explaining it all would be too long and this is not the place.
You can only get the status when the ajax has finished. That is, when the page was loaded, or a 404 was returned.
Because you're trying to call status straight after the request was sent (or not sent, read the P.S), you're getting nothing.
You need to make an async call, to check that status only when the request finishes:
get_json_file.onreadystatechange = function (){
if (get_json_file.readyState==4 && get_json_file.status==200)
{
alert('success');
}
}
read more at http://www.w3schools.com/ajax/ajax_xmlhttprequest_onreadystatechange.asp
P.S as noted by #Oscar, you're missing the send().
If you want to try a synchronous approach, which would stop the code from running until a response is returned, you can try:
var get_json_file = new XMLHttpRequest();
get_json_file.open("GET", "/Users/files/Documents/time.json", false);
//notice we set async to false (developer.mozilla.org/en-US/docs/Web/API/XMLHttpRequest)
get_json_file.send(); //will wait for a response
document.write(get_json_file.status);
//(Credit: orginial asker)
Because of security issues you are not allowed to send requests to files on the local system, but what you could do is look into the fileReader API. (more info here)
<sidenote>
The reason that the readyState works and not status is because by defualt
readyState has a value of 0 and status has no value so it would be undefined.
</sidenote>
in my ubuntu, the path will be prefixed with file:///
i think your json file path should have file:///Users/files/Documents/time.json, because mac and ubuntu based on unix
and then you can check ajax status using #TastySpaceApple answer
if you using google chrome, don't forget to launch it with -–allow-file-access-from-files command, because google chrome not load local file by default due to security reason

Chrome extension regarding injected script + localstorage

I am puzzling my way through my first 'putting it all together' Chrome extension, I'll describe what I am trying to do and then how I have been going about it with some script excerpts:
I have an options.html page and an options.js script that lets the user set a url in a textfield -- this gets stored using localStorage.
function load_options() {
var repl_adurl = localStorage["repl_adurl"];
default_img.src = repl_adurl;
tf_default_ad.value = repl_adurl;
}
function save_options() {
var tf_ad = document.getElementById("tf_default_ad");
localStorage["repl_adurl"] = tf_ad.value;
}
document.addEventListener('DOMContentLoaded', function () {
document.querySelector('button').addEventListener('click', save_options);
});
document.addEventListener('DOMContentLoaded', load_options );
My contentscript injects a script 'myscript' into the page ( so it can have access to the img elements from the page's html )
var s = document.createElement('script');
s.src = chrome.extension.getURL("myscript.js");
console.log( s.src );
(document.head||document.documentElement).appendChild(s);
s.parentNode.removeChild(s);
myscript.js is supposed to somehow grab the local storage data and that determines how the image elements are manipulated.
I don't have any trouble grabbing the images from the html source, but I cannot seem to access the localStorage data. I realize it must have to do with the two scripts having different environments but I am unsure of how to overcome this issue -- as far as I know I need to have myscript.js injected from contentscript.js because contentscript.js doesn't have access to the html source.
Hopefully somebody here can suggest something I am missing.
Thank you, I appreciate any help you can offer!
-Andy
First of all: You do not need an injected script to access the page's DOM (<img> elements). The DOM is already available to the content script.
Content scripts cannot directly access the localStorage of the extension's process, you need to implement a communication channel between the background page and the content script in order to achieve this. Fortunately, Chrome offers a simple message passing API for this purpose.
I suggest to use the chrome.storage API instead of localStorage. The advantage of chrome.storage is that it's available to content scripts, which allows you to read/set values without a background page. Currently, your code looks quite manageable, so switching from the synchronous localStorage to the asynchronous chrome.storage API is doable.
Regardless of your choice, the content script's code has to read/write the preferences asynchronously:
// Example of preference name, used in the following two content script examples
var key = 'adurl';
// Example using message passing:
chrome.extension.sendMessage({type:'getPref',key:key}, function(result) {
// Do something with result
});
// Example using chrome.storage:
chrome.storage.local.get(key, function(items) {
var result = items[key];
// Do something with result
});
As you can see, there's hardly any difference between the two. However, to get the first to work, you also have to add more logic to the background page:
// Background page
chrome.extension.onMessage.addListener(function(message, sender, sendResponse) {
if (message.type === 'getPref') {
var result = localStorage.getItem(message.key);
sendResponse(result);
}
});
On the other hand, if you want to switch to chrome.storage, the logic in your options page has to be slightly rewritten, because the current code (using localStorage) is synchronous, while chrome.storage is asynchronous:
// Options page
function load_options() {
chrome.storage.local.get('repl_adurl', function(items) {
var repl_adurl = items.repl_adurl;
default_img.src = repl_adurl;
tf_default_ad.value = repl_adurl;
});
}
function save_options() {
var tf_ad = document.getElementById('tf_default_ad');
chrome.storage.local.set({
repl_adurl: tf_ad.value
});
}
Documentation
chrome.storage (method get, method set)
Message passing (note: this page uses chrome.runtime instead chrome.extension. For backwards-compatibility with Chrome 25-, use chrome.extension (example using both))
A simple and practical explanation of synchronous vs asynchronous ft. Chrome extensions

XMLHttpRequest onunload?

In my web app, I need to send the latest data the user has changed before they leave the page.
I call up a function like this when the page unloads:
window.onbeforeunload=sendData;
That's what's inside the function called
function sendData(){
var xhr = new XMLHttpRequest;
var storage = container;
xhr.open("POST","save.php",false);
xhr.send("information="+container);
}
My questions:
What is more right: Using async or sync to make send the requests before the user closes the page?
Is it possible to make the requests smaller? I only send variables containing up to two characters and the whole request takes 171 bytes!
It's necessary to use a synchronous request, otherwise, the data is not transmitted in IE10 and IE11, see Unload event in IE10, no form data.

Categories