Break connection to site from the JS console - javascript

In Chrome I have populate an on line mapping tool (Kumu) with a JSON file from the JS Console with:
Workflows.setCurrentMapSource("MY_JSON_LINK");
where MY_JSON_LINK was:
https://XXXXXX/json?key=MTE3.DI4LYA.ZrzRFJ5o7Q5m3nLe6d6JGFISdKI
But the Link is no longer active so when I go to the Kumu page I get the error:
Unable to open map
Is there a way to break the connection from the JS Console? I have searched but have not found anything that works
Thanks

I'm on phone so I can't give you the code, but what you can do is override the XMLHttpRequest methods and then you can manipulate any requests done on the page.
But this must of course be done BEFORE the requests are done so you'll probably need Tampermonkey userscript. Example:
const originalOpen = XMLHttpRequest.prototype.open;
XMLHttpRequest.prototype.open = function (){
//do what you need
originalOpen. apply(this, arguments);
}
So for example if you want to protect some link from being accessed, you can do this:
const originalOpen = XMLHttpRequest.prototype.open;
const REGEX_TEST_URL = /https?:\/\/XXXXXX\/json?key=(.*?)/
XMLHttpRequest.prototype.open = function (method, url){
console.log("Open: ", url);
/// if you want to kill access to that URL
if(REGEX_TEST_URL.test(url))
throw new Error("Blocked loading of URL "+url)
//Otherwise allow normal operatio to proceed
originalOpen.apply(this, arguments);
}
You can test this even here on stackoverflow.

Related

check if webpage is public or private using a js request

using vanilla JS, i need to know if someone is using my chrome extension on a private webpage or a public webpage.
example of public webpage
https://ww.facebook.com/home
example of private webpage
https://ww.facebook.com/account/settings
Are you able to figure out if a webpage is accessible by everyone or login permissions
what i have
let xhr= new XMLHttpRequest();
xhr.open('GET', 'www.facebok.com/account/settings');
xhr.responseType = 'json';
xhr.onload = function() {
let res = xhr.state;
//res == 503?
};
xhr.send();
However, i think that since my app runs on their browser, their session will be saved and it will return a false positive.
There is no standard way of checking that for "normal" websites.
Some might in fact return a proper status code, but others (like Facebook) won't and will instead render the same 200 (OK) status page for every URL and handle the login/redirects internally via JavaScript. (This is oversimplified for the sake of this example)
You will have to write separate detection algorithms for every page you want to check.

Google analytics intercept all requests

I would like to get a callback each time Google Analytics sends data to the server. I would like also to send the same data to my server. Is it possible and if so, how?
https://jsfiddle.net/bk1j8u7o/2/
<script async src="https://www.googletagmanager.com/gtag/js?id=UA-143361924-1"></script>
<script>
window.dataLayer = window.dataLayer || [];
function gtag(){dataLayer.push(arguments);}
gtag('js', new Date());
gtag('config', 'UA-143361924-1');
</script>
Google is actually using gif to sync the data to its server, so intercepting the XHR requests wont work.
In analytics.js there is an official way to do that. via Tasks, here is some small untested example:
ga(function(tracker) {
var originalSendHitTask = tracker.get('sendHitTask');
tracker.set('sendHitTask', function(model) {
var payLoad = model.get('hitPayload');
originalSendHitTask(model);
var gifRequest = new XMLHttpRequest();
var gifPath = "http://localhost/collect";
gifRequest.open('get', gifPath + '?' + payLoad, true);
gifRequest.send();
});
});
make sure that the pageView is sent after this code is executed.
I would demonstrate how you can intercept any AJAX call. Taking from this generic solution, you can filter the GA requests and take the actions you want.
I modified this answer.
The idea behind this solution is modifying the open and send prototype methods of XMLHttpRequest object and do the interception there. The IIFE gets the XMLHttpRequest object, saves the original prototype methods, install new methods and call the original methods from within the new methods. And, of course, do what you want with the data in the mean time.
(function(XHR) {
//Save the original methods
var open = XHR.prototype.open;
var send = XHR.prototype.send;
//Hook new open method in order to get the url
XHR.prototype.open = function(method, url, async, user, pass) {
this._url = url;
//Call the original
open.call(this, method, url, async, user, pass);
};
//Hook here too. This will be executed just before the data is sent
XHR.prototype.send = function(data) {
if (this_url === GA_URL_CONST) //Symbolic const
SendDataToMyServer(data); //Symbolic Fn
//Call the original
send.call(this, data);
}
})(XMLHttpRequest);
Possible? Yes, practical? No. Take a look of what the BigQuery schema for GA looks like and you'll get a sense of the complexity that goes behind the scenes.
That said, I think what you COULD do is:
Use GTM to implement GA.
Set up a custom tag template to refer to your own server that will collect the information. Passing just the data that you need, instead of everything GA collects.
Trigger your new custom tags wherever you're triggering your GA tags.

In a node.js API how do I perform tasks that must be synchronous without freezing the browser?

To start off this is what I am trying to accomplish:
I am trying to do file copies to an array of servers. There are several steps that must be completed in a specific order before and after these copies (for example, stopping IIS, backing up and clearing folders, running a bat file, etc) so they are not single operations.
To make this super easy I wrote an API in node.js that does simple tasks like copy files and folders, delete folders, etc. I then wrote a frontend in node.js using an express generator and Pug that uses javascript XMLHhttpRequests to send commands to the API depending on what I needed to do. I have the API written and running as well as the frontend. Now on to the problems:
If I have my XMLHttpRequest run in synchronous mode (example: xhttp.open("POST", url , false);) when the command is sent to the API to copy a folder if the folder takes several minutes to copy the browser freezes. Chrome displays a "Page Frozen" error. However, the job gets done correctly.
If I have my XMLHttpRequest run in asynchronous mode (example: xhttp.open("POST", url , true);) then every command gets sent to the API at once so that the fastest operation completes first and the commands are out of order. The copy will fail.
I've tried searching for a way to make it so that each operation sent from the frontend javascript has to return a SUCCESS (or 200 response) from the API before moving on to the next command but so far all I've seen is "just use synchronous". Right now that's what I'm doing. That doesn't seem like the best solution even though it works. Is there a better way to do this in a way that won't freeze the browser?
I figured this out by writing a function to handle the requests, setting a counter (for the steps of the process), and putting a switch statement in the if statement for the result. It wasn't exactly what I needed but the basics of my solution is in the answers to this question: How can I call ajax synchronously without my web page freezing
Here's what I did in case it helps anyone else who finds this question:
function myFunction (step, params, url) {
var xhttp = new XMLHttpRequest();
xhttp.onreadystatechange = function() {
if (this.readyState == 4 && this.status == 200) {
switch(step) {
case 2:
//url and params are set here, and step 2 is done here
myFunction(step, params, url);
break;
case 3:
// and so on and so forth
}
}
xhttp.open("POST", url , true);
xhttp.setRequestHeader("Content-type", "application/x-www-form-urlencoded");
xhttp.send(params);
step++;
}
//kick off the function
var step = 1;
var url = "my URL to the API with call";
var parameters = "my parameters";
myFunction(step, parameters, url);

Trying to build query string and scrape Google results

I'm trying to build a Google query string, make a request to that page, scrape the HTML, and parse it in a Chrome extension, which is JavaScript. So I have the following code:
var url = "https://www.google.com/search?#q=" + artist + "+" + title;
searchGoogleSampleInformation(url);
function searchGoogleSampleInformation(url)
{
var xhr = new XMLHttpRequest();
xhr.open("GET", url, false);
xhr.onreadystatechange = function ()
{
if (xhr.readyState == 4)
{
return parseGoogleInformation(xhr.responseText, url);
}
}
xhr.send();
}
function parseGoogleInformation(search_results, url)
{
var link = $(".srg li.g:eq(0) .r a", search_results).attr('href');
}
The parse method just grabs the url of the first search result (which is not want I'll end up doing, but just to test that the HTTP Request was working). But link is undefined after that line. Then I used alert(url) and verified that my query string was being built correctly; I copied it from the alert window and pasted into another tab, and it pulled up the results as expected. Then I opened a new window with search_results, and it appeared to be Google's regular homepage with no search at all. I thought that problem might be occurring because of the asynchrony of the xhr.open call, but flipping that didn't help either. Am I missing something obvious?
It's because "https://www.google.com/search?#q=" + artist + "+" + title initially has no search results in the content. Google renders the page initially with no results and then dynamically loads the results via JavaScript. Since you are just fetching the HTML of the page and processing it the JavaScript in the HTML never gets executed.
You are making a cross domain Ajax call, which is not allowed by default. You cannot make a cross domain call unless the server supports it and you pass the appropriate headers.
However, as you mentioned you are building a Chrome extension, it is possible by adding a few fields in the manifest file: https://developer.chrome.com/extensions/xhr#requesting-permission

Using JavaScript to perform a GET request without AJAX

Out of curiosity, I'm wondering about the best (easiest, fastest, shortest, etc; make your pick) way to perform a GET request in JavaScript without using AJAX or any external libraries.
It must work cross-browser and it's not allowed to distort the hosting web page visually or affect it's functionality in any way.
I don't care about headers in the request, just the url-part. I also don't care about the result of the request. I just want the server to do something as a side effect when it receives this request, so firing it is all that matters. If your solution requires the servers to return something in particular, that's ok as well.
I'll post my own suggestion as a possible answer, but I would love it if someone could find a better way!
Have you tried using an Image object? Something like:
var req = new Image();
req.onload = function() {
// Probably not required if you're only interested in
// making the request and don't need a callback function
}
req.src = 'http://example.com/foo/bar';
function GET(url) {
var head = document.getElementsByTagName('head')[0];
var n = document.createElement('script');
n.src = url;
n.type = 'text/javascript';
n.onload = function() { // this is not really mandatory, but removes the tag when finished.
head.removeChild(n);
};
head.appendChild(n);
}
I would go with Pekka idea and use hidden iframe, the advantage is that no further parsing will be done: for image, the browser will try to parse the result as image, for dynamically creating script tag the browser will try to parse the results as JavaScript code.. iframe is "hit and run", the browser doesn't care what's in there.
Changing your own solution a bit:
function GET(url) {
var oFrame = document.getElementById("MyAjaxFrame");
if (!oFrame) {
oFrame = document.createElement("iframe");
oFrame.style.display = "none";
oFrame.id = "MyAjaxFrame";
document.body.appendChild(oFrame);
}
oFrame.src = url;
}

Categories