How to track/analyze GeoLocation requests in modern browsers? - javascript

Modern browsers such as Firefox and Google Chrome offer a javascript API that, upon the users permission, can detect and share the user's location coordinates with the web server. But to do this, it has to make an additional HTTP or other network request to a web service. Is there an easy way to inspect this request for troubleshooting purposes? To see what is being sent, to where, and how long it is taking.
It does not show up in Firebug or the Web Inspector.

Each browser engine works using its own logic.
The validated fonctionnality, common to everyone, is delivered with a lot of features (success, errors callback, watch, etc...) that can help you to troubleshoot it properly.
Please find below the mozilla doc about it if you want more informations.
https://developer.mozilla.org/en/docs/Using_geolocation

Use a packet sniffer utility to analyze the raw network requests.

Related

Delphi TWebBrowser JavaScript Errors and Cookies

I have been stuck on this for a couple of weeks now and this is a follow on from SO question Delphi REST Debugger Returns Error 429 Too Many Requests but Browser Returns JSON as Expected
I was wanting to get the content of a url response using the TNetHTTPRequest and TNetHTTPClient components. I was continually getting 429 errors “too many requests”. When using Firefox Inspect Element to look at network and storage, I discovered that I needed to receive cookies and then send those cookies with my request. Unfortunately, one of the cookies essential to the website content seems to be dependent (I think) on the execution of javascript. I went back to first principles and dropped a TWebbrowser on a form (VCL) and sure enough browser shows a javascript error “Expected Identifier”.
When I use the TWebbrowser in FMX it does not throw an error it just does not return the website contents at all and remains blank. I need FMX as I will be in a cross platform mobile environment.
The URL is https://shop.coles.com.au/a/national/home
I use Delphi Community Edition 10.3.3 Rio.
The URL returns perfectly in commercial browsers Firefox, Safari, Chrome and even CEF4Delphi. Unfortunately, I can’t use CEF as I need cross platform.
I would like to know how to get the website content returned to the browser (or even better NetHTTPClient) without script errors and how to access the browsers current cookies.
Any help will be most appreciated.
Thanks,
John.
URL returns perfectly in commercial browsers ... without script errors and how to access the browsers current cookies
If you'd inspect the network traffic (F12 > Network, then requesting your URL) or use uMatrix (to block everything that doesn't belong to the domain by default) you'd see the JS does at least one XHR to amazonaws.com. Your HTTP transfer alone (as done by TNetHTTP*) works fine and you get the same resource that each internet browser gets.
However, you don't operate with what you got (in contrast to the internet browser, which also automatically parses the HTML, sees JS resources, and executes them). TWebbrowser does not what you take for granted most likely due to security settings (try to get an error console in there, preferably F12 again). You need to do the same: parse the HTML resource for JS URIs, requesting those and executing what you get, while still providing the same cookie environment.
For executing JS you could use Chakra or mORMot or BESEN. It's challenging at first, but the more you understand about HTTP (including cookies) and a JS engine, the more you'll see why "things work" in one situation and not in another. There's a reason why an internet browser is a very complex software and not just a downloader.
As per this forcing IE11 Quirks mode might cure your problem already when using TWebBrowser:
TBrowserEmulationAdjuster.SetBrowserEmulationDWORD(TBrowserEmulationAdjuster.IE11_Quirks);

Chrome extension private API?

Hello I would like to use networkingPrivate api, which one is from chrome private API, I have added permission for networkingPrivate but it seems that is not enough. Also I found this
These are only usable by extensions bundled with Chrome, they are not publicly accessible.
here: https://developer.chrome.com/extensions/private_apis, but I'm not sure what that exactly mean, and how to do that.
My original intention is to get info about wifi (ssid,...)?
Thanks
It would be a security risk to allow user extensions to access this information direct from the OS. I think the private API is for Chrome OS, so that you can access and modify OS features direct from the browser, since Chrome OS is basically just a browser.
You could try looking at Native Messaging, which I believe will allow you to connect your extension to a native application installed on the host's machine. The application written by you would have some link between the extension, and then you can pass messages between the two. For example, your extension could request network information, and the native app will get that and send this information back to the extension.
I haven't looked into this so I may be wrong, but worth investigating. I'd be very interested to see the result.
Private APIs are made for internally usage.
For example, DeveloperPrivate API is used on chrome://extensions page.
They have stronger power than other APIs. but they also have security issues if they are open. if you really need to use them, just ask to make new API.

Developer Tools: Follow network requests across popups

We are trying to figure out how something works on the web (for web scraping/automation) and one of the web pages we are working on issues a popup to do some of the work. One of our most commonly used debug tools is the Chrome network tab in Developer Tools, hit "record" do some work, and then examine what was done and then replicate the work done "offline".
However the Developer Tools (in Chrome, Safari and Firefox - all work the same) do not follow requests across a popup, even if you hit "record".
Is there some configuration value I'm missing, or some way to record all network events? We can't use tcpdump/wireshark for this because it's all done over SSL. One option we've considered is a man-in-the-middle https proxy, but I can't find anything pre-written so we'd have to create one ourselves.
I don't know of any way to follow the requests across pop-ups, as each window has its own Web Inspector, however you can use Fiddler to inspect HTTPS requests. It will MITM, and subsequently throw a certificate error, which should allow you to inspect all requests in the order that they happened.
You can use Charles Web Debugging Proxy, which is an app that lets you see all the traffic and even replace some responses with your own. Of course that may break HTTPS so you have to accept the certificate errors, but that's usually a minor problem. It works on Win, Mac and even Linux.
The object inspector cannot inspect what isn't in the current page. Therefore, you will need to open the inspector inside the popup url with same parameters in order to see what it does.
As a tool, you can use a web sniffer to see exactly which url were called during the process.

Replace remote JavaScript file with a local debugging copy using Greasemonkey or userscript

While debugging a client app that uses a Google backend, I have added some debugging versions of the functions and inserted them using the Chrome Developer Tools script editor.
However there are a number of limitations with this approach, first is that the editor doesn't seem to always work with de-minified files, and when the JS file is 35K lines long, this is a problem.
Another issue is that all the initialization that is done during load time, uses the original "unpatched" functions, hence this is not ideal.
I would like to replace the remote javascript.js file with my own local copy, presumably using some regex on the file name, or whatever strategy was suitable, I am happy to use either Firefox or Chrome, if one was easier than the other.
So basically, as #BrockAdams identified, there are a couple of solutions to these types of problem depending on the requirements, and they follow either 1 of 2 methods.
the browser API switcharoo.
The proxy based interception befiddlement.
the browser API switcharoo.
Both firefox and chrome support browser extensions that can take advantage of platform specific APIs to register event handlers for "onbeforeload" or "onBeforeRequest" in the case of firefox and chrome respectively. The chrome APIs are currently experimental, hence these tools are likely to be better developed under firefox.
2 tools that definitely do something like what is required are AdBlock plus and Jsdeminifier both of which have the source code available.
The key point for these 2 firefox apps is that they intercept the web request before the browser gets its hands on it and operate on the other side of the http/https encrpytion stage, hence can see the decrypted response, however as identified in the other post that they don't do the whole thing, although the jsdeminifier was very useful, I didn't find a firefox plugin to do exactly what I wanted, but I can see from those previous plugins, that it is possible with both firefox and chrome. Though they don't actually do the trick as required.
The proxy based interception befiddlement This is definitely the better option in a plain HTTP environment, there are whole bunch of proxies such as pivoxy, fiddler2, Charles Web HTTP proxy, and presumably some that I didn't look at specifically such as snort that support filtering of some sort.
The simplest solution for myself was foxyproxy and privoxy on firefox, and configure a user.action and user.filter to detect the url of the page, and then to apply a filter which swapped out the original src tag, for my own one.
The https case. proxy vs plugin
When the request is https the proxy can't see the request url or the response body, so it can't do the cool swapping stuff. However there is one option available for those who like to mess with their browser. And that is the man-in-the-middle SSL proxy. The Charles Web HTTP proxy appears to be the main solution to this problem. Basically the way it works is that when your browser makes a request to the remote HTTPS server, the ssl proxy intercepts the request and from the ip address of the server generates a server certificate on the fly, which it signs with its own root CA, and sends back to the browser. The browser obviously complains about the self-signed cert, but here you can choose to install the ssl proxy root CA cert into the browser, befuddling the browser and allowing the ssl proxy to man in the middle and make replacements and filters on the raw response body.
Alternative roll your own chrome extension
I decided to go with rolling my own chrome extension, which I am planning to make available. Currently its in a very hardcoded to my own requirements state, but it works pretty good, even for https requests and another benefit is that a browser plugin solution can be more tightly integrated with the browser developer tools.

Cross-domain SSL handshake failure in Firefox using xhr, client-certificate

The setup is as follows:
Firefox (both 3.x and 4b) with properly set up and working certificates, including a client certificate.
Web page with an XMLHttpRequest() type of AJAX call to a different subdomain.
Custom web server in said subdomain accepting requests, reponding with a permissive Access-Control-Allow-Origin header and requiring client verification.
The problem is that Firefox aborts the request (well, that's what it says in firebug anyway) abruptly. Running the setup with openssl s_server instead hints that Firefox actually doesn't even send the client certificate:
140727260153512:error:140890C7:SSL routines:SSL3_GET_CLIENT_CERTIFICATE:peer
did not return a certificate:s3_srvr.c:2965:ACCEPT
The same exact setup works perfectly with Chrome, suggesting perhaps a bug in Firefox. However, performing the ajax call with a <script> element injected into the DOM seems to work as intended...
So, has anyone else run into this? Is it a bug? Any workarounds? Is there something obvious missing?
Chiming in 5 years later probably isn't much help to the OP, but in case someone else has this issue in the future...
Firefox appears to not send the client certificate with a cross-origin XHR request by default. Setting withCredentials=true on the XHR instance resolved the issue for me. Note that I also did not see this problem with Chrome--only Firefox.
For more info see this Mozilla Dev Network blog post. In particular, the following statement:
By default, in cross-site XMLHttpRequest invocations, browsers will
not send credentials. A specific flag has to be set on the XMLHttpRequest object when it is invoked.
The reason injecting the script works as opposed to a simple XHR request is because of the Single Origin Policy. This would probably explain why Chrome allows the XHR but not FF; Chrome considers the subdomain part of the same origin, but FF does not.
Injecting scripts from other domains (which is what Google Analytics does) is allowed and one of the practices to handle this situation.
The way my team handles this situation is by making a request through a server-side proxy.
I would recommend using a server-side proxy if you can, but the script injection method works fine as long as the code is coming from a trusted source.
I also found this article which describes your situation.

Categories