I would like to know. Is there any option to generate random network errors for testing of js code? I am trying to implement some error handling during the upload of files to the PHP server but "unfortunately", my internet is rather stable at home + I do it in LAN. I was trying to use a VPN switch, but it switches almost instantly, without network disruption (or at least it seems that way) and even if it worked, I do not feel like using that solution because of that frustrating routine that comes with it...
Thanks!
The Chrome inspector's Network tab has options for, among other things,
throttling your connection
blocking requests
and simulating offline mode for your tab (which might do the trick if you hit it during a request).
Related
I've been creating a site that's meant to work on all browsers, and connects to a nodejs server through a proxy (meaning, the client connects to the server through a proxy) using websockets.
The site works fine on all browsers, but I've noticed some strange behavior on IE. The more the client browsed through the site, and as a result, opened more websockets, the slower the tab the client was browsing through became. Eventually, the tab would stop responding entirely and had to be shut down entirely because it'd stop responding.
I tried to whittle the problem down as much as I could, and eventually noticed that when your browser uses an explicit proxy, and has websockets, after a set amount of connections (around 25) the tab you're using will stop being able to connect to the server, and may stop working altogether. It's easy to reproduce this using the following steps:
Take the example page from here and create an html page
Download fiddler and use it as your proxy
Browse to the example page that you created and keep refreshing your tab. You should notice a slowness, until an eventual stop
It's worth noting that without a proxy, the tab will not end up dying like this.
Has anyone else encountered this issue? If so, is there any fix to this, aside from changing the architecture?
Thanks a lot
I did a lot of search, found some topics like Detect that the Internet connection is offline? but they don't solve my problem.
When I took a look at Google Doc(maybe Gmail use the same way) product, its Internet connectivity checker looks great and almost got notified in real time once you turned off your WiFi.
If you turn on Chrome's Network Monitor, there are http requests: https://0.docs.google.com/document/d/1muWJEAPZU2meqlBg4_69osnscaM1GpxkOK--H7r-f44/bind?id=... and restarts every minutes(55 seconds around)
When you turn off the WiFi, the long polling request get interrupted by ERR_NETWORK_CHANGED.
I try to implement something like this on my own machine, but my long polling request never got interrupted or exception (I use tornado web framework and google chrome)
Is there anyone can show me the magic part that how Google doc checks its connectivity?
I'm trying to troubleshoot a problem on a client's machine for our website. We're using an ajax call to get information on a page to select additional parameters. Our callback function has a block of code for reading when the ajax coming back is an error or is correct. For every other computer that we've tested this with, the ajax comes back. However, for a particular client, we're seeing the ajax come back with the error message, meaning the response never got there successfully or that it's corrupted or broken.
Does anyone know how this would happen? The client is using IE 8 and I've tested IE 8, IE 9, IE 10, and Chrome and all of those work on my computers.
EDIT: As of now, we don't have access to the system and the network that would be causing the error. They are trying to see if they can accept everything from our domain and see if that fixes it, but right now, I can't put Fiddler on their computer.
I've seen any amount of random behaviour caused by virus scanners and so-called network security products. Try adding an exception for your site to any security software your client is running.
The other thing to do is to use Wireshark, Fiddler, etc. to see what's actually happening at the network level.
We are trying to figure out how something works on the web (for web scraping/automation) and one of the web pages we are working on issues a popup to do some of the work. One of our most commonly used debug tools is the Chrome network tab in Developer Tools, hit "record" do some work, and then examine what was done and then replicate the work done "offline".
However the Developer Tools (in Chrome, Safari and Firefox - all work the same) do not follow requests across a popup, even if you hit "record".
Is there some configuration value I'm missing, or some way to record all network events? We can't use tcpdump/wireshark for this because it's all done over SSL. One option we've considered is a man-in-the-middle https proxy, but I can't find anything pre-written so we'd have to create one ourselves.
I don't know of any way to follow the requests across pop-ups, as each window has its own Web Inspector, however you can use Fiddler to inspect HTTPS requests. It will MITM, and subsequently throw a certificate error, which should allow you to inspect all requests in the order that they happened.
You can use Charles Web Debugging Proxy, which is an app that lets you see all the traffic and even replace some responses with your own. Of course that may break HTTPS so you have to accept the certificate errors, but that's usually a minor problem. It works on Win, Mac and even Linux.
The object inspector cannot inspect what isn't in the current page. Therefore, you will need to open the inspector inside the popup url with same parameters in order to see what it does.
As a tool, you can use a web sniffer to see exactly which url were called during the process.
While debugging a client app that uses a Google backend, I have added some debugging versions of the functions and inserted them using the Chrome Developer Tools script editor.
However there are a number of limitations with this approach, first is that the editor doesn't seem to always work with de-minified files, and when the JS file is 35K lines long, this is a problem.
Another issue is that all the initialization that is done during load time, uses the original "unpatched" functions, hence this is not ideal.
I would like to replace the remote javascript.js file with my own local copy, presumably using some regex on the file name, or whatever strategy was suitable, I am happy to use either Firefox or Chrome, if one was easier than the other.
So basically, as #BrockAdams identified, there are a couple of solutions to these types of problem depending on the requirements, and they follow either 1 of 2 methods.
the browser API switcharoo.
The proxy based interception befiddlement.
the browser API switcharoo.
Both firefox and chrome support browser extensions that can take advantage of platform specific APIs to register event handlers for "onbeforeload" or "onBeforeRequest" in the case of firefox and chrome respectively. The chrome APIs are currently experimental, hence these tools are likely to be better developed under firefox.
2 tools that definitely do something like what is required are AdBlock plus and Jsdeminifier both of which have the source code available.
The key point for these 2 firefox apps is that they intercept the web request before the browser gets its hands on it and operate on the other side of the http/https encrpytion stage, hence can see the decrypted response, however as identified in the other post that they don't do the whole thing, although the jsdeminifier was very useful, I didn't find a firefox plugin to do exactly what I wanted, but I can see from those previous plugins, that it is possible with both firefox and chrome. Though they don't actually do the trick as required.
The proxy based interception befiddlement This is definitely the better option in a plain HTTP environment, there are whole bunch of proxies such as pivoxy, fiddler2, Charles Web HTTP proxy, and presumably some that I didn't look at specifically such as snort that support filtering of some sort.
The simplest solution for myself was foxyproxy and privoxy on firefox, and configure a user.action and user.filter to detect the url of the page, and then to apply a filter which swapped out the original src tag, for my own one.
The https case. proxy vs plugin
When the request is https the proxy can't see the request url or the response body, so it can't do the cool swapping stuff. However there is one option available for those who like to mess with their browser. And that is the man-in-the-middle SSL proxy. The Charles Web HTTP proxy appears to be the main solution to this problem. Basically the way it works is that when your browser makes a request to the remote HTTPS server, the ssl proxy intercepts the request and from the ip address of the server generates a server certificate on the fly, which it signs with its own root CA, and sends back to the browser. The browser obviously complains about the self-signed cert, but here you can choose to install the ssl proxy root CA cert into the browser, befuddling the browser and allowing the ssl proxy to man in the middle and make replacements and filters on the raw response body.
Alternative roll your own chrome extension
I decided to go with rolling my own chrome extension, which I am planning to make available. Currently its in a very hardcoded to my own requirements state, but it works pretty good, even for https requests and another benefit is that a browser plugin solution can be more tightly integrated with the browser developer tools.