Initializing page loading browser indicator/animated icon with JavaScript - javascript

Is there an easy way to start and stop the browser throbber (page loading indicator) without changing the page you are on? Preferably with no external libraries or AJAX calls.

No. You could try to force it to spin by performing ajax calls or whatever, but don't. That part of the browser isn't for you, it's for the browser!
This is kinda like asking if you can change the system clock so that your game that includes a time machine is more realistic.

You could call a simple server side script, let's call it 'sleep.cgi', that would halt execution (sleep) and with it delay the delivery of some simple self-reloading page in a hidden iframe. That should make pretty much any major browsers display what you call a 'browser throbber' (page loading indicator). Once you'd be done with whatever reason you'd want the throbber to display, you would simply reset the mentioned iframe location to an empty string. This is how I'd do it:
1) write a simple server-side script that takes the request and halts the execution for up to 30 seconds, not wanting to hit the browser's page load timeout wall. Once the sleep period is over, this server script responds with something along the lines of:
<html><body onload="location.reload();"></body></html>
forcing it to loop until you want the browser to display 'page loaded' indicator.
2) write supporting JavaScript functions that will start and stop the 'throbber' on request and add HTML elements we'll be using:
<script>
function startThrobber(){
document.getElementById('throbberFrame').src='sleep.cgi?'+Math.random();
}
function endThrobber(){
document.getElementById('throbberFrame').src='';
}
</script>
<div id="throbberWrapper" style="display:none;visibility:hidden;">
<iframe id="throbberFrame" style="width:1px;height:1px;"></iframe>
</div>
I've done a quick test in Chrome/IE/Opera/Firefox and it seems to work OK. I've never had any need to do anything like it, though.
EDIT: If your web server supports PHP scripts, here's the sleep function reference: http://php.net/manual/en/function.sleep.php I don't write in PHP but I believe your PHP document should look something along the lines of:
<html><body onload="location.reload();">
<?php
// sleep for 29 seconds
sleep(29);
?>
</body></html>
Cheers!

The UI freezes if something very CPU-intensive is loaded so you never end up seeing the the animated GIF or progress bar.
You have an XY problem. You can prevent the browser from freezing by using setTimeout to divide CPU intensive operations into smaller chunks. The browser throbber is a UI element that signals something is being loaded from the network; it's extremely confusing if you used it for any other reasons.

Related

How to make webpage not responding for testing

I want to make a webpage in not responding state manually. The purpose is to embed webpage inside a webview component in an electron application. so that the renderer process can know the embedded process is not responding, I wish to use the unresponsive event for the webcontents object. Can anyone help?
The Electron API Demos application has a demo section called Handling Window Crashes and Hangs making use of the Electron methods process.crash() and process.hang(), which have been specifically designed for this kind of test purposes.
You may try using process.hang() somewhere in the relevant renderer process code to simulate an unresponsive webpage...
To make a webpage unresponsive you need an infinite rendering proccess. For example putting infinite images in a webpage:
while (true){
var elem=document.getElementById("test");
var img='<img src="test">'
var data=elem.innerHTML;
elem.innerHTML=data + img;
}
<div id="test"></div>
Please note that technically the Not responding is not a state nor an event of webpage. It is a message from browser. Infinite rendering loops ends in Webpage unresponsive normally. However non-rendering script such as while(true){} also ends in errors but perhaps with different messages from browser.

HtmlUnit Load Facebook Photos

So, I have a project where I need to get the photos from a profile.
I am able to navigate to the photos page of a profile, but I believe the JavaScript is not loading.
I am currently using HtmlUnit but if you know of another Java API that would help I'm all ears.
Basically, when I view Facebook in a normal browser, it will load all of the pages and I can inspect the elements.
When inspecting, there is a div called fbStarGrid and a few other modifiers. This div contains all the images for a user's profile.
When I use HTMLUnit, I cannot find the div. I had it print the full page XML to a file, and I found that the div is commented out. I believe this means the Javascript never ran to load the content.
After browsing a lot of javascript help on SO, I have found a few things that help with debugging but can't seem to fix the problem.
The first thing I've done is create an instance of a JavaScriptJobManager. I used it to see how much JavaScript is not complete. After waiting for a while (10+ seconds) it says there are still 3 JS jobs uncomplete. After a very long time (about 60 seconds), it says there are 2 JS jobs uncomplete.
I do not know what is hanging with those JS jobs.
I get a warning upon page load about application/ld+json not running but I do not believe that part of the website is related to the photos.
Is there something I can do to force the JS to run? Is there a job it's stuck on and won't proceed to the next job?
I've also wondered if it's an issue with the page not re-syncing.
I've tried two solutions related to this:
Setting the AjaxController to NicelyResynchronizingAjaxController()
webClient.setAjaxController(new NicelyResynchronizingAjaxController());
And someone suggested creating a custom controller that forces syncing.
webClient.setAjaxController(new AjaxController(){
#Override
public boolean processSynchron(HtmlPage page, WebRequest request, boolean async)
{
return true;
}
});
Neither of these seemed to effect the page.
If HTMLUnit is not the right library for the job, any other ideas? I need this to be headless/guiless to run on a linux server. Java is preferred, but I can switch languages if necessary.

Manipulate website which has been called from another website with Greasmonkey [duplicate]

This question already has answers here:
Fire Greasemonkey script on AJAX request
(2 answers)
Closed 7 years ago.
I have the following situation:
A website gets data from another html-file("news.html") which gets called every 10s or so
I want to manipulate the data from the news.html
I thought I could set up a greasemonkey script which manipulates the news.html and thus also the main website.
However this assumption was wrong: When I open the news.html in my browser, the news are manipulated (in terms of data - just to clarify this), but when I visit the main website the news don't get manipulated.
I think that greasemonkey does not work when the website is not opened "directly" in the browser, but with ajax/jQuery/....
Is there any known workaround for this?
Thanks in advance!
You can't change files that are on a server with greasemonkey unless the API for some reason leaves that exposed. Whatever you are changing is just local to you.
For simple pages, it's safest to wrap your greasemonkey scripts in at least an unload handler: According to the "Authoring" page at http://greasemonkey.mozdev.org/authoring.html,
User scripts are executed after the DOM is fully loaded, but before
onload occurs. This means that your scripts can begin immediately and
don't need to wait for onload. However, replacing large parts of the
DOM (e.g. using innerHTML or outerHTML) at this early stage of
rendering is known to cause Firefox some trouble. In this case, you'll
have more success if you call your code in response to the load event
instead:
window.addEventListener("load", function(e) {
document.innerHTML = "Hello, world!";
}, false);
However, if the "main site" is constructing itself via a secondary ajax call to news.html, that won't be enough, because the data you want to manipulate won't be in the DOM yet when your script runs on the main site. You'll need to delay your script's execution until after the main site has finished doing its thing, so that when you try to do your thing there'll be the thing there for you to do your thing to. So to speak.
Have your script observe the DOM and wait to run until after news.html has been injected into the main site, or be lazy and start it after a sufficiently-long setTimeout.
(A clarification based on discussion in comments: Greasemonkey will only act on the site that was actually loaded in the browser; it will not act directly on every XHR request that site makes, even if that url was #included in the script. So if site "foo.com" ajax-injects content from "bar.com/news.html", and the browser loaded "foo.com", greasemonkey will not directly modify the "bar.com/news.html" request foo.com made; it can only work with the DOM that foo.com constructs based on what it got from news.html.)

Chrome just doesn't finish loading JS files

I am writing an app that includes about 12 short JS files in the <head> section (I know, I should merge them and move them just before the end of <body>; would get to those when in production)
The trouble is that when I try to load the app in Chrome, Some files load immediately while some never finish loading at all! Chrome keeps trying to load the 12 JS files and never renders the page until I hit "Stop".
When I hit stop, the HTML is rendered and the JS files fail as in the image below:
Note that different JS files fail on each attempt! It's not the same file that gets stuck every time!
Inspecting the headers of the failed files shows "Caution: request is not finished yet". The files are stuck in "Receiving" sometimes for many minutes!
Now here's the fun part, after hitting stop, if I focus on the omnibar and press enter, all the JS files load instantly and the application works fine!
On the server side, I am using Apache, PHP and MySQL. Have I misconfigured something in Apache?
STATUS after 2 gruelling days: zilch, nothing, nada, this is driving me nuts. I have tried running the code from different machines, have tried changing apache cache settings and changed myriad things in javascript but nothing has worked. The worst thing is that no one can pin point where the problem is!
One possibility is that you have Keep-Alive enabled and it is not working properly. I've seen this before. The browser establishes it's maximum number of connections to your server (typically 6) to download the first few files (CSS, JS etc.) then those connections are not released until they time out. My symptoms were not quite the same as yours - the timeout was 20 seconds and everything would load in batches of 6 after that - but this could still be the cause.
In your Apache configuration (httpd.conf on most systems), look for the KeepAlive line (or add it if it's missing) and set it to Off.
More than an answer, here's how I would troubleshoot the problem.
One of the things I'd try is commenting out tags one at a time and reload, to see where the threshold is. Also, because this is probably a server configuration problem, I'd restart it after each try, to have a clean slate, so to speak, i.e., no state preserved between tries.
I'd try to get more hints by trying to make the requests for the various Javascripts from a script in your favourite language. Ideally, I'd try GETting the scripts one by one (say with curl) waiting a few milliseconds between requests. I imagine I'd hit a threshold here as well. Like, getting one script per second works, but when requests get too close, the server gets stuck.
If still no clue, I'd use tcpdump to watch the traffic between the browser and the server. Okay, this may be a little too low level!
But perhaps I'd use netstat to see how many connections the browser opens in parallel to the server to fetch the resources, and see if we hit a concurrency limit.
I'm sorry this is a solution but I hope you get some ideas, and I'd be very curious to know what your problem is, in the end!
We got exactly the same message "Caution: request is not finished yet" after a request in the browser.
The request itself was in the order of 15 MB of JSON, which was fed into an Angular 1 application. This request took about 2 seconds. Right after the request was finished, the Angular digest cycle blocked the browser for more than 12 seconds (this is visible by profiling during the request). During that time Chrome showed this "Caution: request is not finished yet" message in request whilst it actually was already finished.
Check the Content-Length header. Server may pass incorrect value - greather than actual content length.
Try to emulate problem in https://www.stevesouders.com/cuzillion/
It should say you side of problem - chrome or server
There are several good answers, but if you want a foolproof method of doing this, you can use PHP to send all the scripts at once. If the problem is truly too many connections to the server, you could add some html code like this:
<script type="text/javascript" src="scripts/scripts.php />
And then in scripts.php you use the include() function to include all of the JavaScript files like so:
<?php
header(''); // control all your headers
include('jquery-1.9.1.min.js');
// rest of scripts
?>
If nothing else works and the problem is excessive connections, this should work.

How can I simulate a non-responding server?

I have a web service which the customers use by inserting an external JavaScript (hosted on my servers). Recently, due to server outage - the external JavaScript became unavailable and my customers' websites came to a crawl as browser didn't load rest of the website until it loaded the JS (it goes into header of the websites).
I am trying to work out methods so that customers' website don't slow down even if my server goes down and for that I wanted to simulate a condition where the my server isn't responding. Note that if I specify a wrong URL, browser won't load the JS but in case URL is right and server isn't responding, browser will stall loading rest of the page. I want to simulate the last case. Any ideas how can I go about it?
PS: On server side, I am using the LAMP stack.
Create a script that sleeps for a configurably long time
Something like
<?php
$how_long = $_GET['seconds'];
sleep($how_long);
echo "alert('Finished sleeping!');";
?>
Then you just access this script instead, for example by putting this in your HTML code
<script src="http://example.com/hang_for.php?seconds=3600" />. That would sleep for an hour. There will be another timeouts that'll trigger first configured in php.ini, but that's exactly what you want to test, no?
If the "P" in your LAMP is PHP, you could use the sleep function (documented here). Then, have your test page load your PHP script as the source of your Javascript to see what happens.
Did you try looping back the server into itself (or any other HTTP server w/o the webservice on)?
unplugging is pretty drastic, the off-button should do.
Unplug the server. Having no power makes a server unresponsive...

Categories