Simulate loading on localhost - javascript

I have a site which uses AJAX and preloaders. Now I would like to see the impact of these preoloaders before deploying the site online.
The "problem" is that localhost doesn't have loading time and the response is immediate, so that I can't see my preloaders.
How can I simulate loading or limited bandwidth (with Firefox, Rails or whatever else)?

If on windows, download Fiddler and set it to act like you are on a modem:
Tools-->Performance-->Simulate Modem Speeds
[edit]
Since you said you are now on a MAC, you have Charles which has throttling
[/edit]

I don't have a rails app in front of me right now but why don't you just add a delay to the appropriate controller?
i.e.
def index
# ...
sleep 2 # sleeps for 2 seconds
# ...
end
Alternatively, use a debugger and place a breakpoint in the controller code. This should mean that your preloader will show until execution is continued.

One option would be to deploy the site briefly to the host you will be using for production under an alternate URL for performance testing.
However, the way it performs for you won't necessarily be the same for everyone else in other locations.
If you provide some more detail on what these "preloaders" are and how they work and what you mean by "see the impact" we might be able to give better answers. Do you mean you want to eyeball the AJAX spinner gifs and get a feel for how it will look to the end user as the loading takes place? Or do you mean you want to do some kind of formal benchmarking on them?

You can use Firebug plugin to Firefox to determine the network behavior of your page. This works fine for localhost. You should see all images being retrieved simultaneously at the time of the preload execution.

You could configure your router so that it forwards requests on a certain port to the computer you're running the website on. Then, when you open your.ip.add.ress:the_port in your browser, the bottleneck will be your upload speed, which is generally quite low.
But that's just how I would do it ;)

Related

CORS header not set - can I request an image url and then serve it back to myself?

I am attempting to populate a WebGL Earth with meshes that are compiled from images. These images are cross-domain, and hosted on a server where setting the appropriate headers isn't an option. Can I XMLHttpRequest the image urls, and then serve them back to myself via PHP to bypass CORS errors?
Or, more specifically, can I use my own webserver as a proxy to serve img urls back to myself (to get around CORS) in a WebGL context?
EDIT: The real question here is if I can use my own webserver as a proxy to pass the urls, or if I'll have to actually download each image to the server to then use it.
I had a similar issue once using an API. First I tried to do everything in JS probably getting the same error message as you do.
My solution was to switch to PHP and do it server side since modern Browsers block what you want to do.
So yes, it is possible.
Get the pictures on the backend and then provide them to the frontend.
Simply retrieve the pictures first and then send them as Output to the Browser. You can do that synchronously by doing something like:
$ch = curl_init ...
...
$pic = curl_exec ... // get the picture
// and then echo it
This I have done once but don't remember correctly. Or you can do it async, what is usually done when using img-tags. I'm not sure how it works with WebGL but should be similar:
Download the pic to your filesystem
then provide the URL to the browser.
It then depends on how big the images are, how long you need them, and the API whether you want to go this direction.
Answer to first comment:
Tricky. I don't have experience using WebGL Earth and whether it is possible to load data async via Ajax (look here) or if you use AngularJS (look here) into it. You would need to try that one out. I'd especially look into the loading times.
There is a API-call like http://example.com/api/get_image/65446 which downloads the picture, resizes it and then sends it to the browser.
What you would do in this case is:
Send the 'normal' Page to the user
Then there you look for the events for which you want to show pictures
When the event happened use the API-call I just mentioned and add it to your page with the success handler. Again, how that can work with WebGL Earth is another question I can't answer.
And if you want to use that for mobile devices you need to think about the picture size. Since the screens are relatively small you should make the pics smaller first. But then, how long does it take to get the picture I guess this is the biggest challenge. Somebody who wants scrolls the globe would like to see the pictures immediately, not after 5 seconds (since I scrolled more probably)
Think about whether you can prepare the download and resize first. If you want to show only certain pictures, like 10.000 in total then I would do that. Then you don't need to think about loading times as much and when you delete which pictures. You should open another question for that topic and try first whether Ajax is possible.

How to make javascript code stop running offline?

I'm making a model viewer online with Javascripts. For security reason, i don't want user can run it anywhere out of my site.
I already obfucated and lock domain but if user download whole website, they still can run offline.
So, how to detect that user is running offline and stop working by Javascript?
Thanks.
var checkAfter=15; //15 seconds
setTimeOut(CheckNavigatorState,checkAfter*1000);
function CheckNavigatorState(){
if(navigator.onLine){
// --- Add javascript src OR call js functions here
}else{
//----remove javascript source OR stop functions
setTimeOut(CheckNavigatorState,checkAfter*1000);
}
}
I would like to mention that it is not easy to tell whether the browser is offline or not.
Some browser vendors say the browser is offline when computer loose its connection to the network, which is not really exact. We know that you can have access to the LAN but not to the internet. So to do what you may need to ping a real distant server, such as Google.
EDITS: To ping with JavaScript, check this fiddle (edited)
You could make the javascript code check the domain it is running on using the window.location property (http://www.roberthalf.com/technology/External_Sites/content/RHT-NA3/Shared/Images/logos/rht_logo_c.gif) but that will only keep people who don't know how to modify javascript code from running it locally. The better solution would be to make your code dependent on something on the server so that whenever they try to do something important it makes an ajax call to the server, which can then check the referrer and return an error if they aren't running from the server.

Javascript timer puzzle

This is a weird scenario I just experienced and I am not sure how to phrase the question.
It may be best to describe my application and what it does 1st.
I have an IP camera connected to my router.
I use a C# VLC wrapper to get 10 frames a second using a RTSP protocol.
I then upload to my web server using a [web method] these seperate jpegs to my server.
Then via browser using a javascript timer set to 100ms it renders the image into a HTML image control by calling an ashx page repteadly.
Now this has worked for a few days OK.
Now this is what I have experienced in the last 48hrs.
The images coming from the IP Camera was jumpy. That is to say sometimes the images flow in a timely order and sometimes it will slow down, stop and speed up again to 'catch up'.
I noticed when viewing via a web browser client on another PC on my network that the javascript timer calls were slow and sometimes stopped for periods of time. I used Google Chrome to view how often the ashx url was being called.
I closed down my own applications. Rebooted all my PCs and started VLC application without using the wrapper. Again, the flow was 'jumpy'. So the conclusion there was that it was not my application.
For some reason I decided to log into my router (192.168.0.1).
Page was not found.
In fact I had to do a complete restart of my router to be able to access my router 'page'.
As soon as I did this everything worked OK again.
So, the 2 questions I have is (1) why could I not access my router through that IP address and (2). Why was my javascript timer crashing to a stand-still?
Like I said this is a weird scenario and I would not blame anyone for wanting to close or vote down this question.
But on the off-chance this is a known thing I would like to be educated.
Thanks

how to handle javascript loading error in web application which uses facebook javascript sdk

I am integrating a web application with facebook by following this tutorial
It is normally working but when AVG do not track is active the browser can't load the Facebook JavaScript SDK so I want to show the user that in such a case he needs to disable the AVG do not track for the current website..
Is there a way to handle the loading error in JavaScript? We have try catch in Java - is there something similar in JavaScript so that I can hadle the loading error.
Sorry if this is a simple question ... I am a noob when it comes to JavaScript :(
I attached a little function to give feedback at the end of the code supplied by the tutorial.
http://jsfiddle.net/PRvJs/1/
The problem is that because your script is running in a web app and the blocking is happening on the browser application level, you can only ever infer that things have gone awry, and even then you have to make assumptions – scripts have load and error events but most Facebook blocking scripts circumvent that whole thing and just kill the HTTP request before it even goes out… so you can never really know!
As a fallback, I made the subjective decision to wait 30 seconds and if there is still no load or error resolution to the script injection, makes the arbitrary [1] decision that something is fundamentally screwed somewhere between your code and Facebook. As far as I know, this is the most you can determine and the only way to determine it…
[1] Most connections close if nothing has happened in 30 seconds.

Application Running With SSL not loading all scripts/CSS on first visit

Hopefully this isnt a tricky one. I've got a web app that doesn't load all javascript/css/images on the first visit. Second visit is fine.
After approximately 2 minutes of inactivity the problem reoccurs.
These problems only started occuring after the customer requested SSL be applied to the application.
Ajax requests stop working after 2 minutes of activity despite a successful page load of all javascript elements.
Application timeout is 30 minutes - like I said, everything was fine before SSL was applied.
All javascript and CSS files use absolute URLS - e.g https://blablabla
There appears to be no pattern as to why certain files arent loaded. The firebug Net output shows the status for the failed elements as 'Aborted'. For example, site.css and nav.css are in the same folder, are declared after each other in the head tag yet one is loaded and the other is not. Both will load fine after refreshing the page (unless roughly two minutes have passed).
An Ajax request also shows as aborted after two minutes. However, if i do the request again the Ajax request will succeed. Almost as if the first request woke something up.
None of these problems occur in Chrome
Any ideas? :)
FYI this is a .Net 4 C# MVC app running under IIS7 but I'm not sure its relevant since it works in Chrome. Everything worked fine before SSL was applied.
Removed SSL across the board and secured action methods with [RequireHttps]. Then changed the scripts and CSS in the master files to point to absolute HTTP urls. Javascript then worked fixing the ajax.
If anybody has any idea why CSS/Javascript broke over SSL it would be cool. Im guessing it's perhaps the work load? Since it worked the second time I'm guessing half the CSS and scripts were cached making less of a workload over SSL?
Anyway, working now!

Categories