using IE11 the time taken to load the javascript and png files is very huge. After disabling addons it was faster but still not as expected. Javascript loading is happening at the begining of the page and this is causing delay in page loading. Does this have anything to do with network?
Related
I have a trouble with loading images on google chrome.
I am building a web site, i have built it in html.
When i run HTML file directly in browser it loads all files right away.
After splitting file into php files (to include header and footer and to load content on click) and moving to xampp for testing i have realized that images are not loading on google chrome every time.
I have tested HTML files via xampp, i get the same result, images are not loaded in chrome every time.
If i keep refreshing page, images load once, the next time some are loaded, some are not.
It works fine on Microsoft Edge and Mozilla Firefox.
Any solutions for this?
I've got a page, where I'm showing a video of around 7MB. I implemented a canplaythrough callback on the video. This seemed to work fine until i checked it out on someone's slow internet connection. What I'm actually doing is loading in one big video and skipping through the video to show different little parts of it.
But sometimes it then gets stuck halway and starts partial loading that part again. How can make sure the video is entirely preloaded and that browser won't reload any parts?
You can't. How the video is buffered is not defined in the specifications and it's entirely up to the browser vendors how they implement the buffering mechanism.
The buffering takes into account things such as preload time, network speed but also tries to prevent downloading huge files to the user's computer that would take up unnecessary space on the disk.
That being said - some browsers do currently have some issues with their buffering mechanism such as the Chrome browser which doesn't always work as expected. If you're using Chrome then try with another browser to check how that behave with your code and scenario.
Is there a way to make a video download in the background (Possible on a different thread?) than I get my images and do webrequests?
Situation:
I show a video, video plays fine.
But, once I start getting 2 images at the same time from that server, the page won't response and will wait for my video to have finished loading, then it loads the images within a few seconds. (about 100kb per image).
If I try to open any other page on the same browser and the crashed page's server it won't load untill the crashed page is done loading, however any other site(For example google) will just load fine.
So is there a way to make the browser not want to download full video, or maybe just give priority to images?
This sounds like the server throttling your requests, as everything apart from scripts always load asynchronously in a browser.
It might be that you are only allowed so much bandwidth per second from the server - or so many connections - and that could be the reason why the server won't respond until your first request has finished.
Check your server configuration and perhaps have a play with it to exclude this possibility.
Maybe you can try a worker, it provides a way to execute background scripts (you will have to split your video and images in separate files), as you can see in the Use Cases it refers to "Prefetching and/or caching data for later use" as a possible scenario. Hope it helps.
More info here:
http://www.html5rocks.com/en/tutorials/workers/basics/#toc-examples
I am creating a complete ajax application where there is one base page and any pages the user navigates to within the application are loaded via ajax into a content div on the page. On the base page I include the various scripts that are needed for every page within the application (jQuery, jQuery-UI, other custom javascript files). Then on the various pages with the application I include a script or two for each page that contains the logic needed for just that page. Each of those script files have something that executes on the page ready event. The problem is that every time the user navigates to page1, the page1.js file is loaded. So, if they visit that page 10 times, that script is then loaded ten times into their browser. Looking at the Chrome script developer tools after running around the site I see tons of duplicated scripts.
I read somewhere about checking to see if the script has already been loaded using a boolean value or storing the loaded scripts in an array. But, the problem with that is that if I see the script is already loaded and I don't load it, the page ready function doesn't get fired for the page's javascript file and everything fails.
Is there an issue having the javascript file loaded over and over when the user visit the same page multiple times?
I did notice looking at the network traffic that every time we visit the page, the script is requested with a random number parameter (/Scripts/Page1.js?_=298384892398) which causes the forced request for the script file every time. I set the cache: true settings on the jQuery ajaxSetup method and that removed the parameter from the request and thus the cached version of the javascript file was loaded instead of actually making a separate HTTP request for it. But, the problem is that I don't want all the ajax requests made to be cached as content changes all the time. Is there a way to force just javascript files to be cachced but allow all other ajax requests to be not cached.
Even when I forced caching on all requests, the javascript file still showed up multiple times in the developer tools. Maybe that isn't a big deal but it doesn't seem quite right.
Any advice on how to handle this situation?
About your first question:
Every time you load a JavaScript file, the entire content gets evaluated by the browser. It solely depends on the content if you can load and execute it multiple times in a row. I'd not consider it a best practice to do so. ;)
Still i'd recommend that you find a way to check if it was already loaded and fire the "page loaded" event manually within the already present code.
For the second question: I'd assume that the script is intended to show up multiple times when including it multiple times. To give an advice on how to not cache the loaded JS i'd need to know how you loaded the code, how you do AJAX and the general jQuery setup.
After doing some more research it looks like it is actually just a Chrome issue. When you load a script via AJAX you can include the following in your code to get it to show up in the the Chrome developer tools
//# sourceURL=some-script-name
The problem is that when you navigate away from the page, the developer tools keeps the script around, but it is actually not longer referenced by the page.
I am a beginner web developer and here is my problem:
In short:
I keep getting similar message in Firebug for all the javascripts I include in the page:
GET http://localhost.:33085/Scripts/jquery.form.js?_=1284615828481 200 OK 1.01s
In details:
I am loading a webpage using AJAX . This page contains references to some java scripts. It also contains some embedded javascript code. Firefox keeps reloading the referenced java scripts each time I navigate to these pages which seems to take time. My questions are:
These scripts are already referenced in the page that has the where I load this page using AJAX. if I remove the references from this ajax loaded page, I start getting '$ is not defined'. Is there away to avoid that error other than referencing these scripts in the AJAX loaded page?
How can I stop firefox from reloading those pages and start using cached version?
Why is it so slow on firefox? I don't seem to see such perf issues on IE or Chrome?
Thanks
The best approach is to ensure that the initial page that you first access loads the required scripts and then subsequent ajax requests only load the content that you need (i.e. the references to the scripts is not in the html returned by the ajax request). There are server side frameworks to help you achieve this but without knowing your server technology I cant recommend a specific solution.
Firefox may be slow due to Firebug, with full debugging enabled in firebug it can slow you web pages down.