Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 5 years ago.
Improve this question
I am working on a site that calls various different pages and forms through ajax. To save page loading times I'm trying to only load the .js files that I need for each page or form, but during development this causes several issues and errors, like events or elements having to be referenced through $(document). Also, Jquery now throws a deprecation warning for loading inline js through ajax.
I know I can call external scripts through jquery's .getScript() function, and will be able to resolve all errors, but I'm wondering if it wouldn't just be a whole lot easier to include all the required script files in the main header (or footer).
What approach is more efficient in terms of work flow vs user experience? Load all the site js initially, or load scripts dynamically as needed? (In this case, total size of extraneous js files is approx 50kb)
I recommend you load dynamically when you need it, and put each js file in each file you gonna load, and forget load() wich is actually deprecated, use $.ajax() syntax.
Related
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 1 year ago.
Improve this question
Lighthouse suggests that I need to remove the unused javascript from the third party javascript library say index.js . It is okay for this particular library to load after few 10 seconds once the page is loaded. So I used the SetTimeout method to do that but I don't think that is the right approach to do it.
Once I do that, lighthouse does not complain about unused javascript from this library and improves the score, but I am worried that in production Google measure the overall performance of the page from the moment page loads to the moment user leaves the page. So in that sense, the unused javascript is never removed and just delayed for the execution. People also suggested that lazy loading JS on user events will help but in our the case mentioned above, the JS should load automatically.
I am basically looking for the suggestion to:
How to handle the JS libraries that I can't get rid of and which has lot of unused code in the page?
Is setTimeout good solution for the above case?
Is my understanding of performance calculation by Google correct (in production environment) although it does not show unused JS after delaying for 10seconds?
I would be more than happy to get the answers.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 7 years ago.
Improve this question
I'm new in ASP.NET MVC world and I want to know which way is the best to load JavaScript codes into the project:
using script tag directly in the page.
using BundleCollection class to add scripts.
using jQuery.getScript method to load the scripts dynamically.
and want to know what is the difference between them.
It all depends upon the usage.
If you want to avoid waiting for all the javascript to load before your page fires the ready event, then you may use jQuery.getScript.
Using the script tag directly will ensure that the script is available and executed before the browser parser proceeds to the next line.
Lastly, in ASP.NET, using the BundleCollection can help you optimise your JS, in a way that it allows you to bundle, compress and even minify the content before serving. This can result in quicker loading times if you have a lot of JS files on your page.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 9 years ago.
Improve this question
I have a php script(example.php), which is using multiple cURL to load 20 pages at the same time.. for example google.com, ebay.com .. It takes 5 seconds to load example.php and its quite a lot so.. I also have a simple html file(index.html) with short loadtime . And what I want is : having a script included in index.html which gets element by id from pages loaded in example.php And why? I want to have a page with fast load time(index.html), which could get elements from sites like google.com, ebay.com, facebook.com (which are actually loaded in example.php on the background) ... Example.php and index.html are on the same domain, so there should be no problem with that..
Accessing content from external websites can't be done easily with Javascript due to the Same Origin Policy. You can however display the entire page in one go by use of an iframe.
You can circumnavigate this with a variety of methods using the server as neatly provided here.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 9 years ago.
Improve this question
I have a set of users with different permissions. Depending on what permissions they have, they should only have access to a certain javascript files. In terms of speed, is it better if on every instance of their visit, I check the permission of that user, create one javascript file that contains ALL the javascript commands accessible to that user, and load that file into the view?
Or is it better to have multiple javascript files, call them page#_permission# (for instance, page1_permission10.js), and just load the corresponding files every time the page loads?
Thanks
It is probably faster to load in only the JavaScript that is needed BUT...
It probably will not be significant enough to warrant the effort. Futhermore, you may find youself in debugging hell just to save a few ms.
Firefox and many other browsers have built in tools which describe how much time it takes to load a page. Below a recent example for stackoverflow.com. You can perform a similar operation you site and locate the bottlenecks.
Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 9 years ago.
Improve this question
can my page Javascript read same page which itself is loaded? Like other parts of page are dynamically loaded by other provider. I have tried many things, google as well, but now I am in doubt that it is posible. Or it is.
Thank You!
If the page has loaded and the javascript you are running is client-side (which it should be), you should be able to access everything on the page via the document object. I would advise reading about the DOM to familiarise yourself with this.
EDIT: removed link
Server side code (whether written in JavaScript or otherwise) is not capable of determining the final rendering of the page in the user's browser.
You could build the entire page yourself (and you could use a headless browser, like PhantomJS, to do it) but that could give different results to a visitor's as you would have a different set of cookies, a different source IP address, and so on.