I have a page I want to load that has sections that are making API calls. I don't the page to wait to load until those are done. I would like the page to load and then the other sections to appear after they are finished.
Right now, this is what I have but it's not getting the results I want and even with this it's still waiting for a lot to process and not loading the page first, even though it seems like it should be.
<script>
$(window).load(function(){
$(".my-div").append('<%= my_function %>');
});
</script>
Assets
Don't use erb code in your asset pipeline
Including that code in your layout or view will be okay, but if you're including in app/assets/javascripts/any_file.js, it won't work
ERB code can only be processed by js in views folders (mainly because of the precompile process of the asset pipeline)
Appending
As pointed out in the comments, you'd be better waiting until the document has loaded, like this:
$(document).ready(function() {
//your code
});
However, if you're using turbolinks, you'd be better using something like this:
var load = function() {
//append code here
};
$(document).ready(load);
$(document).on('page:load', load);
Functionality
Curiously, you've omitted one of the most important aspects of your code -- how you're retrieving the data. If you can reply with your asynchronous function, it will be a great help!
Related
I'm using Laravel to implement templates based on a main page. Different pages have different JS scripts, so I created a template to import JS scripts:
<!-- jQuery 2.1.3 -->
<script src="{{ URL::asset('plugins/jQuery/jQuery-2.1.4.min.js') }}"></script>
<!-- Bootstrap 3.3.2 JS -->
<script src="{{ URL::asset('js/bootstrap.min.js') }}" type="text/javascript"></script>
<!-- Ajax Page Loading -->
<script>
function ajax(url) {
$('.main-content').fadeOut(100); //hide the page
$('.spinner').show(); // show a spinner
$.ajax(url, {
async: true,
success: function(data) {
$('#header').html(data[0]); //append received header to header
$('#content').hide().html(data[1]).fadeIn(500); //show the page again
$('body').append(data[2]); //append scripts to body
$('.spinner').hide();
},
});
}
</script>
#yield('extra-scripts') <--- /* HERE is where the scripts will be */
I'm also using AJAX to load only the content without refreshing the page.
The function ajax will be used to load any url into the div "content". However, I also need to load the scripts so the page works properly.
Data is an array with three fields:
0 is Header html
1 is Content html
2 are the dynamically added scripts
The problem is whenever I'm loading the page, I get this error:
Synchronous XMLHttpRequest on the main thread is deprecated because of
its detrimental effects to the end user's experience. For more help,
check https://xhr.spec.whatwg.org/.
I don't want script loading to affect user experience. Has to be async.
What I already tried:
jQuery.getScript
This solution requires me to move all the scripts to a separate JS file. This would probably solve it, but I would rather keep them all in the respective page.
AjaxPreFilter
$.ajaxPrefilter with options.async = true makes the scripts load after the page thus making some properties undefined and not working.
This warning will continue to happen as long as you inject the script to the body of the document.
I'd recommend using $.getScript as this will load the script correctly. I do not really understand why you'd want all javascripts to be in the same page from the first place.
You probably want to have them in a separate file anyway for easier maintenance down the road and separation of concerns.
You can also use vanilla javascript for that:
var head = document.getElementsByTagName('head')[0];
var script = document.createElement('script');
script.setAttribute('src','your_script.js');
head.appendChild(script);
If you insist on doing it this way - try injecting it directly to the head of the document and not the body and see if that helps.
On a more general note - seems like you're building a single page application (SPA) - have you looked into JS frameworks like Angular, Backbone, etc? they will handle all the heavy lifting for you and will help you scale your application better. This smells a little bit like trying to re-invent the wheel and could be a great thing as an educational process but might not be such as good idea in the long run.
Hope this helps.
What you are doing right now is not a best practice. If you want to load pages using Ajax and dynamically call the Js files, I would recommend you to use pjax.
Take a look at here: https://github.com/defunkt/jquery-pjax
Since you are using laravel, you can easily implement this it in pjax.
Here is a tutorial: https://laracasts.com/lessons/faster-page-loads-with-pjax
In a ASP.NET Masterpage I am using YepNope to unconditionally and asynchronously load jQuery (from the Google CDN, with local fallback) and some scripts which are used on all pages in the site. In the MasterPage I have created a ContentPlaceHolder before the closing body tag (and below the YepNope script that loads those used on all pages) which is used for scripts used on individual page. As jQuery should be available on every page in the site it should not be loaded individually on those pages where there are specific scripts that use it.
The problem I have is that I can't use the callback or complete functions in the yepnope script where jQuery is loaded, as this is on the MasterPage and these are individual page scripts which are only used or added on that page, yet I need to be able to delay the execution of the individual page scripts until yepnope (which appears above the page scripts) has finished loading any dependencies (such as jQuery) used in them.
I can think of two options-
1- Make the script used on the page an external file and load that using the syntax -
yepnope('/url/to/your/script.js');
or
yepnope({ load: '/url/to/your/script.js' });
I'm not sure I like this idea as it introduces an extra HTTP request for a few lines of javascript which isn't going to be used on any other page.
2- Load jQuery again in another yepnope test object block, with the complete function wrapping up the page scripts (calling complete without a test seems to execute the function immediately, before the previous scripts are loaded) and relying on the following-
I am requesting a file twice and it's only loading once? By popular
demand, in yepnope 1.5+ we added the feature that scripts that have
already been requested not be re-executed when they are requested a
second time. This can be helpful when you are dealing with less
complex serverside templating system and all you really care about is
that all of your dependencies are available.
In the page I could presumably load the same version of jQuery from the Google CDN, which based on the above would not actually be loaded twice, and then load the page scripts in an anonymous function called from the complete function of the yepnope test object.
On the plus side this would mean that the page is no longer dependent on jQuery being loaded from the MasterPage, but a negative would be that (even assuming YepNope does not load the script twice now) we would be loading multiple versions of jQuery should the version in the MasterPage be changed without the same happening in the page in the future. From a maintenance point of view I don't feel this is a good idea, especially on the assumption (which I feel you should always make) that another developer would be the one making the changes.
It also does not seem especially elegant.
On balance I will almost certainly use the first option but I would like to know if there is a way to delay or defer scripts on a page until asynchronous loading is completed, and this cannot be done as part of the YepNope test object loading the resources.
How do other developers approach this problem?
I have come up with this as a solution I rather like.
In the MasterPage YepNope test object add the code-
complete: function() {
if (window.pageFunctions !== null && typeof (window.pageFunctions) === "function") {
window.pageFunctions();
}
}
If I want to add any JavaScript code or functions that rely on the dependencies loaded in the MasterPage I just wrap them in a function named "pageFunctions" like so-
<script type="text/javascript">
function pageFunctions() {
$(document).ready(function () {
...
});
}
</script>
I'm still interested in other (possibly better) solutions so I'm going to leave the question open for a couple of days.
I'd also appreciate comments on this as a solution.
Into my app I have included all needed JS files (my scripts, libraries such as Twitter Bootstrap etc.).
The problem is, that when I have a request which is called via AJAX, so in the called page are not included the JS files, which are included in my app and I have to include them into the called page.
Example: my_scripts.js contains lots of JS functions.
link to page called through AJAX
<a href="/articles/create_new" data-remote="true>Create New Article</a>
/views/articles/_create_new.html.haml
...some content of this file.. #here doesn't work the functions from the file "my_scripts.js"
when I put into the /views/articles/_create_new.html.haml this link
= javascript_include_tag "my_scripts"
...some content of this file..
so then in the /views/articles/_create_new.html.haml those JS functions working.
I would like to ask you, if exist any way, how to automatically put all JS files in my every single AJAX pages, because always include the JS files into an AJAX pages is not good way...
Thanks
use a script loader like RequireJS or $cript.
Have your pages reply 2 things also: the content and the scripts to load. This is best using JSON like:
{
"content" : "content here",
"scripts" : ["an","array","of","script","urls"]
}
then when the data is returned, parse and paint the content and after that, use the script loaders to load the scripts. Actually, you can make your own script loader. It's just a matter of dynamically creating a <script> tag, put it in the <head> and give it an src
I would achieve this in one of three ways:
jQuery
From http://api.jquery.com/load/:
Script Execution When calling .load() using a URL without a suffixed
selector expression, the content is passed to .html() prior to scripts
being removed. This executes the script blocks before they are
discarded. If .load() is called with a selector expression appended to
the URL, however, the scripts are stripped out prior to the DOM being
updated, and thus are not executed. An example of both cases can be
seen below:
Here, any JavaScript loaded into #a as a part of the document will
successfully execute.
$('#a').load('article.html');
However, in the following case, script
blocks in the document being loaded into #b are stripped out and not
executed:
$('#b').load('article.html #target');
Basically, you can add the JS references to the HTML returned by Ajax request and jQuery will execute them.
RequireJS or simular
Rather than return straight HTML, return the HTML as part of a JSON bundle that also contains an array of script references:
{
html: '<p>stuff</p>',
scriptRefs: [ 'js/one.js', 'js/two.js' ]
}
I would then iterate through the scriptRefs array with something like RequireJS.
Just add the code to base page
In all honesty, I'm more likely to just do this.
As is evident when the document gets loaded in the client browser,
$(function(){
some code here
});
takes over.
Say I have two JavaScript files main.js and style.js
main.js is for the functionality and style.js for some hypothetical styling when the page loads. I want both the files. I include them in my index.html first style.js then main.js both of them start with:
$(function(){
some code here
});
My question is, what is the order of execution of document.ready is it that main.js and style.js start doing things parallely or is it sequential, once style.js has finished what it should do then main.js takes over??
It is sequential. There is no parallel processing in javascript. They will be called in the order you included your scripts on the page.
This is a good answer too: Can you have multiple $(document).ready(function(){ ... }); sections?
Well you can have multiple document.ready but that affects the readability of the code. More has been explained here
Javascript wont execute the code parallel by default, to execute a code in background you need to create webworkers. Currently your code work on first come first basis.
I read several articles saying it is not recommended to put javascript code within the page.
I would like to hear from you.
How do you do with that code that is specific to that page?
Sample
In my Project.cshtml I have:
<script type="text/javascript">
$(document).ready(function () {
$('.project-status').parent('a').click(function (e) {
e.preventDefault();
DoSomething();
});
});
</script>
Have in my myfunctions.js file:
function DoSomething() {
alert('test')
}
On every page of my project, this situation repeats itself.
Question
You make a single file .js and put there all the javascript for all pages?
Or make one file per page, and make reference that file on the page?
Or put the js code that is specific to the page in the page itself?
The problem we want to solve
I had this question for I am with the following problem:
In my application I have:
Project.cshtml
When you click a link, load the page ProjectPhotos.cshtml via ajax into a div#photos
The problem is that on my page ProjectPhotos.cshtml have the script:
<script type="text/javascript">
$(document).ready(function () {
$('.project-status').parent('a').click(function (e) {
e.preventDefault();
DoSomething();
});
});
</script>
As this page is loaded via ajax, this script will not in the HTML markup.
If this script was in a separate JS file, click on the link, I could load the file dynamically.
Thank you all for your help!
It depends. If the scripts are fairly small, concatenating the files into one is better because you reduce the number of connections (the browser will usually use just a few simultaneous connections).
But if the scripts are big, and the scripts are not needed on all pages, it's probably better to split it up. But still preferably only one file per page.
Try both options and disable/empty cache in your browser and test...
Probably the best way is to put everything in one file for two reasons. One maintenance is easier there is only one script to deal with instead of searching through seven. Second once the script is loaded it is cached. Meaning it is ready for all your other pages as well. Assuming you don't have scripts that are huge for each individual page the overhead is not really that much.