Here's the scenario. I am doing a $.getScript()function call to get a script in my javascript file. The script that I'm downloading from the $.getScript() tries to download some other scripts that it's dependent on. In my script I'm using done() to check if the script loaded completely or not. And if it did, then I try calling the function that's not on the script that I just loaded form $.getScript but the script that was loaded in it.
It's getting confusing so let me demonstrate with some code:-
//my script.js
$.getScript("http://myexternaljs.com/first.js").done(function(){
doSecond(); //<- this resides in second.js which is being called in first.js below
}
//first.js
(function(){
$.getScript("http://firstexternal.com/second.js");
}());
//second.js
function doSecond(){
return console.log("hello world");
}
The problem here is second.js takes a little time to download so I keep getting doSecond() is undefined error on my call to doSecond() on done().
I could use a timeout and check if second.js loaded or not but is there a better way to do this?
I'm open to any AMD loaders or Promises answers as well.
You can also use $.ajaxSuccess:
$(document).ajaxComplete(function(ev, jqXhr, options) {
// You could be as specific as you need to here.
if ( options.url.match('second.js') ) {
doSecond();
}
});
Alternatively, you could do this inside ajaxComplete:
$(document).ajaxComplete(function(ev, jqXhr, options) {
// You could simplify to
// doSecond && doSecond()
// if you can trust that it will always be a function
if ( doSecond && $.isFunction(doSecond) ) {
doSecond();
}
});
The facts:
You have first.js and within this script is an include for second.js
You need to make a call to doSecond() which is defined in second.js
You need to ensure doSecond() is available before you call it
You can't directly change first.js or second.js but you can have someone else change it
Possible solutions, ordered by best to worst
1) Request that second.js be removed from first.js. Call them separately so that you can nest them:
$.getScript("first.js").done(function(){
$.getScript("second.js").done(function(){
doSecond();
});
});
This is the best solution. There are alternatives to this that basically do he same thing in principle (e.g. other people's answers here). If first.js was including second.js synchronously or otherwise forcing load before continuing (e.g. option #3 below), you wouldn't be running up against this problem to begin with. Therefore first.js already must be structured to deal with second.js be *a*sync loaded, so there shouldn't be an issue with them removing it from the file and you calling it yourself.
But you mentioned that the location of second.js is defined in first.js so this somehow isn't feasible to you (why not? can they put the path/to/script in a variable for you to access?)
2) Request that second.js be wrapped in a .done or equivalent loaded callback that pops a callback function that you can define.
// inside first.js
$.getScript("second.js").done(function(){
if (typeof 'secondLoaded'=='function')
secondLoaded();
});
// on-page or elsewhere, you define the callback
function secondLoaded() {
doSecond();
}
This is just a generic and easy "callback" example. There are a million ways to implement this principle, depending on what all is actually in these scripts and how much effort people are willing to make to restructure things.
3) Request that second.js script include be changed to be included via document.write
document.write(unescape("%3Cscript src='second.js' type='text/javascript'%3E%3C/script%3E"));
This will force js to resolve document.write before js can move on, so second.js should be loaded by the time you want to use doSecond(). But this is considered bad practice because until document.write is resolved, nothing else can happen. So if second.js is taking forever to load or eventually times out.. that makes for bad UX. So you should avoid this option unless you have no other choice because of "red tape" reasons.
4) use setTimeout to try and wait for it to load.
function secondLoaded() {
if (!secondLoaded.attempts) secondLoaded.attempts = 0;
if (secondLoaded.attempts < 5) {
if (typeof 'doSecond'=='function') {
doSecond();
} else {
secondLoaded.attempts++;
window.setTimeout('secondLoaded()',100);
}
}
}
secondLoaded();
I list this worse than #3 but really it's kind of a tossup.. In this situation you basically either have to pick between deciding a cutoff time to just not execute doSecond() (in this example, I try 5 times at 100ms intervals), or code it to just keep checking forever and ever (remove the .attempts logic or else swap it up w/ setInterval and removeInterval logic).
You could modify how $.getScript works.
$.fn.getScript = (function() {
var originalLoad = $.fn.getScript;
return function() {
originalLoad.apply($, arguments);
$(document).trigger('load_script')
};
})();
This will fire an event every time a script is loaded.
So you can wait for these events to fire and check if your method exists.
$(document).one('second_loaded', function() {
doSecond();
}).on('load_script', function() {
doSecond && document.trigger('second_loaded');
});
Note that one rather than on. It makes the event fire once.
Have you considered using jQuery.when:
$.when($.getScript("http://myexternaljs.com/first.js"),$.getScript("http://firstexternal.com/second.js"))
.done(function(){
doSecond();
}
If I were you, I'd facade $.getScript and perform some combination of the above tricks. After reading through the comments it seems that there is a possibility of loading the same script twice.
If you use requirejs this problem is solved for you because it only loads each script once. The answer here is to hang on to the requests made.
Loader:
var requiredScripts = {};
function importScript(url) {
if (!requiredScripts[url]) {
requiredScripts[url] = $.getScript(url);
}
return requiredScripts[url];
}
Usage:
// should cause 2 requests
$.when(
importScript('first.js'),
importScript('second.js')
).done(function() {
// should cause no requests
$.when(importScript('first.js')).done(function() {});
$.when(importScript('second.js')).done(function() {});
});
Real world example here using MomentJS and UnderscoreJS: http://jsfiddle.net/n3Mt5/
Of course requirejs would handle this for you and with better syntax.
Requirejs
define(function(require) {
var first = require('first'),
second = require('second');
});
$(window).one("second", function(e, t) {
if ( $(this).get(0).hasOwnProperty(e.type) && (typeof second === "function") ) {
second(); console.log(e.type, e.timeStamp - t);
$(this).off("second")
};
return !second
});
$.getScript("first.js")
.done(function( data, textStatus, jqxhr ) {
if ( textStatus === "success" ) {
first();
$.getScript("second.js")
.done(function( script, textStatus, jqxhr, callbacks ) {
var callbacks = $.Callbacks("once");
callbacks.add($(window).trigger("second", [ $.now() ]));
return ( textStatus === "success" && !!second
? callbacks.fire()
: $(":root").animate({top:"0"}, 1000, function() { callbacks.fire() })
)
});
};
})
// `first.js` : `function first() { console.log("first complete") }`
// `second.js` : `function second() { console.log("second complete") }`
Related
My company allows us to write code in a javascript editor online. Other libraries are preloaded, so the code we write has access to these libraries.
Specifically, we can use Underscore.js and jQuery.js functions in our code. We can also use our very own library Graphie.js.
In an effort to save myself time, I have slowly built up my own personal set of functions which I copy and paste into every code I write. That set of functions is now so long that I want to fetch it externally (in order to save space, etc).
$.getScript( 'url/to/myfunctions.js' )
I tried the above code, but it was too good to be true. This jQuery function getScript seems to run myfunctions as their own independent unit. This fails because myfunctions use our Graphie.js functions within them.
$.get( 'url/to/myfunctions', eval )
This above code fetches and successfully evals my code (i configured my server to do so). Also too good to be true. Any jQuery and Underscode functions in my code actually work. But any Graphie functions in my code cause an error.
Instead of
$.get( 'url/to/myfunctions', eval );
try
$.get( 'url/to/myfunctions', function(code) { eval(code); } );
This way the eval function is going to be executed within the same scope as the rest of your code, rather than within the scope of jQuery. After the code has been fetched and executed, you can continue with the execution of the rest of your code:
$.get( 'url/to/myfunctions', function(code) {
eval(code);
callback();
});
function callback() {
// Your code goes here
}
Explanation
For the purpose of the explanation, let's use this simplified model of the environment, in which your code is being executed:
// JQuery is defined in the global scope
var $ = {
get: function( url, fn ) {
var responses = {
"url/to/myfunctions": "try {\
if(graphie) log('Graphie is visible.');\
} catch (e) {\
log('Graphie is not visible. (' + e + ')');\
}"
}; fn( responses[url] );
}
};
(function() {
// Graphie is defined in a local scope
var graphie = {};
(function() {
// Your code goes here
$.get( "url/to/myfunctions", eval );
$.get( "url/to/myfunctions", function(code) { eval (code); } );
})();
})();
The output: <ol id="output"></ol>
<script>
function log(msg) {
var el = document.createElement("li");
el.appendChild(document.createTextNode(msg));
output.appendChild(el);
}
</script>
As you can see, the function passed to $.get gets executed inside its body. If you only pass eval to $.get, then you don't capture the local variable graphie, which is then invisible to the evaluated code. By wrapping eval inside an anonymous function, you capture the reference to the local variable graphie, which is then visible to the evaluated code.
I'd advise against the use of eval. However, you can follow the following model.
First in your myFunctions.js, wrap all your code into a single function.
(function(_, $, graphie) {
// declare all your functions here which makes use of the paramters
}) // we will be calling this anonymous function later with parameters
Then after getting the script you could do
$.get( 'url/to/myfunctions', function(fn){
var el = document.createElement('script');
el.type = 'text/javascript';
el.text = fn + '(_, jQuery, Graphie);';
document.head.appendChild(el);
});
Note that, I've put Graphie as the parameter, but I'm not sure of it. So put your correct graphie variable there.
Assuming that you have ajax access to this script (since that is what $.get is doing in your sample code shown), you could attempt to use jQuery's .html() to place the script which should execute it with the page's variable environment.
$.ajax({
url: 'url/to/myfunctions.js',
type: 'GET',
success: function (result) {
var script = '<scr'+'ipt>'+result+'</scr'+'ipt>';
var div = $("<div>");
$("body").append(div);
div.html(script);
}
});
Internally, this script will end up being executed by jQuery's globalEval function. https://github.com/jquery/jquery/blob/1.9.1/src/core.js#L577
// Evaluates a script in a global context
// Workarounds based on findings by Jim Driscoll
// http://weblogs.java.net/blog/driscoll/archive/2009/09/08/eval-javascript-global-context
globalEval: function( data ) {
if ( data && jQuery.trim( data ) ) {
// We use execScript on Internet Explorer
// We use an anonymous function so that context is window
// rather than jQuery in Firefox
( window.execScript || function( data ) {
window[ "eval" ].call( window, data );
} )( data );
}
}
I also asked a question related to this here: Why is it that script will run from using jquery's html but not from using innerHTML?
Thanks to everyone's help, here is the solution that worked...
The myfunctions.js file has to be wrapped in a function:
function everything(_,$,Graphie){
// every one of myfunctions now must be attached to the Graphie object like this:
Graphie.oneOfMyFunctions = function(input1,input2,etc){
// content of oneOfMyFunctions
}
// the rest of myfunctions, etc.
}
Then in my code I can retrieve it with:
$.get( '//path/to/myfunctions', eval )
everything(_,jQuery,mygraphievar);
Somehow, the code being evaled didn't have access to the global variable mygraphievar, which is why it had to be passed in and NOT part of the evaled code (here Amit made a small error).
Also, the everything function is executed OUTSIDE of the $.get() so that the changes to mygraphievar are made before any other code below gets executed.
One should notice that $.get() is actually an asynchronous function and will not call eval until after other code is executed. This causes the code to fail the very first time I run it, but after the first time the functions get saved in memory and then everything works correctly. The proper solution would be to write ALL of the code I want to execute in the callback function of the $.get(), but I was lazy.
One should also know that a slightly simpler solution is possible with $.getScript() but I don't have time to verify it.
Source: http://blog.tomasjansson.com/creating-custom-unobtrusive-file-extension-validation-in-asp-net-mvc-3-and-jquery
$(function () {
jQuery.validator.unobtrusive.adapters.add('fileextensions', ['fileextensions'], function (options) {
var params = {
fileextensions: options.params.fileextensions.split(',')
};
options.rules['fileextensions'] = params;
if (options.message) {
options.messages['fileextensions'] = options.message;
}
});
jQuery.validator.addMethod("fileextensions", function (value, element, param) {
var extension = getFileExtension(value);
var validExtension = $.inArray(extension, param.fileextensions) !== -1;
return validExtension;
});
function getFileExtension(fileName) {
var extension = (/[.]/.exec(fileName)) ? /[^.]+$/.exec(fileName) : undefined;
if (extension != undefined) {
return extension[0];
}
return extension;
};
} (jQuery));
Wouldn't jQuery already be available inside this function, why would it be passed in at the end there? I don't get this and I've seen it a few times before, never had to use it so, was curious what's going on here.
Passing it in isn't doing anything. That syntax isn't right since the function, as it's used there, is a callback and not IIFE.
Only reason I could think to do that would be if no conflict mode is used. Even then the syntax is still not correct.
Read More: jQuery.noConflict
We pass jQuery or other jQuery Control variable ($, jQuery, jq, jQ, jQ1101) to the module or plugin because in the DOM, We can have multiple version of jQuery loaded or we can have other libraries which uses $ as control variable. Such as PrototypeJS or Zepto
By passing jQuery Control variable we ensure that we have right control variable for our module and internally we just use $ as jQuery variable.
Please see this example.
<html>
<head>
<title>StackOverflow 19257741</title>
<script type="text/javascript" src="http://zeptojs.com/zepto.min.js"></script>
<script type="text/javascript" src="http://code.jquery.com/jquery-1.10.1.min.js"></script>
</head>
<body>
<div id="content">
<!-- Other HTML Tags -->
</div>
</body>
<script type="text/javascript">
//Change jQuery Control because you have other library loadded or you have multiple jQuery loaded.
var jQ1101 = jQuery.noConflict(true);
//Now, you can not access jQuery by $ or jQuery
//This module have to have jQuery to do DOM Manipulation
var module = (function ($, zepto) {
var i = 0, //private ivar for module use only.
_init = function () {
var me = this;
//TODO: Module can init or do something here...
return me;
},
_fill = function (selector) {
//We can use $ here as jQuery Control
$(selector).css({ "backgroundColor": "#000000", "width": "100%", height: "100%" });
//Wait for 2 seconds
window.setTimeout(function() {
//Not select dom with zepto
zepto(selector).css({ "backgroundColor": "#777777", "width": "100%", height: "100%" });
}, 2000);
};
return {
init: _init,
fill: _fill
};
})(jQ1101, $); //We have to pass the current Control for jQuery so, module can use library for internal function.
//Call module then call fill method by passing variable
module.init().fill("#content");
//Two different Library
console.log(jQ1101.fn.jquery); //jQuery 1.10.1
console.log($); //Zepto
</script>
<html>
The code in that blog post is valid JavaScript syntax and will execute, but it doesn't do what the author probably expected. The $(function...) call looks like an attempt to run that function on DOM ready in the usual manner, but that's not what happens.
Let's deconstruct the code. First, strip out all the code in the function and add some logging:
console.log( 'before' );
$( function () {
console.log( 'DOM ready' );
} (jQuery) );
console.log( 'after' );
This is (perhaps surprisingly) valid JavaScript and will log:
before
DOM ready
after
But here's the problem: If you put that script in a web page, you really want it to log:
before
after
DOM ready
After all, you're running the script before the DOM is ready, expecting the function to be run later, after the DOM becomes ready. That's the whole point of using the $() call.
What went wrong? To make it more clear, let's break out the inner function as a separate named function:
console.log( 'before' );
function ready() {
console.log( 'DOM ready' );
}
$( ready(jQuery) );
console.log( 'after' );
And one more change to make it completely step-by-step:
console.log( 'before' );
function ready() {
console.log( 'DOM ready' );
}
var result = ready( jQuery );
$( result );
console.log( 'after' );
Each of these versions has exactly the same semantics and runs in the same order.
Now it should be clear what happened. We're calling the ready function immediately and passing its return value into the $() call. The ready function doesn't return any value, so that value is undefined.
The last line, then, is the equivalent of:
$( undefined );
And that call simply returns an empty jQuery object ([]).
The culprit, of course, is that (jQuery) at the end. Adding that is what caused the function to be called immediately, instead of passing a reference to the function into the $() call. And the presence of jQuery inside the parentheses is meaningless: this ready function doesn't expect any arguments, so that is ignored. It would be the same thing if () appeared there.
It's very much like a common error you see with setTimeout() and similar calls:
// Oops - calls doStuff immediately, not after one second
setTimeout( doStuff(), 1000 );
This raises the question: why didn't this code run into a problem since it doesn't doesn't work as expected? Well, the function does get called either way - the only difference is when it's run.
So two possible reasons why it didn't cause a problem.
This block of code may have been placed at the end of the <body> as is popular practice these days. In that case the DOM would probably be ready-enough when the code is run. DOM elements are created in order as the <body> is loaded, and if you put a <script> tag after a particular DOM element, that DOM element will indeed be available when that script is run.
The code may not require the DOM to be ready. Looking at the validator code in the blog post, it does look like setup code that doesn't do anything with the DOM when it's first run. If so, then as long as the jQuery.validator plugin is already loaded, then it wouldn't make any difference if this code runs immediately or later when the full DOM is ready.
I'm trying to load a script so I can use scripts on the page that is spawned by the bookmarklet. (view src: XHR followed by beautify.js followed by prettify.js)
I know what I am basically supposed to do (like this) but what's happening is I can't find a good way to detect when the functions I need are actually loaded.
var doWhenLoaded = function (name) {
if (typeof(eval(name)) === 'function') {
eval(name+'()');
} else {
setTimeout(
function () {
console.log("from timeout: "+new Date().getTime());
doWhenLoaded(name,call);
} , 50
);
}
}
I tried that but eval(name+'()'); throws an error.
I can't answer your question, but to test if a function is available use:
var doWhenLoaded = function (name) {
if (typeof window[name] == 'function') {
window[name]();
} else {
// set the timeout. Should have a limit, else it wil go on forever.
}
...
};
Edit
Updated to use window[name], but really should use a reference to the global object. But I guess it's ok to use window for a browser specific script.
The code above should not throw any errors. Since name is in the formal parameters, it's essentially a declared local variable. If name is undefined, then typeof name will return the string "undefined", which fails the test so name() is not evaluated.
I think I can force the scripts to get loaded synchronously before I end up calling them by simply writing the document rather than setting them into the dom.
So let's say we load a bunch of scripts via $.getScript:
$.getScript( 'js/script1.js' );
$.getScript( 'js/script2.js' );
$.getScript( 'js/script3.js' );
Now, I'd like to invoke a handler when all those scripts finished loading. I tried binding a handler for the global ajaxStop event. According to the docs, the ajaxStop global event is triggered if there are no more Ajax requests being processed.
$( document ).ajaxStop( handler );
but it doesn't work (the handler is not invoked).
Live demo: http://jsfiddle.net/etGPc/2/
How can I achieve this?
I digged a little more into this issue and it's indeed the cross-domain script request that's the caveat. As I posted in the comments, that scenario has been implemented such that it sets the global option to false. This makes jQuery not to fire global ajax events. (No idea why that has been implemented though.)
This can be confirmed with this fiddle (pass means ajaxStop is fired):
cross-domain, no script: pass
cross domain, script: fail
no cross-domain, no script: pass
no cross-domain, script: pass
The most straight-forward thing to do is simply adding another prefilter which forces the global option to true:
jQuery.ajaxPrefilter( "script", function() {
s.global = true;
});
This also makes this failing scenario pass in the fiddle.
you're doing it wrong anyways :)
here is one of the things that can happen:
imagine the scripts are cached, then they might be loaded in no time.
so, straight after the first call $.getScript( 'js/script1.js' ); the script will be available and $.ajaxStop (might!!!) get called, in the worst case that would happen three times.
to answer your question indirectly i would propose a different solution which avoids this race condition alltogether.
you can try it here: http://jsfiddle.net/etGPc/8/
var urls, log, loaded;
// urls to load
urls = [
'https://raw.github.com/h5bp/html5-boilerplate/master/js/script.js',
'https://raw.github.com/h5bp/html5-boilerplate/master/js/plugins.js',
'https://raw.github.com/h5bp/html5-boilerplate/master/js/libs/modernizr-2.0.6.min.js'
];
// urls loaded
loaded = [];
log = $( '#log' );
$.map( urls, function( url ){
$.getScript( url, function(){
// append to loaded urls
loaded.push( url );
log.append( "loaded " + url + "<br>" );
// all loaded now?
if( loaded.length == urls.length ){
log.append( "<b>all done!</b>" );
}
} );
} );
if you haven't seen jQuery.map before: it's not really different from a for-loop :)
another advantage here is that this method doesn't get confused if you have other ajax requests going on at the same time.
p.s. to avoid naming-clashes you can wrap the entire thing in a self-executing function, i.e.
function(){
var urls, log, loaded;
... all code here ...
} ();
Update: Refactored the code a bit...
var urls, loadedUrls, log;
urls = [
'https://raw.github.com/h5bp/html5-boilerplate/master/js/script.js',
'https://raw.github.com/h5bp/html5-boilerplate/master/js/plugins.js',
'https://raw.github.com/h5bp/html5-boilerplate/master/js/libs/modernizr-2.0.6.min.js'
];
loadedUrls = [];
log = $( '#log' )[0];
urls.forEach(function ( url ) {
$.getScript( url, function () {
loadedUrls.push( url );
$( log ).append( 'loaded ' + url + '<br>' );
if( loadedUrls.length === urls.length ){
$( log ).append( '<b>all done!</b>' );
}
});
});
Live demo: http://jsfiddle.net/etGPc/10/
While dealing with my own problem I found a much better and elegant solution to this problem, and it is using the jQuery deffered object. I think you already know about the function, maybe for another usecase, but it works great for loading files, and fire functions when eveything is done. Code is as follows:
function getLatestNews() {
return $.get('/echo/js/?delay=2&js=', function(data) {
console.log('news data received');
$('.news').css({'color':'blue'});
});
}
function getLatestReactions() {
return $.get('/echo/js/?delay=5&js=', function(data) {
console.log('reactions data received');
$('.reactions').css({'color':'green'});
});
}
function prepareInterface() {
return $.Deferred(function(dfd) {
var latest = $('.news, .reactions');
latest.slideDown(500, dfd.resolve);
latest.addClass('active');
}).promise();
}
$.when(
getLatestNews(),
getLatestReactions(),
prepareInterface()
).then(function() {
console.log('fire after requests succeed');
$('.finished').html('I am done!');
}).fail(function() {
console.log('something went wrong!');
});
I made a small fiddle where you can check out the code.
http://jsfiddle.net/saifbechan/BKTwT/
You can check out a running copy there, I took the snippet from this tutorial, it's worth reading the whole tutorial, lot's of good information on asynchronous ajax.
http://msdn.microsoft.com/en-us/scriptjunkie/gg723713
I was implementing a on-demand script controller based on jquery's getscript, it looks like this:
function controller = function(){
var script = function(){
var scripts = {};
return {
load: function(jsurl){
$.getScript(jsurl, null);
},
run: function(js){
window[js].apply(this,null);
}
}
};
return {
script: script()
};
}
var ctlr = controller();
then here is a remote script with a function to be loaded - remote.js
function remotefunc(){
alert( 'remotefunc invoked' );
}
and here is how the whole thing supposed to work, in the main script:
ctlr.script.load( 'remote.js' ); // remote script successfully loaded
ctlr.script.run( 'remotefunc' ); // got an error, window['remotefunc'] undefined
but as you can see, 'remotefunc' is defined in the global 'window' scope, so the window object is supposed to be able to 'see' it.
I thought the problem was probably the closure stuff in the 'controller' definition, so I did a direct $.getScirpt without using the 'controller':
$.getScript( 'http://path/to/remote.js', function(){
window['remotefunc'].apply( this, null ); // this worked
} );
strange. So it is about the 'controller' implementation(I kind need it)! Anybody can help me out with this? How to fix the 'controller' implementation so the
window[js].apply(this,null);
can actually work?
Thanx.
The reason it's telling you window['remotefunc'] is undefined is because you are not giving it time to actually download and execute the remote script before attempting to call a function defined in it.
The remote script is loaded asynchronously, which means the script execution isn't paused while waiting for a response.
You will need to either re-implement the getScript method to be synchronous or somehow work your class around the fact that the function will not be available in any determinate amount of time.
EDIT: Just found another possible solution, try calling this before your request
$.ajaxSetup({async: false});
This will make the getScript method synchronous
When using something like getSript, it's important to remember that it is fetching asynchronously. Meaning, the browser fires off the request and while that's happening, code after that line executes without pause.
jQuery provides a callback function parameter to get script that allows you to do something after the asynchronous fetch is finished.
Try this:
var script = function(){
var scripts = {};
return {
load: function(jsurl, callback){
$.getScript(jsurl, callback);
},
run: function(js){
window[js].apply(this,null);
}
}
};
Then, when using it:
ctlr.load( 'remote.js', function(){
// remote script successfully loaded
ctlr.run( 'remotefunc' );
});
Could this be a timing issue?
In your working example you call the function in a callback which jQuery will not invoke until the script is loaded. In your non-working example, you call the function immediately after getScript which is asynchronously loading the script.