I'm trying to programmatically load a local javascript file - papaparse library, and then use one of its functions:
$.getScript("./Content/Scripts/papaparse.js", function () {
console.log("Papaparse loaded successfully");
Papa.parse(file, { skipEmptyLines: true, header: false, complete: completeCallback });
});
The script loaded successfully, but calling the parse method throws an error:
ReferenceError: Papa is not defined
Within papaparse library, Papa defined as follows:
(function (global) {
"use strict";
var Papa = {};
Papa.parse = someFunction;
.
.
global.Papa = Papa;
}
If that helps, this entire code is called from a typescript file.
What am I doing wrong?
As Castro pointed out in his answer here that according to offical documentation of Jquery's getScript
The callback of getScript method is fired once the script has been loaded but not necessarily executed.
That means when getScript's callback function is called then the target script is only being loaded in current page context not fully executed so you need to give some time to JavaScript engine to execute that script.
How could you give that time. Hmm one of the options is setTimeout/setInterval.
You could use setTimeout/setInterval right inside callback function of getScript.
Modified version of your code would look like :-
$.getScript("./Content/Scripts/papaparse.js", function () {
console.log("Papaparse loaded successfully");
function dealWithPapa() {
Papa.parse(file, { skipEmptyLines: true, header: false, complete: completeCallback });
}
//regularly check after 100ms whether Papa is loaded or not
var interval = setInterval(function() {
if(Papa !== undefined) {
//once we have reference to Papa clear this interval
clearInterval(interval);
dealWithPapa();
}
},100);
});
I hope that would clear your doubt.
According to https://api.jquery.com/jquery.getscript/:
The callback is fired once the script has been loaded but not necessarily executed.
You might need to use a setTimeout(300, function(){...}) to wait for it to execute.
Related
Here's the scenario. I am doing a $.getScript()function call to get a script in my javascript file. The script that I'm downloading from the $.getScript() tries to download some other scripts that it's dependent on. In my script I'm using done() to check if the script loaded completely or not. And if it did, then I try calling the function that's not on the script that I just loaded form $.getScript but the script that was loaded in it.
It's getting confusing so let me demonstrate with some code:-
//my script.js
$.getScript("http://myexternaljs.com/first.js").done(function(){
doSecond(); //<- this resides in second.js which is being called in first.js below
}
//first.js
(function(){
$.getScript("http://firstexternal.com/second.js");
}());
//second.js
function doSecond(){
return console.log("hello world");
}
The problem here is second.js takes a little time to download so I keep getting doSecond() is undefined error on my call to doSecond() on done().
I could use a timeout and check if second.js loaded or not but is there a better way to do this?
I'm open to any AMD loaders or Promises answers as well.
You can also use $.ajaxSuccess:
$(document).ajaxComplete(function(ev, jqXhr, options) {
// You could be as specific as you need to here.
if ( options.url.match('second.js') ) {
doSecond();
}
});
Alternatively, you could do this inside ajaxComplete:
$(document).ajaxComplete(function(ev, jqXhr, options) {
// You could simplify to
// doSecond && doSecond()
// if you can trust that it will always be a function
if ( doSecond && $.isFunction(doSecond) ) {
doSecond();
}
});
The facts:
You have first.js and within this script is an include for second.js
You need to make a call to doSecond() which is defined in second.js
You need to ensure doSecond() is available before you call it
You can't directly change first.js or second.js but you can have someone else change it
Possible solutions, ordered by best to worst
1) Request that second.js be removed from first.js. Call them separately so that you can nest them:
$.getScript("first.js").done(function(){
$.getScript("second.js").done(function(){
doSecond();
});
});
This is the best solution. There are alternatives to this that basically do he same thing in principle (e.g. other people's answers here). If first.js was including second.js synchronously or otherwise forcing load before continuing (e.g. option #3 below), you wouldn't be running up against this problem to begin with. Therefore first.js already must be structured to deal with second.js be *a*sync loaded, so there shouldn't be an issue with them removing it from the file and you calling it yourself.
But you mentioned that the location of second.js is defined in first.js so this somehow isn't feasible to you (why not? can they put the path/to/script in a variable for you to access?)
2) Request that second.js be wrapped in a .done or equivalent loaded callback that pops a callback function that you can define.
// inside first.js
$.getScript("second.js").done(function(){
if (typeof 'secondLoaded'=='function')
secondLoaded();
});
// on-page or elsewhere, you define the callback
function secondLoaded() {
doSecond();
}
This is just a generic and easy "callback" example. There are a million ways to implement this principle, depending on what all is actually in these scripts and how much effort people are willing to make to restructure things.
3) Request that second.js script include be changed to be included via document.write
document.write(unescape("%3Cscript src='second.js' type='text/javascript'%3E%3C/script%3E"));
This will force js to resolve document.write before js can move on, so second.js should be loaded by the time you want to use doSecond(). But this is considered bad practice because until document.write is resolved, nothing else can happen. So if second.js is taking forever to load or eventually times out.. that makes for bad UX. So you should avoid this option unless you have no other choice because of "red tape" reasons.
4) use setTimeout to try and wait for it to load.
function secondLoaded() {
if (!secondLoaded.attempts) secondLoaded.attempts = 0;
if (secondLoaded.attempts < 5) {
if (typeof 'doSecond'=='function') {
doSecond();
} else {
secondLoaded.attempts++;
window.setTimeout('secondLoaded()',100);
}
}
}
secondLoaded();
I list this worse than #3 but really it's kind of a tossup.. In this situation you basically either have to pick between deciding a cutoff time to just not execute doSecond() (in this example, I try 5 times at 100ms intervals), or code it to just keep checking forever and ever (remove the .attempts logic or else swap it up w/ setInterval and removeInterval logic).
You could modify how $.getScript works.
$.fn.getScript = (function() {
var originalLoad = $.fn.getScript;
return function() {
originalLoad.apply($, arguments);
$(document).trigger('load_script')
};
})();
This will fire an event every time a script is loaded.
So you can wait for these events to fire and check if your method exists.
$(document).one('second_loaded', function() {
doSecond();
}).on('load_script', function() {
doSecond && document.trigger('second_loaded');
});
Note that one rather than on. It makes the event fire once.
Have you considered using jQuery.when:
$.when($.getScript("http://myexternaljs.com/first.js"),$.getScript("http://firstexternal.com/second.js"))
.done(function(){
doSecond();
}
If I were you, I'd facade $.getScript and perform some combination of the above tricks. After reading through the comments it seems that there is a possibility of loading the same script twice.
If you use requirejs this problem is solved for you because it only loads each script once. The answer here is to hang on to the requests made.
Loader:
var requiredScripts = {};
function importScript(url) {
if (!requiredScripts[url]) {
requiredScripts[url] = $.getScript(url);
}
return requiredScripts[url];
}
Usage:
// should cause 2 requests
$.when(
importScript('first.js'),
importScript('second.js')
).done(function() {
// should cause no requests
$.when(importScript('first.js')).done(function() {});
$.when(importScript('second.js')).done(function() {});
});
Real world example here using MomentJS and UnderscoreJS: http://jsfiddle.net/n3Mt5/
Of course requirejs would handle this for you and with better syntax.
Requirejs
define(function(require) {
var first = require('first'),
second = require('second');
});
$(window).one("second", function(e, t) {
if ( $(this).get(0).hasOwnProperty(e.type) && (typeof second === "function") ) {
second(); console.log(e.type, e.timeStamp - t);
$(this).off("second")
};
return !second
});
$.getScript("first.js")
.done(function( data, textStatus, jqxhr ) {
if ( textStatus === "success" ) {
first();
$.getScript("second.js")
.done(function( script, textStatus, jqxhr, callbacks ) {
var callbacks = $.Callbacks("once");
callbacks.add($(window).trigger("second", [ $.now() ]));
return ( textStatus === "success" && !!second
? callbacks.fire()
: $(":root").animate({top:"0"}, 1000, function() { callbacks.fire() })
)
});
};
})
// `first.js` : `function first() { console.log("first complete") }`
// `second.js` : `function second() { console.log("second complete") }`
I have a function which calls out for JSON and then in the success function makes some changes to the DOM. I'm trying to use the mock-ajax library in my Jasmine tests to avoid having to expose the various private functions for mocking.
Even though when stepping through the test the request.response is set the onSuccess method is never called.
My tests:
describe('the table loader', function () {
var request;
beforeEach(function() {
//html with a loader div, an output div and a transit and dwell template
setFixtures('<div class="loader"></div><div class="row"><div id="output" class="col-xs-12"></div></div><script id="thingTemplate" type="text/x-handlebars">{{snip}}</script>');
expect($('.loader')).toBeVisible();
//start ajax call
window.dashboards.thing.display({
loaderId: '.loader',
templateId: '#thingTemplate',
templateOutletId: '#output',
dataUrl: '/my/fake/url'
});
//catch the ajax request
request = mostRecentAjaxRequest();
});
describe('on success', function () {
beforeEach(function() {
//populate the response
request.response({
status: 200,
responseText: "{rowItem: [{},{},{}]}"
});
});
it('should hide the loader', function () {
//thing should now receive that JSON and act accordingly
expect($('.loader')).not.toBeVisible();
});
});
});
and my code:
(function (dashboards, $) {
dashboards.thing = dashboards.thing || {};
var compileTable = function(templateId, jsonContext) {
var source = $(templateId).html();
var template = Handlebars.compile(source);
var context = jsonContext;
return template(context);
};
var getDashboardData = function(options) {
$.getJSON(
options.dataUrl,
function (data) {
processDashboardData(options, data);
}
).fail(function (jqxhr, textStatus, error) {
console.log('error downloading dashboard data');
console.log(textStatus + ': ' + error);
}).always(function() {
console.log('complete');
});
};
var processDashboardData = function (options, data) {
$(options.loaderId).hide();
$(options.templateOutletId).html(compileTable(options.templateId, data));
};
dashboards.thing.display = function (options) {
getDashboardData(options);
};
}(
window.dashboards = window.dashboards || {},
jQuery
));
None of the deferred functions (success, error, and always) are being called.
Edit
based on #gregg's answer below (and he's right I didn't include the UseMock call in the example code) this feels like a versions issue. As even with that call included this still isn't working for me.
I've added a runnable example on github
You need to make sure to install the ajax mock with jasmine.Ajax.useMock() before you actually make the call or jasmine-ajax won't have taken over the XHR object and you'll be making real requests. Once I did that against your sample code, it looks like the responseText you're sending isn't JSON parsable and jQuery blows up. But I was still seeing the 'complete' message logged in my console.
So, as of 29th Jan 2014, the download link in the Readme for mock-ajax on Github in the 1.3.1 tag doesn't point to the correct version of the file.
Manually downloading the mock-ajax file from the lib folder of the tag does work.
In other words for the 1.3.1 tagged release don't download from the readme link download from the tag lib folder directly
It's possible that the format of the data returned by your ajax call and jasmine's is different -- I believe that would throw an exception that wouldn't be recoverable, hence the reason none of your callback functions are running.
Newer versions of jasmine-ajax also use request.respondWith, and it's also of note (although I think jQuery handles this), that Jasmine doesn't have XMLHttpRequest.Done defined, so if you use vanilla JS you have to handle this case yourself. Lastly I use jasmine.Ajax.install() and not jasmine.Ajax.useMock().
Whew -- it's hard to know what exactly to do because Jasmine has such poor documentation for older versions and there are so many inconsistencies across them.
For my backend I want to automatically load javascript files when it detects certain elements. Here is an example:
if($('.wysiwyg').length>0) {
include('javascript/ckeditor/ckeditor.js');
$(".wysiwyg").ckeditor();
}
But when I execute the code I get $(".wysiwyg").ckeditor is not a function because it seems the browser is still loading or parsing the javascript file that was included on the line before. If I put an alert popup right before the function it does work because it "pauzes" the script I guess and gives it time to load the file.
Is there a way I can know when the file is actually loaded so that the followed code can be executed?
EDIT:
Seems that I asked this question a bit too soon. I found out the e.onload property for a callback function that solved this problem. This is my function now if others might stumble upon the same problem:
function include(script, callback) {
var e = document.createElement('script');
e.onload = callback;
e.src = script;
e.type = "text/javascript";
document.getElementsByTagName("head")[0].appendChild(e);
}
if($('.wysiwyg').length>0) {
include('javascript/ckeditor/ckeditor.js', function() {
$(".wysiwyg").ckeditor();
});
}
Why not use the built in ajax-based getScript?
It also has a callback mechanism that allows you to execute some code only after the required script has been succesfully loaded :
function include(script,callback){
$.getScript(script, function() {
if(typeof callback == 'function')
callback.apply({},arguments);
});
}
and then you can use it in such a manner:
if($('.wysiwyg').length>0) {
include('javascript/ckeditor/ckeditor.js',function(){
$(".wysiwyg").ckeditor();
});
}
When you're using jQuery with promises you can use a modified version of the above code like so:
function include(srcURL) {
var deferred = new $.Deferred();
var e = document.createElement('script');
e.onload = function () { deferred.resolve(); };
e.src = srcURL;
document.getElementsByTagName("head")[0].appendChild(e);
return deferred.promise();
}
Then you can use the above code with a '$.when(include('someurl.js'))' call.
This will let you have
The global window context (which you need for CKEditor)
The ability to defer executing other code until the when resolves
A script that doesn't require a callback and a context for that to be passed because jQuery is handling that with the promises functionality it includes.
I hope this helps someone else who is looking for more than a callback, and multiple scripts to be loaded with jQuery's promises/deferred functionality.
You can also try YepNope - a conditional javascript loader
yepnope is an asynchronous conditional resource loader that's
super-fast, and allows you to load only the scripts that your users
need.
You can do it this way
$(document).ready(function()
{
if($('.wysiwyg').length>0) {
$('head').append('<script language="javascript" src="javascript/ckeditor/ckeditor.js"></script>');
$(".wysiwyg").ckeditor();
}
});
Modernizr can do this for you. See this MetaFlood article: Use jQuery and Modernizr to load javascript conditionally, based on existence of DOM element.
Not sure if I am being totally wrong here but I want to do something like this:
Have an external js page (on an external server)
Include the page - OK that is easy etc...
Have a Jquery function on the external page - well actually many functions
Call those functions directly onto the page.
All a bit like this:
External js page:
$(document).ready(function() {
function testit() {
$('#test').load('page.php');
}
function testit_1() {
$('#test_1').load('page_1.php');
}
function testit_1() {
$('#test_2').load('page_2.php');
}
});
Then on the actual page just call:
<script type="script/javascript">
testit();
</script>
<div id="test"></div>
Am I wrong or should that not work?
You dont need to define the functions within the ready function, but you have to call it within the ready function.
$(document).ready(function() {
testit();
});
function testit() {
$('#test').load('page.php');
}
function testit_1() {
$('#test_1').load('page_1.php');
}
function testit_2() {
$('#test_2').load('page_2.php');
}
Otherwise testit() will be called before the document is loaded. And at that moment the function doesn't even exist yet in your example.
Your functions are local to the scope of the anonymous function passed as the argument to $(document).ready(). Here's a simple example showing the behaviour you're seeing:
(function() {
function foo() {
alert("It shouldn't alert this...");
}
})();
foo();
To fix it, simply move your function declarations outside of the ready function:
function testit() {
$('#test').load('page.php');
}
function testit_1() {
$('#test_1').load('page_1.php');
}
function testit_2() {
$('#test_2').load('page_2.php');
}
And use the ready function (shorthand $(function() { ... })) in your main js file:
$(function() {
testit_1();
});
I'm not sure if I'm understanding you wrongly, but will you load an external page of an external server? This is not possible on normal browser security settings. You cannot perform a succesful XMLHttpRequest for a document that resides on a different server. Nearly all browsers will block this and leave you with nothing. You would have to write a server-side proxy that fetches the document and serves it back to the client.
That should work fine. Just be sure to include the external JS file in your page and execute testit() inside another $.ready() call:
<script type="script/javascript" src="http://someurl.com/external.js"></script>
<script type="script/javascript">
$.ready( function() {
testit();
} );
</script>
<div id="test"></div>
The location of a JS file is irrelevant. Once it is loaded into your page, it is executed in the context of that page.
I was implementing a on-demand script controller based on jquery's getscript, it looks like this:
function controller = function(){
var script = function(){
var scripts = {};
return {
load: function(jsurl){
$.getScript(jsurl, null);
},
run: function(js){
window[js].apply(this,null);
}
}
};
return {
script: script()
};
}
var ctlr = controller();
then here is a remote script with a function to be loaded - remote.js
function remotefunc(){
alert( 'remotefunc invoked' );
}
and here is how the whole thing supposed to work, in the main script:
ctlr.script.load( 'remote.js' ); // remote script successfully loaded
ctlr.script.run( 'remotefunc' ); // got an error, window['remotefunc'] undefined
but as you can see, 'remotefunc' is defined in the global 'window' scope, so the window object is supposed to be able to 'see' it.
I thought the problem was probably the closure stuff in the 'controller' definition, so I did a direct $.getScirpt without using the 'controller':
$.getScript( 'http://path/to/remote.js', function(){
window['remotefunc'].apply( this, null ); // this worked
} );
strange. So it is about the 'controller' implementation(I kind need it)! Anybody can help me out with this? How to fix the 'controller' implementation so the
window[js].apply(this,null);
can actually work?
Thanx.
The reason it's telling you window['remotefunc'] is undefined is because you are not giving it time to actually download and execute the remote script before attempting to call a function defined in it.
The remote script is loaded asynchronously, which means the script execution isn't paused while waiting for a response.
You will need to either re-implement the getScript method to be synchronous or somehow work your class around the fact that the function will not be available in any determinate amount of time.
EDIT: Just found another possible solution, try calling this before your request
$.ajaxSetup({async: false});
This will make the getScript method synchronous
When using something like getSript, it's important to remember that it is fetching asynchronously. Meaning, the browser fires off the request and while that's happening, code after that line executes without pause.
jQuery provides a callback function parameter to get script that allows you to do something after the asynchronous fetch is finished.
Try this:
var script = function(){
var scripts = {};
return {
load: function(jsurl, callback){
$.getScript(jsurl, callback);
},
run: function(js){
window[js].apply(this,null);
}
}
};
Then, when using it:
ctlr.load( 'remote.js', function(){
// remote script successfully loaded
ctlr.run( 'remotefunc' );
});
Could this be a timing issue?
In your working example you call the function in a callback which jQuery will not invoke until the script is loaded. In your non-working example, you call the function immediately after getScript which is asynchronously loading the script.