I've created a Dojo module which depends on dojox/data/JsonRestStore like this:
define("my/MyRestStore",
["dojo/_base/declare", "dojox/data/JsonRestStore"],
function(declare, JsonRestStore) {
var x = new JsonRestStore({
target: '/items',
identifier: 'id'
});
...
which is fine. But now I want to have the the uncompressed version of the JsonRestStore code loaded so that I can debug it. I can't find any documentation on how to do this, but since there is a file called 'JsonRestStore.js.uncompressed.js' I changed my code to:
define("my/MyRestStore",
["dojo/_base/declare", "dojox/data/JsonRestStore.js.uncompressed"],
function(declare, JsonRestStore) {
...
thinking that might work.
I can see the JsonRestStore.js.uncompressed.js file being loaded in FireBug, but I get an error when trying to do new JsonRestStore:
JsonRestStore is not a constructor
Should this work?
Is there a way of configuring Dojo to use uncompressed versions of all modules? That's what I really want, but will settle for doing it on a per dependency basis if that's the only way.
Update
I've found a way to achieve what I want to do: rename the JsonRestStore.js.uncompressed.js file to JsonRestStore.js.
However, this seems a bit like a hacky workaround so I'd still be keen to know if there is a better way (e.g. via configuration).
You have two options
1) Create a custom build. The custom build will output a single uncompressed file that you can use for debugging. Think the dojo.js.uncompressed.js but it includes all the extra modules that you use.
OR
2) For a development environment, use the dojo source code. This means downloading the Dojo Toolkit SDK and referencing dojo.js from that in the development environment.
For the projects I work on, I do both. I set up the Dojo configuration so that it can be dynamic and I can change which configuration that I want using a query string parameter.
When I am debugging a problem, I will use the first option just to let me step through code and see what is going on. I use the second option when I am writing some significant js and don't want the overhead of the custom build to see my changes.
I describe this a bit more at
http://swingingcode.blogspot.com/2012/03/dojo-configurations.html
I think the reason for this is due to the fact that the loader declares its class-loads (modules), by the file conventions used. The 1.7 loader is not too robust just yet, ive had similar problems until realizing how to separate the '.' and '/' chars.
Its only a qualified guess; but i believe it has to do with the interpretation of '.' character in the class-name which signifies as a sub-namespace and not module name.
The 'define(/ * BLANK * / [ / * DEPENDENCIES * / ], ...)' - where no first string parameter is given - gets loaded by the filename (basename). The returned declare also has a saying though. So, for your example with jsonrest, its split/parsed as such:
toplevel = dojox
mid = data
modulename = JsonRestStore.js.uncompressed
(Fail.. Module renders as dojox.data.JsonRestStore.js.uncompressed, not dojox.data.JsonRestStore as should).
So, three options;
Load uncomressed classes through <script src="{{dataUrl}}/dojox/data/JsonRestStore.js.uncompressed.js"></script> and work them on dojo.ready
I think modifying the define([], function(){}) in uncompressed.js to define("JsonRestStore", [], function() {}) would do the trick (uncomfirmed)
Use the dojo/text loader, see below
Text filler needed :)
define("my/MyRestStore",
["dojo/_base/declare", "dojo/text!dojox/data/JsonRestStore.js.uncompressed.js"],
function(declare, JsonRestStore) {
...
JsonRestStore = eval(JsonRestStore);
// not 100% sure 'define' returns reference to actual class,
// if above renders invalid, try access through global reference, such as
// dojox.dat...
Related
I'm not sure I'm even asking the right question here, sorry, but I think the two general ones are:
In what way do you need to modify a node.js package using require etc to be used as a plain embedded script/library in HTML?
How do you call a class constructor (?) in JS as a function to validate a form field?
I'm trying to use this small JS library NoSwearingPlease (which is an npm package) in an environment with no node or build system – so I'm just trying to call it like you would jQuery or something with a script & src in the HTML, and then utilise it with a small inline script.
I can see a couple of things are required to get this working:
the JSON file needs to be called in a different way (not using require etc)
the checker variable needs to be rewritten, again without require
I attempted using jQuery getJSON but I just don't understand the class & scope bits of the library enough to use it I think:
var noswearlist = $.getJSON( "./noswearing-swears.json" );
function() {
console.log( "got swear list from inline script" );
})
.fail(function() {
console.log( "failed to get swear list" );
})
noswearlist.done(function() {
console.log( "done callback as child of noswearlist variable" );
var checker = new NoSwearing(noswearlist);
console.log(checker);
});
Please halp. Thanks!
No need to modify, when outside of node the class is just appended to window (global):
fetch("https://cdn.jsdelivr.net/gh/ThreeLetters/NoSwearingPlease#master/swears.json").then(response => {
return response.json();
}).then(data => {
var noSwearing = new NoSwearing(data);
console.log(noSwearing.check("squarehead"));
});
<script src="https://cdn.jsdelivr.net/gh/ThreeLetters/NoSwearingPlease#master/index.js"></script>
In the future, you can answer this type of question on your own by looking through the source code and looking up things you don't understand. That being said, here's what I was able to gather doing that myself.
For your first question, if you have no build tools you can't use require, you have to hope your NPM package supports adding the class to the window or has a UMD export (which in this case, it does). If so, you can download the source code or use a CDN like JSDelivr and add a <script> tag to link it.
<script src="https://cdn.jsdelivr.net/gh/ThreeLetters/NoSwearingPlease#master/index.js"></script>
I'm having a hard time deciphering your script (it has a few syntax errors as far as I can tell), so here's what you do if you have a variable ns containing the JSON and the string str that you need to check:
var checker = new NoSwearing(ns);
checker.check(str);
As an aside, you should really use build tools to optimize your bundle size and make using packages a lot easier. And consider dropping jQuery for document.querySelector, fetch/XMLHttpRequest, and other modern JavaScript APIs.
I have a SPA (in Aurelia / TypeScript but that should not matter) which uses SystemJS. Let's say it runs at http://spa:5000/app.
It sometimes loads JavaScript modules like waterservice/external.js on demand from an external URL like http://otherhost:5002/fetchmodule?moduleId=waterservice.external.js. I use SystemJS.import(url) to do this and it works fine.
But when this external module wants to import another module with a simple import { OtherClass } from './other-class'; this (comprehensiblely) does not work. When loaded by the SPA it looks at http://spa:5000/app/other-class.js. In this case I have to intercept the path/location to redirect it to http://otherhost:5002/fetchmodule?moduleId=other-class.js.
Note: The Typescript compilation for waterservice/external.ts works find because the typescript compiler can find ./other-class.ts easily. Obviously I cannot use an absolute URL for the import.
How can I intercept the module loading inside a module I am importing with SystemJS?
One approach I already tested is to add a mapping in the SystemJS configuration. If I import it like import { OtherClass } from 'other-class'; and add a mapping like "other-class": "http://otherhost:5002/fetchmodule?moduleId=other-class" it works. But if this approach is good, how can I add mapping dynamically at runtime?
Other approaches like a generic load url interception are welcome too.
Update
I tried to intercept SystemJS as suggest by artem like this
var systemLoader = SystemJS;
var defaultNormalize = systemLoader.normalize;
systemLoader.normalize = function(name, parentName) {
console.error("Intercepting", name, parentName);
return defaultNormalize(name, parentName);
}
This would normally not change anything but produce some console output to see what is going on. Unfortunately this seems to do change something as I get an error Uncaught (in promise) TypeError: this.has is not a function inside system.js.
Then I tried to add mappings with SystemJS.config({map: ...});. Surprisingly this function works incremental, so when I call it, it does not loose the already provided mappings. So I can do:
System.config({map: {
"other-class": `http://otherhost:5002/fetchModule?moduleId=other-class.js`
}});
This does not work with relative paths (those which start with . or ..) but if I put the shared ones in the root this works out.
I would still prefer to intercept the loading to be able to handle more scenarios but at the moment I have no idea which has function is missing in the above approach.
how can I add mapping dynamically at runtime?
AFAIK SystemJS can be configured at any time just by calling
SystemJS.config({ map: { additional-mappings-here ... }});
If it does not work for you, you can override loader.normalize and add your own mapping from module ids to URLs there. Something along these lines:
// assuming you have one global SystemJS instance
var loader = SystemJS;
var defaultNormalize = loader.normalize;
loader.normalize = function(name, parentName) {
if (parentName == 'your-external-module' && name == 'your-external-submodule') {
return Promise.resolve('your-submodule-url');
} else {
return defaultNormalize.call(loader, name, parentName);
}
}
I have no idea if this will work with typescript or not. Also, you will have to figure out what names exactly are passed to loader.normalize in your case.
Also, if you use systemjs builder to bundle your code, you will need to add that override to the loader used by builder (and that's whole another story).
changing exports.X in a function seems to not work...
I want to be able to load settings from a file & access them in Node.js. I have this currently, however, the clients connecting to my node application can edit what's in the settings file. Unfortunately as it stands the Node application has to be restarted for the changes to take effect. Is there a way I can reload the module.exports on the fly?
EDIT:
Settings file is literally a JSON string.
My settings module is 'required' in almost every single file, and there's a lot of files... So reloading it per-file basis is out of the question. I do, however, know precisely when someone makes a change to the settings.
If you are using require to load the settings and only referencing the settings from one module, then doing something along the lines of:
delete require.cache[require.resolve(filename)];
will work for you.
If, on the other hand, multiple modules will be referencing these settings, that approach can become a bit unwieldy and open you up to unforeseen bugs. For example, if any of the modules are holding on to a reference to the required settings file, they would each need to somehow learn that the settings had changed and update their references.
To alleviate (not completely solve) the caching issue, you build your settings interface so that users of it must access either the settings object via a function and/or require that properties are accessed via functions. Even with this model, someone may still decide to cache a setting causing an obscure failure later down the road.
Using the simplest approach of a single getter for the settings object would look something like this:
var settings = require('./settings.json');
// ... watch for changes and reload by invalidating node's cache
module.exports = function() { return settings; }
Usage:
var settings = require('./path/to/settings');
settings().foo;
There are several libraries that do settings. Depending on your needs, I'm partial to nconf.
I'd set up a file watcher here that checks for changes of a JSON file dynamically. It is not recommended practice to change a JS script once the app is running.
Something like:
var _ = require("lodash");
var fs = require("fs");
var result = {};
fs.watch('my-settings.json',function(event,filename){
fs.readFile(filename,function(err,data){
if(err){
// your error catching
}
_.extend(result,JSON.parse(data));
});
});
module.exports = result;
Now, this comes with lots of caveats, first that fs.watch is not always supported by all platforms.
http://nodejs.org/api/fs.html#fs_fs_watch_filename_options_listener
Second, that it's really awkward to change a property like this. The expectation is generally that exports of module not mutate. I'd instead recommend exposing a method whose result can change based on the state of the file, a getter for the resulting data.
Third, a file watcher can be expensive, memory-wise.
This is better code, IMHO:
var _ = require("lodash");
var fs = require("fs");
var filename = 'my-settings.json';
var lastModified;
var mySetting;
module.exports = {
getSettingAsync : function (callback) {
fs.stat(filename,function(err,stat){
if(stat.mtime == lastModified) {
callback(mySetting);
} else {
fs.readFile(filename,function(err,data){
if(err){
// your error catching
}
// this assumes that your data is always correct
mySetting = JSON.parse(data).mySetting;
callback(mySetting);
});
}
});
}
};
In this case, we both check for a JSON file, and expose this as an async method. You could just as easily change the code to use the sync versions if need be and return the value instead of invoking the callback. This version checks when the file was changed, which is cheaper than reading the whole file every time, reads the file if newer and saves you the need to use a potentially buggy file watcher.
By the way, I've not tested this code and it may contain errors as is, but the concept is sound.
But, perhaps the more salient question, why not just store that value in the database?
I have a interesting concept I was working on and looking over, through various stack questions on auto loading JavaScript. I dint want to use a third party tool, aside form jquery, so I thought I would role my own. The concept I have is:
var scripts = {
'name' : 'path/to/script_dir/' // Load all scripts in this file.
}
requireScripts(scripts); // Requires all scripts
// Call your classes, methods, objects and so on ....
The requireScript() function would work something like:
function requireScript(hash){
$.each(hash, function(key, value)){
$.ajax({
url: value,
dataType: "script",
async: false,
error: function () {
throw new Error("Could not load script " + script);
}
});
});
}
Note: The above is just a concept, I don't think it will work.
The above would let you load SPECIFIC scripts. so in essence your hash key value would be 'name' : 'path/to/specific/script'. The issue this posses is that your hash would get rather large ....
The other issue I ran into is what if I simplified this to "php pear naming standard" so, as the trend seems to be - we would create a class, and it would be named after its location:
var some_folder_name_class = function(){}
Would be translated by the autoloader as: some/folder/name/class.js and then loaded that way.
To wrap up and get to my point there are two ways of loading javascript file I am looking at, via rolling my own "require" method. One is loading a directory of javascript files via the hash idea implemented above. (the provided code sample of how this hash would be walked through would have to be changed and fixed....I dont think it works to even load a single file)
OR
to have you just do:
new some_class_name() and have a global function listen for the new word, go find the file your trying to call based on the name of the class and load it, this you never have to worry - as long as you follow "pear naming standards" in both class and folder structure your js file will be loaded.
Can either approach be done? or am I dreaming to big?
I see a lot of frameworks do a bunch of require('/path/to/script') and if I could role my own autoloader to just allow me to either load a directory of js files or even have it where it listens for new before a class instantiation then I could make my life SO MUCH easier.
Have you consider using requirejs and probably Lazy loading.
http://www.joezimjs.com/javascript/lazy-loading-javascript-with-requirejs/
Here is sample version:
You can download here.
The sample is based on this folder structure :
public
index.html
scripts
app.js
lib
** jquery-1.10.2.js
** require.js
3 . From Code:
html
`<!DOCTYPE html><html>
<head><title>Sample Test</title>`
<script src="scripts/lib/require.js"></script> <!-- downloaded from link provide above-->
<script src="scripts/app.js"></script></head>
`<body><h1>My Sample Project</h1><div id="someDiv"></div></body></html>`
application configuration app.js
requirejs.config({
baseUrl: 'scripts',
paths: {
app: 'app',
jquery: 'lib/jquery-1.10.2' //your libraries/modules definitions
}
});
// Start the main app logic. loading jquery module
require(['jquery'], function ($) {
$(document).on('ready',function(){
$('#someDiv').html('Hello World');
});
});
jQuery-only option
If you are looking for a jQuery-only solution, have a look at jQuery.getScript(). It would be a great candidate for handling the script loading portion of your problem. You could then write a very small wrapper around it to load all the scripts—something like you wrote above:
var loadScripts = function(scripts) {
$.each(scripts, function(name, path) {
jQuery.getScript("/root/path/" + path + ".js");
})
}
If you are interested in more information on this approach, read this article by David Walsh.
Other great libraries
I strongly recommend taking a look at the current batch of script-loading libraries. I think that you will pleasantly surprised by what is out there. Plus, they come with the benefit of great community support and documentation. RequireJS seems to be the front runner but David Walsh has great articles on curl.js and LABjs.
I've noticed that RequireJS creates script tags in the tag as it loads modules.
Is there anyway to configure RequireJS to "tag" those elements w/ a class or an attribute of some kind that I could later target w/ jQuery later on?
e.g.:
var $requireJsScripts = $('script.require-script');
--UPDATE--
Ok.. I think I can get by on this little workaround for now. Thanks to this answer for the breadcrumb on require.s.contexts._.defined. I'd still like to hear if anyone knows of a way to configure RequireJS to do something similar to what was laid out in the original question...
var loadedRjsModules = Object.keys(require.s.contexts._.defined);
var $scripts = $('script');
$scripts.each(function () {
if ($(this).data('requiremodule') && $.inArray($(this).data('requiremodule'), loadedRjsModules)) {
console.log(this);
}
});
Looking at the source code, I don't see how RequireJS would allow adding anything custom to the script nodes at creation. The routine that creates them has no provision for it. The code that fleshes them out upon creation does not support it either.
There's an onResourceLoad hook considered part of the internal API. It could be used with the code you've put in your question instead of relying on require.s.contexts._.defined, which as far as I know is fully private and subject to change without notice.