I'm wondering if there's another way of using templates in knockout.js without having to use require.js to load them dynamically.
It adds around 20Kb after minification more to the site and it seems we are loading quite a big library to do something that probably wouldn't need as much code behind it.
This is what I'm doing now:
ko.components.register('menu', {
viewModel: { instance: mm.viewModel },
template: { require: 'text!views/menu.html' },
});
To do so I had to include require.js in my project and requrie text`:
<script type="text/javascript">
requirejs.config({
paths: {
text: 'bower_components/text/text'
},
urlArgs: "v=" + new Date().valueOf()
});
</script>
I ended up getting the file from the server side with my own call.
In node (but this can be done in PHP or any other language as well), I added a route to retrieve the requested file:
router.get('/loadFile/', function(req, res, next){
var params = req.query;
var demo = express.static(path.join(res.locals.virtualDirPath, 'public'));
fs.readFile( __dirname + '/../public/elements/' + params.filename, "utf-8", function read(err, data) {
if (err) {
throw err;
}
// Invoke the next step here however you like
return res.send(data);
processFile();
});
});
Then I created my own custom component loader in the Javascript side as detailed in the docs.
var templateFromUrlLoader = {
loadTemplate: function(name, templateConfig, callback) {
var newUrl = url + 'others/loadFile/';
var params = { 'filename' : templateConfig.filename };
if (templateConfig.filename) {
// Uses jQuery's ajax facility to load the markup from a file
$.get(newUrl, params, function(markupString) {
// We need an array of DOM nodes, not a string.
// We can use the default loader to convert to the
// required format.
ko.components.defaultLoader.loadTemplate(name, markupString, callback);
});
} else {
// Unrecognized config format. Let another loader handle it.
callback(null);
}
}
};
// Registering it
ko.components.loaders.unshift(templateFromUrlLoader);
This way I saved myself from having to load 84Kb of require.js for this simple task.
Plus I'm not limited this way to the use of require.js and I can use a single combined and minified file for production environments.
Also, I'm in total control over the caching of the returned templates, which used to cause me problems when using require.js.
We used to use require.js with knockout, but we have started to use browserify instead. Since then the code base is much nicer and we build the whole project into one file except the basic libraries we use. (Eg.: knockout.js - because we load them separately from cdn, which makes the app in production much, much faster)
Here is a component library what we are developing:
https://github.com/EDMdesigner/knobjs
We use gulp to build the project. Check the build:dev task in the gulpfile. Basically, the templates will be included in the built js file.
Related
I'm using the requirejs-babel plugin which requires prepending 'es6!' to all module ids that need babel transpilation.
define(['es6!some-es6-module'], function(module) {
// ...
});
Is there an API in RequireJS that would allow me to inspect a module id and prepend the plugin id as-needed? For example, if I wanted to apply 'es6!' to all module ids in a specific directory?
Ultimately I need to be able to write defines like this define(['some-es6-module'], ...) and automatically add the es6! prefix depending on what the module id is.
Not looking for information on SystemJS or gulp tasks that do the transpilation ahead of time, etc.
The exact module ids are not known at configuration time- I just know in certain locations/directories, modules will need es6!.
Needs to work in the browser, at runtime
I am not 100% sure on your overall objective (do you want the es6 addition to module ID saved permanently or always auto-added?), but you may be able to use RequireJS mapping to substitute module ID's for defined modules. For example: -
requirejs.config({
map: {
// * - for all modules that require these, do this
'*': {
'some-es6-module': 'es6!some-es6-module'
}
}
});
However, considering your use-case you may need something more complicated than this, as mapping assumes you have actual different versions of files and is generally used for this purpose.
A more complicated solution I assume you are looking to avoid could be to dynamically loop your files before optimising them in r.js and loading/editing them via Node. It would get a little messy!
var config = requirejs.s.contexts._.config;
var needBabel = ['some-es6-module', 'another-module-name', 'another'];
for (var property in config.paths) {
if (config.paths.hasOwnProperty(property) && needBabel.indexOf(property) > -1) {
// load the module in node
// fs.readFileSync(__dirname + config.paths[property] + '.js');
// dynamically modify this file with text replacement
// save this file via Node again
}
}
// run Require JS optimiser
// undo everything you've just done when optimisation is complete
I ended up overriding the load method. The override uses the standard load for modules with mapped paths, otherwise it uses the es6 (requirejs-babel) plugin to load the module.
require.standardLoad = require.load;
require.load = function(context, moduleName, url) {
var config = requirejs.s.contexts._.config;
if (moduleName in config.paths) {
return require.standardLoad(context, moduleName, url);
}
require(['es6'], function(es6) {
es6.load(
moduleName,
require,
{
fromText: function(text) {
require.exec(text);
context.completeLoad(moduleName);
}
},
{});
});
};
Here it is in action: https://gist.run/?id=7542e061bc940cde506b
I managed to accomplish my task using a gulp plugin called gulp-insert like this:
gulp.task('compile-js', function () {
// Minify and bundle client scripts.
var scripts = gulp.src([
srcDir + '/routes/**/*.js',
srcDir + '/shared/js/**/*.js'
])
// Sort angular files so the module definition appears
// first in the bundle.
.pipe(gulpAngularFilesort())
// Add angular dependency injection annotations before
// minifying the bundle.
.pipe(gulpNgAnnotate())
// Begin building source maps for easy debugging of the
// bundled code.
.pipe(gulpSourcemaps.init())
.pipe(gulpConcat('bundle.js'))
// Buffer the bundle.js file and replace the appConfig
// placeholder string with a stringified config object.
.pipe(gulpInsert.transform(function (contents) {
return contents.replace("'{{{appConfigObj}}}'", JSON.stringify(config));
}))
.pipe(gulpUglify())
// Finish off sourcemap tracking and write the map to the
// bottom of the bundle file.
.pipe(gulpSourcemaps.write())
.pipe(gulp.dest(buildDir + '/shared/js'));
return scripts.pipe(gulpLivereload());
});
What I'm doing is reading our app's configuration file which is managed by the config module on npm. Getting our config file from server-side code is a snap using var config = require('config');, but we're a single-page app and frequently need access to the configuration settings on the client-side. To do that I stuff the config object into an Angular service.
Here's the Angular service before gulp build.
angular.module('app')
.factory('appConfig', function () {
return '{{{appConfigObj}}}';
});
The placeholder is in a string so that it's valid JavaScript for some of the other gulp plugins that process the file first. The gulpInsert utility lets me insert the config like this.
.pipe(gulpInsert.transform(function (contents) {
return contents.replace("'{{{appConfigObj}}}'", JSON.stringify(config));
}))
This works but feels a little hacky. Not to mention that it has to buffer the whole bundled file just so I can perform the operation. Is there a more elegant way to accomplish the same thing? Preferably one that allows the stream to keep flowing smoothly without buffering the whole bundle at the end? Thanks!
Have you checked gulp-replace-task?
Something like
[...]
.pipe(gulpSourcemaps.init())
.pipe(replace({
patterns: [{
match: '{{{appConfigObj}}}',
replacement: config
}],
usePrefix: false
})
.pipe(gulpUglify())
[...]
Admittedly, this feels a bit hacky, too, but maybe slightly better... I'm using envify and gulp-env in a React project. You could do something like this.
gulpfile.js:
var config = require('config');
var envify = require('envify');
gulp.task('env', function () {
env({
vars: {
APP_CONFIG: JSON.stringify(config)
}
});
});
gulp.task('compile-js', ['env'], function () {
// ... replace `gulp-insert` with `envify`
});
factory:
angular.module('app')
.factory('appConfig', function () {
return process.env.APP_CONFIG;
});
While working on a Web app using Webpack to manage JavaScript dependencies, I stumbled upon the problem i'm going to describe.
Loading dependencies passing strings to require() works beautifully:
// main.js
var jQuery = require('jquery');
Here, jquery is installed with Bower, and Webpack is correctly configured to automatically resolve Bower modules.
Now, I'm working on the problem of conditionally loading modules, with particular regard to the situation where modules have to be downloaded from a CDN, or from the local server if the CDN fails. I use scriptjs to asynchronously load from the CDN, by the way. The code I'm writing is something like this:
var jQuery = undefined;
try {
jQuery = require('jquery-cdn');
} catch (e) {
console.log('Unable to load jQuery from CDN. Loading local version...');
require('script!jquery');
jQuery = window.jQuery;
}
// jQuery available here
and this code works beautifully as well.
Now, since I obviously have a lot of dependencies (Handlebars, Ember, etc.) that I want to try to load from a CDN first, this code starts to get a little redundant, so the most logical thing I try to do is to refactor it out into a function:
function loadModule(module, object) {
var lib = undefined;
try {
lib = require(module + '-cdn');
} catch (e) {
console.log('Cannot load ' + object + ' from CDN. Loading local version...');
require('script!' + module);
lib = window[object];
}
return lib;
}
var jQuery = loadModule('jquery', 'jQuery');
var Handlebars = loadModule('handlebars', 'Handlebars');
// etc...
The problem is that Webpack has a particular behaviour when dealing with expressions inside require statements, that hinders my attempts to load modules in the way described above. In particular, when using an expression inside require it
tries to include all files that are possible with your expression
The net effect is a huge pile of error messages when I try to run Webpack with the above code.
Though the linked resources suggest to explicitly declare the path of the JavaScript files to include, what I fail to get is how to do the same thing when I cannot, or don't want to, pass a precise path to require, but rather use the automatically resolved modules, as shown.
Thanks all
EDIT:
I still don't known how to use expressions to load those scripts, however, I designed a workaround. Basically, the idea is to explicitly write the require('script') inside a callback function, and then dinamically call that function when it's time. More precisely, I prepared a configuration file like this:
// config.js
'use strict';
module.exports = {
'lib': {
'jquery': {
'object': 'jQuery',
'dev': function() { require('script!jquery'); },
'dist': function() { return require('jquery-cdn'); },
'cdn': '//ajax.googleapis.com/ajax/libs/jquery/1.11.1/jquery.min.js'
},
'handlebars': {
// ...
}
}
};
Inside my main code I, then, define an array of resources to load, like:
var config = require('./config.js');
var resources = [ config.lib.jquery, config.lib.handlebars, ... ];
And then when I have to load the development version, or the distribution version, I dinamically call:
// Inside some kind of cycle
// resource = resources[index]
try {
window[resource.object] = resource.dist();
} catch (e) {
console.log('Cannot load ' + resource.object + ' from CDN. Loading local version...');
resource.dev();
}
Here there's a more complete example of this in action.
Using RequireJS I'm building an app which make extensive use of widgets. For each widget I have at least 3 separate files:
request.js containing code for setting up request/response handlers to request a widget in another part of my application
controller.js containing handling between model and view
view.js containing handling between user and controller
Module definition in request.js:
define(['common/view/widget/entity/term/list/table/controller'],
function(WidgetController) { ... });
Module definition in controller.js:
define(['common/view/widget/entity/term/list/table/view'],
function(WidgetView) { ... });
Module definition of view.js is:
define(['module','require'],function(module,require) {
'use strict';
var WidgetView = <constructor definition>;
return WidgetView;
});
I have lots of these little situations as above in the case of widgets I have developed. What I dislike is using the full path every time when a module is requiring another module and both are located in the same folder. I'd like to simply specify as follows (assuming we have a RequireJS plugin which solves this for us):
define(['currentfolder!controller'],
function(WidgetController) { ... });
For this, I have written a small plugin, as I couldn't find it on the web:
define({
load: function (name, parentRequire, onload, config) {
var path = parentRequire.toUrl('.').substring(config.baseUrl.length) + '/' + name;
parentRequire([path], function (value) {
onload(value);
});
}
});
As you might notice, in its basic form it looks like the example of the RequireJS plugins documentation.
Now in some cases, the above works fine (e.g. from the request.js to the controller.js), but in other cases a load timeout occurs (from controller.js to view.js). When I look at the paths which are generated, all are proper RequireJS paths. Looking at the load timeouts, the following is logged:
Timestamp: 13-09-13 17:27:10
Error: Error: Load timeout for modules: currentfolder!view_unnormalized2,currentfolder!view
http://requirejs.org/docs/errors.html#timeout
Source File: http://localhost/app/vendor/requirejs/require.js?msv15z
Line: 159
The above log was from a test I did with only loading the view.js from controller.js using currentfolder!view in the list of modules in the define statement. Since I only requested currentfolder!view once, I'm confused as to why I both see currentfolder!view_unnormalized2 and currentfolder!view in the message.
Any idea as to why this might be happening?
My answer may not answer your primary questions, but it will help you achieve what you're trying to do with your plugin.
In fact, Require.js support relative paths for requiring modules when using CommonJS style. Like so:
define(function( require, exports, module ) {
var relativeModule = require("./subfolder/module");
module.exports = function() {
console.log( relativeModule );
};
});
Is there a way in a Node.js Jake build to wait until a certain file has been copied, and advance to do some operation only after the destination file can be found? I think this question pretty much comes down to "is there a way to copy files synchronously in Node.js/Jake?" (Perhaps something else than writing something from scratch, using the combination of fs.readSync and fs.writeSync.)
Background:
I'm developing a web app that is run on Node.js (with Express) during development, but will be deployed on a Java server in production. (We use Jade and Stylus in the client and Express enables us to run the app without generating all the HTML files etc. and deploying it after every change.)
I use Jake for making the build, i.e. generating HTML files from Jade files and CSS from Stylus files etc. Now I'm also trying to concatenate all of the app's JavaScript files into one minimized file and change all the HTML files to use that instead of all the separate JS files that are used in "raw" form during development.
However, I now have a problem with that last step. My idea was to copy all of my Jade files into a temporary directory for the deployment build and replace the reference (in a Jade file used as a header on all HTML pages) to a list of all separate JS files to the one that has just been generated by concatenating and minimizing the whole bunch. But as I first copy all of the Jade files to another location (which happens asynchronously) and try to edit one of the files, opening the file always fails since the copy operation hasn't really finished yet.
This is what I have now (in a simplified form) in my jakefile:
var fs = require('fs');
var fse = require('fs-extra');
var path = require('path');
var glob = require('glob');
var Snockets = require('snockets');
var snockets = new Snockets();
// generating the minimized JS file
snockets.getConcatenation(baseDir + '/scripts/all.js', { minify: true }, function(err, allJs) {
if (err) {
throw err;
}
fs.writeFileSync(generatedJsFileName, allJs);
});
// copying all the Jade files to a temp dir
glob.sync('**/*.*', {
cwd : srcDir
}).forEach(function(file) {
var loadPath = srcDir + '/' + file;
var savePath = targetDir + '/' + file;
fse.mkdirsSync(path.dirname(savePath));
fse.copy(loadPath, savePath);
});
// trying to read one of the copied files (which fails, since the file cannot be found yet)
fs.readFile(targetDir + '/views/includes/head.jade', 'utf8', function(err, data) {
...
});
This might be a stupid question, and a stupid way to try to solve the problem in the first place. So, also suggestions for a better approach are very welcome.
Update:
I also tried using Parseq, putting each operation (creating the JS file, copying the Jade files, reading one file) in its own function, but even that gives me the same error. If I run the script several times without deleting the target directory of the copy operation in between, the file can be found. So e.g. the path is correct and the problem really seems to be about timing.
I didn't really find an answer to the main question so I don't know if this helps anyone else facing the same problem. But I did find a way to get around the problem.
I ended using the same original Jade files for the two different conversions, but in the second conversion I use a custom js function to change the script tag reference to point to the minified file.
I.e.
var data = jade.compile(str, { filename: file, pretty: true })({
css: function(path) {
return '<link rel="stylesheet" href="/styles/' + path + '.css" />';
},
js: function(path) {
var name = '<script src="/scripts/';
if (path == 'all') {
name += generatedJsFileName;
}
else {
name += path + '.js';
}
name += '"></script>';
return name;
}
});
It might not be the prettiest workaround but it works.