when parcel compile my code it build js file started like this :
"use strict";
// modules are defined as an array
// [ module function, map of requires ]
//
// map of requires is short require name -> numeric require
//
// anything defined in a previous bundle is accessed via the
// orig method which is the require for previous bundles
require = function (_require) {
function require(_x, _x2, _x3) {
return _require.apply(this, arguments);
}
require.toString = function () {
return _require.toString();
};
return require;
}(function (modules, cache, entry) {
...
that code can't run because it try to define variable require without var|let|const in "use strict" mode .
So my qustion is : why it generate bad code like that ?
So in this tread they say that upgrade node to 8.0.0 or above solve the problem.
In a funny way BTW... it magically remove the use strict from the generated file
Related
I'm configuring Grunt with grunt-contrib-concat to concatenate like 20 javascript files. They have to be in a specific order and I'm wondering if there is a neat way to do this, without messing up my Gruntfile.js.
What I did and what worked well, was declaring an variable called 'libraries' with a function which returned a string with all the files in the right order.
var libraries = new (function () {
return [
'/javascript/libs/jquery.min.js',
'/javascript/libs/jquery.address.js',
'/javascript/libs/jquery.console.js'
];
});
And then concat (simplified, just an example):
concat: {
libs: {
files: {
'libs.js' : [libraries],
},
},
main: {
files: {
'main.js' : [main]
}
}
},
So when I call 'libraries' in my task configuration everything works fine, but I would like to declare this list in a separate file.
Unfortunately I couldn't find anything, nor do I know if this is even possible. Hope that someone could help me out! Thanks in advance :-)
I found a solution! Since Grunt is build on NodeJS, it's possible to use module.exports. What I did was setting an external file called libraries.js, which is in my Grunt directory.
var exports = module.exports = {};
exports.customLibrary = function () {
return [
// Path to a library
// Path to another library
// and so on...
];
};
exports.mainScripts = function () {
return [
// Path to a library
// Path to another library
// and so on...
];
};
Then I import this module by declaring a variable in Gruntfile.js
var libraries = require('../javascript/libraries.js');
To use the methods declared in libraries.js I set two more variables which returns a string with all the necessary files in the desired order:
var customLibrary = libraries.customLibrary();
var mainScripts = libraries.mainScripts();
I use these variables to define the source in the concat task. Hope this is helpful!
While working on a Web app using Webpack to manage JavaScript dependencies, I stumbled upon the problem i'm going to describe.
Loading dependencies passing strings to require() works beautifully:
// main.js
var jQuery = require('jquery');
Here, jquery is installed with Bower, and Webpack is correctly configured to automatically resolve Bower modules.
Now, I'm working on the problem of conditionally loading modules, with particular regard to the situation where modules have to be downloaded from a CDN, or from the local server if the CDN fails. I use scriptjs to asynchronously load from the CDN, by the way. The code I'm writing is something like this:
var jQuery = undefined;
try {
jQuery = require('jquery-cdn');
} catch (e) {
console.log('Unable to load jQuery from CDN. Loading local version...');
require('script!jquery');
jQuery = window.jQuery;
}
// jQuery available here
and this code works beautifully as well.
Now, since I obviously have a lot of dependencies (Handlebars, Ember, etc.) that I want to try to load from a CDN first, this code starts to get a little redundant, so the most logical thing I try to do is to refactor it out into a function:
function loadModule(module, object) {
var lib = undefined;
try {
lib = require(module + '-cdn');
} catch (e) {
console.log('Cannot load ' + object + ' from CDN. Loading local version...');
require('script!' + module);
lib = window[object];
}
return lib;
}
var jQuery = loadModule('jquery', 'jQuery');
var Handlebars = loadModule('handlebars', 'Handlebars');
// etc...
The problem is that Webpack has a particular behaviour when dealing with expressions inside require statements, that hinders my attempts to load modules in the way described above. In particular, when using an expression inside require it
tries to include all files that are possible with your expression
The net effect is a huge pile of error messages when I try to run Webpack with the above code.
Though the linked resources suggest to explicitly declare the path of the JavaScript files to include, what I fail to get is how to do the same thing when I cannot, or don't want to, pass a precise path to require, but rather use the automatically resolved modules, as shown.
Thanks all
EDIT:
I still don't known how to use expressions to load those scripts, however, I designed a workaround. Basically, the idea is to explicitly write the require('script') inside a callback function, and then dinamically call that function when it's time. More precisely, I prepared a configuration file like this:
// config.js
'use strict';
module.exports = {
'lib': {
'jquery': {
'object': 'jQuery',
'dev': function() { require('script!jquery'); },
'dist': function() { return require('jquery-cdn'); },
'cdn': '//ajax.googleapis.com/ajax/libs/jquery/1.11.1/jquery.min.js'
},
'handlebars': {
// ...
}
}
};
Inside my main code I, then, define an array of resources to load, like:
var config = require('./config.js');
var resources = [ config.lib.jquery, config.lib.handlebars, ... ];
And then when I have to load the development version, or the distribution version, I dinamically call:
// Inside some kind of cycle
// resource = resources[index]
try {
window[resource.object] = resource.dist();
} catch (e) {
console.log('Cannot load ' + resource.object + ' from CDN. Loading local version...');
resource.dev();
}
Here there's a more complete example of this in action.
I've installed node-qunit (stable) from npm, but can't seem to get any tests working. My source files don't seem to be included in scope.
./source/myscript.js:
var myObj = {
a : true
}
./test/tests.js:
test("that a is true", function () {
ok(myObj.a);
});
./test/runner.js:
var runner = require('qunit');
runner.run({
code : './source/myscript.js',
tests : './test/tests.js'
});
./Makefile:
test :
<tab>node ./test/testrunner.js
.PHONY: install test
If I run make test, I get a 'ReferenceError: myObj is not defined' error. The source file does run, because it can throw errors. It just doesn't seem to be included in the global scope as it should. It doesn't work if I do it from the command line, as per the instructions in the node-qunit readme. Anyone have any idea how to get this working?
You're not exporting anything. Behind the scenes, node-qunit is using require to load the specified modules. To expose variables when a module is required, you have to add them to the exports object (or assign your own object to the exports variable)
(There's also a syntax error - ; in the object literal)
This works for me:
./source/myscript.js:
exports.myObj = {
a: true
}
./test/tests.js:
QUnit.module('tests')
test("that a is true", function () {
ok(myObj.a)
})
./test/runner.js:
var runner = require('qunit')
runner.run({
code : './source/myscript.js'
, tests : './test/tests.js'
})
I am currently using requirejs to manage module js/css dependencies.
I'd like to discover the possibilities of having node do this via a centralized config file.
So instead of manually doing something like
define([
'jquery'
'lib/somelib'
'views/someview']
within each module.
I'd have node inject the dependencies ie
require('moduleA').setDeps('jquery','lib/somelib','views/someview')
Anyway, I'm interested in any projects looking at dependency injection for node.
thanks
I've come up with a solution for dependency injection. It's called injectr, and it uses node's vm library and replaces the default functionality of require when including a file.
So in your tests, instead of require('libToTest'), use injectr('libToTest' { 'libToMock' : myMock });. I wanted to make the interface as straightforward as possible, with no need to alter the code being tested. I think it works quite well.
It's just worth noting that injectr files are relative to the working directory, unlike require which is relative to the current file, but that shouldn't matter because it's only used in tests.
I've previously toyed with the idea of providing an alternate require to make a form of dependency injection available in Node.js.
Module code
For example, suppose you have following statements in code.js:
fs = require('fs');
console.log(fs.readFileSync('text.txt', 'utf-8'));
If you run this code with node code.js, then it will print out the contents of text.txt.
Injector code
However, suppose you have a test module that wants to abstract away the file system.
Your test file test.js could then look like this:
var origRequire = global.require;
global.require = dependencyLookup;
require('./code.js');
function dependencyLookup (file) {
switch (file) {
case 'fs': return { readFileSync: function () { return "test contents"; } };
default: return origRequire(file);
}
}
If you now run node test.js, it will print out "test contents", even though it includes code.js.
I've also written a module to accomplish this, it's called rewire. Just use npm install rewire and then:
var rewire = require("rewire"),
myModule = rewire("./path/to/myModule.js"); // exactly like require()
// Your module will now export a special setter and getter for private variables.
myModule.__set__("myPrivateVar", 123);
myModule.__get__("myPrivateVar"); // = 123
// This allows you to mock almost everything within the module e.g. the fs-module.
// Just pass the variable name as first parameter and your mock as second.
myModule.__set__("fs", {
readFile: function (path, encoding, cb) {
cb(null, "Success!");
}
});
myModule.readSomethingFromFileSystem(function (err, data) {
console.log(data); // = Success!
});
I've been inspired by Nathan MacInnes's injectr but used a different approach. I don't use vm to eval the test-module, in fact I use node's own require. This way your module behaves exactly like using require() (except your modifications). Also debugging is fully supported.
I'm using modulr for using commonjs modules in the browser.
The goal is to be able to reuse some of those modules also in a server environment.
These "shared" modules need to do something like this:
var _ = _ || require("underscore");
meaning:
if _ exists as a global var (browser environment), use it
else load the "underscore" module (server), and use it instead
Now, since modulr does static analysis in all the code, looking for the require calls in order to generate the final js file, it will fail the build.
Is there a way to work around this problem?
(For example, if modulr supported something like --ignore=<module_list> parameter, everything would run fine.)
Apparently there's no way to fix this in modulr, so I had to create a workaround module named Env which looks like this:
// Env.js
var my = {
modules: undefined,
require: require
};
exports.override = function(modules) {
my.modules = modules;
};
exports.require = function(path) {
if (my.modules && my.modules[path]) {
return my.modules[path];
} else {
// my.require(...) is needed instead of simply require(...)
// because simply require(...) will cause a modulr parsing failure
return my.require(path);
}
};
And at the client side, have a specific initializer that does:
// ClientInitializer.js
Env = require('shared/Env');
Env.override({ underscore: _ });
So, the "shared" modules can do:
// SharedModule.js
var _ = require('shared/Env').require('underscore');
If the "shared" module is running in the server, the normal require function is called.
If it is running in the browser, the Env module will answer with the global _ variable.