Wrapping require - javascript

Given
3 Node.js projects Main - Framework - Repositories
Main has the two other projects connected via npm link.
In a test i wrapped the require in a method. I've got some problems resolving linked projects (details see below)
Simplified code looks like this:
module.export.resolve = function(file){
[...]//Some more logik to handle relative pathes
return require(file)
}
This works fine in most scenarios. I also worked out to get handled with relatives pathes (looking up for caller and apply pathes based on this path)
Now this is in Project Framework which is linked (npm link) to Project Main. Project main has also Project Repositories linked.
Now in Project Main i have:
require('ProjectRepositories/foo') // Works as expected
myRequire.resolve('ProjectRepositories/foo') // Returns MODULE_NOT_FOUND "Cannot find module 'ProjectRepositories/foo'
I assume the problem is that Repositories Project ist not linked in the Framework Project. But is there an other way than linking them ?
I'd prefer to have less dependencies. Any hints on that?

You are absolutely correct in that the reason why the Project Framework resolve does not work is because the requireFn being used from within that project only knows about the modules installed in that framework. This is because when you require a javascript file, node evaluates the script within the context of the module, and not the context of the current project (this is how dependency modules' requires work when from your top-level script).
What you can do, however, is provide a way for the framework resolver to use a user-specified require function to do its work, once it has transformed the paths.
module.exports.resolve = function(file, resolver) {
//Some more logik to handle relative pathes
resolver = typeof resolver == 'function' ? resolver : require;
return resolver(file)
}
Now in your code, you could do
myRequire.resolve('ProjectRepositories/foo', require);
So now your Project Main require will be used to resolve the file.
You can also take this a step further if you want and have the module be stateful and remember the resolver it's supposed to use.
var _requireFn = require;
module.exports = {
resolve: resolve,
setRequireFn: setRequireFn
};
function resolve(path) {
return _requireFn(path);
}
function setRequireFn(requireFn) {
_requireFn = requireFn;
}
On another note, I would be careful about using the term resolve because in node that's semantically used for looking up the correct file path to be required, a la require.resolve.
Finally, in terms of minimizing dependencies, I'd recommend including your subprojects in npm using github repos. This has worked pretty well for me in the past, unless your two subrepos are in a constant state of flux. See the install docs for more info.

Related

error "window" is not available during server side rendering [duplicate]

I have been trying to build my gatsby (react) site recently using an external package.
The link to this package is "https://github.com/transitive-bullshit/react-particle-animation".
As I only have the option to change the props from the components detail, I cannot read/write the package file where it all gets together in the end as it is not included in the public folder of 'gatsby-build'.
What I have tried:
Editing the package file locally, which worked only on my machine but when I push it to netlify, which just receives the public folder and the corresponding package.json files and not the 'node-modules folder', I cannot make netlify read the file that I myself changed, as it requests it directly from the github page.
As a solution I found from a comment to this question, we can use the "Patch-Package" to save our fixes to the node module and then use it wherever we want.
This actually worked for me!
To explain how I fixed it: (As most of it is already written in the "Patch Package DOCS), so mentioning the main points:
I first made changes to my local package files that were giving the error.(For me they were in my node_modules folder)
Then I used the Patch Package Documentation to guide my self through the rest.
It worked after I pushed my changes to github such that now, Patch Package always gives me my edited version of the node_module.
When dealing with third-party modules that use window in Gatsby you need to add a null loader to its own webpack configuration to avoid the transpilation during the SSR (Server-Side Rendering). This is because gatsby develop occurs in the browser (where there is a window) while gatsby build occurs in the Node server where obviously there isn't a window or other global objects (because they are not even defined yet).
exports.onCreateWebpackConfig = ({ stage, loaders, actions }) => {
if (stage === "build-html") {
actions.setWebpackConfig({
module: {
rules: [
{
test: /react-particle-animation/,
use: loaders.null(),
},
],
},
})
}
}
Keep in mind that the test value is a regular expression that will match a folder under node_modules so, ensure that the /react-particle-animation/ is the right name.
Using a patch-package may work but keep in mind that you are adding an extra package, another bundled file that could potentially affect the performance of the site. The proposed snippet is a built-in solution that is fired when you build your application.

Is there a way to cause the JS engine to load a .js file without explicitly importing something from it?

Maybe I'm trying to do something silly, but I've got a web application (Angular2+), and I'm trying to build it in an extensible/modular way. In particular, I've got various, well, modules for lack of a better term, that I'd like to be able to include or not, depending on what kind of deployment is desired. These modules include various functionality that is implemented via extending base classes.
To simplify things, imagine there is a GenericModuleDefinition class, and there are two modules - ModuleOne.js and ModuleTwo.js. The first defines a ModuleOneDefinitionClass and instantiate an exported instance ModuleOneDefinition, and then registers it with the ModuleRegistry. The second module does an analogous thing.
(To be clear - it registers the ModuleXXXDefinition object with the ModuleRegistry when the ModuleXXX.js file is run (e.g. because of some other .js file imports one of its exports). If it is not run, then clearly nothing gets registered - and this is the problem I'm having, as I describe below.)
The ModuleRegistry has some methods that will iterate over all the Modules and call their individual methods. In this example, there might be a method called ModuleRegistry.initAllModules(), which then calls the initModule() method on each of the registered Modules.
At startup, my application (say, in index.js) calls ModuleRegistry.initAllModules(). Obviously, because index.js imports the exported ModuleRegistry symbol, this will cause the ModuleRegistry.js code to get pulled in, but since none of the exports from either of the two Module .js files is explicitly referenced, these files will not have been pulled in, and so the ModuleOneDefinition and ModuleTwoDefinition objects will not have been instantiated and registered with the ModuleRegistry - so the call to initAllModules() will be for naught.
Obviously, I could just put meaningless references to each of these ModuleDefinition objects in my index.js, which would force them to be pulled in, so that they were registered by the time I call initAllModules(). But this requires changes to the index.js file depending on whether I want to deploy it with ModuleTwo or without. I was hoping to have the mere existence of the ModuleTwo.js be enough to cause the file to get pulled in and the resulting ModuleTwoDefinition to get registered with the ModuleRegistry.
Is there a standard way to handle this kind of situation? Am I stuck having to edit some global file (either index.js or some other file it references) so that it has information about all the included Modules so that it can then go and load them? Or is there a clever way to cause JavaScript to execute all the .js files in a directory so that merely copying the files it would be enough to get them to load at startup?
a clever way to cause xxJavaScriptxx Node.js to execute all the .js files in a directory:
var fs = require('fs') // node filesystem
var path = require('path') // node path
function hasJsExtension(item) {
return item != 'index.js' && path.extname(item) === '.js'
}
function pathHere(item) {
return path.join('.', item)
}
fs.readdir('./', function(err, list) {
if (err) return err
list.filter(hasJsExtension).map(pathHere).forEach(require) // require them all
})
Angular is pretty different, all the more if it is ng serve who checks if your app needs a module, and if so serves the corresponding js file, at any time needed, not at first load time.
In fact your situation reminds me of C++ with header files Declaration and cpp files with implementation, maybe you just need a defineAllModules function before initAllModules.
Another way could be considering finding out how to exclude those modules from ng-serve, and include them as scripts in your HTML before the others, they would so be defined (if present and so, served), and called by angular if necesary, the only cavehat is the error in the console if one script tag is not fetched, but your app will work anyway, if it supposed to do so.
But anyway, it would be declaring/defining those modules somewhere in ng-serve and also in the HTML.
In your own special case, and not willing to under-evalute ng-serve, but is the total js for your app too heavy to be served at once? (minified and all the ...), since the good-to-go solution may be one of the many tools to build and rebuild your production all.js from your dev js folder at will, or like you said, with a drag&drop in your folder.
Such tool is, again, server-side, but even if you only can push/FTP your javascript, you could use it in your prefered dev environment and just push your new version. To see a list of such tools google 'YourDevEnvironment bundle javascript'.
To do more with angular serve and append static js files under specific conditions, you should use webpack so the first option i see here is eject your webpack configuration and after that you can specify what angular should load or not.
With that said, i will give an example:
With angular cli and ng serve any external javascript files you wanna include, you have to put them inside the scripts array in the angular-cli.json file.However you can not control which file should be included and which one not.
By using webpack configuration you can specify all these thing by passing a flag from your terminal to the webpack config file and do all the process right there.
Example:
var env.commandLineParamater, plugins;
if(env.commandLineParamater == 'production'){
plugins = [
new ScriptsWebpackPlugin({
"name": "scripts",
"sourceMap": true,
"filename": "scripts.bundle.js",
"scripts": [
"D:\\Tutorial\\Angular\\demo-project\\node_moduels\\bootstrap\\dist\\bootstrap.min.js",
"D:\\Tutorial\\Angular\\demo-project\\node_moduels\\jquery\\dist\\jquery.min.js"
],
"basePath": "D:\\Tutorial\\Angular\\demo-project"
}),
]}else{
plugins = [
new ScriptsWebpackPlugin({
"name": "scripts",
"sourceMap": true,
"filename": "scripts.bundle.js",
"scripts": [
"D:\\Tutorial\\Angular\\demo-project\\node_moduels\\bootstrap\\dist\\bootstrap.min.js"
],
"basePath": "D:\\Tutorial\\Angular\\demo-project"
}),
]
}
then:
module.exports = (env) => {
"plugins": plugins,
// other webpack configuration
}
The script.js bundle will be loaded before your main app bundle and so you can control what you load when you run npm run start instead of ng-serve.
To Eject your webpack configuration, use ng eject.
Generally speaking, when you need to control some of angular ng-serve working, you should extract your own webpack config and customize it as you want.

How to use ember.js without module support

The guides for ember.js are assuming one has the full ES6 support e.g. http://guides.emberjs.com/v2.2.0/routing/specifying-a-routes-model/ shows using the export default construct and doesn't specify any alternative way to achieve the goals. However the module feature is not implemented in all browsers that ember is supporting.
How can I use these features with a browser that doesn't support modules? How would the code in these examples translate to ES5?
Documentation assumes you are using a transpiling tool, because the recommended tool, ember-cli does. Unless you have good reasons not to use it, you definitely should look into it.
It is, however, perfectly fine to work without it. For instance, without a module system, Ember will map controller:posts.index to App.PostsIndexController. So this should work for the example you linked:
App.Router.map(function() {
this.route('favorite-posts');
});
App.FavoritePostsRoute = Ember.Route.extend({
model() {
return this.store.query('post', { favorite: true });
}
});
You may also use Ember with your own module support. I successfully have an Ember project based on rollup. It does require a bit more work though, to have the resolver find your classes (that resolver link also documents how ember relates does the name mapping). Nothing hard, but you must build a short script to generate registrations.
Edit for blessenm: Ember with rollup
Unfortunately I cannot share this code, but it works like this:
A script scans the project directory and compiles templates by invoking ember-template-compiler.js on every .hbs file it encounters.
A script (the same one, actually) scans the project directory and generates the main entry point. It's pretty simple, if it sees, say gallery/models/collection.js and `gallery/routes/picture.js', it will generate a main file that looks like this:
import r1 from 'gallery/models/collection.js';
import r2 from 'gallery/routes/picture/index.js';
// ...
Ember.Application.initializer({
name: 'registrations',
initialize: function (app) {
app.register("model:collection", r1);
app.register("route:picture.index", r2);
// ...
}
});
It should just map your filenames to resolver names. As a bonus, you get to control how your directories are organized.
Invoke rollup on the generated file. It will pull everything together. I use IIFE export format, skipping all the run-time resolution mess. I suggest you setup rollup to work with babel so you can use ES6 syntax.
I don't use any ember-specific module, but it should not be too hard to add. My guess is it's mostly a matter of setting up rollup import resolution properly. For all I know, it may work out of the box.
You should look into using Ember CLI http://ember-cli.com/
You write your code in ES6 and it transpiles down to ES5.

How do I package a node module with optional submodules?

I'm writing a javascript library that contains a core module and several
optional submodules which extend the core module. My target is the browser
environment (using Browserify), where I expect a user of my module will only
want to use some of my optional submodules and not have to download the rest to
the client--much like custom builds work in lodash.
The way I imagine this working:
// Require the core library
var Tasks = require('mymodule');
// We need yaks
require('mymodule/yaks');
// We need razors
require('mymodule/razors');
var tasks = new Tasks(); // Core mymodule functionality
var yak = tasks.find_yak(); // Provided by mymodule/yaks
tasks.shave(yak); // Provided by mymodule/razors
Now, imagine that the mymodule/* namespace has tens of these submodules. The
user of the mymodule library only needs to incur the bandwidth cost of the
submodules that she uses, but there's no need for an offline build process like
lodash uses: a tool like Browserify solves the dependency graph for us and
only includes the required code.
Is it possible to package something this way using Node/npm? Am I delusional?
Update: An answer over here seems to suggest that this is possible, but I can't figure out from the npm documentation how to actually structure the files and package.json.
Say that I have these files:
./lib/mymodule.js
./lib/yaks.js
./lib/razors.js
./lib/sharks.js
./lib/jets.js
In my package.json, I'll have:
"main": "./lib/mymodule.js"
But how will node know about the other files under ./lib/?
It's simpler than it seems -- when you require a package by it's name, it gets the "main" file. So require('mymodule') returns "./lib/mymodule.js" (per your package.json "main" prop). To require optional submodules directly, simply require them via their file path.
So to get the yaks submodule: require('mymodule/lib/yaks'). If you wanted to do require('mymodule/yaks') you would need to either change your file structure to match that (move yaks.js to the root folder) or do something tricky where there's a yaks.js at the root and it just does something like: module.exports = require('./lib/yaks');.
Good luck with this yak lib. Sounds hairy :)

How does requiring modules manually differ from calling them dynamically with browserify?

Sorry for the awkward post title but it's a very weird situation to be in. In my project I have a folder structure like so:
/filters
index.js
[...]
/controllers
index.js
[...]
app.js
app.js is basically the entry point of my application and I thought it would be cool if I could automatically load the contents of those directories by require()ing the directory and having index.js in each of those directories take care of loading whatever it needs to load.
But I'm running into a problem I don't understand. Because I'm being intentionally obtuse (this is a learning/experimentation exercise) I decided to try and keep it as DRY as humanly possible so I tried this big of code to handle the module loading:
'use strict';
var customModules = [
'controllers',
'filters'
];
//require('./controllers');
//require('./filters');
for (var i in customModules) {
if (customModules.hasOwnProperty(i)) {
require('./' + customModules[i]);
}
}
var nativeModules = [
'ngSanitize'
];
angular.module('app', customModules.concat(nativeModules));
I'm planning on extracting that to it's own module but it'll do for demonstration. For some reason this code throws this exception:
Uncaught Error: Cannot find module './controllers'
But when I load it using a static string
require('./controllers');
No problems, everything works and my sample application behaves as expected.
Checkout the Browserify Handbook at how browserify works (emphasis mine):
Browserify starts at the entry point files that you give it and searches for any require() calls it finds using static analysis of the source code's abstract syntax tree.
For every require() call with a string in it, browserify resolves those module strings to file paths and then searches those file paths for require() calls recursively until the entire dependency graph is visited.
Bottom line: the dependencies must be statically declared; your code involving the dynamcally-created require() calls cannot work (although a good idea in principle).
You could achieve a similar result using server-side or build-time techniques.

Categories