I am just getting use to CoffeeScript and I have gotten stuck with classes.
I want to have my files structured like in node so that I can require a JavaScript file containing a class like this.
Test = require "test.js"
Test.start()
Where start is a method of the Test Class.
Is this possible?
Is this possible?
Not exactly like in Node. There is no synchronous require in browser environments. Yet, you could try one of the many asynchronous libraries to do that, have a look at AMD. The most famous implementation is require.js.
I've found that the simplest way to use CommonJS modules (the ones that Node.js uses) in browser environments is to use Browserify. I personally also prefer the CommonJS module definitions to the the AMD ones, but that's just personal taste.
Also, take into account that in order to export your classes so that require 'test' will give you the class constructor directly, you would have to assign your class to module.exports:
# In test.coffee
module.exports = class Test
#start = -> console.log 'start!'
Then compile that file to test.js and you're ready to use it:
Test = require './test'
Test.start()
In Node.js this will Just Work. In browsers, you will need to process the files first with Browserify (or some other tool) to get it working (it will create the proper require function as well as some exports and module.exports variables for CommonJS modules to work correctly).
Take a look on stitch, hem (which is inspired by stitch, but has more neat features) and browserify.
Personally, I prefer hem. You can do something like this with it:
# app/lib/model.coffee
module.exports = class Model
...
.
# app/lib/util.coffee
helper1 = -> ...
helper2 = -> ...
module.export = {helper1, helper2}
.
# app/index.coffee
Model = require 'lib/model'
{helper1} = require 'lib/util'
# do whatever you want with required stuff
...
Hem takes care of compiling CoffeeScript on the fly and bundling all the needed code together (it also supports npm modules and arbitrary js libs as external dependencies for your code, see the docs for more details).
Related
I don't like the whole export/require stuff in node, it takes too long. Let's say I have a file server.js and I want to use functions in whatever.js. in html I just add this to the header:
<script src='whatever.js'></script>
and then I can just use all the functions of whatever.js in my body's script.
But in node, in the server.js file I'd do:
var myobject = require('./whatever.js');
but then I need to set it to myobject, and further I need to go to whatever.js and manually decide what functions I want to export. not to mention that typing myobject.someFunction() is alot longer to write than someFunction() and I need to remember what I exposed/didn't expose.
I wanted something where I could just go:
require('./whatever.js');
and it puts it ALL in global, no bs. like in good old html/javascript. Is there a way to do this in node?
This will do the trick,
var fs = require('fs');
eval(fs.readFileSync('whatever.js')+'');
// here call functions from whatever.js file
(I realize this is an old thread but wanted to leave a note here for posterity.)
Here in 2022 there are several approaches for executing code from different files with Node.js:
ESM: Use standard ECMAScript modules
At the time of this writing, much of the node ecosystem (i.e. packages on npm) is in the process of transitioning to this paradigm, and there are some associated growing pains (e.g. things like __dirname are only available in CJS not ESM, though the workaround is easy).
For most developers, it would be advisable to become comfortable with this standard as it transcends node.js (i.e. is implemented in other runtimes like Deno and web browsers) and has been years in the making.
CJS: Use the original "CommonJS" module mechanism, e.g. require('./some-script.js')
It should be noted, particularly for the OP, that even though the "intended" way to use CJS modules is to export functions, constants, etc. and import them explicitly, it is possible to define everything in global scope using globalThis, though I would not recommend this.
// my-script.js
require('./foo.js');
require('./bar.js');
foo(); // This is foo from <...>foo.js
console.log(`bar = ${bar} (in ${__filename})`); // bar = 123 (in <...>my-script.js)
// foo.js
globalThis.foo = function() {
console.log(`This is foo from ${__filename}`);
}
// bar.js
globalThis.bar = 123;
If you try omitting globalThis. you'll find that foo and bar are no longer defined in the main script because require "wraps them" in "module scope."
Use eval
In my experience, there are very few legitimate use cases for eval (see Never use eval()!). Nevertheless, the functionality requested in this question is precisely what eval provides: "run some code as if it were written right here" and you can feed it from a file, as explained above by Mehul Prajapati
// include.js
// Defines a global function that works like C's "#include" preprocessor directive
const { readFileSync } = require('fs');
globalThis.include = function(scriptFile) {
console.warn('!!! EXTREMELY INSECURE !!!');
eval(readFileSync(scriptFile, 'utf-8'));
};
// main.js
require('./include.js'); // loads global include
// (I sure hope you completely trust these sources)
include('./foo.js');
include('./bar.js');
Note: Something that has contributed to much of my confusion in the past is that there have been competing standards/conventions/APIs that use some of the same identifiers, namely require, which require.js and other bundlers that support AMD (Asynchronous Module Definition)
use with different semantics. So for someone building a web application (using AMD for modules in web browsers) with node.js tooling (using CJS for modules locally) it can be frustrating to keep the functions straight, especially if it's an Electron application, which can expose Node.js APIs to scripts running in the renderer (browser). If you find yourself confused why a module is "not found" in a situation like that, check the stack trace to see which require is being called (and you may have to wrap/rename them on globalThis or something to avoid collisions).
Further reading:
JavaScript Modules: A Brief History [2019]
How the module system, CommonJS & require works [updated 2022]
What is AMD, CommonJS, and UMD? [2014]
I'm a beginner at using js modules.
I'm working on a fairly simple web application. It uses typescript and angular 2, which heavily relies on modules.
Most of my app ts files 'import' one or many js modules (usually mostly angular 2 modules).
As I understand, because my app ts files have a top level 'import', they are automatically considered a js module by typescript.
However, I want any of my app ts files to be accessible by any other of my app ts files, without having to 'import' each other. But because they are now modules themselves, ts requires me to do that...
Is it possible?
It seems crazy to me that for each of my app ts file, I should have to declare every other of my app ts files that are used in there (I like to have tiny files with a single class/interface). In addition, this relies on relative paths which breaks as soon as I restructure my folder structure.
Am I thinking about this the wrong way?
You must have a js file which is an entry point to your application right?.. So in that file just import all the modules which you want to access without importing and attach them to the window object. Since the window object is available globally, you can access your module from anywhere without importing the corresponding module. For example,
Consider this scenario:
You have a module in a file called module1.ts
The entry point of your application is a file called index.ts
And you have a module2 where you require something from module1
// module1.ts
function add(first: number, second: number): number {
return first + second
}
export {add}
in your index.ts
// index.ts
import {add} from '<path to module1>/module1';
window.add = add
Now in your module2
// module2.ts
window.add(1, 2)
Since the window object is available globally you can attach as many properties to it as you like.
As far as the type resolution is concerned you can declare a window module with the add function you require in a .d.ts file as follows:
declare module window {
add: (first: number, second: number) => number
}
Declaring dependencies (e.g modules) for each file is a double-edged sword.
The advantage is that there is no 'magic' - you know exactly where each function, variable, class etc. is coming from. This makes it much easier to know what libraries / frameworks are being used and where to look to troubleshoot issues. Compare it to opposite approach that Ruby on Rails uses with Ruby Gems, where nothing is declared and everything is auto-loaded. From personal experience I know it becomes an absolute pain to try to workout where some_random_method is coming from and also what methods / classes I have access to.
You're right that the disadvantage is that it can become quite verbose with multiple imports and moving relative files. Modern editors and IDEs like WebStorm and Visual Studio Code have tools to automatically update the relative paths when you move a file and also automatically add the imports when you reference code in another module.
One practical solution for multiple imports is to make your own 'group' import file. Say you have a whole bunch of utility functions that you use in all your files - you can import them all into a single file and then just reference that file everywhere else:
//File: helpers/string-helpers.ts
import {toUppercase} from "./uppercase-helper";
import {truncate} from "./truncate-helper";
export const toUppercase = toUppercase;
export const truncate = truncate;
Then in any other file:
import * as StringHelpers from "../path-to/helpers/string-helpers";
...
let shoutingMessage = StringHelpers.toUppercase(message);
The disadvantage of this is that it may break tree shaking, where tools such as webpack remove unused code.
Is it possible
Not in any easy way. The ts file is a module and uses e.g. module.exports (if commonjs) that will need to be shimmed out. And that is just the runtime story. The TypeScript story will be harder and one way would be to make a .d.ts file for the module stating the contents as global.
Like I said. Not worth doing. Modules are the way forward instead of making something hacky.
It's not crazy at all. You are definitively thinking in the wrong way.
Actually what you don't like it's a common feature in all modern programming languages and it makes the code and structure of the app a lot clearer and simple to understand.
Without imports and going to old school way looks very crazy to me :)
You can have only chaos with so many global variables.
I'm writing a javascript library that contains a core module and several
optional submodules which extend the core module. My target is the browser
environment (using Browserify), where I expect a user of my module will only
want to use some of my optional submodules and not have to download the rest to
the client--much like custom builds work in lodash.
The way I imagine this working:
// Require the core library
var Tasks = require('mymodule');
// We need yaks
require('mymodule/yaks');
// We need razors
require('mymodule/razors');
var tasks = new Tasks(); // Core mymodule functionality
var yak = tasks.find_yak(); // Provided by mymodule/yaks
tasks.shave(yak); // Provided by mymodule/razors
Now, imagine that the mymodule/* namespace has tens of these submodules. The
user of the mymodule library only needs to incur the bandwidth cost of the
submodules that she uses, but there's no need for an offline build process like
lodash uses: a tool like Browserify solves the dependency graph for us and
only includes the required code.
Is it possible to package something this way using Node/npm? Am I delusional?
Update: An answer over here seems to suggest that this is possible, but I can't figure out from the npm documentation how to actually structure the files and package.json.
Say that I have these files:
./lib/mymodule.js
./lib/yaks.js
./lib/razors.js
./lib/sharks.js
./lib/jets.js
In my package.json, I'll have:
"main": "./lib/mymodule.js"
But how will node know about the other files under ./lib/?
It's simpler than it seems -- when you require a package by it's name, it gets the "main" file. So require('mymodule') returns "./lib/mymodule.js" (per your package.json "main" prop). To require optional submodules directly, simply require them via their file path.
So to get the yaks submodule: require('mymodule/lib/yaks'). If you wanted to do require('mymodule/yaks') you would need to either change your file structure to match that (move yaks.js to the root folder) or do something tricky where there's a yaks.js at the root and it just does something like: module.exports = require('./lib/yaks');.
Good luck with this yak lib. Sounds hairy :)
So lets say I have some small bit of library code that I develop and test in isolation. I use RequireJS during development and have a root level file that depends on 1 other file. So it's define looks something like...
// lib/main.js
define(['lib/dep1'] function(dep1) {
...
})
I run r.js on the code which results in dist/myLibrary.js, which looks something like this:
define('lib/dep1',[], function(){...})
define('lib/main',["lib/dep1"], function(dep1){...})
If I pull myLibrary.js straight into another project it won't work. Nothing is defining itself as the module for that file. But if I append an actual module definition, it works.
define('lib/dep1',[], function(){...})
define('lib/main',["lib/dep1"], function(dep1){...})
define(['lib/main'], function(lib) {
return lib;
})
And the ['lib/main'] seems to be scoped to the module, because if I have an actual lib/main in my app, it doesn't get used.
Questions:
Regarding the scoping, is that the normal behavior? The fact that lib/main is recognized as a module id from the same file rather than going to look for it someplace else. If I import 10 such libraries that all have a lib/main, they won't collide?
Is there a better way? I am at least initially unconcerned about supporting the non-AMD use case as this is all internal lib development and we all use RequireJS. So within a fully AMDed environment, is there another, better way to do this? Assuming there's no pitfall to this approach, it seems fairly simple and boilerplate to support.
Hello with RequireJS I can set a base path like this: base : './app/' so when I am in ./app/foo/bar/ for example and I have a script where I use require('foo'); RequireJS then would search for ./app/foo.js and not in node_module folder or in ./app/foo/bar/foo.js this comes handy when you have a kind of structure where it would be much cleaner for you as a developer to see the dependencies instead of having ../../foo.js. I could have ./app/foo.js and ./app/foo/foo.js and ./app/foo/bar/foo.js it would be much more cleaner to have:
require('foo');
require('foo/foo');
require('foo/bar/foo');
rather than:
require('../../foo');
require('../foo');
require('./foo');
Now you could say why not change the name and not have foo everywhere, let's say that we can't for any reason…
Another lack of feature that I see in node's require method against RequireJS is the ability of setting path mapping, if I have a directory named ./app/super-sized-directory-name/ in RequireJS I could simply do 'big-dir' : 'super-sized-directory-name' and then I could simply use require('./app/big-dir/foo') with Node.js's require method this is not possible as far as I know…
--alias, -a Register an alias with a colon separator: "to:from"
Example: --alias 'jquery:jquery-browserify'
You can register aliases with browserify, so that covers your renaming.
As for your rooted absolute paths, that can't really be done. As mentioned modul8 has a namespacing mechanism to solve this.
I would recommend you pong SubStack in #stackvm on freenode and ask him directly.
It may or may not help you, but I believe the Dojo Frameworks AMD Loader is API compatible with RequireJS and providing you are using a new microkernel does not pollute the global namespace.
I believe it only has require() and define() in the global namespace now.
Anyway their method of dealing with this is to do something like:
require(["dojo/node!util"], function(util){
// Module available as util
});
The documentation is at http://dojotoolkit.org/reference-guide/1.8/dojo/node.html
Use uRequire which provides a 'bridge' between nodejs require and AMD define modules, without reinventing the wheel (it is build on top of the two standards). It basically converts modules from AMD or commonJS format to the other format or UMD that runs smoothly on both nodejs & the browser.
It is also translating dependency paths with flexible path conventions, so you can have either '../../foo' or 'bar/foo' depending on which makes more sense at the point you are at.
Your AMD or UMD modules are loaded asynchronously on browser (using AMD/requireJs or other AMD loader) and on node the asynchronous require(['dep1', 'dep2'], function(dep1,dep2){...}) is also simulated.