I'm looking for a simple solution for server side includes of regular HTML and CSS (for repeated components like headers, footers, navigation, etc) that doesn't require extensive frameworks.
You're looking for the
require()
function. Check out the documentation for this and other Node.js things here
If you want to use newer import statement, you may do so; it's not yet fully implemented in Node, but you can use it by using the .mjs extension on the file you need to import and then using the following command:
node --experimental-modules someFile.mjs
In node js to include repetitive files like headers, footers etc. you will need a templating language such as ejs.
Using ejs here is a sample code snippet with the include tag
<%- include('./path/to/your/html/file') %>
You can use require to require multiple files. However, since node caches the files, you will need to delete them from the cache if you want to used an un-cached version of the file.
index.js
app.get('/path', (req, res) => {
clear()
let headers1 = require('/headers/a.js')
let headers2 = require('/headers/b.js')
res.set(headers1)
res.set(headers2)
})
// Remove from require cache
function clear() {
delete require.cache[require.resolve('/headers/a.js')]
delete require.cache[require.resolve('/headers/b.js')]
}
headers/a.js
module.exports = {
'Content-Type': 'application/json'
}
headers/b.js
module.exports = {
'Custom-Header': 'Brass Monkey'
}
I think you like require "view components". Exists multiple view engines for nodejs, for example pug , or ejs. In this case, you use include
Related
I have a node.js server to create a web chat application. But I have a problem. In one file, I want to get a function from another file with the require method and module.export. In my first file (server.js, which is in the root path), the require method works, but in the /js folder (which is not in the root), it does not work. I installed npm and all packages globally.
My All File :
Code in chat.js:
const {verifUserConnected, getUserInfo} = require('express');
console.log(verifUserConnected)
Code in connect.js :
function verifUserConnected(){
return isConnected;
}
function getUserInfo(){
return null;
}
module.exports = {
verifUserConnected,
getUserInfo
};
In "Server.js" The require method works
You've put connect.js in underneath a folder named "public" which implies you are serving it to the browser and trying to run it client-side.
Browsers do not have native support for CommonJS modules (i.e. module.exports and require).
Your starting options are:
Rewrite the client-side code to not use modules
Rewrite the client-side code to use JavaScript modules (i.e. using import, export and <script type="module").
Transpile the modules for use on the browser (e.g. using a tool like Webpack or Parcel.js
However … chat.js attempts to require('express'). Express will not run in the browser and doesn't export anything named verifUserConnected either. You'll need to address that too.
In Common JS (Node JS works by default eith Common JS)
const startServer = () => {
// Code
};
module.exports = { startServer }
//Or
exports.startServer = startServer;
To import.
const { startServer } = require("./path");
If you have any question ask me
So lets say I have some code in js
const myApiKey = 'id_0001'
But instead of harcoding it I want to put it in some bash script with other env vars and read from it and then replace it in the JS
So lets say for prod I would read from prod-env.sh or for dev I would read them from dev-env.sh and then gulp or some other tool does the magic and replaces MY_API_KEY based on whatever is established inside of prod-env.sh or dev-env.sh.
const myApiKey = MY_API_KEY
Update: I want to add I only care about unix OS, not concerned about windows. In golang there is way to read for example envVars.get('MY_API_KEY'), I'm looking for something similar but for JS in the client side.
If you're using gulp, it sounds like you could use any gulp string replacer, like gulp-replace.
As for writing the gulp task(s). If you are willing to import the environment into your shell first, before running node, you can access the environment via process.env
gulp.task('build', function(){
gulp.src(['example.js'])
.pipe(replace('MY_API_KEY', process.env.MY_API_KEY))
.pipe(gulp.dest('build/'));
});
If you don't want to import the environment files before running node, you can use a library like env2 to read shell environment files.
Another option would be to use js/json to define those environment files, and load them with require.
prod-env.js
{
"MY_API_KEY": "api_key"
}
gulpfile.js
const myEnv = require('./prod-env')
gulp.task('build', function(){
gulp.src(['example.js'])
.pipe(replace('MY_API_KEY', myEnv.MY_API_KEY))
.pipe(gulp.dest('build/'));
});
Also, for a more generic, loopy version of the replace you can do:
gulp.task('build', function () {
stream = gulp.src(['example.js']);
for (const key in process.env) {
stream.pipe('${' + key + '}', process.env[key]);
}
stream.pipe(gulp.dest('build/'));
});
In that last example I added ${} around the environment variable name to make it less prone to accidents. So the source file becomes:
const myApiKey = ${MY_API_KEY}
This answer is an easy way to do this for someone who doesn't want to touch the code they are managing. For example you are on the ops team but not the dev team and need to do what you are describing.
The environment variable NODE_OPTIONS can control many things about the node.js runtime - see https://nodejs.org/api/cli.html#cli_node_options_options
One such option we can set is --require which allows us to run code before anything else is even loaded.
So using this you can create a overwrite.js file to perform this replacement on any non-node_modules script files:
const fs = require('fs');
const original = fs.readFileSync;
// set some custom env variables
// API_KEY_ENV_VAR - the value to set
// API_KEY_TEMPLATE_TOKEN - the token to replace with the value
if (!process.env.API_KEY_TEMPLATE_TOKEN) {
console.error('Please set API_KEY_TEMPLATE_TOKEN');
process.exit(1);
}
if (!process.env.API_KEY_ENV_VAR) {
console.error('Please set API_KEY_ENV_VAR');
process.exit(1);
}
fs.readFileSync = (file, ...args) => {
if (file.includes('node_modules')) {
return original(file, ...args);
}
const fileContents = original(file, ...args).toString(
/* set encoding here, or let it default to utf-8 */
);
return fileContents
.split(process.env.API_KEY_TEMPLATE_TOKEN)
.join(process.env.API_KEY_ENV_VAR);
};
Then use it with a command like this:
export API_KEY_ENV_VAR=123;
export API_KEY_TEMPLATE_TOKEN=TOKEN;
NODE_OPTIONS="--require ./overwrite.js" node target.js
Supposing you had a script target.js
console.log('TOKEN');
It would log 123. You can use this pretty much universally with node, so it should work fine with gulp, grunt, or any others.
I have an application using node.js backend and require.js/backbone frontend.
My backend has a config/settings system, which depending on the environment (dev, production, beta) can do different things. I would like to propagate some of the variables to the client as well, and have them affect some template rendering (e.x change the Title or the URL of the pages).
What is the best way to achieve that?
I came up with a way to do it, and it seems to be working but I don't think its the smartest thing to do and I can't figure out how to make it work with requirejs optimizer anyway.
What I do is on the backend I expose an /api/config method (through GET) and on the client
I have the following module config.js:
// This module loads an environment config
// from the server through an API
define(function(require) {
var cfg = require('text!/api/config');
return $.parseJSON(cfg);
});
any page/module that needs config will just do:
var cfg = require('config');
As I said I am having problem with this approach, I can't compile/optimize my client code
with requirejs optimizer since /api/config file doesn't exist in offline during optimization. And I am sure there are many other reason my approach is a bad idea.
If you use use module bundlers such as webpack to bundle JavaScript files for usage in a browser, you can reuse your Node.js module for the client running in a browser. In other words, put your settings or configuration in Node.js modules, and share them between the backend and the client.
For example, you have the following settings in config.js:
Normal Node.js module: config.js
const MY_THIRD_PARTY_URL = 'https://a.third.party.url'
module.exports = { MY_THIRD_PARTY_URL }
Use the module in Node.js backend
const config = require('path-to-config.js')
console.log('My third party URL: ', config.MY_THIRD_PARTY_URL)
Share it in the client
import config from 'path-to-config.js'
console.log('My third party URL: ', config.MY_THIRD_PARTY_URL)
I do the following (note that this is Jade, i have never used require.js or backbone, however as long as you can pass variables from express into your templating language, you should be able to place JSON in data-* attributes on any element you want.)
// app.js
app.get('/', function(req, res){
var bar = {
a: "b",
c: Math.floor(Math.random()*5),
};
res.locals.foo = JSON.stringify(bar);
res.render('some-jade-template');
});
// some-jade-template.jade
!!!
html
head
script(type="text/javascript"
, src="http://ajax.googleapis.com/ajax/libs/jquery/1.10.2/jquery.min.js")
script(type="text/javascript")
$.ready(init);
function init(){
var json = $('body').attr('data-stackoverflowquestion');
var obj = JSON.parse(json);
console.log(obj);
};
body(data-stackoverflowquestion=locals.foo)
h4 Passing data with data-* attributes example
I'm using requireJS to load scripts. It has this detail in the docs:
The path that is used for a module name should not include the .js
extension, since the path mapping could be for a directory.
In my app, I map all of my script files in a config path, because they're dynamically generated at runtime (my scripts start life as things like order.js but become things like order.min.b25a571965d02d9c54871b7636ca1c5e.js (this is a hash of the file contents, for cachebusting purposes).
In some cases, require will add a second .js extension to the end of these paths. Although I generate the dynamic paths on the server side and then populate the config path, I have to then write some extra javascript code to remove the .js extension from the problematic files.
Reading the requireJS docs, I really don't understand why you'd ever want the path mapping to be used for a directory. Does this mean it's possible to somehow load an entire directory's worth of files in one call? I don't get it.
Does anybody know if it's possible to just force require to stop adding .js to file paths so I don't have to hack around it?
thanks.
UPDATE: added some code samples as requested.
This is inside my HTML file (it's a Scala project so we can't write these variables directly into a .js file):
foo.js.modules = {
order : '#Static("javascripts/order.min.js")',
reqwest : 'http://5.foo.appspot.com/js/libs/reqwest',
bean : 'http://4.foo.appspot.com/js/libs/bean.min',
detect : 'order!http://4.foo.appspot.com/js/detect/detect.js',
images : 'order!http://4.foo.appspot.com/js/detect/images.js',
basicTemplate : '#Static("javascripts/libs/basicTemplate.min.js")',
trailExpander : '#Static("javascripts/libs/trailExpander.min.js")',
fetchDiscussion : '#Static("javascripts/libs/fetchDiscussion.min.js")'
mostPopular : '#Static("javascripts/libs/mostPopular.min.js")'
};
Then inside my main.js:
requirejs.config({
paths: foo.js.modules
});
require([foo.js.modules.detect, foo.js.modules.images, "bean"],
function(detect, images, bean) {
// do stuff
});
In the example above, I have to use the string "bean" (which refers to the require path) rather than my direct object (like the others use foo.js.modules.bar) otherwise I get the extra .js appended.
Hope this makes sense.
If you don't feel like adding a dependency on noext, you can also just append a dummy query string to the path to prevent the .js extension from being appended, as in:
require.config({
paths: {
'signalr-hubs': '/signalr/hubs?noext'
}
});
This is what the noext plugin does.
requirejs' noext plugin:
Load scripts without appending ".js" extension, useful for dynamic scripts...
Documentation
check the examples folder. All the info you probably need will be inside comments or on the example code itself.
Basic usage
Put the plugins inside the baseUrl folder (usually same folder as the main.js file) or create an alias to the plugin location:
require.config({
paths : {
//create alias to plugins (not needed if plugins are on the baseUrl)
async: 'lib/require/async',
font: 'lib/require/font',
goog: 'lib/require/goog',
image: 'lib/require/image',
json: 'lib/require/json',
noext: 'lib/require/noext',
mdown: 'lib/require/mdown',
propertyParser : 'lib/require/propertyParser',
markdownConverter : 'lib/Markdown.Converter'
}
});
//use plugins as if they were at baseUrl
define([
'image!awsum.jpg',
'json!data/foo.json',
'noext!js/bar.php',
'mdown!data/lorem_ipsum.md',
'async!http://maps.google.com/maps/api/js?sensor=false',
'goog!visualization,1,packages:[corechart,geochart]',
'goog!search,1',
'font!google,families:[Tangerine,Cantarell]'
], function(awsum, foo, bar, loremIpsum){
//all dependencies are loaded (including gmaps and other google apis)
}
);
I am using requirejs server side with node.js. The noext plugin does not work for me. I suspect this is because it tries to add ?noext to a url and we have filenames instead of urls serverside.
I need to name my files .njs or .model to separate them from static .js files. Hopefully the author will update requirejs to not force automatic .js file extension conventions on the users.
Meanwhile here is a quick patch to disable this behavior.
To apply this patch (against version 2.1.15 of node_modules/requirejs/bin/r.js) :
Save in a file called disableAutoExt.diff or whatever and open a terminal
cd path/to/node_modules/
patch -p1 < path/to/disableAutoExt.diff
add disableAutoExt: true, to your requirejs.config: requirejs.config({disableAutoExt: true,});
Now we can do require(["test/index.njs", ...] ... and get back to work.
Save this patch in disableAutoExt.diff :
--- mod/node_modules/requirejs/bin/r.js 2014-09-07 20:54:07.000000000 -0400
+++ node_modules/requirejs/bin/r.js 2014-12-11 09:33:21.000000000 -0500
## -1884,6 +1884,10 ##
//Delegates to req.load. Broken out as a separate function to
//allow overriding in the optimizer.
load: function (id, url) {
+ if (config.disableAutoExt && url.match(/\..*\.js$/)) {
+ url = url.replace(/\.js$/, '');
+ }
+
req.load(context, id, url);
},
The patch simply adds the following around line 1887 to node_modules/requirejs/bin/r.js:
if (config.disableAutoExt && url.match(/\..*\.js$/)) {
url = url.replace(/\.js$/, '');
}
UPDATE: Improved patch by moving url change deeper in the code so it no longer causes a hang after calling undef on a module. Needed undef because:
To disable caching of modules when developing with node.js add this to your main app file:
requirejs.onResourceLoad = function(context, map)
{
requirejs.undef(map.name);
};
I am currently using requirejs to manage module js/css dependencies.
I'd like to discover the possibilities of having node do this via a centralized config file.
So instead of manually doing something like
define([
'jquery'
'lib/somelib'
'views/someview']
within each module.
I'd have node inject the dependencies ie
require('moduleA').setDeps('jquery','lib/somelib','views/someview')
Anyway, I'm interested in any projects looking at dependency injection for node.
thanks
I've come up with a solution for dependency injection. It's called injectr, and it uses node's vm library and replaces the default functionality of require when including a file.
So in your tests, instead of require('libToTest'), use injectr('libToTest' { 'libToMock' : myMock });. I wanted to make the interface as straightforward as possible, with no need to alter the code being tested. I think it works quite well.
It's just worth noting that injectr files are relative to the working directory, unlike require which is relative to the current file, but that shouldn't matter because it's only used in tests.
I've previously toyed with the idea of providing an alternate require to make a form of dependency injection available in Node.js.
Module code
For example, suppose you have following statements in code.js:
fs = require('fs');
console.log(fs.readFileSync('text.txt', 'utf-8'));
If you run this code with node code.js, then it will print out the contents of text.txt.
Injector code
However, suppose you have a test module that wants to abstract away the file system.
Your test file test.js could then look like this:
var origRequire = global.require;
global.require = dependencyLookup;
require('./code.js');
function dependencyLookup (file) {
switch (file) {
case 'fs': return { readFileSync: function () { return "test contents"; } };
default: return origRequire(file);
}
}
If you now run node test.js, it will print out "test contents", even though it includes code.js.
I've also written a module to accomplish this, it's called rewire. Just use npm install rewire and then:
var rewire = require("rewire"),
myModule = rewire("./path/to/myModule.js"); // exactly like require()
// Your module will now export a special setter and getter for private variables.
myModule.__set__("myPrivateVar", 123);
myModule.__get__("myPrivateVar"); // = 123
// This allows you to mock almost everything within the module e.g. the fs-module.
// Just pass the variable name as first parameter and your mock as second.
myModule.__set__("fs", {
readFile: function (path, encoding, cb) {
cb(null, "Success!");
}
});
myModule.readSomethingFromFileSystem(function (err, data) {
console.log(data); // = Success!
});
I've been inspired by Nathan MacInnes's injectr but used a different approach. I don't use vm to eval the test-module, in fact I use node's own require. This way your module behaves exactly like using require() (except your modifications). Also debugging is fully supported.