Related
First question so bare with me if it is not very clear, but I'll try my best.
I am currently running through a youtube video to test my contract with hardhat, ethers, and waffle (https://www.youtube.com/watch?v=oTpmNEYV8iQ&list=PLw-9a9yL-pt3sEhicr6gmuOQdcmWXhCx4&index=6).
Here is the contract:
// SPDX-License-Identifier: MIT
pragma solidity ^0.8.9;
import "#openzeppelin/contracts/token/ERC721/ERC721.sol";
contract MyContract is ERC721 {
constructor(string memory name, string memory symbol)
ERC721(name, symbol) {
}
}
And here is test.js:
const { expect } = require('chai');
describe("MyContract", function() {
it("should return correct name", async function() {
const MyContract = hre.ethers.getContractFactory("MyContract");
const myContractDeployed = await MyContract.deploy("MyContractName", "MCN");
await myContractDeployed.deployed();
expect(await myContractDeployed.name()).to.equal("MyContractName");
});
});
when I run "npx hardhat test" in the terminal it returns:
MyContract
1) should return correct name
0 passing (7ms)
1 failing
1) MyContract
should return correct name:
TypeError: Cannot read properties of undefined (reading 'getContractFactory')
at Context.<anonymous> (test\test.js:7:35)
at processImmediate (node:internal/timers:464:21)
My code matches the one from the video, and I am having a tough time understanding why I am getting a TypeError here. Any guidance is much appreciated!
EDIT:
I somehow fixed it, I dont understand how exactly it fixed it but it did. Instead of just installing
npm install #nomiclabs/hardhat-waffle ethereum-waffle chai #nomiclabs/hardhat-ethers ethers
I installed
npm install --save-dev #nomiclabs/hardhat-waffle ethereum-waffle chai #nomiclabs/hardhat-ethers ethers
Then the terminal printed
npm WARN idealTree Removing dependencies.#nomiclabs/hardhat-waffle in favor of devDependencies.#nomiclabs/hardhat-waffle
npm WARN idealTree Removing dependencies.ethereum-waffle in favor of devDependencies.ethereum-waffle
npm WARN idealTree Removing dependencies.#nomiclabs/hardhat-ethers in favor of devDependencies.#nomiclabs/hardhat-ethers
npm WARN idealTree Removing dependencies.ethers in favor of devDependencies.ethers
then I removed the hre in front of ethers.getContractFactory("MyContract") and it worked! If anyone would like to explain why this might have fixed it I'd be happy to read it, otherwise I am moving on.
Add the following code snippet at the top of your hardhat.config.js file
require("#nomiclabs/hardhat-waffle");
Sometimes it is because any of these dependencies below missing. Especially if you are using dotenv file and forgetting to import it. So, put these import statements on your hardhat.config or truffle.config file:
require("#nomicfoundation/hardhat-toolbox");
require("#nomiclabs/hardhat-ethers");
require("dotenv").config();
You needed to import hre in the test code.
const hre = require("hardhat");
Is there a way to get the version set in package.json in a nodejs app? I would want something like this
var port = process.env.PORT || 3000
app.listen port
console.log "Express server listening on port %d in %s mode %s", app.address().port, app.settings.env, app.VERSION
I found that the following code fragment worked best for me. Since it uses require to load the package.json, it works regardless of the current working directory.
var pjson = require('./package.json');
console.log(pjson.version);
A warning, courtesy of #Pathogen:
Doing this with Browserify has security implications.
Be careful not to expose your package.json to the client, as it means that all your dependency version numbers, build and test commands and more are sent to the client.
If you're building server and client in the same project, you expose your server-side version numbers too.
Such specific data can be used by an attacker to better fit the attack on your server.
If your application is launched with npm start, you can simply use:
process.env.npm_package_version
See package.json vars for more details.
Using ES6 modules you can do the following:
import {version} from './package.json';
Or in plain old shell:
$ node -e "console.log(require('./package.json').version);"
This can be shortened to
$ node -p "require('./package.json').version"
There are two ways of retrieving the version:
Requiring package.json and getting the version:
const { version } = require('./package.json');
Using the environment variables:
const version = process.env.npm_package_version;
Please don't use JSON.parse, fs.readFile, fs.readFileSync and don't use another npm modules it's not necessary for this question.
For those who look for a safe client-side solution that also works on server-side, there is genversion. It is a command-line tool that reads the version from the nearest package.json and generates an importable CommonJS module file that exports the version. Disclaimer: I'm a maintainer.
$ genversion lib/version.js
I acknowledge the client-side safety was not OP's primary intention, but as discussed in answers by Mark Wallace and aug, it is highly relevant and also the reason I found this Q&A.
Here is how to read the version out of package.json:
fs = require('fs')
json = JSON.parse(fs.readFileSync('package.json', 'utf8'))
version = json.version
EDIT: Wow, this answer was originally from 2012! There are several better answers now. Probably the cleanest is:
const { version } = require('./package.json');
There is another way of fetching certain information from your package.json file namely using pkginfo module.
Usage of this module is very simple. You can get all package variables using:
require('pkginfo')(module);
Or only certain details (version in this case)
require('pkginfo')(module, 'version');
And your package variables will be set to module.exports (so version number will be accessible via module.exports.version).
You could use the following code snippet:
require('pkginfo')(module, 'version');
console.log "Express server listening on port %d in %s mode %s", app.address().port, app.settings.env, module.exports.version
This module has very nice feature - it can be used in any file in your project (e.g. in subfolders) and it will automatically fetch information from your package.json. So you do not have to worry where you package.json is.
I hope that will help.
NPM one liner:
From npm v7.20.0:
npm pkg get version
Prior to npm v7.20.0:
npm -s run env echo '$npm_package_version'
Note the output is slightly different between these two methods: the former outputs the version number surrounded by quotes (i.e. "1.0.0"), the latter without (i.e. 1.0.0).
One solution to remove the quotes in Unix is using xargs
npm pkg get version | xargs echo
Option 1
Best practice is to version from package.json using npm environment variables.
process.env.npm_package_version
more information on: https://docs.npmjs.com/using-npm/config.html
This will work only when you start your service using NPM command.
Quick Info: you can read any values in pacakge.json using process.env.npm_package_[keyname]
Option 2
Setting version in environment variable using https://www.npmjs.com/package/dotenv as .env file and reading it as process.env.version
Just adding an answer because I came to this question to see the best way to include the version from package.json in my web application.
I know this question is targetted for Node.js however, if you are using Webpack to bundle your app just a reminder the recommended way is to use the DefinePlugin to declare a global version in the config and reference that. So you could do in your webpack.config.json
const pkg = require('../../package.json');
...
plugins : [
new webpack.DefinePlugin({
AppVersion: JSON.stringify(pkg.version),
...
And then AppVersion is now a global that is available for you to use. Also make sure in your .eslintrc you ignore this via the globals prop
If you are looking for module (package.json: "type": "module") (ES6 import) support, e.g. coming from refactoring commonJS, you should (at the time of writing) do either:
import { readFile } from 'fs/promises';
const pkg = JSON.parse(await readFile(new URL('./package.json', import.meta.url)));
console.log(pkg.version)
or, run the node process with node --experimental-json-modules index.js to do:
import pkg from './package.json'
console.log(pkg.version)
You will however get a warning, until json modules will become generally available.
If you get Syntax or (top level) async errors, you are likely in a an older node version. Update to at least node#14.
You can use ES6 to import package.json to retrieve version number and output the version on console.
import {name as app_name, version as app_version} from './path/to/package.json';
console.log(`App ---- ${app_name}\nVersion ---- ${app_version}`);
A safe option is to add an npm script that generates a separate version file:
"scripts": {
"build": "yarn version:output && blitz build",
"version:output": "echo 'export const Version = { version: \"'$npm_package_version.$(date +%s)'\" }' > version.js"
}
This outputs version.js with the contents:
export const Version = { version: "1.0.1.1622225484" }
To determine the package version in node code, you can use the following:
const version = require('./package.json').version; for < ES6 versions
import {version} from './package.json'; for ES6 version
const version = process.env.npm_package_version;
if application has been started using npm start, all npm_* environment variables become available.
You can use following npm packages as well - root-require, pkginfo, project-version.
we can read the version or other keys from package.json in two ways
1- using require and import the key required e.g:
const version = require('./package.json')
2 - using package_vars as mentioned in doc
process.env.npm_package_version
You can use the project-version package.
$ npm install --save project-version
Then
const version = require('project-version');
console.log(version);
//=> '1.0.0'
It uses process.env.npm_package_version but fallback on the version written in the package.json in case the env var is missing for some reason.
Why don't use the require resolve...
const packageJson = path.dirname(require.resolve('package-name')) + '/package.json';
const { version } = require(packageJson);
console.log('version', version)
With this approach work for all sub paths :)
In case you want to get version of the target package.
import { version } from 'TARGET_PACKAGE/package.json';
Example:
import { version } from 'react/package.json';
I know this isn't the intent of the OP, but I just had to do this, so hope it helps the next person.
If you're using docker-compose for your CI/CD process, you can get it this way!
version:
image: node:7-alpine
volumes:
- .:/usr/src/service/
working_dir: /usr/src/service/
command: ash -c "node -p \"require('./package.json').version.replace('\n', '')\""
for the image, you can use any node image. I use alpine because it is the smallest.
The leanest way I found:
const { version } = JSON.parse(fs.readFileSync('./package.json'))
I've actually been through most of the solutions here and they either did not work on both Windows and Linux/OSX, or didn't work at all, or relied on Unix shell tools like grep/awk/sed.
The accepted answer works technically, but it sucks your whole package.json into your build and that's a Bad Thing that only the desperate should use temporarily to get unblocked, and in general should be avoided, at least for production code. The alternative is to use that method only to write the version to a single constant that can be used instead of the whole file.
So for anyone else looking for a cross-platform solution (not reliant on Unix shell commands) and local (without external dependencies):
Since it can be assumed that Node.js is installed, and it's already cross-platform for this, I just created a make_version.js file with:
const PACKAGE_VERSION = require("./package.json").version;
console.log(`export const PACKAGE_VERSION = "${PACKAGE_VERSION}";`);
console.error("package.json version:", PACKAGE_VERSION);
and added a version command to package.json:
scripts: {
"version": "node make_version.js > src/version.js",
and then added:
"prebuild": "npm run version",
"prestart": "npm run version",
and it creates a new src/versions.js on every start or build. Of course this can be easily tuned in the version script to be a different location, or in the make_version.js file to output different syntax and constant name, etc.
I do this with findup-sync:
var findup = require('findup-sync');
var packagejson = require(findup('package.json'));
console.log(packagejson.version); // => '0.0.1'
I am using gitlab ci and want to automatically use the different versions to tag my docker images and push them. Now their default docker image does not include node so my version to do this in shell only is this
scripts/getCurrentVersion.sh
BASEDIR=$(dirname $0)
cat $BASEDIR/../package.json | grep '"version"' | head -n 1 | awk '{print $2}' | sed 's/"//g; s/,//g'
Now what this does is
Print your package json
Search for the lines with "version"
Take only the first result
Replace " and ,
Please not that i have my scripts in a subfolder with the according name in my repository. So if you don't change $BASEDIR/../package.json to $BASEDIR/package.json
Or if you want to be able to get major, minor and patch version I use this
scripts/getCurrentVersion.sh
VERSION_TYPE=$1
BASEDIR=$(dirname $0)
VERSION=$(cat $BASEDIR/../package.json | grep '"version"' | head -n 1 | awk '{print $2}' | sed 's/"//g; s/,//g')
if [ $VERSION_TYPE = "major" ]; then
echo $(echo $VERSION | awk -F "." '{print $1}' )
elif [ $VERSION_TYPE = "minor" ]; then
echo $(echo $VERSION | awk -F "." '{print $1"."$2}' )
else
echo $VERSION
fi
this way if your version was 1.2.3. Your output would look like this
$ > sh ./getCurrentVersion.sh major
1
$> sh ./getCurrentVersion.sh minor
1.2
$> sh ./getCurrentVersion.sh
1.2.3
Now the only thing you will have to make sure is that your package version will be the first time in package.json that key is used otherwise you'll end up with the wrong version
I'm using create-react-app and I don't have process.env.npm_package_version available when executing my React-app.
I did not want to reference package.json in my client-code (because of exposing dangerous info to client, like package-versions), neither I wanted to install an another dependency (genversion).
I found out that I can reference version within package.json, by using $npm_package_version in my package.json:
"scripts": {
"my_build_script": "REACT_APP_VERSION=$npm_package_version react-scripts start"
}
Now the version is always following the one in package.json.
I made a useful code to get the parent module's package.json
function loadParentPackageJson() {
if (!module.parent || !module.parent.filename) return null
let dir = path.dirname(module.parent.filename)
let maxDepth = 5
let packageJson = null
while (maxDepth > 0) {
const packageJsonPath = `${dir}/package.json`
const exists = existsSync(packageJsonPath)
if (exists) {
packageJson = require(packageJsonPath)
break
}
dir = path.resolve(dir, '../')
maxDepth--
}
return packageJson
}
If using rollup, the rollup-plugin-replace plugin can be used to add the version without exposing package.json to the client.
// rollup.config.js
import pkg from './package.json';
import { terser } from "rollup-plugin-terser";
import resolve from 'rollup-plugin-node-resolve';
import commonJS from 'rollup-plugin-commonjs'
import replace from 'rollup-plugin-replace';
export default {
plugins: [
replace({
exclude: 'node_modules/**',
'MY_PACKAGE_JSON_VERSION': pkg.version, // will replace 'MY_PACKAGE_JSON_VERSION' with package.json version throughout source code
}),
]
};
Then, in the source code, anywhere where you want to have the package.json version, you would use the string 'MY_PACKAGE_JSON_VERSION'.
// src/index.js
export const packageVersion = 'MY_PACKAGE_JSON_VERSION' // replaced with actual version number in rollup.config.js
const { version } = require("./package.json");
console.log(version);
const v = require("./package.json").version;
console.log(v);
Import your package.json file into your server.js or app.js and then access package.json properties into server file.
var package = require('./package.json');
package variable contains all the data in package.json.
Used to version web-components like this:
const { version } = require('../package.json')
class Widget extends HTMLElement {
constructor() {
super()
this.attachShadow({ mode: 'open' })
}
public connectedCallback(): void {
this.renderWidget()
}
public renderWidget = (): void => {
this.shadowRoot?.appendChild(this.setPageTemplate())
this.setAttribute('version', version)
}
}
I have a Kotlin JVM project that runs some JavaScript in an native runtime. Currently, the different language sources are defined in separate repositories and the JS file is webpacked and packaged as a JAR to be specified as a dependency of the JVM project. This works fine, but I want to merge the two repositories as they are inherently coupled. Rather than maintain an abundance of different build tooling, I thought it would be a good opportunity to learn and use a polyglot build system, like Bazel.
The current structure:
Essentially, there are two main packages I'm trying to build. The web package builds correctly and I can view the webpacked output via command line. Including the web BUILD file for full picture:
load("#npm_bazel_typescript//:index.bzl", "ts_library")
ts_library(
name = "compileCore",
srcs = ["index.ts"],
tsconfig = "tsconfig.json",
)
filegroup(
name = "internalCore",
srcs = ["compileCore"],
output_group = "es5_sources",
)
load("#npm//webpack-cli:index.bzl", webpack = "webpack_cli")
webpack(
name = "bundle",
outs = ["bundle.prod.js"],
args = [
"--mode production",
"$(execpath internalCore)",
"--config",
"$(execpath webpack.config.js)",
"-o",
"$#",
],
data = [
"internalCore",
"webpack.config.js",
"#npm//:node_modules",
],
visibility = ["//visibility:public"],
)
The other important package is the nested //jvm/src/main/java/com/example/bazel/plugin package. This is essentially the final deliverable, which should be a JAR with the output of the web package included as resources.
load("#io_bazel_rules_kotlin//kotlin:kotlin.bzl", "kt_jvm_library")
kt_jvm_library(
name = "plugin",
srcs = glob(["*.kt"]),
deps = [
# ... some deps
],
resources = ["//web:bundle"],
visibility = ["//visibility:public"],
)
This is seemingly straightforward, but errors during the build with:
❯ bazel build //jvm/src/...
INFO: Analyzed target //jvm/src/main/java/com/example/bazel/plugin:plugin (0 packages loaded, 0 targets configured).
INFO: Found 1 target...
ERROR: /Users/jzucker/dev/GitHub/plugin-example-bazel/jvm/src/main/java/com/example/bazel/plugin/BUILD:12:15: error executing shell command: '/bin/bash -c external/bazel_tools/tools/zip/zipper/zipper c bazel-out/darwin-fastbuild/bin/jvm/src/main/java/com/example/bazel/plugin/plugin-resources.jar #bazel-out/darwin-fastbuild/bin/jvm/src/ma...' failed (Exit 255) bash failed: error executing command /bin/bash -c ... (remaining 1 argument(s) skipped)
Use --sandbox_debug to see verbose messages from the sandbox
File web/bundle.prod.js does not seem to exist.Target //jvm/src/main/java/com/example/bazel/plugin:plugin failed to build
Use --verbose_failures to see the command lines of failed build steps.
INFO: Elapsed time: 0.402s, Critical Path: 0.05s
INFO: 0 processes.
FAILED: Build did NOT complete successfully
After some experimentation, this seems to be an issue of trying to bundle generated outputs as resources for a kt_jvm_library. If the resources reference a tangible source file from another package, then it works just fine. The main question here is whether this is the right pattern for Bazel or if I'm trying to abuse this technology. This seems like a relatively simple use case, but there is a line in the docs that concerns me the most:
An invariant of all rules is that the files generated by a rule always belong to the same package as the rule itself; it is not possible to generate files into another package. It is not uncommon for a rule's inputs to come from another package, though.
From https://docs.bazel.build/versions/master/build-ref.html
Any insight would be greatly appreciated.
This is actually a bug in the Bazel Kotlin ruleset:
github.com/bazelbuild/rules_kotlin/issues/281
Until that is fixed, you can package the resources in a java_library and include that as resource_jars.
java_library(
name = "resources",
resources = ["//web:bundle"],
resource_strip_prefix = "web",
)
load("#io_bazel_rules_kotlin//kotlin:kotlin.bzl", "kt_jvm_library")
kt_jvm_library(
name = "plugin",
srcs = glob(["*.kt"]),
deps = [
# ... some deps
],
resource_jars = ["resources"],
visibility = ["//visibility:public"],
)
TravisCI builds are passing for my open-source project, and I'm now trying to integrate gulp-coveralls. On Coveralls.io, no builds can be found for my repository, even though Travis builds have run successfully since I added my repo to Coveralls.
'There have been no builds for this repo.'
When I try to run my gulp-coveralls gulp task, I get this error:
'Repo token could not be determined. Continuing without it.'
Error in plugin 'gulp-coveralls'
Bad response:422 {"message":"Couldn't find a repository matching this job.","error":true}
at handleError (/Users/sarah.green/angular-embedly/node_modules/gulp-coveralls/index.js:11:30)
at sendToCoverallsCallback (/Users/sarah.green/angular-embedly/node_modules/gulp-coveralls/index.js:19:9)
at /Users/sarah.green/angular-embedly/node_modules/gulp-coveralls/index.js:31:13
at Request._callback (/Users/sarah.green/angular-embedly/node_modules/gulp-coveralls/node_modules/coveralls/lib/sendToCoveralls.js:7:5)
at Request.self.callback (/Users/sarah.green/angular-embedly/node_modules/gulp-coveralls/node_modules/coveralls/node_modules/request/index.js:142:22)
at Request.EventEmitter.emit (events.js:98:17)
at Request.<anonymous> (/Users/sarah.green/angular-embedly/node_modules/gulp-coveralls/node_modules/coveralls/node_modules/request/index.js:856:14)
at Request.EventEmitter.emit (events.js:117:20)
at IncomingMessage.<anonymous> (/Users/sarah.green/angular-embedly/node_modules/gulp-coveralls/node_modules/coveralls/node_modules/request/index.js:808:12)
at IncomingMessage.EventEmitter.emit (events.js:117:20)
at _stream_readable.js:919:16
at process._tickCallback (node.js:419:13)
Here's what I've got so far:
gulp-coveralls in my dev dependencies in package.json
gulpfile.js:
var coveralls = require('gulp-coveralls');
...
gulp.task('coveralls', function () {
gulp.src('coverage/**/lcov.info')
.pipe(coveralls());
});
karma.conf.js:
coverageReporter: {
type : 'lcov',
dir : 'coverage/'
}
Github: https://github.com/lithiumtech/angular-embedly
I use Karma and PhantomJS to run my tests. The file coverage/lcov.info is definitely being generated. Any idea what could be going on?
Sarah,
What you are missing is a coveralls repository token. You must go to coveralls.io and create a login using your GitHub account. This will then pull all your repos into coveralls. Then for the repo that you want to use coveralls with, you turn coveralls on by clicking the "Off" switch.
Now click the "view on coveralls" button and it will show you your repo key. You can then set it up by creating a .coveralls.yml file and copying your keys into that file. This should solve your problem.
Maybe you have a mistake in your .coveralls.yml file. If you are using Travis CI, try this:
service_name: travis-ci
repo_token: token_given
and if you're using Travis Pro:
service_name: travis-pro
repo_token: token_given
I hope this will be useful.
For GitHub builds running in travis-ci.org you don't need a service name in your .coveralls.yml file, just the token. You don't need a passing build in Travis either, just successfully generated LCOV data and a plugin that sends it.
I had one issue with gulp-coveralls where the LCOV data was not sent properly to coveralls, when read from file using gulp.src. The only way I it would work finally was if the LCOV data was sent to the plugin directly, rather than using an intermediary file to store it to first.
To have both LCOV data piped to gulp-coveralls and have a JSON/HTML report, finally I resorted to lazy-pipe to create reusable steps.
The complete project can be found at GitHub's angular-logger
// .coveralls.yml
repo_token: the_token
var jasmine = require('gulp-jasmine');
var cover = require('gulp-coverage');
var coveralls = require('gulp-coveralls');
var lazypipe = require('lazypipe');
(..)
// gulpfile.js
var testAndGather = lazypipe()
.pipe(cover.instrument, {
pattern: ['src/**/*.js'],
debugDirectory: 'debug'
})
.pipe(jasmine, {includeStackTrace: true})
.pipe(cover.gather);
gulp.task('test', ['build'], function () {
gulp.src('spec/**/*spec.js')
.pipe(testAndGather())
.pipe(cover.format(['html']))
.pipe(gulp.dest('reports'));
});
gulp.task('travis', ['build'], function () {
gulp.src('spec/**/*spec.js')
.pipe(testAndGather())
.pipe(cover.format(['lcov']))
.pipe(coveralls()); // directly pipe into coveralls
});
Using:
"gulp-jasmine": "~2.0.1",
"gulp-coverage": "~0.3.35",
"gulp-coveralls": "~0.1.4",
"lazypipe": "~0.2.3"
Any ideas on how I could implement an auto-reload of files in Node.js? I'm tired of restarting the server every time I change a file.
Apparently Node.js' require() function does not reload files if they already have been required, so I need to do something like this:
var sys = require('sys'),
http = require('http'),
posix = require('posix'),
json = require('./json');
var script_name = '/some/path/to/app.js';
this.app = require('./app').app;
process.watchFile(script_name, function(curr, prev){
posix.cat(script_name).addCallback(function(content){
process.compile( content, script_name );
});
});
http.createServer(this.app).listen( 8080 );
And in the app.js file I have:
var file = require('./file');
this.app = function(req, res) {
file.serveFile( req, res, 'file.js');
}
But this also isn't working - I get an error in the process.compile() statement saying that 'require' is not defined. process.compile is evaling the app.js, but has no clue about the node.js globals.
A good, up to date alternative to supervisor is nodemon:
Monitor for any changes in your node.js application and automatically restart the server - perfect for development
To use nodemon with version of Node without npx (v8.1 and below, not advised):
$ npm install nodemon -g
$ nodemon app.js
Or to use nodemon with versions of Node with npx bundled in (v8.2+):
$ npm install nodemon
$ npx nodemon app.js
Or as devDependency in with an npm script in package.json:
"scripts": {
"start": "nodemon app.js"
},
"devDependencies": {
"nodemon": "..."
}
node-supervisor is awesome
usage to restart on save for old Node versions (not advised):
npm install supervisor -g
supervisor app.js
usage to restart on save for Node versions that come with npx:
npm install supervisor
npx supervisor app.js
or directly call supervisor in an npm script:
"scripts": {
"start": "supervisor app.js"
}
i found a simple way:
delete require.cache['/home/shimin/test2.js']
If somebody still comes to this question and wants to solve it using only the standard modules I made a simple example:
var process = require('process');
var cp = require('child_process');
var fs = require('fs');
var server = cp.fork('server.js');
console.log('Server started');
fs.watchFile('server.js', function (event, filename) {
server.kill();
console.log('Server stopped');
server = cp.fork('server.js');
console.log('Server started');
});
process.on('SIGINT', function () {
server.kill();
fs.unwatchFile('server.js');
process.exit();
});
This example is only for one file (server.js), but can be adapted to multiple files using an array of files, a for loop to get all file names, or by watching a directory:
fs.watch('./', function (event, filename) { // sub directory changes are not seen
console.log(`restart server`);
server.kill();
server = cp.fork('server.js');
})
This code was made for Node.js 0.8 API, it is not adapted for some specific needs but will work in some simple apps.
UPDATE:
This functional is implemented in my module simpleR, GitHub repo
nodemon came up first in a google search, and it seems to do the trick:
npm install nodemon -g
cd whatever_dir_holds_my_app
nodemon app.js
nodemon is a great one. I just add more parameters for debugging and watching options.
package.json
"scripts": {
"dev": "cross-env NODE_ENV=development nodemon --watch server --inspect ./server/server.js"
}
The command: nodemon --watch server --inspect ./server/server.js
Whereas:
--watch server Restart the app when changing .js, .mjs, .coffee, .litcoffee, and .json files in the server folder (included subfolders).
--inspect Enable remote debug.
./server/server.js The entry point.
Then add the following config to launch.json (VS Code) and start debugging anytime.
{
"type": "node",
"request": "attach",
"name": "Attach",
"protocol": "inspector",
"port": 9229
}
Note that it's better to install nodemon as dev dependency of project. So your team members don't need to install it or remember the command arguments, they just npm run dev and start hacking.
See more on nodemon docs: https://github.com/remy/nodemon#monitoring-multiple-directories
Nodemon has been the go to for restarting server for file changes for long time. Now with Node.js 19 they have introduced a --watch flag, which does the same [experimental]. Docs
node --watch index.js
node-dev works great. npm install node-dev
It even gives a desktop notification when the server is reloaded and will give success or errors on the message.
start your app on command line with:
node-dev app.js
There is Node-Supervisor that you can install by
npm install supervisor
see http://github.com/isaacs/node-supervisor
You can use nodemon from NPM.
And if you are using Express generator then you can using this command inside your project folder:
nodemon npm start
or using Debug mode
DEBUG=yourapp:* nodemon npm start
you can also run directly
nodemon your-app-file.js
Hope this help.
There was a recent (2009) thread about this subject on the node.js mailing list. The short answer is no, it's currently not possible auto-reload required files, but several people have developed patches that add this feature.
With Node.js 19 you can monitor file changes with the --watch option. After a file is changed, the process is restarted automatically, reflecting new changes.
node --watch server.js
yet another solution for this problem is using forever
Another useful capability of Forever is that it can optionally restart
your application when any source files have changed. This frees you
from having to manually restart each time you add a feature or fix a
bug. To start Forever in this mode, use the -w flag:
forever -w start server.js
Here is a blog post about Hot Reloading for Node. It provides a github Node branch that you can use to replace your installation of Node to enable Hot Reloading.
From the blog:
var requestHandler = require('./myRequestHandler');
process.watchFile('./myRequestHandler', function () {
module.unCacheModule('./myRequestHandler');
requestHandler = require('./myRequestHandler');
}
var reqHandlerClosure = function (req, res) {
requestHandler.handle(req, res);
}
http.createServer(reqHandlerClosure).listen(8000);
Now, any time you modify myRequestHandler.js, the above code will notice and replace the local requestHandler with the new code. Any existing requests will continue to use the old code, while any new incoming requests will use the new code. All without shutting down the server, bouncing any requests, prematurely killing any requests, or even relying on an intelligent load balancer.
I am working on making a rather tiny node "thing" that is able to load/unload modules at-will (so, i.e. you could be able to restart part of your application without bringing the whole app down).
I am incorporating a (very stupid) dependency management, so that if you want to stop a module, all the modules that depends on that will be stopped too.
So far so good, but then I stumbled into the issue of how to reload a module. Apparently, one could just remove the module from the "require" cache and have the job done. Since I'm not keen to change directly the node source code, I came up with a very hacky-hack that is: search in the stack trace the last call to the "require" function, grab a reference to it's "cache" field and..well, delete the reference to the node:
var args = arguments
while(!args['1'] || !args['1'].cache) {
args = args.callee.caller.arguments
}
var cache = args['1'].cache
util.log('remove cache ' + moduleFullpathAndExt)
delete( cache[ moduleFullpathAndExt ] )
Even easier, actually:
var deleteCache = function(moduleFullpathAndExt) {
delete( require.cache[ moduleFullpathAndExt ] )
}
Apparently, this works just fine. I have absolutely no idea of what that arguments["1"] means, but it's doing its job. I believe that the node guys will implement a reload facility someday, so I guess that for now this solution is acceptable too.
(btw. my "thing" will be here: https://github.com/cheng81/wirez , go there in a couple of weeks and you should see what I'm talking about)
solution at:
http://github.com/shimondoodkin/node-hot-reload
notice that you have to take care by yourself of the references used.
that means if you did : var x=require('foo'); y=x;z=x.bar; and hot reloaded
it.
it means you have to replace the references stored in x, y and z. in the hot reaload callback function.
some people confuse hot reload with auto restart
my nodejs-autorestart module also has upstart integration to enable auto start on boot.
if you have a small app auto restart is fine, but when you have a large app hot reload is more suitable. simply because hot reload is faster.
Also I like my node-inflow module.
Here's a low tech method for use in Windows. Put this in a batch file called serve.bat:
#echo off
:serve
start /wait node.exe %*
goto :serve
Now instead of running node app.js from your cmd shell, run serve app.js.
This will open a new shell window running the server. The batch file will block (because of the /wait) until you close the shell window, at which point the original cmd shell will ask "Terminate batch job (Y/N)?" If you answer "N" then the server will be relaunched.
Each time you want to restart the server, close the server window and answer "N" in the cmd shell.
my app structure:
NodeAPP (folder)
|-- app (folder)
|-- all other file is here
|-- node_modules (folder)
|-- package.json
|-- server.js (my server file)
first install reload with this command:
npm install [-g] [--save-dev] reload
then change package.json:
"scripts": {
"start": "nodemon -e css,ejs,js,json --watch app"
}
now you must use reload in your server file:
var express = require('express');
var reload = require('reload');
var app = express();
app.set('port', process.env.PORT || 3000);
var server = app.listen(app.get('port'), function() {
console.log( 'server is running on port ' + app.get('port'));
});
reload(server, app);
and for last change, end of your response send this script:
<script src="/reload/reload.js"></script>
now start your app with this code:
npm start
You can do it with browser-refresh. Your node app restarts automatically, your result page in browser also refreshes automatically. Downside is that you have to put js snippet on generated page. Here's the repo for the working example.
const http = require('http');
const hostname = 'localhost';
const port = 3000;
const server = http.createServer((req, res) => {
res.statusCode = 200;
res.setHeader('Content-Type', 'text/html; charset=UTF-8');
res.write('Simple refresh!');
res.write(`<script src=${process.env.BROWSER_REFRESH_URL}></script>`);
res.end();
})
server.listen(port, hostname, () => {
console.log(`Server running at http://${hostname}:${port}/`);
if (process.send) {
process.send({ event: 'online', url: `http://${hostname}:${port}/` })
}
});
Not necessary to use nodemon or other tools like that. Just use capabilities of your IDE.
Probably best one is IntelliJ WebStorm with hot reload feature (automatic server and browser reload) for node.js.
I have tried pm2 : installation is easy and easy to use too; the result is satisfying. However, we have to take care of which edition of pm2 that we want. pm 2 runtime is the free edition, whereas pm2 plus and pm2 enterprise are not free.
As for Strongloop, my installation failed or was not complete, so I couldn't use it.
If your talking about server side NodeJS hot-reloading, lets say you wish to have an Javascript file on the server which has an express route described and you want this Javascript file to hot reload rather than the server re-starting on file change then razzle can do that.
An example of this is basic-server
https://github.com/jaredpalmer/razzle/tree/master/examples/basic-server
The file https://github.com/jaredpalmer/razzle/blob/master/examples/basic-server/src/server.js will hot-reload if it is changed and saved, the server does not re-start.
This means you can program a REST server which can hot-reload using this razzle.
it's quite simple to just do this yourself without any dependency... the built in file watcher have matured enough that it dose not sucks as much as before
you don't need any complicated child process to spawn/kill & pipe std to in/out... you just need a simple web worker, that's all! A web Worker is also what i would have used in browsers too... so stick to web techniques! worker will also log to the console
import { watch } from 'node:fs/promises'
import { Worker } from 'node:worker_threads'
let worker = new Worker('./app.js')
async function reloadOnChange (dir) {
const watcher = watch(dir, { recursive: true })
for await (const change of watcher) {
if (change.filename.endsWith('.js')) {
worker.terminate()
worker = new Worker('./app.js')
}
}
}
// All the folder to watch for
['./src', './lib', './test'].map(reloadOnChange)
this might not be the best solution where you use anything else other than javascript and do not depend on some build process.
Use this:
function reload_config(file) {
if (!(this instanceof reload_config))
return new reload_config(file);
var self = this;
self.path = path.resolve(file);
fs.watchFile(file, function(curr, prev) {
delete require.cache[self.path];
_.extend(self, require(file));
});
_.extend(self, require(file));
}
All you have to do now is:
var config = reload_config("./config");
And config will automatically get reloaded :)
loaddir is my solution for quick loading of a directory, recursively.
can return
{ 'path/to/file': 'fileContents...' }
or
{ path: { to: { file: 'fileContents'} } }
It has callback which will be called when the file is changed.
It handles situations where files are large enough that watch gets called before they're done writing.
I've been using it in projects for a year or so, and just recently added promises to it.
Help me battle test it!
https://github.com/danschumann/loaddir
You can use auto-reload to reload the module without shutdown the server.
install
npm install auto-reload
example
data.json
{ "name" : "Alan" }
test.js
var fs = require('fs');
var reload = require('auto-reload');
var data = reload('./data', 3000); // reload every 3 secs
// print data every sec
setInterval(function() {
console.log(data);
}, 1000);
// update data.json every 3 secs
setInterval(function() {
var data = '{ "name":"' + Math.random() + '" }';
fs.writeFile('./data.json', data);
}, 3000);
Result:
{ name: 'Alan' }
{ name: 'Alan' }
{ name: 'Alan' }
{ name: 'Alan' }
{ name: 'Alan' }
{ name: '0.8272748321760446' }
{ name: '0.8272748321760446' }
{ name: '0.8272748321760446' }
{ name: '0.07935990858823061' }
{ name: '0.07935990858823061' }
{ name: '0.07935990858823061' }
{ name: '0.20851597073487937' }
{ name: '0.20851597073487937' }
{ name: '0.20851597073487937' }
another simple solution is to use fs.readFile instead of using require
you can save a text file contaning a json object, and create a interval on the server to reload this object.
pros:
no need to use external libs
relevant for production (reloading config file on change)
easy to implement
cons:
you can't reload a module - just a json containing key-value data
For people using Vagrant and PHPStorm, file watcher is a faster approach
disable immediate sync of the files so you run the command only on save then create a scope for the *.js files and working directories and add this command
vagrant ssh -c "/var/www/gadelkareem.com/forever.sh restart"
where forever.sh is like
#!/bin/bash
cd /var/www/gadelkareem.com/ && forever $1 -l /var/www/gadelkareem.com/.tmp/log/forever.log -a app.js
I recently came to this question because the usual suspects were not working with linked packages. If you're like me and are taking advantage of npm link during development to effectively work on a project that is made up of many packages, it's important that changes that occur in dependencies trigger a reload as well.
After having tried node-mon and pm2, even following their instructions for additionally watching the node_modules folder, they still did not pick up changes. Although there are some custom solutions in the answers here, for something like this, a separate package is cleaner. I came across node-dev today and it works perfectly without any options or configuration.
From the Readme:
In contrast to tools like supervisor or nodemon it doesn't scan the filesystem for files to be watched. Instead it hooks into Node's require() function to watch only the files that have been actually required.
const cleanCache = (moduleId) => {
const module = require.cache[moduleId];
if (!module) {
return;
}
// 1. clean parent
if (module.parent) {
module.parent.children.splice(module.parent.children.indexOf(module), 1);
}
// 2. clean self
require.cache[moduleId] = null;
};