Nodemon - specifying extension watch list using config files - javascript

Is there a way to specify watch list using config files instead of command line?
Command line method in nodemon's doc:
I attempted to use a nodemon.json config file with the following:
{
"ext": ["js", "json", "hbs", "html"]
}
Returned an 'extension.match' error.
Then I tried to add the config to package.json with the below:
{...
"nodemonConfig": {
"ext": ["js", "json", "hbs", "html"]
}
...}
Also same error.
I have a feeling both approaches are on the right track, but I'm missing something.

You can use a nodemon.json file in the root of your application you were almost there but the syntax is slightly different to what you had the correct syntax would look like this:
{
"ext": "js,json,hbs,html"
}

Nodemon also supports using package.json for configuration, under nodemonConfig key.
The config is in the same format as in the nodemon.json config file:
package.json
{
...
"nodemonConfig": {
"ext": "js,json,hbs,html",
...
}
}

Step 1: First you add nodemon.json file in the root of your project.
(For example if you have a Weather-App Project first you add nodemon.json in the root.)
Weather-App _
|_ nodemon.json
Step 2: Add this code in newly created nodemon.json file.
{
"ext": "js,json,hbs,html,css"
}

Related

Configure custom "root" directory with "babel-plugin-module-resolver"?

Here is my project structure:
cloudRun
distApp // TRANSPILED APP FILES FROM ./src
distService // TRANSPILED BACKEND FILES FROM ./cloudRun/src
src // SOURCE FILES FOR THE BACKEND CODE
index.js // INDEX.JS FOR THE BACKEND CODE
babel.config.js // CONFIG FOR THE BABEL TRANSPILE SCRIPT
src // SOURCE FILES FOR THE APP
index.js // INDEX.JS FOR THE APP CODE
package.json // THIS IS THE MAIN PROJECT package.json
I'll try to be very succinct and clear.
In both of the index.js (app and backend code) I use path aliases in the source code.
For example:
./src/some-folder/some-file.js
import xxx from "#src/hooks/someHoot";
// IN THE TRANSPILED VERSION #src MUST BE CONVERTED TO ./cloudRun/distApp
And also, for example:
./cloudRun/src/some-folder/some-file.js
import xxx from "#src/hooks/someHoot";
// IN THE TRANSPILED VERSION #src MUST BE CONVERTED TO ./cloudRun/distApp
But somehow I'm having trouble when configuring module-resolver on babel.config.js. Either I get it to work correctly with the path aliases present on ./src (and path aliases on ./cloudRun/src are all wrong by 1 level) or vice-versa.
For example:
.cloudRun/babel.config.js
plugins = [
["module-resolver", {
"alias": {
"#src" : "./distApp",
"#hooks" : "./distApp/hooks",
}
}]
];
This works for the ./src files. But files from ./cloudRun/src are all wrong by 1 level up.
And if I change to this:
.cloudRun/babel.config.js
plugins = [
["module-resolver", {
"alias": {
"#src" : "./cloudRun/distApp",
"#hooks" : "./cloudRun/distApp/hooks",
}
}]
];
Then it works fine for the ./cloudRun/src files. But all files from ./src will be wrong by 1 level down.
I was thinking that I might fix this with the "root" option in the module-resolver config. But I couldn't make it work yet.
Maybe something like this:
.cloudRun/babel.config.js
plugins = [
["module-resolver", {
"root": ["./cloudRun"], // SET A NEW ROOT HERE
"alias": {
"#src" : "./distApp",
"#hooks" : "./distApp/hooks",
}
}]
];
I've tried many things inside the "root" config. But so far it doesn't seem to make any difference.
Here is how I run babel:
// SCRIPTS FROM ./package.json
babel src --out-dir cloudRun/distApp --config-file ./cloudRun/babel.config.js
babel cloudRun/src --out-dir cloudRun/distService --config-file ./cloudRun/babel.config.js

How to create multiple pages with different languages from one template?

I want to generate multiple pages which will have content on different languages from one common template. How can I do it with webpack?
I tried to use different webpack plugins like webpack-static-i18n-html, i18n-webpack-plugin but nothing works for me. The best thing I found is a webpack-static-i18n-html, but it has bad support and this plugin can't watch changes in JSON files with translated text. Below is what I have for now.
This is my code from webpack.common.js.
const Path = require('path');
const HtmlWebpackPlugin = require('html-webpack-plugin');
const StaticI18nHtmlPlugin = require("webpack-static-i18n-html");
//...
module.exports = {
//...
plugins: [
//...
new StaticI18nHtmlPlugin({
locale: 'en',
locales: ['en', 'ua', 'ru'],
baseDir: Path.posix.join(__dirname, ".."),
outputDir: 'src/localized-pages',
outputDefault: '__lng__/__file__',
localesPath: 'src/locales',
files: 'src/templates/index.html'
}),
new HtmlWebpackPlugin({
filename: 'index.html',
template: Path.resolve(__dirname, '../src/templates/index.html')
}),
new HtmlWebpackPlugin({
filename: 'ua/index.html',
template: Path.resolve(__dirname, '../src/localized-pages/ua/src/templates/index.html')
}),
new HtmlWebpackPlugin({
filename: 'ru/index.html',
template: Path.resolve(__dirname, '../src/localized-pages/ru/src/templates/index.html')
}),
//...
],
//...
};
I also have webpack.dev.js and webpack.prod.js which I merge with webpack.common.js via webpack-merge plugin. As you can see, after generating pages, I have to use HtmlWebpackPlugin to serve them. It's awkward to use.
locales folder:
locales
|-en.json
|-ua.json
|-ru.json
en.json file:
{
"key": {
"innerKey" : "value"
}
}
Then plugin generates from:
<p data-t>key.innerKay</p>
this
<p>value</p>
But as I said, If I change en.json nothing will regenerate. I will not use this way to generate multiple pages for different languages.
So, I would like to generate several pages from one template. Is there any way to do this with webpack?
I was working on a multi language admin dashboard with Webpack and was wondering how could I tackle this problem until I found a way to make everything automatic with a multiple language web template.
First of all, webpack-static-i18n-html isn't a good solution because most of its packages are deprecated. But actually the mentioned package is based on a good npm package called node-static-i18n. So, the first thing you need to do is installing this npm package using this command
npm install -g static-i18n
Next, you need to make your translation file as *.json files and in json format and put them in a folder which I named "locales" and I put it in my "src" folder of my project. I need two languages for my website. One of them is English and another is Farsi or Persian. Therefore I made two file namely fa.json and en.json. So, I have folder and file structure like the picture below:
My file and folder structure in my Webpack project
This is part of my en.json file as an example:
{
"menu": {
"items": {
"dashboard": "Dashboard",
"posts": "Posts",
"media": "Media"
},
"sub": {
"items": {
"all-posts": "All Posts",
"add-new-post": "Add New",
"categories": "Categories"
}
}
}
}
This is part of my fa.json file as an example:
{
"menu": {
"items": {
"dashboard": "پیشخوان",
"posts": "نوشته ها",
"media": "رسانه"
},
"sub": {
"items": {
"all-posts": "نوشته ها",
"add-new-post": "افزودن نوشته",
"categories": "دسته ها"
}
}
}
}
and you can use them in your html tags like this:
<span class="className" data-t>menu.items.dashboard</span>
Please notice that for using translation you should use the attribute data-t in your tags like span then you can use key and values saved in your related json file to use translations between your tags. for more information about data-t and its usage please go to the plugin's Github page that I mentioned it earlier in this text on the plugin name.
Next, you should write needed command in the script section of your package.json file to run node-static-i18n to translate your template based on your html template file and save them in i18n folder in root of your project as below:
"scripts": {
"i18n": "static-i18n -l en -i fa -i en src --localesPath src/locales/",
}
in the above command:
-l: The default locale.
-i: the list of locales to be generated.
--localesPath: The directory of the translations, where each file should be named LOCALE_NAME.json
Now if you run npm run i18n this command should make a folder in root path of your project called i18n containing html files in two languages in this case. it will be like the picture below:
i18n folder and translated html files in it
Next you should config your Html Webpack Plugin in your Webpack config file to show these pages in your browser like this:
plugins: [
.
.
.
new HtmlWebpackPlugin({
//inject: false,
chunks: ['main'],
template: 'i18n/index.html',
filename: 'index.html'
}),
new HtmlWebpackPlugin({
//inject: false,
chunks: ['main-rtl'],
template: 'i18n/fa/index.html',
filename: 'fa/index.html'
})
]
because you need to see changes on your browser automatically you need another package called npm-watch to install through this command:
npm install -D npm-watch
Then, you should change script section of your package.json like this:
"scripts": {
"i18n-watch": "watch 'npm run i18n' src",
"i18n": "static-i18n -l en -i fa -i en src --localesPath src/locales/",
}
By using the command npm run i18n-watch whenever you make any changes in your locale files or your original html template in src folder it's gonna re-translate your html file based on new information and if you're running your webpack dev server you can see the result right after you save changes.
After that, to run i18n-watch command and your Webpack dev server at the same time it would be great installing another npm package for this purpose called npm-run-all by using the command below:
npm i -D npm-run-all
Finally, you can change the script section of your package.json like this to run i18n-watch and your Webpack dev server at the same time and after that if you make any changes you can see the result in the browser right after saving changes.
"scripts": {
"i18n-watch": "watch 'npm run i18n' src",
"i18n": "static-i18n -l en -i fa -i en src --localesPath src/locales/",
"webpack-dev": "webpack-dev-server --open --config=config/webpack.dev.js",
"start": "npm-run-all --parallel webpack-dev i18n-watch"
}
Now, if you use npm start in your terminal you will see your Webpack dev server and i18n-watch are running at the same time watching for any changes.
Hopefully this makes sense.

How to add a folder as entry in npm package?

I am trying to publish a npm module. Which has a following folder structure.
In my package.json it has "main": "./dist/" I understand this resolve for index.js. But in the dist folder I have individual files named as string.js, class.js, dom.js I am planning to import them as
import { isValidZipCode } from '#scope/utils/string'; but right now I have to import them as import { isValidZipCode } from '#scope/utils/dist/string';
Is there a way I can resolve a folder when I import a module from node_modules?
EDIT: Main idea is to import the files as import { isValidZipCode } from '#scope/utils/string' when I keep individual files for individual exports.
The other answers are correct for the most part, but I think there's one thing that's missing (either from your OG post or from their answers), which is:
Your folder structure is definitely not standard, which likely led to your current problems as well as non-helpful results in the Google searches when you tried to find an answer.
You didn't show your package.json nor your webpack.config.js file contents, which are the key to answering your question even if you did have such a weird file structure.
Some suggestions:
Change your folder structure to be something along the lines of
/
|--src
|--utils
|--string.js
|--[... other js files]
|--index.js
|--dist (will be generated automatically)
|--[config files, like package.json, webpack.config.js, etc]
Make your webpack.config.js have something along the lines of:
output: {
path: path.resolve(__dirname, 'dist'),
//...
}
plugins: [
new CopyWebpackPlugin({
patterns: [
'ReadMe.md', // optional
'package.json',
'LICENSE.md' // optional
]
})
],
In order to fix/normalize the output (e.g. output would be /dist/utils/[string.js, ...], /dist/package.json).
Then, make your package.json main something like
"main": "utils/string.js"
After doing that, your output should look something like
/
|--src
|--utils
|--string.js
|--[... other js files]
|--index.js
|--dist
|--utils
|--string.js
|--[... other js files]
|--index.js // optional: only if you want to support stuff like
// `import { isValidZip } from '#scope/utils';`
|--package.json
|--[config files, like package.json, webpack.config.js, etc]
Finally, you need to cd dist and run npm publish from inside there. (That's why you need the package.json inside that directory.)
I can't really go into details about the #scope portion since I haven't done that myself, but I did the above for one of my own projects and it worked as expected.
All you need to do is to make a index file in root folder then just export all files with the following:
In your dist/string export each method/function on it, and for the index do it follows:
export * from "./dist";
as it helps maintain code and looks cleaner to eye
Regards :)
Create a index file in root folder then just export all files like this
export { default as Str } from "./dist/string";
export { default as Cls } from "./dist/class";
export { default as Dom } from "./dist/dom";
and also update package.json change main from./dis/ to ./
Hope this will help you. Happy coding.

How can I concatinate the eslint airbnb style guide into one file to copy into package.json?

If you look at these directories and quoted file contents you can see the structure of the style guide .js files and how they all load into eslint:
https://github.com/airbnb/javascript/tree/master/packages/eslint-config-airbnb-base
https://github.com/airbnb/javascript/tree/master/packages/eslint-config-airbnb-base/rules
//index.js
module.exports = {
extends: [
'./rules/best-practices',
'./rules/errors',
'./rules/node',
'./rules/style',
'./rules/variables',
'./rules/es6',
'./rules/imports', // (my note) not needed as uses extra plugin
].map(require.resolve),
parserOptions: {
ecmaVersion: 2018,
sourceType: 'module',
},
rules: {
strict: 'error',
},
};
// .eslintrc
{
"extends": "./index.js",
"rules": {
// disable requiring trailing commas because it might be nice to revert to
// being JSON at some point, and I don't want to make big changes now.
"comma-dangle": 0,
// we support node 4
"prefer-destructuring": 0,
},
}
I would like to concatinate all the files together so I can paste it into my package.json. How can I do this? I don't know node, I don't need all the other stuff in the NPM download, I would just like a permanent copy of the current style guide in one file in one place. Cheers!
If you follow the instructions on installing eslint-config-airbnb-base: https://github.com/airbnb/javascript/tree/master/packages/eslint-config-airbnb-base#eslint-config-airbnb-base-1, step 2 asks you to create a file .eslintrc in your project's root directory.
// .eslintrc
{
"extends": "airbnb-base"
}
You don't need to concatenate any files to use eslint-config-airbnb-base. Install it and create .eslintrc and you should be good to go.
I know this is old and I have not tried this but it should work.
Just copy the rule files from the Airbnb repo into your repo and then create a new copy from their index.js file called something else like "my-eslint-rules.js" in your repo. The last step is to have an .eslintrc that refers to this new file with extends.

How to automatically zip files with Node.js and npm

Is there a way to automatically zip certain files at the build time with Node.js and npm?
For example, I have a project, that file structure looks like this:
Project/
--lib/
--node_modules/
--test/
--index.js
--package.json
I want to be able to zip lib folder, certain modules from node_modules and index.js into some zip archive to upload it on the AWS Lambda, for example. I do not need test folder or test Node.js modules (mocha and chai) to be zipped. I have even created a bash script for generating zip file, but is there a way to automatically execute this script, when 'npm install' is called?
This should be a standard problem and it should have a standard solution, but I was unable to discover such.
UPDATE
thanks to michael, decided to use gulp. This is my script, in case some one else will need it for AWS Lambda:
var gulp = require('gulp');
var clean = require('gulp-clean');
var zip = require('gulp-zip');
var merge = require('merge-stream');
gulp.task('clean', function () {
var build = gulp.src('build', {read: false})
.pipe(clean());
var dist = gulp.src('dist', {read: false})
.pipe(clean());
return merge(build, dist);
});
gulp.task('build', function() {
var index = gulp.src('index.js')
.pipe(gulp.dest('build'));
var lib = gulp.src('lib/**')
.pipe(gulp.dest('build/lib'));
var async = gulp.src('node_modules/async/**')
.pipe(gulp.dest('build/node_modules/async'));
var collections = gulp.src('node_modules/collections/**')
.pipe(gulp.dest('build/node_modules/collections'));
var underscore = gulp.src('node_modules/underscore/**')
.pipe(gulp.dest('build/node_modules/underscore'));
var util = gulp.src('node_modules/util/**')
.pipe(gulp.dest('build/node_modules/util'));
var xml2js = gulp.src('node_modules/xml2js/**')
.pipe(gulp.dest('build/node_modules/xml2js'));
return merge(index, lib, async, collections, underscore, util, xml2js);
});
gulp.task('zip', ['build'], function() {
return gulp.src('build/*')
.pipe(zip('archive.zip'))
.pipe(gulp.dest('dist'));
});
gulp.task('default', ['zip']);
I realize this answer comes years too late for the original poster. But I had virtually the same question about packaging up a Lambda function, so for posterity, here's a solution that doesn't require any additional devDependencies (like gulp or grunt) and just uses npm pack along with the following package.json (but does assume you have sed and zip available to you):
{
"name": "my-lambda",
"version": "1.0.0",
"scripts": {
"postpack": "tarball=$(npm list --depth 0 | sed 's/#/-/g; s/ .*/.tgz/g; 1q;'); tar -tf $tarball | sed 's/^package\\///' | zip -#r package; rm $tarball"
},
"files": [
"/index.js",
"/lib"
],
"dependencies": {
"async": "*",
"collections": "*",
"underscore": "*",
"util": "*",
"xml2js": "*"
},
"bundledDependencies": [
"async",
"collections",
"underscore",
"util",
"xml2js"
],
"devDependencies": {
"chai": "*",
"mocha": "*"
}
}
Given the above package.json, calling npm pack will produce a package.zip file that contains:
index.js
lib/
node_modules/
├── async/
├── collections/
├── underscore/
├── util/
└── xml2js/
The files array is a whitelist of what to include. Here, it's just index.js and the lib directory.
However, npm will also automatically include package.json, README (and variants like README.md, CHANGELOG (and its variants), and LICENSE (and the alternative spelling LICENCE) unless you explicitly exclude them (e.g. with .npmignore).
The bundledDependencies array specifies what packages to bundle. In this case, it's all the dependencies but none of the devDependencies.
Finally, the postpack script is run after npm pack because npm pack generates a tarball, but we need to generate a zip for AWS Lambda.
A more detailed explanation of what the postpack script is doing is available at https://hackernoon.com/package-lambda-functions-the-easy-way-with-npm-e38fc14613ba (and is also the source of the general approach).
If you're UNIX-based you could also just use the zip command in one of your scripts:
"scripts": {
"zip": "zip -r build.zip build/"
"build": "build",
"build-n-zip": "build && zip
}
The above creates a build.zip at the root, which is a zipped up version of the /build folder.
If you wanted to zip multiple folders/files, just add them to the end:
"scripts": {
"zip": "zip -r build.zip build/ some-file.js some-other-folder/"
}
Note
If a build.zip already exists in the folder, the default behaviour is for zip to add files to that existing archive. So many people who are continuously building will probably want to delete the build.zip first:
"scripts": {
"zip": "rm -f build.zip && zip -r build.zip build",
"build": "build",
"build-n-zip": "yarn build && yarn zip"
}
I would go with gulp using gulp-sftp, gulp-tar and gulp-gzip and an alias as command. Create a file called .bash_aliases in your users home folder containing
alias installAndUpload='npm install && gulp runUploader'
After a reboot you can call both actions at once with this alias.
A gulp file could look something like this
var gulp = require('gulp');
var watch = require('gulp-watch');
var sftp = require('gulp-sftp');
var gzip = require('gulp-gzip');
gulp.task('runUploader', function () {
gulp.src('.path/to/folder/to/compress/**')
.pipe(tar('archive.tar'))
.pipe(gzip())
.pipe(gulp.dest('path/to/folder/to/store')) // if you want a local copy
.pipe(sftp({
host: 'website.com',
user: 'johndoe',
pass: '1234'
}))
});
Of course, you can also add gulp-watch to automatically create the tar/zip and upload it whenever there is a change in the directory.
You should take a look to npm scripts.
You'll still need a bash script laying around in your repository, but it will be automatically triggered by some npm tasks when they are executed.
npm-pack-zip worked for me.
npm install --save-dev npm-pack-zip
To publish the whole lambda using aws I used this node script in package.json:
"publish": "npm-pack-zip && aws lambda update-function-code --function-name %npm_package_name% --zip-file fileb://%npm_package_name%.zip && rm %npm_package_name%.zip"
You can use Zip-Build, this little package will use the data in your package.json file and create a compressed file named project-name_version.zip.
Disclaimer: I am a developer of this library.
How to use zip-build
Just install in your project as dev dependency with:
$ npm install --save-dev zip-build
Then modify the build script in your package.json, adding && zip-build at the end, like this:
"scripts": {
"build": your-build-script && zip-build
}
If your build directory is named different than build and your desired directory for compressed files is named different than dist, you can provide the directory names as arguments for zip-build:
"scripts": {
"build": your-build-script && zip-build build-dirname zip-dirname
}
If you need automate tasks take a look to Grunt or Gulp.
In the case of Grunt needed plugins:
https://www.npmjs.com/package/grunt-zip
https://www.npmjs.com/package/grunt-aws-lambda
Check out my gist at https://gist.github.com/ctulek/6f16352ebdfc166ce905
This uses gulp for all the tasks you mentioned except creating the lambda function initially (it only updates the code)
It assumes every lambda function is implemented in its own folder, and you need to define your AWS credential profile.

Categories