eslint on dynamically changing file list - javascript

I'd like to run eslint on modified files only. I've created a new run target in package.json to run eslint from command line (git diff --name-only --relative | grep -E '.*\\.(vue|js)$' | xargs eslint --ext .js,.vue). In theory, this should work fine, but there's a little transformation step happening in my project (a string replacement) when bundling the files with webpack that will throw off eslint (some non-standard markup will be expanded to JS).
What are my options and how would I go about implementing them? For instance, could I execute a particular webpack rule/loader and pipe the result to eslint? Another option I see is to include eslint into the webpack rule/loader process (instead of executing it from the command line), but how would I then filter on files that are currently modified (could this be handled by a temporary file that contains the git diff... result?)

I've got a somewhat working approach. I chose to modify webpack.base.conf.js instead of going for the command line solution to make use of the already existing string replacement loader.
The files are collected in the WebpackBeforeBuildPlugin callback function and instead of a regex based test variable, a function is used which checks against the previously collected files.
const exec = require('child_process').exec;
const WebpackBeforeBuildPlugin = require('before-build-webpack');
var modFilesList = new Set([]);
const srcPath = resolve('.');
...
rules: [{
test: function(filename) {
let relFilename = path.relative(srcPath, filename);
let lint = modFilesList.has(relFilename);
return lint
},
loader: 'eslint-loader',
include: resolve('src'),
exclude: /node_modules/,
options: {
formatter: require('eslint-friendly-formatter'),
cache: false
}
}, {
... other string replacement loader ...
}
plugins: [
...
new WebpackBeforeBuildPlugin(function(stats, callback) {
// Collect changed files before building.
let gitCmd = 'git diff --name-only --relative | grep -E ".*\\.(vue|js)$"';
const proc = exec(gitCmd, (error, stdout, stderr) => {
if (stdout) {
let files = stdout.split('\n');
modFilesList = new Set(files);
}
if (error !== null) {
console.log(`exec error: ${error}`);
}
});
callback();
})
]
The only problem at the moment is that when git file changes occur, they don't trigger a re-linting based on these file changes (i.e. new file is changed, or previously (before starting webpack-dev-server) changed file changes are discarded). I checked everything I could. The change is registered and stored in modFilesList, the test function is executed and returns true (for a new change in a previously unchanged file) or false in case the change was discarded. I also played with the cache option to no avail. It seems that at initial load, eslint-loader caches the files it will lint in future (don't know if that's a result of using a test function instead of a regex or also the case with the regex). Is anyone having an idea or has seen this before (eslint-loader not updating the file list)?
Update
This seems to be a problem with webpack (or one of the other loaders) as the eslint-loader isn't even executed when the file changed. The test function however is executed which is a bit weird. I don't fully understand how loaders work or how they play together, so there might be some other loader that is causing this...

Related

Using RequireJS with node to optimize creating single output file does not include all the required files

I use the FayeJS and the latest version has been modified to use RequireJS, so there is no longer a single file to link into the browser. Instead the structure is as follows:
/adapters
/engines
/mixins
/protocol
/transport
/util
faye_browser.js
I am using the following nodejs build script to try and end up with all the above minified into a single file:
var fs = require('fs-extra'),
requirejs = require('requirejs');
var config = {
baseUrl: 'htdocs/js/dev/faye/'
,name: 'faye_browser'
, out: 'htdocs/js/dev/faye/dist/faye.min.js'
, paths: {
dist: "empty:"
}
,findNestedDependencies: true
};
requirejs.optimize(config, function (buildResponse) {
//buildResponse is just a text output of the modules
//included. Load the built file for the contents.
//Use config.out to get the optimized file contents.
var contents = fs.readFileSync(config.out, 'utf8');
}, function (err) {
//optimization err callback
console.log(err);
});
The content of faye_browser.js is:
'use strict';
var constants = require('./util/constants'),
Logging = require('./mixins/logging');
var Faye = {
VERSION: constants.VERSION,
Client: require('./protocol/client'),
Scheduler: require('./protocol/scheduler')
};
Logging.wrapper = Faye;
module.exports = Faye;
As I under stand it the optimizer should pull in the required files, and then if those files have required files, it should pull in those etc..., and and output a single minified faye.min.js that contains the whole lot, refactored so no additional serverside calls are necessary.
What happens is faye.min.js gets created, but it only contains the content of faye_browser.js, none of the other required files are included.
I have searched all over the web, and looked at a heap of different examples and none of them work for me.
What am I doing wrong here?
For anyone else trying to do this, I mist that on the download page it says:
The Node.js version is available through npm. This package contains a
copy of the browser client, which is served up by the Faye server when
running.
So to get it you have to pull down the code via NPM and then go into the NPM install dir and it is in the "client" dir...

How to mutate node-config in memory

I'm trying to mutate the value of my config in memory for testing, I've tried adding process.env.ALLOW_CONFIG_MUTATIONS=true in several spots in the application, as well as through the command line and my .env file.
The config.util.getEnv('ALLOW_CONFIG_MUTATION') method always returns undefined.
I've also tried using importFresh and MockRequest as per examples I've seen online, neither of which allow me to mutate the config in memory, and then reset the value later.
Does anyone have any idea about this?
Update: here's an example of what I'm trying to accomplish
const config = require (config);
const app = new App(config)
it(`does a thing with base config`, () => { ... }
it('does a thing with modified config, () => {
// here i would need to modify my config value and
// have it change the original config that's currently in
// application memory
config = newConfig
expect(config.get('newValues')).to.equal(true)
}
Thanks!
If it is the same config module that I have used (I think I is) then add a custom-environment-variables.js OR test.js with you test config.
test.js will need an ENV=test to work and the custom-environment-variables need something like (for Mac's and NPM) $ npm run funcTest -> yarn serverRunning && NODE_ENV=test wdio wdio.conf.js.
the JSON will look something like
{
test: 'Value'
}

Proper way of using grunt-bump and grunt-ng-constant together

While seemingly the tasks execute in proper order (bump first and than ngconstant creating a config file based on package.json-s version property) i think they actually execute parallely, and ngconstant reads up the package.json before bump has written it.
Running "bump" task
md
>> Version bumped to 2.0.6 (in package.json)
Running "ngconstant:production" (ngconstant) task
Creating module config at app/scripts/config.js...OK
The resultung package.json has 2.0.6 as version while config.js has 2.0.5.
My ngconstant config simply uses
grunt.file.readJSON('package.json')
to read up the json.
So, basically the question is, how can i make sure that bump's write is finished, before reading up the json with ngconstant, and what actually causes the above?
EDIT: the original Gruntfile: https://github.com/dekztah/sc2/blob/18acaff22ab027000026311ac8215a51846786b8/Gruntfile.js
EDIT: the updated Gruntfile that solves the problem: https://github.com/dekztah/sc2/blob/e7985db6b95846c025ba0b615bf239c4f9c11e8f/Gruntfile.js
Probably your package.json file is stored in memory and is not updated before your run the next task.
An workaround would be to create a script in your file package.json as:
"scripts": {
"bumb-and-ngconstant": "grunt:bump && grunt:build"
}
As per grunt-ng-constant documentation:
Or if you want to calculate the constants value at runtime you can create a lazy evaluated method which should be used if you generate your json file during the build process.
grunt.initConfig({
ngconstant: {
options: {
dest: 'dist/module.js',
name: 'someModule'
},
dist: {
constants: function () {
return {
lazyConfig: grunt.file.readJSON('build/lazy-config.json')
};
}
}
},
})
This forces the json to be read while the task runs, instead of when grunt inits the ngconstant task.

Webpack Hashing *after* uglification

We're using the style-loader in webpack, which, when compiled seems to place information about the current directory in the source code that injects style tags when modules are loaded/unloaded. It looks roughly like this:
if(false) {
// When the styles change, update the <style> tags
if(!content.locals) {
module.hot.accept("!!./../../../../../node_modules/css-loader/index.js!./../../../../../node_modules/sass-loader/index.js?includePaths[]=/var/deploy/referrals/web_head/releases/20151118202441/node_modules/patternity/node_modules/node-neat/node_modules/node-bourbon/node_modules/bourbon/app/assets/stylesheets&includePaths[]=/var/deploy/referrals/web_head/releases/20151118202441/node_modules/patternity/node_modules/node-neat/node_modules/bourbon-neat/app/assets/stylesheets&includePaths[]=/var/deploy/referrals/web_head/releases/20151118202441/node_modules/patternity&includePaths[]=/var/deploy/referrals/web_head/releases/20151118202441/node_modules/infl-fonts!./campaigns.scss", function() {
var newContent = require("!!./../../../../../node_modules/css-loader/index.js!./../../../../../node_modules/sass-loader/index.js?includePaths[]=/var/deploy/referrals/web_head/releases/20151118202441/node_modules/patternity/node_modules/node-neat/node_modules/node-bourbon/node_modules/bourbon/app/assets/stylesheets&includePaths[]=/var/deploy/referrals/web_head/releases/20151118202441/node_modules/patternity/node_modules/node-neat/node_modules/bourbon-neat/app/assets/stylesheets&includePaths[]=/var/deploy/referrals/web_head/releases/20151118202441/node_modules/patternity&includePaths[]=/var/deploy/referrals/web_head/releases/20151118202441/node_modules/infl-fonts!./campaigns.scss");
if(typeof newContent === 'string') newContent = [[module.id, newContent, '']];
update(newContent);
});
}
// When the module is disposed, remove the <style> tags
module.hot.dispose(function() { update(); });
}
The thing to note is the directory listed in the accept string.
Now, this code ultimately gets removed once uglify is run (because of the if (false)) which is great. Where my problem lies however is that this compilation happens on 2 machines and the chunk hashing appears to happen before uglification, because the hash generated on 2 different machines (or even on the same machine but when in a different folder) is different. This obviously won't work if I'm deploying to, say, a production machine and need both of these machines to serve up an asset with the same digest.
Just to clarify, when not minified, the code is different, thus generates a different hash, but minification does in fact make the files identical, however the chunk hashing appears to have happened before minification.
Does anyone know how I can get the chunkhash to be generated after the uglify plugin is run. I'm using a config like so:
...
output: {
filename: '[name]-[chunkhash].js'
...
with the command:
webpack -p
edit
So after looking over this. I'm seeing now that this has to do with us adding includePaths to our style loader, it looks like this:
var includePaths = require('node-neat').includePaths;
module: {
loaders: [
{ test: /\.scss$/, loader: "style!css!sass?includePaths[]=" + includePaths },
]
}
So I think we know why we're getting these absolute URLs, but I think the original question still stands, IMO webpack should be hashing chunks AFTER minification, not before.

RequireJS optimizer prepend

I am looking for a way to prepend some information to minimized file.
I've found the option here, but it doesn't useful for me because uglifier runs after the wrapper code has been added.
You could assign a function to the out key to post-process the file. By setting this option, the result will not automatically be written to a file, so you have to do that yourself. For example:
({
// Let's optimize mainApp.js
name: "mainApp",
optimize: "uglify",
out: function(text) {
// Transform the compiled result.
text = '// Stuff to prepend \n' + text;
var filename = 'outputfile.js';
// By default, the name is resolved to the current working directory.
// Let's resolve it to the directory that contains this .build.js:
filename = path.resolve(this.buildFile, '..', filename);
// Finally, write the transformed result to the file.
file.saveUtf8File(filename, text);
}
})
Note: In the previous snippet, file.saveUtf8File is an internal RequireJS API and path is the path module imported from Node.js standard library (only if you run r.js with Node.js, and not e.g. with Rhino or in the browser).
If you save the previous as test.build.js, create an empty file called mainApp.js and run `r.js -o test.build.js, then a file called "outputfile.js" will be created with the following content:
// Stuff to prepend
define("mainApp",function(){});

Categories