This is a quite common scenario, when two flavors of 3rd party JavaScript dependencies are required for a project. For development purposes a non-minified versions of JavaScript files are used, while deploy scenario typically only includes the minified versions (*.min.js).
Let's assume both (minified and non-minified) versions of all the dependencies are in the 'repo' folder. Further, there's 2 versions of the 'main' file, one uses the minified deps 'main.min.js' and while 'main.js' uses the non-minified ones. Let's assume both 'main' files can be generated by some means from the 'deps.json' where all the dependencies are declared. The file structure is as follows:
public/
lib/
repo/
angular/
...
angular-resource/
...
angular-route/
...
build.gradle
deps.json
main.js
main.min.js
The public folder is where all the output files should appear, so I wrote the corresponding build.gradle file:
task createMain {
inputs.file 'deps.json'
// TODO: read deps.json and create main.min.js and './main.js
outputs.file 'main.min.js'
outputs.file 'main.js'
}
task copyMain(type: Copy, dependsOn: createMain) {
from('.') {
include 'main.js'
}
into('public')
}
task copyMainForDeploy(type: Copy, dependsOn: createMain) {
from('.') {
include 'main.min.js'
}
rename('main.min.js','main.js')
into('public')
}
task installJSDeps(type: Copy, dependsOn: copyMain){
from('repo')
into('public/lib')
outputs.dir 'public/lib'
inputs.file 'deps.json'
}
task installJSDepsForDeploy(type: Copy, dependsOn: copyMainForDeploy){
from('repo'){
include '**/*.min.js'
}
into('public/lib')
outputs.dir 'public/lib'
inputs.file 'deps.json'
doFirst {
//clean up any existing files before copying new ones
FileTree tree = fileTree (dir: "public/lib");
delete(tree)
}
}
What I was hoping to achieve is: if I call installJSDepsForDeploy only the minified files appear in public/lib, if I call installJSDeps all the files appear in public/lib (in addition to main file being copied/renamed).
What happens is the following:
$ gradle installJSDepsForDeploy
:createMain UP-TO-DATE
:copyMainForDeploy
:installJSDepsForDeploy
BUILD SUCCESSFUL
Total time: 3.698 secs
$ gradle installJSDeps
:createMain UP-TO-DATE
:copyMain
:installJSDeps
BUILD SUCCESSFUL
Total time: 2.484 secs
$ gradle installJSDepsForDeploy
:createMain UP-TO-DATE
:copyMainForDeploy
:installJSDepsForDeploy UP-TO-DATE
BUILD SUCCESSFUL
Total time: 2.41 secs
The second time :installJSDepsForDeploy UP-TO-DATE is reported, which is not desired and as far as I understand, incorrect.
Am I missing something?
Thanks in advance,
Sash
EDIT:
Taking into account the comments bellow, a more correct build.gradle file would be:
task createMain {
inputs.file 'deps.json'
// ASSUME: read deps.json and create main.min.js and './main.js
outputs.file 'main.min.js'
outputs.file 'main.js'
}
task copyMain(type: Copy, dependsOn: createMain) {
from('.') {
include 'main.js'
}
into('public')
}
task copyMainForDeploy(type: Copy, dependsOn: createMain) {
from('.') {
include 'main.min.js'
}
rename('main.min.js','main.js')
into('public')
}
task fetchJSDeps {
// ASSUME: reads `deps.json` and fetches
// all the deps into `repo` folder
outputs.dir 'repo'
inputs.file 'deps.json'
}
task installJSDeps(type: Copy, dependsOn: ['copyMain','fetchJSDeps']){
from('repo')
into('public/lib')
outputs.dir 'public/lib'
inputs.file 'deps.json'
}
task installJSDepsForDeploy(type: Copy, dependsOn: ['copyMainForDeploy','fetchJSDeps']){
from('repo'){
include '**/*.min.js'
}
into('public/lib')
outputs.dir 'public/lib'
inputs.file 'deps.json'
doFirst {
//clean up any existing files before copying new ones
FileTree tree = fileTree (dir: "public/lib");
delete(tree)
}
}
The corresponding output is still:
$ gradle installJSDepsForDeploy
:createMain UP-TO-DATE
:copyMainForDeploy
:fetchJSDeps UP-TO-DATE
:installJSDepsForDeploy
BUILD SUCCESSFUL
Total time: 2.769 secs
$ gradle installJSDeps
:createMain UP-TO-DATE
:copyMain
:fetchJSDeps UP-TO-DATE
:installJSDeps
BUILD SUCCESSFUL
Total time: 2.519 secs
$ gradle installJSDepsForDeploy
:createMain UP-TO-DATE
:copyMainForDeploy
:fetchJSDeps UP-TO-DATE
:installJSDepsForDeploy UP-TO-DATE
BUILD SUCCESSFUL
Total time: 2.376 secs
UPDATE:
I've tried a lot of different set-ups and it seems to me that Gradle does not support this kind of tasks. Essentially, I want taskA copy from:
/repo/**/*.js
/repo/**/*min.js
/repo/**/*min.js.map
/repo/**/*.css
/repo/**/*.md
to:
/public/lib/**/*min.js
/public/lib/**/*min.js.map
And I want taskB copy from:
/repo/**/*.js
/repo/**/*min.js
/repo/**/*min.js.map
/repo/**/*css
/repo/**/*.md
to:
/public/lib/**/*.js
/public/lib/**/*.css
/public/lib/**/*.md
The tasks DO NOT depend on each other, that's in fact the main point here. Further, I want TaskA NOT to copy the files that TaskB is copying. Both tasks need to make sure the files that the other task is copying over ARE NOT present after completion.
Basically, after running gradle taskA the public/lib should include ONLY:
/public/lib/**/*min.js
/public/lib/**/*min.js.map
and NOT:
/public/lib/**/*.js
/public/lib/**/*.css
/public/lib/**/*.md
After running `gradle taskB' the result should be the opposite.
Is it possible to do that in Gradle?
Thanks in advance,
Sash
Looking at the code, it's not clear to me what you are trying to do here. A minimal example would help.
Some general hints (I couldn't make this a comment because it's too long):
createMain does nothing, which means that the copyMain tasks won't have anything to copy (unless you manually put main.js and main.min.js files in place for now).
copyMain and copyMainForDeploy create the same file in the same place, which means that they'll keep overwriting each other's outputs, making the other task out-of-date.
The task dependencies don't make sense because (say) installJSDeps doesn't consume anything that copyMain produces (i.e. there is no semantic dependency between the two tasks).
I don't understand why deps.json would be an input to the Copy tasks. Also, task types such as Copy automatically declare their inputs and outputs based on how they are configured (e.g. from and into), which means that they don't have to be declared in the build script.
Instead of from('.') { include 'foo' }, from 'foo' should be used.
Instead of deleting the target dir's contents before copying, a Sync task should be used (which is configured in exactly the same way as Copy).
Related
While seemingly the tasks execute in proper order (bump first and than ngconstant creating a config file based on package.json-s version property) i think they actually execute parallely, and ngconstant reads up the package.json before bump has written it.
Running "bump" task
md
>> Version bumped to 2.0.6 (in package.json)
Running "ngconstant:production" (ngconstant) task
Creating module config at app/scripts/config.js...OK
The resultung package.json has 2.0.6 as version while config.js has 2.0.5.
My ngconstant config simply uses
grunt.file.readJSON('package.json')
to read up the json.
So, basically the question is, how can i make sure that bump's write is finished, before reading up the json with ngconstant, and what actually causes the above?
EDIT: the original Gruntfile: https://github.com/dekztah/sc2/blob/18acaff22ab027000026311ac8215a51846786b8/Gruntfile.js
EDIT: the updated Gruntfile that solves the problem: https://github.com/dekztah/sc2/blob/e7985db6b95846c025ba0b615bf239c4f9c11e8f/Gruntfile.js
Probably your package.json file is stored in memory and is not updated before your run the next task.
An workaround would be to create a script in your file package.json as:
"scripts": {
"bumb-and-ngconstant": "grunt:bump && grunt:build"
}
As per grunt-ng-constant documentation:
Or if you want to calculate the constants value at runtime you can create a lazy evaluated method which should be used if you generate your json file during the build process.
grunt.initConfig({
ngconstant: {
options: {
dest: 'dist/module.js',
name: 'someModule'
},
dist: {
constants: function () {
return {
lazyConfig: grunt.file.readJSON('build/lazy-config.json')
};
}
}
},
})
This forces the json to be read while the task runs, instead of when grunt inits the ngconstant task.
I am trying to write a gradle task which will minify all my project's javascript files. I am using a gradle library: com.eriwen.gradle.js. This library contains a task called minifyJs where we define the source file we want to minify and the destination of the minified file:
minifyJs {
source = file(sourcePathString)
dest = file(targetPathString)
}
What I want to do is call execute this task for EVERY javascript file in my project and produce a minified version of it in a new path for EACH file. This would require me to run the minifyJs task multiple times each time with different source and dest values, but I can't seem to find a solution on how to do this. One person had suggested that we use a loop to create a new task of type: minifyJs for each javascript file but this takes a huge amount of time and will create 250+ tasks i.e. not effective at all.
Since calling a task inside another task doesn't work (and using task.execute() is bad practice) I'm essentially looking for a workaround that lets me achieve this:
task customMinify {
def jsFileTree = fileTree('my/javascript/files')
jsFileTree.forEach {
def jsFile = it
minifyJs {
source = file(jsFile.getPath())
dest = file('new/path/to/file.js')
}
}
}
which obviously doesn't work since we can't call minifyJs inside another task.
I'm really sorry that this gap has continued to exist in the gradle-js-plugin.
Since generating tasks won't do, I suggest that you write a custom task under buildSrc combining my JsMinifier and the MinifyJsTask.
If you're willing to wait 8 hours or so, I can write an implementation of this later if you like.
EDIT: Here's a gist for a ClosureMinifyTask you can throw in buildSrc/src/main/groovy/com/eriwen/gradle/js/tasks and it'll minify each file individually and produce individual source map files etc.
buildSrc/build.gradle:
repositories {
mavenCentral()
}
dependencies {
compile localGroovy()
compile gradleApi()
compile ('com.google.javascript:closure-compiler:v20151015') {
exclude module: 'junit'
}
}
Sample Usage:
task mini(type: com.foo.bar.ClosureMinifyTask) {
source = "src/js"
dest = "${buildDir}/js/minified"
}
The story:
We have a team of testers working on automating end-to-end tests using protractor for our internal AngularJS application. Here is the task they usually run for "local" testing:
grunt.registerTask('e2e:local', [
'build:prod',
'connect:test',
'protractor:local'
]);
It runs the "build" task, starts a webserver and runs the e2e tests against the local build.
The build:prod task itself is defined as:
grunt.registerTask(
'build:prod', [
'clean',
'copy:all',
'copy:assets',
'wiredep',
'ngtemplates',
'useminPrepare',
'concat',
'ngAnnotate',
'autoprefixer',
'uglify',
'cssmin',
'copy:cssfix',
'usemin',
'copy:html',
'bowercopy',
'template:setProdVersion'
]
);
Here we have a lot of subtasks (it definitely could be improved, but this is how it looks now).
The problem:
Currently, it takes about 25 seconds for the build to complete. And, every time a person is running end-to-end tests, the build task is executed.
The question:
How can I run the build:prod task only if there are changes in src directory?
Note that the requirement here is to make it transparent for the testers who run the tests. I don't want them to remember when they need to perform a build and when not.
In other words, the process should be automated. The goal is to automatically detect if build is needed or not.
Note that ideally I would like to leave the build task as is, so that if it is invoked directly via grunt build:prod it would rebuild regardless of the datestamp of the previous build.
Thoughts and tries:
there is the closely related grunt-newer package, but, since we have a rather complicated build, having a clean task at the beginning, I'm not sure how to apply it in my case
what I was also thinking about is to, inside the e2e:local task, manually check the timestamps of the files inside dist and src and, based on that, decide if build:prod is needed to be invoked. I think this is what grunt-newer is doing internally
we started to use jit-grunt that helped to improve the performance
Here's an idea if you use git:
How about using something like grunt-gitinfo and using the last commit in HEAD as a base?
The idea is:
You create a new grunt task that checks for latest commit hash
You'd save this commit hash in a file that's added to gitignore (and is NOT in the clean folder, typically can be in root of repo)
Before saving to file, it'd check the value already in it (standard node fs module can do the read/write easily)
If the hash doesn't match, run build:prod task then save new commit hash
The testers build would depend on your new task instead of build:prod directly
Another option (still using git):
You can use something like grunt-githooks and create a git hook that runs after pull and calls the git build:prod, then you can remove it from the dependencies of the grunt task that testers run.
You might have another code to check for githook and install it if required though, which can be a one-time extra step for testers, or maybe baked into the grunt task they call.
I'm surprised noone has mentioned grunt-contrib-watch yet (it's in the gruntjs.com example file and I thought it was pretty commonly known!). From github: "Run predefined tasks whenever watched file patterns are added, changed or deleted." - heres a sample grunt file that would run your tasks any time any .js files are modified in src/ or in test/, or if the Gruntfile is modified.
var filesToWatch = ['Gruntfile.js', 'src/**/*.js', 'test/**/*.js'];
grunt.initConfig({
watch: {
files: filesToWatch,
tasks: ['build:prod',
'connect:test',
'protractor:local']
}
});
grunt.loadNpmTasks('grunt-contrib-watch');
You have your developers open a terminal and run grunt watch before they start modifying files, and every time those files are modified the tasks will automatically be run (no more going back to the terminal to run grunt build:prod every time).
It's an excellent package and I suggest you check it out. -- github -- npmjs.org
npm install grunt-contrib-watch --save-dev
Not the answer your are looking for with grunt, but this will be easy with gulp.
var fs = require('fs');
var gulpif = require('gulp-if');
var sourceChanged = fs.statSync('build/directory').mtime > fs.statSync('source/directory').mtime;
gulp.task('build:prod', function() {
if (!sourceChanged) {
return false;
}
return gulp.src('./src/*.js')
.pipe(.... build ....)
.pipe(gulp.dest('./dist/'));
});
Here's how we've done some Git HEAD sha work for our build. We use it to determine which version is currently deployed to our production environment - but I'm quite certain you could rework it to return a boolean and trigger the build if truthy.
Gruntfile.js
function getHeadSha() {
var curr, match, next = 'HEAD';
var repoDir = process.env.GIT_REPO_DIR || path.join(__dirname, '..');
try {
do {
curr = grunt.file.read(path.join(repoDir, '.git', next)).trim();
match = curr.match(/^ref: (.+)$/);
next = match && match[1];
} while (next);
} catch(ex) {
curr = 'not-found';
}
return curr;
}
grunt.initConfig({
replace: {
applicationVersion: {
src: '<%= config.dist %>/index.html',
overwrite: true,
replacements: [{
from: '{{APPLICATION_VERSION}}',
to: getHeadSha
}]
}
}
});
grunt.registerTask('build', {
'replace:applicationVersion',
/** other tasks **/
});
grunt.registerTask('e2e:local', {
'check_if_we_should_build',
/** other tasks **/
});
index.html
<html data-version="{{APPLICATION_VERSION}}">
<!-- -->
</html>
There's also the git-info package which would simplify this whole process, we're looking at switching over to that ourselves.
edit; I just noticed #meligy already pointed you in the direction of git-info. credit where credit is due.
I am not sure if its helpful or not but same things we have done it in our project using GULP framework. We have written a watcher in the gulp that continuously check for the source change and run a quick function to build the project. Its a Protractor Test case.
gulp.task('dome', function () {
gulp.src(["maintest.js"])
.pipe(notify("Change Found , Executing Scripts."))
.pipe(protractor({
configFile: "conf.js",
args: ['--baseUrl', 'http://127.0.0.1:8000']
})).on('error', function (e) {
throw e
});
})
gulp.task('default', function () {
gulp.watch('./webpages/*.js', ['dome']);
gulp.watch('maintest.js', ['dome']);
gulp.watch('conf.js', ['dome']);
});
Link to repo.
I don't have experience in protractor, but conceptually I think this could work.
What I could suggest is to set an alias in your ~/.cshrc to run the build commands only if a diff command returns true.
#./cshrc
alias build_on_diff 'diff -r branch_dir_1 branch_dir_2\
if ( $status == 1 ) then\
build:prod\
endif'
Just replace the diff command with whatever git uses, and it should work provided it returns a 1 status for differences detected. We apply a similar method at my workplace to avoid rebuilding files that haven't changed.
Context
I have a few grunt tasks that I've already written, and I'd like to use them with a new project I'm writing in Sails.js.
With Sails.js, you can add additional grunt tasks by adding a JS file to the /tasks/register folder. Before we get to the file I've added, let's talk about the problem.
The Problem
Sails won't lift. Debugger shows:
debug: --------------------------------------------------------
error: ** Grunt :: An error occurred. **
error:
------------------------------------------------------------------------
ERROR
>> Unable to process task.
Warning: Required config property "clean.dev" missing.
The issue in question is obviously with grunt, so then I try grunt build (which automatically runs with sails lift):
Running "clean:dev" (clean) task
Verifying property clean.dev exists in config...ERROR
>> Unable to process task.
Warning: Required config property "clean.dev" missing. Use --force to continue.
From this, I've garnered that this is a path issue. Let's take a look at the file I've added.
/tasks/register/customTask.js
The task here loads load-grunt-config, which is the source of my problems:
module.exports = function(grunt) {
// measures the time each task takes
require('time-grunt')(grunt);
// This require statement below causes my issue
require('load-grunt-config')(grunt, {
config: '../../package.json',
scope: 'devDependencies',
overridePath: require('path').join(process.cwd(), '/asset-library/grunt')
});
grunt.registerTask('customTask', [
'newer:jshint',
'newer:qunit',
'newer:concat',
'newer:cssmin',
'newer:uglify'
]);
};
I had assumed that using overridePath instead of configPath would solve my issue, but alas, it's not quite that simple. Is there some way to make it so that I can use my own custom tasks folder with load-grunt-config like I've done in other projects, or is there some magic conditional I can wrap the require statement around?
I only need it to run with grunt customTask, and not run with grunt * (anything else).
Okay, this was actually pretty easy. All I had to do was change the grunt.registerTask call in my customTask.js file from this:
grunt.registerTask('customTask', [
'newer:jshint',
'newer:qunit',
'newer:concat',
'newer:cssmin',
'newer:uglify'
]);
to this:
grunt.registerTask('customTask', 'My custom tasks', function() {
// The require statement is only run with "grunt customTask" now!
require('load-grunt-config')(grunt, {
config: '../../package.json',
scope: 'devDependencies',
overridePath: require('path').join(process.cwd(), '/asset-library/grunt')
});
grunt.task.run([
'newer:jshint',
'newer:qunit',
'newer:concat',
'newer:cssmin',
'newer:uglify'
]);
});
In case it's not clear, I did have to move the require('load-grunt-config') call, so if you're copy + pasting, make sure to remove the require statement that's outside the grunt.registerTask call.
You can find more information about custom Grunt tasks here.
Background: I have a multi-project Gradle build, and I've defined a Gradle task which runs JavaScript unit tests in an Exec task. The inputs to this task are the JavaScript files in the project, so it's only re-run if one of the source files are modified. The task is added to all JavaScript projects from a master project.
Question: I want to extend this so that the tests are re-run if JavaScript files in the project, or in any of its project dependencies are changed. How is this best done?
The code below works if placed in each subproject build file (after the dependency declaration), but we have 20+ JavaScript subprojects and I'd like to stay DRY.
project.ext.jsSourceFiles = fileTree("src/").include("**/*.js*")
task testJavaScript(type: Exec, dependsOn: configurations.js) {
inputs.files resolveJavascriptDependenciesFor(project)
outputs.file "report.xml"
// Run tests in JSTestDriver using command line call...
}
def resolveJavascriptDependenciesFor(project) {
def files = project.jsSourceFiles
project.configurations.js.allDependencies.each {
files = files + resolveJavascriptDependenciesFor(it.dependencyProject)
}
return files
}
Is there a better solution? Maybe where I don't have to resolve all file dependencies myself?
As written in the answer before, adding the jsTest task within a subprojects closure would make it very easy to add jstesting support for every subproject. I think you can ease your inputs setup by declaring source files as dependencies:
dependencies {
js filetree("src/main").include("**/*.js")
}
and
subprojects { subproj ->
task testJavaScript(type: Exec, dependsOn: configurations.js) {
inputs.files subproj.configurations.js
outputs.file "report.xml"
commandLine ...
}
}
Would it be possible to do something like this?
allprojects {project ->
task testJavaScript(type: Exec, dependsOn: configurations.js) {
inputs.files resolveJavascriptDependenciesFor(project)
// Run tests in JSTestDriver using command line call...
}
}
def resolveJavascriptDependenciesFor(project) {
def files = project.jsSourceFiles
project.configurations.js.allDependencies.each {
files = files + resolveJavascriptDependenciesFor(it.dependencyProject)
}
return files
}
Tha way the task is on all projects, and wil be called recursively.
Not completely sure this works, but I think its the way to go
I've found a solution that works but isn't great, using the same dependency specification as in the question. It's to load the gradle files in a slightly different order in the master project.
Master build.gradle:
subprojects {
configurations {
js
}
// Apply the subproject's build.gradle, but with another name (that isn't automatically loaded)
if (project.file('depends.gradle').exists()) {
apply from: project.file('depends.gradle')
}
apply from: project.parent.file('javaScriptProjectTasks.gradle')
}
Compare to the previous, non working master build.gradle
subprojects {
configurations {
js
}
apply from: project.parent.file('javaScriptProjectTasks.gradle')
// The subproject's build.gradle is automatically loaded
}