I need to use the Google Closure compiler.jar to minify a huge project I am working on. I have multiple js files that I want to compile into a single game.min.js file. I know I can use the following:
java -jar compiler.jar --js file1.js --js file2.js --js etc, etc --js_output_file game.min.js
...but I have a LOT of files and as I understand it Closure doesn't have support for adding a directory and finding all the *.js files residing under that directory. My fumbling google searches are not giving me any tools that I can use for the job (or nothing that works at any rate).
Has anyone out there found / used / written a script that loops through a directory and spits out all the .js files into a single minified file? I am hopeless with php, python, etc, so any help greatly appreciated.
You can use ant to automate the use of the closure compiler.
I do it in two separate steps, concatenation then compilation :
<concat destfile="src/somepath/app.concat.js">
<filelist dir="src/somepath">
<file name="a.js" />
<file name="b.js" />
<file name="c.js" />
<file name="d.js" />
</filelist>
</concat>
<jscomp compilationLevel="simple" warning="quiet" debug="false" output="src/app.min.js">
<sources dir="src/somepath">
<file name="app.concat.js" />
</sources>
</jscomp>
Be careful that the order of the files is important. That's why you can't simply pass a fileset to the jscomp task.
You can also use wildcards when specifying files. You could change your example to:
java -jar compiler.jar --js *.js --js_output_file game.min.js
This should combine all of the .js files in your current working directory into the output file you've specified.
You should concatenate all your source files before you apply Google Closure compiler.
For all related tasks you could use Ant build tool. Also, there is a great Grunt.js project, which is more convenient for JS. There are grunt-contrib-concat and grunt-shell npm modules for Grunt.js, the first is for concatenation, the other is for running console commands.
Your Gruntfile.js could look like this:
module.exports = function(grunt) {
// Project configuration.
grunt.initConfig({
concat: {
js: {
src: ['src/js/*.js'],
dest: 'dist/pre-build.js'
}
},
shell: {
optimize: {
command: 'google closure compiler command here',
stdout: true
}
}
});
grunt.loadNpmTasks('grunt-shell');
grunt.loadNpmTasks('grunt-contrib-concat');
// Default task.
grunt.registerTask('default', ['concat', 'shell:optimize']);
};
Related
I have the following setup in my phpstorm, and I would like to combine my javascript files located in the /js/app and /js/lib folder.
now it's only minifying a single jar, I would like to combine them
One simple way to do it:
Create a closure.command file (or whatever you want to name it) in the same directory as your js files. Put Closure command line options in it like so:
--js file1.js
--js file2.js
--js_output_file all.min.js
Edit the Closure file watcher arguments to include "--flagfile closure.command". You will probably want to uncheck the "Create output file from stdout" option.
** Since you want it to operate on js files from different directories you may have to modify the file paths and "Working Directory" option for it to work properly.
I have a directory of scripts:
/scripts/module-foo.js
/scripts/module-bar.js
/scripts/site.js
/scripts/some_other_non_module_script.js
the two module scripts export modules:
goog.module('foo');
exports = 'foo';
And the site.js script includes them:
var foo = goog.module.get('foo');
I can get this to work fine if I manually specify each source file in the compiler command:
java -jar compiler.jar ... --js module-foo.js --js module-bar.js --js site.js
but I'm trying to avoid that. If I specify
--js ./**
It works, but I get the source from site.js as well as some_other_non_module_script.js in the same output file. I only want site.js
site.js will need to have a goog.provide statement itself - something like goog.provide('mysite'). Then you can use the --only_closure_dependencies and --closure_entry_point flags.
java -jar compiler.jar ... --js scripts/**.js --only_closure_dependencies
--closure_entry_point mysite
Learn about the Manage Closure Dependencies Flags
I have a lot of .ts files in my project. WebStorm build each .ts file as a js file. But I dont want that.
I have an app.ts file and all other .ts files will be build in that app.ts file. How can I do that in WebStorm 7?
There is a solution in CLI mode but how can i implement it in WebStorm?
tsc --out app.js main.ts app.ts a.ts b.ts
Or is there a better way to do this?
ANSWER
Just added this line at Arguments section in Edit Watcher
--sourcemap $FileName$ --out your-main.js
You can specify --out option in Typescript File watcher arguments, and, if 'track only root files' option is on, all ts files will be merged into a main js file (that imports them all directly or via references chain) on modifying any of them
You could use grunt-ts which can maintain a reference.ts file for you, and point the webstorm file watcher to run your grunt task https://github.com/basarat/grunt-ts#javascript-generation-and-ordering
Disclaimer : I am one of the authors of grunt-ts.
For those who don't have a single file that links to all other ones and they don't want to maintain manually "references.ts" or using basarat's grunt-ts, here's my setup:
The basic idea is to list all your *.ts files into a text file and then use the file as a patameter for the tsc compiler. So I created my own file watcher and disabled the default one for TypeScript files. My file watcher's program is a bat file with following content:
dir /s /b /o:gn scripts\*.ts > ts_sources.txt
tsc %*
If you're on Mac or Linux you can easily transform this into a bash script.
On the watcher's setup screen you point to your batch file (bash script):
$ProjectFileDir$\compile.bat
and as your arguments you can have to use following:
#$ProjectFileDir$\ts_sources.txt --out $ProjectFileDir$\app_all.js --sourcemap
I guess there are many ways to do it...
I have some projects that use RequireJS to load individual JavaScript modules in the browser, but I haven't optimized them yet. In both development and production, the app makes a separate request for each JavaScript file, and now I would like to fix that using Grunt.
I have tried to put together a simple project structure to no avail, so I'm wondering if someone can provide a working example for me. My goals are the following:
In development mode, everything works in the browser by issuing a separate request for each required module. No grunt tasks or concatenation are required in development mode.
When I'm ready, I can run a grunt task to optimize (combine) all of the JavaScript files using r.js and test that out locally. Once I'm convinced the optimized application runs correctly, I can deploy it.
Here's a sample structure for the sake of this conversation:
grunt-requirejs-example/
grunt.js
main.js (application entry point)
index.html (references main.js)
lib/ (stuff that main.js depends on)
a.js
b.js
requirejs/
require.js
text.js
build/ (optimized app goes here)
node_modules/ (necessary grunt tasks live here)
Specifically, I'm looking for a working project structure that I can start from. My main questions are:
If this project structure is flawed, what do you recommend?
What exactly needs to be in my grunt.js file, especially to get the r.js optimizer working?
If all of this isn't worth the work and there's a way to use the grunt watch task to automatically build everything in development mode every time I save a file, then I'm all ears. I want to avoid anything that slows down the loop from making a change to seeing it in the browser.
I use the grunt-contrib-requirejs task to build project based on require.js. Install it inside your project directory with:
npm install grunt-contrib-requirejs --save-dev
BTW: --save-dev will add the package to your development dependencies in your package.json. If you're not using a package.json in your project, ignore it.
Load the task in your grunt file with:
grunt.loadNpmTasks('grunt-contrib-requirejs');
And add the configuration to your grunt.initConfig
requirejs: {
production: {
options: {
baseUrl: "path/to/base",
mainConfigFile: "path/to/config.js",
out: "path/to/optimized.js"
}
}
}
Now you're able to build your require.js stuff into a single file that will be minimized with uglifyjs by running grunt requirejs
You can bundle a set of different tasks into some sort of main task, by adding this to your grunt file
grunt.registerTask('default', ['lint', 'requirejs']);
With this, you can simply type grunt and grunt will automatically run the default task with the two 'subtasks': lint and requirejs.
If you need a special production task: define it like the above
grunt.registerTask('production', ['lint', 'requirejs', 'less', 'copy']);
and run it with
grunt production
If you need different behaviors for 'production' and 'development' inside i.e. the requirejs task, you can use so called targets. In the configuration example above it's already defined as production. You can add another target if you need (BTW, you can define a global config for all targets by adding a options object on the same level)
requirejs: {
// global config
options: {
baseUrl: "path/to/base",
mainConfigFile: "path/to/config.js"
},
production: {
// overwrites the default config above
options: {
out: "path/to/production.js"
}
},
development: {
// overwrites the default config above
options: {
out: "path/to/development.js",
optimize: none // no minification
}
}
}
Now you can run them both at the same time with grunt requirejs or individually with grunt requirejs:production, or you define them in the different tasks with:
grunt.registerTask('production', ['lint', 'requirejs:production']);
grunt.registerTask('development', ['lint', 'requirejs:development']);
Now to answer your questions:
I would definitely use a subfolder in your project. In my case I use a 'src' folder for development that is build into a 'htdocs' folder for production. The project layout I prefere is:
project/
src/
js/
libs/
jquery.js
...
appname/
a.js
b.js
...
main.js // require.js starter
index.html
...
build/
... //some tmp folder for the build process
htdocs/
... // production build
node_modules/
...
.gitignore
grunt.js
package.json
see above
You can do so, but I wouldn't recommend to add requirejs to the watch task, it's a resource hungry task and it will slow down your machine noticeable.
Last but not least: Be very cautious when playing around with r.js. Especially when you want to optimize the whole project with r.js by adding a modules directive to your config. R.js will delete the output directory without asking. If it happens that it is accidentally configured to be your system root, r.js will erase your HDD. Be warned, I erased my whole htdocs folder permanently some time ago while setting up my grunt task... Always add keepBuildDir:true to your options when playing around with the r.js config.
I would like to Compress all my file .js in a same directory in one file with Google Closure Compiler in a command line.
For one file it's :
java -jar compiler.jar --js test.js --js_output_file final.js
But I didn't find in the doc how put my other file at the end of final.js without write over the last compress file ?
I would like something like that :
java -jar compiler.jar --js --option *.js --js_output_file final.js
It's possible or must I do a programm who add all file in a file and after compress it ?
Thank you if you can help me !
java -jar path/to/closure-compiler/build/compiler.jar \
--js input_one.js \
--js input_two.js \
... \
--js_output_file compiled_output.js
I tried Orbits' answer, but It didn't really work perhaps the version I use is a newer version. The command I use is :
java -jar compiler.jar --js file1.js file2.js file3.js --js_output_file --js_output_file compiled_output.js.
Assuming that you’ve installed the Closure Compiler with Homebrew on a Mac, you can also concatenate all files and then pipe them to to Closure (this should also work the java -jar way, but I haven’t verified it):
cat $(ls scripts/*.js) | closure-compiler --js_output_file main.js
Here are five globbing techniques for including multiple input files, with documentation extracted from the CommandLineRunner class:
(1) This is a variation of muka's technique, removing the --js flag, which is not needed:
java -jar compiler.jar \
--js_output_file build/out.js `find ./src/*.js`
From the docs:
The --js flag name is optional, because args are interpreted as files by default.
This will include all .js files in /src/, but won't include any files in subdirectories of /src/.
(2) Similar to 1, but will include all .js files in /src/ and all its subdirectories:
java -jar compiler.jar \
--js_output_file build/out.js `find ./src/ -name '*.js'`
(3) Similar to 2, but uses xargs:
find ./src/ -name '*.js' \
| xargs java -jar compiler.jar \
--js_output_file build/out.js \
--manage_closure_dependencies
From the docs:
It is convenient to leverage the additional arguments feature when using the
Closure Compiler in combination with find and xargs:
find MY_JS_SRC_DIR -name '*.js' \
| xargs java -jar compiler.jar --manage_closure_dependencies
The find command will produce a list of '*.js' source files in
the MY_JS_SRC_DIR directory while xargs will convert them
to a single, space-delimited set of arguments that are appended to the
java command to run the Compiler.
Note that it is important to use the
--manage_closure_dependencies option in this case because the
order produced by find is unlikely to be sorted correctly with
respect to goog.provide() and goog.requires().
(4) The v20140625
release added support for the ** (globstar) wildcard, which recursively
matches all subdirectories.
For example, this will include all .js files in /src/ and all its subdirectories:
java -jar compiler.jar \
--js_output_file build/out.js './src/**.js'
More info here. From the docs:
You may also use minimatch-style glob patterns. For example, use:
--js='**.js' --js='!**_test.js'
to recursively include all js files that do not end in _test.js
From the Java docs:
The following rules are used to interpret glob patterns:
The * character matches zero or more characters of a name component without crossing directory boundaries.
The ** characters matches zero or more characters crossing directory boundaries.
(5) The v20140625
release also added a new feature: if the input path is a directory, then all .js files
in that directory and all subdirectories will be included.
For example, this will include all .js files in /src/ and all its subdirectories:
java -jar compiler.jar \
--js_output_file build/out.js './src/'
More info here.
You can use KjsCompiler: https://github.com/knyga/kjscompiler
. Compiles multiple JavaScript files with Google Closure Compiler application in a right order
How to solve your problem:
1. Add annotations to js files, like this:
/**
* #depends {lib/somefile.js}
**/
If you do not care about compilation chain, you can ignore this step.
2. java -jar kjscompile.jar
You can look for example here: https://github.com/knyga/kjscompiler/tree/master/examples
<exec executable="java">
<arg line="-jar ../lib/compiler.jar --compilation_level SIMPLE_OPTIMIZATIONS --language_in ECMASCRIPT5 --js_output_file=${compressdir}/js/controllers/jscontrollersfiles.js ${webappdir}/js/controllers/*.js" />
</exec>
The github of google closure doc gives the below command.
If you have multiple scripts, you should compile them all together with one compile command.
java -jar compiler.jar --js_output_file=out.js in1.js in2.js in3.js ...
You can also use minimatch-style globs.
# Recursively include all js files in subdirs
java -jar compiler.jar --js_output_file=out.js 'src/**.js'
# Recursively include all js files in subdirs, excluding test files.
# Use single-quotes, so that bash doesn't try to expand the '!'
java -jar compiler.jar --js_output_file=out.js 'src/**.js' '!**_test.js'
Source : https://github.com/google/closure-compiler#compiling-multiple-scripts