I'm trying to learn Make and building a Makefile into my app to help me with building and minimizing my .js files for use of a combo loader server application later on.
What I'm trying to accomplish is that when I run make, it'll copy over to the build directory only the .js files that have changed since the last run, and then minify that file and generate a -min.js copy. Finally I need to always make sure I generate a new meta.js file.
I've pasted what I have working below, the trouble with this is that it's not picking only the changed .js files, but each file on each run. I'm missing something in how to get Make to pick only changed files in this instance.
BOOKIE_JS = bookie/static/js/bookie
JS_BUILD_PATH = bookie/static/js/build
JS_META_SCRIPT = scripts/js/generate_meta.py
jsbuild: $(JS_BUILD_PATH)/bookie/meta.js
clean_js:
rm -rf $(JS_BUILD_PATH)/*
$(JS_BUILD_PATH)/bookie/meta.js: $(BOOKIE_JS)/y*-min.js
$(JS_META_SCRIPT) -n YUI_MODULES -s $(BOOKIE_JS)/y* -o $(JS_BUILD_PATH)/bookie/meta.js
$(BOOKIE_JS)/y*-min.js: $(BOOKIE_JS)/y*.js
scripts/js/jsmin_all.py $(JS_BUILD_PATH)/bookie
# this is the part that runs for each .js file and I'd like it to only run for the *modified* files
$(BOOKIE_JS)/y*.js: $(JS_BUILD_PATH)/bookie
cp $# $(JS_BUILD_PATH)/bookie/
$(JS_BUILD_PATH)/bookie:
mkdir $(JS_BUILD_PATH)/bookie
clean: clean_js
.PHONE: clean clean_js
Current output:
cp bookie/static/js/bookie/yapi.js bookie/static/js/build/bookie/
cp bookie/static/js/bookie/ymodel.js bookie/static/js/build/bookie/
cp bookie/static/js/bookie/ytagcontrol.js bookie/static/js/build/bookie/
cp bookie/static/js/bookie/yview.js bookie/static/js/build/bookie/
scripts/js/jsmin_all.py bookie/static/js/build/bookie
scripts/js/generate_meta.py -n YUI_MODULES -s bookie/static/js/bookie/y* -o bookie/static/js/build/bookie/meta.js
I'd like to see only the cp of the changed files.
I think you intended to make a pattern rule but used the wrong syntax. For example, this:
$(BOOKIE_JS)/y*-min.js: $(BOOKIE_JS)/y*.js
scripts/js/jsmin_all.py $(JS_BUILD_PATH)/bookie
means each of the $(BOOKIE_JS)/y*-min.js files depends on the $(BOOKIE_JS)/y*.js files -- all of them, not just the one with a similar name. If you do this:
$(BOOKIE_JS)/y%-min.js: $(BOOKIE_JS)/y%.js
scripts/js/jsmin_all.py $(JS_BUILD_PATH)/bookie
then the % must be replaced with the same string on each side, so for example $(BOOKIE_JS)/yapi-min.js depends only on $(BOOKIE_JS)/yapi.js
Related
I'm in the process of building an npm package which will be installed globally. Is it possible to have non-code files installed alongside code files that can be referenced from code files?
For example, if my package includes someTextFile.txt and a module.js file (and my package.json includes "bin": {"someCommand":"./module.js"}) can I read the contents of someTextFile.txt into memory in module.js? How would I do that?
The following is an example of a module that loads the contents of a file (string) into the global scope.
core.js : the main module file (entry point of package.json)
//:Understanding: module.exports
module.exports = {
reload:(cb)=>{ console.log("[>] Magick reloading to memory"); ReadSpellBook(cb)}
}
//:Understanding: global object
//the following function is only accesible by the magick module
const ReadSpellBook=(cb)=>{
require('fs').readFile(__dirname+"/spellBook.txt","utf8",(e,theSpells)=>{
if(e){ console.log("[!] The Spell Book is MISSING!\n"); cb(e)}
else{
console.log("[*] Reading Spell Book")
//since we want to make the contents of .txt accesible :
global.SpellBook = theSpells // global.SpellBook is now shared accross all the code (global scope)
cb()//callBack
}
})
}
//·: Initialize :.
console.log("[+] Time for some Magick!")
ReadSpellBook((e)=>e?console.log(e):console.log(SpellBook))
spellBook.txt
ᚠ ᚡ ᚢ ᚣ ᚤ ᚥ ᚦ ᚧ ᚨ ᚩ ᚪ ᚫ ᚬ ᚭ ᚮ ᚯ
ᚰ ᚱ ᚲ ᚳ ᚴ ᚵ ᚶ ᚷ ᚸ ᚹ ᚺ ᚻ ᚼ ᚽ ᚾ ᚿ
ᛀ ᛁ ᛂ ᛃ ᛄ ᛅ ᛆ ᛇ ᛈ ᛉ ᛊ ᛋ ᛌ ᛍ ᛎ ᛏ
ᛐ ᛑ ᛒ ᛓ ᛔ ᛕ ᛖ ᛗ ᛘ ᛙ ᛚ ᛛ ᛜ ᛝ ᛞ ᛟ
ᛠ ᛡ ᛢ ᛣ ᛤ ᛥ ᛦ ᛧ ᛨ ᛩ ᛪ ᛫ ᛬ ᛭ ᛮ ᛯ
If you require it from another piece of code, you will see how it prints to the console and initializes by itself.
If you want to achieve a manual initalization, simply remove the 3 last lines (·: Initialize :.) and use reload() :
const magick = require("core.js")
magick.reload((error)=>{ if(error){throw error}else{
//now you know the SpellBook is loaded
console.log(SpellBook.length)
})
I have built some CLIs which were distributed privately, so I believe I can illuminate a bit here.
Let's say your global modules are installed at a directory called $PATH. When your package will be installed on any machine, it will essentially be extracted at that directory.
When you'll fire up someCommand from any terminal, the module.js will be invoked which was kept at $PATH. If you initially kept the template file in the same directory as your package, then it will be present at that location which is local to module.js.
Assuming you edit the template as a string and then want to write it locally to where the user wished / pwd, you just have to use process.cwd() to get the path to that directory. This totally depends on how you code it out.
In case you want to explicitly include the files only in the npm package, then use files attribute of package.json.
As to particularly answer "how can my code file in the npm package locate the path to the globally installed npm folder in which it is located in a way that is guaranteed to work across OSes and is future proof?", that is very very different from the template thingy you were trying to achieve. Anyway, what you're simply asking here is the global path of npm modules. As a fail safe option, use the path returned by require.main.filename within your code to keep that as a reference.
When you npm publish, it packages everything in the folder, excluding things noted in .npmignore. (If you don't have an .npmignore file, it'll dig into .gitignore. See https://docs.npmjs.com/misc/developers#keeping-files-out-of-your-package) So in short, yes, you can package the text file into your module. Installing the module (locally or globally) will get the text file into place in a way you expect.
How do you find the text file once it's installed? __dirname gives you the path of the current file ... if you ask early enough. See https://nodejs.org/docs/latest/api/globals.html#globals_dirname (If you use __dirname inside a closure, it may be the path of the enclosing function.) For the near-term of "future", this doesn't look like it'll change, and will work as expected in all conditions -- whether the module is installed locally or globally, and whether others depend on the module or it's a direct install.
So let's assume the text file is in the same directory as the currently running script:
var fs = require('fs');
var path = require('path');
var dir = __dirname;
function runIt(cb) {
var fullPath = path.combine(__dirname, 'myfile.txt');
fs.readFile(fullPath, 'utf8' , function (e,content) {
if (e) {
return cb(e);
}
// content now has the contents of the file
cb(content);
}
}
module.exports = runIt;
Sweet!
I would like to copy a list of folders to a destination with gulp
So far i've come up with a working solution, but its far from performant.
The structure of my directory is like this:
App
src
web
some files...
and i would like to copy it to
build
src
web
the files
The code i am using to accomplish this is:
var paths = [path.app + '/src/', path.app + '/app/'].concat(path.assets);
paths.forEach(function(value, index){
// value.replace(path.app, path.build);
gulp.src(value + '/**/*')
.pipe(gulp.dest(value.replace(path.app, path.build)));
});
Where the assets are my files (or other directories)
However there is a loop and no clear return value. I am wondering if there is a more performant way of doing this
I'm not sure I understand what you're trying to do here (where is your gulp task definition for example?), but it seems like you just want to copy everything below App to the build folder while preserving directory structure.
If that's the case, you don't have to loop over the files and replace folder names yourself. Gulp does it for you:
gulp.task('default', function () {
return gulp.src('App/**')
.pipe( gulp.dest('build') );
});
Everything before the ** is automatically stripped from the path of files written to build, so you end up with build/src, build/web, etc ...
Hei guys!
I need help with the commander node.js library. I need create this CLI which accepts 3 flags, --input, --output and --pattern, like:
commander
.version('3.0.0')
.usage('[options] <file ...>')
.option('-i, --input', 'Array of files to be extracted')
.option('-o, --output', 'Output file name')
.option('-p, --pattern', 'Pattern name to be used in the extraction')
.parse(process.argv);
My problem is with the input flag. I need send several files, for that i need an array data type.
The problem is: I just can't figure it out how to make this:
node ./bin/extract -i ../files/*.PDF
become an array with all my files that are inside my files directory. I already try to run every sample in the documentation, and i didn't find the solution for my problem. Also, i searched in the issues and didn't find either... what is strange, maybe i am doing something wrong and you guys could help??
Thanks!
You can use Coercion to achieve it:
function scanDir(val) {
files = fs.readdirSync(val);
return files;
}
program
.version('0.0.1')
.option('-s, --scan [value]', '', scanDir)
.parse(process.argv);
console.log(' scan: %j', program.scan);
And call it like:
node app.js -s /foo
I'm using skeleton #2, HTML5BP + Grunt. The first time I docpad run the following happens:
info: LiveReload listening to new socket on channel /docpad-livereload
Performing writeFiles (postparing) at 0/1 0% [...] Running "min:js" (min) task
File "../out/scripts/all.min.js" created.
Uncompressed size: 298495 bytes.
Compressed size: 38257 bytes gzipped (106756 bytes minified).
Which is as is supposed to be. However using the livereload plugin if I change a template or document file, I get:
--Running "min:js" (min) task
File "../out/scripts/all.min.js" created.
Uncompressed size: 0 bytes.
Editing my script.js throws it into the mix, but none of my vendor js files are rendered with it, which is just as useless. grunt-cssmin renders all scss/css files grunt-config.json regardless, which works fine. Moving my js from /files/vendor to /documents/scripts didn't change this behavior.
I've done a little poking around, but I'm new to grunt and nothing jumped out at me.
It'd be nice if I could either:
a) have all JS files in grunt-config.json minned and zipped each time
b) not have grunt min js files in development environment
As is if I want to make any changes to something regarding javascript, I need to ctrl-c docpad and then run it again, which is meh.
Not ideal, but effective enough:
events:
# Write After
# Used to minify our assets with grunt
writeAfter: (opts,next) ->
# Prepare
docpad = #docpad
rootPath = docpad.config.rootPath
balUtil = require 'bal-util'
_ = require 'underscore'
# Make sure to register a grunt `default` task
command = ["#{rootPath}/node_modules/.bin/grunt", 'default']
# Execute
balUtil.spawn command, {cwd:rootPath,output:true}, ->
src = []
gruntConfig = require './grunt-config.json'
_.each gruntConfig, (value, key) ->
src = src.concat _.flatten _.pluck value, 'src'
#_.each src, (value) ->
# balUtil.spawn ['rm', value], {cwd:rootPath, output:false}, ->
#balUtil.spawn ['find', '.', '-type', 'd', '-empty', '-exec', 'rmdir', '{}', '\;'], {cwd:rootPath+'/out', output:false}, ->
next()
# Chain
#
The three lines around "balUtil" which perform find/rm commands were commented out.
Not ideal since the "uncompressed" files are left around -- but that's not really the end of the world. Live-reloading to empty pages was a tad more frustrating, ultimately.
There could be a way to further enhance this to detect a live reload (development) vs generating a build for production, but I haven't grokked that yet.
I would like be able to run a single command in my project folder to concatenate and compress all of my javascript files (perhaps with YUI Compressor) into a single output file.
If possible I would like to partially specify the order in which they are concatenated together but not have to keep track of every single javascript file. Perhaps a config file could be built which looks like this:
application.js
excanvas.js
json2.js
jquery*.js
flot/*
backbone*.js
app/screen-*.js
app/main.js
app/crud-*.js
app/*
*
Does anyone know of either an existing tool to do something like this, could whip together a bash/ruby/node/perl script, or even a better methodology? I'm building a Single Page App with heavy JS usage (~40 files) to be consumed by people with low bandwidth.
I would need the solution to be executable on my OS X development machine.
find . -iname "*.js" -exec cat "{}" \; > singlefile.js
[JS compressor] singlefile.js
First concatenate the files, then compress them.
If you really care, though, you may want a real JS optimizer like the RequireJS optimizer.
given a folder of javascript files:
geee: ~/src/bash/js-files
$ find .
.
./application.js
./jquery-ui.js
./all-scripts.js
./cp.js
./excanvas.js
./backbone-worldwide.js
./jquery-plugin.js
./.found
./app
./app/crud-sel.js
./app/screen-detach.js
./app/aligator.js
./app/crud-in.js
./app/giraffe.js
./app/screen-attach.js
./app/main.js
./app/crud-del.js
./app/mouse.js
./app/monkey.js
./app/screen-shot.js
./backbone-national.js
./backbone23.js
./ummap.js
./CONFIG
./backbone-ibm.js
./ieee754.js
./flot
./flot/cow.js
./flot/moo.js
./flot/cat.js
./flot/bull.js
./flot/dog.js
./flot/sheep.js
./lines
./droiddraw-r1b21
./droiddraw-r1b21/._readme.txt
./droiddraw-r1b21/readme.js
./droiddraw-r1b21/LICENSE.js
./jquery-1.7.js
./ole.js
./touch
./json2.js
./xls2txt.js
./DO.sh
./backbone-isp.js
with a slightly modified configuration file:
geee: ~/src/bash/js-files
$ cat CONFIG
application.js
excanvas.js
json2.js
jquery*.js
flot/*
backbone*.js
app/screen-*.js
app/main.js
app/crud-*.js
app/*js
*js
and this bash script:
$ cat DO.sh
PROJECT=/home/jaroslav/src/bash/js-files # top folder of the web-app
SUPERJS=${PROJECT}/all-scripts.js
CONFIG=${PROJECT}/CONFIG # your the priority file (notice *js)
FOUND=${PROJECT}/.found # where to save results
JSMIN=$HOME/bin/jsmin # change to /usr/local/bin/jsmin or some other tool
echo > $FOUND # remove results from previous run
if [ ! -x $JSMIN ]
then
TMPJSMIN=/tmp/jsmin.c
wget -q https://github.com/douglascrockford/JSMin/raw/master/jsmin.c -O $TMPJSMIN & FOR=$?
echo "fetching jsmin (by Douglas Crockford) from github"
wait $FOR
gcc -o $JSMIN $TMPJSMIN
fi
cat $CONFIG | \
while read priority
do
eval "find $priority|sort -n" | \
while read amatch;
do
grep -q $amatch $FOUND || echo $amatch >> $FOUND
done
done
echo minifying:
cat $FOUND
cat `cat $FOUND` | $JSMIN > $SUPERJS
you will find the "merged" script in all-scripts after runing the script:
geee: ~/src/bash/js-files
$ . DO.sh
fetching jsmin (by Douglas Crockford) from github
[1]+ Done wget -q https://github.com/douglascrockford/JSMin/raw/master/jsmin.c -O $TMPJSMIN
minifying:
application.js
excanvas.js
json2.js
jquery-1.7.js
jquery-plugin.js
jquery-ui.js
flot/bull.js
flot/cat.js
flot/cow.js
flot/dog.js
flot/moo.js
flot/sheep.js
backbone23.js
backbone-ibm.js
backbone-isp.js
backbone-national.js
backbone-worldwide.js
app/screen-attach.js
app/screen-detach.js
app/screen-shot.js
app/main.js
app/crud-del.js
app/crud-in.js
app/crud-sel.js
app/aligator.js
app/giraffe.js
app/monkey.js
app/mouse.js
all-scripts.js
cp.js
ieee754.js
ole.js
ummap.js
xls2txt.js
Let me know if you need me to explain the script or if it fails on OS X.
The following script will follow the order of your config file and use the patterns given
#!/bin/bash
shopt -s nullglob;
while read config; do
cat $config >> out.js
done < /path/to/config/file
I ended up building a solution which uses a json file to list all of the files required by the app. On the dev environment the files are individually loaded by the browser. On the production server, the big compiled file is loaded. On my dev machine, I manually run a command to iterate over each file, appending it to a big JS file and running YUI Compressor.
It's a little hacky, but here it is:
https://github.com/renownedmedia/js-compressor