I need to write a function for the gulp.
The function should:
1. Open file (style.min.css)
2. Find all lines with the given text (img/sprite.png)
3. Replace all found lines with (../img/sprite.png)
To simplify the task, I include a plugin (replace-in-file)
I am not good at Node.js but I tried...
Please help me make it right.
If you want you can show how to do this without using a plugin (replace-in-file).
But you can also use the plugin...
const build = ('build/')
const replace = require('replace-in-file')
const config = {
build: {
style: build + 'css'
}
}
function fixSprites() {
const results = replace.sync({
files: config.build.style + '/style.min.css',
from: 'img/sprite.png',
to: '../img/sprite.png',
countMatches: true,
})
return src(config.build.style + '/style.min.css')
.pipe(replace(results))
}
exports.fixSprites = fixSprites()
I used a plugin 'replace-in-file'!
This is the out of the box gulp solution
Replace all lines of a similar type with others.
Example: replace img/sprite.png with ../img/sprite.png
const { series } = require('gulp')
const replace = require('replace-in-file')
function updateRedirects(done) {
replace({
files: 'build/css/style.min.css',
from: /img\/sprite.png/g, // img/sprite.png – Replace this.
to: '../img/sprite.png', // ../img/sprite.png – Replace with this.
countMatches: true,
}, done)
}
exports.build = series(updateRedirects) // Start
Related
When building our production app in Gatsby, I see something like this:
window.___chunkMapping={
"app":[],
"component---src-templates-page-tsx":[],
"component---src-templates-pages-newsletter-tsx":[]
}
Is it possible to hash these paths instead of printing them out? We don‘t want to expose too much from what is happening in the back.
I tried setting these configs in webpack:
output: {
filename: `[chunkhash:2][contenthash:5].js`,
chunkFilename: `[chunkhash:2][contenthash:5].js`,
},
And it successfully hashes .js files but not the template paths.
I upvoted this question when I first saw it, I think it's definitely should be done in production build.
Unfortunately, componentChunkName (the template path in question) is generated by Gatsby in createPage & not handled by webpack.
The code that generates componentChunkName is over here: github
I tried to modify the code as follow:
const { kebabCase } = require(`lodash`)
const path = require(`path`)
+ const uuidv5 = require(`uuid/v5`)
const { store } = require(`../redux`)
const generateComponentChunkName = componentPath => {
const program = store.getState().program
let directory = `/`
if (program && program.directory) {
directory = program.directory
}
const name = path.relative(directory, componentPath)
- return `component---${kebabCase(name)}`
+ return process.env.NODE_ENV === `production`
+ ? `component---${uuidv5(name, uuidv5.URL)}`
+ : `component---${kebabCase(name)}`
}
exports.generateComponentChunkName = generateComponentChunkName
This successfully hides all the component names in production build:
app: Array [ "/app-e593b3d93932ed3a0363.js" ]
"component---11d478fe-6a55-579c-becf-625ab1e57cf4": Array [ "/component---11d478fe-6a55-579c-becf-625ab1e57cf4-76c90ae50035c52657a0.js" ]
"component---15c76861-b723-5e0a-823c-b6832aeeb0a0": Array [ "/component---15c76861-b723-5e0a-823c-b6832aeeb0a0-18eb457ba6c147e1b31b.js" ]
...
None of the local unit tests failed, my clicking-around-until-something-breaks test also hasn't yielded any errors. I might submit a PR later today to see if the maintainers have some insights on why this is not a good idea.
Edit: I opened an issue instead: github, you can subscribe to the issue to see how it resolves.
I am working on a WordPress plugin and have all the files in my working directory and run gulp in that project folder. Now, I'd like to have a watch task that copies all the changes to my local WP installation for testing.
Therefore I am looking for a way to sync (only in one direction) the project folder with the plugin folder of WP.
I managed to get it to work with gulp-directory-sync
...
var dirSync = require("gulp-directory-sync");
var localDir = "../newDir/";
var buildDir = "./buildDir/";
...
function copy_to_local_folder() {
return pipeline(
gulp.src(buildDir+'**/*'),
dirSync( buildDir, localDir, { printSummary: true } )
);
}
function watch_local() {
gulp.watch(buildDir+'**/*', copy_to_local_folder);
exports.default = watch_local;
However, the plugin hasn't been updated in 4 years and according to this answer, it is not doing it the proper "gulp way" (e.g. not using gulp-src) and this task should be possible with other basic gulp functions.
Copying changed files is pretty easy, but also keeping track of deleted files is more complicated. I also would prefer to only update changed/deleted/new files and not clearing the folder every time before coping all files.
Starting with the updated code in the aforementioned answer, I tried to implement it and made changes to make it work.
...
var newer = require("gulp-newer");
var pipeline = require("readable-stream").pipeline;
var del = require("del");
var localDir = "../newDir/";
var buildDir = "./buildDir/";
function copy_to_local_folder() {
return pipeline(
gulp.src([buildDir+'**/*']),
newer(localDir),
gulp.dest(localDir),
);
}
function watch_local() {
var watcher = gulp.watch(buildDir + '**/*', copy_to_local_folder );
watcher.on('unlink', function(path) {
console.log(path);
var newPath = './'+path;
newPath = newPath.replace(buildDir, localDir);
console.log(newPath);
(async () => {
const deletedPaths = await del(newPath, {dryRun: true, force: true});
console.log('Deleted files and directories:\n', deletedPaths.join('\n'));
})();
});
}
exports.default = watch_local;
With this code, the folder gets updated when I change or delete files, but it does not trigger when I delete an entire folder. Which is probably because I use unlink and not unlinkDir. But even if I use the version of the function below, it doesn't get triggered by deleting a folder (with containing files).
watcher.on('unlinkDir', function(path) {
console.log('folder deleted');
console.log(path);
var newPath = './'+path;
newPath = newPath.replace(buildDir, localDir);
console.log(newPath);
});
What am I doing wrong?
Or is there in general a better way to achieve this?
PS: I'm using
node v11.15.0
gulp v4.0.2
on Linux
deleting files and folders in VS Code
Update:
When I run it with:
watcher.on('unlink', ... and delete a file:
it works
with the console.log output and the ( async () => ...
and Starting and Finished for copy_to_local_folder
watcher.on('unlinkDir', ... and delete a folder:
it works not
nothing happens in the console output
(not even Starting)
watcher.on('unlinkDir', ... and delete a file:
Starting and Finished for copy_to_local_folder
but not the console.log and ( async () => ...
watcher.on('add', ... and watcher.on('addDir', ...
work both
Seems to me that the watcher.on('unlinkDir', ... does never get triggered ... is unlinkDir not supported by gulp-watch?
I have several typescript files, some of them export a const named APIS.
I'm trying to access those exports (I want to concatenated all of them to a single file), but it doesn't seem to work. I'm obviously doing something wrong, but I'm not sure what.
For example, I have a folder named services, with 2 files: service1.ts, service2.ts.
service1.ts:
...
export const APIS = [ { "field1" : "blabla" } ];
service2.ts: does not contain the APIS var.
This is my gulpfile.js:
var gulp = require('gulp');
var concat = require('gulp-concat');
var map = require('gulp-map');
gulp.task('default', function() {
return gulp.src('.../services/*.ts')
.pipe(map(function(file) {
return file.APIS;
}))
.pipe(concat('all.js'))
.pipe(gulp.dest('./test/'));
});
When I run this task, I get nothing. When I added console.log(file.APIS); to the map function, I get undefined for all the values (although it is defined in service1.ts!).
This is following to: Extracting typescript exports to json file using gulp
EDIT: OK, so I tried saving the exports in a .js file instead of a .ts file, and now I can access those vars using require:
gulp.task('default', function() {
return gulp.src('./**/*.service.export.js')
.pipe(map(function(file) {
var fileObj = require(file.path);
...
}))
Now if I try console.log(fileObj.APIS); I get the correct values. What I'm still confused about is how I can pass these value on, and create a single file out of all these vars. Is it possible to push them into an array?
This will not work as you think it would work. Gulp itself knows nothing about typescript files, that file is a vinyl-file and has no knowledge about the typescript code within its content.
Edit
Based on your example, you can do something like this:
var gulp = require('gulp');
var concat = require('gulp-concat');
var map = require('gulp-map');
var fs = require('fs');
gulp.task('test', function ()
{
var allConstants = [];
var stream = gulp.src('./**/*.service.export.js')
.pipe(map(function(file)
{
var obj = require(file.path);
if (obj.APIS != null)
allConstants = allConstants.concat(obj.APIS);
return file;
}));
stream.on("end", function (cb)
{
// Do your own formatting here
var content = allConstants.map(function (constants)
{
return Object.keys(constants).reduce(function (aggregatedString, key)
{
return aggregatedString + key + " : " + constants[key];
}, "");
}).join(", ");
fs.writeFile('filename.txt', content, cb);
});
return stream;
});
Suggestion
If you want to collect multiple variables into a single file i.e. a common variables file I suggest gulp-replace.
Steps
Create a file, require it and use tags within that file to place your variables.
Advice
If you are already using services don't create an array. Instead create an object (JSON) where every property is a constant. i.e.
var constants = {
const_1: 0,
const_2: 1,
const_3: 2,
}
My goal is to fake out getting some requirejs code working via babel. I've found that if I add the following: if (typeof define !== "function") { var define = require("amdefine")(module); } to the top of every file while running in nodejs things seem to work out.
Here is some code I wrote, which I thought would work or nearly work:
function injectDefine(babel) {
var header = 'if (typeof define !== "function") { var define = require("amdefine")(module); }';
return new babel.Plugin('amdefine', {
visitor: {
Program: {
enter: function(path, file) {
path.unshiftContainer(
'body',
babel.types.expressionStatement(
babel.types.stringLiteral(header)
)
);
},
},
},
});
}
require('babel-core/register')({
stage: 0,
plugins: [{transformer: injectDefine}],
});
require('../components/button');
The components/button file is just me trying to test that some file can load.
Other notes: I'm using babel 5, and I can't upgrade right now. I also can't use a .babelrc very easily right now.
Tip 1: the environment variable BABEL_DISABLE_CACHE=1 is needed if you are doing heavy testing of plugins. If you had a script that you ran like npm run unit you may instead want to run like BABEL_DISABLE_CACHE=1 npm run unit while testing your plugin.
Tip 2: babel.parse will give you a full program out of some source. The easiest thing you could do is babel.parse(header).program.body[0].
The following ended up working:
function injectDefine(babel) {
var header = 'if (typeof define !== "function") { var define = require("amdefine")(module); }';
return new babel.Plugin('amdefine', {
visitor: {
Program: {
enter: function(node, parent) {
node.body.unshift(
babel.parse(header).program.body[0]
);
},
},
},
});
}
require('babel-core/register')({
cache: false,
stage: 0,
plugins: [injectDefine],
});
At this stage, a cleaner solution can be to use #babel/traverse and #babel/types.
Let's suppose you want to append a comment to the top of every file, you could use some code like the following:
// Import the required modules
import * as t from "#babel/types";
import traverse from "#babel/traverse";
// Get your ast (for this, you can use #babel/parser)
// Traverse your ast
traverse(ast, {
// When the current node is the Program node (so the main node)
Program(path) {
// Insert at the beginning a string "Hello World" --> not valid JS code
path.unshiftContainer('body', t.stringLiteral("Hello World"));
}
});
I have several typescript files, some of them export a certain var - named APIS - which is an array of objects.
I want to extract the values of all of these exports, and pipe them to a json file using gulp.
For example, I have a folder named services, with 3 files: service1.ts, service2.ts, service3.ts.
service1.ts:
...
export const APIS = [ { "field1" : "blabla" } ];
service2.ts:
...
export const APIS = [ { "field2" : "yadayada" }, { "field3" : "yadabla" } ];
service3.ts: - does not export the APIS var.
I want to use gulp in oder create a json file that looks something like this:
[ { "field1" : "blabla" }, { "field2" : "yadayada" }, { "field3" : "yadabla" } ]
gulpfile.js - the ??? is a placeholder for the missing code.
gulp.task('default', function () {
return gulp.src('.../services/*.ts')
.pipe(???)
.pipe(concat('export.json'))
.pipe(gulp.dest('./test'));
});
I'm new to both typescript & gulp, so I'm not sure how to achieve this... any ideas? :)
EDIT: So, I understand that there's no OOTB solution for this, and I need to writer my own task / plugin. I'm not really sure how to achieve that, though.
Ideally, what I want is to find a gulp plugin (or a combination of plugins) that can handle ts / js files as objects with properties. Then I can extract the var I need from the file.
I couldn't really find something like that, only string manipulation plugins - Treating my ts file as a string and using search with regex seems overly complicated to me. Is there something I'm missing here? is there a more straight-forward way to do this?
The typescript compiler API is relevant here, as this is what you need to parse and understand the ts-code properly. Unfortunately, I don't think there is a gulp plugin that implements this API.
I think your best bet is to change strategy completely here and solve your problem in another way, or to use regex to try to extract the constants that you want. Unless you want to write your own gulp-plugin using the compiler API.
This is what I ended up doing, and it worked for me. I'm positing it here in case anyone else finds it useful. :)
Instead of .ts, I saved the exports in .js files, i.e:
service2.export.js:
exports.APIS = [ { "field2" : "yadayada" }, { "field3" : "yadabla" } ];
Based on the answer given here: https://stackoverflow.com/a/36869651/3007732 I created a gulp task as following:
var gulp = require('gulp');
var concat = require('gulp-concat');
var map = require('gulp-map');
var fs = require('fs');
var allServices;
gulp.task('default', function() {
var allServices = [];
var stream = gulp.src('./**/*.export.js')
.pipe(map(function(file) {
var obj = require(file.path);
if (obj.APIS != null) {
allServices.push.apply(allServices, obj.APIS);
}
return file;
}));
stream.on("end", function (cb)
{
fs.writeFile('./export.json', JSON.stringify(allServices), cb);
});
return stream;
});
and now I get the following output in export.json:
[ { "field1" : "blabla" }, { "field2" : "yadayada" }, { "field3" : "yadabla" } ]
which is exactly what I wanted.