RegExp not matching expected pattern in Node.js - javascript

I am writing a script that will take in all my Lambda functions and create a Node server for local testing. I'm trying to string out all of the dbconfig objects from each file. I use https://regexr.com/ to test out my patterns, and I have done a number of variations that all work in there, but they will not work in my script. I am at a loss as to why. The objects all look like this:
const dbconfig = {
server: process.env.SERVER,
userName: process.env.USER_NAME,
password: process.env.PASSWORD,
options: {
database: process.env.DATABASE,
table: process.env.TABLE,
encrypt: true,
requestTimeout: 300000,
rowCollectionOnRequestCompletion: true,
}
}
I have tried (amongst others):
/(.+[\n\r]).+process.env(.+[\n\r])+/g
/const dbconfig(.+[\s\n\r])+/g
/(.+\s).+process.env(.+\s)+/g
Each one of these matches the whole object declaration as expected, but in Node it will replace:
nothing, leaves it as is
the first line (const dbconfig = {
all of the lines that contain process.env but leave the rest
I have no idea why I would get the different results, and why it would fail. Any suggestions welcome!
Edit:
Sorry, not enough detail included. By replace I mean I am doing a substitution for an empty string ''.
I am reading in the files to an array inside a loop of directory names like this:
files.push(fs.readFileSync(`../lambda/${folder}/index.js`, {encoding: 'utf8'}));
I am pulling out the required libraries from each of these like this:
let imports = new Set();
let arr;
files.forEach((file, idx) => {
while ((arr = replaceOptions.from[0].exec(file)) !== null) {
imports.add(arr[0]);
}
});
I then join the files into a master single file, strip out the requires and trying to do these configs, and will append a single copy at the top:
fs.writeFileSync('joined.js', files.join('\n'));
try {
const results = replace.sync(replaceOptions);
console.log('Replacement results:', results);
}
catch (error) {
console.error('Error occurred:', error);
}
fs.writeFileSync('server.js', Array.from(imports.values()).join('\n'));
fs.appendFileSync('server.js', fs.readFileSync('joined.js'));
I was formerly doing the string replace myself, but then ended up using npm package replace-in-file jsut for ease / in case I was stuffing up something there.
And yes, I do realise this code is a bit messy and inefficient currently, it's just been iterations of trying to get a base achievement of the desired outcome.

I still don't know why I would get the difference between an online regex builder/tester and Node, but there you go. I do now have a working implementation:
/(.*\s*).+process\.env(.*\s*)*?\}\s*\}/gm

Related

cy.url not returning a string as expected

Prior to switching to using a hash router, I had been implementing the cy.url command frequently to assure that links were navigating to the right URL addresses throughout the application. Now that we are using hash routing cy.url no longer yields a string, instead it is yielding a function. Any ideas how to work around this or reasons this is happening?
I was getting errors through out the cypress test runner like:
AssertionError: object tested must be an array, an object, or a string, but undefined given
so I logged the typeof result console.log(typeof(cy.url)) and got function printed to the console.
cy.get(dataCyButtonAttribute)
.should('be.visible')
.click()
console.log(typeof(cy.url))
cy.url().then(url => {
const categoryId = url.split(`${linkType}/`)[1]
const category = url.split('admin/')[1]
expect(category).to.contain(linkType)
expect(categoryId).to.equal('new')
})
}
This should yield a string:
const returnedUrl = null
cy.url().then(url => {
returnedUrl = url;
});
Cypress commands are asynchronous and must be followed by .then() in order to yield useful return values.
You can refer to this Github issue for more info:
https://github.com/cypress-io/cypress/issues/2150
I have encountered the same issue. And my solution as below.
cy.url().then(($base_url) => {
let id = $base_url.substr($base_url.lastIndexOf('/'),$base_url.length)
cy.log("The id is " + id);
})
It works for me.

How to skip or ignore programmatically a suite in CodeceptJS

As the test suite grows I need to be able to run something in BeforeSuite() which will connect to external suite and skip the suite if an external resource is unavailable.
Feature('External Server');
BeforeSuite((I) => {
// Check if server is available and skip all scenarios if it is not
});
Scenario('Login to the server', (I) => {
// Do not run this if the server is not available
})
I understand I could probably set a variable, but I think it would be nice if there was a way to tell the runner that a suite has been skipped.
The goal is to have a suite marked as skipped in the output eg:
Registration --
✓ Registration - pre-checks in 4479ms
✓ Registration - email validation in 15070ms
✓ Registration - password validation in 8194ms
External Server -- [SKIPPED]
- Login to the server [SKIPPED]
maybe prepend x before every scenario in your feature? example xScenario. I don't think codecept supports something similar to only for features. it currently works with scenarios only as far as I know.
you can use
Scenario.skip
in your step definition to skip a scenario.
Note: if any steps have been executed before skipping then it will still show it in the report
https://codecept.io/basics/#todo-test
My answer is compiled from a number of comments on the CodeceptJS github and stackoverflow. However, I can't recall the exact links or comments which helped me derive this solution, it's been at least a year, maybe two, since I started and have slowly modified this.
Edit: Found the github thread - https://github.com/codeceptjs/CodeceptJS/issues/661
Edit2: I wrote a post about "selective execution" (which avoids tagging unwanted tests with skip status) https://github.com/codeceptjs/CodeceptJS/issues/3544
I'll add a snippet at the bottom.
I'm on CodeceptJS 3.3.6
Define a hook file (eg: skip.js) and link it to your codeceptjs.conf.js file.
exports.config = {
...
plugins: {
skipHook: {
require: "../path/to/skip.js",
enabled: true,
}
}
...
}
The basic code in skip.js is
module.exports = function () {
event.dispatcher.on(event.test.before, function (test) {
const skipThisTest = decideSkip(test.tags);
if (skipThisTest) {
test.run = function skip() {
this.skip();
};
return;
}
});
};
I've defined a decision function decideSkip like so:
function decideSkip(testTags) {
if (!Array.isArray(testTags)) {
output.log(`Tags not an array, don't skip.`);
return false;
}
if (testTags.includes("#skip")) {
output.log(`Tags contain [#skip], should skip.`);
return true;
}
if (
process.env.APP_ENVIRONMENT !== "development" &&
testTags.includes("#stageSkip")
) {
output.log(`Tags contain [#stageSkip], should skip on staging.`);
return true;
}
}
(Mine is a bit more complicated, including evaluating whether a series of test case ids are in a provided list but this shows the essence. Obviously feel free to tweak as desired, the point is a boolean value returned to the defined event listener for event.test.before.)
Then, using BDD:
#skip #otherTags
Scenario: Some scenario
Given I do something first
When I do another thing
Then I see a result
Or standard test code:
const { I } = inject();
Feature("Some Feature");
Scenario("A scenario description #tag1 #skip #tag2", async () => {
console.log("Execute some code here");
});
I don't know if that will necessarily give you the exact terminal output you want External Server -- [SKIPPED]; however, it does notate the test as skipped in the terminal and report it accordingly in allure or xunit results; at least insofar as CodeceptJS skip() reports it.
For "selective execution" (which is related but not the same as "skip"), I've implemented a custom mocha.grep() utilization in my bootstrap. A key snippet is as follows. To be added either to a bootstrap anonymous function on codecept.conf.js or some similar related location.
const selective = ["tag1", "tag2"];
const re = selective.join("|");
const regex = new RegExp(re);
mocha.grep(regex);

Moving folders using gulp

I have project with 3 folders (I'm using gulp), which I don't need to compile. So, I need a task, which takes 3 folders "src/fonts", "src/libs" and "src/docs" as gulp.src() and just move them in dest/ folder. I don't need to do something with them, just move them after building.
My current attempt:
gulp.task('others', function () {
return gulp.src(['src/libs/**'], ['src/fonts/**'], ['src/docs/**'])
.pipe(gulp.dest('dist/'))
});
Using this code, task move only inner files and folders(need to be wrapped), and it take only "src/libs" as gulp.src()
Problem:
You're using gulp.src() wrong.
Explanation:
In the current state of your code, the following is happening:
['src/libs/**'] gets passed to globs property
['src/fonts/**'] gets passed to options property
['src/docs/**'] gets passed to NOTHING
The above explains why you're only seeing the files of src/libs being selected.
Thus:
gulp.src(['src/libs/**'], ['src/fonts/**'], ['src/docs/**'])
should be
gulp.src(['src/libs/**','src/fonts/**','src/docs/**'])
or (with a file wildcard *.*)
// something like this (its dirty as I'm writing this off the cuff)
// but you get the idea
var sources = [
'src/libs/**',
'src/libs/**/*.*',
...
];
gulp.src(sources)
Some More Information:
The docs specify the usage as such:
gulp.src(globs[, options])
where globs can only be a:
string => 'im/a/single/directory/or/file/s'
(a single) array object => ['dir1','dir2','dir3','etc']
and options, ... well, an options object (with defined values that you can supply)
options => { buffer: true, read: false, ... }

What about this combination of gulp-concat and lazypipe is causing an error using gulp 4?

I'm upgrading from Gulp 3 to 4, and I'm running into an error:
The following tasks did not complete: build
Did you forget to signal async completion?
I understand what it's saying, but can't understand why this code is triggering it.
Error or not, the task completes (the files are concatenated and written to dest). Executing the same code without lazypipe results in no error, and removing the concatenation within lazypipe also fixes the error.
Wrapping the whole thing in something that creates a stream (like merge-stream) fixes the issue. I guess something about the interaction between gulp-concat and lazypipe is preventing a stream from being correctly returned.
Here's the (simplified) task:
gulp.task('build', function() {
var dest = 'build';
var buildFiles = lazypipe()
.pipe(plugins.concat, 'cat.js') // Task will complete if I remove this
.pipe(gulp.dest, dest);
// This works
// return gulp.src(src('js/**/*.js'))
// .pipe(plugins.concat('cat.js'))
// .pipe(gulp.dest(dest));
// This doesn't (unless you wrap it in a stream-making function)
return gulp.src(src('js/**/*.js'))
.pipe(buildFiles());
});
Any advice appreciated!
This is a known issue when using lazypipe with gulp 4 and it's not going to be fixed in the near future. Quote from that issue:
OverZealous commented on 20 Dec 2015
As of now, I have no intention of making lazypipe work on Gulp 4.
As far as I can tell this issue is caused by the fact that gulp 4 uses async-done which has this to say about its stream support:
Note: Only actual streams are supported, not faux-streams; Therefore, modules like event-stream are not supported.
When you use lazypipe() as the last pipe what you get is a stream that doesn't have a lot of the properties that you usually have when working with streams in gulp. You can see this for yourself by logging the streams:
// console output shows lots of properties
console.log(gulp.src(src('js/**/*.js'))
.pipe(plugins.concat('cat.js'))
.pipe(gulp.dest(dest)));
// console output shows much fewer properties
console.log(gulp.src(src('js/**/*.js'))
.pipe(buildFiles()));
This is probably the reason why gulp considers the second stream to be a "faux-stream" and doesn't properly detect when the stream has finished.
Your only option at this point is some kind of workaround. The easiest workaround (which doesn't require any additional packages) is to just add a callback function cb to your task and listen for the 'end' event:
gulp.task('build', function(cb) {
var dest = 'build';
var buildFiles = lazypipe()
.pipe(plugins.concat, 'cat.js')
.pipe(gulp.dest, dest);
gulp.src(src('js/**/*.js'))
.pipe(buildFiles())
.on('end', cb);
});
Alternatively, adding any .pipe() after buildFiles() should fix this, even one that doesn't actually do anything like gutil.noop():
var gutil = require('gulp-util');
gulp.task('build', function() {
var dest = 'build';
var buildFiles = lazypipe()
.pipe(plugins.concat, 'cat.js')
.pipe(gulp.dest, dest);
return gulp.src(src('js/**/*.js'))
.pipe(buildFiles())
.pipe(gutil.noop());
});
So the error is clear. I had to do some refactoring to make things work again for gulp 4. I ended up making some extra methods that take a source and destination and perform the tasks previously done by my lazypipe implementation.
I have to say I don't miss lazypipe now. It's just a different approach. I did end up with some extra tasks but they use a standard method like in the example below:
// previously a lazypipe, now just a method to return from a gulp4 task
const _processJS = (sources, destination) => {
return src(sources)
.pipe(minify(...))
.pipe(uglify(...))
.pipe(obfuscate(...))
.pipe(whatever())
.pipe(dest(destination));
};
const jsTaskXStep1 = ()=>{
return src(...).pipe(...).pipe(...).pipe(dest(...));
};
const jsTaskXStep2 = ()=>{
return _processJS(['./src/js/x/**/*.js'], './dist/js');
};
const jsTaskYStep1 = ()=>{
return src(...).pipe(...).pipe(...).pipe(dest(...));
};
const jsTaskYStep2 = ()=>{
return _processJS(['./src/js/y/**/*.js'], './dist/js');
};
const jsTaskX = series(jsTaskXStep1, jsTaskXStep2);
const jsTaskY = series(jsTaskYStep1, jsTaskYStep2);
module.exports = {
js: parallel(jsTaskX, jsTaskY),
css: ...,
widgets: ...,
...
default: parallel(js, css, widgets, series(...), ...);
}
So basically you can put your lazypipe stuff in methods like _processJS in this example. And then create tasks that use it and combine everything with gulp series and parallel. Hope this helps out some of you who are strugling with this.

glob in Node.js and return only the match (no leading path)

I need to glob ./../path/to/files/**/*.txt but instead of receiving matches like this:
./../path/to/files/subdir/file.txt
I need the root stripped off:
subdir/file.txt
Presently, I have:
oldwd = process.cwd()
process.chdir(__dirname + "/../path/to/files")
glob.glob("**/*.txt", function (err, matches) {
process.chdir(oldwd)
});
But it's a little ugly and also seems to have a race condition: sometimes the glob occurs on oldwd. So that has to go.
I am considering simply mapping over matches and stripping the leading path with string operations. Since glob returns matches with the dotdirs resolved, I would have to do the same to my search-and-replace string, I suppose. That's getting just messy enough that I wonder if there is a better (built-in or library?) way to handle this.
So, what is a nice, neat and correct way to glob in Node.js and just get the "matched" portion? JavaScript and CoffeeScript are both ok with me
Pass in the the directory to the options and have all that mess taken care of by glob.
glob.glob("**/*.txt", {cwd: '../../wherever/'}, function(err, matches) {
...
});
Try this:
var path = require('path');
var root = '/path/to/files';
glob.glob("**/*.txt", function(err, matches) {
if(err) throw err;
matches = matches.map(function(match) {
return path.relative(root, match);
});
// use matches
});

Categories