I currently have a findFile function I use within a forEach Loop to iterate through an array of filenames and then provide a new array of the file system paths for each of the filenames within the original array:
var dir = 'Some File System Path';
const fs = require('fs');
const findFile = function (dir, pattern) {
var results = [];
fs.readdirSync(dir).forEach(function (dirInner) {
dirInner = path.resolve(dir, dirInner);
var stat = fs.statSync(dirInner);
if (stat.isDirectory()) {
results = results.concat(findFile(dirInner, pattern));
}
if (stat.isFile() && dirInner.endsWith(pattern)) {
results.push(dirInner);
}
});
return results;
};
var newFileArr = ['1.txt', 'file2.txt', 'file3.txt', 'file4.txt']
function fpArr(newFileArr) {
const fpArray = [];
newFileArr.forEach((file => {
fpArray.push(findFile(dir, file).toString())
//console.log(fpArray)
}));
OutPut:
[
/some/file/path/file1.txt,
/some/file/path/1.txt,
/some/file/path/file2.txt,
/some/file/path/file3.txt,
/some/file/path/file4.txt
]
The issue I am facing is my findFile looks for a pattern match and it picks up the path for "1.txt" presumably when it is searching for "file1.txt". I know this is an edge case, but it is an important edge case because other filenames may end with the same letters or numbers and I do not want those to be picked up in the file path array output. I have tried several ideas such as ".match()" and ".startsWith" however those will only work with a string literal or regEx as far as I can tell which causes the search itself to fail. I am looking to see if there is a way to get the findFile function to do an exact match based off a variable of some string value? Any help getting me in the right direction is appreciated.
[
'K_01.00.0000.iar',
'HELLOWORLDKLA_01.00.0000.iar',
'HELLO_KLA_01.00.0000.iar',
'KLA_01.22.7895.iar',
'KLA_HELLO_WORLD_2_01.00.0000.iar',
'KLA_02_WORLD_01.00.0000.iar'
]
[]
Above are the actual 2 arrays I am working with. The first array is just a simply array of filenames. The bottom array has been run through both sync and you async/await solution. For some reason even with the RegExp added it still picks up 2 files which are not listed in the filename array. I added the file paths to show none of the files are in the root directory and I spread out the files into sub directories to ensure the recursive search is working. I will keep messing with it to see if I can figure out why the RegExp solution is bringing these files into the array when they shouldn't be....
I'm afraid I don't have enough reputation to comment so am posting an answer!
Your code seems to work and if I understand rightly that you want to find all files from a list against a directory and sub directories giving you the paths to these files. Is that correct?
When I ran your code it does indeed pick up the four files in newFileArr and excludes the 1.txt, I've created a codesandbox demo to show you with dummy files.
I've also taken the liberty to create a codesandbox that demonstrates how to do this in a single function using RegExp and map to create dynamic regex expressions if that is useful.
Related
I'm trying to get all files from a directory, filtered them and display the result as an array..
I've tried a bunch of stuff but none seem to work as i need them to so ill ask here, below is my code.
const fs = require('fs');
let files = fs.readdirSync('./commands/fonts')
files.forEach(el => {
el.includes('.json') ? files[files.indexOf(el)].replace('.json', '') : files.splice(files[files.indexOf(el)], 1)
});
console.log(files)
I tried promises but I am inexperienced with them so I'm asking here. If you could help it would be great!
Use the filter, map, and slice functions:
const files = ["fileone.json", "filetwo.json", "filethree, filefour.md"]
filteredFiles = files.filter(fileName => fileName.endsWith(".json"))
.map(fileName => fileName.slice(0, -5))
console.log(filteredFiles)
filter() method creates a new array with all elements that pass the test implemented by the provided function
map() method creates a new array populated with the results of calling a provided function on every element in the calling array
slice() method returns the selected elements in an array, as a new array object (we've selected the characters in positions 0 through [final-5] in our example)
Edit: answer now using endsWith instead of includes, as suggested in comments
Use javascript filter function:
var myfiles = ["file1.json", "file2.txt", "file3.jpeg"]
myfiles = myfiles.filter(o=> o.indexOf(".json") !== -1)
console.log(myfiles)
I may or may not get in 2 differently formatted bits of data.
They both need to be stripped of characters in different ways. Please excuse the variable names, I will make them better once I have this working.
const cut = flatten.map(obj => {
return obj.file.replace("0:/", "");
});
const removeDots = flatten.map(obj => {
return obj.file.replace("../../uploads/", "");
})
I then need to push the arrays into my mongo database.
let data;
for (const loop of cut) {
data = { name: loop };
product.images.push(data);
}
let moreData;
for (const looptwo of removeDots) {
moreData = {name: looptwo};
product.images.push(moreData);
}
I wanted to know if there is a way to either join them or do an if/else because the result of this is that if I have 2 records, it ends up duplicating and I get 4 records instead of 2. Also, 2 of the records are incorrectly formatted ie: the "0:/ is still present instead of being stripped away.
Ideally I would like have a check that if 0:/ is present, remove it, if ../../uploads/ is present or if both are present, remove both. And then create an array from that to push.
You can do your 2 replace on the same map :
const processed = flatten.map(obj => {
return obj.file.replace("0:/", "").replace("../../uploads/", "");
});
Since you know the possible patterns, you can create a regex and use it to replace any occurrences.
const regex = /(0:\/|(\.\.\/)+uploads\/)/g
const processed = flatten.map(obj => obj.file.replace(regex, ''));
You can verify here
Note, regex is a pattern based approach. So it has pros and cons.
Pro:
You can have any number of folder nesting. Using string ../../uploads/ will restrict you with 2 folder structure only.
You can achieve transformation in 1 operation and code looks clean.
Cons:
Regex can be hard to understand and can reduce readability of code a bit. (Opinionated)
If you have pattern like .../../uploads/bla, this will be parsed to .bla.
Since you ask also about a possible way of joining two arrays, I'll give you couple of solutions (with and w/o joining).
You can either chain .replace on the elements of the array, or you can concat the two arrays in your solution. So, either:
const filtered = flatten.map(obj => {
return obj.file.replace('0:/', '').replace('../../uploads/', '');
});
Or (joining the arrays):
// your two .map calls go here
const joinedArray = cut.concat(removeDots);
I have a gulp.src on a task but I want to match all the files that contain a certain word as a prefix e.g.:
gulp.src('./mydir/project.build-blessed1.css')
But there can be N files with different numbers, blessed1, blessed2, blessed3
How can I make it match any file that starts with that and has the .css extension on the same line, without using a function?
if you're not using an old implemetation of gulp you can use '*' as wild card even in the middle of a path so something like:
gulp.src('./mydir/project.build-blessed*.css')
should work
You can see the doc here https://github.com/gulpjs/gulp/blob/master/docs/API.md
As Splatten suggested, you could use string.lastIndexOf method. In your case:
var string = 'i want to do something!'
if (string.lastIndexOf('something') !== -1) {
//keyword was found, meaning it's in the string
} else {
//keyword was not found
}
lastIndexOf returns the index where he found the keyword.
I'm using Node.js to read and parse a file of pairs encoding numbers. I have a file like this:
1561 0506
1204 900
6060 44
And I want to read it as an array, like this:
[[1561,0506],[1204,900],[6060,44]]
For that, I am using a readStream, reading the file as chunks and using native string functions to do the parsing:
fileStream.on("data",function(chunk){
var newLineIndex;
file = file + chunk;
while ((newLineIndex = file.indexOf("\n")) !== -1){
var spaceIndex = file.indexOf(" ");
edges.push([
Number(file.slice(0,spaceIndex)),
Number(file.slice(spaceIndex+1,newLineIndex))]);
file = file.slice(newLineIndex+1);
};
});
That took way to many time, though (4s for the file I need on my machine). I see some reasons:
Use of strings;
use of "Number";
Dynamic array of arrays.
I've rewriten the algorithm without using the builtin string functions, but loops instead and, to my surprise, it became much slower! Is there any way to make it faster?
Caveat: I have not tested the performance of this solution, but it's complete so should be easy to try.
How about using this liner implementation based on the notes in this question.
Using the liner:
var fs = require('fs')
var liner = require('./liner')
var source = fs.createReadStream('mypathhere')
source.pipe(liner)
liner.on('readable', function () {
var line
while (line = liner.read()) {
var parts = line.split(" ");
edges.push([Number(parts[0]), Number(parts[1])]);
}
})
As you can see I also moved the edge array to be an inline constant-sized array separate from the split parts, which I'm guessing would speed up allocation. You could even try swapping out using indexOf(" ") instead of split(" ").
Beyond this you could instrument the code to identify any further bottlenecks.
I have a directed graph where paths are stored in JSON array like. It is in the form of source and destination .
Var pathDirection = [{"Source":2,"Destination":3},
{"Source":3,"Destination":4},
{"Source":5,"Destination":4},
{"Source":2,"Destination":5},
{"Source":4,"Destination":6}];
Using above it forms graph like below structure .
My problem is I don’t know the starting point and I have to find all possible path to reach 6 from any node
Like for above graph different path to reach 6 is
Output:
[4 ->6]
[3->4 ->6]
[5->4 ->6]
[2->5->4 ->6]
[2->3->4 ->6]
I have tried to write below algo using backtracking which is working fine but looking for some best algo to find. Please suggest any other possible way to do same and how can i optimize below programe.
// BackTrack From End Node Destination 6
var getAllSource = function(destId){
var sourceForsameDist = [];
pathDirection.forEach(function(eachDirection){
if(eachDirection.Destination == destId){
sourceForsameDist.push(eachDirection.Source);
}
});
return sourceForsameDist;
};
var diffPath = [];
var init = function(destination){
var sourceId = getAllSource(destination[destination.length - 1]);
if(sourceId.length === 0){
diffPath.push(destination);
}
for(var i=0;i<sourceId.length;i++){
var copy = destination.slice(0);
copy.push(sourceId[i]);
init(copy);
}
};
init([6]);
console.log(diffPath); // [[6,4,3,2],[6,4,5,2]]
I have tried to do using backtracking which is working fine but looking for some best algo to find.
I'd call it Depth-First-Search instead of backtracking, but yes the algorithm is fine.
However, I'd have some suggestions on the implementation:
make diffPath a local variable and return it from the init function
If you omit the if(sourceId.length === 0) condition then you will get the expected output, not only the the paths from the sources
instead of looping through the whole pathDirections in your getAllSource function, I'd use a lookup table that is filled before starting the traversal
rename init to something more meaningful