Bash Loop on JSON Array - javascript

Context:
My goal is to pull the tag from a series of git repos and merge it into an existing JSON (that is the satis.JSON file for a private repo).
Example Data
{
"name": "Example",
"homepage": "http://satis.example.ca",
"repositories": [
{ "type": "vcs", "url": "git#repo.example.com:reponame1.git" },
{ "type": "vcs", "url": "git#repo.example.com:reponame2.git"}
]}
These individual commands work:
Retrieve Repo Path from URL in JSON using jq
cat satis.json | jq '.repositories[] | .url'
Retrieve Repo Path from URL in JSON using underscore_cli
underscore -i satis.json select '.repositories .url'
Example return
["git#repo.example.com:reponame1.git","git#repo.example.com:reponame2.git"]
Get Latest Tag From Individual Repo Path
git ls-remote --tags git#repo.example.com:reponame2.git | head -1 | awk '{split($0,array,"tags/")} END{print array[2]}'
Example return: 2.0.7
I'm struggling on combining them into a loop that occurs for each repo to return the tag. Any examples I hit on for while loops don't deal with the JSON array format that I haven't figured out to convert back to bash format yet.
Alternate / Related questions:
I couldn't find any way to do a git command to pull the version via JS otherwise I would have done everything with javascript. Got any specific examples that don't involve adding full JS versions of git to add to project?
If the loop could write the version back into the original data that would be most excellent. The hope is to schedule this script so that it pulls latest from satis repository, checks all the repos for latest version and outputs in a new path.

Related

NPM Scripts: Pass one script command json output to another script command as argument value

I am trying to inject a json as an argument value within npm script, like this:
(https://medium.com/typeforms-engineering-blog/codeceptjs-puppeteer-setting-up-multiple-configurations-32d95e65adf2)
"scripts": {
"example": "bash -c 'cat ./sample.json | tr -d ' \t\n\r''",
"test:override": "codeceptjs run --steps --config ./codecept.conf.js --override <expecting_output_json_here_from_example_script>",
}
The first script command 'example' executes well and displays the output json correctly in the console (same as input):
{
"helpers":{
"Puppeteer":{
"url":"http://yahoo.com",
"show":false,
"windowSize":"1920x1080"
}
}
however, I couldn't find the way to pass it as an argument value for the --override in the "test:override" script
I have tried various StackOverflow questions/answers, however, I couldn't find a way to achieve it.
I am using Git Bash in VS Code in Win10.
Please suggest a solution/alternate approach or point me to the right document/SO question.
To the best of my knowledge and efforts, I believe that it is not a duplicate question.

How to concatenate two strings and search for them within my React JSX in Next.js? And How to make file (MDX JS) content searchable?

I have the following function where I am trying to search the 'title' and 'summary' for something entered in the search bar. However, this throws an error that 'const' is not allowed here.
import { frontMatter as blogs1 } from './blog/*.mdx';
const filteredBlogPosts = blogs1
.sort(
(a, b) =>
Number(new Date(b.publishedAt)) - Number(new Date(a.publishedAt))
)
.filter((frontMatter) =>
const concat = frontMatter.summary + frontMatter.title, #Error. How to search for both title and summary for entered search value?
concat.toLowerCase().includes(searchValue.toLowerCase()),
);
The blogs1 that gets imported has content like this - how do I make the content part of the MDX file searchable as well? At least with title and summary, I have a key-value pair. But how to make text in content of this MDX file (anything after 'Table of Contents' in example below) searchable too?
---
title: 'abc def ghi'
publishedAt: '2020-09-06'
summary: "xyz mnk."
image: '/static/images/chapter18/1.png'
---
## Table of Contents
1. [Introduction](#introduction)
2. [Alphabet and xyz](#comparing-alphabet-and-xyz)
3. [ABC](#abc)
# Introduction
This is an attempt at something.<br/>
![Test](/static/images/chapter18/2.png)<br/>
It's very hard to answer your question exactly because you're stuck in a path where you need to step backwards and consider the whole "architecture" side.
For search, I'd recommend to use fuse.js, see this tutorial.
I do have a search feature on my website and it is searching the MDX sources: articles and npm package readme's.
Conceptually, you have to make a script which fs.readFileSync-reads all MDX, extracts the JSON-like "list" (from fuse.js terminology) and writes it into a file (I make it .ts). Then, React app imports it and runs the fuse client-side.
In npm scripts, in package.json, we end up with two scripts, instead of "build": "next build", you prepend the search index generation script, something like: "build": "node './ops/generate.mjs' && next build".
I even published my "list" fuse string generation function as an npm package, see extract-search-index. It takes a string, article's source, which you get by fs.fileReadSync and then slice from the second instance of ---, that's just under the front-matter.
I estimate the total "weight" of the search feature ends up around 200KB, but it's loaded in async then cached; Remix manages it well.

Managing multi sites in cypress

Following are the three country based sites I am having -
Site 1 - https://example.com/uk
Site 2 - https://example.com/fr
Site 3 - https://example.com/ie
All 3 sites are using same code base and on the basis of country (uk | fr | ie) in my code I am passing some default configuration, like country specific text and some feature enable/disable switch etc. to the inner pages.
In my cypress, I have created fixtures like -
/fixtures -
/uk
-uk-config.json
/fr
-fr-config.json
/ie
-ie-config.json
I am stuck with the folder structure in integration folder and do not know the recommended way of doing this. Please help me on this.
Option 1-
/integration -
/uk
-homepage.spec.js
-plp.spec.js
-pdp.spec.js
-cart.spec.js
/fr
-homepage.spec.js
-plp.spec.js
-pdp.spec.js
-cart.spec.js
/ie
-homepage.spec.js
-plp.spec.js
-pdp.spec.js
-cart.spec.js
Problem with this approach - Though this code is more segregated on country basis, but here lot of code duplicates and it get increases as we launch other country stores.
Option 2 -
/integration -
-homepage.spec.js
-plp.spec.js
-pdp.spec.js
-cart.spec.js
And in this pass, country specific configurations from fixtures. TBH, I don't know how can I manage this and it would really be good if someone find this is a better way and can provide some pointers toward this would really be helpful.
Problem:If I understood your problem clearly, you want to run your same set of tests but for different countries and you are facing issues in reading and there is a problem that suite gets increased if too many countries will be added just to test same set of tests. Right ??
Solution:
You can pass the COUNTRY variable as node env variable from command line and assign that as Cypress env variable and read it in your tests.
"test": "COUNTRY=$COUNTRY ./node_modules/.bin/cypress open --env COUNTRY=$COUNTRY"
Your run command should be like below
COUNTRY=fr npm run test
COUNTRY=in npm run test
COUNTRY=uk npm run test
COUNTRY=whatever npm run test
let json = require('config.json');
export const getCountryUrl = () => {
return json[Cypress.env().COUNTRY]['url']
}
{
"uk": {
"url": "https://uk-website"
},
"fr": {
"url": "https://fr-website"
}
}

Remove unused keys from JSON file

Hello I'm looking for the best way to automatize my problem:
I am working on a web application in which I use JSON translation files that have this form:
{ "unique_key" : "value"}
I have several files, one for each supported language, which all have the same number of items.
Ex :
i18n_en.json
{ "greeting" : "Hello"}
i18n_fr.json
{ "greeting" : "Bonjour"}
I have very badly managed the evolution of these files, and I end up with keys that are no longer used (I easily think 30% of the ~500 keys), the problem being that I don't know which ones. And that I would have to manually search through the entire architecture of my application to find those that are used and redo a clean file.
My idea to automate this process being:
Open one of the JSON files (no matter which one, they all have the
same number of keys).
Loop for each key
For each key, browse the entire architecture of my project ( by
looking only in the *.html or *.js files)
If we find an occurrence of this key, create an entry in a new clean
json with the key + value of it.
I don't really know which language to use that would be optimized for this kind of task, thank you for guiding me!
Maybe something like i18next-scanner will be useful. You are not the first with such a problem.
I ended up creating my own shell script :
path_to_project=path/to/my/project
RED='\033[0;31m'
GREEN='\033[0;32m'
rm temp.json
rm final.json
touch temp.json
touch final.json
echo "{" >> temp.json
while IFS=, read -r key value
do
if grep -r -q --include=\*.{js,html} --exclude-dir={node_modules,bower_components} $key $path_to_project; then
# write in new json
echo "\"$key\":\"$value\"," >> temp.json
echo -e "${GREEN} $key was found !"
else
echo -e "${RED} $key not found"
fi
done < data.csv
echo "}" >> temp.json
#remove new lines
cat temp.json | tr -d '\r' >> final.json
For this to work, I had to convert my json file to csv (data.csv).
The final json file needs to be reworked a little manually after the script, but really nothing overwhelming.

How do I delete all files that match the basename in an array of globs?

I have an existing build task in my gulpfile, that transforms a collection of "source" files into "dist" files. I have an array of well-known filenames that tell my build task which files to operate on:
sourceFiles: [
"./js/source/hoobly.js",
"./js/source/hoo.js"
]
My build task produces the following files from this:
./js/dist/hoobly.js
./js/dist/hoobly.min.js
./js/dist/hoobly.js.map
./js/dist/hoobly.min.js.map
./js/dist/hoo.js
./js/dist/hoo.min.js
./js/dist/hoo.js.map
./js/dist/hoo.min.js.map
I now want to write a corresponding clean task that removes the files that get generated during my build task. Unfortunately I cannot just delete all the files in the ./js/dist/ directory, as this contains other files that are not generated by the build task in question, so I need to ensure that the files I delete match the "basename" of the orginal sourceFiles.
My question is: how do I go about using the sourceFiles array and "munging"* it so that I can end up calling something like:
gulp.src(sourceFiles)
.pipe(munge()) // I don't know what to do here!!
.pipe(del()); // does a `del ./js/dist/hoobly.*`
// and a `del ./js/dist/hoo.*`
(*Technical term, "munge"...)
Can you point me in the right direction? I've looked at various NPM packages (vinyl-paths, gulp-map-files, gulp-glob, ...) and I'm just getting more and more lost.
I'd change globs before passing them to gulp.src:
var sourceFiles = [
"./js/source/hoobly.js",
"./js/source/hoo.js"
];
var filesToDelete = sourceFiles.map(f=>f.replace("/source/", "/dist/").replace(".js", ".*"));
console.log(filesToDelete)
and omit that munge step:
gulp.src(filesToDelete).pipe(del())

Categories