Not able to write splitted text in json file in cypress - javascript

I am testing an application where the alert text says like
Customer added successfully with customer id :13
Now i want to grab 13 and save it in json file.
Somehow its not getting written in json file.
Please help
Below is my code
export function stripcustomerid() {
cy.on('window:alert', (txt) => {
cy.readFile('cypress/fixtures/account_details.json').then((data) => {
var customerno = txt.split(':')
const cno = customerno[1]
data.customerid = cno
cy.writeFile(
'cypress/fixtures/account_details.json',
JSON.stringify(data)
)
})
})
}

Assuming that the txt variable has the string Customer added successfully with customer id :13 and also your account_details.json file has an existing customer id data like customerid: 55, because you are changing this value. Then your code looks correct.
You can replace these three lines:
var customerno = txt.split(':')
const cno = customerno[1]
data.customerid = cno
with
data.customerid = txt.split(':')[1]

Related

How to get specific key and value from a long json object while iterating it in node.js

I am trying to parse a csv file in node.js , i am able to parse the csv file and can print the content , the contents are coming as a from of a json object.Now my target is to iterate the json object and take out specific key and values from each block and use them in a Query which will do some DB operations.But the problem is while i am trying to iterate the json only first key and values of the first block is printed. Let me post the code what i have done
fs.createReadStream(path)
.pipe(csv.parse({headers:true ,ignoreEmpty : true}))
.on("error",(error) => {
throw error.message;
})
.on("data",function(data){
if(data && data!=={}){
Object.keys(data).forEach(function(k){
if(k==='name' || k==='Office'){
let selectQury = `select name,Office from myTable where name = ${data['name']} and Office
=${data[Office]};
db.query(selectQury,(err,res)=>{
if(err){
console.log('error',null);
This my json which i parse from the csv looks like
{
id:1,
name:"AS",
Office:"NJ"
........
ACTIVE: 1.
},
{
id:2,
name:"AKJS",
Office:"NK"
........
ACTIVE: 2.
}
so now what i want is in the select Query the parameters will be passed like
let selectQury = `select name,Office from myTable where name = "AS" and Office = "NJ";
in the first iteration
let selectQury = `select name,Office from myTable where name = "AKJS" and Office = "NK";
in the second iteration and so on as the csv grows.
I am not able to do it ,please help . Thanks in advance. I am new to node.js & tricky javascript operations.

How to properly write the values of an array into a JSON file using writeFile in Cypress?

I wanted to write the values of an array into a JSON file by using writeFile .
console.log(category);
cy.writeFile('cypress/fixtures/actionsName.json', category);
I've outputted the values of the category array. See below
My expected contents of the file would be something like this
However, upon writing the contents of the JSON file using writeFile, it only prints the top level array, not including its contents.
I'm not sure what I'm missing, I would really appreciate if someone can take a look. Thanks in advance!
UPDATE:
As per request, below is the code used to populate the category array. Let me know if anything needs to be updated/optimized.
getActionName(){
let category = new Array();
let actions = new Array();
//Get all action name and store to json
cy.get('.linkbox').each(($category) => {
//Get category name
const text = $category.find('.heading').text();
cy.log("Category: " + text);
category.push(text);
//Get each action name and push to the related category
cy.wrap($category).find('ul li h2').each(($action) => {
const text = $action.text();
cy.log("Action: " + text);
actions.push(text);
}).then(() => {
category[text] = actions;
actions = [];
})
}).then(() =>{
console.log(category);
//!Only writes the top level array
cy.writeFile('cypress/fixtures/actionsName.json', category);
})
}
It's a bit hard to tell for sure from the screenshot, but I think the array has non-standard properties attached (an Array is an object).
Try converting,
const keys = ['Alert', 'Dynamic Elements', 'Frames and Windows', etc]
const output = keys.reduce((acc, item) => {
acc[item] = category[item]
return acc
}, {})
cy.writeFile('cypress/fixtures/actionsName.json', output)
Just looked at your posted code - the problem is as expected, you're attaching the lists to the array as properties. The above should work, or you can clean up the way you gather the lists.
getActionName(){
//let category = new Array(); // change this to an object
let category = {};
let actions = new Array();
//Get all action name and store to json
cy.get('.linkbox').each(($category) => {
//Get category name
const text = $category.find('.heading').text();
cy.log("Category: " + text);
//category.push(text); // don't need this
//Get each action name and push to the related category
cy.wrap($category).find('ul li h2').each(($action) => {
const text = $action.text();
cy.log("Action: " + text);
actions.push(text);
}).then(() => {
category[text] = actions;
actions = [];
})
}).then(() =>{
console.log(category);
//!Only writes the top level array
cy.writeFile('cypress/fixtures/actionsName.json', category);
})
}

Error: Data between close double quote (") and field separator

I'm trying to use Google Apps Script to take a CSV from Google Drive and put it into Big Query. When I upload, I get this error:
"Error while reading data, error message: Error detected while parsing row starting at position: 560550. Error: Data between close double quote (") and field separator."
I've tried looking at that byte position of the file and its way outside the bounds of the CSV (it only goes to ~501500 bytes).
Here's a link to the CSV that I'm using which is a scrape of a website: https://drive.google.com/file/d/1k3cGlTSA_zPQCtUkt20vn6XKiLPJ7mFB/view?usp=sharing
Here's my relevant code:
function csvToBigQuery(exportFolder, csvName, bqDatasetId){
try{
//get most recent export from Screaming Frog
var mostRecentFolder = [];
while(exportFolder.hasNext()){
var folder = exportFolder.next();
var lastUpdated = folder.getLastUpdated();
if(mostRecentFolder.length == 0)
mostRecentFolder = [folder.getLastUpdated(),folder.getId()];
else if(lastUpdated > mostRecentFolder[0])
mostRecentFolder = [lastUpdated, folder.getId()];
}
var folderId = mostRecentFolder[1];
var file = DriveApp.getFolderById(folderId).getFilesByName(csvName + '.csv').next();
if(!file)
throw "File doesn't exist";
//get csv and add date column.
//getBlob().getDataAsString().replace(/(["'])(?:(?=(\\?))\2[\s\S])*?\1/g, function(e){return e.replace(/\r?\n|\r/g, ' ')})
var rows = Utilities.parseCsv(file.getBlob().getDataAsString());
Logger.log(rows);
var numColumns = rows[0].length;
rows.forEach(function(row){
row[numColumns] = date;
});
rows[0][numColumns] = 'Date';
let csvRows = rows.map(values =>values.map(value => JSON.stringify(value).replace(/\\"/g, '""')));
let csvData = csvRows.map(values => values.join(',')).join('\n');
//log(csvData)
var blob = Utilities.newBlob(csvData, 'application/octet-stream');
//create job for inserting to BQ.
var loadJob = {
configuration: {
load: {
destinationTable: {
projectId: bqProjectId,
datasetId: bqDatasetId,
tableId: csvName
},
autodetect: true, // Infer schema from contents.
writeDisposition: 'WRITE_APPEND',
}
}
};
//append to table in BQ.
BigQuery.Jobs.insert(loadJob, bqProjectId, blob);
}catch(e){
Logger.log(e);
}
}
Modification points:
From your error message, I thought that there might be the parts which are not enclosed by the double quota. So, I searched When I saw your CSV data and your CSV data is replaced \"(|.+?)\" with "" using the following script, it was found that the row 711 has the value.
function sample() {
var id = "###"; // File ID of your CSV file.
// This is your script.
var file = DriveApp.getFileById(id);
var rows = Utilities.parseCsv(file.getBlob().getDataAsString());
var numColumns = rows[0].length;
var date = "sample";
rows.forEach(function(row){
row[numColumns] = date;
});
rows[0][numColumns] = 'Date';
let csvRows = rows.map(values =>values.map(value => JSON.stringify(value).replace(/\\"/g, '""')));
let csvData = csvRows.map(values => values.join(',')).join('\n');
// I added below script for checking your CSV data.
var res = csvData.replace(/\"(|.+?)\"/g, "");
DriveApp.createFile("sample.txt", res);
}
The row 711 is as follows.
"https://supergoop.com/products/lip-shield-trio/?utm_source=Gorgias&utm_medium=CustomerCare&utm_campaign=crosssellhello\","text/html; charset=utf-8","200","OK","Non-Indexable","Canonicalised","Lip Shield Trio - Restores, Protects + Water-resistant – Supergoop!","67","595","Moisturizing lip protection made from antioxidant-rich coconut, avocado, and grape seed oil.","92","576","","0","Lip Shield Trio","15","Lip Shield Trio","15","Why We Love It","14","Ingredients","11","","","","https://supergoop.com/products/lip-shield-trio","","","","","451488","754","1.686","5","","12","4","0.590","205","80","8","5","","","","","f6d1476960d22b1c5964581e161bdd49","0.064","","","","","HTTP/1.1","https://supergoop.com/products/lip-shield-trio/?utm_source=Gorgias&utm_medium=CustomerCare&utm_campaign=crosssellhello%5C"
From this value, I found that \" is used at "https://supergoop.com/products/lip-shield-trio/?utm_source=Gorgias&utm_medium=CustomerCare&utm_campaign=crosssellhello\". I thought that the reason of your issue might be due to this.
So in order to avoid this issue, how about the following modification?
Modified script:
From:
let csvRows = rows.map(values =>values.map(value => JSON.stringify(value).replace(/\\"/g, '""')));
To:
let csvRows = rows.map(values =>values.map(value => JSON.stringify(value).replace(/\\"/g, '""').replace(/\\"/g, '')));
or
From:
var rows = Utilities.parseCsv(file.getBlob().getDataAsString());
To:
var rows = Utilities.parseCsv(file.getBlob().getDataAsString().replace(/\\/g, ''));
By this modification, I could confirm that the file size was reduced with 2 bytes between your script and the modified script. And also, when above check script is used for the CSV data using the modified script, I could confirm that all rows have no values.

How to Read json file and write the change to values of particular key in the same json file using Javascript?

I have JSON file named data.json =>
{
"Name": "Smith",
"Id": "123"
}
Now I need to change the Id to 321 using javascript.
How can I do it?
I have tried some code but in vain.
let rawdata_one = fs.readFileSync('data.json');
let onedata = JSON.parse(rawdata_one);
onedata.for(
function(obj){
let ob = onedata.findIndex( o => o.Id);
if(ob == 'Id'){
onedata[ob].Id= '321';
}
}
)
let data = JSON.stringify(onedata);
fs.writeFileSync('data.json', data);
But I am unable to make the changes in the JSON file. Can anyone help in this part?
try
const osarr = onedata['Type']['section']['os'];
osarr.forEach((i)=> {
i['Id'] = 321;
})
console.log(onedata)
For this simple JSON,
I did
const osarr = onedata
osarr.Id =321
console.log(onedata)
It worked

Get JSON data from ajax call to PHP page through POST

I want to get some data from a form making AJAX call. I am getting the data as a string in my PHP page. The string looks like
'fname':'abc','lname':'xyz','email':'','pass':'','phone':'','gender':'','dob':''
Now I want to convert this entire string into array which would look like
["fname"] => "abc",
["lname"] => "xyz"... and so on
The ajax call is like below:
fname = $("#form-fname").val();
lname = $("#form-lname").val();
email = $("#form-username").val();
pwd = $("#form-password").val();
phone = $("#form-phone").val();
gender = $("#form-gender").val();
dob = $("#form-dob").val();
var user = {"fname":fname,"lname":lname,"email":email,"pass":pwd,"phone":phone,"gender":gender,"dob":dob};
$.ajax({
type: "POST",
url: "doRegistration.php",
data: user
})
.done(function( msg ) {
window.location.href = "../profile.php";
})
.fail(function(msg) {
sessionStorage.setItem("success","0");
window.location.reload();
});
And here is my PHP code:
$content = file_get_contents("php://input");
file_put_contents("log1.txt",$content); //This produces the string I get above
Now I try to convert the string into array as above
$dec = explode(",",$content);
$dec1 = array();
for($i=0;$i<count($dec);$i++)
{
$dec1[i] = substr($dec[i],strpos($dec[i],":"));
}
//After this $dec1 is empty and count($dec1) gives 0.
But this does not give me the required array. I have checked several answers here, but they do not solve my issue. I tried to google but did not find any resolution. Is there something wrong in the code? Kindly help. Thanks in advance.
Change quotes and add braces. Then you can decode resulting json
$string = "'fname':'abc','lname':'xyz','email':'','pass':'','phone':'','gender':'','dob':''";
print_r(json_decode('{' . str_replace("'", '"', $string) . '}', true));
result
Array
(
[fname] => abc
[lname] => xyz
[email] =>
[pass] =>
[phone] =>
[gender] =>
[dob] =>
)
In your code above, you are using the index of the for loop for the key of the array. If you are trying to archive (I believe it is called dictionary in Java). You can try the following code:
<?php
$string = "'fname':'abc','lname':'xyz','email':'','pass':'','phone':'','gender':'','dob':''";
$pieces=explode(',',$string);
foreach($pieces as $array_format){
list($key,$value) = explode(':',$array_format);
$array[$key]=$value;
}

Categories