How to print JSON String in human readable format in JS - javascript

I have a Firebase realtime database that reads sensor data (updated every 0.3s) and displays it on my webpage. After doing some research I found out about 'pretty-printing'. However, this is not in format I am after. My data right now is displayed like this: {"Moisture":619}.
What I am looking for is: Moisture: 619. As of right now this code is also creating a new {"Moisture":619} every time the value in the database is updated. Ideal would be if the new value is updated, making it so it just changes the value after Moisture, instead of displaying the whole thing again.
My code:
<!DOCTYPE html>
<html>
<head>
<meta charset="utf-8">
<meta http-equiv="X-UA-Compatible" content="IE=edge">
<meta name="viewport" content="width=device-width, initial-scale=1">
<link rel="stylesheet" href="/styles.css">
<script src="https://www.gstatic.com/firebasejs/4.11.0/firebase-app.js"></script>
<script src="https://www.gstatic.com/firebasejs/4.11.0/firebase-auth.js"></script>
<script src="https://www.gstatic.com/firebasejs/4.11.0/firebase-database.js"></script>
<script src="https://www.gstatic.com/firebasejs/4.11.0/firebase-firestore.js"></script>
<script src="https://www.gstatic.com/firebasejs/4.11.0/firebase-storage.js"></script>
<script src="https://www.gstatic.com/firebasejs/4.11.0/firebase-messaging.js"></script>
<script>
// Initialize Firebase
var config = {
apiKey: "xx",
authDomain: "xx",
databaseURL: "xx",
projectId: "xx",
storageBucket: "xx",
messagingSenderId: "xx",
appId: "xx"
};
firebase.initializeApp(config);
</script>
<script>
var database = firebase.database();
var ref = firebase.database().ref("plant-patrol/Moisture");
ref.once("value")
.then(function(snapshot) {
var key = snapshot.key; // "plant-patrol"
var childKey = snapshot.child().key; // "Moisture"
});
</script>
<script>
var ref = firebase.database().ref();
ref.on("value", function(snapshot) {
console.log(snapshot.val());
var snapshotJSON = JSON.stringify(snapshot.val());
var moisture = snapshotJSON;
document.write(moisture);
}, function (error) {
console.log("Error: " + error.code);
});
</script>
<script src="/script.js" defer></script>
</head>
</html>

You can use JSON.stringify and replace:
const json = {
id: "1",
employee_name: "Tiger Nixon",
employee_salary: "320800",
employee_age: "61",
profile_image: ""
};
document.getElementById("app").innerHTML = JSON.stringify(json, (key, value) => (value || ''), 4).replace(/"([^"]+)":/g, '$1:');
const json = {
id: "1",
employee_name: "Tiger Nixon",
employee_salary: "320800",
employee_age: "61",
profile_image: ""
};
document.getElementById("app").innerHTML = JSON.stringify(json, (key, value) => (value || ''), 4).replace(/"([^"]+)":/g, '$1:');
<div id="app"></div>

You can use pre tag to display formatted json.
const json = {
id: "1",
employee_name: "Tiger Nixon",
employee_salary: "320800",
employee_age: "61",
profile_image: ""
};
document.getElementById("app").innerHTML = JSON.stringify(json, (key, value) => (value || ''), 4).replace(/"([^"]+)":/g, '$1:');
<div><pre id="app"></pre></div>

You can use regex to remove []{}"" characters:
snapshotJSON.replace(/[\[\]\{\}\"]+/g, '')
But you already have the plain value as
snapshot.val()
so why not use this.
JSON.stringify()
converts a javascript object to a JSON formatted string - normally usedfor machine to machine communication. The opposite would be JSON.parse to convert text into a JavaScript object.

You can use prettier for styling your whole code.
Source: https://prettier.io
Npm Link: https://www.npmjs.com/package/prettier

Related

Parse non JSON to JSON

I have a file with data in it that I am needing to parse and store in a DB. Below, is an example of 2 entries in the file. I'm not quite sure what the structure is (although it looks to be ndJSON). I am trying to parse the data in to a JSON object in order to store it in a DB, but cannot seem to figure it out. Here is what I have so far
var ndjson = {
"sequence-num": "0123456789",
"version": "N1.4",
"record-type": "R",
"session-id": "197-30760303",
"date": "2021-07-23 15:00:53",
"passport-header": { "alg": "ES256", "ppt": "test", "typ": "passport", "x5u": "https://cr.com" },
"passport-payload": { "attest": "A", "dest": { "tn": ["0123456789"] }, "iat": 0123456789, "orig": { "tn": "0123456789" }, "origid": "c699f78a-ebc6-11eb-bfd8-bec0bbc98888" },
"identity-header": "eyJhbGciOiJFUzI1NiIsInBwdCI6InNoYWtlbiIsInR5cCI6InBhc3Nwb3J0IiwieDV1IjoiaHR0cHM6Ly9jci5zYW5zYXkuY29tL1RvdWNodG9uZV82ODNBIn0.eyJhdHRlc3QiOiJCIiwiZGVzdCI6eyJ0biI6WyIxMjUeyJhdHRlc3QiOiJCIiwiZGVzdCI6eyJ0biI6WyIxMj;info=<https://google.com/>;alg=ES256;ppt=\"test\""
}
{
"sequence-num": "0123456788",
"version": "N1.4",
"record-type": "R",
"session-id": "214-30760304",
"date": "2021-07-23 15:00:53",
"passport-header": { "alg": "ES256", "ppt": "test", "typ": "passport", "x5u": "https://cr.com" },
"passport-payload": { "attest": "B", "dest": { "tn": ["0123456788"] }, "iat": 0123456788, "orig": { "tn": "0123456788" }, "origid": "c69d0588-ebc6-11eb-bfd8-bec0bbc98888" },
"identity-header": "eyJhbGciOiJFUzI1NiIsInBwdCI6InNoYWtlbiIsInR5cCI6InBhc3Nwb3J0IiwieDV1IjoiaHR0cHM6Ly9jci5zYW5zYXkuY29tL1RvdWNodG9uZV82ODNBIn0.eyJhdHRlc3QiOiJCIiwiZGVzdCI6eyJ0biI6WyIxMjUeyJhdHRlc3QiOiJCIiwiZGVzdCI6eyJ0biI6WyIxMj;info=<https://google.com/>;alg=ES256;ppt=\"test\""
};
let result = ndjson.split(',').map(s => JSON.parse(s));
console.log('The resulting array of items:');
console.log(result);
console.log('Each item at a time:');
for (o of result) {
console.log("item:", o);
}
When I run this, I get Uncaught SyntaxError: Unexpected token ':' error on line 12 at the 2nd node of "sequence-num": "0123456788",.
Any help is appreciated, thank you!
If you actually have ndJSON(newline-delimited JSON) then each line in the file is valid JSON, delimited by newlines. A simple file would look like this:
{"key1": "Value 1","key2": "Value 2","key3": "Value 3","key4": "Value 4"}
{"key1": "Value 5","key2": "Value 6","key3": "Value 7","key4": "Value 8"}
This differs from the formatted data you've posted here, and the difference is important since once you've formatted it, the valid JSON objects cannot simply be distinguished by the presence of newlines.
So, on the assumption that you do have valid ndJSON, in its original form, you can extract it by using split() on newLines and using JSON.parse() on the resulting array.
This snippet adds a little file handling to allow a file to be uploaded, but thereafter it uses split() and JSON.parse() to extract the data:
"use strict";
document.getElementsByTagName('form')[0].addEventListener('submit',function(e){
e.preventDefault();
const selectedFile = document.getElementById('inputFile').files[0];
let fr = new FileReader();
fr.onload = function(e){
let ndJSON = e.target.result; // ndJSON extracted here
let ndJSONLines = ndJSON.split('\n');
// Process JSON objects here
ndJSONLines.forEach(function(el){
let obj = JSON.parse(el);
Object.keys(obj).forEach(key=>{
console.log(`Key: ${key}, Value: ${obj[key]}`);
});
});
}
fr.readAsText(selectedFile)
});
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<title>Parsing ndJSON</title>
</head>
<body>
<form method="post" enctype="multipart/form-data">
<input type="file" name="inputFile" id="inputFile">
<input type="submit">
</form>
</body>
</html>
Output, based on the sample file above:
Here is what I do
const end_point_url = 'https://ipfs.io/ipfs/bafkqap33ejwgs3tfgerduitomrvhg33oebtg64tnmf2c4lroej6qu6zcnruw4zjsei5ce5dinfzsa2ltebqsa43fmnxw4zbanruw4zjcpufa';
let json = await fetch(end_point_url).
then( resp => resp.text() ).
then( buf => { // NDJSON format ...
return buf.slice(0,-1).split('\n').map(JSON.parse);
}).
catch(console.error);

Parsing JavaScript MySQL response

Am reading MySQL via javascript,
and am getting response back successfully and returning data
But my question is that how can i parse and get only
whats your first school
from this output
{
"Success": true,
"Result": [
{
"question": "whats your first school"
}
]
}
<html>
<head>
<title>MySqlJs test</title>
</head>
<script src="http://mysqljs.com/mysql.js"></script>
<body>
<pre id="output"></pre>
<script>
MySql.Execute(
"host",
"serv",
"pwd",
"db56",
"select quest from datab",
function (data) {
document.getElementById("output").innerHTML = JSON.stringify(data,null,2);
});
</script>
</body>
</html>
Assuming data is a JavaScript object (If it's a string use JSON.parse()) you can access it like:
const data = {
"Success": true,
"Result": [{
"question": "whats your first school"
}]
}
console.log(data.Result[0].question);

Javascript / Json / Firestore code to firestore bulk upload

We are looking for ways to bulk upload data into google firestore and found here in Stack Overflow an awesome script to do so, sort of. The problem is that when data is uploaded the collection is fine, document is fine, but nesting some data we can only manage to import as "map" type when we need "array" and "map". Since we are basically newbies trial and error has not been enough. We appreciate if you can take a look at the code and help us with that.
So far, with the following code:
<!DOCTYPE html>
<html>
<head>
<meta charset="utf-8">
<script src="https://www.gstatic.com/firebasejs/7.5.0/firebase-app.js"></script>
<script src="https://www.gstatic.com/firebasejs/7.5.0/firebase-firestore.js"></script>
</head>
<body>
<script>
var config = {
apiKey: 'MY API KEY',
authDomain: 'MY AUTHDOMAIN',
projectId: 'MY PROJECT ID'
};
firebase.initializeApp(config);
var firestore = firebase.firestore();
//Paste from the CSV to JSON converter
const data = [
{
"collection": "sub_cats",
"uid": "Av9vJ0EoFfxhAR2",
"class": "especialidades_medicas",
"id": 0,
"name": "Cardiologo",
"ind": "11",
"district": ""
},
{
"collection": "sub_cats",
"uid": "Av9vJ0EoFfxhAR2",
"class": "especialidades_medicas",
"id": 1,
"name": "Urologo",
"ind": "12",
"district": ""
}
]
var promises = [];
data.forEach(function (arrayItem) {
var docRef = firestore.collection(arrayItem.collection).doc(arrayItem.uid);
var objClass = {};
var objId = {};
objId[arrayItem.id] = { name: arrayItem.name, ind: arrayItem.ind };
objClass[arrayItem.class] = objId;
promises.push(docRef.set(objClass, { merge: true }));
});
</script>
</body>
</html>
we get this:
data structure current / needed
How would you modify this code to get the array / map structure?
Thanks a lot!!!
You will need to use arrayUnion() of fieldValue that can be used with set() or update(). So basically if you use it with update it will "merge" the new value with the existing array values in your document. Inside that you can have your map or any datatype you want.
An example code looks like this:
db.collection("users").doc(props.uid).update({
points: db.FieldValue.arrayUnion({value: pointObj.value, reason: pointObj.reason})
});

How to export or convert JSON to Excel in AngularJS?

I'm extracting an array with 4 objects and each object has an array inside, from my kendo charts datasource, on my Angular project.
The data inside each sub-object varies in size, but it always includes a timestamp, and 1-5 value fields.
I need to export this array to an Excel file (.xls or .xlsx NOT CSV).
So far I managed to download the JSON as a file on its own (both .json and unformatted .xls).
I'd like for each object to be a book and in that book to have a formatting that has the timestamp in the first column, value 1 in another, and so on. The header for the columns should be timestamp, value1 name, etc (I'm translating these on the ui according to user preferences).
How can I build this type of formatted .xls file using angular? I don't know a particular good library for this, that is clear on how to use it in Angular.
Following Nathan Beck's link sugestion, I used AlaSQL. I'm getting correctly formatted columns, just need to adapt my array to have multiple worksheets.
The way we integrate alaSQL into our Angular project is by including the alasql.min.js and xlsx.core.min.js.
Then we call the alasql method in our function
$scope.export = function(){
var arrayToExport = [{id:1, name:"gas"},...];
alasql('SELECT * INTO XLSX("your_filename.xlsx",{headers:true}) FROM ?', arrayToExport);
}
EDIT: Solved the multiple worksheets issues as well. Keep in mind that when using the multiple worksheet method, you have to remove the asterisk and replace the headers: true object in the query with a question mark, passing the options in a separate array. So:
var arrayToExport1 = [{id:1, name:"gas"},...];
var arrayToExport2 = [{id:1, name:"solid"},...];
var arrayToExport3 = [{id:1, name:"liquid"},...];
var finalArray = arrayToExport1.concat(arrayToExport2, arrayToExport3);
var opts = [{sheetid: "gas", headers: true},{sheetid: "solid", headers: true},{sheetid: "liquid", headers: true}];
alasql('SELECT INTO XLSX("your_filename.xlsx",?) FROM ?', [opts, finalArray]);
You can use the XLSX library to convert JSON into XLS file and Download. Just create a service for your AngularJS application then call it as service method having below code.
I found this tutorial having JS and jQuery code but we can refer this code to use in AngularJS
Working Demo
Source link
Method
Include library
<script type="text/javascript" src="//unpkg.com/xlsx/dist/xlsx.full.min.js"></script>
JavaScript Code
var createXLSLFormatObj = [];
/* XLS Head Columns */
var xlsHeader = ["EmployeeID", "Full Name"];
/* XLS Rows Data */
var xlsRows = [{
"EmployeeID": "EMP001",
"FullName": "Jolly"
},
{
"EmployeeID": "EMP002",
"FullName": "Macias"
},
{
"EmployeeID": "EMP003",
"FullName": "Lucian"
},
{
"EmployeeID": "EMP004",
"FullName": "Blaze"
},
{
"EmployeeID": "EMP005",
"FullName": "Blossom"
},
{
"EmployeeID": "EMP006",
"FullName": "Kerry"
},
{
"EmployeeID": "EMP007",
"FullName": "Adele"
},
{
"EmployeeID": "EMP008",
"FullName": "Freaky"
},
{
"EmployeeID": "EMP009",
"FullName": "Brooke"
},
{
"EmployeeID": "EMP010",
"FullName": "FreakyJolly.Com"
}
];
createXLSLFormatObj.push(xlsHeader);
$.each(xlsRows, function(index, value) {
var innerRowData = [];
$("tbody").append('<tr><td>' + value.EmployeeID + '</td><td>' + value.FullName + '</td></tr>');
$.each(value, function(ind, val) {
innerRowData.push(val);
});
createXLSLFormatObj.push(innerRowData);
});
/* File Name */
var filename = "FreakyJSON_To_XLS.xlsx";
/* Sheet Name */
var ws_name = "FreakySheet";
if (typeof console !== 'undefined') console.log(new Date());
var wb = XLSX.utils.book_new(),
ws = XLSX.utils.aoa_to_sheet(createXLSLFormatObj);
/* Add worksheet to workbook */
XLSX.utils.book_append_sheet(wb, ws, ws_name);
/* Write workbook and Download */
if (typeof console !== 'undefined') console.log(new Date());
XLSX.writeFile(wb, filename);
if (typeof console !== 'undefined') console.log(new Date());
Angular directive for exporting and downloading JSON as a CSV. Perform bower install ng-csv-download. Run in plunkr
var app = angular.module('testApp', ['tld.csvDownload']);
app.controller('Ctrl1', function($scope, $rootScope) {
$scope.data = {};
$scope.data.exportFilename = 'example.csv';
$scope.data.displayLabel = 'Download Example CSV';
$scope.data.myHeaderData = {
id: 'User ID',
name: 'User Name (Last, First)',
alt: 'Nickname'
};
$scope.data.myInputArray = [{
id: '0001',
name: 'Jetson, George'
}, {
id: '0002',
name: 'Jetson, Jane',
alt: 'Jane, his wife.'
}, {
id: '0003',
name: 'Jetson, Judith',
alt: 'Daughter Judy'
}, {
id: '0004',
name: 'Jetson, Elroy',
alt: 'Boy Elroy'
}, {
id: 'THX1138',
name: 'Rosie The Maid',
alt: 'Rosie'
}];
});
<!DOCTYPE html>
<html ng-app="testApp">
<head>
<meta charset="utf-8" />
<title>Exporting JSON as a CSV</title>
<script>document.write('<base href="' + document.location + '" />');</script>
<link rel="stylesheet" href="style.css" />
<script src="https://cdnjs.cloudflare.com/ajax/libs/angular.js/1.3.15/angular.min.js"></script>
<script src="csv-download.min.js"></script>
<script src="app.js"></script>
</head>
<body>
<div>Using an Angular directive for exporting JSON data as a CSV download.</div>
<div ng-controller="Ctrl1">
<h2>All Attributes Set</h2>
<csv-download
filename="{{data.exportFilename}}"
label="{{data.displayLabel}}"
column-header="data.myHeaderData"
input-array="data.myInputArray">
</csv-download>
<hr />
<h2>Only Required Attribute Set</h2>
<h3>Optional Attributes Default</h3>
<csv-download
input-array="data.myInputArray">
</csv-download>
</div>
</body>
</html>

Algolia instantsearch disjunctiveFacetsRefinements not working with facetsExcludes

var getallskills = document.getElementById('OpenJobs').value;
var preselectedCategories = getallskills.split(",");
var getallskills2 = document.getElementById('HiddenField1').value;
var preselectedCategories2 = getallskills2.split(",");
var geter = "Filled";
var pregeter = geter.split(",");
var search = instantsearch({
appId: 'XXXXXXXXX',
apiKey: 'XXXXXXXXXXXX',
indexName: 'Jobs',
searchParameters: {
"disjunctiveFacetsRefinements": {
"JobTitle": preselectedCategories
},
"facetsExcludes": {
"Emails.User": preselectedCategories2,
"Status": pregeter
}
}});
I am trying to get disjunctiveFacetsRefinements working with facetsExcludes and it doesn't seem to work. Is there something I am doing wrong? Perhaps a bug?
I figured this out..... I think the problem is the disjunctiveFacetsRefinements. If you look you at my code:
"disjunctiveFacetsRefinements": {
//Facet Following Users
"JobTitle": preselectedCategories//, "ToUser.EmailAddress": emailaddress
},
So what it does is it only looks at the "JobTitle" facet. So it limits my results. Is there a way where I can change the searchParameter to have a LIKE type query looking at searchable fields?

Categories