Logging a JSON object in firebase console - javascript

Does anyone know how to log JSON data in a readable manner to the firebase logs? When using:
console.log(req.body)
or
console.log(`${req.body.event}: ${JSON.stringify(req.body, null, 2)}`);
it prints it as seen in the picture below on multiple lines.
My firebase cloud functions run on Node 10

You are supposed to use functions.logger.log() with Node version 10 and above. See: https://firebase.google.com/docs/functions/writing-and-viewing-logs

I can confirm I have the same problem, since the update to Node 10 JSON objects are printed in multiple lines.
For now the solution is to import this dependency:
require('firebase-functions/lib/logger/compat');
Just put this at the top of your functions code.
You can check the official issue for updates: https://github.com/firebase/firebase-functions/issues/612

This multi-line behavior is likely due to the fact that you explicitly tell JSON.stringify() to add line breaks and indentation. The last parameter (2) tells JSON formatter that you want "structured" (aka "pretty") output with 2 spaces per indentation level. If you drop the last argument (you may safely drop "null" as well), then you should see good old one long string :)

FWIW, I don't see this logging behavior when using NodeJS 8 on my Firebase Functions. So something changed with Node 10 on the Firebase side...
Also check this GitHub issue: https://github.com/firebase/firebase-functions/issues/612

Related

Placing Array of objects into mssql db from nodejs

I'm attempting to place a decently sized array of objects into a db and I've got the code written that will do it and several records ARE actually written to the DB, however something about the formatting is causing the function to throw errors.
One of those errors that I've seen on multiple attempts to fix this issue is:
"Incorrect syntax near the keyword 'with'. If this statement is a common table expression, an xmlnamespaces clause or a change tracking context clause, the previous statement must be terminated with a semicolon."
another is
"Incorrect syntax near 's'."
When I initially open the array I'm attempting to place into the DB I get an alert from VSCode about symbols that needed to be removed but I am not able to get the error verbatim at the moment and don't know how to replicate the message. Regardless it has no effect on if it will add all entries or not.
My question is what about my objects could cause this kind of weird issue and what kinds of things could I fix keeping in mind that some of the records are actually being added. Thank you!
Also what is the best/easiest way for bulk adding items to a DB programmatically? ATM I'm just doing INSERT INTO query.

Run a function in javascript when an array is being "written" or "read"

I have an array (actually the database data from alasql on github), and I want to run a function when it is "written" or "read" in script.
"written" stands for push, pull and Assignment operator like =,
While "read" stands for direct access to the array value like array (whole array), array[0] (an element of the array) etc., and reading properties like array.length
Actually I use those function for accessing the localStorage.
I have already read this stackoverflow question but I still don't have my own solution since I have a few restrictions here.
I have to monitor not just push() but = so Override the push method does not work for me.
That array is actually created by the javascript library alasql and I am not going to change the library code since it can be a big big task, so Create a custom observable array is not possible.
The library access the array directly for SQL statements as the store the data of the SQL table, so it is impossible to use proxy.
My script is going to run on a firefox on a slow computer running xp bought at least 10 years ago with hardware like < 1GB RAM and single-core CPU, and the array which is actually the SQL table will have entries like 10000 entries, hence the array is performance required. If I use library like underscore-observe on github, which relies on scanning every 250ms, that dumb computer will simply die.
Use smart-collection on github. Well, I actually don't know what does that answer and the readme on github talking about, but I don't think it will works since I have to create another array for it (?)
Use localStorage engine of alasql on github readme, fine, this could be the best one initially and I tried it. But I decided to left it when I found a bug of it which block my work.
I would be glad if someone can give me a solution on how to do the trick, since it is a main and big issue in my project so far.
P.S.: My project is to develop a single HTML page that run a POS-like system with sell and borrow functionality with SQL, and It is going to run in a dumb old computer without internet connection.

Postgres JSON function passed string instead of object

So I've got a Postgres function in place designed to merge two JSONB objects recursively, and it works just fine on the production server but fails on my local Postgres installation. The function itself is written in plv8 (a v8 Javascript engine basically) and expects two arguments in JSONB format to merge; the problem is that the JSON is passed in as a string and not as an object, which essentially breaks the entire function.
This only happens on my local computer though, a fresh Postgres 9.4.5 installation. The production server is running 9.4.4, which shouldn't cause such a major change across versions...ideas on where to go to see what's broken here?
EDIT: Can now confirm that reverting to 9.4.4 doesn't make this behave any differently locally
Hard to say. Possibilities:
Different casts - you can define custom casts - CREATE CAST statement - try to check result of psql command \dC *json*
New bug introduced in 9.4.5
FWIW, upgrading to 9.5 seems to solve instances of this issue for me.

JSON diff of large JSON data, finding some JSON as a subset of another JSON

I have a problem I'd like to solve to not have to spend a lot of manual work to analyze as an alternative.
I have 2 JSON objects (returned from different web service API or HTTP responses). There is intersecting data between the 2 JSON objects, and they share similar JSON structure, but not identical. One JSON (the smaller one) is like a subset of the bigger JSON object.
I want to find all the interesecting data between the two objects. Actually, I'm more interested in the shared parameters/properties within the object, not really the actual values of the parameters/properties of each object. Because I want to eventually use data from one JSON output to construct the other JSON as input to an API call. Unfortunately, I don't have the documentation that defines the JSON for each API. :(
What makes this tougher is the JSON objects are huge. One spans a page if you print it out via Windows Notepad. The other spans 37 pages. The APIs return the JSON output compressed as a single line. Normal text compare doesn't do much, I'd have to reformat manually or w/ script to break up object w/ newlines, etc. for a text compare to work well. Tried with Beyond Compare tool.
I could do manual search/grep but that's a pain to cycle through all the parameters inside the smaller JSON. Could write code to do it but I'd also have to spend time to do that, and test if the code works also. Or maybe there's some ready made code already for that...
Or can look for JSON diff type tools. Searched for some. Came across these:
https://github.com/samsonjs/json-diff or https://tlrobinson.net/projects/javascript-fun/jsondiff
https://github.com/andreyvit/json-diff
both failed to do what I wanted. Presumably the JSON is either too complex or too large to process.
Any thoughts on best solution? Or might the best solution for now be manual analysis w/ grep for each parameter/property?
In terms of a code solution, any language will do. I just need a parser or diff tool that will do what I want.
Sorry, can't share the JSON data structure with you either, it may be considered confidential.
Beyond Compare works well, if you set up a JSON file format in it to use Python to pretty-print the JSON. Sample setup for Windows:
Install Python 2.7.
In Beyond Compare, go under Tools, under File Formats.
Click New. Choose Text Format. Enter "JSON" as a name.
Under the General tab:
Mask: *.json
Under the Conversion tab:
Conversion: External program (Unicode filenames)
Loading: c:\Python27\python.exe -m json.tool %s %t
Note, that second parameter in the command line must be %t, if you enter two %ss you will suffer data loss.
Click Save.
Jeremy Simmons has created a better File Format package Posted on forum: "JsonFileFormat.bcpkg" for BEYOND COMPARE that does not require python or so to be installed.
Just download the file and open it with BC and you are good to go. So, its much more simpler.
JSON File Format
I needed a file format for JSON files.
I wanted to pretty-print & sort my JSON to make comparison easy.
I have attached my bcpackage with my completed JSON File Format.
The formatting is done via jq - http://stedolan.github.io/jq/
Props to
Stephen Dolan for the utility https://github.com/stedolan.
I have sent a message to the folks at Scooter Software asking them to
include it in the page with additional formats.
If you're interested in seeing it on there, I'm sure a quick reply to
the thread with an up-vote would help them see the value posting it.
Attached Files Attached Files File Type: bcpkg JsonFileFormat.bcpkg
(449.8 KB, 58 views)
I have a small GPL project that would do the trick for simple JSON. I have not added support for nested entities as it is more of a simple ObjectDB solution and not actually JSON (Despite the fact it was clearly inspired by it.
Long and short the API is pretty simple. Make a new group, populate it, and then pull a subset via whatever logical parameters you need.
https://github.com/danielbchapman/groups
The API is used basically like ->
SubGroup items = group
.notEqual("field", "value")
.lessThan("field2", 50); //...etc...
There's actually support for basic unions and joins which would do pretty much what you want.
Long and short you probably want a Set as your data-type. Considering your comparisons are probably complex you need a more complex set of methods.
My only caution is that it is GPL. If your data is confidential, odds are you may not be interested in that license.

What's the most intelligent way to parse a command's stdout into a usable javascript object

I'm writing a node.js application where I need to do a lot of parsing of stdout for various commands into javascript objects that I can send to the browser over a websocket connection.
Lets use ping as an example. I want to send back the stdout
64 bytes from ip.isp.com (123.123.123.123): icmp_seq=2 ttl=53 time=7.92 ms
as an object like
{
'icmp_seq': 2,
'ttl': 53,
'time': '7.92 ms'
}
There are a number of different commands I want to do this too, including nmap, so I want to make sure I'm doing it as efficiently and intelligently as possible. My plan right now is to just do splits and regex matches but I want to make sure I'm not missing something.
Try this regular expression:
^(?<Size>\d+) bytes from (?<DestinationHost>[^\s]+) \((?<DestinationIP>.+?)\): icmp_seq=(?<ICMPSequence>\d+) ttl=(?<TTL>\d+) time=(?<Time>.+)$
Run this with the multi-line option. You will have to tweak it to work with all output you may receive but for the line you posted it will work.
Once you have the result you can pull each match out into its own variable or into a JSON object.
Splits and regex matches are probably what I'd do, at least for fairly simple commands like ping. For anything significantly more complex, you may have to create a rudimentary (or even non-rudimentary) parser.
Back a million years ago, I developed some some monitoring software (Tivoli) that used CLI commands to collect system info. What I did was to make heavy use of "awk" the end of a pipe from the command output. The line-oriented nature of native CLI tools like that are (sometimes) an easy way of chopping up CLI output for "screen scraping" purposes.
Many 'beefy' command line utilities will have special output modes which are intended to be easily parsed. For instance, I know nmap has -oX for XML output and it will hopefully be straightforward to convert that to JSON.
Here is more info on nmap:
http://nmap.org/book/man-output.html
So I highly recommend researching each command line utility and looking for "parse intended" output options.
If you do go the regex route, be sure to be as forgiving as possible. Rather than capture the entire stdout buffer and attempt to parse the entire thing as one global match, I would attempt to grab specific sub-patterns for the info you're looking for but ymmv.
But long story short, dont' reinvent the wheel here. I expect that for every different util there is at least one page/thread online where someone has done this already.

Categories