I would like to parse the following String:
{
date: [ 'Thu, 28 Apr 2016 10:56:13 +0200' ],
subject: [ 'Subject' ],
from: [ 'Blob <blob#test.com>' ],
to: [ '<blab#test.com>' ]
}
In order to access the variable date, subject etc ...
But I am not sure how to do it since it
Is not a valid JSON
It is not a structure I know
And I don't want to re-invent the wheel if a solution exist which I am not (yet) aware of.
Any ideas?
EDIT
Data are getting using a node-imap module (only relevant part)
f.on('message', function(msg, seqno) {
console.log('Message #%d', seqno);
var prefix = '(#' + seqno + ') ';
msg.on('body', function(stream, info) {
var buffer = '';
stream.on('data', function(chunk) {
buffer += chunk.toString('utf8');
});
stream.once('end', function() {
var parsedHeader = inspect(Imap.parseHeader(buffer));
console.log('Author: '+parsedHeader);
});
SOLVED
See the comment of #stdob--. Imap.parseHeader() return an object.
Looks like that Imap.parseHeader already returns an object with keys
Try console.log( Object.keys(parsedHeader) to see all the keys.
Related
I am trying to use a ParameterizedQuery with SQL Names inside its text parameter.
I know that the docs read that this parameter must be a string or a QueryFile.
Basically, what I'd like to do is something like:
import pgPromise from 'pg-promise';
const pgp = pgPromise();
const pq = new pgp.ParameterizedQuery({
text: `
SELECT $1:name from my_table
where $2:name = $3;
`,
rowMode: 'array'
});
const params = {user_col: 'user', id_col: 'id', id_value: 'XXX'};
try {
return await this.db.any(pq, Object.values(params));
} catch (e) {
console.error(e);
return e;
}
What I get is an error like:
QUERY: {
[start:run] text: '\n' +
[start:run] ' SELECT $1:name from my_table\n' +
[start:run] ' where $2:name = $3;\n' +
[start:run] ' ',
[start:run] values: [ 'user', 'id', 'XXX' ],
[start:run] rowMode: 'array'
[start:run] }
[start:run] error: syntax error at or near ":"
Is it possible to use :name (or ~) inside a ParameterizedQuery? The thing is, I really want my query result would be an Array of rows rather than an array of row objects, and ParameterizedQuery's parmater rowMode set to array seems the only way I can do this.
From the Formatting Filters docs:
Note that formatting filters work only for normal queries, and are not available within PreparedStatement or
ParameterizedQuery, because those are, by definition, formatted on the server side.
That note within official documentation was added following this question, because it came up a few times before. Hopefully, it will be more clear from now on.
I would like to store accounts in a json file.
Something like : accounts{[user: user1, email: email1], [user: user2. email: email2]}
Javascript file
Accounts = {
Nickname: form.getElementsByTagName('input')[0].value,
Email: form.getElementsByTagName('input')[3].value
};
var json = JSON.stringify(Accounts);
fs.appendFile('Accounts.json', json, function(err){});
When I add a second user this code make a new object and it looks like.
Json file
{"NickName": "user1", "Email": "Email1"}{"NickName": "user2", "Email": "Email2"}
Then, when I try to read the file and parse it I receive an unexpected error {
I think the Json file should look like
{
{"Name":"value"},
{"Name2":"value2"}
}
I fix the issue.
Here the solution with a mix of Michael Brune's solution.
// Read accounts.json
const Accounts = fs.readFileSync('Accounts.json', 'utf-8');
// Check if json file is empty
if(Accounts.length !== 0){
var ParsedAccounts = JSON.parse(Accounts);
}
else{
ParsedAccounts = [];
}
ParsedAccounts.push(Account);
const NewData = JSON.stringify(ParsedAccounts, null, 4);
// Write new data to accounts.json
fs.writeFileSync('Accounts.json', NewData);
Basically, I push the new data in ParsedAccounts then I write it in the json file.
Maybe there is another way, but if your file is pretty small, then try like this:
Accounts.json:
{
"other": "other data"
}
index.js
const fileData = fs.readFileSync('Accounts.json');
let parsedFileData
try {
parsedFileData = JSON.parse(fileData);
} catch (error) {
if (error instanceof SyntaxError) {
// Create empty object to append to since parse was not a success
parsedFileData = {};
// Create backup of old, possible corrupt file
fs.writeFileSync('Accounts.backup.json', fileData);
}
}
const newFileData = JSON.stringify({
...parsedFileData,
...newDataToAppend,
}, null, 4);
fs.writeFileSync('Accounts.json', newFileData);
First you can parse the file.
Assign new and old data to a object.
Then convert that to srting JSON.Stringify
The null and 4 are to write a nice pretty file
Then write it to the file directly.
By using node js I generate the array below(2) , by parsing through multiple json files for a particular value.
My json files contains a list of IDs with their status : isAvailable or undefined.
So In my code i parse through all my json files but looking only for the first ID and get the statuts of Availability in the picture bellow. When it's true t means that the ID is available. As you can see the name of the file, it's the date and the hour the json was produced.
So What I want to achieve is write a function or anything simple, where i go through the array you can see in the picture.
Example:
We can see that the status is available for the first file, I wanna recover first file name with status available
("{ fileName: '2017-03-17T11:39:36+01:00',
Status: Available }"
when the status stop being available, in our example that would be here ( { fileName: '2017-04-06T11:19:17+02:00', contents: undefined } )
get:
{ fileName: '2017-04-06T11:19:17+02:00', Status: unavailable }
(2)
So here is part of my code where I generate this array :
Promise.mapSeries(filenames, function(fileName) {
var contents = fs
.readFileAsync("./availibility/" + fileName, "utf8")
.catch(function ignore() {});
return Promise.join(contents, function(contents) {
return {
fileName,
contents
};
});
}).each(function(eachfile) {
if(eachfile.contents){
jsonobject = JSON.parse(eachfile.contents);
if(jsonobject && jsonobject.hasOwnProperty('messages'))
// console.log(jsonobject.messages[0].message.results[2]);
eachfile.contents = jsonobject.messages[0].message.results[1].isAvailable;
}
eachfile.fileName = eachfile.fileName.substring('revision_'.length,(eachfile.fileName.length-5));
console.log(eachfile);
})
May someone help me please
Thank you,
Suppose you have an array:
[
{
filename : "2017-03-23 00:00:00",
contents : true
},
...
{
filename : "2017-03-23 00:00:00",
contents : undefined
},
{
filename : "2017-03-23 00:00:00",
contents : undefined
},
{
filename : "2017-03-23 00:00:00",
contents : true
}
]
where the ... represents a long stream of objects where the contents value is true.
You want to end up with a list of objects without consequent objects with the same contents value, meaning the result would look like:
[
{
filename : "2017-03-23 00:00:00",
contents : true
},
{
filename : "2017-03-23 00:00:00",
contents : undefined
},
{
filename : "2017-03-23 00:00:00",
contents : true
}
]
Im going to use jQuery because this is a javascript framework I am familiar with, but you should be able to translate it with ease to whatever framework you're using.
function doit(dataArray) {
var resultList = [];
var currentContent = "";
$.each(dataArray, function(index, value) {
if(currentContent != value.content) {
resultList.push(value);
currentContent = value.content;
}
});
console.log(resultList);
}
Note that you need an array with the data that looks like the data in your picture, however, you print every row. You might need to add those rows into a new array, and then pass that array to this function.
I have a file with name "file.csv", this file have data below:
ID Full name
1 Steve
2 John
3 nam
4 Hạnh
5 Thủy
I use segment code below to parse this file to json file. But my results is not utf8
Code:
var fastCsv = require("fast-csv");
var fs = require("fs");
var iconv = require('iconv-lite');
var fileStream = fs.createReadStream("file.csv");
fastCsv
.fromStream(fileStream, {headers : ["id", "full_name"]})
.on("data", function(data){
console.log("------------------------");
console.log("data: ", data);
})
.on("end", function(){
console.log("done");
});
Results:
data: { id: '��I\u0000D\u0000', full_name: '\u0000F\u0000u\u0000l\u0000l\u0000 \u0000n\u0000a\u0000m\u0000e\u0000' }
data: { id: '\u00001\u0000',full_name: '\u0000S\u0000t\u0000e\u0000v\u0000e\u0000' }
data: { id: '\u00002\u0000',full_name: '\u0000J\u0000o\u0000h\u0000n\u0000' }
data: { id: '\u00003\u0000',full_name: '\u0000n\u0000a\u0000m\u0000' }
data: { id: '\u00004\u0000', full_name: '\u0000H\u0000�\u001en\u0000h\u0000' }
data: { id: '\u00005\u0000',full_name: '\u0000T\u0000h\u0000�\u001ey\u0000' }
data: { id: '\u0000', full_name: '' }
How to convert my result to utf8?
Your input file is encoded in UTF-16LE, but it has been read as if it were UTF-8.
Try opening the file with fs.createReadStream('file.csv', {encoding: 'utf-16le'}).
Take a look at Javascript Has a Unicode Problem
In your case you need to decode the escaped unicode chars. A library included with node called punycode can handle this.
Import punycode via:
var punycode = require("punycode");
Change:
console.log("firstName: ", data);
To:
console.log("firstName: ", punycode.ucs2.decode(data));
You might have to break down the data object further to decode it's properties but I can't tell from your answer what their structure is.
I'm new to Node.js. I have a JSON object which looks like the following:
var results = [
{ key: 'Name 1', value: '1' },
{ key: 'Name 2', value: '25%' },
{ key: 'Name 3', value: 'some string' },
...
];
The above object may or may not have different values. Still, I need to get them into a format that looks exactly like the following:
{"Name 1":"1","Name 2":"25%","Name 3":"some string"}
In other words, I'm looping through each key/value pair in results and adding it to a single line. From my understanding this single line approach (with double quotes) is called "JSON Event" syntax. Regardless, I have to print my JSON object out in that way into a text file. If the text file exists, I need to append to it.
I do not know how to append to a text file in Node.js. How do I append to a text file in Node.js?
Thank you!
You can use JSON.stringify to convert a JavaScript object to JSON and fs.appendFile to append the JSON string to a file.
// write all the data to the file
var fs = require('fs');
var str = JSON.stringify(results);
fs.appendFile('file.json', str, function(err) {
if(err) {
console.log('there was an error: ', err);
return;
}
console.log('data was appended to file');
});
If you want to add just one item at a time, just do
// Just pick the first element
var fs = require('fs');
var str = JSON.stringify(results[0]);