I just started trying Javascript and I'm struggling. My end result is supposed to be a chronological timeline of activities (Calls, Meetings, Tasks).
I'm receiving a file from an application with different types of records. It contains Calls, Meetings, and Tasks. When I receive the file, they are in no particular order and each record type has different fields. I need to get them into the same table but sorted by date.
Here is a sample file with a Call and a Task but it might have 10 or more records of differing type.
[
{
"Owner": {
"name": "Raymond Carlson",
},
"Check_In_State": null,
"Activity_Type": "Calls",
"Call_Start_Time": "2022-10-23T20:00:00-05:00",
"$editable": true,
"Call_Agenda": "Need to call and discuss some upcoming events",
"Subject": "Call scheduled with Florence"
},
{
"Owner": {
"name": "Raymond Carlson",
},
"Check_In_State": null,
"Activity_Type": "Tasks",
"Due_Date": "2022-10-24",
"$editable": true,
"Description": "-Review Action Items from Last week”,
"Subject": "Complete Onboarding"
}
]
This is what I'm doing now and I know it's not the best way to go about it.
for (var i = 0; i < obj.length; i++) {
var activityType0 = (obj[0].Activity_Type)
var activityOwner0 = (obj[0].Owner.name);
if (activityType0 == "Events") {
start0 = (obj[0].Start_DateTime)
startDate0 = new Date(start0);
activityDate0 = startDate0.toLocaleString();
activityTitle0 = (obj[0].Subject);
activityDesc0 = (obj[0].Description);
}
else if (activityType0 == "Tasks"){
dueDate0 = (obj[0].Due_Date)
activityDate0 = dueDate0;
activityTitle0 = (obj[0].Subject);
activityDesc0 = (obj[0].Description);
}
else if (activityType0 == "Calls"){
callStart0 = (obj[0].Call_Start_Time)
callStartTime0 = new Date(callStart0);
activityDate0 = callStartTime0.toLocaleString();
activityTitle0 = (obj[0].Subject);
activityDesc0 = (obj[0].Call_Agenda);
}
}
So regardless of the type of record, I have an
activityOwner,
activityDate,
activityTitle,
activityDesc,
And that's what I need.
Aside from that code above needing work, now my question is, what do I need to do with these values for each record to put them in order by "activityDate". Do I need to put them back into an array then sort and if so, what's the best approach?
Thank you much!
Right now I'm not really sure what your end goal is, is it sorting by activity type or activity date ?
If it's the latter, you can try referring to this answer, or try to sort activity type by the ASCII number of each starting letter in each type (e.g. "C" in "Call", "T" in "Tasks", etc.)
Related
I need some help with a small thing I am struggeling with. I have to create a general search input that searches though a json of music numbers. The user has to be able to type an album/track or artist in the searchbar and then he'll get the result. Like any other search bar does. Only this one searches based on keypresses instead of a submit button.
The part where I'm stuck is that I've received a large JSON file with more than 5000 entries. And my search bar has to be able to identify entries based on partially typed "keywords". So for instance if I want to search for madonna and I type in "mado" I should already get some madonna in my results ( of course its possible to get other entries that have mado in their title or someting! ).
Sorry for my lack of good grammar but I try my best to explain the situation as good as possible!
Now for the question! The thing I'm struggeling with is how I loop through a json file to search for these keywords. This is a small portion of the json I receive:
{
"1": {
"track": "Dani California",
"artist": "Red Hot Chili Peppers",
"album": "Stadium Arcadium"
},
"2": {
"track": "Tell me baby",
"artist": "Red Hot Chili Peppers",
"album": "Stadium Arcadium"
},
"3": {
"track": "Snow (Hey Oh)",
"artist": "Red Hot Chili Peppers",
"album": "Stadium Arcadium"
}}
Normally I would create a function that is something like this:
for(var i = 0; i < data.length; i++){
if(data[i].album == 'red hot'){
console.log(data[i].album)
}}
But in this case I want to loop through the json, looking for enties that contain the search value an save it to an object for later usage
Is it possible to do this all at once? So to check the artist/title/album at once, or would it be better to create a small filter?
If something is not clear about my explanation please met me know I tried my best to be as clear as I could be!
I dont think searching 5000 entries should cause performance issues.
Check out this code which should return the desired entries when you call search('text')
var data = JSON.parse('JSON DATA HERE') // dataset
var search_fields = ['track','artist','album'] //key fields to search for in dataset
function search(keyword){
if(keyword.length<1) // skip if input is empty
return
var results = []
for(var i in data){ // iterate through dataset
for(var u=0;u<search_fields.length;u++){ // iterate through each key in dataset
var rel = getRelevance(data[i][search_fields[u]],keyword) // check if there are matches
if(rel==0) // no matches...
continue // ...skip
results.push({relevance:rel,entry:data[i]}) // matches found, add to results and store relevance
}
}
results.sort(compareRelevance) // sort by relevance
for(i=0;i<results.length;i++){
results[i] = results[i].entry // remove relevance since it is no longer needed
}
return results
}
function getRelevance(value,keyword){
value = value.toLowerCase() // lowercase to make search not case sensitive
keyword = keyword.toLowerCase()
var index = value.indexOf(keyword) // index of the keyword
var word_index = value.indexOf(' '+keyword) // index of the keyword if it is not on the first index, but a word
if(index==0) // value starts with keyword (eg. for 'Dani California' -> searched 'Dan')
return 3 // highest relevance
else if(word_index!=-1) // value doesnt start with keyword, but has the same word somewhere else (eg. 'Dani California' -> searched 'Cali')
return 2 // medium relevance
else if(index!=-1) // value contains keyword somewhere (eg. 'Dani California' -> searched 'forn')
return 1 // low relevance
else
return 0 // no matches, no relevance
}
function compareRelevance(a, b) {
return b.relevance - a.relevance
}
Since it's not an array you can't use Array.prototype.filter() unless you turn your object into an array. You could do this every time you get a new Json, no need to do this with every search.
var myArray = [];
for(var elementName in data) //We iterate over the Object to get the names of the nested objects
myArray.push(data[elementName]); //We get the objects of the json and push them inside our array.
Then you can use .filter to filter your data, I recommend using regex:
var userQuery = 'Mado' //user input
var myRegex = new RegExp('.*' + userQuery + '.*','gi'); //We create a new regular expression, this one tests if the text is contained within a string.
var filteredArray = myArray.filter(function(item){
//We test each element of the object to see if one string matches the regexp.
return (myRegex.test(item.track) || myRegex.test(item.artist) || myRegex.test(item.album))
});
filteredArray should be the elements of the json you need.
Array.prototype.filter MDN
RegeExp MDN
Here's a pattern I often use for filter functionalities. Some key points are:
Always build an index property that contains the appended string of the filterable values. For example, if the values of 'track','artist' and 'album' can be filtered on, then join their values into a string, and add that string as one of the properties to the original object.
This helps quickly search using indexOf, rather than having to iterate through each object when filtering. It significantly improves the performance as those iterations and additional number of properties*number of objects comparisons are no longer required. In your case, you'd be saving roughly 10k comparisons and 15k iterations with every filter operation.
If the filter operation is case-insensitive, use toLowerCase while appending the values to build indexes. It also saves you from performing those many toLowerCase operations every time filter is called.
Always create an array of objects, rather than an object with object properties. I don't have specific stats on whether this improves performance or not, but it provides you some array methods such as array.filter or array.sort that you could utilize to improve UX. I haven't done that in the snippet, but you can do that quite easily while preparing the data.
var data = {
"1": {
"track": "Dani California",
"artist": "Red Hot Chili Peppers",
"album": "Stadium Arcadium"
},
"2": {
"track": "Tell me baby",
"artist": "Red Hot Chili Peppers",
"album": "Stadium Arcadium"
},
"3": {
"track": "Snow (Hey Oh)",
"artist": "Red Hot Chili Peppers",
"album": "Stadium Arcadium"
}};
// One time activity!
// Build search indexes, for every object.
for(var prop in data) {
if(data.hasOwnProperty(prop)) {
var index = "";
var item = data[prop];
// Iterate over each object and build the index by appending the values of each property.
for(var attr in item) {
if(item.hasOwnProperty(attr)) {
// Note: Different values are separated by a hash as hash # is unlikely to come into the search query.
index = index + item[attr] + "#";
}
}
// Insert the index property into the object.
// Also notice the toLowerCase that allows for case insenstive searches later on.
item.index = index.toLowerCase();
}
}
console.log("Prepared data:" ,data);
// Filter process.
var key = "Sn";
var keyLowerCase = key.toLowerCase();
// Iterate over the objects and compare the index prpoerty to match with the search string.
var filteredData = [];
for(var prop in data) {
if(data.hasOwnProperty(prop)) {
var item = data[prop];
if(item.index.indexOf(keyLowerCase) >= 0 ){
filteredData.push(item);
}
}
}
console.log("Filtered data:", filteredData);
{
"purchaseorder": {
"Meta": [{
"InterchangeControlNumber": ["000004010"],
"GroupControlNumber": ["833"],
"DocumentControlNumber": ["5952"],
"InterchangeSenderID": ["009483355"],
"InterchangeReceiverID": ["076627454"],
"GroupSenderID": ["009483355"],
"GroupReceiverID": ["076627454"]
}],
"header": [{
"address": [{
"home": [{
"hno": ["134/123"],
"contact": ["898944343"],
"pin": ["272272"],
"district": ["telangana"]
}],
"office": [{
"hno": ["456/789"],
"contact": ["440838272"],
"pin": ["55833"],
"district": ["delhi"]
}]
}]
}]
}
}
I have JSON data like the above.
I want to store it in database as meta, home and office as table names, with the respective keys as column and value as cell data.
How do I the data from the JSON in node.js or JavaScript?
The JSON data can be any thing. I can pass the name of required table to be created, it should be automated.
I'm afraid I don't know too much about SQL Server, but hopefully this should get you going along the right lines on the Javascript side of things.
Convert JSON to Javascript Object
You can turn the JSON data into a key-value accessible object, and then access meta, home and office, as follows:
var data = JSON.parse(jsonData);
var meta = data.purchaseorder.Meta[0];
var home = data.purchaseorder.header[0].address[0].home[0];
var office = data.purchaseorder.header[0].address[0].office[0];
Then to pull out specific values you can do, for example:
var homeContact = home.contact; // Will return "898944343".
Retrieve Column Names
If, as you say, the data can change, one way that you could derive column names is this:
var metaColumns = Object.keys(meta);
var homeColumns = Object.keys(home);
var officeColumns = Object.keys(office);
Note: Javascript objects do not necessarily preserve property ordering, so these columns may not be in the same order as they appear in the data.
Create Table and Insert Data
You can put together corresponding arrays of column names and values for this purpose:
var allColumns = metaColumns + homeColumns + officeColumns;
var allValues = [];
for (let col of metaColumns) {
allValues.push(meta[col]);
}
for (let col of homeColumns) {
allValues.push(home[col]);
}
for (let col of officeColumns) {
allValues.push(office[col]);
}
From here you can build your CREATE TABLE and INSERT SQL statements, with the column names and values as parameters, using a suitable node.js library for SQL Server, such as this.
Be aware that if this data comes from an untrusted source you should make sure to sanitise it properly to prevent the possibility of SQL injection.
I have some JSON data that I'm trying to extract.
I'm trying to extract and reformat some of the data to an array. I'm looping through the data, but am having problems extracting the data within a nested submeaning object.
JSON data:
var data = [
{
"meaning": "a procedure intended to establish the quality, performance, or reliability of something, especially before it is taken into widespread use.",
"examples": [
"no sparking was visible during the tests"
],
"submeanings": [
{
"meaning": "a short written or spoken examination of a person's proficiency or knowledge.",
"examples": [
"a spelling test"
]
},
{
"meaning": "an event or situation that reveals the strength or quality of someone or something by putting them under strain.",
"examples": [
"this is the first serious test of the peace agreement"
]
},
{
"meaning": "an examination of part of the body or a body fluid for medical purposes, especially by means of a chemical or mechanical procedure rather than simple inspection.",
"examples": [
"a test for HIV",
"eye tests"
]
},
{
"meaning": "a procedure employed to identify a substance or to reveal the presence or absence of a constituent within a substance."
}
]
},
{
"meaning": "a movable hearth in a reverberating furnace, used for separating gold or silver from lead."
}
]
Algorithm:
// array to hold definitions
var definitions = [];
for (var i = 0; i < data.length; i++) {
// push first
definitions.push(data[i]['meaning']);
// push second, if submeaning data exists
if (data[i]['submeanings'].length >= 1) {
definitions.push(data[i]['submeanings'][i]['meaning']);
}
}
When I run this code, I'm getting the following error:
Uncaught TypeError: Cannot read property 'length' of undefined(…)
Any help is appreciated.
Check if submeanings exists, before you ask for it's length.
// push second, if submeaning data exists
if (data[i] && data[i]['submeanings'] && data[i]['submeanings'].length >= 1) {
definitions.push(data[i]['submeanings'][i]['meaning']);
}
You have to check if the objects have submeanings before checking their length. Change
if (data[i]['submeanings'].length >= 1)
to
if (data[i]['submeanings'] && data[i]['submeanings'].length >= 1)
Furthermore, if there are multiple submeanings, you'll need a separate loop to extract the submeanings, like so:
if (data[i]['submeanings'] && data[i]['submeanings'].length >= 1) {
for(var j = 0; j < data[i]['submeanings'].length; j++) {
definitions.push(data[i]['submeanings'][j]['meaning']);
}
}
Keeping track of multiple indices is difficult though, so I'd suggest looking into using forEach instead: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Array/forEach
That because you don't have attribute submeanings in the second object of your JSON data, so you should check if the object has property using hasOwnProperty() then get it :
if (data[i].hasOwnProperty('submeanings')) {
definitions.push(data[i]['submeanings'][i]['meaning']);
}
Hope this helps.
Any help would be greatly appreciated!
Here's what I've done so far.
I have an index.html page and a notice-details.html page. On the index.html page, I have a data grid and I'm displaying 'title' from the json file. I hyperlinked the title to details.html and also added the 'number' so each title href is unique. The data template for the linked title in the data grid is looking like this:
{{title}}
On notice-details.html page I'm trying to capture the query string parameter and display the associated key value pairs that match the 'number' in the json array. So if I land on notice-details.html?id=2012-01 I want to display on that page the title, award claim due date, award claim forms, and date posted associated to the 2012-001 in the json.
I'm stuck on how to match the number and querying only the matched content.
JSON:
{
"notices": [
{
"number": "2012-001",
"link": "google.com",
"title": "sample title",
"awardClaimDueDate": "",
"awardClaimForms": "",
"datePosted": "1/31/2012"
},
{
"number": "2012-001",
"link": "google.com",
"title": "sample title",
"awardClaimDueDate": "",
"awardClaimForms": "",
"datePosted": "1/31/2012"
}
]
}
JS:
function jsonParser(json){
$('#load').fadeOut();
$.getJSON('notices.json',function(data){
// Parse ID param from url
var noticeParamID = getParameterByName('id');
$.each(data.notices, function(k,v){
var noticeNumber = v.number,
noticeTitle = v.title,
claimDueDate = v.awardClaimDueDate,
claimForms = v.awardClaimForms,
date = v.datePosted;
if(noticeParamID == noticeNumber){
// how can I display content that matches the url param value (noticeURLNumber)?
}
});
});
}
// get URL parameter by name
function getParameterByName(name) {
name = name.replace(/[\[]/, "\\[").replace(/[\]]/, "\\]");
var regex = new RegExp("[\\?&]" + name + "=([^&#]*)"),
results = regex.exec(location.search);
return results === null ? "" : decodeURIComponent(results[1].replace(/\+/g, " "));
}
I would personally make available a 2nd JSON data structure that allows you to make a lookup based on the notice number rather than have to iterate over an array of notices as you would have to do with your current data structure. For example, if you had JSON structured like this:
{
"2012-001": {
"number": "2012-001",
"link": "google.com",
"title": "sample title",
"awardClaimDueDate": "",
"awardClaimForms": "",
"datePosted": "1/31/2012"
},
"2012-002": {
"number": "2012-002",
"link": "yahoo.com",
"title": "another title",
"awardClaimDueDate": "",
"awardClaimForms": "",
"datePosted": "3/31/2012"
},
...
}
Then you could easily lookup your data from the object created from the JSON like this (assuming this new JSON file is called noticesByID.json):
var noticeParamID = getParameterByName('id');
// get notice based on this id
$.getJSON('noticesByID.json',function(data){
var notice = data[noticeParamID];
// do something with the notice
// let's say we are updating DOM element with id's the same as notice keys
for (propKey in notice) {
$('#'+propKey).html(notice[propKey]);
}
}
You should really strongly consider this approach, as you have already noted you have 1K+ notices that, without a proper data structure to support lookup by notice number, you would have to iterate through to find the notice you were interested in. Your performance would continue to get worse the more notices you have with the iteration approach (it has O(n) complexity where n is number of notices). With the approach I have presented here, you would always have an O(1) operation.
I would also begin to think about whether you really, truly need to make this data available via static JSON files. This requires you to download the entire file just so the use can get to a single record. That is a lot of wasted bandwidth and potential response lag in the UI. If you are going for a fully static site, perhaps you have to live with this or think about making a bunch of individual JSON files (i.e. 2012-011.json) to minimize download. Alterantely if you already have a dynamic site or are not opposed to a dynamic site, then you could look at databases and other things that could simplify your data lookup problems.
Per comments, here is suggestion on converting existing JSON to new JSON format needed for this approach:
var sourceJSON = '{"notices":...}'; // your source JSON
var sourceObj = JSON.parse(sourceJSON);
var notices = sourceObj.notices;
var targetObj = {}; // the target object you will load notice data into
// fill targetObj with data structure you want
for(i = 0; i < notices.length; i++) {
// we use the `number` property of the notice as key
// and write the whole notice object to targetObj at that key
targetObj[notices[i].number] = notices[i];
}
// create new JSON from targetObj
newJSON = JSON.stringify(targetObj);
I have a field in my documents, that is named after its timestamp, like so:
{
_id: ObjectId("53f2b954b55e91756c81d3a5"),
domain: "example.com",
"2014-08-07 01:25:08": {
A: [
"123.123.123.123"
],
NS: [
"ns1.example.com.",
"ns2.example.com."
]
}
}
This is very impractical for queries, since every document has a different timestamp.
Therefore, I want to rename this field, for all documents, to a fixed name.
However, I need to be able to match the field names using regex, because they are all different.
I tried doing this, but this is an illegal query.
db['my_collection'].update({}, {$rename:{ /2014.*/ :"201408"}}, false, true);
Does someone have a solution for this problem?
SOLUTION BASED ON NEIL LUNN'S ANSWER:
conn = new Mongo();
db = conn.getDB("my_db");
var bulk = db['my_coll'].initializeOrderedBulkOp();
var counter = 0;
db['my_coll'].find().forEach(function(doc) {
for (var k in doc) {
if (k.match(/^2014.*/) ) {
print("replacing " + k)
var unset = {};
unset[k] = 1;
bulk.find({ "_id": doc._id }).updateOne({ "$unset": unset, "$set": { WK1: doc[k]} });
counter++;
}
}
if ( counter % 1000 == 0 ) {
bulk.execute();
bulk = db['my_coll'].initializeOrderedBulkOp();
}
});
if ( counter % 1000 != 0 )
bulk.execute();
This is not a mapReduce operation, not unless you want a new collection that consists only of the _id and value fields that are produced from mapReduce output, much like:
"_id": ObjectId("53f2b954b55e91756c81d3a5"),
"value": {
"domain": "example.com",
...
}
}
Which at best is a kind of "server side" reworking of your collection, but of course not in the structure you want.
While there are ways to execute all of the code in the server, please don't try to do so unless you are really in a spot. These ways generally don't play well with sharding anyway, which is usually where people "really are in a spot" for the sheer size of records.
When you want to change things and do it in bulk, you generally have to "loop" the collection results and process the updates while having access to the current document information. That is, in the case where your "update" is "based on" information already contained in fields or structure of the document.
There is therefore not "regex replace" operation available, and there certainly is not one for renaming a field. So let's loop with bulk operations for the "safest" form of doing this without running the code all on the server.
var bulk = db.collection.initializeOrderedBulkOp();
var counter = 0;
db.collection.find().forEach(function(doc) {
for ( var k in doc ) {
if ( doc[k].match(/^2014.*/) ) {
var update = {};
update["$unset"][k] = 1;
update["$set"][ k.replace(/(\d+)-(\d+)-(\d+).+/,"$1$2$3") ] = doc[k];
bulk.find({ "_id": doc._id }).updateOne(update);
counter++;
}
}
if ( counter % 1000 == 0 ) {
bulk.execute();
bulk = db.collection.initializeOrderedBulkOp();
}
});
if ( counter % 1000 != 0 )
bulk.execute();
So the main things there are the $unset operator to remove the existing field and the $set operator to create the new field in the document. You need the document content to examine and use both the "field name" and "value", so hence the looping as there is no other way.
If you don't have MongoDB 2.6 or greater on the server then the looping concept still remains without the immediate performance benefit. You can look into things like .eval() in order to process on the server, but as the documentation suggests, it really is not recommended. Use with caution if you must.
As you already recognized, value-keys are indeed very bad for the MongoDB query language. So bad that what you want to do doesn't work.
But you could do it with a MapReduce. The map and reduce functions wouldn't do anything, but the finalize function would do the conversion in Javascript.
Or you could write a little program in a programming language of your which reads all documents from the collection, makes the change, and writes them back using collection.save.