Get count of unique values of properties from JSON API response - javascript

I have a JSON API served by a Ruby on Rails backend. One of the endpoints returns an array of objects structured like this
{
"title_slug": "16-gaijin-games-bittrip-beat-linux-tar-gz",
"platform": "Linux",
"format": ".tar.gz",
"title": "BIT.TRIP BEAT",
"bundle": "Humble Bundle for Android 3",
"unique_games": 9
},
{
"title_slug": "17-gaijin-games-bittrip-beat-linux-deb",
"platform": "Linux",
"format": ".deb",
"title": "BIT.TRIP BEAT",
"bundle": "Humble Bundle for Android 3",
"unique_games": 9
},
Because there are different types of downloads for a single title the "Title" is not unique across several objects. I would like a count of only unique titles.
I was thinking of doing it in Ruby on Rails in the model and just sending it in the JSON response but that does not work because it needs the whole array to count them, obviously. I am using angular on the front end so I am thinking it needs to be done in the controller. I also filter the response in a table and want updated numbers of the unique titles being displayed.
Here's a screenshot of the page this is going on to get better perspective. http://i.imgur.com/Iu1Xajf.png
Thank you very much,
Thomas Le
BTW, this is a site I am developing that is not going to be a public website. It is a database site that holds all the data on the bundles I have bought from IndieGala and HumbleBundle. I am not going to make these links available to the public. I am making it more functional than the bare minimum because it is an open source project that I have on GitHub that people can use themselves locally.
Just in case people were wondering why I have Humble Bundle stuff listed on the image.

http://jsfiddle.net/hy7rasp4/
Aggregate your data in an array indexed by the unique key, Then you get access to information on duplicates and count.
var i,
title,
uniqueResults= {};
for (i in results) {
title= results[i].title;
if (!uniqueResults[title]) {
uniqueResults[title]= [];
}
uniqueResults[title].push(results[i]);
}

Maybe it would be better to restructure your data at the same time, so you can also get those items easily later as well as a quick lookup for the number of titles, e.g. in JavaScript
// assuming arrayOfObjects
var objectOfTitles = {},
i;
for (i = 0; i < arrayOfObjects.length; ++i) {
if (!objectOfTitles.hasOwnProperty(arrayOfObjects[i].title)) {
objectOfTitles[arrayOfObjects[i].title] = [];
}
objectOfTitles[arrayOfObjects[i].title].push(arrayOfObjects[i]);
}
var numberOfTitles = Object.keys(objectOfTitles).length;
// then say you choose a title you want, and you can do
// objectOfTitles[chosenTitle] to get entries with just that title

Related

Pushing and pulling JSON data with a filter from API

I am working on a REACT JS project in an attempt to create a small Todo List app.
I have my data in a JSON file, currently hosted on jsonbin.io, in a format that looks like this...
{
"moduleAccess": {
"tasks": [
{
"email": "campbell#yahoo.com",
"id": 0,
"task_name": "Call mom",
"due_date": 44875,
"completed": true
},
{
"email": "palsner593#gmail.com",
"id": 1,
"task_name": "Buy eggs",
"due_date": 44880,
"completed": false
},
{
"email": "rob#gmail.com",
"id": 2,
"task_name": "Go to dog park",
"due_date": 44879,
"completed": false
}
]
}
}
Currently, I fetch the data using jsonbin.io's API. The data is brought into a variable called Tasks. If a user updates a specific to-do item, deletes a to-do item, or creates a new one, all those changes are put back into the Tasks variable. I can then send push those tasks to the server.
What I explained above works fine; however, the caveat is that I would like to allow multiple users to log in and then pull only the Todo items that pertain to their respective email.
Say, campbell#yahoo.com is logged in to my app. In this case, in my fetch pull request, I can specify that I would only like records with campbell#yahoo.com
async function loadData() {
const newPath = '$..tasks[?(#.email==' + campbell#yahoo.com + ')]';
console.log(newPath);
const url = 'https://api.jsonbin.io/v3/b/*binid*?meta=false'
const response = await
fetch(url, {
method: "GET",
headers: {
"X-Master-Key": key,
"X-JSON-Path": newPath
}
});
const data = await response.json();
setTasks([...data]); //or whatever
console.log(tasks);
}
This concept works as well. However, when pushing my task data back to a server after a user has made changes, I encounter an issue. The API I am using does not seem to allow parameters for specifying the JSON path upon PUSH. JSON-PATH is only allowed for a pull request. So when I push data to the server, it seems as if all JSON data will be overwritten, regardless of the user.
Does anybody have an alternative way to push/pull user-specific data? I am sorry if the detail I have provided is unnecessary. Not sure what the easiest way to approach this problem is for a react app.
Any help is appreciated. Thanks!
I did a little research in jsonbin.io API and came up with a solution that might work.
So I'm not really sure that this will work, but still.
When creating a new bin, you can add it to some collection using X-Collection-Id. So you might be able to make next flow:
When user register, create a separate bin for tasks for this user
Add user with bin id to some users collection where you will have all your users
When user auth, get his bin id using filters that you used in your code and store it for future use somewhere in your app.
After this you will be able to fetch users tasks by that bin id and modify it, cause now it will be a separate bin for each user and you can simply override all of its content.
Hope this works.

Good data structure for storing collection of items for most effiecient query DynamoDB

I recently ran into a problem where I had to store a collection of data inside an attribute. There is restaurantDB table. The table has an attribute(not required) named "groups" that manages the groups created by the restaurant. These groups have several permissions associated with them. A natural way of thinking is that the groups attribute can be of type list that stores a map(key-value pairs) of the data like group_name, group_id, and permission(that is a list of all permisssions)
table:{
//other fields
id:
groups: L: M: {group_name:S,group_id:S,permission:L :S}
}
e.g.
"groups": [{
"name": "OWNER",
"permission": ["READ", "WRITE"]
},
{
"name": "MANAGER",
"permission": ["READ"],
}
],
This works fine for creation and appending arbitrary number of users using dynamoDB aws-sdk
update with UpdateExpression: 'SET #groups:=list_append(#groups,:newgroup)'
however if i have to do a patch request to modify a permission for a group, say, MANAGER,
How can i retrieve the Map object inside the list with key group_name:"MANAGER" without fetching the whole array of Groups.
I don't want to patch it by fetching the whole list because I will first have to query to get hold of the groups attribute, then I'll have to iterate through whole of the array to findgroup_name:"MANAGER". Once I get hold of it I'll then modify the array then put the whole list back with an update.
The group names are gauranteed to be unique(however the client also wants a unique group_id) so I thought of a data structure something like
table:{
//other fields
id:
groups: M:{S:L}
}
e.g.
"groups":{
"OWNER":["READ","WRITE"],
"MANAGER":["READ"]
}
Though now I cannot enter arbitrary number of data to create a new group(as in POST request) but I can do it one at a time. Also PATCH request now work fine as
UpdateExpression:
"set #groups.#groupName = :newPermission",
ExpressionAttributeNames: {
"#groups": "groups",
"#groupName": `${groupName}`, //groupName taken from request Body
},
ExpressionAttributeValues: {
":newPermission": permission, //permission taken from request body
},
I wanted to know if there was a better way to retrieve list types. The documentation syas we can retrive list items with Indices however I don't know the index in advance.

how to navigate nested objects of unknown depth?

Im making a notetaking app and Ive decided to store all the notes and structure in JSON file. On javascript, I get the JSON with AJAX, parse it and output it on the website.
My note structure is array of objects that can be nested, like this (if it is a note, it has a "content" attribute, if it is a folder, it has an array of objects (can be empty array too if the folder should me empty):
data {
entries = [
{
name: "Some note",
content: "This is a test note"
},
{
name: "folder",
children: [
{
name: "Bread recpie",
content: "Mix flour with water..."
},
{
name: "Soups",
children: [
{
name: "Pork soup",
content: "Add meat, onion..."
},
{
name: "Chicken soup"
content: "....."
}
]
}
]
}
]
}
To list the root directory, its simple, i just loop through the array as it only outputs the top-level records:
for (entry of data.entries) {
const li = document.createElement("li");
li.textContent = entry.name;
if (entry.children) {
li.className = "folder";
} else {
li.className = "file";
}
loop.appendChild(li);
}
But what about the folders? How should I proceed in listing the folders if the depth of nesting is unknown? And how do I target the specific folder? Should I add unique IDs to every object so i can filter the array with them? Or should I store some kind of depth information in a variable all the time?
You're making this more difficult for yourself by saving data to a JSON file. That is not a good approach. What you need to do is design a database schema appropriate for your data and create an API that outputs a predictable pattern of data that your client can work with.
I would suggest having a Folder resource and a Note resource linked through a one-to-many relationship. Each Folder resource can have many associated Note entries, but each Note has only one Folder that it is linked to. I suggest using an ORM, because most make it easy to eager load related data. For instance, if you choose Laravel you can use Eloquent, and then getting all notes for a folder is as easy as:
$folderWithNotes = Folder::with('notes')->where('name', 'school-notes')->get();
Knowing PHP is beside the point. You should still be able to see the logic of that.
If you create a database and build a server-side API to handle your data, you will end up with JSON on your client side that has a predictable format and is easy to work with.

Node.Js async request inside a Array.forEach not completing before wiriting a json file

I'm making a web scraper Node.js app that harvests job description text from various urls.. I currently have an array of job objects named jobObj and the code cycles through each url, makes a request for html, loads using cheerio module then finally makes a new object with jobName and jobDesc keys and pushes it onto a new array of objects that is then written as a json file....
All this currently works however the completeness of the written json file is very random and usually only contains one complete job account. I thought this may be due to the forEach loop completing much quicker than the asynchronous Request function thus resulting in execution of the fs.writefile before request callback is completed. I've added a counter to monitor at what stage the requests are at and only write the json file once counter===jobObj.length but still the json file is not fully complete.
I'm new to node.js if someone could please point out my error it would be greatly appreciated!
var jobObj = [
{
id:1,
url:"https://www.indeed.co.uk/cmp/Daffodil-IT/jobs/Lead-Junior-Website-Developer-59ea7d446bdf1253?q=Junior+Web+Developer&vjs=3",
},
{
id:2,
url:"https://www.indeed.co.uk/cmp/Crush-Design/jobs/Middleweight-Web-Developer-541331b7885c03cf?q=Web+Developer&vjs=3",
},
{
id:3,
url:"https://www.indeed.co.uk/cmp/Monigold-Solutions/jobs/Graduate-Web-Software-Engineer-a5787dc322c0ca36?q=Web+Developer&vjs=3",
},
{
id:4,
url:"https://www.indeed.co.uk/cmp/ZOO-DIGITAL-GROUP-PLC/jobs/Web-Developer-5cdde1c3b0b7b8d0?q=Web+Developer&vjs=3",
},
{
id:5,
url:"https://www.indeed.co.uk/viewjob?jk=9cc3d8c637c41067&q=Web+Developer&l=Sheffield&tk=1cf5di52e9u0ocam&from=web&vjs=3",
}
];
app.get('/myform', function(req, res){
res.send("<h1>" + `scanning ${jobObj.length} urls for job description text` + "</h1>");
//make assign input form data to node "url" variable
//Compnonents for a request counter
var jobs = new Array;
function scrapeFinished(){console.log("all websites scraped!");};
var itemsProcessed = 0;
jobObj.forEach(function(item){
request(item.url, function(err, res, html){
if(!err){
var $ = cheerio.load(html);
var newJob = new Object;
$('#job_summary').each(function(){
var data = $(this);
var textout = data.text();
newJob.jobDesc = textout;
});
$('.jobtitle').each(function(){
var data = $(this);
var jobtitle = data.text();
newJob.jobName = jobtitle;
});
jobs.push(newJob);
itemsProcessed++;
console.log(item.url + " scraped");
if(itemsProcessed === jobObj.length){
scrapeFinished();
fs.writeFile('output.json', JSON.stringify(jobs, null, "\t"), function(err){
if(!err){console.log("output.json file written")}
})
}
}
})
})
})
And finally this is what I get on fs.writefile
[
{},
{
"jobDesc": "We are a successful design and digital agency that works with some great clients on a wide range of digital projects.We simply need more developers to join our great team to deliver even more great work.The projects we work on are all php based, typically built using WordPress, Laravel or flat html.We are seen as a premium agency because of the quality and complexity of the work we do.That means you will have to do more that just manipulate a theme - you will have to code. But you will be given the space, time and support to do so.We want you to be proud of the work you do, because the reputation of the agency need you to be.Key skills we will want you to bringCSS (CSS3) & HTMLAt least some knowledge of MySQL and JavaScript.At least some knowledge PHP (seniors will be tested)PhotoshopWhat you will want that we can giveA good place to work with a friendly teamA chance to develop your coding craftA decent range of projects to challenge yourselfA senior developer on hand to coach and adviseA successful company with an optimistic outlook, growth plans and a secure futureExactly how much experience you have can vary, but you must have some. And the more experience you have, the more we will pay you.We are based in offices we own in the centre of Chesterfield will two staff that do the short commute from Sheffield.If you think this job sounds interesting, we would love to hear from you, please apply!(though not agencies please)Job Type: Full-timeSalary: £22,000.00 to £30,000.00 /yearExperience:development: 2 years",
"jobName": "Middleweight web developer"
},
{},
{},
{}
]
forEach run in synchronous fashion so no matter how much time it will get to scrap the webpage, it won't run for the next array item. What can cause trouble here is your code maybe pushing an object in the array before scrapping the webpage. This is why you are getting an empty object. What you do is to push object after your program finishes visiting the URL.
Do something like this
jobObj.forEach(function (item) {
var newJob = new Object;
request(item.url, function (err, res, html) {
// Scrap URL and save values in newJob
});
jobs.push(newJob);
});
if your code still pushes an empty object before completing the request then consider using Async module.

AngularJS, save data in a json and retrieve it allowing angular to rebuild a dynamic table

Good morning Everyone:
My angular app is a restaurant manager. Somwhere in the backend it shows a restaurant delivery manager, where delivery costs can be managed. I'm giving the option to the users (Restaurant admins) in the backend ui to add Delivery methods, and locations, so each combination has a price and it's saved to the json and displayed in a table below the inputs using ng-repeat
My json is something like this:
{
"deliveryLocation": [
{
"id": 1,
"location": "downtown"
}
],
"deliveryMethods": [
{
"id": 1,
"name": "Delivery Boy"
},
{
"id": 2,
"name": "Cab"
}
],
"combinations": []
}
So when you populate the inputs, you will have at least a combination, for ex: "Downtown - Cab". That combination will be stored in "combinations" (see json) with the following format: "deliveryLocation": "", "deliveryMethods": "", "price": 0
Where price will be an empty input in the combinations table. So the user indicates how much that delivery will cost and that will be saved in the combinations array. (For ex: "deliveryLocation": "downtown", "deliveryMethods": "cab", "price": 50).
I must send the data to the backend when the user saves the table. When the user opens the delivery manager again, the app will receive the data from the backend, will receive the same json, recovering deliveryLocations and deliveryMethods it's quite straightforward, but how should I indicate to AngularJS that everything that will be found in the combinations field is a combination of references? (deliveryLocation an deliveryMethods).
Thanks so much,
Guillermo
1st off take a look on this good example: Fiddle
This example demonstrates how to convert JSON object to table with angularjs.
So what you need to do:
1) with angular send your table object to your (suppose PHP) server side to DB by using some factory:
module.factory('ajax_post', ['$http', function(_http) {
var path = 'src/php/data.ajax.php'; // example
return{
init: function(jsonData){
var _promise= _http.post(path,
jsonData
,{
headers: {
'SOAPActions': 'http://schemas.microsoft.com/sharepoint/soap/UpdateListItems'
}
}
);
return _promise;
}
}
}]);
By the same way load table from DB.

Categories