Traverse Nested JSON Objects - YUI AutoComplete - javascript

Say I have two different unrelated JSON Objects returned from separate AJAX requests
content: [{...}]
0: {...}
userId: "22"
name: "Kevin Johnson"
Manager: {…}
managerId: "123"
name: "Ryan Burke" //will be set as "searchValue"
content: [{...}]
0: {...}
companyId: "345"
companyName: "Trucks-R-Us" //will be set as "searchValue"
Building: {…}
buildingId: "5"
section: "North-West"
The attributes I've marked will be stored in a variable called searchValue (note they are not on the same level)
Can I access searchValue using YUI's AutoComplete plugin to locate the attributes I've specified using the combination of resultListLocator and resultTextLocator, regardless of what the key of the attribute is named or how nested the attribute is?
var autoComplete = new Y.AutoComplete({
inputNode: '#search-string',
resultListLocator: 'content',
resultTextLocator: function( return /** Find searchValue within nested Object**/),
resultHighlighter: 'phraseMatch',
maxResults: 10
});
Short version: Can my objects be traversed until a match to searchValue is found?
Can elaborate if this isn't enough detail

Related

Why javascript array sort doesn't compare all elements with each other?

I'm trying to understand why Javascript array sort doesn't work with the following logic. I have no problems making my own algorithm to sort this array, but I'm trying to make it with the Javascript sort built-in method to understand it better.
In this code, I want to push entities that "belongs to" another entity to the bottom, so entities that "has" other entities appear on the top. But apparently, the sort method doesn't compare all elements with each other, so the logic doesn't work properly.
Am I doing something wrong, or it is the correct behavior for the Javascript sort method?
The code I'm trying to execute:
let entities = [
{
name: 'Permission2',
belongsTo: ['Role']
},
{
name: 'Another',
belongsTo: ['User']
},
{
name: 'User',
belongsTo: ['Role', 'Permission2']
},
{
name: 'Teste',
belongsTo: ['User']
},
{
name: 'Role',
belongsTo: ['Other']
},
{
name: 'Other',
belongsTo: []
},
{
name: 'Permission',
belongsTo: ['Role']
},
{
name: 'Test',
belongsTo: []
},
]
// Order needs to be Permission,
let sorted = entities.sort((first, second) => {
let firstBelongsToSecond = first.belongsTo.includes(second.name),
secondBelongsToFirst = second.belongsTo.includes(first.name)
if(firstBelongsToSecond) return 1
if(secondBelongsToFirst) return -1
return 0
})
console.log(sorted.map(item => item.name))
As you can see, "Role" needs to appear before "User", "Other" before "Role", etc, but it doesn't work.
Thanks for your help! Cheers
You're running into literally how sorting is supposed to work: sort compares two elements at a time, so let's just take some (virtual) pen and paper and write out what your code is supposed to do.
If we use the simplest array with just User and Role, things work fine, so let's reduce your entities to a three element array that doesn't do what you thought it was supposed to do:
let entities = [
{
name: 'User',
belongsTo: ['Role', 'Permission2']
},
{
name: 'Test',
belongsTo: []
},
{
name: 'Role',
belongsTo: ['Other']
}
]
This will yield {User, Test, Role} when sorted, because it should... so let's see why it should:
pick elements [0] and [1] from [user, test, role] for comparison
compare(user, test)
user does not belong to test
test does not belong to user
per your code: return 0, i.e. don't change the ordering
we slide the compare window over to [1] and [2]
compare(test, role)
test does not belong to role
role does not belong to test
per your code: return 0, i.e. don't change the ordering
we slide the compare window over to [2] and [3]
there is no [3], we're done
The sorted result is {user, test, role}, because nothing got reordered
So the "bug" is thinking that sort compares everything-to-everything: as User and Role are not adjacent elements, they will never get compared to each other. Only adjacent elements get compared.

Apollo Client replaces an array of objects with the same id and different values with an array of copies of the first object

Our GraphQL server responds to a query with data that includes an array of objects each of which shares the same id and different values for a different key. For instance, we might have an array that looks like:
[
{ id: 123, name: 'foo', type: 'bar', cost: 5 },
{ id: 123, name: 'foo', type: 'bar', cost: 6 },
{ id: 123, name: 'foo', type: 'bar', cost: 7 },
{ id: 123, name: 'foo', type: 'bar', cost: 8 }
]
We can see in the Network tab that the response from the server has the correct data in it. However, by the time it goes through processing by the Apollo Client module the array has been transformed into something that might look like this:
[
{ id: 123, name: 'foo', type: 'bar', cost: 5 },
{ id: 123, name: 'foo', type: 'bar', cost: 5 },
{ id: 123, name: 'foo', type: 'bar', cost: 5 },
{ id: 123, name: 'foo', type: 'bar', cost: 5 }
]
Essentially what we're seeing is that if all of the objects in an array share the same value for id then all objects in the array become copies of the first object in the array.
Is this the intended behavior of Apollo Client? We thought maybe it had something to do with incorrect caching, but we were also wondering if maybe Apollo Client assumed that subsequent array members with the same id were the same object.
It looks like this is behavior as intended. The Apollo Client normalizes on id.
As the other answer suggests this happens because Apollo normalises by ID. There's a very extensive article on the official blog that explains the rationale of it, along with the underlying mechanisms.
In short, as seen by Apollo's cache, your array of objects contains 4 instances of the same Object (id 123). Same ID, same object.
This is a fair assumption on Apollo's side, but not so much in your case.
You have to explicitly tell Apollo that these are indeed 4 different items that should be treated differently.
In the past we used dataIdFromObject, and you can see an example here.
Today, you would use typePolicies and keyfields:
const cache = new InMemoryCache({
typePolicies: {
YourItem: {
// Combine the fields that make your item unique
keyFields: ['id', 'cost'],
}
},
});
Docs
It works for me:
const cache: InMemoryCache = new InMemoryCache({ dataIdFromObject: o => false )};
previous answer solves this problem too!
Also you can change the key name(for example id => itemId) on back-end side and there won't be any issue!
I have the same issue. My solution is to set fetchPolicy: "no-cache" just for this single API so you don't have to change the InMemoryCache.
Note that setting fetchPolicy to network-only is insufficient because it still uses the cache.
fetchPolicy document

Using ng-options with JSON data

I'm still trying to find my way with AngularJS. I have a JavaScript code that uses URL to return JSON data as an array. I need help with populating the same data in select using ng-options.
data to populate on the select
This isn't how you ask for help, but nevermind. Given a JSON object like this
var JsonArray = [{
id: 1,
name: 'Jane',
address: 'Jane\'s house'
}, {
id: 2,
name: 'Jill',
address: 'Jill\'s house'
}, {
id: 3,
name: 'John',
address: 'John\'s house'
}, {
id: 4,
name: 'Jack',
address: 'Jack\'s house'
}];
When you want to use ng-select with ng-options, you need to specify 3 required fields :
your table
the name that every object will take (like a for ... each loop)
The property you want to see in your options
You also can specify a track by property to give your options a given value.
<select ng-model="mySelect" ng-options="object.name for object in JsonArray track by object.id"></select>
Now, use the last piece of code, and inspect it in a brwoser. You will understand everything.
For reference :
<select ng-model="mySelect" ng-options="[object].[chosenProperty] for [object] in [yourArray] track by [object].[property]"></select>

AngularJS: is $$hashkey a reliable key

Question
I'm interested in the properties of $$hashkey on angular arrays/objects.
Would each generated hashkey be the same each time you reload a
page; a quick test tells me yes but I somewhat assumed it
wouldn't?
If you updated/added to the existing array, would the old hashkey's
stay consistent?
If the above is true, is there a way to fetch from an array using
the hashkey? - of cause I could roll my own but before I recreate the wheel I thought I'd ask.
Example:
Views would include:
form data (example has 1 form)
element data (example has 2 elements)
element options data (example has 2 options per element)
Fetch method:
angular.get($$hashkey);
You would then pass the hashkey of the element and it would return a reference to that array inside the full array.
Lastly the data would be:
{
form_id: 1
form_desc: 'xxx',
form_name: 'name 1',
Elements: [
{
element_id: 1,
element_name: 'element1',
default_value: null,
disabled: "0",
element_type: "image",
ElementOptions: [
{
show: false,
sort_order: 0,
value: "ar",
},
{
show: true,
sort_order: 1,
value: "rw",
}
],
},
{
element_id: 2,
element_name: 'element2',
default_value: null,
disabled: "0",
element_type: "image",
ElementOptions: [
{
show: false,
sort_order: 0,
value: "ar",
},
{
show: true,
sort_order: 1,
value: "rw",
}
],
}
]
}
$$hashkeys will only be computed for functions and objects so if you wish to track anything that isn't one of those types then you have that limitation.
$$Hashkeys look like ...
(function||object): N
...
Where N is just an incremental value being adjusted + 1 for each $$HashKey computed.
So, in many cases this could be the same value across page loads. But loading data that is asynch, will cause differences when multiple data sources are being queried as part of page initialization and order of return cannot be guranteed. In cases like that you would have to marshall all your asynch data and then assign that data to your scope in a specific order to ensure consistent $$hashkeys.
Moving items around in an array that is linked to our DOM (via ng-repeat) will not change that items $$hashkey. Deleting it and re-adding it will.
I would not use $$Hashkey for my own housekeeping as it is intended to be internal to AngularJS.
I've used this internal private property when I had no identifiers.
I think, it's pretty usable, but not recommended.

Prevent Javascript function running out of memory because too many objects

I'm building a web scraper in nodeJS that uses request and cheerio to parse the DOM. While I am using node, I believe this is more of a general javascript question.
tl;dr - creating ~60,000 - 100,000 objects, uses up all my computer's RAM, get an out of memory error in node.
Here's how the scraper works. It's loops within loops, I've never designed anything this complex before so there might be way better ways to do this.
Loop 1: Creates 10 objects in array called 'sitesArr'. Each object represents one website to scrape.
var sitesArr = [
{
name: 'store name',
baseURL: 'www.basedomain.com',
categoryFunct: '(function(){ // do stuff })();',
gender: 'mens',
currency: 'USD',
title_selector: 'h1',
description_selector: 'p.description'
},
// ... x10
]
Loop 2: Loops through 'sitesArr'. For each site it goes to the homepage via 'request' and gets a list of category links, usually 30-70 URLs. Appends these URLs to the current 'sitesArr' object to which they belong, in an array property whose name is 'categories'.
var sitesArr = [
{
name: 'store name',
baseURL: 'www.basedomain.com',
categoryFunct: '(function(){ // do stuff })();',
gender: 'mens',
currency: 'USD',
title_selector: 'h1',
description_selector: 'p.description',
categories: [
{
name: 'shoes',
url: 'www.basedomain.com/shoes'
},{
name: 'socks',
url: 'www.basedomain.com/socks'
} // x 50
]
},
// ... x10
]
Loop 3: Loops through each 'category'. For each URL it gets a list of products links and puts them in an array. Usually ~300-1000 products per category
var sitesArr = [
{
name: 'store name',
baseURL: 'www.basedomain.com',
categoryFunct: '(function(){ // do stuff })();',
gender: 'mens',
currency: 'USD',
title_selector: 'h1',
description_selector: 'p.description',
categories: [
{
name: 'shoes',
url: 'www.basedomain.com/shoes',
products: [
'www.basedomain.com/shoes/product1.html',
'www.basedomain.com/shoes/product2.html',
'www.basedomain.com/shoes/product3.html',
// x 300
]
},// x 50
]
},
// ... x10
]
Loop 4: Loops through each of the 'products' array, goes to each URL and creates an object for each.
var product = {
infoLink: "www.basedomain.com/shoes/product1.html",
description: "This is a description for the object",
title: "Product 1",
Category: "Shoes",
imgs: ['http://foo.com/img.jpg','http://foo.com/img2.jpg','http://foo.com/img3.jpg'],
price: 60,
currency: 'USD'
}
Then, for each product object I'm shipping them off to a MongoDB function which does an upsert into my database
THE ISSUE
This all worked just fine, until the process got large. I'm creating about 60,000 product objects every time this script runs, and after a little while all of my computer's RAM is being used up. What's more, after getting about halfway through my process I get the following error in Node:
FATAL ERROR: CALL_AND_RETRY_2 Allocation failed - process out of memory
I'm very much of the mind that this is a code design issue. Should I be "deleting" the objects once I'm done with them? What's the best way to tackle this?

Categories