Fast Unique Team Selection from a group - javascript

I have a situation in which I have a list of 150 players(Names, id(uuid),team names etc.).
The user can make a team of 10 players and he can create as many teams as possible.
I need to make sure that the user does not try to save the same team twice i.e. same set of players in different order.
I want to achieve it in the most efficient way possible. I was thinking about a hashing function that could solve my problem. I haven't figured it out yet but I just wanted to know what do you guys think? I have to implement this in nodejs if it helps anyway.

When the user makes a team, sort the list of names and join them together with list.join().
Use the resulting string as a key to detect duplicate teams, like this:
var check={};
function submit(team /*array*/) {
var key = team.sort().join();
if (check[key]) {
//... error, duplicate ...
}
check[key]=true;
//... continue with submission...
This is probably the fastest way in JavaScript

Related

accessing and removing objects by ID

I have certain requirements , I wanted to do the following in quickest way as possible.
I have 1000's of objects like below
{id:1,value:"value1"} . . {id:1000,value:"value1000"}
I want to access above objects by id
I want to clean the objects Lesser than certain id every few minutes (Because it generates 1000's of objects every second for my high frequency algorithm)
I can clean easily by using this.
myArray = myArray.filter(function( obj ) {
return obj.id > cleanSize;
});
I can find the object by id using
myArray.find(x => x.id === '45');
Problem is here , I feel that find is little slower when there is larger sets of data.So I created some objects of object like below
const id = 22;
myArray["x" + id] = {};
myArray["x" + id] = { id: id, value:"test" };
so I can access my item by id easily by myArray[x22]; , but problem is i am not able find the way to remove older items by id.
someone guide me better way to achieve the three points I mentioned above using arrays or objects.
The trouble with your question is, you're asking for a way to finish an algorithm that is supposed to solve a problem of yours, but I think there's something fundamentally wrong with the problem to begin with :)
If you store a sizeable amount of data records, each associated with an ID, and allow your code to access them freely, then you cannot have another part of your code dump some of them to the bin out of the blue (say, from within some timer callback) just because they are becoming "too old". You must be sure nobody is still working on them (and will ever need to) before deleting any of them.
If you don't explicitly synchronize the creation and deletion of your records, you might end up with a code that happens to work (because your objects happen to be processed quickly enough never to be deleted too early), but will be likely to break anytime (if your processing time increases and your data becomes "too old" before being fully processed).
This is especially true in the context of a browser. Your code is supposed to run on any computer connected to the Internet, which could have dozens of reasons to be running 10 or 100 times slower than the machine you test your code on. So making assumptions about the processing time of thousands of records is asking for serious trouble.
Without further specification, it seems to me answering your question would be like helping you finish a gun that would only allow you to shoot yourself in the foot :)
All this being said, any JavaScript object inherently does exactly what you ask for, provided you're okay with using strings for IDs, since an object property name can also be used as an index in an associative array.
var associative_array = {}
var bob = { id:1456, name:"Bob" }
var ted = { id:2375, name:"Ted" }
// store some data with arbitrary ids
associative_array[bob.id] = bob
associative_array[ted.id] = ted
console.log(JSON.stringify(associative_array)) // Bob and Ted
// access data by id
var some_guy = associative_array[2375] // index will be converted to string anyway
console.log(JSON.stringify(some_guy)) // Ted
var some_other_guy = associative_array["1456"]
console.log(JSON.stringify(some_other_guy)) // Bob
var some_AWOL_guy = associative_array[9999]
console.log(JSON.stringify(some_AWOL_guy)) // undefined
// delete data by id
delete associative_array[bob.id] // so long, Bob
console.log(JSON.stringify(associative_array)) // only Ted left
Though I doubt speed will really be an issue, this mechanism is about as fast as you will ever get JavaScript to run, since the underlying data structure is a hash table, theoretically O(1).
Anything involving array methods like find() or filter() will run in at least O(n).
Besides, each invocation of filter() would waste memory and CPU recreating the array to no avail.

Dynamic-multiple inputs with the same name

For a dynamic system, which is a "bag" and some "parts" in it, I am trying to make an edit-page. It should be able to get the bag and the parts. Getting the data from the backend is no problem, but saving it is.
I tried to give each "part"-Input field names like name="parts[][partName], name="parts[][internalNumber] and so on, but the $_POST-Array has a strange structure of many arrays:
parts [
[partName = "Test1"],
[internalNumber = "111"],
[partName = "Test2"],
[internalNumber = "222"],
...
]
I think that relying on counting and dividing doesn't seem like a reliable method to me, especially because most of the inputs can be empty.
Working with fixed names, like "parts[0][...]", "parts[1][...]" would be kind of difficult because of the dynamic structure by adding, removing and editing the groups.
What would be the best way to get the data reliably?

Parse.com relations count

I want to query object from Parse DB through javascript, that has only 1 of some specific relation object. How can this criteria be achieved?
So I tried something like this, the equalTo() acts as a "contains" and it's not what I'm looking for, my code so far, which doesn't work:
var query = new Parse.Query("Item");
query.equalTo("relatedItems", someItem);
query.lessThan("relatedItems", 2);
It seems Parse do not provide a easy way to do this.
Without any other fields, if you know all the items then you could do the following:
var innerQuery = new Parse.Query('Item');
innerQuery.containedIn('relatedItems', [all items except someItem]);
var query = new Parse.Query('Item');
query.equalTo('relatedItems', someItem);
query.doesNotMatchKeyInQuery('objectId', 'objectId', innerQuery);
...
Otherwise, you might need to get all records and do filtering.
Update
Because of the data type relation, there are no ways to include the relation content into the results, you need to do another query to get the relation content.
The workaround might add a itemCount column and keep it updated whenever the item relation is modified and do:
query.equalTo('relatedItems', someItem);
query.equalTo('itemCount', 1);
There are a couple of ways you could do this.
I'm working on a project now where I have cells composed of users.
I currently have an afterSave trigger that does this:
const count = await cell.relation("members").query().count();
cell.put("memberCount",count);
This works pretty well.
There are other ways that I've considered in theory, but I've not used
them yet.
The right way would be to hack the ability to use select with dot
notation to grab a virtual field called relatedItems.length in the
query, but that would probably only work for me because I use PostGres
... mongo seems to be extremely limited in its ability to do this sort
of thing, which is why I would never make a database out of blobs of
json in the first place.
You could do a similar thing with an afterFind trigger. I'm experimenting with that now. I'm not sure if it will confuse
parse to get an attribute back which does not exist in its schema, but
I'll find out, by the end of today. I have found that if I jam an artificial attribute into the objects in the trigger, they are returned
along with the other data. What I'm not sure about is whether Parse will decide that the object is dirty, or, worse, decide that I'm creating a new attribute and store it to the database ... which could be filtered out with a beforeSave trigger, but not until after the data had all been sent to the cloud.
There is also a place where i had to do several queries from several
tables, and would have ended up with a lot of redundant data. So I wrote a cloud function which did the queries, and then returned a couple of lists of objects, and a few lists of objectId strings which
served as indexes. This worked pretty well for me. And tracking the
last load time and sending it back when I needed up update my data allowed me to limit myself to objects which had changed since my last query.

Lucene-like searching through JSON objects in JavaScript

I have a pretty big array of JSON objects (its a music library with properties like artist, album etc, feeding a jqgrid with loadonce=true) and I want to implement lucene-like (google-like) query through whole set - but locally, i.e. in the browser, without communication with web server. Are there any javascript frameworks that will help me?
Go through your records, to create a one time index by combining all search
able fields in a single string field called index.
Store these indexed records in an Array.
Partition the Array on index .. like all a's in one array and so on.
Use the javascript function indexOf() against the index to match the query entered by the user and find records from the partitioned Array.
That was the easy part but, it will support all simple queries in a very efficient manner because the index does not have to be re-created for every query and indexOf operation is very efficient. I have used it for searching up to 2000 records. I used a pre-sorted Array. Actually, that's how Gmail and yahoo mail work. They store your contacts on browser in a pre-sorted array with an index that allows you to see the contact names as you type.
This also gives you a base to build on. Now you can write an advanced query parsing logic on top of it. For example, to support a few simple conditional keywords like - AND OR NOT, will take about 20-30 lines of custom JavaScript code. Or you can find a JS library that will do the parsing for you the way Lucene does.
For a reference implementation of above logic, take a look at how ZmContactList.js sorts and searches the contacts for autocomplete.
You might want to check FullProof, it does exactly that:
https://github.com/reyesr/fullproof
Have you tried CouchDB?
Edit:
How about something along these lines (also see http://jsfiddle.net/7tV3A/1/):
var filtered_collection = [];
var query = 'foo';
$.each(collection, function(i,e){
$.each(e, function(ii, el){
if (el == query) {
filtered_collection.push(e);
}
});
});
The (el == query) part of course could/should be modified to allow more flexible search patterns than exact match.

Query using multiple conditions

I recently discovered (sadly) that WebSQL is no longer being supported for HTML5 and that IndexedDB will be replacing it instead.
I'm wondering if there is any way to query or search through the entries of an IndexedDB in a similar way to how I can use SQL to search for an entry satisfying multiple conditions.
I've seen that I can search through IndexedDB using one condition with the KeyRange. However, I can't seem to find any way to search two or more columns of data without grabbing all the data from the database and doing it with for loops.
I know this is a new feature that's barely implemented in the browsers, but I have a project that I'm starting and I'm researching the different ways I could do it.
Thank you!
Check out this answer to the same question. It is more detailed than the answer I give here. The keypath parameter to store.createIndex and IDBKeyRange methods can be an array. So, crude example:
// In onupgradeneeded
var store = db.createObjectStore('mystore');
store.createIndex('myindex', ['prop1','prop2'], {unique:false});
// In your query section
var transaction = db.transaction('mystore','readonly');
var store = transaction.objectStore('mystore');
var index = store.index('myindex');
// Select only those records where prop1=value1 and prop2=value2
var request = index.openCursor(IDBKeyRange.only([value1, value2]));
// Select the first matching record
var request = index.get(IDBKeyRange.only([value1, value2]));
Let's say your SQL Query is something like:
SELECT * FROM TableName WHERE Column1 = 'value1' AND Column2 = 'value2'
Equivalent Query in JsStore library:
var Connection = new JsStore.Instance("YourDbName");
Connection.select({
From: "YourTableName"
Where: {
Column1: 'value1',
Column2: 'value2'
},
OnSuccess:function (results){
console.log(results);
},
OnError:function (error) {
console.log(error);
}
});
Now, if you are wondering what JsStore is, let me tell you it is a library to query IndexedDB in a simplified manner. Click here to learn more about JsStore
I mention some suggestions for querying relationships in my answer to this question, which may be of interest:
Conceptual problems with IndexedDB (relationships etc.)
As to querying multiple fields at once, it doesn't look like there's a native way to do that in IndexedDB (I could be wrong; I'm still new to it), but you could certainly create a helper function that used a separate cursor for each field, and iterated over them to see which records met all the criteria.
Yes, opening continuous key range on an index is pretty much that is in indexedDB. Testing for multiple condition is not possible in IndexedDB. It must be done on cursor loop.
If you find the solution, please let me know.
BTW, i think looping cursor could be very fast and require less memory than possible with Sqlite.
I'm a couple of years late, but I'd just like to point out that Josh's answer works on the presumption that all of the "columns" in the condition are part of the index's keyPath.
If any of said "columns" exist outside the the index's keyPath, you will have to test the conditions involving them on each entry which the cursor created in the example iterates over. So if you're dealing with such queries, or your index isn't unique, be prepared to write some iteration code!
In any case, I suggest you check out BakedGoods if you can represent your query as a boolean expression.
For these types of operations, it will always open a cursor on the focal objectStore unless you're performing a strict equality query (x ===? y, given x is an objectStore or index key), but it will save you the trouble writing your own cursor iteration code:
bakedGoods.getAll({
filter: "keyObj > 5 && valueObj.someProperty !== 'someValue'",
storageTypes: ["indexedDB"],
complete: function(byStorageTypeResultDataObj, byStorageTypeErrorObj){}
});
Just for the sake of complete transparency, BakedGoods is maintained by moi.

Categories