I have some code to store cued Ajax requests in localstorage that works well with one key per request. I'm also storing other persistent data in the same place - I've separated this by using a unique key prefix so that this doesn't interfere with my Ajax cue.
I now have need to store yet further data and I think it's getting messy.
Is there some way of having multiple localstorage instances?
You can't have multiple localStorage instances, you can't instantiate it. You don't have to separate out every single bit of data with unique key prefix if they can be logically grouped together. As long as those data can be serialized you can do something like this:
var requests = {reqId2 : reqObj, reqId2 : reqObj};
localStorage.setItem("ajax_requests", JSON.stringify(requests));
// then when you want to add additional requests into localStorage
var requests = JSON.parse(localStorage.getItem("ajax_requests"));
requests[reqId] = reqObj;
Just remember to store all data that can be logically grouped together, it should help you with the organization.
Related
Dataloader is able to batch and cache requests, but it can only be used by either calling load(key) or loadMany(keys).
The problem I am having is that sometimes I do not know they keys of the items I want to load in advance.
I am using an sql database and this works fine when the current object has a foreign key from a belongsTo relation with another model.
For example a user that belongs to a group and so has a groupId. To resolve the group you would just call groupLoader.load(groupId).
On the other hand, if I wanted to resolve the users within a group, of which there could be many I would want a query such as
SELECT * from users where user.groupId = theParticularGroupId
but a query such as this doesn't use the keys of the users and so I am not sure how make use of dataloader.
I could do another request to get the keys like
SELECT id from users where user.groupId = theParticularGroupId
and then call loadMany with those keys... But I could have just requested the data directly instead.
I noticed that dataloader has a prime(key, value) function which can be used to prime the cache, however that can only be done once the data is already fetched. At which point many queries would already have been sent, and duplicate data could have been fetched.
Another example would be the following query
query {
groups(limit: 10) {
id
...
users {
id
name
...
}
}
}
I cannot know the keys if I am searching for say the first or last 10 groups. Then once I have these 10 groups. I cannot know the keys of their users, and if each resolver would resolve the users using a query such as
SELECT * from users where user.groupId = theParticularGroupId
that query will be executed 10 times. Once the data is loaded I could now prime the cache, but the 10 requests have already been made.
Is there any way around this issue? Perhaps a different pattern or database structure or maybe dataloader isn't even the right solution.
You'll want a dataloader instance for the lookup you can do, in this case you have a group ID and you want the users:
import DataLoader from 'dataloader';
const userIdsForGroupLoader = new DataLoader(groupIds => batchGetUsersIdsForGroups(groupIds));
Now your batchGetUsersForGroups function is essentially has to convert an array of group IDs to an array of arrays of users (one array of user IDs for each group).
You'd start off with an IN query:
SELECT id from users where user.groupId in (...groupIds)
This will give you a single result set of users, which you'll have to manipulate, by grouping them by their groupId, the array should be ordered according to the original array of groupIds. Make sure you return an empty array for groupIds that don't have any users.
Note that in this we're only returning the user ids, but you can batch fetch the users in one go once you have them. You could tweak it slightly to return the users themselves, you'll have to decide for yourself if that's the right approach.
Everything I mention in this article can be achieved using clever use of Dataloader. But the key takeaway is that the values you pass to the load/loadMany functions don't have to correspond to the IDs of the objects you're trying to return.
I want to query object from Parse DB through javascript, that has only 1 of some specific relation object. How can this criteria be achieved?
So I tried something like this, the equalTo() acts as a "contains" and it's not what I'm looking for, my code so far, which doesn't work:
var query = new Parse.Query("Item");
query.equalTo("relatedItems", someItem);
query.lessThan("relatedItems", 2);
It seems Parse do not provide a easy way to do this.
Without any other fields, if you know all the items then you could do the following:
var innerQuery = new Parse.Query('Item');
innerQuery.containedIn('relatedItems', [all items except someItem]);
var query = new Parse.Query('Item');
query.equalTo('relatedItems', someItem);
query.doesNotMatchKeyInQuery('objectId', 'objectId', innerQuery);
...
Otherwise, you might need to get all records and do filtering.
Update
Because of the data type relation, there are no ways to include the relation content into the results, you need to do another query to get the relation content.
The workaround might add a itemCount column and keep it updated whenever the item relation is modified and do:
query.equalTo('relatedItems', someItem);
query.equalTo('itemCount', 1);
There are a couple of ways you could do this.
I'm working on a project now where I have cells composed of users.
I currently have an afterSave trigger that does this:
const count = await cell.relation("members").query().count();
cell.put("memberCount",count);
This works pretty well.
There are other ways that I've considered in theory, but I've not used
them yet.
The right way would be to hack the ability to use select with dot
notation to grab a virtual field called relatedItems.length in the
query, but that would probably only work for me because I use PostGres
... mongo seems to be extremely limited in its ability to do this sort
of thing, which is why I would never make a database out of blobs of
json in the first place.
You could do a similar thing with an afterFind trigger. I'm experimenting with that now. I'm not sure if it will confuse
parse to get an attribute back which does not exist in its schema, but
I'll find out, by the end of today. I have found that if I jam an artificial attribute into the objects in the trigger, they are returned
along with the other data. What I'm not sure about is whether Parse will decide that the object is dirty, or, worse, decide that I'm creating a new attribute and store it to the database ... which could be filtered out with a beforeSave trigger, but not until after the data had all been sent to the cloud.
There is also a place where i had to do several queries from several
tables, and would have ended up with a lot of redundant data. So I wrote a cloud function which did the queries, and then returned a couple of lists of objects, and a few lists of objectId strings which
served as indexes. This worked pretty well for me. And tracking the
last load time and sending it back when I needed up update my data allowed me to limit myself to objects which had changed since my last query.
I have the following hierarchy on firebase, some data are hidden for confidentiality:
I'm trying to get a list of videos IDs (underlines in red)
I only can get all nodes, then detect their names and store them in an array!
But this causes low performance; because the dataSnapshot from firebase is very big in my case, so I want to avoid retrieving all the nodes' content then loop over them to get IDs, I need to just retrieve the IDs only, i.e. without their nested elements.
Here's my code:
new Firebase("https://PRIVATE_NAME.firebaseio.com/videos/").once(
'value',
function(dataSnapshot){
// dataSnapshot now contains all the videos ids, lines & links
// this causes many performance issues
// Then I need to loop over all elements to extract ids !
var videoIdIndex = 0;
var videoIds = new Array();
dataSnapshot.forEach(
function(childSnapshot) {
videoIds[videoIdIndex++] = childSnapshot.name();
}
);
}
);
How may I retrieve only IDs to avoid lot of data transfer and to avoid looping over retrived data to get IDs ? is there a way to just retrive these IDs directly ?
UPDATE: There is now a shallow command in the REST API that will fetch just the keys for a path. This has not been added to the SDKs yet.
In Firebase, you can't obtain a list of node names without retrieving the data underneath. Not yet anyways. The performance problems can be addressed with normalization.
Essentially, your goal is to split data into consumable chunks. Store your list of video keys, possible with a couple meta fields like title, etc, in one path, and store the bulk content somewhere else. For example:
/video_meta/id/link, title, ...
/video_lines/id/...
To learn more about denormalizing, check out this article: https://www.firebase.com/blog/2013-04-12-denormalizing-is-normal.html
It is a bit old, and you probably already know, but in case someone else comes along. You can do this using REST api call, you only need to set the parameter shallow=true
here is the documentation
Take for example a case where I have thousands of students.
So I'd have an array of objects.
students = [
{ "name":"mickey", "id","1" },
{ "name":"donald", "id","2" }
{ "name":"goofy", "id","3" }
...
];
The way I currently save this into my localstorage is:
localStorage.setItem('students', JSON.stringify(students));
And the way I retrieve this from the localstorage is:
var data = localStorage.getItem('students');
students = JSON.parse(data);
Now, whenever I make a change to a single student, I must save ALL the
students to the localStorage.
students[0].name = "newname";
localStorage.setItem('students', JSON.stringify(students));
I was wondering if it'd be better instead of keeping an array, to maybe have
thousands of variables
localStorage.setItem('student1', JSON.stringify(students[0]));
localStorage.setItem('student2', JSON.stringify(students[1]));
localStorage.setItem('student3', JSON.stringify(students[2]));
...
That way a student can get saved individually without saving the rest?
I'll potentially have many "students".. Thousands. So which way is better,
array or many variables inside the localstorage?
Note: I know I should probably be using IndexedDB, but I need to use LocalStorage for now. Thanks
For your particular case it would probably be easier to store the students in one localStorage key and using JSON parse to reconstruct your object, add to it, then stringifying it again and it would be easier than splitting it up by each student to different keys.
If you don't have so many data layers that you really need a real local database like IndexedDB, a single key and a JSON string value is probably OK for your case.
There is limitation for the size of local storage and older browsers won't support it.
It is better to store in an array for couple reasons:
Can use loops to process them
No JSON needed
Always growable
So I am using jQuery and have setup the jquery cookie plugin.
I have 4 drop down lists on my page, and I want to save the user's selections in a cookie, so when they come back to the page I automatically pre-select their previous selections.
I added a class to all my drop downs "ddl-cookie", and I was just thinking if I could somehow loop through all the drop down lists using the class, and save the selection and also loop to set the selections when the user returns to the page.
$(".ddl-cookie").each(function() {
});
It seems that given a cookie name, I can save a single key/value in the cookie.
So I'm guessing the only way for me to do this would be to have a comma separated list of drop down list names and values (selection value)?
You are correct. Cookies are intended to store a single piece of data, so the most common way to handle this is to serialize your data into an easy to retrieve format. That format is up to you, but you might use something like:
field_1=value1&field_2=value&...
You might want to also encode this data--remember that cookies are transferred as part of the request header. The pseudo code would go something like this:
// Store the data, using your own defined methods
data = serialize_data(data);
data = encode_data(data);
cookie = data;
// Retrieve the data using your own defined methods
data = cookie;
data = unencode_data(data)
data = deserialize_data(data)