ExtJS 3.3 Store: save event receiving incomplete data - javascript

We're still using ExtJS 3.3 in this project because it's an older project and (unfortunately) we don't have time to upgrade at the moment.
I have an editor grid panel with a restful JSON store. The JSON store has autoSave set to false, because I need to perform a potentially long running task on the server when something has changed. So I decided to create a "Save" button so the store is only saved once the user has completed the modifications. As the store is sending multiple requests to the server, I'm listening to the save event of the store to fire another request starting the long running operation after all data has been saved.
And here comes the problem: According to the documentation the third parameter of the event handler is the data that has been saved, grouped by the store operation (create, update or destroy), like this:
{
create: [ /* created records here */ ],
update: [ /* updated records here */ ],
destroy: [ /* deleted records here */ ]
}
My problem is that I'm receiving only the "created" records. For the other operations, the array is always empty when there should be some records. If there were no records using "update" or "destroy", the data object doesn't contain the "update" or "destroy" key (which is correct).
Let's say there were one updated, two created and no deleted records in the last save operation, the data I receive would look like this:
{
create: [
{ /* record 1 data here */ },
{ /* record 2 data here */ }
],
update: [] // <-- should not be empty!
}
I don't know why update and destroy are always empty. Can anyone help me?
Or maybe you have another suggestion how I can solve the original problem, performing an "after-save" task using the IDs of all created/updated/deleted records.

I managed to work around this issue.
Apparently the beforesave event receives the correct parameters. I'm now storing the dirty records in a temporary property in the store before saving them and read the property afterwards in the save event.
listeners: {
beforesave: function (store, data) {
// in my concrete case it's okay to join all records
var all = (data.create || []).concat(data.update || []).concat(data.destroy || []);
store.tempData = all;
},
save: function (store) {
// do something with store.tempData
}
}
As the records in tempData are references, it doesn't matter that they may be phantom records (without server-generated ID) when beforesave is called. When save is called, these records have already been replaced with "real" records.

Related

Determining what has been added/deleted/changed in a Firestore list

Firestore as the backend. I've managed to get through by simply using basic crud methods. However, I wanted to find out how do I determine the changes to a list of items that are returned after the initial subscription.
What I'm ultimately looking to do is :
- miminise the amount of documents that are read each time
- animate a list of items (entry animation, exit animation, change animamtion)
In the following example I have the basic crud method along with the initial subscription:
posts:post [] = [];
constructor(private db: AngularFirestore){}
ngOnInit(){
//The initial subscription to the posts
this.db.collection("Posts").valuechanges().subscribe( _posts => {
this.posts = _posts;
});
async addItem(_post:post)
{
_post.id = this.db.createId();
await this.db.collection("Posts").doc(_post.id).set(_post);
}
async update(_post:post)
{
await this.db.collection("Posts").doc(_post.id).update(_post);
}
delete (_post:post)
{
await this.db.collection("Posts").doc(_post.id).delete();
}
With the above methods, I'm subscribing to the documents in the Posts collection. Initially I'm receiving an arrray of type Post, and whenever another item is added, updated, removed i'm receiving an updated array of of type post.
How do I differentiate what has happened to the item so I can animate the changes (i.e animate the entry of the item etc...) ?
It would really help me out if you could show a sample code ?
Thanks
The valueChanges observable only exposes the actual data in the document. It has no other metadata about the document, nor the kind of change.
If you need more information, listen for documentChanges instead. That exposes a stream of DocumentChangeAction objects, which amongst others contain a type property that is the DocumentChangeType.
See https://github.com/angular/angularfire2/blob/master/docs/firestore/documents.md#the-documentchangeaction-type

How to prevent 'value' event on the client that issued set?

A Firebase client calling set() will cause all connected clients to have value triggered - including - the original client that issued the set().
In my case (and I think in most cases), there is no reason for the client that issued the set() to respond to the value event produced by its own call. Obviously its model is correct and there's no need to change it (which may be an expensive operation).
Is there any way for the client to not-receive/prevent/ignore the value event triggered by its own set() call ? I considered using off/on around set() but that can make the client miss value events that came at the same time but were not triggered by it.
Am I missing something obvious ?
Most applications treat the Firebase data itself as their model. So when there's an update, they call ref.set() (or another mutator function) and then the update flows back into their app through an on() event. React/Flux aficionados know this as a unidirectional data-flow, other might know it as Command Query Responsibility Segregation.
But there indeed cases where the model has already been updated and thus you want to ignore the event from Firebase if you're the one who triggered it.
There is no API for not receiving theses self-triggered events. Instead you'll have to "remember" the data that you sent to Firebase and filter it out in your on() handler.
The Android drawing sample from Firebase keeps a list of segments that it sends to Firebase and then ignores those segments in its onChildAdded handler. It uses push ids to identify the line segments and those are generated client-side, so it can use those to track identify the segments.
A JavaScript sample of this:
var pendingChildIds = []; // Push ids of nodes we've sent to the server, but haven't received in `on()` yet
// this code is in your UI event handler, or whatever triggers the needs to update your Firebase data
var newChild = ref.push();
pendingChildIds.push(newChild.key());
newChild.set(
{ property1: 'value1', property2: 3.14 },
function(error) {
// the write operation has completed, remove the child id from the list of pending writes
pendingChildIds.splice(pendingChildIds.indexOf(newChild.key());
}
);
// this is the event handler, using child_added in this case
ref.on('child_added', function(snapshot) {
if (!pendingChildIds.contains(snapshot.key())) {
// this is a child that we DIDN'T generate
}
});
I ended up adding a client ID to the model, something like:
var clientId=(Math.random()*10000000000000000).toFixed(0);
function set(data) {
ref.set(JSON.stringify({ clientId: clientId, data: data }));
}
ref.on('value', function(snapshot) {
var json=JSON.parse(snapshot.val());
if (!json || json.clientId===clientId) return;
var data=json.data;
// update model with data
});

ExtJS How to update filtered store data?

Just doing these steps:
I have grid filled with data from store.
I filter it (for example: Show only with status "late").
Data in store is updated.
I still see filtered old data (old records with status late).
I remove filter, All new data appear, with all old records which were not visible during filter.
Maybe someone know why and how to fix this?
FIXED
This code in store made a trick:
listeners : {
beforeload : function() {
this.data.clear();
if(this.data._source)
this.data._source.clear();
}
},
You can bind yourStore.reload() to your update event.

backbone.js cache collections and refresh

I have a collection that can potentially contain thousands of models. I have a view that displays a table with 50 rows for each page.
Now I want to be able to cache my data so that when a user loads page 1 of the table and then clicks page 2, the data for page 1 (rows #01-50) will be cached so that when the user clicks page 1 again, backbone won't have to fetch it again.
Also, I want my collection to be able to refresh updated data from the server without performing a RESET, since RESET will delete all the models in a collection, including references of existing model that may exist in my app. Is it possible to fetch data from the server and only update or add new models if necessary by comparing the existing data and the new arriving data?
In my app, I addressed the reset question by adding a new method called fetchNew:
app.Collection = Backbone.Collection.extend({
// fetch list without overwriting existing objects (copied from fetch())
fetchNew: function(options) {
options = options || {};
var collection = this,
success = options.success;
options.success = function(resp, status, xhr) {
_(collection.parse(resp, xhr)).each(function(item) {
// added this conditional block
if (!collection.get(item.id)) {
collection.add(item, {silent:true});
}
});
if (!options.silent) {
collection.trigger('reset', collection, options);
}
if (success) success(collection, resp);
};
return (this.sync || Backbone.sync).call(this, 'read', this, options);
}
});
This is pretty much identical to the standard fetch() method, except for the conditional statement checking for item existence, and using add() by default, rather than reset. Unlike simply passing {add: true} in the options argument, it allows you to retrieve sets of models that may overlap with what you already have loaded - using {add: true} will throw an error if you try to add the same model twice.
This should solve your caching problem, assuming your collection is set up so that you can pass some kind of page parameter in options to tell the server what page of options to send back. You'll probably want to add some sort of data structure within your collection to track which pages you've loaded, to avoid doing unnecessary requests, e.g.:
app.BigCollection = app.Collection.extend({
initialize: function() {
this.loadedPages = {};
},
loadPage: function(pageNumber) {
if (!this.loadedPages[pageNumber]) {
this.fetchNew({
page: pageNumber,
success: function(collection) {
collection.loadedPages[pageNumber] = true;
}
})
}
}
});
Backbone.Collection.fetch has an option {add:true} which will add models into a collection instead of replacing the contents.
myCollection.fetch({add:true})
So, in your first scenario, the items from page2 will get added to the collection.
As far as your 2nd scenario, there's currently no built in way to do that.
According to Jeremy that's something you're supposed to do in your App, and not part of Backbone.
Backbone seems to have a number of issues when being used for collaborative apps where another user might be updating models which you have client side. I get the feeling that Jeremy seems to focus on single-user applications, and the above ticket discussion exemplifies that.
In your case, the simplest way to handle your second scenario is to iterate over your collection and call fetch() on each model. But, that's not very good for performance.
For a better way to do it, I think you're going to have to override collection._add, and go down the line dalyons did on this pull request.
I managed to get update in Backbone 0.9.9 core. Check it out as it's exactly what you need http://backbonejs.org/#Collection-update.

Backbone.js appending more models to collection

I have a following code that fetches data from the server and add to the collection.
// common function for adding more repos to the collection
var repos_fetch = function() {
repos.fetch({
add: true,
data: {limit:curr_limit, offset:(curr_offset*curr_limit)},
success:function() {
curr_offset++;
console.log(repos.length);
}
});
};
each time I call the function "repos_fetch", the data is retrieved from the server and added to the collection "repos".
My problem is that I want to APPEND to the collection, instead of REPLACING. So I put the option "add: true" there.
But below function looks like it's keep replacing the data in collection.
What's even strange is that if I remove the line "curr_offset++;" then the data gets appended!. "curr_offset" just increments, so I get different sets of data.
What's going on here?

Categories