Backbone event Aggregator -notify when all event handler are done processing - javascript

In our project we have a event agree-gator used to save the data.We have many views who listen to the save and do there work. its all independent.
Now what we want to do is have a mechanism to know that all views are done with the save.
Basically meaning find out if all event subscribers are done with there work.
Do you guys know of a pattern which we can sue or do we need to redesign save infrastructure.

Seems not dissimilar from what jQuery Deferreds allow you to do:
http://api.jquery.com/jQuery.when/

Related

Postpone Synchronize Backbone.js Collection with Server

I have a Backbone Collection that users are performing CRUD-type activities on. I want to postpone any changes from being propagated back to the server — the Collection.sync() should not happen until the user initiates it (like POSTING a form).
As it stands, I have been able to implement on-the-fly updates with no issue (by calling things like Model.destroy() on the models when deleted, or Collection.add() to add new models to the collection. As I understand, I could pass the {silent:true} option to my models, preventing .sync() from being called during .add()/.destroy(), but from what I can tell, that could lead to some headaches later.
I have considered overriding Backbone.sync, but I am not sure if that is the best route — I feel like there is some way to hook into some events, but I am not sure. Of course I have read through the Backbone docs, annotated source, and relevant SO questions before posting this, but I have hit a wall trying to extrapolate for this particular situation.
Eventually I will need to implement this in many places in my application, which is why I am concerned about best-practices at this stage. I am looking for some guidance/suggestions/thoughts on how to proceed with preventing the default behavior of immediately syncing changes with the remote server. Any help is appreciated — thank you for your time!
EDIT:
I went with Alex P's suggestion of refactoring: in my collection I set up some attributes to track the models that have been edited, added, or deleted. Then, when the user triggers the save action, I iterate through the lists and do the appropriate actions.
The first step is to ensure that your collection is being synchronised when you suspect it is. Collection.add() shouldn't trigger a Collection.sync() by default (it's not mentioned in the method documentation or the list of events, and I couldn't see a trigger in the annotated source).
Model.destroy() does trigger a sync(), but that shouldn't be a surprise - it's explicitly defined as "destroying a model on the server", and that sync() is performed on the model, not the collection. Your destroyed models will be removed from any collections that contain them, but I wouldn't expect those collections to sync() unless explicitly asked.
If your collections really are sync()ing when you're not expecting them to, then the most likely culprit is an event listener somewhere. Have you added any event listeners that call sync() for you when they see add or remove events? If your collection should sync() only on user interaction, can you remove those event listeners?
If not, then passing {silent: true} into your methods might be a viable approach. But remember that this is just stopping events from being emitted - it's not stopping that code from running. If something other than an event listener is triggering your sync()s, then preventing those events from being emitted won't stop them.
It would also be worth considering a wider refactor of your app. Right now you modify the collection and models immediately, and try to delay all sync()s until after the user clicks a button. What if you cached a list of all models to destroy & items to add, and only performed the actions when the button is clicked? Storing the model IDs would be sufficient to destroy them, and storing the collection ID and model ID would let you add items. It also means you don't have to fetch() the collection again if the user decides not to save their changes after all.

Deferred objects VS backbone events for View updates in Marionette

I am learning Marionette / Backbone with the excellent book "Backbone.Marionette.js: A Gentle Introduction".
In the chapter 'Handling Data Latency', the Author (David Sulc) uses JQuery deferreds objects / promises to pass data from the fetch() action (called in entities/contact.js) to controllers (show_controller.js and list_controller.js in the example).
Code is available here : https://github.com/davidsulc/marionette-gentle-introduction/commit/de25497a1fa028f6a4471b0003645307a6e426f8
I understand how it works but in my previous readings on Backbone, this kind of thing was handled by Backbone events: we listen to a 'change' event on the model and we call render() when it is triggered. So I have a few questions about that :
1/ Are deferred objets better than backbone events for updating the view when we get a response from the server?
2/ What are the pros and the cons of the 2 solutions?
3/ Is it wrong / bad design to use backbone events in this situation?
Thanks for your help!
A specific use-case would be where you want to load/instantiate a view only when a model/collection has been fetched from the server. In that case you can use the Deferred object because that'll tell you exactly when the data has been loaded.
Of course you can listen to a change event on the model/collection, but you'd need to have a view instantiated already or add an extra listener to your model/collection to create the view.
So, to answer your questions:
1. Really depends, if you don't mind updating your view, then events are fine. If you want to instantiate a view later on, deferreds are a good fit.
2. See introduction
3. Not really. It also depends on the scale of your application. As S_I_R mentiones as well, it'll probably result in a bit cleaner code.
Ok, let me try explaining,
1.
Deferred Objects have promises so it contains callback built right in it. So in data driven web application its better to use this to get rid of callback hell for asynchronous functions.
2.
If you need to bind a user control event then you need trigger or events from Backbone.Events to do that, in that case you have to manually propagate your event through the controllers. On the other hand deferred objects can give status when it is finished, so after that you can populate your view or do other task.
3.
Well deferred mechanism was introduced to ease up situation where manual callbacks are overwhelming. It is much easier to use promises instead callback which makes code cleaner.
And as javascript being single threaded promises are more efficient for async operation.
1) Are deferred objects better than backbone events for updating the view when we get a response from the server?
Deferred objects are better.
I'll explain why. Relying on the view to re-render after the change or sync event on its model or collection can lead to visual anomalies. Lets take an example when you fetch a collection from a server and display its models as a list. When the collection has zero models you want to display 'No models found'. In this case before the collection is fetched and a 'change' event is fired, the user will see 'No models found' on the screen. As soon as the collection is fetched the view will re-render itself and now the user will see the models. This can lead to confusion. Although this is just one of the many use cases I am sure you get my point.
Once your object has been fetched from the server you can then bind the view's render() method to the object's 'change' or 'sync' event for any subsequent changes that may happen to the object.
2) What are the pros and the cons of the 2 solutions?
I think the answer to question 1 answers this question as well.
3) Is it wrong / bad design to use backbone events in this situation?
Probably. I would go with S_I_R's answer on this one.

Why use a callback function over triggering an event in jQuery?

I've seen the callback() concept used a lot in jQuery plugins and I'm starting to think triggering custom events could be a better alternative.
jQuery has a built-in mechanism to trigger('A_CUSTOM_EVENT') so why don't plugin authors simply trigger a 'COMPLETE_EVENT' rather than insist we pass in a callback function that is to handle this 'complete phase'.
Nick
This depends on what you are trying to achieve - it's an architectural choice.
Basically the event-paradigm is open, non-private and persistent. You have an public event, where everybody can register for, and their event functions are basically called as often as they wish until they unregister from the event. Makes sense for recurring events.
Example: Registering to a hover event.
The callback-paradigm is isolated, private and disposable. Someone calls your code and hands over a private callback, which will be disposed once it got executed. In most cases the usability is limited (limited to a single point in time) and/or should not necessarily be public.
Example: Handling an ajax response.
Both of these paradigms have advantages and drawbacks. Using one or the other is up to you and how you want your application to be used.

Delete models and collections

Using Backbone.js:
When a user logs out is it normal to delete a bunch of models and collections?
That's what I plan to do in my application to prevent zombie data/bindings but I dont know if it's the best way to handle things.
If this is a good practice should I just call delete this on cleanup?
the zombies you need to worry about come from binding to events. i wrote a post about this from the perspective of views: http://lostechies.com/derickbailey/2011/09/15/zombies-run-managing-page-transitions-in-backbone-apps/
in your case with models, you should do the unbinding first and then delete the models and collections you don't need. calling delete whatever is the best way to ensure that things are truly gone. be sure to unbind from your model and collection events first, or you'll end up with bindings that point to undefined and cause exceptions.

Unsubscribing from a published event with JavaScript

I have a website that users can dynamically add widgets to. These widgets use the Peter Higgins pub/sub plug-in to $.(subscribe) to an event that I $.(publish) from another 'core' module.
I have widgets in their own name space like this:
km.widget.name1,
km.widget.name2,
etc.
So the handles created by $.(subscribe) aren't global.
I do not know how to unsubscribe these widgets when the user decides to remove the widget from their custom page.
Also, how would I know which widget to unsubscribe from?
This doesn't solve your problem directly, but it may very well help you out. This is a recent blog by Sam Clearman. He describes a way to handle publish/subscribe events without using that plugin:
jQuery custom events provide a built in means to use the publish subscribe pattern in a way that is functionally equivalent, and syntactically very similar, to Higgin’s pub/sub plugin: Just bind observers to document.
Doing it this way, you may be able to solve your current issues.
I haven't used the pubsub plugin before, but I just glanced at the source code and it looks like you can unsubscribe in exactly the same as as you subscribe, just using $.unsubscribe(...) rather than $.subscribe(...).
Is this something you already know, and the problem is caused by your widget namespacing? I'm not really sure of what you mean by namespacing, anyway, since JavaScript doesn't support true namespaces (just objects - which I'm guessing is what you're using).

Categories