Don't render backbone marionette collection view on change? - javascript

Working on an app that has real time messages with socket.io, I also need to save the messages. I choose to use mongodb and on the client side I am using backbone.
My problem is when I emit a new message all the sockets/users get the message in real time, but the socket/user that sent the message gets it twice, once from the socket and once from the collection re-rendering.
Here is what my collection view looks like.
module.exports = CollectionView = Marionette.CollectionView.extend({
className: 'collection',
initialize: function() {
//this.listenTo(this.collection, 'change', this.render);
},
itemView: MessageView
});
I commented out this.listenTo(this.collection, 'change', this.render);. So I am asking this question because maybe marionette by default renders the CollectionView? Maybe someone has an explanation for how I can prevent the socket/user that sends a message from being appended twice?
Edit: Just got an idea, instead of appending the message I could just fetch the collection when a new message is made? I tried it and it works, maybe there is a better way? I'm still thinking, and am open to new ideas!
Edit: This worked pretty well for me. Instead of appending the html, I just make a GET request to all the sockets connected, that way they get the fresh data.
createMessage: function(data) {
window.App.data.messages.fetch({
success: function() {
console.log('success')
}
});
//this.$el.find('.message-content').append('<div class="message"><b>'+data.username+':</b>'+data.message+'</div>');
window.App.core.vent.trigger('app:log', 'Chat View: Received a new message!');
}
Only issue with the above is the sequence is POST, then GET within milliseconds of each other, so the GET request may return data before the POST fully finishes.
So I am trying to figure out how I can set a callback so that when the POST successfully adds to the collection make the GET. I'll share the solution if I come to it.

You're not showing your socket.io client code, but I think your second approach is better. Socket should only send/receive data. I assume you have at least 1 event in your socket for "receiveMessage" and 1 emit for "sendMessage".
I think you could, in your client-side socket:
When receiving a message (or I'm notified that I have a new message), add the message to your collection using Collection.add(message). Marionette will render that message for you.
When sending a message, just add it to your collection, or wait for a callback from the server (see the docs) to be sure it was received correctly before adding to the collection.
Never, never, append something to the view's HTML with jQuery if you're using Marionette view! ;)
For your initial message load, use Collection.fetch() the first time (for example, as soon as your socket is connected) to get all messages that were already on the server. From that point on, add individual messages as they come instead of fetching them all (you'll be saving bandwith).
What I do in a similar application, is send from the socket a "Hello" message upon first connection, that includes the data I need. Then I just do Collection.reset(data) with what was sent from the socket, and you're good to go. Your socket will initialize your collection and will be updating it one message at a time.
Hope it helps!

Related

Get an http call from express to angular when event happens

i'm building an angular app that will make about a thousand people to connect simultaneously to book a ticket. I want only "XYZ" of them to access simultaneously at the registration Angular component. The other ones will see a "waiting room" component until it's their turn.
I set up the whole thing like this:
User enters the page.
I make an http call to expressjs server
The server checks if the "connections" collection constains less than XYZ docs
If true, it unlocks the user registation component and with an http post req, it creates a new doc in the db. if false it leaves it hidden and shows up the waitingroom component
When user leaves the page, his doc in "connections" collection gets destroyed with an http delete call.
Fully working.
The problem is now that i want to create a kind of "priority" system, because, going like that, if you just refresh you may be lucky and get access, even if you are soon arrived and there is who is waiting since 1990's. So i introduced a "priority" system. When the user makes the first http call, if user is not allowed, the server creates a timestamp and pushes it into an array.
const timestamps = []
.
.
.
// this below is in http get req
Connessione.countDocuments({},(err,count)=>{
if(count<=nmax){
console.log("Ok")
res.status(200).json({allowed: true})
}
else{
const timestamp = req.params.timestamp;
timestamps.push(timestamp);
console.log("Semo troppi")
res.status(401).json({allowed: false})
}
});
The idea is to listen to db changes, and when there is just XYZ-1 in the db. Make a call to the first timestamp's angular frontend to say him: "Hey there, if you want we're done. You can go" and unlock him the access to registration component.
The problem is that i can't make continuous http requests every second from angular until there's a free place...
Is there any method to send a request at the server, and when server says OK, calls angular and says "Hey dude. You can go!"?
Hope you understood my question. If not ask me in the comments.
Thanks in advance
Even i had trouble with sockets in the beginning so i'll try to explain the concept in a simple way, Whenever you write an API or Endpoint you have a one way connection i.e. you send request to server and it return back some response as shown below.
Event 1:
(Client) -> Request -> (Server)
Event 2:
(Client) <- Response <- (Server)
For API's, without request you cannot get response.
To overcome this issue as of now i can think of two possible ways.
Using Sockets, With sockets you can create a two way connection. Something like this
(Server) <-> data <-> (Client)
It means you can pass data both ways, Client to server and Server to client. So whenever an event occurs(some data is added or updated in database) one can emit or broadcast it to the client and the client can listen to the socket and receive it.
In your case as it's a two connection you can emit the data from angular and
I've attached few links at the bottom. please have a look.
Using XML/AJAX Request, This is not a preferable method, using setInterval you can call the server in every 5 seconds or so and do the operation needed.
setInterval(ajaxCall, 5000); //5000 MS == 5 seconds
function ajaxCall() {
//do your AJAX stuff here
}
Links:
https://socket.io/docs/
https://alligator.io/angular/socket-io/

Need guidance on efficiently fetching from Django server with Backbone.js

I have an app built on Backbone.js and Django that is just something I'm building for practice.
It's a webchat app that let's users post a message and then saves it to a database, all while constantly fetching new data into a collection I have, so it "updates live" with the server.
On a basic level, the front-end application works with a message model to model a user's message, a messages collection to store all of the posts, and a chat view to render the collection in the chat window.
The code I use to constantly fetch new data from the server is:
// This code is inside the `chat` view's `initialize` function,
// so `that` is set to `this` (the `chat` view) higher up in the code.
window.setInterval(function() {
theColl.fetch({
success: function() {
that.render();
}
});
}, 2000);
I have don't have reset: true because I thought that it would be more efficient to just append new messages to the collection as they were retrieved from the server, instead of reloading every model already in it.
Is this an efficient way to run a simple webchat? As the application has to fetch more models, I've noticed that it gets extremely sluggish, and adds a delay to the user input. Here is a photo to show what I think is one problem (p.s. ignore the stupid messages I wrote):
As the server sends back new models and the collection refreshes, I think that my app is initializing old chat models again, because at this point in the application the collection itself only had 25 models inside of it.
Is there something I can do to stop the reinitialization of every model in the database? As you can see, even though there are only 25 unique models in the chat, and in the database, after letting the app run for 2 minutes or so I get models with CIDs up to 460.
Regardless of that, I've noticed that if I flood the server with new messages things go awry, like new messages appear out of place, and the server gets backed up. The latter may be because I'm using the cheapest option on Digital Ocean to host the app, but I'm not sure what the first problem comes from. It may be from my Django app's views.py or the rest of my Backbone code, I'm not really sure how to debug it.

Auto detect if DB table changes

I have a small application where a users can drag and drop a task in an HTML table.
When user drops the task, I call a javascript function called update_task:
function update_task(user_id, task_id, status_id, text, uiDraggable, el) {
$.get('task_update.php?user_id='+user_id+'&task_id='+task_id+'&status_id='+status_id+'', function(data) {
try {
jsonResult = JSON.parse(data);
} catch (e) {
alert(data);
return;
};
In task_update.php I GET my values; user_id, task_id & status_id and execute a PDO UPDATE query, to update my DB. If the query executes correctly, I
echo json_encode ( array (
'success' => true
) );
And then I append the task to the correct table cell
if(typeof jsonResult.success != 'undefined') {
$(uiDraggable).detach().css({top: 0,left: 0}).appendTo(el);
}
This has all worked fine. But, I'm starting to realize, that it's a problem when 2 or more people are making changes at the same time. If I'm testing with 2 browsers, and has the site opened on both for example: Then, if I move a task on browser1, I would have to manually refresh the page at browser2 to see the changes.
So my question is; How can I make my application auto-detech if a change to the DB-table has been made? And how can I update the HTML table, without refreshing the page.
I have looked at some timed intervals for updating pages, but that wouldn't work for me, since I really don't want to force the browser to refresh. A user can for example also create a new task in a lightbox iframe, so it would be incredibly annoying for them, if their browser refreshed while they were trying to create a new task.
So yeah, what would be the best practice for me to use?
Use Redis and its publish/subscribe feature to implement a message bus between your PHP app and a lightweight websocket server (Node.js is a good choice for this).
When your PHP modifies the data, it also emits an event in Redis that some data has changed.
When a websocket client connects to the Node.js server, it tells the server what data it would like to monitor, then, as soon as a Redis event was received and the event's data matches the client's monitored data, notify the client over the websocket, which then would refresh the page.
Take a look at this question with answers explaining all of this in detail, includes sample code that you can reuse.
I would use ajax to check the server at a reasonable interval. What's reasonable depends on your project - it should be often enough that it changes on one end don't mess up what another user is doing.
If you're worried about this being resource intensive you could use APC to save last modified times for everything that's active - that way you don't have to hit the database when you're just checking if anything has changed.
When things have changed then you should use ajax for that as well, and add the changes directly in the page with javascript/jquery.
If you really need to check a db changes - write a database triggers.
But if nobody, except your code, change it - you can to implement some observation in your code.
Make Observation(EventListener) pattern imlementation, or use one of existed.
Trigger events when anything meaningful happened.
Subscribe to this events

sails.js / socket.io sending destroyed, but not created

I'm using v0.10.
Simple blueprint request for a messaging app (my model is named message)
var socket = io.connect('http://localhost:1337');
//initiate the request
socket.request('/message', {}, function(users) {});
socket.on('message', function(m){
console.log(m)
});
Using postman to to delete a message sends the delete to the client, however create does not send anything. Thank you.
UPDATE:
created this repo to reproduce the issues: https://github.com/jamescharlesworth/testProject
Take a look to http://beta.sailsjs.org/#/documentation/reference/Upgrading search that page for "socket" and there you'll find the differences on sails 0.10.
The most important one now is that instead of the "type of message", the first parameter of the "on" is the model. As it was previously used to call "message" to one of the types of messages, perhaps there is a remaining bug or something that filters your "message" model notifications when creating.
Have you tried naming your model different? Just to validate that the issue is with your model's name.
Additionally: if you want some transparent binding of models into an angular app, you can do it seamlessly with angular-sails-bind:
https://github.com/diegopamio/angular-sails-bind
I made it for my own project and then decided to put it as a separated library so everybody could benefit and I could have my first experience developing a bower package.
I hope it could help you.
In your example, you are using autosubscribe: ['destroy', 'create', 'update'],
while in sails 0.10 they have with "ed":
The events that were formerly create, update, and destroy are now created, updated, and destroyed.
That may be your issue.

multiple destroy()s locking up backend in backbone.js

In the example Todos app for backbone.js, this takes place:
clearCompleted: function() {
_.each(Todos.done(), function(todo){ todo.clear(); });
return false;
},
This deletes multiple models by sending out multiple http DELETE requests to whatever service is backing the app. In the example's case that is no problem b/c they are using a local storage solution.
But when I try a similar process with a database on the backend (sqlite/datamapper/sinatra) the fact that it sends off multiple delete http requests simultaneously causes the db to lock and send back an error.
Is this something any of you have run into?
I can think of two ways around it:
Have a destroyBatch() that sends an array of id's into a DELETE call, and have sinatra sniff out the multiple ids and handle the deletes all at once server-side.
Have a destroyAsync() on the client-side that pushes the ids into a queue and calls destroy() on the models one-by-one in an async chain reaction until they are all gone ( but you would see them being deleted one by one on the screen with a pause in between each).
Do either of those solutions seem reasonable, or am I a frail goose flapping wildly?
-j
Option 2 is not a viable one. Your user can click back or close the window and the deletion will not succeed completely. So out with this one.
This leaves us to:
Fix your initial problem of locks in the DB :D
Send all ids to be deleted at once.
I would try to solve the initial problem first. What is causing them to lock up? I am pretty sure that in development mode sinatra will process a single request at a time, so sending a bunch of delete will actually be serialized on the backend processing... That is another question altogether that would be linked to the sqlite error returned.
As for sending the deletion in batches. It is a good idea, but it deviates from the standard RESTful controller. So you will have to handle that yourself as backbone do not provide a way to do this. You can add a deleteAll method on the collection and handle the sync from there (do not forget to send events if you are relying on them).

Categories