multiple destroy()s locking up backend in backbone.js - javascript

In the example Todos app for backbone.js, this takes place:
clearCompleted: function() {
_.each(Todos.done(), function(todo){ todo.clear(); });
return false;
},
This deletes multiple models by sending out multiple http DELETE requests to whatever service is backing the app. In the example's case that is no problem b/c they are using a local storage solution.
But when I try a similar process with a database on the backend (sqlite/datamapper/sinatra) the fact that it sends off multiple delete http requests simultaneously causes the db to lock and send back an error.
Is this something any of you have run into?
I can think of two ways around it:
Have a destroyBatch() that sends an array of id's into a DELETE call, and have sinatra sniff out the multiple ids and handle the deletes all at once server-side.
Have a destroyAsync() on the client-side that pushes the ids into a queue and calls destroy() on the models one-by-one in an async chain reaction until they are all gone ( but you would see them being deleted one by one on the screen with a pause in between each).
Do either of those solutions seem reasonable, or am I a frail goose flapping wildly?
-j

Option 2 is not a viable one. Your user can click back or close the window and the deletion will not succeed completely. So out with this one.
This leaves us to:
Fix your initial problem of locks in the DB :D
Send all ids to be deleted at once.
I would try to solve the initial problem first. What is causing them to lock up? I am pretty sure that in development mode sinatra will process a single request at a time, so sending a bunch of delete will actually be serialized on the backend processing... That is another question altogether that would be linked to the sqlite error returned.
As for sending the deletion in batches. It is a good idea, but it deviates from the standard RESTful controller. So you will have to handle that yourself as backbone do not provide a way to do this. You can add a deleteAll method on the collection and handle the sync from there (do not forget to send events if you are relying on them).

Related

Better alternative to pinging the database over and over?

I want to create a dashboard that automatically updates when new data is posted.
My first thought was to just make a javascript function and put a fetch statement in it and then loop the function every second or every couple of seconds...
Obviously, this is not a great solution. But I don't know what the better way is...
Some notes:
-PHP Server-Side Language
-Ran on Localhost so traffic is not going over the internet
Can anyone advise what I should be doing or if this is an acceptable approach?
Thanks in advance!
Server Side:
You can look for any onUpdate events if your database supports any such events
Or else just run a query in a timed interval to fetch new updates form the database (Connection to database is made just once and all subsequent requests go through the same connection. Hence this isn't a bad approach)
But when it comes to client side and receiving those updates, you can make it efficient in either of the two ways:
[Simple] Use Socket IO - Push an event with your new data and listen to them on the client side. (This way socket connection is made just once and all subsequent responses are received in the same connection)
Docs: https://socket.io/docs/v4/index.html
[Complex] Use HTTP stream
Example: https://gist.github.com/igrigorik/5736866

Making really simple app including front & backend skills (js, node.js, psql, react...)

I'm trying to make a simple todo app in order to understand how frontend and backend are connected. I read some of the websites showing a tutorial for using and connecting rest API, express server, and database, but still, I was not able to get the fake data from a database. Anyway, I wanted to check if my understanding of how they are connected and talk to each other is correct or not. So could you give some advice please?
First of all, I'm planning to use either Javascript & HTML or React for frontend, Express for server, and Postgres for the database. My plan is a user can add & delete his or her task. I have already created a server in my index.js file and created a database using psql command. Now if I type "" it takes me to the page saying "Hello" (I made this endpoint), and I'm failing to seed my data to the database. Here are my questions↓
After I was able to seed my fake data into the database, how should I get the data from the database and send to the frontend? I think in my index.js file, create a new endpoint something like "app.get("/api/todo", (res, req) => ..." and inside of the callback function, I should write something like "select * from [table name]". Also, form the front end, I should probably access certain endpoints using fetch. Is this correct?
Also, how can I store data which is sent from the frontend? For example, if I type my new todo to <input> field and click the add <button>, what is the sequence of events looks like? Adding event listener to button and connect to the server, then create post method in the server and insert data, kind of (?) <= sorry this part it's super unclear for me.
Displaying task on the frontend is also unclear for me. If I use an object like {task: clean up my room, finished: false (or 0 ?)} in the front end, it makes sense but, when I start using the database, I'm confused about how to display items that are not completed yet. In order to display each task, I won't use GET method to get the data from the database, right?
Also, do I need to use knex to solve this type of problem? (or better to have knex and why?)
I think my problem is I kind of know what frontend, server, database for, but not clear how they are connected with each other...
I also drew some diagrams as well, so I hope it helps you to understand my vague questions...
how should I get the data from the database and send to the frontend?
I think in my index.js file, create a new endpoint something like
"app.get("/api/todo", (res, req) => ..." and inside of the callback
function, I should write something like "select * from [table name]".
Typically you use a controller -> service -> repository pattern:
The controller is a thin layer, it's basically the callback method you refer to. It just takes parameters from the request, and forwards the request to the service in the form of a method call (i.e. expose some methods on the service and call those methods). It takes the response from the service layer and returns it to the client. If the service layer throws custom exceptions, you also handle them here, and send an appropriate response to the client (error status code, custom message).
The service takes the request and forwards it to the repository. In this layer, you can perform any custom business logic (by delegating to other isolated services). Also, this layers will take care of throwing custom exceptions, e.g. when an item was not found in the database (throw new NotFoundException)
The repository layer connects to the database. This is where you put the custom db logic (queries like you mention), eg when using a library like https://node-postgres.com/. You don't put any other logic here, the repo is just a connector to the db.
Also, form the front end, I should probably access certain endpoints
using fetch. Is this correct?
Yes.
Also, how can I store data which is sent from the frontend? For
example, if I type my new todo to field and click the add , what is
the sequence of events looks like? Adding event listener to button and
connect to the server, then create post method in the server and
insert data, kind of (?) <= sorry this part it's super unclear for me.
You have a few options:
Form submit
Ajax request, serialize the data in the form manually and send a POST request through ajax. Since you're considering a client library like React, I suggest using this approach.
Displaying task on the frontend is also unclear for me. If I use an
object like {task: clean up my room, finished: false (or 0 ?)} in the
front end, it makes sense but, when I start using the database, I'm
confused about how to display items that are not completed yet. In
order to display each task, I won't use GET method to get the data
from the database, right?
If you want to use REST, it typically implies that you're not using backend MVC / server rendering. As you mentioned React, you're opting for keeping client state and syncing with the server over REST.
What it means is that you keep all state in the frontend (in memory / localstorage) and just sync with the server. Typically what is applied is what is referred to as optimistic rendering; i.e. you just manage state in the frontend as if the server didn't exist; yet when the server fails (you see this in the ajax response), you can show an error in the UI, and rollback state.
Alternatively you can use spinners that wait until the server sync is complete. It makes for less interesting user perceived performance, but is just as valid technical wise.
Also, do I need to use knex to solve this type of problem? (or better
to have knex and why?) I think my problem is I kind of know what
frontend, server, database for, but not clear how they are connected
with each other...
Doesn't really matter what you use. Personally I would go with the stack:
Node Express (REST), but could be Koa, Restify...
React / Redux client side
For the backend repo layer you can use Knex if you want to, I have used node-postgres which worked well for me.
Additional info:
I would encourage you to take a look at the following, if you're doubtful how to write the REST endpoints: https://www.youtube.com/watch?v=PgrP6r-cFUQ
After I was able to seed my fake data into the database, how should I get the data from the database and send to the frontend? I think in my index.js file, create a new endpoint something like "app.get("/api/todo", (res, req) => ..." and inside of the callback function, I should write something like "select * from [table name]". Also, form the front end, I should probably access certain endpoints using fetch. Is this correct?
You are right here, you need to create an endpoint in your server, which will be responsible for getting data from Database. This same endpoint has to be consumed by your Frontend application, in case you are planning to use ReactJS. As soon as your app loads, you need to get the current userID and make a fetch call to the above-created endpoint and fetch the list of todos/any data for that matter pertaining to the concerned user.
Also, how can I store data which is sent from the frontend? For example, if I type my new todo to field and click the add , what is the sequence of events looks like? Adding event listener to button and connect to the server, then create post method in the server and insert data, kind of (?) <= sorry this part it's super unclear for me.
Okay, so far, you have connected your frontend to your backend, started the application, user is present and you have fetched the list of todos, if any available for that particular user.
Now coming to adding new todo the most minimal flow would look something like this,
User types the data in a form and submits the form
There is a form submit handler which will take the form data
Check for validation for the form data
Call the POST endpoint with payload as the form data
This Post endpoint will be responsible for saving the form data to DB
If an existing todo is being modified, then this should be handled using a PATCH request (Updating the state, if task is completed or not)
The next and possibly the last thing would be to delete the task, you can have a DELETE endpoint to remove the todo item from the list of todos
Displaying task on the frontend is also unclear for me. If I use an object like {task: clean up my room, finished: false (or 0 ?)} in the front end, it makes sense but, when I start using the database, I'm confused about how to display items that are not completed yet. In order to display each task, I won't use GET method to get the data from the database, right?
Okay, so as soon as you load the frontend for the first time, you will make a GET call to the server and fetch the list of TODOS. Store this somewhere in the application, probably redux store or just the application local state.
Going by what you have suggested already,
{task: 'some task name', finished: false, id: '123'}
Now anytime there has to be any kind of interaction with any of the TODO item, either PATCH or DELETE, you would use the id for each TODO and call the respective endpoint.
Also, do I need to use knex to solve this type of problem? (or better to have knex and why?) I think my problem is I kind of know what frontend, server, database for, but not clear how they are connected with each other...
In a nutshell or in the most minimal sense, think of Frontend as the presentation layer and backend and DB as the application layer.
the overall game is of sending some kind of request and receiving some response for those sent requests. Frontend is what enables any end-user to create these so-called requests, the backend (server & database) is where these requests are processed and response is sent back to the presentational layer for the end user to be notified.
These explanations are very minimal to make sure you get the gist of it. Since this question almost revolves around the entire scope of web development. I would suggest you read a few articles about both these layers and how they connect with each other.
You should also spend some time understanding what is RESTful API. That should be a great help.

Auto detect if DB table changes

I have a small application where a users can drag and drop a task in an HTML table.
When user drops the task, I call a javascript function called update_task:
function update_task(user_id, task_id, status_id, text, uiDraggable, el) {
$.get('task_update.php?user_id='+user_id+'&task_id='+task_id+'&status_id='+status_id+'', function(data) {
try {
jsonResult = JSON.parse(data);
} catch (e) {
alert(data);
return;
};
In task_update.php I GET my values; user_id, task_id & status_id and execute a PDO UPDATE query, to update my DB. If the query executes correctly, I
echo json_encode ( array (
'success' => true
) );
And then I append the task to the correct table cell
if(typeof jsonResult.success != 'undefined') {
$(uiDraggable).detach().css({top: 0,left: 0}).appendTo(el);
}
This has all worked fine. But, I'm starting to realize, that it's a problem when 2 or more people are making changes at the same time. If I'm testing with 2 browsers, and has the site opened on both for example: Then, if I move a task on browser1, I would have to manually refresh the page at browser2 to see the changes.
So my question is; How can I make my application auto-detech if a change to the DB-table has been made? And how can I update the HTML table, without refreshing the page.
I have looked at some timed intervals for updating pages, but that wouldn't work for me, since I really don't want to force the browser to refresh. A user can for example also create a new task in a lightbox iframe, so it would be incredibly annoying for them, if their browser refreshed while they were trying to create a new task.
So yeah, what would be the best practice for me to use?
Use Redis and its publish/subscribe feature to implement a message bus between your PHP app and a lightweight websocket server (Node.js is a good choice for this).
When your PHP modifies the data, it also emits an event in Redis that some data has changed.
When a websocket client connects to the Node.js server, it tells the server what data it would like to monitor, then, as soon as a Redis event was received and the event's data matches the client's monitored data, notify the client over the websocket, which then would refresh the page.
Take a look at this question with answers explaining all of this in detail, includes sample code that you can reuse.
I would use ajax to check the server at a reasonable interval. What's reasonable depends on your project - it should be often enough that it changes on one end don't mess up what another user is doing.
If you're worried about this being resource intensive you could use APC to save last modified times for everything that's active - that way you don't have to hit the database when you're just checking if anything has changed.
When things have changed then you should use ajax for that as well, and add the changes directly in the page with javascript/jquery.
If you really need to check a db changes - write a database triggers.
But if nobody, except your code, change it - you can to implement some observation in your code.
Make Observation(EventListener) pattern imlementation, or use one of existed.
Trigger events when anything meaningful happened.
Subscribe to this events

How would you create an auto-updating newsfeed without a reload?

How would I go around creating an auto-updating newsfeed? I was going to use NodeJS, but someone told me that wouldn't work when I got into the thousands of users. Right now, I have it so that you can post text to the newsfeed, and it will save into a mysql database. Then, whenever you load the page, it will display all the posts from that database. The problem with this is that you have to reload the page everytime there is an update. I was going to use this to tell the nodejs server someone posted an update...
index.html
function sendPost(name,cont) {
socket.emit("newPost", name, cont);
}
app.js
socket.on("newPost", function (name,cont) {
/* Adding the post to a database
* Then calling an event to say a new post was created
* and emit a new signal with the new data */
});
But that won't work for a ton of people. Does anyone have any suggestions for where I should start, the api's and/or programs I would need to use?
You're on the right track. Build a route on your Node webserver that will cause it to fetch a newspost and broadcast to all connected clients. Then, just fire the request to Node.
On the Node-to-client front, you'll need to learn how to do long polling. It's rather easy - you let a client connect and do not end the response until a message goes through to it. You handle this through event handlers (Postal.JS is worth picking up for this).
The AJAX part is straightforward. $.get("your/node/url").then(function(d) { }); works out of the box. When it comes back (either success or failure), relaunch it. Set its timeout to 60 seconds or so, and end the response on the node front the moment one event targetted it.
This is how most sites do it. The problem with websockets is that, right now, they're a bit of a black sheep due to old IE versions not supporting them. Consider long polling instead if you can afford it.
(Psst. Whoever told you that Node wouldn't work in the thousands of users is talking through their asses. If anything, Node is more adapted to large concurrency than PHP due to the fact that a connection on Node takes almost nothing to keep alive due to the event-driven nature of Node. Don't listen to naysayers.)

How to avoid too many ajax calls and cache json data on the client side

I have a calendar application and it loads all of the event data using ajax and json results. the issue is that i have different view and right now i have to re call the server when i change views.
Is there any recommendation for ways i can cache this data on the client side and check if i have loaded these events already before firing off more ajax calls.
What is the best practice for this ?
Like hvgotcodes said, an MVC framework would help; try backbone.js (http://documentcloud.github.com/backbone/), for instance.
Alternatively, you might want to consider using jStorage (http://www.jstorage.info/). Every time you need to make an AJAX call, check first if it's in your storage object, then run the AJAX call if it isn't. On the other end, whenever you finish an AJAX call, store the results in the storage object. Make sure you have some kind of index (a CalendarEvent id) to reference when looking it up in the data store. Might want to add some kind of "expire time" to the data in your storage, too ... a timestamp after the AJAX call, and re-request up front if it's out of date.
It's called MVC.
You need to construct a data model for you application, write some sort of Record objects, and then you can determine their status. So your application would have some sort of CalendarEvent model, and when you load data from the server, you would instantiate instances.
So when changing views, you would first check to see if you had the model object for that view, and if you did, you wouldn't need to load it from the server (unless you want to check for changes).
Your scheme doesn't need to be that complicated. If you load events by Id, you can do something like
window.App = {};
window.App.Models = {};
when you load a record you could put
window.App.Models[id] = InstanceOfYourRecord
and that way its pretty fast to look for records. Or just use a framework (like Sproutcore) that has a robust data layer.
I had similar issues on a recent project.
Conceptually, I have the "real" data model (DM) kept on the server, persisted to a database.
To make life sane, the client keeps its own local data model. Outside of the client DM, all the client code thinks it's pulling results locally.
When reading data (GET) from the client DM it:
checks the cache for existing results
invokes appropriate AJAX queries when cached data is not available, then caches the results.
When changing data (POST) via the client DM it:
invalidates the cache as appropriate
invokes appropriate AJAX queries
emits custom jQuery event indicating client DM changed
Note that this client DM also:
centralizes AJAX error handling
tracks AJAX calls still in-flight. (Lets us warn users when leaving pages with unsaved changes).
allows a drop-in, dummy replacement for unit testing, where all the calls hit local data and are completely synchronous.
Implementation notes:
I coded this as a JavaScript class called DataModel. As the design becomes more complex, it makes sense to further break-down the responsibilities in to separate objects.
jQuery's custom events let you easily implement the observer pattern. Client components update themselves from the client DM whenever it indicates data has changed.
JSON in your remote API helps simplify the code. My client DM stores the JSON results directly in its cache.
The client dm function arguments include call-backs so everything can naturally be passed along via AJAX when needed: function listAll( contactId, cb ) { ... }
My project only allowed single user logins. If outside parties can change the server datamodel, some sort of has-data-changed probe should be fired regularly to ensure the client cache is still valid.
For my app, multiple client components would request the same data when receiving a client DM changed event. This resulted in multiple AJAX calls with the same info. I fixed this problem with a getJsonOnce() helper, which manages a queue of client component call-backs awaiting the same result.
Example function in my implementation:
listAll:
function( contactId, cb ) {
// pull from cache
if ( contactId in this.notesCache ) {
cb( this.notesCache[contactId] );
return;
}
// init queue if needed
this.listAllQueue[contactId] = this.listAllQueue[contactId] || [];
// pull from server
var self = this;
dataModelHelpers.getJsonOnce(
'/teafile/api/notes.php',
{'req': 'listAll', 'contact': contactId},
function(resp) { self.notesCache[contactId] = resp; },
this.listAllQueue[contactId],
cb
);
}
The getJsonOnce() helper makes sure that if multiple client components request the exact same (uncached) data, that we only send out a single AJAX request and inform everyone once it comes in.
The notesCache is just a simple javascript object:
this.notesCache = {};

Categories