Knockout.js performance - how many observables? - javascript

I am new to Knockout.js and already like it very much.
Say I'm implementing web blog and want add/edit/delete blog post comments with using of Knockout.js. In purpose of this I define Comment viewmodel with subject, text and tags (in my real application i need much more fields, like 10 to 20).
After the message have been edited by user and posted to server I want to refresh it at the screen with the latest values (including those that came from server - say, timestamp). It appears that I need observable (not just simple) properties for every listed field, otherwise the values will not be refreshed at user's screen after postback.
Now, if I have 20 observables per comment and there are 50? 100? comments on the screen, then will it slow browser down much? What about mobile devices? If so, is there another way to achive what I want?
The other possible option is to use viewmodels only for the comment being edited. In this case I somehow need to "unbind" other viewmodels from html elements - ex., delete it and render again. But here I can't see a nice solution.

Interesting question.
The short, simple answer is no.
Browser performance is not really an issue unless you are specifically developing an application that is know or expected to be a performance hit.
A browser is well designed to handle very large amounts of data. Be it downloading new data from a server or rendering DOM elements. I would say a browser could handle over 1000 comments (an educated guess).
Take a look at a Google application (such as calendar) - they tend to process huge amounts of data.

This use case scenario sounds like a perfect match for the mapping plugin:
// Every time data is received from the server:
ko.mapping.fromJS(data, viewModel);
And if you ever get into performance issues, the Viewmodel plugin claims to be several times faster specifically for the task of updating your viewmodel from an updated model.
ko.viewmodel.updateFromModel(viewmodel, updatedModel);

Related

which is better, searching in javascript or database?

I have a grid(employee grid) which has say 1000-2000 rows.
I display employee name and department in the grid.
When I get data for the grid, I get other detail for the employee too(Date of Birth, location,role,etc)
So the user has option to edit the employee details. when he clicks edit, I need to display other employee details in the pop up. since I have stored all the data in JavaScript, I search for the particular id and display all the details. so the code will be like
function getUserDetails(employeeId){
//i store all the employeedetails in a variable employeeInformation while getting //data for the grid.
for(var i=0;i<employeeInformation.length;i++){
if(employeeInformation[i].employeeID==employeeId){
//display employee details.
}
}
}
the second solution will be like pass employeeid to the database and get all the information for the employee. The code will be like
function getUserDetails(employeeId){
//make an ajax call to the controller which will call a procedure in the database
// to get the employee details
//then display employee details
}
So, which solution do you think will be optimal when I am handling 1000-2000 records.
I don't want to make the JavaScript heavy by storing a lot of data in the page.
UPDATED:
so one of my friend came up with a simple solution.
I am storing 4 columns for 500 rows(average). So I don't think there should not be rapid slowness in the webpage.
while loading the rows to the grid, under edit link, I give the data-rowId as an attribute so that it will be easy to retrieve the data.
say I store all the employee information in a variable called employeeInfo.
when someone clicks the edit link.. $(this).attr('data-rowId') will give the rowId and employeeInfo[$(this).attr('data-rowId')] should give all the information about the employee.
instead of storing the employeeid and looping over the employee table to find the matching employeeid, the rowid should do the trick. this is very simple. but did not strike me.
I would suggest you make an AJAX call to the controller. Because of two main reasons
It is not advisable to handle Database actiity in javascript due to security issues.
Javascript runs on client side machine it should have the least load and computation.
Javascript should be as light as possible. So i suggest you do it in the database itself.
Don't count on JavaScript performance, because it is heavily depend on computer that is running on. I suggest you to store and search on server-side rather than loading heavy payload of data in Browser which is quite restricted to resources of end-user.
Running long loops in JavaScript can lead to an unresponsive and irritating UI. Use Ajax calls to get needed data as a good practice.
Are you using HTML5? Will your users typically have relatively fast multicore computers? If so, a web-worker (http://www.w3schools.com/html/html5_webworkers.asp) might be a way to offload the search to the client while maintaining UI responsiveness.
Note, I've never used a Worker, so this advice may be way off base, but they certainly look interesting for something like this.
In terms of separation of concerns, and recommended best approach, you should be handling that domain-level data retrieval on your server, and relying on the client-side for processing and displaying only the records with which it is concerned.
By populating your client with several thousand records for it to then parse, sort, search, etc., you not only take a huge performance hit and diminish user experience, but you also create many potential security risks. Obviously this also depends on the nature of the data in the application, but for something such as employee records, you probably don't want to be storing that on the client-side. Anyone using the application will then have access to all of that.
The more pragmatic approach to this problem is to have your controller populate the client with only the specific data which pertains to it, eliminating the need for searching through many records. You can also retrieve a single object by making an ajax query to your server to retrieve the data. This has the dual benefit of guaranteeing that you're displaying the current state of the DB, as well as being far more optimized than anything you could ever hope to write in JS.

Webpage showing data - Data updated meanwhile

I have an ASP.NET MVC 4 website and I have a regular page showing some data. I want to add a similar feature like SO has, which notifies the user that another user has made some changes (database record update) since the page was opened.
What's the best practice to achieve it? I don't know if the most common approach is to use timers, or if is there any other option like listeners.
Thanks
Not too familiar with ASP.NET MVC 4 so this answer is from an design PoV.
Basically there are 2 solutions:
Have the server inform (all) clients that the data has changed (reverse AJAX, Listeners/Observers)
Have the client regularly ask the server if the data has changed (Timer)
in 99.9% of the cases you want to pick option 2. Keeping track of all clients and managing that list creates too many problems worth the effort. while receiving an additional request every x seconds hardly ever kills you (as long as x is proportional to your server and customer base).

CouchDB: How to change view function via javascript?

I am playing around with CouchDB to test if it is "possible" [1] to store scientific data (simulated and experimental raw data + metadata). A big pro is the schema-less approach of CouchDB: we have to be very flexible with the metadata, as the set of parameters changes very often.
Up to now I have some code to feed raw data, plots (both as attachments), and hierarchical metadata (as JSON) into CouchDB documents, and have written some prototype Javascript for filtering and showing. But the filtering is done on the client side (a.k.a. browser): The map function simply returns everything.
How could I change the (or push a second) map function of a specific _design-document with simple browser-JS?
I do not think that a temporary view would yield any performance gain...
Thanks for your time and answers.
[1]: of course it is possible, but is it also useful? feasible? reasonable?
[added]
Ah, the jquery.couch.js (version 0.9.0) provides a saveDoc() function, which could update the _design document with the new map function.
But I also tried out the query function, which uses a temporary view. Okay, "do not use this in the real product, only during development"... But scientific research is steady development, right?
Temporary views are getting cached, as I noticed, and it works well for ~1000 documents per DB. A second plus: all users (think of 1 to 3, so a big user management is quit of an overkill) can work with their own temporary view.
Never ever use temporary views. They are really only there for dev and debugging purposes. For more information, see http://wiki.apache.org/couchdb/Introduction_to_CouchDB_views (specifically the bold "NOTE").
And yes, because design documents are really just documents with special powers, you can run you GET/POST/PUT/DELETE methods on them. However, you will usually need admin privileges to do this. So, if you are allowing a client side piece of software to do that, you are making your entire database public for read/write access - this may be fine for your application, but is important to remember.
Ex., if you restrict access to your database, but put the username and password in client side javascript, then anyone can see that username and password.
Cheers.
I´ve written an helper functions for jquery.couch and design docs, take a look at:
https://github.com/grischaandreew/jquery.couch.js

Efficient way to display lots of information in Javascript

I am continually pulling in a list (over 200) entries from a remote server and showing them in a table (many pieces of data per row). The data changes and fluctuates over time. What's the most efficient way in JS to render this to the screen. As the data changes do I simply wipe my DOM and re-render with the new data; do I have a list of ids that will simply update the correct tr dom entries?
Its already sorted and filtered to be the correct information (server side) so its just displaying it to the user I am concerned with.
Any suggestions would be helpful. I am using jQuery for my requests but that can be changed. I need to know the quickest (latency wise) way of pulling this large dataset and displaying it in a tabular format.
Edit: Pagination is not possible in this scenario.
Edit2: The backend is Python (WSGI) running in a load balancer and is capable of flushing out 600 reqs / second. The data is only available as JSON and the backend is more of an API.
Are you using HTML tables? Sometime ago I crossed a Microsoft blog from the IE team stating that rendering tables are slow. Very slow. Re-rendering the whole table everytime will probably be slower than updating its values, but that is just my guess.
In fact, my opinion here is that updating values will be faster than re-rendering elements in the page. To optimize your latency, you could keep a hash of objects mapped by, so you don't have to lookup the DOM everytime you update the values.
I quite like the 'Live Image Search' which provides more data as you scroll down.
If your data list is getting really large, consider not displaying all of that data to the user at one time (if that is possible in your situation). Not only will the user get lost looking at that much data, but the browser will have to render the whole list and slow you down.
You don't mention what server side technology you are using. If you are in .Net, there are a couple ASP.Net controls that use a data pager (like a grid view). There is also a PagedDataSource object you might look into using (which can be used with any ASP.Net control that has a DataSource property). Both will break up your data into pages, and only the viewed page will be rendered at one time. This decreases latency dramatically.

Client side performance factors...?

I am workin in a application in which i need to maintain the last accessed UI state(like column filters, sortings, selected records, resized widgets, columns, etc) of the page throughout the app session, this only terminated on the session end(and ll be posted to db on the session end either by manual logout or browser close). I have few things to ask
I have a parent page that never refreshes(this is a framework page, and i don have control on this one). My page is on the frame, and this frame will be loaded with different pages on runtime. so i decided to create a list object in my parent page during runtime using like this
parent.parent.parent.eval("var sam;")
and fill my state objects (in JSON form) during the whole session.
is it recommented using this kind of approach?? is it ok to maintain this amount of data in client side?? am not much into javascript oject deallocation capablitites.. will it hurt the system on anyway??
And i am maintaining JSON data array in my page which holds the some data, and it ll be maintained only on the page lifetime, it ll be terminated on page unload.. and this can grow based on the data in the page
I just started worrying, can browser hold this burden? can someone suggest me the javascript best approaches on performance.
I understand what you're doing, but you don't really give us enough information to give a good answer. you ask
is it ok to maintain this amount of
data in client side??
But you don't really tell us how much data you're working with.
Personally, I think you'll be fine. Storing moderate-to-large amounts of data will impact RAM which isn't a huge concern these days - the real performance issues with JavaScript come when you start doing really CPU-intensive stuff, like a lot of DOM interaction, large iterations, and recursion.
Oh, and I would probably encourage you to create variables in the frame's scope like this
top.varname = null;
as opposed to
parent.parent.parent.eval( 'var varname;' );

Categories