I have a table which updates every few seconds using an Ajax call. The table was becoming pretty unresponsive when there were a few thousand items in it. The main bottleneck is Tablesorter.js updating the table (reapplying filters, sorting and pagination), which is understandable as it has a lot of data to process.
I have increased performance somewhat by caching the last array of objects returned and then using underscore to compare it to the new one received and only updating the table if there has been a change, in the fashion....
var cachedJobData;
$.ajax({
type : 'GET',
cache : false,
url : '/jobs',
success : function(data){
console.log(GetNow() + ' got data');
Items = data;
if(_.isEqual(Items, cachedJobData)){
console.log("DATA THE SAME");
}
else{
console.log("DATA DIFFERENT!!!!!!!!!!!!!!!!!!");
rebuildTable();
}
cachedJobData = data;
},
error: function(Xhr, status, error){
toastr.error('Error getting job list - ' + error);
},
always:function(data){
if(data.loginResult === 'false'){
onLoggedOut();
}
}
});
function rebuildTable()
{
var tbl_body = "";
$.each(Items, function(index, item){
tbl_body += makeRow(item);
});
$("#dbTable tbody").html(tbl_body);
$("table").trigger("update");
}
...this works a lot better than previously, and has greatly improved performance, but I think it could be better.
When Tablesorter has to update a table of 5000ish rows (each with 12 cells) using $("table").trigger("update") it tends to take a good few seconds, and the page is unresponsive for this time.
My question is, how can I improve on this? One thing that crossed my mind was to look up each object individually in the current and cached array, and push any differences to another array so I end up with an 'update' array and then apply that to the table by looking up each row by an ID field that I store, then telling Tablesorter to only update those rows. This seems cumbersome though.
I am looking to use Angular eventually, but it's a steep learning curve to achieve the same functionality that Tablesorter gives me in terms of looks, filtering, sorting and pagination.
At the risk of this being labeled as a non-constructive question - any advice?
Related
I am going to do live data streaming on ag-grid datatable, so I used DeltaRowData for gridOptions and added getRowNodeId method as well which return unique value 'id'.
After all, I got a live update result on my grid table within some period I set, but some rows are duplicated so I can notice total count is a bit increased each time it loads updated data. The question title is warning message from browser console, I got bunch of these messages with different id number. Actually it is supposed not to do this from below docs. This is supposed to detect dups and smartly added new ones if not exist. Ofc, there are several ways to get refreshed data live, but I chose this one, since it says it helps to persist grid info like selected rows, current position of scroll on the grid etc. I am using vanilla js, not going to use any frameworks.
How do I make live data updated periodically without changing any current grid stuff? There is no error on the code, so do not try to speak about any bug. Maybe I am wrong with current implementation, Anyway, I want to know the idea or hear any implementation experience on this.
let gridOptions = {
....
deltaRowDataMode: true,
getRowNodeId = (data) => {
return data.id; // return the property you want set as the id.
}
}
fetch(loadUrl).then((res) => {
return res.json()
}).then((data) => {
gridOptions.api.setRowData(data);
})
...
If you get:
duplicated node warning
it means your getRowNodeId() has 1 value for 2 different rows.
here is part from source:
if (this.allNodesMap[node.id]) {
console.warn("ag-grid: duplicate node id '" + node.id + "' detected from getRowNodeId callback, this could cause issues in your grid.");
}
so try to check your data again.
if u 100% sure there is an error not related with your data - cut oof the private data, create a plinkr/stackblitz examples to reproduce your issue and then it would be simpler to check and help you.
If I have 1000+ rows with 10 to 30 columns it works fine,
but if I have 10 rows with 200+ columns, rendering takes 2 to 3 minutes.
After some time, the data is not visible while horizontal scrolling.
Eventually, the page becomes unresponsive.
The code is here :
$http.post($scope.url + "/getPagingRecordImportedTable", {
dbname : $stateParams.dbname,
tableName : tablename,
pageNumber : 1
}).success(function(response) {
$scope.names = response.records;
$scope.mdata = response.metadata;
$scope.gridOptions.data = response.records;
var columnsize = 0;
for (var obj in $scope.mdata) {
if ($scope.mdata[obj]['columnsize'] > 20) {
columnsize = 20;
} else {
columnsize = $scope.mdata[obj]['columnsize'];
}
$scope.columns.push({
field : $scope.mdata[obj]['columnname'],
width : columnsize
});
}
console.log("-----------Data received");
})
After firing http request the -------------Data received log is immediately printed, but rendering data takes too much time.
How can improve performance? Or is there any other API for huge rows and 200+ columns?!
I had to deal with a similar thing last year and summarised my thoughts in an article Rendering large data with AngularJS 1.4 (ft. Polymer 0.5).
Basically, I recommend using <iron-list> from Polymer components although it requires some refactoring (depending on your use case).
If you can't or don't want to use <iron-list> try to use one-time bindings as much as possible.
I could fix the issue by letting watchers fire individually for rows and columns.
I was earlier updating gridOptions.colDefs and gridOptions.data together. Then changed it to:
gridOptions.colDefs = new settings...
$timeout(function() { gridOptions.data = new data... },500, true);
And it works like charm. Obviously, one can adjust timeout values by hitting and trialing out on data and columns length.
I have a JSON data of news like this:
{
"news": [
{"title": "some title #1","text": "text","date": "27.12.15 23:45"},
{"title": "some title #2","text": "text","date": "26.12.15 22:35"},
...
]
}
I need to get a certain number of this list, depended on an argument in a function. As I understand, its called pagination.
I can get the ajax response and slice it immediately. So that every time the function is called - every time it makes an ajax request.
Like this:
function showNews(page) {
var newsPerPage = 5,
firstArticle = newsPerPage*(page-1);
xhr.onreadystatechange = function() {
if(xhr.readyState == 4) {
var newsArr = JSON.parse(xhr.responseText),
;
newsArr.news = newsArr.news.slice(firstArticle, newsPerPage*(page));
addNews(newsArr);
}
};
xhr.open("GET", url, true);
xhr.send();
Or I can store all the result in newsArr and slice it in that additional function addNews, sorted by pages.
function addNews(newsArr, newsPerPage) {
var pages = Math.ceil(amount/newsPerPages), // counts number of pages
pagesData = {};
for(var i=0; i<=pages; i++) {
var min = i*newsPerPages, //min index of current page in loop
max = (i+1)*newsPerPages; // max index of current page in loop
newsArr.news.forEach(createPageData);
}
function createPageData(item, j) {
if(j+1 <= max && j >= min) {
if(!pagesData["page"+(i+1)]) {
pagesData["page"+(i+1)] = {news: []};
}
pagesData["page"+(i+1)].news.push(item);
}
}
So, simple question is which variant is more effective? The first one loads a server and the second loads users' memory. What would you choose in my situation? :)
Thanks for the answers. I understood what I wanted. But there is so much good answers that I can't choose the best
It is actually a primarily opinion-based question.
For me, pagination approach looks better because it will not produce "lag" before displaying the news. From user's POV the page will load faster.
As for me, I would do pagination + preload of the next page. I.e., always store the contents of the next page, so that you can show it without a delay. When a user moves to the last page - load another one.
Loading all the news is definitely a bad idea. If you have 1000 news records, then every user will have to load all of them...even if he isn't going to read a single one.
In my opinion, less requests == better rule doesn't apply here. It is not guaranteed that a user will read all the news. If StackOverflow loaded all the questions it has every time you open the main page, then both StackOverflow and users would have huge problems.
If the max number of records that your service returns is around 1000, then I don't think it is going to create a huge payload or memory issues (by looking at the nature of your data), so I think option-2 is better because
number of service calls will be less
since user will not see any lag while paginating, his experience of using the site will be better.
As a rule of thumb:
less requests == better
but that's not always possible. You may run out of memory/network if the data you store is huge, i.e. you may need pagination on the server side. Actually server side pagination should be the default approach and then you think about improvements (e.g. local caching) if you really need them.
So what you should do is try all scenarios and see how well they behave in your concrete situation.
I prefer fetch all data but showing on some certain condition like click on next button data is already there just do hide and show on condition using jquery.
Every time call ajax is bad idea.
but you also need to call ajax for new data if data is changed after some periodic time
Assuming I have about 500 rows of data and are within a grid showing 40 rows per page.
I been thinking about how the grid works is that the 500 rows to display the next page with the data correspond.?
I have 2 options ....
1 The first is that you make a single queryto BD, and keep 500 rows in memory and work from there and cut each time you pass page.
2nd Second, run the query to the database, Cut to show the necessary rows and then display. And every time I pass page make a new query, bringing the 500 and go cutting data and showing only what is needed.
So the question is what is the ideal operation that should have a grid to handle the data? Making the best use in performance.
If you try to take a look at more general case, which is there are N rows in a dataset and you need to show M on a page, it would be clear that with N big enough, in most cases, it does not make sense to load all data in memory ( however you might do this in order to warm up a cache etc )
So usually you are loading one page at a time from database, passing pageNumber and pageSize parameters to the query.
Also you might use client side caching and save old page when navigating to new one. Having LocalStorage, WebSql etc in a browser makes it really easy.
function getNewPage(pageNumber, pageSize) {
var cacheKey = pageNumber + '_' + pageSize
if (myCache.contain(cacheKey)) {
return $.deferred().resolve(myCache.get(cacheKey))
}
return $.ajax( url , { pageSize:pageSize, pageNumber: pageNumber }, function (data) {
myCache.set(cacheKey, data)
return data
} )
}
var myCache = {
contain: function(key) {
return !!sessionStorage.getItem(key)
},
get: function(key) {
return JSON.parse(sessionStorage.getItem(key)
},
set: function(key, o) {
sessionStorage.setItem(key, JSON.stringify(o))
}
}
and somewhere in some button click handler:
getNewPage(0,10).done(function(data) {
var template = _.template('<tr><td>{field1}</td><td>{field2}</td></tr>')
$myTableTbody.html(data.map(template))
})
this question has already been answered multiple times. but here I want to site that if the code provided in the documentation can't be achieved without additional codes, why it has been given there in first place. Its simply misleading. The code given in the documentation achieves paging, but while sorting, the grid data simply disappears.
Correct me if I am wrong.
jQuery("#gridid").jqGrid({
...
datatype: 'json', // can be xml
loadComplete : function () {
jQuery("#gridid").jqGrid('setGridParam',{datatype:'local'});
},
onPaging : function(which_button) {
jQuery("#gridid").jqGrid('setGridParam',{datatype:'json'});
},
...
});
You don't posted the exact reference to the documentation where you get the code. I found it here.
jqGrid is open source product which you get for free. It's practical, but you should understand, that in the case the product and its documentation could not be perfect. The part of code which you referenced could work probably in some very old version of jqGrid, but it's wrong code in the current version of jqGrid. The sense of implementing "Client side sorting, but server side paging" is very suspected at all. My old answer about the subject you would find here. I would rewrite some part of the answer now, but in general the code tested with old versions could be not full compatible with the new version of jqGrid.
I can say there is no place where intentional misleading has happened. They are giving the pluging for free though it is a very huge plugin. And the work done by people like Oleg is making it more perfect. For you question, the code related to "client side sorting and server side paging" here is the code that can solve your problems. And this was taken with the help of some old answer given by Oleg.
This is my version of code,
loadComplete: function(data) {
var $this = $(this);
if ($this.jqGrid('getGridParam', 'datatype') === 'json') {
// because one use repeatitems: false option and uses no
// jsonmap in the colModel the setting of data parameter
// is very easy. We can set data parameter to data.rows:
$this.jqGrid('setGridParam', {
datatype: 'local',
data: data.userdata,
pageServer: data.page,
recordsServer: data.records,
lastpageServer: data.total
});
// because we changed the value of the data parameter
// we need update internal _index parameter:
this.refreshIndex();
if ($this.jqGrid('getGridParam', 'sortname') !== '') {
// we need reload grid only if we use sortname parameter,
// but the server return unsorted data
$this.triggerHandler('reloadGrid');
}
} else {
$this.jqGrid('setGridParam', {
page: $this.jqGrid('getGridParam', 'pageServer'),
records: $this.jqGrid('getGridParam', 'recordsServer'),
lastpage: $this.jqGrid('getGridParam', 'lastpageServer')
});
this.updatepager(false, true);}
}
onPaging:function(){
/*this code is to fix the issue when we click on next page with some data in filter tool bar
* along with this in grid definition( in filterToolbar ) we have to return true in "beforeClear "event
* */
var data = $(this).jqGrid("getGridParam", "postData");
data._search = false;
data.filters=null;
data.page=$(this).jqGrid("getGridParam","page");
/* comment this line if you disable filter toolbar*/
$(this)[0].clearToolbar();
//Here making _search alone false will not solve problem, we have to make search also false. like in below.
$(this).jqGrid('setGridParam', { search: false, postData:data });
var data = $(this).jqGrid("getGridParam", "postData");
/*this is to fix the issue when we go to last page(say there are only 3 records and page size is 5) and click
* on sorting the grid fetches previously loaded data (may be from its buffer) and displays 5 records
* where in i am expecting only 3 records to be sorted out.along with this there should be a modification in source code.
*/
$(this).jqGrid("clearGridData");
/* this is to make the grid to fetch data from server on page click*/
$(this).setGridParam({datatype: 'json'}).triggerHandler("reloadGrid");
}
For the modification that you have to do in source code is , see this answer..