How a grid works when passed to the next page? - javascript

Assuming I have about 500 rows of data and are within a grid showing 40 rows per page.
I been thinking about how the grid works is that the 500 rows to display the next page with the data correspond.?
I have 2 options ....
1 The first is that you make a single queryto BD, and keep 500 rows in memory and work from there and cut each time you pass page.
2nd Second, run the query to the database, Cut to show the necessary rows and then display. And every time I pass page make a new query, bringing the 500 and go cutting data and showing only what is needed.
So the question is what is the ideal operation that should have a grid to handle the data? Making the best use in performance.

If you try to take a look at more general case, which is there are N rows in a dataset and you need to show M on a page, it would be clear that with N big enough, in most cases, it does not make sense to load all data in memory ( however you might do this in order to warm up a cache etc )
So usually you are loading one page at a time from database, passing pageNumber and pageSize parameters to the query.
Also you might use client side caching and save old page when navigating to new one. Having LocalStorage, WebSql etc in a browser makes it really easy.
function getNewPage(pageNumber, pageSize) {
var cacheKey = pageNumber + '_' + pageSize
if (myCache.contain(cacheKey)) {
return $.deferred().resolve(myCache.get(cacheKey))
}
return $.ajax( url , { pageSize:pageSize, pageNumber: pageNumber }, function (data) {
myCache.set(cacheKey, data)
return data
} )
}
var myCache = {
contain: function(key) {
return !!sessionStorage.getItem(key)
},
get: function(key) {
return JSON.parse(sessionStorage.getItem(key)
},
set: function(key, o) {
sessionStorage.setItem(key, JSON.stringify(o))
}
}
and somewhere in some button click handler:
getNewPage(0,10).done(function(data) {
var template = _.template('<tr><td>{field1}</td><td>{field2}</td></tr>')
$myTableTbody.html(data.map(template))
})

Related

search-ui: Load 120 records but then revert back to 20 per page

i'm creating infinite scrolling for my ElasticSearch app, which is using the search-ui package.
This is a component created in Vue.js
I have got an Observer, once this is in view I increase the page number, rather than replace my entire results array I append to it, which in turn creates infinite scrolling.
current is the current page.
lastRequestId is the last request
requestId is the current request
The below is in a different component, so I send the page increase Event back to the root component.
new IntersectionObserver(() => {
if (this.lastRequestId !== this.requestId) {
window.EventBus.$emit('search-set-page', this.current + 1)
this.lastRequestId = this.requestId
}
}, {
rootMargin: '0px',
threshold: 1.0
}).observe(this.$refs.bottomOfResults);
This is the root component, I have just pulled out the key code which does the logic.
window.EventBus.$on('search-set-page', page => {
if (page < 1 || page > this.state.totalPages) {
console.error('You tried to change to a page that doesn\'t exist...');
return;
}
this.setCurrentPage(page)
});
this.driver = new SearchDriver({});
// This is where it does the results appending logic.
this.driver.subscribeToStateChanges(state => {
this.state = state
this.current = state.current
if (!this.results[state.requestId] && state.results.length) {
this.$set(this.results, state.requestId, state.results)
}
this.requestId = state.requestId
});
setCurrentPage(page) {
this.current = page;
this.driver.setCurrent(page);
},
My Problem?
Well, lets say I have scrolled down to load 7 pages worth of documents. If I refresh I now need to load these again and send the visitor to where they refreshed, I can set the results per page to current * results per page which works fine.
But due to API delays and other timing issues i'm struggling to set this back once the results have returned. I tried adding a setTimeout and then changing the resultsPerPage AND current however it was changing the resultsPerPage first, which just results in duplicate records being returned, and it never changed the current.
Anyone got any experience in this?

100+ column with 10 rows rendering performance is very worst in ui-grid

If I have 1000+ rows with 10 to 30 columns it works fine,
but if I have 10 rows with 200+ columns, rendering takes 2 to 3 minutes.
After some time, the data is not visible while horizontal scrolling.
Eventually, the page becomes unresponsive.
The code is here :
$http.post($scope.url + "/getPagingRecordImportedTable", {
dbname : $stateParams.dbname,
tableName : tablename,
pageNumber : 1
}).success(function(response) {
$scope.names = response.records;
$scope.mdata = response.metadata;
$scope.gridOptions.data = response.records;
var columnsize = 0;
for (var obj in $scope.mdata) {
if ($scope.mdata[obj]['columnsize'] > 20) {
columnsize = 20;
} else {
columnsize = $scope.mdata[obj]['columnsize'];
}
$scope.columns.push({
field : $scope.mdata[obj]['columnname'],
width : columnsize
});
}
console.log("-----------Data received");
})
After firing http request the -------------Data received log is immediately printed, but rendering data takes too much time.
How can improve performance? Or is there any other API for huge rows and 200+ columns?!
I had to deal with a similar thing last year and summarised my thoughts in an article Rendering large data with AngularJS 1.4 (ft. Polymer 0.5).
Basically, I recommend using <iron-list> from Polymer components although it requires some refactoring (depending on your use case).
If you can't or don't want to use <iron-list> try to use one-time bindings as much as possible.
I could fix the issue by letting watchers fire individually for rows and columns.
I was earlier updating gridOptions.colDefs and gridOptions.data together. Then changed it to:
gridOptions.colDefs = new settings...
$timeout(function() { gridOptions.data = new data... },500, true);
And it works like charm. Obviously, one can adjust timeout values by hitting and trialing out on data and columns length.

What is more effective: every time make an ajax request or slice a result in func?

I have a JSON data of news like this:
{
"news": [
{"title": "some title #1","text": "text","date": "27.12.15 23:45"},
{"title": "some title #2","text": "text","date": "26.12.15 22:35"},
...
]
}
I need to get a certain number of this list, depended on an argument in a function. As I understand, its called pagination.
I can get the ajax response and slice it immediately. So that every time the function is called - every time it makes an ajax request.
Like this:
function showNews(page) {
var newsPerPage = 5,
firstArticle = newsPerPage*(page-1);
xhr.onreadystatechange = function() {
if(xhr.readyState == 4) {
var newsArr = JSON.parse(xhr.responseText),
;
newsArr.news = newsArr.news.slice(firstArticle, newsPerPage*(page));
addNews(newsArr);
}
};
xhr.open("GET", url, true);
xhr.send();
Or I can store all the result in newsArr and slice it in that additional function addNews, sorted by pages.
function addNews(newsArr, newsPerPage) {
var pages = Math.ceil(amount/newsPerPages), // counts number of pages
pagesData = {};
for(var i=0; i<=pages; i++) {
var min = i*newsPerPages, //min index of current page in loop
max = (i+1)*newsPerPages; // max index of current page in loop
newsArr.news.forEach(createPageData);
}
function createPageData(item, j) {
if(j+1 <= max && j >= min) {
if(!pagesData["page"+(i+1)]) {
pagesData["page"+(i+1)] = {news: []};
}
pagesData["page"+(i+1)].news.push(item);
}
}
So, simple question is which variant is more effective? The first one loads a server and the second loads users' memory. What would you choose in my situation? :)
Thanks for the answers. I understood what I wanted. But there is so much good answers that I can't choose the best
It is actually a primarily opinion-based question.
For me, pagination approach looks better because it will not produce "lag" before displaying the news. From user's POV the page will load faster.
As for me, I would do pagination + preload of the next page. I.e., always store the contents of the next page, so that you can show it without a delay. When a user moves to the last page - load another one.
Loading all the news is definitely a bad idea. If you have 1000 news records, then every user will have to load all of them...even if he isn't going to read a single one.
In my opinion, less requests == better rule doesn't apply here. It is not guaranteed that a user will read all the news. If StackOverflow loaded all the questions it has every time you open the main page, then both StackOverflow and users would have huge problems.
If the max number of records that your service returns is around 1000, then I don't think it is going to create a huge payload or memory issues (by looking at the nature of your data), so I think option-2 is better because
number of service calls will be less
since user will not see any lag while paginating, his experience of using the site will be better.
As a rule of thumb:
less requests == better
but that's not always possible. You may run out of memory/network if the data you store is huge, i.e. you may need pagination on the server side. Actually server side pagination should be the default approach and then you think about improvements (e.g. local caching) if you really need them.
So what you should do is try all scenarios and see how well they behave in your concrete situation.
I prefer fetch all data but showing on some certain condition like click on next button data is already there just do hide and show on condition using jquery.
Every time call ajax is bad idea.
but you also need to call ajax for new data if data is changed after some periodic time

JQuery & Tablesorter - Improving performance - update only changed data

I have a table which updates every few seconds using an Ajax call. The table was becoming pretty unresponsive when there were a few thousand items in it. The main bottleneck is Tablesorter.js updating the table (reapplying filters, sorting and pagination), which is understandable as it has a lot of data to process.
I have increased performance somewhat by caching the last array of objects returned and then using underscore to compare it to the new one received and only updating the table if there has been a change, in the fashion....
var cachedJobData;
$.ajax({
type : 'GET',
cache : false,
url : '/jobs',
success : function(data){
console.log(GetNow() + ' got data');
Items = data;
if(_.isEqual(Items, cachedJobData)){
console.log("DATA THE SAME");
}
else{
console.log("DATA DIFFERENT!!!!!!!!!!!!!!!!!!");
rebuildTable();
}
cachedJobData = data;
},
error: function(Xhr, status, error){
toastr.error('Error getting job list - ' + error);
},
always:function(data){
if(data.loginResult === 'false'){
onLoggedOut();
}
}
});
function rebuildTable()
{
var tbl_body = "";
$.each(Items, function(index, item){
tbl_body += makeRow(item);
});
$("#dbTable tbody").html(tbl_body);
$("table").trigger("update");
}
...this works a lot better than previously, and has greatly improved performance, but I think it could be better.
When Tablesorter has to update a table of 5000ish rows (each with 12 cells) using $("table").trigger("update") it tends to take a good few seconds, and the page is unresponsive for this time.
My question is, how can I improve on this? One thing that crossed my mind was to look up each object individually in the current and cached array, and push any differences to another array so I end up with an 'update' array and then apply that to the table by looking up each row by an ID field that I store, then telling Tablesorter to only update those rows. This seems cumbersome though.
I am looking to use Angular eventually, but it's a steep learning curve to achieve the same functionality that Tablesorter gives me in terms of looks, filtering, sorting and pagination.
At the risk of this being labeled as a non-constructive question - any advice?

How to reset pagination in a JQuery datatable

I have a JQuery datatable that loads data via an ajax request. There is a form by the table that lets the user filter the results on the server. When the user changes the filter form, I call fnReloadAjax() with the new filter data, and the table redraws with the new, filtered results.
The problem I have is that the pagination sticks, even if the table no longer has enough items to reach the current page. So if the table originally had 20 items, and the user was on page 2 (10 items per page), and they change the filter so only 5 items are returned, the table displays no items and shows this at the bottom:
Showing 11 to 5 of 5 entries
It's still on page 2 even though there is only enough data for one page.
I have found numerous posts about trying to preserve the current page/row, but none showing a simple way to reset pagination to the first page. What is the simplest way to do this?
Here's a simplified version of my code for clarity:
$("#mytable").dataTable({
bStateSave: false,
fnServerData: function (sSource, aoData, fnCallback) {
return $.ajax({
type: "POST",
url: url,
data: {filter: filterValue}
});
}
});
$("#myForm").submit(function(e) {
table.fnReloadAjax();
return false;
});
You could explicitly jump to the first page after reloading, see http://datatables.net/api#fnPageChange
$("#myForm").submit(function(e) {
table.fnPageChange(0);
table.fnReloadAjax();
return false;
});
Accepting the solution given by #Gigo:
$("#myForm").submit(function(e) {
table.fnPageChange(0);
table.fnReloadAjax();
return false;
});
This have a problem, it sends two request to the server.
i have found that the fnPageChange does it at the first time.
$("#myForm").submit(function(e) {
table.fnPageChange(0);
return false;
});
This can be solved by implementing the functions to save and load the state of the datatable and resetting the start point - example below
"fnStateSave": function (oSettings, oData) {
localStorage.setItem( 'MyDataTable', JSON.stringify(oData) );
},
"fnStateLoad": function (oSettings) {
var settings = JSON.parse( localStorage.getItem('MyDataTable') );
settings.iStart = 0; // resets to first page of results
return settings
},
As fnStateLoad is called when the table is reloaded - e.g. a new filter is applied - the paging is reset to the start.
fnStateSave is called each time you retrieve the next page of results
This approach avoids the overhead of an additional request back to the server

Categories