Performs the task of adding approximately 2000 rows per second to Tabledata:
$.each(data.msg, function(count, item) {
tabledata.push( {"#":"#",
"Timestamp":item['Timestamp'],
"ID":item['ID'],
"ID2":item['ID2'],
"DLC":item['DLC'],
"Offset":item['Offset'],
"Detect":item['Detect'],
})
});
//append table row
table.rows.add(tabledata).draw();
However, as time goes on, too much delay is consumed in rendering and the real-time meaning disappears. Therefore, we are working on initializing the table or deleting rows by a certain period.
if ( tableCount%10 == 0 ){
console.log("remove");
table.rows($('#packetList tr')).remove().draw(false);
tableCount = 0;
}
However, the problem now is that only one page is deleted from the paginated bootstrap. I have tried to figure out what to do when I want to delete the entire pagination data, but it is difficult to identify a clear example. Please let me know if you know anyone!
p.s. In addition, please let me know if anyone knows the best way to avoid delays in expressing over 2,000 data points per second or more on a web page.
Related
I have a tablesorter, with 2 filter Columns. The first Filter works as a drop down and has no issues right now. The second filter is intended to be a full table search and filter mechanism.
That is to say, even though it is associated with the computer column, it should return results for child rows as well
The Computer Filter should respond to all child rows. For instance, if I searched for z840, , only Computers with Model z840 should appear.
However, I have a custom secondary filter mechanism by request The gauge at the top, works as a filter for workgroup
However, If I am filtered in a workgroup, and use the Computer Filter, it ignores the custom hidden rows, and searches against any row in the table. (Child Row Searching works fine).
My Question is, Is there a way to overwrite the functionality of the filter, to ignore any rows that are already satisfying some condition IE: $(row).hasClass('hide')
I have tried using filter_functions but every result ends up searching on computer name only
I am using Jinja Templating so it was a little hard to get a fiddle up and running but here is a sample.
http://jsfiddle.net/brianz820/856bzzeL/813/
Sort by wg02 (at top, no results), and then use the computer filter to search for say, 3.3. No results show up, but once you delete the search, the original workgroup filter is removed.
On my production copy, even typing 3.3 would have returned results for any workgroup, ignoring the filter.
There may be lots of extraneous code on Fiddle, just wanted to get a working version up
Thanks for reading, Goal to maintain free form child searching and filtering on filter selection, but maintain external hidden rows.
if there is any more required information please let me know
I'm not exactly sure if this is what you meant, but the hideRows function can be simplified by using the filter widget (demo):
function hideRows() {
var $table = $('.tablesorter'),
filters = $.tablesorter.getFilters( $table );
filters[2] = selected === 'All' ? '' : selected;
$.tablesorter.setFilters( $table, filters );
}
I'm trying to do a webservice where multiple Users can be logged in at the same time. On the Nodejs server there is a unsorted array with all the users. And a Database with all users.
Every User can always see every User online in a HTML Bootstrap Table, with different columns for Username, Id, online since.... and there are also lists for Groups that include online and offline Users. The important part is, that the Table should be updated like every 3-5 seconds.
I want the Users to be able to sort the Online Users Table by clicking on the Tableheader of a Column. What is the best practice to do that?
I currently can only think of two different solutions, both don't seem perfect to me.
1. Use Bootstrap sorting
I save the information in which way the User wanted the list to be sorted.
Then I receive the unsorted Data and fill the Table with it, after which I will trigger a click on the header and sort the Table the same way again.
But if I do it this way I think the User will always notice that the Table is refilled and then sorted againm if done every 3-5 Seconds.
2. Keep sorted lists on the Server
Keep all the different sorted lists of the Users on my Server at all Time and let the server sort them new every 3-5 Seconds.
Then when the client requests it, send the sorted list he currently wants to the client and fill the Table HTML.
But this way I think it would use quite some resources from my server, because it also has to sort some mixed online/offline Users for groups which would be many different tables I had to constantly save and reorder on my server.
Are there any better ways to achieve many sortable Userlists for the clientside?
The important thing about the UI is to reduce flicker and any kind of lag. First off try testing sorting on the user end before the data is displayed in the table. You don't want to trigger the click events because that might make a flicker effect where the data comes in, is displayed, sorted, then displayed again. If for some reason the sorting is taking too long this could result in lag or choppiness on the ui so test it out and see how it feels. I would only look to the server side if the client side isn't performing well. Check your CPU and RAM to see how best to handle that. The sorting on the fly might be doable with your setup or keeping it in RAM may be an option if you have some to spare.
Serverside stored in a site-wide or thread-engine variable in ram. If you can get away with it the thread-engine variable will be the fastest option but the cost would be SORTEDDATA_BYTES * WEB_THREADS.
Array.prototype.keySort = function(k, descending) {
var z = descending === true ? -1 : 1;
this.sort(function(a, b) {
return a[k] > b[k] ? 1*z : a[k] < b[k] ? -1*z : 0;
});
return this;
};
var sortedJSON = {
UsernameAsc: JSON.stringify(data.keySort("Username")),
UsernameDesc: JSON.stringify(data.keySort("Username", true)),
IdAsc: JSON.stringify(data.keySort("Id")),
IdDesc: JSON.stringify(data.keySort("Id", true))
};
Scenario
I have web interface (in a large web application) that allows a user to make a connection between two very large lists.
List A - 40,000+ items
List B - 1,000+ items
List C - Contains a list of items in b that are connected to the selected item in list A
The Code
Here is a rough jsfiddle of the current behavior minus the ajax update of the database.
Here is the primary functionality (only here because stack overflow requires a code snippet for jsfiddle links).
$('.name-listb input').add('.name-listc input').click(function (e) {
var lista_id = $('.name-lista input:checked').val();
var listb_id = $(this).val();
var operation = $(this).prop('checked') ? 'create' : 'delete';
var $listb = $('.name-listb .checkbox-list');
var $listc = $('.name-listc .checkbox-list');
if (operation == 'create') {
$listb.find('input[value=' + listb_id + ']').prop('checked', true);
// Ajax request to add checked item.
$new_item = $listb.find('input[value=' + listb_id + ']').parents('.option-group').clone();
$listc.append($new_item);
} else if (operation == 'delete') {
console.log('hello list delete');
$listb.find('input[value=' + listb_id + ']').prop('checked', false);
// Ajax request to remove checked item.
$listc.find('input[value=' + listb_id + ']').parents('.option-group').remove();
}
});
The Problem
The requirements do not allow for me to use an auto complete field or pager. But the current page takes way too long to load (between 1 and 5sec depending on caching). Also the JS behaviors are attached to all 40k+ items which will cause problems on lower performance computers (Tested on a newish $200 consumer special and the computer was crippled by the JS). There is also (not on JS fiddle but the final product) a filter that filters the list down based on text input.
The Question
What is a good strategy for handling this scenario?
My Idea
My first thought was to create a sort of document view architecture. A JavaScript list that adds items to the top and bottom as the user scrolls and dumps items off the other end when the list reaches a certain size. To filter I would dump the whole list and obtain a new list of filtered items like an auto-complete but it would be able to scroll and add items using ajax. But this is very complicated. I was hoping someone might have a better idea or a jquery plugin that already uses this approach.
Update
Lista is 70K Fixed
Listb is User generated and will span between 1k-70k.
That said just optimizing the JS with the excellent feedback of using delegates (which will make life 10x more awesome), won't be enough. Still need to limit the visible list.
Your Ideas?
I've encountered this issue on numerous projects before and one solution that's both easy to implement and well performing is using something like Infinity.js.
To summarize shortly, Infinity, like many other "infinite scroll" libraries, allows you to render only a small part of the actual list that should be visible (or should be visible soon), thus reducing the strain on the browser tremendously. You can see a simple live demo over here, check the first link for the API reference.
This is not impossible to do, the trick is NOT to load all that stuff onto the DOM because it will wreck any Java Script Engine.
The answer is using the d3js base library, which is the king amongst sorting extremely large data on client side, whether it is tabular or graphical. When I first saw it it had 4 examples, now there are pages and pages.
This is one of the first examples provided by d3, crossfilter.
The dataset if 5.3megabytes! And it filters data in milliseconds, and it promises to sort millions of rows without a performance loss.
I don't think this is strictly infinite scrolling but it was the best i could think of that compares to what i am seeing.
Anyway, we are using ng-grid to show data in a table. We have roughly 170 items (rows) to display. when we use ng-grid it creates a repeater. When i inspect this repeater from the browser its limited to 35 items and, as you scroll down the list, you start to lose the top rows from the dom and new rows are added at the bottom etc (hence why i don't think its strictly infinite scrolling as that usually just adds more rows)
Just so I'm clear there is always 35 'ng-repeat=row in rendered rows' elements in the dom no matter how far you have scrolled down.
This is great until it comes to testing. I need to get the text for every item in the list, but using element.all(by.binding('item.name')) or by.repeater or by.css doesn't help as there is only ever 35 items present on the page.
Now to my question, how can i make it so that i can grab all 170 items as an object that i can then iterate through to grab the text of and store as an array?
on other pages where we have less than 35 items iv just used the binding to create an object, then using async.js to go over each row and get text (see below for an example, it is modified extract i know it probably wouldn't work as it is, its just to give you reference)
//column data contains only 35 rows, i need all 170.
var columnData = element.all(by.binding('row.entity.name'))
, colDataTextArr = []
//prevOrderArray gets created elsewhere
, prevOrderArray = ['item1', 'item2'... 'item 169', 'item 170'];
function(columnData, colDataTextArr, prevOrderArray){
columnData.then(function(colData){
//using async to go over each row
async.eachSeries(colData, function(colDataRow, nRow){
//get the text for the current async row
colDataRow.getText().then(function(colDataText){
//add each rows text to an array
colDataTextArr.push(colDataText);
nRow()
});
}, function(err){
if(err){
console.log('Failed to process a row')
}else{
//perform the expect
return expect(colDataTextArr).toEqual(prevOrderArray);
}
});
});
}
As an aside, I am aware that iterating through 170 rows and storing the text in an array isn't very efficient so if there is a better way of doing that as well I'm open to suggestions.
I am fairly new to JavaScript and web testing so if im not making sense because I'm using wrong terminology or whatever let me know and i'll try and explain more clearly.
I think it is an overkill to test all the rows in the grid. I guess it would be sufficient to test that you get values for the first few rows and then, if you absolutely need to test all the elements, use an evaluate().
http://angular.github.io/protractor/#/api?view=ElementFinder.prototype.evaluate
Unfortunately there is no code snippet in the api page, but it would look something like this:
// Grab an element where you want to evaluate an angular expression
element(by.css('grid-selector')).evaluate('rows').then(function(rows){
// This will give you the value of the rows scope variable bound to the element.
expect(rows[169].name).toEqual('some name');
});
Let me know if it works.
I have a long table with columns of schedule data that I'm loading via a form with jQuery load(). I have access to these html pages with the table data and can add classes/data attributes etc.
My form has select fields for hours and minutes (defaulting to the current time) and I'm trying to get the next closest time plus the four after that.
The time data in my tables are all formatted as <td>H:MM</td>.
Ideally with jQuery, I was wondering how I can strip the table data of everything but those times. Alternatively, since I can reformat this data would I be making my life easier to format it a certain way?
Things I've tried - I am admittedly a novice at js so these things may seem silly:
Reading the first character of each cell and comparing it to the
selected hour. This is obviously a problem with 10, 11, 12 and is
really intensive (this is a mobile site)
Using a single time select field thenCreating an Array of each
column to compare with the selected time. Couldn't get this working
and also creates an issue with having to use a single select for
every time.
Basically looking for a little guidance on how to get this working short of, or maybe including, copying all the html tables into JSON format...
May as well post http://jsbin.com/ozebos/16/edit, though I was beaten to it :)
Basic mode of operation is similar to #nrabinowitz
On load, parse the time strings in some way and add to data on each row
On filter (i.e. user manipulates a form), the chosen time is parsed in the same way. The rows are filtered on row.data('time') >= chosen_time
The resulting array of elements limited to 5 (closest time plus four as OP requested) using .slice(0, 5)
All rows are hidden, these rows are displayed.
Some assumptions have been made, so this code serves only as a pointer to a solution.
I thought this was an interesting question, so I put together a jsFiddle here: http://jsfiddle.net/nrabinowitz/T4ng8/
The basic steps here are:
Parse the time data ahead of time and store using .data(). To facilitate comparison, I'm suggesting storing the time data as a float, using parseFloat(hh + '.' + mm).
In your change handler, use a loop to go through the cells in sequence, stopping when you find the index of the cell with a time value higher than your selected time. Decrement the index, since you've gone one step too far
Use .toggle(i >= index && i < index+4) in an .each() loop to hide and show the appropriate rows.
Here's how to do it on client side. This is just an outline, but should give you an idea.
// Create a sorted array of times from the table
var times = []
$('#mytable td').each(function(cell) {
times.push(cell.innerHTML);
});
times.sort();
// Those times are strings, and they can be compared, e.g. '16.30' > '12.30' returns true.
var currentTime = '12:30' // you probably need to read this from your select
var i = 0;
while (times[i] < currentTime || i=times.length) {
i++;
}
var closestTime = times[i];
var closestTimes = times.slice(i, i+4);
If you want to access not the times, but actually the cells containing the times, you can find them like this:
$('#mytable td').each(function() {
if ($(this).text() in closestTimes) {
// do something to that cell
}
})