Effect on page operation if page is too big in HTML - javascript

In one of my requirements, I get the log file (can be around 10MB) from my WebServer, and display a table with its content as log entries (using javascript). The problem I face is that it creates too many table rows, due to which page becomes less responsive.
I want to know if too many objects in HTML makes a page slow ?
Also, what should be workaround for this ?
Thanks in advance.

What does your javascript look like? Depending on how you're building the DOM, it could be the bottleneck. For example:
var html = [];
for(var i=0;i<rows.length;i++) {
html.push('<tr><td>' + rows[i] + '</td></tr>');
}
$('#tbody').append(html.join(''));
is going to be a lot faster than
for(var i=0;i<rows.length;i++) {
$('#tbody').append('<tr><td>' + rows[i] + '</td></tr>');
}
This is assuming you're using jquery. Also, as others have mentioned, use pagination to keep limit the number of elements on the screen.
If your HTML is complex, you may want to look into a template engine, and some help choosing the one that is right for you. I use dust because linkedin did all the hard evaluation work.

So, given this is an AJAX call I see a few options (in order of best performance):
Add paging to the service call and allow the AJAX query to only pull down specific batches of entries. Then implement next/previous page buttons that would make successive calls to retrieve more/less information. e.g. /ajax/log would be your initial call, then clicking next button would call /ajax/log?page=2; Previous button re-calls /ajax/log?page=1.
Pull all the data down in a single call, but only selectively output it to the DOM. KnockoutJS makes ouputting information in a paging format very simple (and there are examples here on Stackoverfow).
Dump it as-s and add a script like jQuery DataTables. This will hide rows and implement paging based on what's been pushed in to the table itself. It requires very little additional programming other than load up the table via ajax, then call $('mytable').datatable().

Related

Spliting a VERY long table in multiples pages html/php

here is the problem. I have no right to create a database and I receive a csv that countain MASSIVE amount of data each day. (More than 200 000 rows)
Data that I must make accessible for everybody on the intranet. So I created a simple html/php page that extract all the rows and display those informations in a table with a filter on every column with a simple fgetcsv.
Problem is that the web browser is not suited to display that much informations at the same time so it makes it crash or freeze for a while, and you can't do anything for a while.
I wanted to know if anyone knew a way to say to the page "load only the first 100 rows for exemple, then automatically create a next page that will load and display the next 100 rows etc."
I manages to DISPLAY only the first x rows and then when you clicked a button the table would expand with the next x rows, but they are still all loaded at once. The y remaining are just hidden, so the browser still dies or freeze.
Any idea ?
Thanks
It's a generic pagination question really. It doesn't matter if your data is stored in database or in a CSV file.
Just pass some offset argument to your PHP script via query string or URL rewriting and use it to select only part of your CSV list.
Like this: /big-table.php?page=3.
// Getting passed argument.
$pageNumber = (int) $_GET['page'];
// Items per page default.
$itemsPerPage = 100;
// Calculating offset.
$offset = ($pageNumber - 1) * $itemsPerPage;
Then use the $offset and $itemsPerPage to retrieve only part of your CSV file by limiting scope of your CSV parsing loop.
You can also pass items per page value as an argument to your script in order to control this value from your web interface. For example, if you want to create a dropdown menu with ability to select 10, 50, 100 items per page, etc.
And if you want it - you could always use AJAX to fetch more items dynamically, it doesn't really affect your pagination implementation server-side, only an output format (JSON instead of HTML).
Of course database implementation will work faster and I would recommend to opt for it instead if possible. And/Or you can use some caching layer to speed things up.
You could use the jquery Datatables plugin http://datatables.net/
It's quite simple to do what you want using that.
Refer to either this example: http://datatables.net/examples/data_sources/ajax.html or http://datatables.net/examples/data_sources/server_side.html
The most sensible thing is to demand database access. Right now, you're told to build a car but without using wheels and an engine.
Right now, you could use PHP to split the big csv in multiple smaller files of n rows long. You have to do this only once, or once a day/hour if the big csv is updated. Then you load each of these files only when you need them, either by navigating to another page or dynamically using Javascript.

jQuery Show/Hide Multiple Comment Sections on One Page

Presently, I am using a script to show/hide comments on a number of WordPress-based sites. The script works as expected on essentially every site. However, I've run into an issue with a site using the Hero theme. On the index pages, the theme pulls the content from X number of posts. Unfortunately, this has the effect of calling the script multiple times and resulting in the first post getting X number of show/hide buttons and the other posts being unaffected.
I'm uncertain as to how to modify the script in order to target each of the comment sections individually. Naturally, the task would be easier if the theme author had not assigned all comment sections on index pages the same id (#commentBox). I've looked at limiting the scope of the script (i.e., to have each instance of the script affect only the post content within which it's contained) and changing the actual functionality of the script to account for the multiple comment sections. Unfortunately, I've not yet been able to get it working as intended.
Here is a link an index page displaying the issue: http://www.sitestyling.ca/abbyphotography/blog/. The first link (i.e., the one to the script) leads to a page which has the script implemented and functioning as expected.
Any help or suggestions would be greatly appreciated.
So to state the obvious here, having multiple elements with the same ID is very bad mojo. To whatever extent that you can control how these WP sites are loaded I would encourage you to work to modify the repeated injection of your script into the page since even after making a change, your script would be run repeatedly since you have multiple blocks defining document.ready functions. You could ameliorate that with a global to tell if your script had already fired but that's getting a bit hackish...
What I would suggest in principle is that you add logic to your script to do the following:
Find the offending duplicate ID items and rename them to something unique
Add a class that you can reference rather than an ID
Refactor your code to loop through the collection of items returned from the class selector
If really, really necessary you could also then have kept track of your renamed IDs, go retrieve them individually and reset them to the offending original duplicate ID name thus incriminating yourself for future page user
The first couple of points here could be done pretty simply with something like this:
var uniqueAppend = 1;
var tempName = 'commentBox';
while(jQuery("#commentBox").length > 0 ){
jQuery("#commentBox").attr('id',tempName + uniqueAppend++).addClass('commentContainer')
}
Now you can find all your comment DIVs at once with jQuery(".commentContainer") and then iterate through that collection to take whatever actions you need.

Oracle APEX: Call stored procedure from javascript

I need some help with oracle apex. The thing which I want to do is the following:
I have table with some data about people. So each row describes exactly one human. And I want to show some more information about certain human. For example, list of shops he or she has visited. Such data provided by other tables.
I see it in this way: right click on table with people on certain row -> select option (what kind of info to show) -> execute stored procedure and show new page with data table (e.g. list of shops). But how can I implement it?
I've already found this plugin. Now I can execute some JavaScript function after right-click. But how can I execute stored procedure and show new page?
I'm new in apex, any help would be appreciated.
You're trying to reinvent the wheel. You're new to apex. Have you taken a good look at the documentation?
Start out where everyone else has to start: at the beginning. Report + form. Column links.
There is ample help available to someone new to apex.
The 2-day developer guide, running you through some of the
basics of apex and a good familiarization.
Get a workspace at apex.oracle.com
Each workspace starts with the sample database application, based on
products and customers. You can view and edit this application and
thus you can glean plenty of information from it.
Furthermore, there are the packaged applications, many of which offer
good basic solutions to common situations. Again, you can glean a lot
of information on them, and they are even editable after you unlocked
them.
After you are familiar with the basics, you can look further ahead. What you are asking is simply too much for someone new to the matter. You even want to implement a jQuery plugin straight away. You're talking ajax. It's great if you know those subjects and they'll be of plenty of value to you, but it just seems you don't even know how to present and fetch your data.
A good start would be to make a report and a form. In the form you can then add some classic (or an interactive) report(s) to represent associated data.
It's possible. First of all, assume that each row contains unique identifier ID. You have to add hidden item to the page which would contain additional info about certain row. Let's name it P1_ID. Then add the following JavaScript code to page which contains initial data (in example from question, page with table with information about people):
function TestFunction(action, el, pos) {
var id = $(el).children('td[headers="ID"]').text();
var href = 'f?p={APPLICATION ID}:{PAGE_NUMBER}:&SESSION.::::P2_ID:'+id;
window.location = href
}
Function name should match name in plugin settings. Example: link
Replace APPLICATION_ID and PAGE_NUMBER with actual values according to application. PAGE_NUMBER is the page number which contains additional info about row.
Then you can add some reports to the page with additional info and use ID parameter to select information about certain row.
The only problem is that plugin mentioned in question stop working after table refreshing. For example, if we filter data in table then no menu on right click will be shown. I don't know how to fix it for now. Any ideas?

Fast managing of tables with ~10k rows in Javascript [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
I've been working with tables and a huge amount of data. There are a tables in my website with 10 thousand rows. This table has dynamic search, filters, etc. I've been using pure JavaScript considering performance, but it gets laggy with this amount of rows.
Do you guys know any alternative for pure JavaScript with better performance?
EDIT> I REALLY need to load the 10 thousand rows at once. I can load them all in the browser in 5 seconds. The main problems are the filters and search...
EDIT2> The search is dynamic. I can search by name and filter it by first character.
I've been working on it for months...
SEARCH:
search when the field has more than 3 characters and only when its length and characters has changed (onchange event on input may trigger multiple times when a character changes, so I make sure it only triggers once using some verifications)
each row that matches the searched string is coppied to another table. The original table is hidden and the new one is displayed.
when the user changes the search field or cancel the dynamic search, the new table is erased.
Conclusion: it's faster to create a new table with desired elements than hide the undesired ones.
FITLER
The rows are actually in 25 tables (A to Z + non-alphabetical characters)
When you select a character, only that table is shown
Conclusion: it's faster to hide a whole table than hide the
undesired rows
Thanks for the repplies. I've edited with some extra info so we can narrow the possible solutions...
I'm asuming you get the contents from database and load with with something like PHP (I'm going to asume PHP for now)
You could make the Javascript make an ajaxcall to a php-file which does the filtering (actually you should make the database do it, a lot faster!) and place the resulting table back on screen.
A faster method combined with the above might be this: Get all id's on the initial rows in an array and save those (in a session might work pleasant).
When you have to filter, don't make PHP get the whole table, just apply the filter to only the stored id's and send javascript the matching rows.
Then make javascript do something like this:
- set all visible
- set resulting id's to hidden (hidden in favor of remove, because I think a user might perform multiple filter actions?)
Another idea just popped in my head: If you don't need to display it on load, you can start the initial load with all tables hidden and a message "please search to display".
A common technique to handle this case is to load the data in memory or a subset of the data, and recycle your table rows such that you aren't actually ever creating thousands upon thousands of rows. You can get creative with this and create a web interface that seemingly scrolls endlessly but in reality you are just reusing dom elements and shuffling them around.
Most well-built data grid widgets whether they are on the web, mobile or even a desktop interface will employ this technique to handle your particular problem.
In most cases a user will never actually find themselves benefiting from seeing 10's of thousands of rows of data at once anyway.
fetch from the server only the things to be seen by the user, Like everyone has pointed out 10,000 rows needn't be there on that page.
you can use the concept of pagination and for every page few rows are fetched and shown . JQuery's Ajax is capable of calling the server side function to fetch rows to add them to your page.
don't know any backend details here, but in struts framework there is display:table tag and I believe in .NET framework there is GridView for pagination in the client side that you can look into
I'm just a student, newly joined to the community. Take what I say with a grain of salt.
I'm not sure why everyone is so much as blinking at the ten thousand rows business when we're measuring modern personal computers' memory in gigabytes.
Alright. I'm going to assume that what you're doing needs to be done in the browser, and so you can't switch to doing native code. In that case, looking for an alternative to Javascript won't get you much of anywhere. In the context of a browser, you're looking at an interpreted language. In terms of number of instructions the program ultimately has to run, the difference between one language or another is negligible in the face of how long it takes to be interpreted. Besides, Javascript has gotten nicely polished over the years.
So never mind that. There's a much more important thing to consider here, and it applies no matter what you're programming in or on: The cache(s). Igor Ostrovsky explains it beautifully; read it until you grok it.
So I'm guessing you have objects that would stringify to something like, "obj1 = {field-1:'a', field-2:'b', ..., field-n:'n'}". And you can select a field-i to sort by. The trouble with this is that when you sort by field-i, you're loading the entirety of obj1 into the cache, even though you don't need it. What you really want to do is load the field-i's forobj1, obj2, obj3, ..., objm all at once. So you look at an array, stringifying to something like: "field1 = [refToRow1, field1inRow1, refToRow2, field1inRow2, ..., refToRowM, field1inRowM]".
You might not be able to do fit all M rows in the cache, after all M==10000! But you can group them together into chunks that you could reasonably expect a cache to manage. Anyone got a good number for this? Say, 64kB? So you figure for each i in M you've got a reference, and a field that's probably just a reference to a short string (it'd be better if you could have the string itself right there, but I don't think Javascript works that way). So that's 8B? 8B*1024 = 64kB? Hell, if that's right, you could fit it all into the cache in two chunks, which means you'd want to do it in 4.
So now you've got a collection of smaller arrays, and you want to sort them. This is a classic application for B-trees. And while having a separate B-tree for each and every column in the table may seem like a lot, it's not.
Okay, so that handles sorting. You tell it to sort by a column, and the truth is it's already sorted! You're just repopulating the visible table using a different b-tree. You still need to handle filtering, but that's fine. You do some cache juggling as you find something to display and follow the reference to get the other fields, but I'd still expect this to go fast since you're skipping over so many rows.
Normally, I would say if you want to speed things up, look into multiprocessing. But I think browsers are still working to make that a thing with their Javascript implementations. Plus, while it would be well-suited for sorting, it would be a lot of effort to make it useful for the filtering part, and I expect you can do fine without.
I hope this isn't too scatter-brained, and that it gives you some ideas. Good luck!

living web page elements

I have a simple rails app with a model Task, which has 10 rows. It does not matter what's inside this table. On the index page I can see all 10 elements and I need to arrange them in proper sequence, when I did this, I should see a message "Done".
If I understand correctly this should be implemented in javascript, because page should not be reloaded, right?
I want to be able to rearrange the elements via drag and drop.
How I can realize that function?
I would start with here -> http://jqueryui.com/demos/sortable/
You can sort and its quite simple as there great examples on the site how to do this and hook into events.

Categories