I am building a page, which depending on a date range, displays between 0 and a couple hundred rows. When the user enters the page it loads all rows and displays them, the user can then filter the data to his needs. This seems reasonable fast in chrome but IE8 becomes quite slow at some point. (Unfortunately IE8 is the Browser that counts)
Say I need the entire data at page load, but only want to display a subset. Whats the best way to do that?
1.) Build a DOM String and add only the needed rows to the "real" DOM.
2.) Save the data in localStorage.
3.) Take the needed data from the Server produced JSON Object.
4.) ???
Or is it always better to hit the server with a specified query and return only the needed data?
On page load render all the rows in the DOM and print the necessary fields of the data in JSON array.
When filter criteria changes, filter the data in JSON, and then using the unique identifiers in JSON hide the rows in the table (only hide, not remove). This way You wont have to rerender existing rows.
If You choose the ajax way though, the fastest way is to render the HTML on the server side, then simply replace the content of the table with it. This way the browser renders the representation from the given string, u dont have to iterate through a JSON array and render it one by one. Its drawback maybe the network latency and bandwidth.
Hope it helps decide
Related
I have a fairly simplistic HTML page that relies on JQuery to process a JSON object after a Form has been submitted and then organize the data in a large table (upwards of 1000+ rows depending on the criteria selected in the Form, and 50 columns)
The request goes to a PHP page which does some fairly heavy computations to build the JSON array and send it back to be processed by JQuery. The JS organizes the data in an HTML table.
The number of objects returned in the JSON array varies depending on the settings chosen in the form, but the length of time seems to increase the total load time on the HTML page exponentially.
I know that each JSON object is fairly large, but I need all of the information that is returned so it's not really an option to pare it down any further.
My problem is that I'm unable to figure out where in my code the slow down is occurring?
Is it the size of the JSON array?
What I've Tried:
Speeding up the PHP/MYSQL selections (helped minimally)
Time the PHP script by writing to a *.txt file when the script begins
and ends. Even in the worst case scenario (most options selected on the HTML form) the total time never exceeds 4-5 seconds to process json_encode.
Use the ob_start("ob_gzhandler"); prior to the json_encode (didn't
seem to make any difference).
I took out the JS Plugins to sort the table columns (originally used sorttable.js and changed to stupidtable to speed things up, but didn't seem to work)
I tried using Chrome's heap snapshot to identify any memory leaks and
to find the total size of the JSON array. Honestly, I'm not really
sure what I'm looking at??
I appreciate any help you can give me. I've spent the last day searching through Google and StackOverflow, but I'm currently at a loss.
Thank you
use the dev tools f12 to see how long it is taking the request from the server to come back. Add some logging in your js after it comes back between different operations to see where the bottleneck might be. Trying to display 1000k+ table rows in the dom all at once is going to affect performance. I would consider a JS table api, such as, DataTables that will only show so many table rows at once in the dom. Something that you could feed the whole json object into to page it.
https://datatables.net/
I have a csv file of about 10k rows x 25 columns. The csv contains information of bus routes, stops, etc. I have a select box with all the routes and the user will be able to pick a single route to show on the map, and then they will be able to click on individual stops (another select box) to get a closer look on the map. I am wanting to know what will be the best way to parse and structure to store this information and be able to perform fast queries (database?), and how should I store the result of the query (array, json object, dictionary, data Table?). I won't need all columns every time, so I will pick the useful columns to make the query a little faster.
Each time a user will select a different route, I will make a query to get all the stops and other relevant information and loop through the data to display it on the map (maybe store results of last 5 queries?). What will be the best way to store this result? Showing the specific stop information won't be too big of a deal since it will be a smaller subset of the already queried results.
Let me know if you need any additional information that will assist with the answers.
Google releasd a public scheme called gtfs which is a transport data structure. You would ideally use a graph data structure. I have found neo4j a good option.
See this github project for an example of how to use a graph database for this purpose:
https://github.com/tguless/neo4j-gtfs
So, I am working on a project where from a JSON response i have to create a list. Each item in the list has a common template. Only the data part that is different is populated using JSON response.
About the JSON, it is an ajax request response, in which as parameter I am passing start:0 and rows:9999, so to get all data.
The whole point of briefing above thing is, when i go with above approach, dynamically creating list based on JSON response creates thousands of nodes, which effects the initial page load time. In general, it takes around 10 secs to populate around 6k nodes.
Only solution which seems doable to me is to break the result into pieces and show a max of say, 10 results in one page. It boils down to use pagination and thus on every page request, do ajax with request to next 10 values and simultaneously populate.
But the problem lies in filtering the data when the list is not full(start:0, rows:20), as we used to have earlier with all nodes/data(start:0, rows:9999).
This will effect the filtered results. Filter is done by hiding the node which doesn't satisfy the filter criteria. So if all the nodes are not in the list, filter will not show the same result as earlier.
I am confused about the way to proceed and still achieve filter. How can on a single ajax request, list is populated on different pages, but only one of them is shown at a time. Any suggestions are appreciated.
If you want to display the data in form of tree then you can use jstree framework which is based on jQuery.
I am continually pulling in a list (over 200) entries from a remote server and showing them in a table (many pieces of data per row). The data changes and fluctuates over time. What's the most efficient way in JS to render this to the screen. As the data changes do I simply wipe my DOM and re-render with the new data; do I have a list of ids that will simply update the correct tr dom entries?
Its already sorted and filtered to be the correct information (server side) so its just displaying it to the user I am concerned with.
Any suggestions would be helpful. I am using jQuery for my requests but that can be changed. I need to know the quickest (latency wise) way of pulling this large dataset and displaying it in a tabular format.
Edit: Pagination is not possible in this scenario.
Edit2: The backend is Python (WSGI) running in a load balancer and is capable of flushing out 600 reqs / second. The data is only available as JSON and the backend is more of an API.
Are you using HTML tables? Sometime ago I crossed a Microsoft blog from the IE team stating that rendering tables are slow. Very slow. Re-rendering the whole table everytime will probably be slower than updating its values, but that is just my guess.
In fact, my opinion here is that updating values will be faster than re-rendering elements in the page. To optimize your latency, you could keep a hash of objects mapped by, so you don't have to lookup the DOM everytime you update the values.
I quite like the 'Live Image Search' which provides more data as you scroll down.
If your data list is getting really large, consider not displaying all of that data to the user at one time (if that is possible in your situation). Not only will the user get lost looking at that much data, but the browser will have to render the whole list and slow you down.
You don't mention what server side technology you are using. If you are in .Net, there are a couple ASP.Net controls that use a data pager (like a grid view). There is also a PagedDataSource object you might look into using (which can be used with any ASP.Net control that has a DataSource property). Both will break up your data into pages, and only the viewed page will be rendered at one time. This decreases latency dramatically.
This is an ajax questions. I have a table this table shows users information at certain time depending on what settings the user sets.
Now in some cases a user will see the information right away and in some cases they won't it all depends on when they want to see the information.
Now what should I do?
Should I do? do a post and post their data and then do a ajax get after to get the table and render it?
*I probably could it all in the post but unless some huge performance gain is gained I rather not otherwise I have to do mix "success/fail" messages and the table to be rendered all in the same response.
So each one seems to have pluses and minuses.
Ajax way
don't have to worry about having a
JavaScript solution that queries the
database to figure out what their
timezone is and then determine if the
row should be added or not and any
other headaches that comes with
javascript dates.
Each row could potential have a
different style to. This would
mean I would have to possibly do a
query to the database and figure it
out or have hidden field in the page
for easy access. With Ajax way I
would not have to worry about it.
don't have to worry about making a
manual row in javascript/jquery
syntax what can be a pain to do if
you have many columns.
Javascript way
Problem less of a performance hit
since only have to potentially make
one new or do nothing. Where
otherwise I have to generate a new
table regardless and if it has lots
of rows in it that could be slow.
Have to rebind all jquery plugins
that would be on the table. Or use
jquery.live for everything else.
So I am not sure to me it seems like a hard choice.
Unless I misunderstand what you want to do, why not do both in one solution?
Return a JSON response, so, when a user logs in, you post the information using an ajax call, and just return the data from the database. I tend to return either data or an error message, but you could have two objects in your json string, one for a possible error message and the other being the data that is being returned.
The javascript then can process the data as is needed.
So, you do both, it isn't an either/or decision.