I have a web page that contains a table wherein you can edit every cell content. The typical size of this table is about 12 columns and 40 rows. This page is typically used by teachers to encode grades here in our area. But our area sometimes has a poor connection, and what happens is sometimes, teachers would lose the encoding they did (they typically only save after encoding everything).
So right now, I am thinking of implementing an autosave feature. I think the solution I'm thinking of is straight-forward: every time a cell gets edited and then loses focus, do an ajax request to save the data. At least that's what I'm thinking.
My concern is since there's a lot of people using this at the same time, wouldn't that be too heavy, every edit, access the DB, and insert or update? And also, I've tried it a bit and the UI became a bit choppy when moving from one cell to another. My main question basically is, is there a pattern or guidelines I can use for implementing this?
I'd need more details on what your api design to help come up with the most optimal solution but here are some high level approaches you can consider
On each cell update only update the diff (the exited cell) and not the entire dataset to keep the api call as light weight as possible
Look into using IndexedDB or Local Storage to save a copy of the data in the browser as they update cells. In case of a lost connection you can restore the user's session.
There are some points here for your problem that I mentioned and try to say its solution:
First, you have to think about concurrency. If you need to show table content for a long time to your users, It is possible to change contents by other users at the same time, and the current content will expire! So you could keep your table content update continuously! one solution is to use RowVersion in DotNet For this purpose (Optimistic approach for concurrency). Another solution is using SingnalR to keep tables update all the time (I mean Real Time).
In addition, there is no serious problem with update and edit according to a user request in regular systems but in an enterprise solution you can scale up or scale out your structure.
Related
I looked at "Generating HTML Page on the fly" on this website, but most of it was over my head.
I have a 2 part question that I would like assistance with please.
I want to fill a narrow vertical container, <div id=”counter”> with the numbers 1 .. <xx>.
<xx> is determined by the record count of a database, filtered “on-the-fly”, by the user choosing a category (no problem there – I have an SQL background)
Eg. Category1: 1 .. 200
Category2: 1 .. 6
These numbers could change over time, as I want to allow users to add content to the database (vetted of course).
I have viewed a number of website source code pages (of similar ideas eg. Surgicalexam.com), but they have all been hard-coded and are distinct pages per category.
I have created a small website of a similar nature to that, hard-coding all the images and links, but I am looking at 3000+ images (as a starting point here), and they differ per page.
I have created this scenario many times in stand-alone apps and from past experience, I thought perhaps, I could create a javascript routine which would use a loop to
• print the numbers to the <div> using the getelementbyID ( ).
• Fill an array with the record number, a title and an image link.
Question 1: Is this possible or am I beating a “dead horse”?
If it is possible, any suggestions would be gratefully accepted.
Part 2:
My current idea is that, as the user hovers the mouse over any number, a mouseover ( ) event will occur that will read the appropriate array record and display the <title> as a tool-tip-text.
If the user clicks the number, a function (I have yet to write) will read the appropriate array record and attach the image link to an <a> tag, and subsequently display the appropriate image to the screen.
Question 2: repeat of question 1.
I have viewed a number of website source code pages (of similar ideas eg. Surgicalexam.com), but they have all been hard-coded and are distinct pages per category.
Why are you so sure about that? You can't see php-code, because it is executed on the server. There is no way to know if it was hardcoded or by php
Answer:
It is possible.
If I understand this correctly, you want to read some data from a database and if the user clicks / hovers something, you want to load more data?
You have to splitt this into two things:
Load data with PHP from the db (Server side)
If you want a live, visual feedback you need JavaScript (and/or CSS3) to do changes. (Client side)
One possible solution is to create a API with php (maybe REST-like) and then call that api with JavaScript.
You could also do everything with PHP but this will require a reload of the website on every click. PHP cannot do changes On-The-Fly.
First of all you should learn the basics about web development.
And most important: If you decide to learn Web-Programming: learn about security, too. For example things like Cross Site Scripting and SQL-Injection. Never trust data coming from a client (e.g. JavaScript)!
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
I've been working with tables and a huge amount of data. There are a tables in my website with 10 thousand rows. This table has dynamic search, filters, etc. I've been using pure JavaScript considering performance, but it gets laggy with this amount of rows.
Do you guys know any alternative for pure JavaScript with better performance?
EDIT> I REALLY need to load the 10 thousand rows at once. I can load them all in the browser in 5 seconds. The main problems are the filters and search...
EDIT2> The search is dynamic. I can search by name and filter it by first character.
I've been working on it for months...
SEARCH:
search when the field has more than 3 characters and only when its length and characters has changed (onchange event on input may trigger multiple times when a character changes, so I make sure it only triggers once using some verifications)
each row that matches the searched string is coppied to another table. The original table is hidden and the new one is displayed.
when the user changes the search field or cancel the dynamic search, the new table is erased.
Conclusion: it's faster to create a new table with desired elements than hide the undesired ones.
FITLER
The rows are actually in 25 tables (A to Z + non-alphabetical characters)
When you select a character, only that table is shown
Conclusion: it's faster to hide a whole table than hide the
undesired rows
Thanks for the repplies. I've edited with some extra info so we can narrow the possible solutions...
I'm asuming you get the contents from database and load with with something like PHP (I'm going to asume PHP for now)
You could make the Javascript make an ajaxcall to a php-file which does the filtering (actually you should make the database do it, a lot faster!) and place the resulting table back on screen.
A faster method combined with the above might be this: Get all id's on the initial rows in an array and save those (in a session might work pleasant).
When you have to filter, don't make PHP get the whole table, just apply the filter to only the stored id's and send javascript the matching rows.
Then make javascript do something like this:
- set all visible
- set resulting id's to hidden (hidden in favor of remove, because I think a user might perform multiple filter actions?)
Another idea just popped in my head: If you don't need to display it on load, you can start the initial load with all tables hidden and a message "please search to display".
A common technique to handle this case is to load the data in memory or a subset of the data, and recycle your table rows such that you aren't actually ever creating thousands upon thousands of rows. You can get creative with this and create a web interface that seemingly scrolls endlessly but in reality you are just reusing dom elements and shuffling them around.
Most well-built data grid widgets whether they are on the web, mobile or even a desktop interface will employ this technique to handle your particular problem.
In most cases a user will never actually find themselves benefiting from seeing 10's of thousands of rows of data at once anyway.
fetch from the server only the things to be seen by the user, Like everyone has pointed out 10,000 rows needn't be there on that page.
you can use the concept of pagination and for every page few rows are fetched and shown . JQuery's Ajax is capable of calling the server side function to fetch rows to add them to your page.
don't know any backend details here, but in struts framework there is display:table tag and I believe in .NET framework there is GridView for pagination in the client side that you can look into
I'm just a student, newly joined to the community. Take what I say with a grain of salt.
I'm not sure why everyone is so much as blinking at the ten thousand rows business when we're measuring modern personal computers' memory in gigabytes.
Alright. I'm going to assume that what you're doing needs to be done in the browser, and so you can't switch to doing native code. In that case, looking for an alternative to Javascript won't get you much of anywhere. In the context of a browser, you're looking at an interpreted language. In terms of number of instructions the program ultimately has to run, the difference between one language or another is negligible in the face of how long it takes to be interpreted. Besides, Javascript has gotten nicely polished over the years.
So never mind that. There's a much more important thing to consider here, and it applies no matter what you're programming in or on: The cache(s). Igor Ostrovsky explains it beautifully; read it until you grok it.
So I'm guessing you have objects that would stringify to something like, "obj1 = {field-1:'a', field-2:'b', ..., field-n:'n'}". And you can select a field-i to sort by. The trouble with this is that when you sort by field-i, you're loading the entirety of obj1 into the cache, even though you don't need it. What you really want to do is load the field-i's forobj1, obj2, obj3, ..., objm all at once. So you look at an array, stringifying to something like: "field1 = [refToRow1, field1inRow1, refToRow2, field1inRow2, ..., refToRowM, field1inRowM]".
You might not be able to do fit all M rows in the cache, after all M==10000! But you can group them together into chunks that you could reasonably expect a cache to manage. Anyone got a good number for this? Say, 64kB? So you figure for each i in M you've got a reference, and a field that's probably just a reference to a short string (it'd be better if you could have the string itself right there, but I don't think Javascript works that way). So that's 8B? 8B*1024 = 64kB? Hell, if that's right, you could fit it all into the cache in two chunks, which means you'd want to do it in 4.
So now you've got a collection of smaller arrays, and you want to sort them. This is a classic application for B-trees. And while having a separate B-tree for each and every column in the table may seem like a lot, it's not.
Okay, so that handles sorting. You tell it to sort by a column, and the truth is it's already sorted! You're just repopulating the visible table using a different b-tree. You still need to handle filtering, but that's fine. You do some cache juggling as you find something to display and follow the reference to get the other fields, but I'd still expect this to go fast since you're skipping over so many rows.
Normally, I would say if you want to speed things up, look into multiprocessing. But I think browsers are still working to make that a thing with their Javascript implementations. Plus, while it would be well-suited for sorting, it would be a lot of effort to make it useful for the filtering part, and I expect you can do fine without.
I hope this isn't too scatter-brained, and that it gives you some ideas. Good luck!
I have a database no more than say 100k which I'd like to use as reference documentation for my software. It's just a simple table really - around 5 columns by a couple of hundred rows. I am looking for a decent Javascript database library; one which would feature:
Sorting by column
A tiny size. Has to be small as it is sent to the user (since I don't want anything server-side). Say no more than 50-100k.
"Update-as-you-type" functionality, and by that I mean, you can type in a filter box, and the rows filter out instantly on the HTML page, or near instantly as you're typing (client-side processing only). If no input in the filter is given, all the results would display on a single large HTML page.
Searches that allow for partial matches of any cell in the table, and preferably allow NOT, OR and obviously AND.
Furthermore, it should be free/cheap, easy to use and install, perhaps working on a CSV data file for its data.
Is there anything out there that fits the bill?
Check jQGrid OR DataTables it has most of that what you are looking for.
I'm building a budget webapp, mostly for my personal needs and for the sake of self training. I may release it later at some point.
The interface will feature a table of operations (credit / debit). I was planning to use Ajax to make the table "editable" by clicking in a cell (Excel-like).
I therefore need to:
diplay operations
add new ones
modify existing ones
I fail to see how to make "modify" degrade nicely, as if you remove JS, this will be a plain old table without the possibility to modify an existing entry.
Turning the table into a giant form would be ugly, adding links to edit each operation then hide them using JS seems fairly complex...
One possible solution would be to add links that go to other forms that allow the modify operations to take place. This is much less fluid than in-place edits for accounts, but it still remains accessible without being burdensome (like a huge page of form inputs would be).
You can then override the links with javascript to give the ajaxy, web-app functionality you are looking for.
I have a webpage which contains a table for displaying a large amount of data (on average from 2,000 to 10,000 rows).
This page takes a long time to load/render. Which is understandable.
The problem is, while the page is loading the PCs memory usage skyrockets (500mb on my test system is in use by iexplorer) and the whole PC grinds to a halt until it has finished, which can take a minute or two. IE hangs until it is complete, switching to another running program is the same.
I need to fix this - and ideally i want to accomplish 2 things:
1) Load individual parts of the page seperately. So the page can render initially without the large data table. A loading div will be placed there until it is ready.
2) Dont use up so much memory or local resources while rendering - so at least they can use a different tab/application at the same time.
How would I go about doing both or either of these?
I'm an applications programmer by trade so i am still a little fizzy on the things I can do in a web environment.
Cheers all.
Regarding the first part, it's called Ajax: display the page without the table, or with an empty table, and then use ajax requests to fetch the data (in html or any data format) and display it.
Regarding the second part, you want something called lazyloading: the possibility to load data only when the user needs it, ie when it's on the visible part of the document. You can look at this question for a DataGrid library capable of handling millions of rows.
Two basic options:
Pagination
Lazy loading (load as user scrolls down). See this jQuery plugin
You could try a couple of things:
Loading data Asynchronously
and
Paging