I need to know how to use exactlly the paginating function.
I have read that offest isn't the only way, and there an many ways to paginating between the results.
I would like to paginating without reload the site it's possible?
I got this example
SELECT * FROM Table LIMIT 100, 5000
I expect to put it in a function or in <a> or in button + Eventlistener...
I don't get it how to displaying the next number of result and next on.
Hope what i try to explain is clear.
Thx for the help.
If you are using classic asp indeed, then you can make use of the recordset pagination features.
You might want to have a look at this tutorial: ADO Recordset paging in ASP
I had no success researching this problem, so apologies for my difficulty in concisely describing it.
Basically, I'm building a website where users can submit "recipes," by way of html form > php. I wanted a way then for users to dynamically add or subtract steps or ingredients (I've seen this on other sites, but I couldn't find a simple pure js solution).
Anyway, I built something akin to what I wanted (example here, JavaScript here), but I'm not convinced it's such a great solution. To retrieve the form info, the php code basically loops through the materials' and steps' ids until it's reached the last one.
Although this solution works, I've run into 2 more problems:
The first is that the text for each input is saved in a js array each time the user types--that way new inputs can be added or removed without losing the text. However, when a new input element is created, if this previous text was too long, it is cut off.
The second is that for the ingredients section, I'd like to have a element where users can only choose standardized measurements (ie mg, g, kg...) and then another for a numeric quantity. I've tried Bootstrap's input-group classes, but the spacing turns out very odd and doesn't work at all on mobile. Is there a better way to accomplish this? This also thoroughly complicates the my original solution, since there will now be 3x as many inputs.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
I've been working with tables and a huge amount of data. There are a tables in my website with 10 thousand rows. This table has dynamic search, filters, etc. I've been using pure JavaScript considering performance, but it gets laggy with this amount of rows.
Do you guys know any alternative for pure JavaScript with better performance?
EDIT> I REALLY need to load the 10 thousand rows at once. I can load them all in the browser in 5 seconds. The main problems are the filters and search...
EDIT2> The search is dynamic. I can search by name and filter it by first character.
I've been working on it for months...
SEARCH:
search when the field has more than 3 characters and only when its length and characters has changed (onchange event on input may trigger multiple times when a character changes, so I make sure it only triggers once using some verifications)
each row that matches the searched string is coppied to another table. The original table is hidden and the new one is displayed.
when the user changes the search field or cancel the dynamic search, the new table is erased.
Conclusion: it's faster to create a new table with desired elements than hide the undesired ones.
FITLER
The rows are actually in 25 tables (A to Z + non-alphabetical characters)
When you select a character, only that table is shown
Conclusion: it's faster to hide a whole table than hide the
undesired rows
Thanks for the repplies. I've edited with some extra info so we can narrow the possible solutions...
I'm asuming you get the contents from database and load with with something like PHP (I'm going to asume PHP for now)
You could make the Javascript make an ajaxcall to a php-file which does the filtering (actually you should make the database do it, a lot faster!) and place the resulting table back on screen.
A faster method combined with the above might be this: Get all id's on the initial rows in an array and save those (in a session might work pleasant).
When you have to filter, don't make PHP get the whole table, just apply the filter to only the stored id's and send javascript the matching rows.
Then make javascript do something like this:
- set all visible
- set resulting id's to hidden (hidden in favor of remove, because I think a user might perform multiple filter actions?)
Another idea just popped in my head: If you don't need to display it on load, you can start the initial load with all tables hidden and a message "please search to display".
A common technique to handle this case is to load the data in memory or a subset of the data, and recycle your table rows such that you aren't actually ever creating thousands upon thousands of rows. You can get creative with this and create a web interface that seemingly scrolls endlessly but in reality you are just reusing dom elements and shuffling them around.
Most well-built data grid widgets whether they are on the web, mobile or even a desktop interface will employ this technique to handle your particular problem.
In most cases a user will never actually find themselves benefiting from seeing 10's of thousands of rows of data at once anyway.
fetch from the server only the things to be seen by the user, Like everyone has pointed out 10,000 rows needn't be there on that page.
you can use the concept of pagination and for every page few rows are fetched and shown . JQuery's Ajax is capable of calling the server side function to fetch rows to add them to your page.
don't know any backend details here, but in struts framework there is display:table tag and I believe in .NET framework there is GridView for pagination in the client side that you can look into
I'm just a student, newly joined to the community. Take what I say with a grain of salt.
I'm not sure why everyone is so much as blinking at the ten thousand rows business when we're measuring modern personal computers' memory in gigabytes.
Alright. I'm going to assume that what you're doing needs to be done in the browser, and so you can't switch to doing native code. In that case, looking for an alternative to Javascript won't get you much of anywhere. In the context of a browser, you're looking at an interpreted language. In terms of number of instructions the program ultimately has to run, the difference between one language or another is negligible in the face of how long it takes to be interpreted. Besides, Javascript has gotten nicely polished over the years.
So never mind that. There's a much more important thing to consider here, and it applies no matter what you're programming in or on: The cache(s). Igor Ostrovsky explains it beautifully; read it until you grok it.
So I'm guessing you have objects that would stringify to something like, "obj1 = {field-1:'a', field-2:'b', ..., field-n:'n'}". And you can select a field-i to sort by. The trouble with this is that when you sort by field-i, you're loading the entirety of obj1 into the cache, even though you don't need it. What you really want to do is load the field-i's forobj1, obj2, obj3, ..., objm all at once. So you look at an array, stringifying to something like: "field1 = [refToRow1, field1inRow1, refToRow2, field1inRow2, ..., refToRowM, field1inRowM]".
You might not be able to do fit all M rows in the cache, after all M==10000! But you can group them together into chunks that you could reasonably expect a cache to manage. Anyone got a good number for this? Say, 64kB? So you figure for each i in M you've got a reference, and a field that's probably just a reference to a short string (it'd be better if you could have the string itself right there, but I don't think Javascript works that way). So that's 8B? 8B*1024 = 64kB? Hell, if that's right, you could fit it all into the cache in two chunks, which means you'd want to do it in 4.
So now you've got a collection of smaller arrays, and you want to sort them. This is a classic application for B-trees. And while having a separate B-tree for each and every column in the table may seem like a lot, it's not.
Okay, so that handles sorting. You tell it to sort by a column, and the truth is it's already sorted! You're just repopulating the visible table using a different b-tree. You still need to handle filtering, but that's fine. You do some cache juggling as you find something to display and follow the reference to get the other fields, but I'd still expect this to go fast since you're skipping over so many rows.
Normally, I would say if you want to speed things up, look into multiprocessing. But I think browsers are still working to make that a thing with their Javascript implementations. Plus, while it would be well-suited for sorting, it would be a lot of effort to make it useful for the filtering part, and I expect you can do fine without.
I hope this isn't too scatter-brained, and that it gives you some ideas. Good luck!
I have been looking for information about this but I don't get with the solution.
I have some text fields on a page which I get updated on blur making use of jQuery events.
There's no form to submit, the update took place in the background using jQuery $.post.
If a user updates those inputs, then clicks on a link to go to another section of the page and then comes back using the browser's back button, those inputs won't show the last values set by the user but the previous ones in case there were, or empty text inputs if there weren't.
Users are reporting this as a bug (even is more a browser behavior), and I wonder if there is any solution for this.
I have been taking a look at things like this or this, but they don't solve the problem I have. I have no forms, I have no submit and I don't want to reset the form.
I've noticed this problem doesn't take place in IE 9 and either in Firefox but it does on Chrome.
The user can navigate to different pages (more than 30) with more than 12 text fields in each and therefor I have discarded the idea of storing them on sessions.
Is there any way to solve it?
Thanks.
I do not know if my solution can solve you're problem , but:
what if you bind data to you're input? e.g you can set dinamically data on input
$('input').data( 'val' , $(this).val());
and then re-bind it through
$('input').val( $(this).data() ) ;
Do not follow exactly these lines of code , I'm trying to give you an idea of what I'm wondering. I hope that it can help you.
http://api.jquery.com/data/ doc for .data()
Hi I know its not a good idea but due to one use case I am populating a combo box with more 10000 items. Its behaving very weired in IE7 in all other browsers its working fine in IE7 its taking too much time for downloading the page. Sometime IE7 also hangs up
Is there any known bug with IE7 for this issue.
Thanks,
Amit
Not sure whether anything can be done to speed this up. One thing to look into would be loading the options dynamically through Ajax, and adding them as DOM nodes to the existing select element. That would at least allow the whole page to load before the rest of the data is fetched.
There are ready-made JS/jQuery-based Ajax combo boxes as well. One with a good loading strategy might yield better results.
I have no experience with them so I can't tell which one is suitable for you, but these seem worth checking out:
DHTMLXCombo (not free)
More in this question
I would suggest abandoning any attempt at having 10k entries in a single select box -- as others have said, it's a user interface nightmare, even if you can solve the problem with it killing the browser (which I don't think you can).
What to do instead?
Break the selection into categories. Then have one <select> box for the category, and have a second <select> get populated according to the category that is picked. This second <select> could be populated via Ajax or a page reload; both techniques are common. Given the quantity of options you want to provide, you may even want to break it down into category and sub-category.
The other alternative (which may be better, given the number of options you're providing) is to implement a Google-style auto-complete. There are a number of easy-to-use Javascript and JQuery scripts out there which allow you to implement this sort of thing without having to write it from scratch - it's almost as easy as writing the select box.
Here's one for you to try: http://docs.jquery.com/Plugins/autocomplete (but there are plenty of others if you google)
Hope that helps.