I am building a simple website for a project. One of my page do MySQL database query and output all contents to a html table, I then use a filter enable uses to show/hide certain rows.
This strategy works well when deal with hundreds of rows, but when dealing with thousands of rows, there is a significant delay when people clicking the link. I thought might be I could add some Ajax, display a 'loading/querying' information when the query didn't finished.
A sample webpage: http://epigenome.wustl.edu/TE_Methylation/browse.php
When people go to that link, before database finished query, I want to display a 'loading' message, and also when people choose to hide several rows, before the Javascript finished hide the corresponding rows, also display a 'loading' information.
(The example page above didn't have this issue because there are only 900 rows, while I am working on a dataset with 10000+ rows)
Did anyone have some suggestions on how to achieve this? many thanks :)
When working with a large amount of data, you want to try and filter as much of that data before you send it to the client. Imagine that your viewer is on a mobile device or limited connection speed. This data will take even longer to transfer.
What I would do in your case, is set up an ajax service that responds with a json list of results for a given filter / criteria. I would then limit the response to a page or range as was stated in comments.
So, filter your results in your MySQL query using parameters provided via an Ajax request.
For example (a load more example)
var dataService = function() {
this.Loaded = 0,
this.Limit = 20,
this.LoadMore = function() {
$.getJSON("data_service.php", { FilterH1ES : true, Blah : true, Start : this.Loaded, Limit : this.Limit }, function(results) {
// Append Rows to table
// ...
}
}
}
// OnLoad call dataService.LoadMore();
Then in PHP, you would do the query filters for MySQL based on the parameters.
<?php
$filters['H1ES'] = $_GET['FilterH1ES'];
$sql = "SELECT * FROM MyTable WHERE /* Default Filter Here */";
if ($filters['H1ES']) $sql .= " H1ES Filter";
// ... run query and json_encode(result);
Code is not tested and my javascript is rusty.
Related
I have around 10000+ records in my website.I applied pagination to list them.But other than listing there is a map view to pin the listings.then user can search in the map.For that i want the full data in the client side as a json array.I done this like
function doInBackground(){
$.get('car/get-map-data',
{
'params':'$params'
'page':page
},
function(data){
if(data)
{
console.log(data);
}
});
}
This is my api call.And my controller api is
public function actionGetMapData($params){
$searchModel=new CarSearch();
$dataProvider=$searchModel->search($params);
$models=$dataProvider->getModels();
$mapData=array();
foreach ($models as $key => $model) {
array_push($mapData, $model->title);
}
return json_encode($mapData);
}
I have a page size of 10 in the search
$query = Car::find();
$dataProvider = new ActiveDataProvider([
'query' => $query,
'pagination' => [
'pageSize' => 10,
],
]);
I am iterating the above javascript function to get all data page by page to the client side.But sometimes it getting very slow.How can i overcome this.i want to add the datas to my map view.hence the map view also getting slower.How to load 10000+ listing to client fastly
As I understand, you only retrieve 10 items with each API call. If you have 10,000 items, it means that you are making 1000 API calls to retrieve all items. This will definitely not be performing well. You could try to load all 10,000 items in one go and then only do pagination on the client-side.
That said, it may not be "fast" to retrieve 10,000+ items from your API. It will depend on how fast your API can retrieve those items and what each item looks like. If they are simple objects, this could work but if they are objects with many properties, the size of the response will likely be a bottleneck.
In that case, you will need to perform the search on the API in order for it be as fast as possible.
I have table user where id = primary key, lastchange = int with lastchange data (in seconds). And i have frontend page with js script. Also all users placed in div like table by pages. So, items has user id and used changed timevalue in attributes.
I want to request all changed user ids in 1 request (1 http and 1 sql). They can be changed by other user on site.
How can i do this? I dont want check every user in page by timer, there is too many requests.
In my mind it looks like:
js do get request with list of users in page in json format [{"id":1, "lastchange":123123},{"id":2, "lastchange":123123}...
Php does request in mysql like SELECT * FROM `users` WHERE `id` IN (1, 2, 3) AND `lastchange` NOT IN (123459, 123456, 123459); (this not works fine, there is no queue for lastchange, it checks all inside braces and result are wrong)
Php return only ids of different rows [1, 15, 22] etc. to js.
Js check every id separately in other request getting full info about user by id
I can check every user in php separately, but i want to know how can i do it with 1 SQL request
Sorry my bad English.
I think you might want to implement the solution differently as per the first comment.... but, if you want keep your current model the SQL you need would look something like:
SELECT * FROM users WHERE
(id = 1 AND lastchange <> 123123) OR
(id = 2 AND lastchange <> 123123) OR
...
This will keep the id's and the lastchange values that relate to each other being compared properly. It is pretty ugly SQL but not hard to generate in a loop.
Sorry my stupidness. I think i solve this.
Js request api with only user id in page
php response json like id, lastchange
js checking lastchange value and id in page
js request full data about changed user replaces old in page
Thx all for helping
Im using AngularFire+Firebase and have data at firebase-database.
Im trying to paginate Data with Smart Table
My problem is that I dont know how to range query without specifying any child i,e fetch records from record # 25 to 35
Below query gives me first 5 records
var queryFIrst = visitRef.startAt().limitToFirst(5);
$scope.Visits = $firebaseArray(queryFIrst);
now Im trying to get records next 5,from 6 to 10 and I tried below
var queryFIrst = visitRef.startAt().limitToFirst(5).endAt().limitToFirst(5);
$scope.Visits = $firebaseArray(queryFIrst);
but it giving error that startAt and endAt can't be used like this with limit
In general pagination is not a good fit for Firebase's realtime data model/API. You're trying to model a SQL SKIP operator, which won't work with the Firebase Database.
But if you want to model pagination in Firebase, you should think of having an "anchor point".
When you've loaded the first page, the last item on that page becomes the anchor point. When you then want to load the next page, you create a query that starts at the anchor point and load n+1 item.
In pseudo-code (it's real JavaScript, I just didn't run it):
var page1 = visitRef.orderByKey().limitToFirst(5);
var anchorKey;
page1.on('child_added', function(snapshot) {
anchorKey = snapshot.key; // this will always be the last child_added we received
});
Now when you want to load the next page of items, you create a new query that starts at the anchor key:
var page2 = visitRef.orderByKey().startAt(anchorKey).limitToFirst(6);
A few things to note here:
You seem to be using an approach from the Firebase 1.x SDK, such as an empty startAt(). While that code may still work, my snippets use the syntax/idiom for the 3.x SDK.
For the second page you'll need to load one extra item, since the anchor item is loaded for both pages.
If you want to be able to paginate back, you'll also need the anchor key at the start of the page.
Is that what you needed that time?
visitRef.orderByKey().startAt("25").endAt("35")
I asked a similar question Get specific range of Firebase Database children
I have a grid with data in Lighswitch application. Grid has on every column posibility to filter column. Thanks to lsEnhancedTable
Right now I am sending an ajax request to the web api controler with the list of ids of the Customers that I want to export. It works but with a lot of data it is very slow because I have to turn off the paging of the data to get all visible customers ids so I can iterate over the VisualCollection.
To optimize this I would have to turn on back the paging of the data to 50 records so that the initial load is fast and move the loading of the data to a save/export to excel button.
Possible solutions:
Load data all data on save button click. To do this I have to somehow load all items before I can iterate over collection.
The code bellow locks UI thread since the loadMore is async. How to load all data synchronously? Ideally I would like to have some kind of progress view using a msls.showProgress.
while(3<4)
{
if (screen.tblCustomers.canLoadMore) {
screen.tblCustomers.loadMore();
}
else
break;
}
var visibleItemsIds = msls.iterate(screen.tblCustomers.data)
.where(function (c) {
return c;
})
Second approach would be turn on paging and pass just the filters applied by the users to the web api controller so I can query database and return only filtered records. But I don't know how to do that.
Third approach is the one that I am using right now. Turn off the paging->iterate over visual collection, get the customers id, pass them to the controller and return a filtered excel. This doesn't work well when there are a lot of records.
Iterate over filtered collection in the server side? I don't know if there is a way to do this in Lighswitch?
Here's an option for client side javascript.
// First build the OData filter string.
var filter = "(FieldName eq " + msls._toODataString("value", ":String") + ")";
// Then query the database.
myapp.activeDataWorkspace.ApplicationData.[TableName].filter(filter).execute().then(function (result) { ... });
I want to paginate the result of my ajax success. In my ajax when I get success I append all result into one table (this table is empty by default only header).
I used this tutorial but I can't figure out how to make it work when the table is empty by default. I was able to make it run when there are values by default.
When the table is populated after ajax success nothing is happening to the data all are displayed. Is there a more applicable sample or tutorial for this kind of scenario. Or what needs to be done on the current tutorial to make it work.
Any suggestion is appreciated
You shouldn't use pagination this way, I suggest you to make your function returning only the records needed for the selected page.
It's useles to make your ajax return all the set (ej. 300 rows) if you are only going to show only a subset (ej. 30 rows in page 1). So you should add to the function which return the records from the DB (lets call it getRecords) a few parameters more:
page: the current/selected page of the paginator
records: how many records you want to show in each page
Combining this two you can limit your sql accordingly, f.instance (you can prepare the limit and the offset before the call in your php code):
select blablabla from blablable where blablablu
limit records, offset (page * records)
note: the first page here is zero. So for the first page the first record will be 0 and the last record shown will be 30 (this is (0 + 1) *30).
Here you have a good tutorial.