For my site, I have a list of cities in the U.S. (about 30,000) and I'm using that to autocomplete a freeform field, so users can start typing their city, and choose it from the list.
My question is, what is the best way to put the data into the autocomplete list?
Read all of the data from the DB via PHP, and put it into a large Javascript array.
Make a call to the DB every time the user updates the city field, and search for cities that match what they're typing.
I ask this because I'd like to know which is best for performance. It seems like #2 would have a lot of DB calls, and overhead if 100+ people are hitting the site at a time.
#1 may not be great becuase 30,000 elements in an array seems like a lot.
I'm currently using method #1 and it seems OK, but it does look kind of sloppy when you view the source code and it shows 30,000 lines for the array.
Any suggestions/Input is appreciated.
Both solutions seems to be expensive for performance. The first one may cause problems with slow computers while the second one is expensive for the server resources.
I would suggest you to use a full-text search engine like Sphinx because it supports complex search queries and is really much faster than DB thanks to caching.
I do always use #2
This is what my PHP can look like:
PHP
//retrieve the search term that autocomplete sends
$term = trim(strip_tags($_GET['term']));
//Prepare Query
for($i=1; $i<=strlen($term); $i++) {
$test_term = substr($term, 0, $i);
$query = "SELECT DISTINCT column as value FROM table WHERE column LIKE '".$test_term."%'";
}
Related
I'm currently trying to modify the selection order of some records using a javascript drag&drop mechanism.
This is the idea:
Once I've ordered the elements by d&d I retrieve the IDs of each element (in the right order) and I send them to php via ajax call.
I store the array of IDs somewhere (to develop)
Then, I run a query like this:
$sql = "SELECT * FROM items ORDER BY field(id, ".$order.");";
(where $order is the imploded array of IDs)
It works quite good but, since I never used this feature before, my doubt is:
since my IDs are strings of 16 characters, and supposing to have 200 records to order....
...Should I expect some trouble in therms of performance?
Do you see any better solution?
Thanks.
The comments up there made me think and I realized that this approach has a big issue.
Even considering to send the $order array only at the end of drag&drop process - I mean, push a button (unlock d&d), reorder, confirm (send&lock) - it would be however necessary to perform a custom select on every single js action comporting a refresh of the elements (view, create, rename,...). And that's pretty dumb.
So I guess that the best approach is the one suggested by Kiko, maybe with a lock system as described above to avoid an ajax call and consequent reindexing of the order field at every single move.
I have a webapp with an input field which should be filled in with user's region according to ISO3166-2. To improve user experience I wanna make this field an autocomplete - user starts typing, and gets suggestions. There's also a country field, which, if filled, would limit the amount of regions to suggest to regions from the chosen country.
The regions list is a pretty large amount of data, which includes region names and codes for all officially recognized countries in the world and it's footprint is about 250kB.
Those of you, who had similar experience, which of the following ways to implement it would you recommend to achieve the best performance and why?
1) don't send the whole regions list to the client and make a request to server instead every time user types in region field (debounce it, ofcourse), so we look for suggestions server-side but have additional roundtrips;
2) use a webworker to find suggestions;
3) smth else?
You can have an associative array with keys as the region name (or somewhat similar), then you can have the array populated from the backend according the the logic.
Binary search the array using the typed keyword
If data exists then populate the autocomplete or else populate the array from backend data.
You have to write code to maintain the array yourself.
Or else you can save your time by using something like Twitter typeahead with Bloodhound having prefetching, intelligent caching, fast lookups, and backfilling with remote data.
Here's the link: https://github.com/twitter/typeahead.js
I'm looking to add a feature to my application which currently, you enter your first name and last name and a few other details.
What I want to add is the ability to start typing the first name in the form and instantly output those with a similar name and if the name is there then use those details rather than entering the name again.
Once you've selected the name of a previous entry you can carry on adding the data to the form for the fields not populated.
I'm currently working with laravel any insight to tackling this one would be greatly appreciated.
Thanks
I have done this before but not in Laravel. The concepts should be mostly the same. The issue you will run into is how you want to qualify similar. If similar means you can use a "like" query against values the start with what has been typed against a database table, it's super easy. The simple way is going to perform a query like this:
$users = DB::table('users')
->where('name', 'like', 'rob%')
->get();
Because the wildcard is at the end, you can still use a standard database index to prevent your lookups from killing your database.
If you want something more advanced, you'll have to figure out indexing schemes and possibly use a full-text index server like Lucene. We ended up with the latter.
In either case, you will need an API endpoint that works with a front-end widget. I believe we used the JQuery plugin Select2. It has an example to make it work for a remote data set with Ajax.
I am working with a database that was handed down to me. It has approximately 25 tables, and a very buggy query system that hasn't worked correctly for a while. I figured, instead of trying to bug test the existing code, I'd just start over from scratch. I want to say before I get into it, "I'm not asking anyone to build the code for me". I'm not that lazy, all I want to know is, what would be the best way to lay out the code? The existing query uses "JOIN" to combine the results of all the tables in one variable, and spits it into the query. I have been told in other questions displaying this code, that it's just too much, and far too many bugs to try to single out what is causing the break.
What would be the most efficient way to query these tables that reference each other?
Example: Person chooses car year, make, model. PHP then gathers that information, and queries the SQL database to find what parts have matching year, vehicle id's, and parts compatible. It then uses those results to pull parts that have matching car model id's, OR vehicle id's(because the database was built very sloppily, and compares all the different tables to produce: Parts, descriptions, prices, part number, sku number, any retailer notes, wheelbase, drive-train compatibility, etc.
I've been working on this for two weeks, and I'm approaching my deadline with little to no progress. I'm about to scrap their database, and just do data entry for a week, and rebuild their mess if it would be easier, but if I can use the existing pile of crap they've given me, and save some time, I would prefer it.
Would it be easier to do a couple queries and compare the results, then use those results to query for more results, and do it step by step like that, or is one huge query comparing everything at once more efficient?
Should I use JOIN and pull all the tables at once and compare, or pass the input into individual variables, and pass the PHP into javascript on the client side to save server load? Would it be simpler to break the code up so I can identify the breaking points, or would using one long string decrease query time, and server loads? This is a very complex question, but I just want to make sure there aren't too many responses asking for clarification on trivial areas. I'm mainly seeking the best advice possible on how to handle this complicated situation.
Rebuild the database then make a php import to bring over the data.
I have a grid(employee grid) which has say 1000-2000 rows.
I display employee name and department in the grid.
When I get data for the grid, I get other detail for the employee too(Date of Birth, location,role,etc)
So the user has option to edit the employee details. when he clicks edit, I need to display other employee details in the pop up. since I have stored all the data in JavaScript, I search for the particular id and display all the details. so the code will be like
function getUserDetails(employeeId){
//i store all the employeedetails in a variable employeeInformation while getting //data for the grid.
for(var i=0;i<employeeInformation.length;i++){
if(employeeInformation[i].employeeID==employeeId){
//display employee details.
}
}
}
the second solution will be like pass employeeid to the database and get all the information for the employee. The code will be like
function getUserDetails(employeeId){
//make an ajax call to the controller which will call a procedure in the database
// to get the employee details
//then display employee details
}
So, which solution do you think will be optimal when I am handling 1000-2000 records.
I don't want to make the JavaScript heavy by storing a lot of data in the page.
UPDATED:
so one of my friend came up with a simple solution.
I am storing 4 columns for 500 rows(average). So I don't think there should not be rapid slowness in the webpage.
while loading the rows to the grid, under edit link, I give the data-rowId as an attribute so that it will be easy to retrieve the data.
say I store all the employee information in a variable called employeeInfo.
when someone clicks the edit link.. $(this).attr('data-rowId') will give the rowId and employeeInfo[$(this).attr('data-rowId')] should give all the information about the employee.
instead of storing the employeeid and looping over the employee table to find the matching employeeid, the rowid should do the trick. this is very simple. but did not strike me.
I would suggest you make an AJAX call to the controller. Because of two main reasons
It is not advisable to handle Database actiity in javascript due to security issues.
Javascript runs on client side machine it should have the least load and computation.
Javascript should be as light as possible. So i suggest you do it in the database itself.
Don't count on JavaScript performance, because it is heavily depend on computer that is running on. I suggest you to store and search on server-side rather than loading heavy payload of data in Browser which is quite restricted to resources of end-user.
Running long loops in JavaScript can lead to an unresponsive and irritating UI. Use Ajax calls to get needed data as a good practice.
Are you using HTML5? Will your users typically have relatively fast multicore computers? If so, a web-worker (http://www.w3schools.com/html/html5_webworkers.asp) might be a way to offload the search to the client while maintaining UI responsiveness.
Note, I've never used a Worker, so this advice may be way off base, but they certainly look interesting for something like this.
In terms of separation of concerns, and recommended best approach, you should be handling that domain-level data retrieval on your server, and relying on the client-side for processing and displaying only the records with which it is concerned.
By populating your client with several thousand records for it to then parse, sort, search, etc., you not only take a huge performance hit and diminish user experience, but you also create many potential security risks. Obviously this also depends on the nature of the data in the application, but for something such as employee records, you probably don't want to be storing that on the client-side. Anyone using the application will then have access to all of that.
The more pragmatic approach to this problem is to have your controller populate the client with only the specific data which pertains to it, eliminating the need for searching through many records. You can also retrieve a single object by making an ajax query to your server to retrieve the data. This has the dual benefit of guaranteeing that you're displaying the current state of the DB, as well as being far more optimized than anything you could ever hope to write in JS.