mysql order by field usage/limit/performance - javascript

I'm currently trying to modify the selection order of some records using a javascript drag&drop mechanism.
This is the idea:
Once I've ordered the elements by d&d I retrieve the IDs of each element (in the right order) and I send them to php via ajax call.
I store the array of IDs somewhere (to develop)
Then, I run a query like this:
$sql = "SELECT * FROM items ORDER BY field(id, ".$order.");";
(where $order is the imploded array of IDs)
It works quite good but, since I never used this feature before, my doubt is:
since my IDs are strings of 16 characters, and supposing to have 200 records to order....
...Should I expect some trouble in therms of performance?
Do you see any better solution?
Thanks.

The comments up there made me think and I realized that this approach has a big issue.
Even considering to send the $order array only at the end of drag&drop process - I mean, push a button (unlock d&d), reorder, confirm (send&lock) - it would be however necessary to perform a custom select on every single js action comporting a refresh of the elements (view, create, rename,...). And that's pretty dumb.
So I guess that the best approach is the one suggested by Kiko, maybe with a lock system as described above to avoid an ajax call and consequent reindexing of the order field at every single move.

Related

Best practice for using PHP DB to populate JQuery autocomplete

For my site, I have a list of cities in the U.S. (about 30,000) and I'm using that to autocomplete a freeform field, so users can start typing their city, and choose it from the list.
My question is, what is the best way to put the data into the autocomplete list?
Read all of the data from the DB via PHP, and put it into a large Javascript array.
Make a call to the DB every time the user updates the city field, and search for cities that match what they're typing.
I ask this because I'd like to know which is best for performance. It seems like #2 would have a lot of DB calls, and overhead if 100+ people are hitting the site at a time.
#1 may not be great becuase 30,000 elements in an array seems like a lot.
I'm currently using method #1 and it seems OK, but it does look kind of sloppy when you view the source code and it shows 30,000 lines for the array.
Any suggestions/Input is appreciated.
Both solutions seems to be expensive for performance. The first one may cause problems with slow computers while the second one is expensive for the server resources.
I would suggest you to use a full-text search engine like Sphinx because it supports complex search queries and is really much faster than DB thanks to caching.
I do always use #2
This is what my PHP can look like:
PHP
//retrieve the search term that autocomplete sends
$term = trim(strip_tags($_GET['term']));
//Prepare Query
for($i=1; $i<=strlen($term); $i++) {
$test_term = substr($term, 0, $i);
$query = "SELECT DISTINCT column as value FROM table WHERE column LIKE '".$test_term."%'";
}

Populate form with previous data

I'm looking to add a feature to my application which currently, you enter your first name and last name and a few other details.
What I want to add is the ability to start typing the first name in the form and instantly output those with a similar name and if the name is there then use those details rather than entering the name again.
Once you've selected the name of a previous entry you can carry on adding the data to the form for the fields not populated.
I'm currently working with laravel any insight to tackling this one would be greatly appreciated.
Thanks
I have done this before but not in Laravel. The concepts should be mostly the same. The issue you will run into is how you want to qualify similar. If similar means you can use a "like" query against values the start with what has been typed against a database table, it's super easy. The simple way is going to perform a query like this:
$users = DB::table('users')
->where('name', 'like', 'rob%')
->get();
Because the wildcard is at the end, you can still use a standard database index to prevent your lookups from killing your database.
If you want something more advanced, you'll have to figure out indexing schemes and possibly use a full-text index server like Lucene. We ended up with the latter.
In either case, you will need an API endpoint that works with a front-end widget. I believe we used the JQuery plugin Select2. It has an example to make it work for a remote data set with Ajax.

Where to render Ajax search results in an object oriented approach using Coldfusion?

I'm updating a Coldfusion8/MySql site with a fairly complex search from "spaghetti to object" (= separate view, controller and process - no framework, everything handled by Jquery Mobile).
I need to run the search query through Jquery-AJAX and am now posting the search form to my searchProcess.cfc, which does the database queries.
Question:
I'm not sure where to render the results?
The results will be fairly complex (database with a few million records, rows of 40 fields) and should end up in a single result layout or a multiple result layout file.
I was thinking of constructing the files inside the cfc and handing them back via cfsavecontent, but I'm reading everywhere this is a no-no...
What are the alternatives then?
I could set up a template_single.cfm and template_multi.cfm, pass back pure search results as AJAX response and then fire another AJAX call from the success handler to call the template and then render the output of this 2nd call. This seems awfully complicated, plus I don't see where I can fit my pagination in there without passing around large datasets.
So I'm looking for some advice on how to handle search-results in an object-oriented-way?
Thanks for input!
EDIT:
After a few more hours of googling, I'm currently looking at the following option:
1.) run a single database query to return paginated results - as per here
2.) send data with 0-25 records back to AJAX in JSON
3.) trying to use a template cf/js in a loop (length 1 or length 25) - as per here
This would mean data-transfer only for 1-25 raw records in JSON. If I try to render in the success handler, I'm not having to http-request another template.
Does this approach make sense?
First off, I see nothing wrong with putting display logic in a .cfc that is specifically for display logic. I know it's not strict MVC, but depending on what you're doing, it can work just fine. What I don't like is .cfc's that do any sort of output within the function. I always hand back any data from the function.
/end opinion
For this particular problem, I second EDIT idea of setting up the view to be almost all HTML/jQuery with AJAX calls for paginated recordsets. As far as the single/multiple results, I'd go with separate jQuery functions depending on which one you needed. The nice thing about this is that you could have the multiple recordset display call the single record display to view a single record (while still retaining the full recordset in the Dom).
In both cases, I highly recommend getting a Javascript Dump function (helps so much in visualizing the DOM data.)
PS. If anybody finds any newer/better JS Dump functions that work like cfdump, please, please, please let me know!

should I ALSO keep my data in a Javascript data structure? or just in the DOM?

I am very new to Javascript and Jquery, so my apologies for this beginner's question.
In a simple ajax web app, I am creating an HTML page which is mainly a big table. Each row in this table describes an event (party, show, etc). My page not only displays this information but is meant to let the user do a bunch of things with it, in particular to dynamically search and filter the table according to a variety of criteria.
First an abstract beginner's question: in this broad kind of situation (by which I mean that you want your javascript code to run a bunch of operations on the information you retrieve from the webserver) would you use the DOM as a data structure? The ease with which one can search and manipulate it (using Jquery) makes that a possibility. (E.g., "find me table rows describing an event with date column = 2010-01-01 and event type column = 'private party'.) Or would you keep the same information in a traditional Javascript data structure, search/filter/operate on that using plain javascript code and then update the DOM accordingly to display the results to the user?
(As a newbie, I imagine the first, DOM-only approach to be slower while the latter to be take up a good deal of memory. Right? Wrong?)
Assuming the second strategy is reasonable (is it?), then a practical question: can I simply store in my Javascript objects a pointer to the corresponding Jquery object? Eg, can I do
var events = new Array();
// ....
var event3094 = new Event('party','2010-01-01' /*, ... */);
event3094.domElement = $("#correctIdOfTheEventRowInMyTable");
events.push(event3094)
Does this store just a reference (pointer?) to the Jquery object in each Event object or is it creating a new copy of the Jquery object?
I am just wondering "how the pros" do it. : )
Thank you for any advice and insight.
cheers
lara
There are so many ways to do this, but DOM manipulation will almost always be slower than JS manipulation.
To answer your question, anytime you use $(selector) a new jQuery object is created and a match to find the element is performed.
I would recommend two approaches:
FIRST OPTION
Load data in a normal HTML table
Read through the rows, and store just the data (each cell's contents) in an array similar to your code example.
Store a reference to the tr in that object.
Then you can process filter, etc, and only apply changes and searches to the DOM as needed.
SECOND OPTION
Load the page without the table
Load the data as JSON from the server, and generate a table from the data
Store reference to the tr element
Basically, you don't want to perform a $(selector) a 1000 times. The concept is something like this:
var $rows = $("table tr");
// ...
event.domElement = $rows[0]; // Stores reference to existing DOM node, not a new jQuery object.
Then when you need to use jQuery methods on the object, you could use $(yourEvent.domElement) to wrap it in a jQuery wrapper.
Depending on the number of rows you might expect to be shown for most of your users (let's assume it's no more than a few hundred), I myself would probably aim to just keep everything in the DOM table that you're already building. (If you are expecting to be dealing with thousands of rows on one page, you might want to explore a different solution rather than sending it all to the browser.)
There are a few things that you did not mention in your original post. First, how are you creating this table? I imagine using a server-side solution. How easy is that to modify? How much extra work would it be to go through and generate all of your data a second time in a different format, as XML or JSON? Does this add a bunch of complexity on the server-side, only so that you can add more complexity client-side to match? Certain platforms may make this trivial, but is something to consider.
Now, in regards to your alternatives to the DOM:
I agreed and mentioned in a comment above that I don't think JSON would be very optimal "out of the box" for what you want to do. A javascript array is no better. XML is nice in that you can use jquery to easily traverse/filter, but then you still have to deal with your DOM. Sure, you can store references to your DOM elements, but that just seems like a bunch of work up front and then some more work later when matching them up. And without necessarily guaranteeing any major performance boost.
So, to answer your question directly as it is phrased, should you ALSO keep your data in a JavaScript data structure, or just in the DOM: You did mention this was a "simple" ajax web app. My recommendation is to try and keep this simple, then! Your example of how you can so easily use jquery to find rows and cells based on search criteria should be enough to convince you to give this a try!
Best of luck!-Mike
I think you'l find that the DOM method is close to the same speed if I follow your logic right. The second method will require you to manipulate the data, and then apply the changes to the DOM, where the first method allows both operations at the same time.
If youve got alot of data i would for go making objects and just supply it as XML. that way you get most of the same features as operating on the HTML DOM but you dont have a crazy markup structure like a table to navigate through.
I think it's largely a personal preference, but I like to store the objects in JavaScript and then render them to HTML as needed. From my understanding
event3094.domElement = $("#correctIdOfTheEventRowInMyTable");
will store a reference to the jQuery wrapper of your row element (not a copy).
One benefit of storing is JS is that if you have lost of objects (eg 10,000 ) and only render a small fraction of them, the browser will perform a lot better if you're not creating 10,000 * (number of DOM elements per object) elements.
Edit: Oh, and if you can, you might want to send the data from the server to the client as JSON. It's very compact (compared to XML) and in the new browsers it can be parsed quickly and safely using JSON.parse(). For older browsers, just use this library http://www.json.org/js.html . The library will only create a global JSON namespace if the browser doesn't supply it.
One thing to consider is how you need to access your data. If all the data for an element's event is contained in the element (an event operating solely on a table cell for example), then storing in the DOM makes sense.
However, if the calculation of one element depends on data form other elements (a summation of all table cells in a particular column, for example), you may find it difficult to gather up all the data if it's scattered about in DOM elements compared to a single data structure.

Load up entire table again or add row manually?

This is an ajax questions. I have a table this table shows users information at certain time depending on what settings the user sets.
Now in some cases a user will see the information right away and in some cases they won't it all depends on when they want to see the information.
Now what should I do?
Should I do? do a post and post their data and then do a ajax get after to get the table and render it?
*I probably could it all in the post but unless some huge performance gain is gained I rather not otherwise I have to do mix "success/fail" messages and the table to be rendered all in the same response.
So each one seems to have pluses and minuses.
Ajax way
don't have to worry about having a
JavaScript solution that queries the
database to figure out what their
timezone is and then determine if the
row should be added or not and any
other headaches that comes with
javascript dates.
Each row could potential have a
different style to. This would
mean I would have to possibly do a
query to the database and figure it
out or have hidden field in the page
for easy access. With Ajax way I
would not have to worry about it.
don't have to worry about making a
manual row in javascript/jquery
syntax what can be a pain to do if
you have many columns.
Javascript way
Problem less of a performance hit
since only have to potentially make
one new or do nothing. Where
otherwise I have to generate a new
table regardless and if it has lots
of rows in it that could be slow.
Have to rebind all jquery plugins
that would be on the table. Or use
jquery.live for everything else.
So I am not sure to me it seems like a hard choice.
Unless I misunderstand what you want to do, why not do both in one solution?
Return a JSON response, so, when a user logs in, you post the information using an ajax call, and just return the data from the database. I tend to return either data or an error message, but you could have two objects in your json string, one for a possible error message and the other being the data that is being returned.
The javascript then can process the data as is needed.
So, you do both, it isn't an either/or decision.

Categories