Efficiently loading/caching non-small array from remote source? - javascript

I'm inexperienced at JavaScript, and don't know how to optimize things for my situation.
I've written an autocomplete function using the JQueryUI Autocomplete plugin. The source for the completion is a JSON array, holding a few thousand items, that I load from my same server. This autocomplete will be attached to a search box that will be on every page of my site, so it'll get populated a lot; I don't want to request the same array every time anyone hits any page. The completion depends on database values, so I can't just put the array in static form in the code. However, it doesn't have to be perfectly synced; caching it for some amount of time would be fine.
Right now, I'm loading the array with $.getJSON. It seems that using an actual remote source is meant to be an AJAX thing where the server does the actual search itself as you type; I think this is probably overkill given that there are only a few thousand, rather than millions, of items--I don't want to fire a zillion requests every time someone types into the search box.
What is the correct way of handling this? I'm totally unfamiliar with how caching would work in JS, or if there's some built-in way to accomplish a similar thing.

Related

Django: fastest way to update the data that was once sent from template to view

I am working on an data visualisation app that will allow the user to filter the data that he sees by various criteria.
I want to keep as much logic as possible on Python/Django side, like this:
Data is passed from Django view to the template.
On the frontend, user filters the data through various controls: dropdowns, sliders etc.
The controls inputs are sent back to Django view (via AJAX post request?), which returns filtered data and sends it back to the template.
4.The template - the visualization - is updated with the filtered data.
Is this a good approach? My concern is that a lot of data will be flying around and the app might be unresponsive.
Another, possibly faster idea is to filter the data on client's side in JavaScript - but I would really like to leverage the great Python data munching libraries instead.
If you want to use DRF API, then go with it. A lot of websites have filtering features. I'd suggest you to take a look at django_filter package. It's possible to integrate it with DRF.
The worst thing in filtering data on client side is that you can't use pagination. Imagine that you have 500+ objects to filter, javascript filtering function is what really will make your app slow.
At the same time, if you have 20-30 objects to filter and this number won't grow up, then you can go with JS only and single endpoint: getAll()
Common approach is to set up javascript on_change handler and construct GET requests like(example from real project) this:
https://yourbackend.com/api/product/?status=not_published,published,inactive&search=132&moderation_status=declined,on_moderation,not_ready&ordering=desc&price_max=1000&page=1
DRF + django_filters will work just fine that, with minimum of your code
involved.
Well known pitfall on js side is to make request without timeout, eg user writes text and on every keyUP() event request being sent. Or he moves the slider and a lot of requests being made - you'll need to make request when users stop it, eg 300ms after he chosen value. See this question for reference.
Sure, there's one more point. Your database have to be normalised and have proper indexes. But you have to look at this side if you'll have really slow SQL queries.
Summing up: I'd choose thin js layer and do most of work on backend.

Dynamically / iteratively build up page content

I am planning to create a website which will let you iteratively construct an SQL query.
The idea is the following:
while(user wants more where clauses)
{
show selection (html select) for table columns
let user choose one column
upon selection, show distinct values of that column
let user choose one/multiple value(s)
}
I know how to handle the SQL part, but I am not sure how to tackle the iterative building of the page.
So my questions are:
What is the best method to build the page iteratively with the idea sketched above?
What do I do, if the user changes one of the previous selections?
The website will be build with Perl and I am thinking of utilizing Ajax for the dynamic part.
Any help is much appreciated.
If I were to do this, I'd use SQL::Abstract.
UPDATE:
If you don't want to redraw the whole page, you're going to be using AJAX. So find yourself a JavaScript library that you feel comfortable with that includes ajax calls. Jquery has this, a bunch of others do too. People have differing opinions on various libraries.
Anyway, your workflow looks like this:
user submits form
javascript performs client-side validation
javascript submits AJAX-style to the server
Server performs server-side validation, data manipulation, etc.
Server responds with data paylod
client updated the screen based on the contents of the payload.
So let's concentrate on 5 and 6.
Data Payload: The X in AJAX means XML, but many apps send back JSON, or HTML.
Update the Screen:
You can apply HTML directly to the existing page by setting a tag's innerHTML or outerHTML attribute. But that doesn't update the DOM. If you don't dig around the DOM in your clcinet code, then this can suffice. If you dig around, then you need to build nodes and add them to you page's DOM, so you might want to consider sending back JSON or XML.
So let's say that you have a div with id='generatedSQL' when your AJAX call retruns, it will fire a callback method (let's call it updateSQL()) and inside the callback you'll have code that looks approximately like:
$(#generatedSQL).innerHTML = theVariableHoldingTheHtml;
Your other option is to parse the JSONXML/etc. and using createNode(),etc, build new DOM bits and plug them into your page. I don't have the jquery chops to write this for you. I look it up every time I need to do something like it.
If the query text is only ever display-only, and you never try to dig around in it on the client side, just use the innerHTML method, whether you pass HTML or pass JSON and generate HTML from it. If the query text is important to the inner workings of the client, then you'll need to write bunch of DOM manipulation code.
No sure what you would be using but, if uses clicks on something then use the Onload event of the element with ajax call to script which brings back the data/content and on readystate just update the innerHTML of the container DIV.
Hope following link will help you understand the concept will give you a code to start with as well.
http://www.w3schools.com/php/php_ajax_database.asp

create html elements on the serverside VS get data as JSON and create tags with javascript

I want to create a AJAX search to find and list topics in a forum (just topic link and subject).
The question is: Which one of the methods is better and faster?
GET threads list as a JSON string and convert it to an object, then loop over items and create a <li/> or <tr>, write data (link, subject) and append it to threads list. (jQuery Powered)
GET threads list which it wrapped in HTML tags and print it (or use innerHTML and $(e).html())
Thanks...
I prefer the second method.
I figure server-side you have to either convert your data to JSON or html format so why not go directly to the one the browser understands and avoid having to reprocess it client-side. Also you can easily adapt the second method to degrade gracefully for users who have disabled JavaScript (such that they still see the results via standard non-JS links.)
I'm not sure which way is better (I assume the second method is better as it would seem to touch the data less) but a definitive way to found out is try both ways and measure which one does better.
'Faster' is probably the second method.
'Better' is probably subjective.
For example, I've been in situations (as a front end dev) where I couldn't alter the html the server was returning and i wished they would have just delivered a json object so i could design the page how i wanted.
Also, (perhaps not specific to your use case), serving up all the html on initial page load could increase the page size and load time.
Server generated HTML is certainly faster if the javascript takes long time to process the JSON and populate the html.
However, for maintainability, JS is better. You can change HTML generation just by changing JS, not having to update server side code, making a delta release etc etc.
Best is to measure how slow it really is. Sometimes we think it is slow, but then you try it out in real world and you don't really see a big difference. You might have the major delay in transmitting the JSON object. That delay will still be there and infact increase if you send an html representation from the server.
So, if you bottleneck really is parsing JSON and generating html, not the transmission from server, then sending html from server makes sense.
However, you can do a lot of optimization in producing the html and parsing JSON. There are so many tricks to make that faster. Best if you show me the code and I can help you make a fast JS based implementation or can tell you to do it on the server.

Where to render Ajax search results in an object oriented approach using Coldfusion?

I'm updating a Coldfusion8/MySql site with a fairly complex search from "spaghetti to object" (= separate view, controller and process - no framework, everything handled by Jquery Mobile).
I need to run the search query through Jquery-AJAX and am now posting the search form to my searchProcess.cfc, which does the database queries.
Question:
I'm not sure where to render the results?
The results will be fairly complex (database with a few million records, rows of 40 fields) and should end up in a single result layout or a multiple result layout file.
I was thinking of constructing the files inside the cfc and handing them back via cfsavecontent, but I'm reading everywhere this is a no-no...
What are the alternatives then?
I could set up a template_single.cfm and template_multi.cfm, pass back pure search results as AJAX response and then fire another AJAX call from the success handler to call the template and then render the output of this 2nd call. This seems awfully complicated, plus I don't see where I can fit my pagination in there without passing around large datasets.
So I'm looking for some advice on how to handle search-results in an object-oriented-way?
Thanks for input!
EDIT:
After a few more hours of googling, I'm currently looking at the following option:
1.) run a single database query to return paginated results - as per here
2.) send data with 0-25 records back to AJAX in JSON
3.) trying to use a template cf/js in a loop (length 1 or length 25) - as per here
This would mean data-transfer only for 1-25 raw records in JSON. If I try to render in the success handler, I'm not having to http-request another template.
Does this approach make sense?
First off, I see nothing wrong with putting display logic in a .cfc that is specifically for display logic. I know it's not strict MVC, but depending on what you're doing, it can work just fine. What I don't like is .cfc's that do any sort of output within the function. I always hand back any data from the function.
/end opinion
For this particular problem, I second EDIT idea of setting up the view to be almost all HTML/jQuery with AJAX calls for paginated recordsets. As far as the single/multiple results, I'd go with separate jQuery functions depending on which one you needed. The nice thing about this is that you could have the multiple recordset display call the single record display to view a single record (while still retaining the full recordset in the Dom).
In both cases, I highly recommend getting a Javascript Dump function (helps so much in visualizing the DOM data.)
PS. If anybody finds any newer/better JS Dump functions that work like cfdump, please, please, please let me know!

Load up entire table again or add row manually?

This is an ajax questions. I have a table this table shows users information at certain time depending on what settings the user sets.
Now in some cases a user will see the information right away and in some cases they won't it all depends on when they want to see the information.
Now what should I do?
Should I do? do a post and post their data and then do a ajax get after to get the table and render it?
*I probably could it all in the post but unless some huge performance gain is gained I rather not otherwise I have to do mix "success/fail" messages and the table to be rendered all in the same response.
So each one seems to have pluses and minuses.
Ajax way
don't have to worry about having a
JavaScript solution that queries the
database to figure out what their
timezone is and then determine if the
row should be added or not and any
other headaches that comes with
javascript dates.
Each row could potential have a
different style to. This would
mean I would have to possibly do a
query to the database and figure it
out or have hidden field in the page
for easy access. With Ajax way I
would not have to worry about it.
don't have to worry about making a
manual row in javascript/jquery
syntax what can be a pain to do if
you have many columns.
Javascript way
Problem less of a performance hit
since only have to potentially make
one new or do nothing. Where
otherwise I have to generate a new
table regardless and if it has lots
of rows in it that could be slow.
Have to rebind all jquery plugins
that would be on the table. Or use
jquery.live for everything else.
So I am not sure to me it seems like a hard choice.
Unless I misunderstand what you want to do, why not do both in one solution?
Return a JSON response, so, when a user logs in, you post the information using an ajax call, and just return the data from the database. I tend to return either data or an error message, but you could have two objects in your json string, one for a possible error message and the other being the data that is being returned.
The javascript then can process the data as is needed.
So, you do both, it isn't an either/or decision.

Categories