I have a search service that has Apache Solr at the backend and client makes an AJAX call to the webservice which fetches data through Solr and gives me the result. I am searching for ways to make this faster then what it is right now. The auto suggest gives categories in suggestion which are total 1,20,000 (text value i.e name)
Efforts
I made a txt file (on the server )and upon search it fetches the entire categories in a javascript variable in which I find the .indexOf() to find the substr position (which user enters), sort the indexes and provide auto suggest.
I have thought to make a json file and implement something similar to what i have done above.
Also made a drop down through which auto filters the content upon typing but I read alot that it is not much recommended to use it.
The txt file made is of size 2MB and I also have to take care of 2G and other low bandwidth internet users. My main aim is not to hit the DB again and again on every user key press.
To make the response fast, I suggest cache the indexes in-memory. Indexing 1,20,000 categories may not be huge. Also you can try Redis's autocomplete. This would easy to implement and would have very fast response time.
Related
I'm inexperienced at JavaScript, and don't know how to optimize things for my situation.
I've written an autocomplete function using the JQueryUI Autocomplete plugin. The source for the completion is a JSON array, holding a few thousand items, that I load from my same server. This autocomplete will be attached to a search box that will be on every page of my site, so it'll get populated a lot; I don't want to request the same array every time anyone hits any page. The completion depends on database values, so I can't just put the array in static form in the code. However, it doesn't have to be perfectly synced; caching it for some amount of time would be fine.
Right now, I'm loading the array with $.getJSON. It seems that using an actual remote source is meant to be an AJAX thing where the server does the actual search itself as you type; I think this is probably overkill given that there are only a few thousand, rather than millions, of items--I don't want to fire a zillion requests every time someone types into the search box.
What is the correct way of handling this? I'm totally unfamiliar with how caching would work in JS, or if there's some built-in way to accomplish a similar thing.
I am working on an data visualisation app that will allow the user to filter the data that he sees by various criteria.
I want to keep as much logic as possible on Python/Django side, like this:
Data is passed from Django view to the template.
On the frontend, user filters the data through various controls: dropdowns, sliders etc.
The controls inputs are sent back to Django view (via AJAX post request?), which returns filtered data and sends it back to the template.
4.The template - the visualization - is updated with the filtered data.
Is this a good approach? My concern is that a lot of data will be flying around and the app might be unresponsive.
Another, possibly faster idea is to filter the data on client's side in JavaScript - but I would really like to leverage the great Python data munching libraries instead.
If you want to use DRF API, then go with it. A lot of websites have filtering features. I'd suggest you to take a look at django_filter package. It's possible to integrate it with DRF.
The worst thing in filtering data on client side is that you can't use pagination. Imagine that you have 500+ objects to filter, javascript filtering function is what really will make your app slow.
At the same time, if you have 20-30 objects to filter and this number won't grow up, then you can go with JS only and single endpoint: getAll()
Common approach is to set up javascript on_change handler and construct GET requests like(example from real project) this:
https://yourbackend.com/api/product/?status=not_published,published,inactive&search=132&moderation_status=declined,on_moderation,not_ready&ordering=desc&price_max=1000&page=1
DRF + django_filters will work just fine that, with minimum of your code
involved.
Well known pitfall on js side is to make request without timeout, eg user writes text and on every keyUP() event request being sent. Or he moves the slider and a lot of requests being made - you'll need to make request when users stop it, eg 300ms after he chosen value. See this question for reference.
Sure, there's one more point. Your database have to be normalised and have proper indexes. But you have to look at this side if you'll have really slow SQL queries.
Summing up: I'd choose thin js layer and do most of work on backend.
I guess i have a noob question here.
I have a huge Json file (30MB) that i would like to parse with a Jquery web app.
Now, the ideal would be to load it into local storage, regex what i want and show the results.
I would like it to start showing the results as soon as i type (google style) but it looks like every attempt i've made the app just hangs.
If i reduce the Json file to 1 MB then it works.
Does anybody know how to do that? maybe with an example that i can see?
thanks a lot!
I would recommend you to not use this method for search. Because even if you will manage to make your search quicker it will take very long time to download 30MB file.
What you could do is convert your data from JSON to SQL and do search by sending AJAX calls to the server from your javascript code.
For further optimizations you can send AJAX calls only after search term's length exceeds certain number of characters probably 3.
I have a website that contains graphs which display employee activity records. There are tiers of data (ie: region -> state -> office -> manager -> employee -> activity record) and each time you click the graph it drills down a level to get to display more specific information. The highest level (region) requires me to load ~1000 objects into an array and the lowest level is ~500,000 objects. I am populating the graphs via a JSON formatted text file using:
$.ajax({url:'data/jsondata.txt', dataType: 'json',
success: function (data) {
largeArray = data.employeeRecords;
}
Is there an alternative method I could use without hindering response time/performance? I am caught up in the thought that I must pre-load all of the data client side otherwise there will be lagtime if I need to fetch it on a user click. If anyone can point me to best practices and maybe even explain what is considered "TOO MUCH" client side data i'd appreciate it.
FYI i'm restricted to using an old web server and if I want to do anything server side i'd be limited to classic ASP otherwise it has to be client side. thank you!
If your server responds quickly
In this case, you can probably simply load data on demand when a user clicks. The server is quick, so why bother trying to be smarter for no gain.
If the server is quick, but not quick enough, then you might be able to preload the next level while drawing the first. Eg if you have just rendered the graph at the "office" level, then silently preload the "manager" next level down data while the user is still reacting to the screen update.
If the server is too slow for data on demand
In this case you probably need to model exactly where it is slow and address that. There are several things in play here and your question doesnt exactly say.
Is the server slow to query the database, if yes fix it. There is little you can do client side to solve this.
Is the server slow to package for transmission? Harder to fix, server big enough?
Network transmission is slow? Hmmm, need to send less data or get users onto faster bandwidth.
Browser unpack time is slow? (ie delay decoding the data before your script can chart it). Change how you package the data, or send less data, such as chunks.
Can browsers handle 500,000 objects? You should be able to just monitor memory of tHe browser you are using, and there are opionions yes/no on this. Will really depend or target users browser/hardware.
You might like to look at this question What is the most efficient way of sending data for a very large playlist over http? as it shows an alternative way of sending and handling data which I've found to be much quicker for step 4 above. Of course, at 500k objects you will no longer be able to use localStorage, but I've been experimenting with downloading millions of array elements and it works ok. ( still WIP ) I dont use jquery, so not sure how useable this is either.
Best practice? Sorry cannot help with that part of the question.
I would like to keep the contents of large UI lists cached on the client, and updated according to criterial or regularly. Client side code can then just fill the dropdowns locally, avoiding long page download times.
These lists can be close to 4k items, and dynamically filtering them without caching would result in several rather large round trips.
How can I go about this? I mean, what patterns and strategies would be suitable for this?
Aggressive caching of JSON would work for this, you just hash the JS file and throw it on the end of it's URL to update it when it changes. One revision might look like this:
/media/js/ac.js?1234ABCD
And when the file changes, the hash changes.
/media/js/ac.js?4321DCBA
This way, when a client loads the page, your server-side code links to the hashed URL, and the client will get a 304 Not Modified response on their next page load (assuming you have this enabled on your server). If you use this method you should set the files to never expire, as the "expiring" portion will be dealt with by the hash, i.e., when the JS file does expire, the hash will change and the client won't get a 304, but rather a 200.
ac.js might contain a list or other iterable that your autocomplete code can parse as it's completion pool and you'd access it just like any other JS variable.
Practically speaking, though, this shouldn't be necessary for most projects. Using something like memcached server-side and gzip compression will make the file both small and amazingly fast to load. If the list is HUGE (say thousands of thousands of items) you might want to consider this.
Combres is a good solution for this - it will track changes and have the browser cache the js forever until a change is made, in which case it changes the URL of the item.
http://combres.codeplex.com/
You might consider rather than storing the data locally using jQuery and AJAX to dynamically update the dropdown lists. Calls can be made whenever needed and the downloads would be pretty quick.
Just a thought.
This might be helpful:
http://think2loud.com/using-jquery-and-xml-to-populate-a-drop-down-box/
If its just textual data, you have compression enabled on the web server, and there are less than 100 items, then there may be no need to maintain lists in the client script.
Its usually best to put all your data (list items are data) in one place so you dont have to worry about synchronization.