How to performance tune AJAX heavy application - javascript

I'm developing kind of image/profile search application, that is based almost exclusively on AJAX. Main page basically displays profile images and allows user to filter/search and paginate through them.
Pagination works when user scrolls, so the interface has to be very very fast. There will be only 6 (maybe 9, but definitely not more) images displayed on the main page, so users will scroll a lot. I'm currently using very simple JS cache to store results of all the requests in case user decides to go back ... in that case, I simply pull everything out of the cache instead of querying the server.
Client cache
One option that I thought of is to pre-load say 10 pages in front and store them in the cache.
But my biggest issue is filtering/searching, since that completely changes the type of query that goes to the server. My filters aren't very complex, only around 6-7 string/number/enum attributes.
Now if I wanted to do all the filtering in the cache, I would have to duplicate all the search logic and fetch all the data from the server (not just the data I'm displaying), so I could filter the results on client side.
Here raises a question, should I make the cache somehow persistent? Store it into a cookie maybe?
Server cache?
One suggestion might be to use memcached on the server and just store everything there. I'm definiely going to cache away all the results I can, but that doesn't save the server from handling loads and loads of AJAX requests.
I'm developing this application on Rails 3, and even though I love it, I wouldn't say it's the fastest thing in the world. Another option that this gives me is to create separate Rack/Sinatra application to handle only the AJAX requests. By this I mean requests from the main query, not all AJAX.
S3 for images?
Big part of this application are images, even though they're mostly small thumbnails (unless user wants to display it bigger).
At the moment, I don't have problems with bandwidth. My VPS host provides me with 200GB, which should be more than enough (I hope). The problem is loading speed. Would it help if I uploaded all the images to S3 and load them from there, or is this worth doing only for larger files? I'm going to load a lot of 100x150px images, which are generally under 50kB.

Have you looked at SlickGrid. It has an interesting idea of only building the list as users scroll down but then removing the list as the users scroll out of that range.

Related

Using API more efficiently

I am trying to load my playlist onto my website from Spotify by using their API and requesting data by lines of AJAX code but that data would be loaded every single time if I refresh the page, which lower the speed and UX. Is there any solution of the problem?
P.S: a more complex problem: I was going to request the data once after period of time and modify the code of the HTML files stored on the server by a server-side Python application I wrote if any changes of the playlist occurred (example: I added/removed songs to or from my playlist via Spotify), for the purpose of not only freeing the browser from requesting datas but also getting the chances to sort that data by the date when publishing, rating or the artists for a better client-side layout. So my questions are:
Is it secure to let the server-side applications to modify the HTML files (which means really saving the changes to the server-side HTML files) automatically because it is a very long process I don't want to either leaving those annoying works to the client-side which bothers the users or updating my website manually which bothers me.
Are those things even possible?
Try using html5 localStorage to store your playlist on the client's browser along with a timestamp. Once the timestamp is a certain period old, any loaded page can reload your playlist if necessary.

Is it possible to generate a text or xml file from local storage data and save it in a folder?

I'm working on a website that is going to be offline. All the html files will be in a folder stored on the hard-disc. I've managed to do 90% of the work and the last part I have no idea of. Here is what it is:
I have stored a list of products in the localStorage as various strings under the keys - like buying objects and it goes to the cart, the cart objects are in localStorage. I created a page that showed the list of all the products in the localStorage. It can be cleared if the user clears them. Now I need to create a page where all the objects that was selected before, regardless of the localStorage being cleared, show as list in this page. You can take it as the page that lists products that have been ordered in the past, i.e even after the cart is cleared the products will show in the past-orders page.
I do not know any server side codes, I did everything using JavaScript as it was supposed to be a simple project, but I'm stuck at this part. So I cannot use PHP or anything to generate files or use a database to store stuff.
Here's what I thought but I don't think it works but wanted to confirm if it does or not:
Generate an XML file or a .txt file and store it in the drive and then clear the localStorage. But I don't think it is possible. If its possible just using JavaScript please point me in the right direction and I'll research and come up with something.
P.S. the website will be entirely offline what I mean is the users will never connect to the internet for this to work. Also there won't be a server or localhost.
Thank you!
The site is completely offline, but functionality is similar to an eCommerce site. You click a button and some content from the website stores in the localStorage and I have to call it in multiple pages, when a user clicks another button, localStorage clears but whatever was selected before must be available without localStorage. Hmmmm.. Consider a quiz site where you answer everything and when you take a new quiz, old scores will be stored somewhere else and it won't change when you take a new test.
Is it possible to attain this functionality without a server side script? It seems the final-targeted-users won't know how to install any softwares that can provide a localhost or a server or something like that.
Client-side, browser's JavaScript runtimes don't have local file system access excepting a very limited one when uploading files to a remote server, and anyway, browers won't give you direct access to the file.
You'll need some kind of server-side logic behind the scenes if you want full I/O in the local file system.
Perhaps you can take a look at NodeJS, a server-side JavaScript runtime that can work as lighty Web server on either small and large projects, and it has built-in server-side JavaScript I/O functions.
Otherwise, you're absolutely stuck in the client-side Web browser's sandbox limitations.
U can refer documents of knockoutjs and NodeJS.. That would probablky help... A far as my knowledger is concerned NodeJS does contain a way to handle your problem.

jQuery post big text data transfer (eventual load)

The problem:
I have a jquery ajax (post) based website, where the website doesn't refresh every time user navigates to another page. Which means I have to pull data with ajax and present it to the user. Now for pulling small text data, this system works great. However once the text data is huge (let's say over 200,000 words), the load time is quite high (especially for mobile users). What I mean to say is, ajax tries to load full text information and displays it after it is done loading all text. So the user has to wait quite a bit to get the information.
If you look at a different scenario, let's say wikipedia. There are big pages in wikipedia. However, a user doesn't feel he/she has to wait a lot because the page loads step by step (top to bottom). So even if the page is large, the user is already kept busy with some information. And while the user is processing those, rest of the page keeps loading.
Question:
So is it possible to display, via ajax, information on real time load? Meaning keep showing whatever is loaded and not wait for the full document to be loaded?
Ajax (xmlhttprequest) is a really great feature in html5, for the same thing, ajax is better than socket, by that, I mean non-persistant connection but as soon as the connection is persistant (impossible for xmlhttprequest)socket is fastest.
The simplest way is to use web socket is socket.io but you need a JavaScript server to use this library and there is one host where you can get one for free with admin tools: heroku.
You can use PHP server if you dont want to use JavaScript server with the socketme library but it is a bit more complex.
Also, you can think diferently, you try to send a lot of data.
200 000 words is something like 70ko (I try a lorem ipsum), the upload speed is relative to data and connection speed/ping. You can compress by any way your data before sending and uncompress server-side. There is probably thousand way to do this but I think the simpliest way is to find a JavaScript library to compress/uncompress data and simply use your jquery ajax to send what's compressed.
EDIT — 21/03/14:
I misunderstood the question, you want to display the current upload progress ?
Yes, it is possible by using the onprogress event, in jQuery you must follow this simple exemple: jQuery ajax progress

patterns for building Web/Mobile apps that processes a lot of data on the client side

I'm trying to build a single page web app using Backbone. the app looks and behaves like a mobile app running on a tablet.
The web app is built to help event organizers manage their lists of people attending their events, and this includes the ability to search and filter those lists of attendees.
I load all attendees list when the user opens the attendees screen. and whenever the user starts to search or filter the attendees, the operation happens on the client side.
This way always works perfectly when the event has about ~400 attendees or less, but when the number of attendees gets bigger than that (~1000), the initial download time takes longer (makes sense) .. but after all data is loaded, searching and filtering is still fast relatively.
I originally decided to go with the option of fully loading all the data each time the app is loaded; to do all search operations on the client side and save my servers the headache and make search results show up faster to the user.
I don't know if this is the best way to build a web/mobile app that processes a lot data or not.
I wish there's a known pattern for dealing with these kinds of apps.
In my opinion your approach to process the data on the client side makes sense.
But what do you mean with "fully loading all the data each time the app is loaded"?
You could load the data only once at the beginning and then work with this data throughout the app lifecycle without reloading this data every time.
What you also could do is store the data which you have initially fetched to HTML5 localstorage. Then you only have to refetch the data from the server if something changed. This should reduce your startup time.

How to handle large data sets for server-side simulation --> client browser

Sorry for the somewhat confusing title. Not sure really how to title this. My situation is this- I have an academic simulation tool, that I in the process of developing a web front-end for. While the C++-based simulator is computationally quite efficient (several hundredths to a tenth of a second runtime) for small systems, it can generate a significant (in web app terms) amount of data (~4-6mb).
Currently the setup is as follows-
User accesses index.html file. This page on the left side has an interactive form where the user can input simulation parameters. On the right side is a representation of the system they are creating, along with some greyed out tabs for various plots of the simulation data.
User clicks "Run simulation." This submits the requested sim parameters to a runSimulation.php file via an AJAX call. runSimulation.php creates an input file based on the submitted data, then runs the simulator using this input file. The simulator spits out 4-6mb of data in various output files.
Once the simulation is done running, the response to the browser is another javascript function which calls a file returnData.php. This php script packages the data in the output files as JSON data, returns the JSON data to the browser, then deletes the data files.
This response data is then fed to a few plotting objects in the browser's javascript, and the plot tabs become active. The user can then open and interact with the plotted data.
This setup is working OK, however I am running into two issues:
The return data is slow- 4-6mb of data coming back can take a while to load. (That data is being gzipped, which reduces its side considerably, but it still can take 20+ seconds on a slower connection)
The next goal is to allow the user to plot multiple simulation runs so that they can compare the results.
My thought is that I might want to keep the data files on the server, while the users session is active. This would enable the ability to only load up the data for the plot the user wants to view (and perhaps loading other data in the background as they view the results of the current plot). For the multiple runs, I can have multiple data sets sitting on the server, ready for the user to download if/when they are needed.
However, I have a big issue with this line of thinking- how do I recognize (in php) that the user has left the server, and delete the data? I don't want the users to take over the drive space on the machine. Any thoughts on best practices for this kind of web app?
For problem #1, you don't really have any options. You are already Gzip'ing the data, and using JSON, which is a relatively lightweight format. 4~6 MB of data is indeed a lot. BTW if you think PHP is taking too long to generate the data, you can use your C++ program to generate the data and serve it using PHP. You can use exec() to do that.
However, I am not sure how your simulations work, but Javascript is a Turing-complete language, so you could possibly generate some/most/all of this data on the client side (whatever makes more sense). In this case, you would save lots of bandwidth and decrease loading times significantly - but mind that JS can be really slow.
For problem #2, if you leave data on the server you'll need to keep track of active sessions (ie: when was the last time the user interacted with the server), and set a timeout that makes sense for your application. After the timeout, you can delete the data.
To keep track of interaction, you can use JS to check if a user is active (by sending heartbeats or something like that).

Categories