I do not want to turn off caching for a site, but I do want to avoid caching in some areas. Wondering the best way.
First, I pull data from an API to check the "online" status of an advisor.
Next, I store that (and any other data that may change) in a CPT.
At the same time, I store a random string of characters (that I can sort on later giving the appearance of random order).
I pull the data from the API on every page load, because I need real-time data. This makes me cringe, but I don't know any other way. This part isn't cached.
However, when I display the list of "advisors", I sort them by online status, then the random string. This is meant to give fairness as to who is above the fold, and near the beginning of the results.
Well, that is all generated with PHP, so therefore the resultant HTML is cached.
I read a bit about the WP Rest API, and perhaps that will help with the speed of the query, but that won't help with the cached HTML right?
So, regardless of how I query the data (REST API, WP_Query), am I to assume that I must iterate through the data with JavaScript to avoid it being cached by the Full Page Cache solution of the server?
If I use WP_Query still, and I use PHP to display the results, can I just call the PHP function from JavaScript?
Every page of the site will display some or all of the advisors (ex: homepage 8 advisors, the "advisor" page shows all, the "advisor" category pages, and 4 advisors in the footer of every other page), so it doesn't make sense to turn off caching.
Any direction would be greatly appreciated! Thanks in advance.
Are you not better of doing it with AJAX?
Or, I'm pretty sure there's like a line of PHP that you can add to pages so they won't get cached. Woocommerce sometimes needs this for example. I guess it depends on which caching plugin you are using. Which one are you using?
Or, for example WP Super Cache has an area where you can exclude certain pages from caching. Under Settings -> WP super chache -> Advanced -> Accepted Filenames & Rejected URIs.
Related
I'm inexperienced at JavaScript, and don't know how to optimize things for my situation.
I've written an autocomplete function using the JQueryUI Autocomplete plugin. The source for the completion is a JSON array, holding a few thousand items, that I load from my same server. This autocomplete will be attached to a search box that will be on every page of my site, so it'll get populated a lot; I don't want to request the same array every time anyone hits any page. The completion depends on database values, so I can't just put the array in static form in the code. However, it doesn't have to be perfectly synced; caching it for some amount of time would be fine.
Right now, I'm loading the array with $.getJSON. It seems that using an actual remote source is meant to be an AJAX thing where the server does the actual search itself as you type; I think this is probably overkill given that there are only a few thousand, rather than millions, of items--I don't want to fire a zillion requests every time someone types into the search box.
What is the correct way of handling this? I'm totally unfamiliar with how caching would work in JS, or if there's some built-in way to accomplish a similar thing.
This is more of a architectural questions. An external platform had product and price information for let's say, books. There is an API available to get this information.
What I read is that it should be possible to create a function in Javascript and connect the Javascript to a page where you want to show the data on my own website. This would mean that for each page request an API-call is made. Since the requested information only changes once a day maximum this does not sound the most efficient solution.
Can someone advise a better solution? Something into the direction of a similar php or javascript function that does the request on the background, schedule an update and import the data into mysql? If so, what language would be most common.
I need the solution for a Joomla/php/mysql environment
Here's a simple idea - fetch and store results from the API (ones you think aren't gonna change in a day), either on disk, or in the database, and later use these stored results to retrieve what you otherwise would've fetched from the API.
Since storing anything in frontend JS across page reloads isn't easy, you need to make use of PHP for that. Based on what's given, you seem to have two ways of calling the API:
via the frontend JS (no-go)
via your PHP backend (good-to-go)
Now, you need to make sure your results are synced every (say) 24 hours.
Add a snippet to your PHP code that contains a variable $lastUpdated (or something similar), and assign it the "static" value of the current time (NOT using time()). Now, add a couple of statements to update the stored results if the current time is at least 24 hours greater than $lastUpdated, followed by updating $lastUpdated to current time.
This should give you what you need with one API call per day.
PS: I'm not an expert in PHP, but you can surely figure out the datetime stuff.
It sounds like you need a cache, and you're not the first person to run into that problem - so you probably don't need to reinvent the wheel and build your own.
Look into something like Redis. There's an article on it available here as well: https://www.compose.com/articles/api-caching-with-redis-and-nodejs/
I have a search service that has Apache Solr at the backend and client makes an AJAX call to the webservice which fetches data through Solr and gives me the result. I am searching for ways to make this faster then what it is right now. The auto suggest gives categories in suggestion which are total 1,20,000 (text value i.e name)
Efforts
I made a txt file (on the server )and upon search it fetches the entire categories in a javascript variable in which I find the .indexOf() to find the substr position (which user enters), sort the indexes and provide auto suggest.
I have thought to make a json file and implement something similar to what i have done above.
Also made a drop down through which auto filters the content upon typing but I read alot that it is not much recommended to use it.
The txt file made is of size 2MB and I also have to take care of 2G and other low bandwidth internet users. My main aim is not to hit the DB again and again on every user key press.
To make the response fast, I suggest cache the indexes in-memory. Indexing 1,20,000 categories may not be huge. Also you can try Redis's autocomplete. This would easy to implement and would have very fast response time.
I've created a subscription-based system that deals with a large data-set. In its first iteration, it had semi-complicated joins that would execute, based on user-set filters, on every 'data view' page. Each query would fetch anywhere from a few kilobytes to several megabytes depending on the filter range. I decided this was unacceptable and so learned about APC (I had heard about its data-store features).
I moved all of the strings out of the queries into an APC preload routine that fires upon first login. In the same routine, I am running the "full set" join query to get all of the possible IDs for the data set into a $_SESSION variable. The entire set is anywhere from 100-800Kb, depending on what data the customer is subscribed to.
I convert this set into a JSON array and shuffle the data around dynamically when the user changes the filters. In creating the system I wanted it to seem as if the user was moving around lots of data very quickly, with minimal page loading (AJAX + APC when string representations are needed), as they played with the filters.
My multipart question is, is it possible for the user to effectively "cancel" the initial cache/query routine by surfing to another page after the first login? If so, can I move this process to an AJAX page for preloading, or does this carry the same problem? Or, am I just going about all of this in the wrong way? I came up with the idea on my own and I'm worried that I've created an unusable monster.
Also, I've been warned that my questions suck and I'm in danger of being banned. Every question I've asked has come from a position of intelligent wonder, written as well as I knew how at the time, and so it's really aggravating when an outsider votes me down without intelligent criticism. Just tell me what I did wrong and I will quickly fix the problem. Bichis.
I would like to keep the contents of large UI lists cached on the client, and updated according to criterial or regularly. Client side code can then just fill the dropdowns locally, avoiding long page download times.
These lists can be close to 4k items, and dynamically filtering them without caching would result in several rather large round trips.
How can I go about this? I mean, what patterns and strategies would be suitable for this?
Aggressive caching of JSON would work for this, you just hash the JS file and throw it on the end of it's URL to update it when it changes. One revision might look like this:
/media/js/ac.js?1234ABCD
And when the file changes, the hash changes.
/media/js/ac.js?4321DCBA
This way, when a client loads the page, your server-side code links to the hashed URL, and the client will get a 304 Not Modified response on their next page load (assuming you have this enabled on your server). If you use this method you should set the files to never expire, as the "expiring" portion will be dealt with by the hash, i.e., when the JS file does expire, the hash will change and the client won't get a 304, but rather a 200.
ac.js might contain a list or other iterable that your autocomplete code can parse as it's completion pool and you'd access it just like any other JS variable.
Practically speaking, though, this shouldn't be necessary for most projects. Using something like memcached server-side and gzip compression will make the file both small and amazingly fast to load. If the list is HUGE (say thousands of thousands of items) you might want to consider this.
Combres is a good solution for this - it will track changes and have the browser cache the js forever until a change is made, in which case it changes the URL of the item.
http://combres.codeplex.com/
You might consider rather than storing the data locally using jQuery and AJAX to dynamically update the dropdown lists. Calls can be made whenever needed and the downloads would be pretty quick.
Just a thought.
This might be helpful:
http://think2loud.com/using-jquery-and-xml-to-populate-a-drop-down-box/
If its just textual data, you have compression enabled on the web server, and there are less than 100 items, then there may be no need to maintain lists in the client script.
Its usually best to put all your data (list items are data) in one place so you dont have to worry about synchronization.