Is it possible (and supported cross-browser) to embed an image into the XML of an AJAX response, and then load that image using JavaScript?
I have a system that does some calculations based on the number of sessions running through it. The results are then graphed, and returned in two parts:
1) XML containing information about the graph, totals, and Image map data allowing the user to click on relevant areas.
2) The graph image.
As the data can change between the two requests (and could be expensive to calculate), I'd prefer to do it in a single request (return the image with the XML). The current implementation caches the stats for a small period so that the results of multiple requests will still match. As the amount of data that needs to be cached is going to be increasing (from ~2.5K to ~1.2MB), I would like to try an alternative method.
NOTE: I do not want to use inline b64 PNG images as they are not supported in IE.
Can you not store the image on the server and send the URL to the client?
As this seems like more work that it's worth, I've decided that a simpler solution would be:
1) Send XML data to the client with the details of what is to be graphed.
2) Client sends a request for the image, including the data to graph (similar to the Google Chart API).
This decouples the chart rendering from the data, and then it can be used in the future to generate generic charts for other data sets. The other benefit is that it doesn't require any caching server-side since only 1 request is used.
You might want to check this link to see if this concept will work. This link maybe useful.
I think trying to combine both set of data in the XML would be interesting.
Have you considered using Google's Chart API?
Well actually you can but it's probably not going to be worth it. It seems like a vector based method such as canvas and the VML alternatives for IE would be a better alternative for rendering graphs. Then you only have to pass the graph data to the browser.
If storing is not an option. Create image in memory and flush the binary image data to response from a script, like a php file. Just use the correct header:
Content-type: image/png
The image gets regenerated every time.
To be absolutely sure the image does not get cached, append some random parameter to the querystring.
You could return the image to the ajax-client, and then include the XML data in a X-HTTP-Header for the image.
But you need to find out if it is possible to read the X-header from the ajax-client though.
Related
I do not want to turn off caching for a site, but I do want to avoid caching in some areas. Wondering the best way.
First, I pull data from an API to check the "online" status of an advisor.
Next, I store that (and any other data that may change) in a CPT.
At the same time, I store a random string of characters (that I can sort on later giving the appearance of random order).
I pull the data from the API on every page load, because I need real-time data. This makes me cringe, but I don't know any other way. This part isn't cached.
However, when I display the list of "advisors", I sort them by online status, then the random string. This is meant to give fairness as to who is above the fold, and near the beginning of the results.
Well, that is all generated with PHP, so therefore the resultant HTML is cached.
I read a bit about the WP Rest API, and perhaps that will help with the speed of the query, but that won't help with the cached HTML right?
So, regardless of how I query the data (REST API, WP_Query), am I to assume that I must iterate through the data with JavaScript to avoid it being cached by the Full Page Cache solution of the server?
If I use WP_Query still, and I use PHP to display the results, can I just call the PHP function from JavaScript?
Every page of the site will display some or all of the advisors (ex: homepage 8 advisors, the "advisor" page shows all, the "advisor" category pages, and 4 advisors in the footer of every other page), so it doesn't make sense to turn off caching.
Any direction would be greatly appreciated! Thanks in advance.
Are you not better of doing it with AJAX?
Or, I'm pretty sure there's like a line of PHP that you can add to pages so they won't get cached. Woocommerce sometimes needs this for example. I guess it depends on which caching plugin you are using. Which one are you using?
Or, for example WP Super Cache has an area where you can exclude certain pages from caching. Under Settings -> WP super chache -> Advanced -> Accepted Filenames & Rejected URIs.
I'm working on a database of math problems right now - all the problems are formatted in LaTeX, but there are Asymptote images as well.
It doesn't look like Mathjax supports Asymptote - what would be the best way to get the images from each math problem? How does Art of Problem Solving's TeXeR get Asymptote?
Could we send the code and render the image client-side, or should we extract the asy code for each problem, get an image, save these images to the database, and link each image to its respective problem?
I'm very late here, but I'll suggest something anyway.
What I would do is simply render all the images serverside. You can have Asymptote rerender images if there are any changes to the original file.
That way you save bandwidth and processing time for the client a lot. You may have a lot more data to store, but this way you'd also have a bit more control over the pics.
Asymptote lets you define the output, that way you can have Your server automatically create them based completely on the original problem, name them accordingly. That way they are already 'linked'.
This is probably the best answer I can give without code.
I have some data that I want to display on a web page. There's quite a lot of data so I really need to figure out the most optimized way of loading and parsing it. In CSV format, the file size is 244K, and in JSON it's 819K. As I see it, I have three different options:
Load the web page and fetch the data in CSV format as an Ajax request. Then transform the data into a JS object in the browser (I'm using a built-in method of the D3.js library to accomplish this).
Load the web page and fetch the data in JSON format as an Ajax request. Data is ready to go as is.
Hard code the data in the main JS file as a JS object. No need for any async requests.
Method number one has the advantage of reduced file size, but the disadvantage of having to loop through all (2700) rows of data in the browser. Method number two gives us the data in the end-format so there's no need for heavy client-side operations. However, the size of the JSON file is huge. Method number three has the advantage of skipping additional requests to the server, with the disadvantage of a longer initial page load time.
What method is the best one in terms of optimization?
In my experience, data processing times in Javascript are usually dwarfed by transfer times and the time it takes to render the display. Based on this, I would recommend going with option 1.
However, what's best in your particular case really does depend on your particular case -- you'll have to try. It sounds like you have all the code/data you need to do that anyway, so why not run a simple experiment to see which one works best for you.
Suppose I display a few dozens thumbnails in a few web pages (10 thumbnails per page). I would like to load them as quickly as possible.
Does it make sense to get a few thumbnails with one HTTP request (one thumbnail is ~10K) ? How would you suggest do it with JavaScript?
You can, but you need to jump through a few hoops:
1) Base-64 encode the images on the server as a single file.
2) Send them to the client as a single request blob, via AJAX.
3) Decode the images back into pieces.
4) Use Data-URIs to insert them into the DOM.
...not really worth it.
Regarding network performance it does really make sense.
You could, for example, put a predefinited number of thumbails along in a single image.
On client side you can treat that image like using "css sprite" tecnique
(http://www.w3schools.com/css/css_image_sprites.asp)
If it's important for you to send the images as fast as possible I would consider sending them as a sprite. Unfortunately this may be somewhat difficult on the back end if the provided images may vary. If they are static and the same for every user it is way easier as you can manually prepare the images and the front end code to display the correct image parts.
In combination with the sprite approach it would also be useful to enable progressive/interlaced loading in order to deliver visible results as fast as possible.
I have a website that contains graphs which display employee activity records. There are tiers of data (ie: region -> state -> office -> manager -> employee -> activity record) and each time you click the graph it drills down a level to get to display more specific information. The highest level (region) requires me to load ~1000 objects into an array and the lowest level is ~500,000 objects. I am populating the graphs via a JSON formatted text file using:
$.ajax({url:'data/jsondata.txt', dataType: 'json',
success: function (data) {
largeArray = data.employeeRecords;
}
Is there an alternative method I could use without hindering response time/performance? I am caught up in the thought that I must pre-load all of the data client side otherwise there will be lagtime if I need to fetch it on a user click. If anyone can point me to best practices and maybe even explain what is considered "TOO MUCH" client side data i'd appreciate it.
FYI i'm restricted to using an old web server and if I want to do anything server side i'd be limited to classic ASP otherwise it has to be client side. thank you!
If your server responds quickly
In this case, you can probably simply load data on demand when a user clicks. The server is quick, so why bother trying to be smarter for no gain.
If the server is quick, but not quick enough, then you might be able to preload the next level while drawing the first. Eg if you have just rendered the graph at the "office" level, then silently preload the "manager" next level down data while the user is still reacting to the screen update.
If the server is too slow for data on demand
In this case you probably need to model exactly where it is slow and address that. There are several things in play here and your question doesnt exactly say.
Is the server slow to query the database, if yes fix it. There is little you can do client side to solve this.
Is the server slow to package for transmission? Harder to fix, server big enough?
Network transmission is slow? Hmmm, need to send less data or get users onto faster bandwidth.
Browser unpack time is slow? (ie delay decoding the data before your script can chart it). Change how you package the data, or send less data, such as chunks.
Can browsers handle 500,000 objects? You should be able to just monitor memory of tHe browser you are using, and there are opionions yes/no on this. Will really depend or target users browser/hardware.
You might like to look at this question What is the most efficient way of sending data for a very large playlist over http? as it shows an alternative way of sending and handling data which I've found to be much quicker for step 4 above. Of course, at 500k objects you will no longer be able to use localStorage, but I've been experimenting with downloading millions of array elements and it works ok. ( still WIP ) I dont use jquery, so not sure how useable this is either.
Best practice? Sorry cannot help with that part of the question.