google finance zoom in zoom out graph logic - javascript

I'm looking for logic behind zoom-able graph like google finance. I know there
are off the shelf components that just do that, but I am looking for a basic
example that explains the logic.

Whoever writes things like that basically has two choices.
Load a lot of data, and show only a little bit. When the user changes the zoom, use the data we weren't showing before. Basically, we load all of the data at page-load time, so the Javascript can use it later. This is easier to write, but slow; sometimes, you have to load tons of data to do it.
Load only the data you need. When the user interacts with the page, make AJAX requests back to the server, to load in the new data that you need.
2a. When you load new data, store everything you've loaded so far, so that you don't need to make more AJAX requests if they return to an older zoom setting.
1 + 2. Load only the data you need, then show the page. Then immediately load everything else, but don't show it until/unless they change the zoom settings.
Of these, 2 and 2a are likely the best choices, while #1 is the "get it done quicker" approach.

Google Chrome (and browsers based on chromium) have developer tools with a network feature that lets you see what happens.
When you load a quote and then change the zoom, you will see a new data request. For example:
https://www.google.com/finance/getprices?q=AA&x=NYSE&i=1800&p=30d&f=d,c,v,o,h,l&df=cpct&auto=1&ts=1382233772497
It makes a new request for each "zoom level", which is necessary because the larger time windows (1 yr, 5 yr) will show data at coarser granularity (1 day, 1 week respectively)

Related

What is the right way to call dynamic content (currently using ajax) inside a cached page?

We have a news website where we cache a complete article page.
There are 4 areas that need to continue to be dynamic on that page:
View Counter: We add +1 to view_counts of that article when page loads.
Header: On the header of the website we check if session->id exists or not if it does we display a Welcome [Name], My Profile / Logout and if not we show Register / Login.
Comments: We display the comments made for that article.
Track User Behavior: We track every single action made by users on the site
Now the only way we could think of doing this is through AJAX calls:
$('#usercheck').load(<?php echo "'" . base_url() . "ajax/check_header'"; ?>);
And so on.
This is creating a massive load on CPU, but what would be the right/alternative way of approaching this?
Please see attached:
First of all, you do not have to use AJAX for every possible dynamic content, especially in the case of comments, you may as well load them via an iframe.
That way, you are not relying on Javascript to make the request.
It may even work for the counter.
However, you problem is not Javascript, nor the database server, based on what I can see from your graph. It seems to me you have some heavy PHP controllers, maybe you are loading a heavy framework just to have $session->id checked.
Further, what do you mean by "we track every single action"? How do you track them? Are you sending an AJAX request from every little thing or are you debouncing them with JS and only sending them one every 30 seconds or so?
My advice is that you consider the size of the PHP code you are calling, and slim it down as much as you can, even to zero if it seems feasible (by leveraging localStorage to keep track of you user session after the first login), and maybe loading the counter and the comments in alternative ways.
For example, I infer you are only checking the counter once per page load, ignoring subsequent loads by other users while the current user is reading the article, so your counter may happen to be out-of-date once i a while, depending on your traffic.
I going to explain it better: your page has n views, so when I load it, you request for n and then display n+1 to me. While I'm reading, the same page gets requested and viewed x times by other users. Your counter on the server has been surely updated to n+x, but the counter on my page still says "n views".
So, what's the point in being picky and showing n+1 to me and the not updating it, thus being off by x?
So, first of all the counter controller should be as slim as possible, and what if you loaded it within an iframe, auto updating without AJAX?
How to refresh an iframe not using javascript?
That would keep the counter up-to-date, you may render it with PHP just once per page view, and then just statically serve the resulting HTML file.

Screen-scrape paginated data

I'm trying to grab a list of all of the available stores returned from the search at this website.
https://www.metropcs.com/find-store.html.html
The issue is that it returns back only 4 or 5 at a time, and does not have the option for 'See All'. I attempted to use Post Man in Chrome and AutoPager in Firefox to see if I could somehow see all of the data in the background but I wasn't able to. I also was researching JSON interception tools, as I believe the site is using JSON in the return set, but I wasn't able to find any of the actual data that I needed.
In the past I was able to hit 'print preview' and grab the list that way (then I just copy-pasted to Excel and ran some custom macros to strip the data I need) but the printer-friendly version is gone now as well.
Any ideas on tools that would allow me to export all of the stores found, especially for larger return sets?
You want to manipulate this request:
https://www.metropcs.com/apps/mpcs/servlet/genericservlet
You'll notice the page sends this (among other things) as the request to that URL:
inputReqParam=
{"serviceProviderName":"Hbase","expectedParams":
{"Corporate Stores":...Truncated for clarity...},
"requestParams":
{"do":"json",
"minLatitude":"39.89234063913044",
"minLongitude":"-74.85258152641507",
"maxLongitude":"-74.96578907358492",
"maxLatitude":"39.979297160869564"
},
"serviceName":"metroPCSStoreLocator"}
You'll need to manipulate the lat and long bounding box to encompass the area you want. (The entire US is something like [-124.848974, 24.396308] to [-66.885444, 49.384358] )
In your favorite browser it should be easy enough to tweak the request to get a JSON response with what you require.

Google Chrome Lags on Large Form Data Sumbissions

Summary:
I am attempting to pass base64 encoded image data via a form field input. My code works fine on all browsers I've tested it on, but there is severe amount of CPU lag, post submit, on Google Chrome - the length of which is proportional to the length of data submitted.
Details:
What I'm Doing:
I have an SVG editor on my site in which users may create images to be saved to their profile. Once the user finishes their work, they click 'save' - which kicks off some javascript to convert the SVG into an encoded data string via canvas.toDataURL(), store it in a hidden input field, submit the form, and return the user to an overview of their designs.
What's the problem?
The code, itself, seems to be functioning without an issue across both Firefox and Google Chrome. Firefox page loads take 1-2 seconds, regardless of the data_string size. However, on Google Chrome, the time it takes to load the 'overview' page is proportional to the size of the data string submitted in the hidden field.
For example, if I truncate the data string at various lengths, I receive different page load times:
Test Image 1:
5000 chars - 1.78 sec
50000 chars - 8.24 sec
73198 chars - 11.67 sec (Not truncated)
Test Image 2:
5000 chars - 1.92 sec
50000 chars - 8.79 sec
307466 chars - 42.24 sec (Not truncated)
My Question:
The delay is unacceptable (as most images will be at least 100k in size); does anyone know what's going on with Google Chrome?
I would like to reiterate that the server responds with the same speed, regardless of browser; it is definitely a client-side, browser specific issue with Google Chrome.
I would also appreciate alternative suggestions. I've spent some time attempting to fool the browser into thinking the data was a file upload (by changing the text input to a file input field and then manually trying to form the data and submit it via javascript, but I can't seem to get Django to recognize the falsified file (so it errors out, believing that no file was uploaded).
Summary
Google Chrome seems to have a problem handling large amounts of data when said data is placed into an actual input field. I suspect it's an issue with Chrome attempting to clean up the memory used to display the data.
Details
I was able to achieve a workaround by doing away with the client-side form, entirely, and submitting the data via a javascript XMLHttpRequest (as I had touched on at the end of my question), then redirecting the user to the next page in the AJAX callback.
I could never get Django to recognize a manually formed FileField object (as multipart/form-data), but I was able to get it to accept a manually formed CharField string (which was my base64 encoded canvas data).
Because the data is never placed into an input field, Google Chrome responds without delay.
I hope that helps anyone who may run across a similar issue.
I was also having the exact same problem, I was searching for a solution.
In my case there was no such problem for the initial few runs of the page.
Then it suddenly started to lag eating up a large amount of memory which in turn made my whole system running very slow.
I tried in another PC like what expected there was no problem submitting the big sized svg data for the first few runs but later it is also showing the same lagging problem.
After reading your post i am planning to use jquery's ajax for posting the data . I hope this will solve the issue.

Alternatives to using meta-refesh for updating a page

In the past, when I've covered events, I've used a meta-refresh with a 5 minute timer to refresh the page so people have the latest updates.
Realizing that this may not be the perfect way to do it (doesn't always work in IE, interrupts a person's flow, restarts things for people with screen readers, etc.) I'm wondering if there's any other way to do handle this situation.
Is it possible to have something like ajax check every few minutes if the html file on the server is newer and have it print a message saying "Update info available, click here to refresh"?
If that's crazy, how about a javascript that just counts down from 5 minutes and just suggests a refresh.
If anyone could point me to tutorials or code snippets I'd appreciate. I just play a programmer on TV. :-)
Actually, your thought on a timed Ajax test is an excellent idea. I'm not sure that is exactly what StackOverflow uses, but it checks periodically to see if other answers have been posted and shows the user, on an interval, if there are updates.
I think this is ideal for these reasons:
It's unobtrusive - the reader can easily ignore the update if they don't care
It won't waste bandwith - no reloading unless the user chooses to
It's informative - the user knows there's an update and can choose to act on it.
My take on how - have the ajax script send off the latest post id to a script that checks for new scripts. This script can query your database to see if there are any new posts, and how many there are. It can return this number. If there are new posts, show some (non modal) message including the number of updates, and let the user decide what to do about it.
setInterval(function() {
if (confirm("Its Been 5 Minutes. Would you like to refresh")) {
window.location.reload(true);
//Or instead of refreshing the page you could make an ajax call and determing if a newer page exists. IF one does then reload.
}
}, 300000);
You can use the setInterval function in javascript.
here's a sample
setInterval("refresh function", milliseconds, lang);
You will use it passing a name to a function that actually refresh the page for the first param and the number of milliseconds between refresh for the second param (300000 for 5 minutes). The third parameter lang is optional
If the user would be interacting with the scores and clicking on things it would be a little rude to just refresh the page on them. I think doing something like a notification that the page has been updated would be ideal.
I would use jQuery and do an ajax call to the file on the server or something that will return the updated data. If it's newer than throw up a Growl message
Gritter - jQuery Growl System
Demo of what a Growl is using Gritter
A Growl message would come up possibly with whatever was changed, new scores and then an option within that message to refresh and view the new results.
jQuery Ajax information

Open Flash Chart 2 IE doesn't load new data for chart

I have OFC2 chart in my page and when user selectes choice, then it will make another request to server with parameters and OFC2 should load new data from server, but IE won't load new data, but it shos old data instead. It works well in other browsers, but not in IE. What could be the problem here, and how can I solve it?
i too have same issue, i have develop the dashboard which refresh and get data for every 5min. but IE always take old data only. i even try to clear the cache, but no use.
Finally, once i change the delay to 20 min, data got refresh without any problem.
So please try to extend your relay timing, hope it work.
look at this entry:
why-would-my-updated-flash-chart-not-be-refreshing-in-internet-exp
It worked for me (just add a different random extra parameter every time you refresh your page)
This is possibly because of an IE caching issue. I believe IE is caching the requests that the OFC code uses. To work around this, add a random parameter to the request each time. For example if you are currently requesting http://myserver.com/script.php?userid=1 then request http://myserver.com/script.php?userid=1&randomparam=<some_random_number_here>

Categories