Since June 22, 2016, the Google Maps Javascript API requires a key (again). This means they're tracking your usage. When you use the Google Places API in an AutoComplete textbox, a request is made for every character you type meaning that if you search in such a textbox for a city that contains 10 characters, you consumed 10 requests for that day.
I want to lower the amount of requests by setting a time-out before updating the autocomplete popup (like: "wait 750ms before triggering the request"), but I can't find a way to do it.
Is there a way to lower the amount of requests used by the Google Places Javascript API Autocomplete component?
Edit: I am aware of this article but that handles the Android API, I'm using the JavaScript API.
You can build your own widget using the AutocompleteService class, and set your own thresholds.
But honestly, I would hate an app that takes such a long time to react to my keystrokes. I would initially blame the phone for not reacting to my touches, but likely realize, eventually, that it is just this one app that is slow to react to my keystrokes, and find a replacement app. Consider that.
Related
On Production environments, Javascript based analytics scripts (Google Analytics, Facebook Pixel etc.), are injected into most web applications, along with the Unique ID/Pixel ID, in plain Javascript.
For example, airbnb uses Google Analytics. I can open up my dev console and run
setInterval(function() {ga('send', 'pageview');}, 1000);
which will cause the analytics pixel to be requested every 1 second, forever. That is 3600 requests an hour from my machine alone.
Now, this can easily be done in a distributed fashion, causing millions of requests per second, completely skewing the Google Analytics data for the pageview event. I understand that the huge amounts of data collected would correct this skewing to a certain extend, but that can be easily compensated by hiking up the amount of requests.
My question is this: are there any safeguards to prevent competitors or malicious individuals from destroying the data integrity of applications in this manner? Does GA or Facebook provide such options?
Yes,but the unsafe part don't comes for the Javascript. For example, you can use the measurement protocol to flood data to one account. Here you can see a lot of people in the same comunity having thoubles with this (and it's quiet simple to solve.)
https://stackoverflow.com/search?q=spam+google+analytics
All this measurement systems uses HTTP calls to fill the data on your "database". If you are able to build the correct call you can Spam Everyone and everywhere (but don't do it, don't be evil).
https://developers.google.com/analytics/devguides/collection/protocol/v1/?hl=es-419
This page of Google Analytics explain what is the protocol measurement, Javascript only work as framework to build and send the hit.
https://developers.google.com/analytics/devguides/collection/protocol/v1/?hl=es-419
But, so not everything is lost.
For example, if you try to do that on you browser with that code, The Google Analytics FrameWork limit to 1 call per second and 150 per session (or cookie value). Yes it's not complicated to jump that barrier, but after that other barriers will come.
So if you use the Javascript framework are safe. Now imagine you do the same with python, sending http to the Google Analytics server. It's possible but:
So here are 2 important things to says.
Google Analytics has a proactive "firewall", to detect Spammers and ban them.(How and when they do this is not public), but in my case i see a lot of less spammer that few years ago.
Also there is a couple of good practices to avoid this. For example, store only domains under a white list, creating a filter to allow only traffic from your domain
https://support.google.com/analytics/answer/1033162?hl=en
Also it's a very good practice to protect you ecommerce, using a filter to include only data from certain store or with certain parameter, "for example brand == my brand" or "CustomDimension== true". Exclude transactions with products over $1.000 (check your limits and apply proactive filters). All this barrier make complex to broke.
If you do this, you will protect your domain a lot(because it's too much complicated to know the combination of UA + Domain Valid when you create a robot), but you know, all the system can be broken. In my experience i only see 2 or 3 cases of damage comming from spammer or people who wanna hurt, and in all this case could be prevented if I created a proactive filter. Usually spammer only spam ads into your account,almost never want to hurt you. Facebook, Piwik and other Tools happens more or less the same.
Currently I am using Google Maps for both Autocomplete as well as Geocoding function.
However, I realized that Google Maps geocoding is rather inaccurate for most cases and also chanced upon a few links suggesting to use Google Places API.
As the Google Maps method which I have implemented is based on Javascript approach and it does not require any API key. Therefore, it does not have much restriction since it is based on client side limit.
However for Google Places, it seems that it requires an API key and have a different set of usage limit.
Before I convert and explore on Google Places API:
May I know is there any better free alternative for geocoding solution which is good/accurate?
Is it possible to configure Google Places geocoding in a way similar to Google Maps so that the usage limitation is held at client level?
Or is there a strategy that I can try/consider. Example, create a few Google Places account/API key, and develop some logic e.g. (if apikey1 exceed limits, switch to api2 etc..)? Before that, is it able to track or detect the current limit via portal or coding level respectively?
May I know is there any better free alternative for geocoding solution which is good/accurate?
I think Google Maps offers the most accurate among free geocoding services. Reading from this SO thread, if you're really concerened about precision, then consider using paid services.
Is it possible to configure Google Places geocoding in a way similar to Google Maps so that the usage limitation is held at client level?
Here's a statement from Google about Usage Limits
The Google Places API Web Service enforces a default limit of 1 000
requests per 24 hour period, which you can increase free of charge. If
your app exceeds the limit, the app will start failing. Verify your
identity to get up to 150 000 requests per 24 hour period, by enabling
billing on the Google API Console. A credit card is required for
verification. We ask for your credit card purely to validate your
identity. Your card will not be charged for use of the Google Places
API Web Service.
The limit for free usage is 150 000 requests per 24 hour period. If
your app exceeds the limit, the app will start failing again. Purchase
a Google Maps APIs Premium Plan license to get more than 150 000
requests per 24 hour period.
Please take steps to increase your limit early if you expect to exceed
the default number of requests allowed.
Is there a strategy that I can try/consider. Example, create a few Google Places account/API key, and develop some logic e.g. (if apikey1 exceed limits, switch to api2 etc..)?
It seems there is no such feature. If you want to exceed the free quota, consider
payment.
You're right, there are many restrictions on the Google APIs. In fact, in the terms of use, the Google Maps API requires that you use the geocode information with a map presentation—you can't just print the numbers.
And yes, the Google Maps API guesses an approximate location based on the address input. For instance, if you give it a complete address that is not a real place, it will try to give you somewhere in between the real places that would probably be next to it. This is one of the reasons you will often get inaccurate geocode information from them. Overall, the API is great for what it is designed to do.
As someone who works in this industry, I'm not actually aware of any completely free geocoding and autocompleting service. Most products have a free tier though (up to so many uses per week or per month, etc).
(Full disclosure: I'm a developer at SmartyStreets where I work on the US Autocomplete API as well as the US Street Address API, which provides geocoding.)
This may be very obvious to others but I am struggling with how to achieve this and can't seem to find it in the docs or using Google, this may be down to a badly worded query.
What I am trying to do is create a Route on a map and track my progress on it as I navigate and like the navigate function in google maps send a notification of the turn when within a x number of meters, similar to voice in google maps but I just want to get the text.
Is this possible with the Javascript API v3 out of the box? If so can someone point me to the relevant documentation or tutorials
if not out of the box, can someone suggest a design pattern or some pseudo code to do this.
What I am trying to do at the moment is.
Get my routes in Steps ( gps & text )
Get the next Step ( gps & text )
When current location is within 10m notify
when current location is past notify to update to next step & repeat
This feels like I am oversimplifying it and I am also struggling with how to correctly get the distance to next step. I know I can get the distance between 2 coordinates but is the a more accurate way to do so as to avoid getting an "as the crow fly's" distance?
I am trying to do this with Google maps api v3 in an Ionic AP using the Cordova GeoLocation plugin
Again Apologies if this is obvious to anyone else but I am struggling to find any relevant examples. If for some reason this is not easily done with google maps I am open to other open source or free frameworks that I can access via javascript
There is no out of the box solution. You would have to use both the Maps API and the Directions API. Directions returns routes in legs so you can use that to determine each step and get the user's location via Geolocation.
However, this might be against Terms of Service
No Navigation, Autonomous Vehicle Control, or Enterprise
Applications. You must not use the Service or Content with any
products, systems, or applications for or in connection with any of
the following:
(i) real-time navigation or route guidance, including but not limited
to turn-by-turn route guidance that is synchronized to the position of
a user's sensor-enabled device.
I have an HTML (phonegap) application that uses Google Maps API to display a map with markers. I want this app to be used offline. I know that Google Maps tiles can't be used offline (because of its license). However, what I want to do is use the map interface without the tiles.
When online -> tiles and markers displayed.
When offline -> only markers displayed.
However, the js loading of Google maps is complex, and I haven't managed to cache it.
Thanks.
Caching google maps javascript is not allowed, that's because their payment system is based on how many times their javascript API are loaded by the users. One page refresh is equal to one google maps API call, and depending what kind of contract you made you just lost -1 from the total amount of API requests you have purchased.
How is Google Maps API for Business usage tracked and reported?
A single load of the Google Maps JavaScript API into a page. The
JavaScript API is reloaded every time a page that uses the API is
reloaded. User interactions with the map (eg. panning, zooming,
changing map types) do not generate more page views. Note however that
a page view is generated if the API is loaded into the page even if
the API is not then used to display a map.
See https://developers.google.com/maps/documentation/business/faq#pageview
You could, of course contact google and try to ask some tailored business solution for your needs. I am not 100% sure what kind of things they offer if you contact them directly. But although, your request is somewhat impossible to fill since those UI generation codes also resides inside that google maps API javascript which you need.
So I would suggest that if you only need google maps interface when offline, take a moment and implement something similar with HTML/CSS/ (and some JS).. markers you could draw on canvas or use normal img's and positioning. If you need to implement dragging and zooming - it would be little bit more difficult but not impossible with canvas or some other techniques. But that being said it would just be easier to keep app online, we all have internet :) ? Making 1on1 matching dummy offline UI against google maps would be really painful process, when we consider the fact that google maps UI also changes overtime, like in their upcoming versions.
You could use OpenStreetMaps instead, it can be used offline: http://wiki.openstreetmap.org/wiki/Offline_Openstreetmap
I came across a site that does something very similar to Google Suggest. When you type in 2 characters in the search box (e.g. "ca" if you are searching for "canon" products), it makes 4 Ajax requests. Each request seems to get done in less than 125ms. I've casually observed Google Suggest taking 500ms or longer.
In either case, both sites are fast. What are the general concepts/strategies that should be followed in order to get super-fast requests/responses? Thanks.
EDIT 1: by the way, I plan to implement an autocomplete feature for an e-commerce site search where it 1.) provides search suggestion based on what is being typed and 2.) a list of potential products matches based on what has been typed so far. I'm trying for something similar to SLI Systems search (see http://www.bedbathstore.com/ for example).
This is a bit of a "how long is a piece of string" question and so I'm making this a community wiki answer — everyone feel free to jump in on it.
I'd say it's a matter of ensuring that:
The server / server farm / cloud you're querying is sized correctly according to the load you're throwing at it and/or can resize itself according to that load
The server /server farm / cloud is attached to a good quick network backbone
The data structures you're querying server-side (database tables or what-have-you) are tuned to respond to those precise requests as quickly as possible
You're not making unnecessary requests (HTTP requests can be expensive to set up; you want to avoid firing off four of them when one will do); you probably also want to throw in a bit of hysteresis management (delaying the request while people are typing, only sending it a couple of seconds after they stop, and resetting that timeout if they start again)
You're sending as little information across the wire as can reasonably be used to do the job
Your servers are configured to re-use connections (HTTP 1.1) rather than re-establishing them (this will be the default in most cases)
You're using the right kind of server; if a server has a large number of keep-alive requests, it needs to be designed to handle that gracefully (NodeJS is designed for this, as an example; Apache isn't, particularly, although it is of course an extremely capable server)
You can cache results for common queries so as to avoid going to the underlying data store unnecessarily
You will need a web server that is able to respond quickly, but that is usually not the problem. You will also need a database server that is fast, and can query very fast which popular search results start with 'ca'. Google doesn't use conventional database for this at all, but use large clusters of servers, a Cassandra-like database, and a most of that data is kept in memory as well for quicker access.
I'm not sure if you will need this, because you can probably get pretty good results using only a single server running PHP and MySQL, but you'll have to make some good choices about the way you store and retrieve the information. You won't get these fast results if you run a query like this:
select
q.search
from
previousqueries q
where
q.search LIKE 'ca%'
group by
q.search
order by
count(*) DESC
limit 1
This will probably work as long as fewer than 20 people have used your search, but will likely fail on you before you reach a 100.000.
This link explains how they made instant previews fast. The whole site highscalability.com is very informative.
Furthermore, you should store everything in memory and should avoid retrieving data from the disc (slow!). Redis for example is lightning fast!
You could start by doing a fast search engine for your products. Check out Lucene for full text searching. It is available for PHP, Java and .NET amongst other.