Bing Image Search API - javascript

Does anyone know if Bing limits the number of requests an application can have for the Image Search API? I looked through the terms and couldn't find anything but the wording that they 'reserve' the right to do so. My application would pull several images for each user - so there could potentially be a lot of requests. Any feedback?
Zach

It's not clear whether this is actually enforced but the guidelines say
"[You must] Restrict your usage to
less than 7 queries per second (QPS)
per IP address. You may be permitted
to exceed this limit under some
conditions, but this must be approved
through discussion with
api_tou#microsoft.com."
http://msdn.microsoft.com/en-us/library/dd440746.aspx

I think, it is possible that this guide contains outdated information about restrictions. There are no info about queries number in the current version of terms (ver. March 2011). At the same time, there is a line about restricted advertisment on the page where the bing images or videos results are situated - this is essential.

Actually, Free Bing Search API is limited to 5000 Transactions/month & the Source type can be..
Web
Images
News
Videos
Related Search
Spelling Suggestions
More Info, https://datamarket.azure.com/dataset/5BA839F1-12CE-4CCE-BF57-A49D98D29A44

Related

Aren't Javascript analytics scripts susceptible to easy data hacks?

On Production environments, Javascript based analytics scripts (Google Analytics, Facebook Pixel etc.), are injected into most web applications, along with the Unique ID/Pixel ID, in plain Javascript.
For example, airbnb uses Google Analytics. I can open up my dev console and run
setInterval(function() {ga('send', 'pageview');}, 1000);
which will cause the analytics pixel to be requested every 1 second, forever. That is 3600 requests an hour from my machine alone.
Now, this can easily be done in a distributed fashion, causing millions of requests per second, completely skewing the Google Analytics data for the pageview event. I understand that the huge amounts of data collected would correct this skewing to a certain extend, but that can be easily compensated by hiking up the amount of requests.
My question is this: are there any safeguards to prevent competitors or malicious individuals from destroying the data integrity of applications in this manner? Does GA or Facebook provide such options?
Yes,but the unsafe part don't comes for the Javascript. For example, you can use the measurement protocol to flood data to one account. Here you can see a lot of people in the same comunity having thoubles with this (and it's quiet simple to solve.)
https://stackoverflow.com/search?q=spam+google+analytics
All this measurement systems uses HTTP calls to fill the data on your "database". If you are able to build the correct call you can Spam Everyone and everywhere (but don't do it, don't be evil).
https://developers.google.com/analytics/devguides/collection/protocol/v1/?hl=es-419
This page of Google Analytics explain what is the protocol measurement, Javascript only work as framework to build and send the hit.
https://developers.google.com/analytics/devguides/collection/protocol/v1/?hl=es-419
But, so not everything is lost.
For example, if you try to do that on you browser with that code, The Google Analytics FrameWork limit to 1 call per second and 150 per session (or cookie value). Yes it's not complicated to jump that barrier, but after that other barriers will come.
So if you use the Javascript framework are safe. Now imagine you do the same with python, sending http to the Google Analytics server. It's possible but:
So here are 2 important things to says.
Google Analytics has a proactive "firewall", to detect Spammers and ban them.(How and when they do this is not public), but in my case i see a lot of less spammer that few years ago.
Also there is a couple of good practices to avoid this. For example, store only domains under a white list, creating a filter to allow only traffic from your domain
https://support.google.com/analytics/answer/1033162?hl=en
Also it's a very good practice to protect you ecommerce, using a filter to include only data from certain store or with certain parameter, "for example brand == my brand" or "CustomDimension== true". Exclude transactions with products over $1.000 (check your limits and apply proactive filters). All this barrier make complex to broke.
If you do this, you will protect your domain a lot(because it's too much complicated to know the combination of UA + Domain Valid when you create a robot), but you know, all the system can be broken. In my experience i only see 2 or 3 cases of damage comming from spammer or people who wanna hurt, and in all this case could be prevented if I created a proactive filter. Usually spammer only spam ads into your account,almost never want to hurt you. Facebook, Piwik and other Tools happens more or less the same.

Can I Use the Facebook Events API for a *Very* High-Traffic Website?

This is a sort of general question for which I couldn't find a solid answer in the FB developer docs, but here it is (in 3 parts):
If I am working on a site that could get (very roughly) 300k-500k+ uniques with 500k to 1M+ page views per
day, would it still be somehow possible to use the Facebook REST API to pull
in Event data from a Facebook page without hitting the API rate limit?
If it is technically possible,
do you know what would be the best practice with regards to rate
limiting? (in a little more detail than what they have on the docs) :)
If it is NOT possible with FB's out-of-the-box
capabilities, do you know of a solution to make this work (maybe
creating a service or sort of job that copies the event data every
15 mins to my own database, so that I take the hit instead of FB?)
Thanks!

Google Maps/Place Geocoding

Currently I am using Google Maps for both Autocomplete as well as Geocoding function.
However, I realized that Google Maps geocoding is rather inaccurate for most cases and also chanced upon a few links suggesting to use Google Places API.
As the Google Maps method which I have implemented is based on Javascript approach and it does not require any API key. Therefore, it does not have much restriction since it is based on client side limit.
However for Google Places, it seems that it requires an API key and have a different set of usage limit.
Before I convert and explore on Google Places API:
May I know is there any better free alternative for geocoding solution which is good/accurate?
Is it possible to configure Google Places geocoding in a way similar to Google Maps so that the usage limitation is held at client level?
Or is there a strategy that I can try/consider. Example, create a few Google Places account/API key, and develop some logic e.g. (if apikey1 exceed limits, switch to api2 etc..)? Before that, is it able to track or detect the current limit via portal or coding level respectively?
May I know is there any better free alternative for geocoding solution which is good/accurate?
I think Google Maps offers the most accurate among free geocoding services. Reading from this SO thread, if you're really concerened about precision, then consider using paid services.
Is it possible to configure Google Places geocoding in a way similar to Google Maps so that the usage limitation is held at client level?
Here's a statement from Google about Usage Limits
The Google Places API Web Service enforces a default limit of 1 000
requests per 24 hour period, which you can increase free of charge. If
your app exceeds the limit, the app will start failing. Verify your
identity to get up to 150 000 requests per 24 hour period, by enabling
billing on the Google API Console. A credit card is required for
verification. We ask for your credit card purely to validate your
identity. Your card will not be charged for use of the Google Places
API Web Service.
The limit for free usage is 150 000 requests per 24 hour period. If
your app exceeds the limit, the app will start failing again. Purchase
a Google Maps APIs Premium Plan license to get more than 150 000
requests per 24 hour period.
Please take steps to increase your limit early if you expect to exceed
the default number of requests allowed.
Is there a strategy that I can try/consider. Example, create a few Google Places account/API key, and develop some logic e.g. (if apikey1 exceed limits, switch to api2 etc..)?
It seems there is no such feature. If you want to exceed the free quota, consider
payment.
You're right, there are many restrictions on the Google APIs. In fact, in the terms of use, the Google Maps API requires that you use the geocode information with a map presentation—you can't just print the numbers.
And yes, the Google Maps API guesses an approximate location based on the address input. For instance, if you give it a complete address that is not a real place, it will try to give you somewhere in between the real places that would probably be next to it. This is one of the reasons you will often get inaccurate geocode information from them. Overall, the API is great for what it is designed to do.
As someone who works in this industry, I'm not actually aware of any completely free geocoding and autocompleting service. Most products have a free tier though (up to so many uses per week or per month, etc).
(Full disclosure: I'm a developer at SmartyStreets where I work on the US Autocomplete API as well as the US Street Address API, which provides geocoding.)

Example of using the google analytics API to track subdomains

I'm a pretty much a noob/novice when it comes to google analytics. I have asked this question in some form before, but i wanted to see if anyone could provide an example to help as the documentation for the API, while pretty detailed, doesn't really answer my question.
Scenario - we have about 85 subdomains, all of which they want tracked separately. Due to the view limit of 50, they want to do it by reading a 'session' ID in PHP (which would equal subdomain if possible) then somehow having that data either come into the suite ( not sure how that's possible ) or into a spreadsheet. They want it all separate though, as in foo.bar.net, test.bar.net collecting the different data for each (page views, events etc). So, really doing the same thing as views in the GA suite/dashboard does, but just from the other end. Is there a way to do this in a settings file that talks to the API to read the different subdomain traffic?
Thanks, I hope the question is somewhat clear

Google maps - map from certain date and time

Please before you vote this down consider the question as I have not been able to conceptualize a better way or place to ask it:
I have experimented adequately with google maps to understand the overall structure. Making requests, creating custom flags, etc. It is all quite easy and very similar the jCharts library.
Now, google obviously has something that is not available: a map from a certain date in the past. I do not need a full day by day iteration, but even every 6 months or so would be huge.
Is this possible? Has anyone else experimented with this?
Is the only option to save results locally and reinvent the google maps wheel?
Thank you very much
Google Earth has this functionality: http://www.google.com/earth/explore/showcase/historical.html
Travel back in time with Historical Imagery in Google Earth. View your neighborhood, home town, and other familiar places to see how they have changed over time.
As for Google maps:
A discussion suggesting the use of older URLs to obtain the old satellite images.
This example supposedly pulls older images if they're available. Doesn't work that well for me.
This search on the Google groups might help but I see numerous posts about it not being officially available.
There is no official service. These posts hint at ways to go back a
short while, under some circumstances.
http://groups.google.com/group/google-maps-api/search?group=google-maps-api&q=old+satellite
Note the comments about seeing if it is within the terms - probably
not - and the risk of getting (temporarily) blocked.

Categories