I am working on a search engine that needs to have access to results from google. Here are my options:
Using the custom search API
Using a proxy to make my server send searches and return the data
I am not sure about some things though:
Is the custom search API limited? I may need a really big amount of queries, so if the use is limited it will be a problem.
Is it "authorized" to use a proxy in node that would send search queries to google and intercept the result to show to my users? If I do so, wouldn't I run to some limitations?
The inspiration here is gizoogle which managed to plug into google API (they have the same results as google) while still not using custom search (custom search displays adds, and there aren't any on this website). So I assume they have some sort of proxy, but how come google let them run those queries?
Edit: It turns out that the custom search API is also limited. So, how did gizoogle do ?
Ok here is how I solved this problem:
It turns out that google has a lost API (probably deprecated so be aware of this) for client-side ajax search. It looks like that:
http://ajax.googleapis.com/ajax/services/search/web?v=1.0&q=test&rsz=large
Just go to that url to see what results it gives.
So basically here is the process:
The user types a search
It is sent to your server in ajax
The server might modify the search depending on your application (filtering forbidden words or whatever)
Your server polls the ajax web service from google - don't forget to add the getparameter userIp which is needed to avoid limitations (google limits incoming queries from each user, so your server has to tell google that it is making a request on behalf of this userIp
You send back the results to the client, and then use javascript to display them
The only drawback is that the search must be made in ajax, meaning that the page is empty at load and filled later. But you could actually use get parameters in URL to preload the search and fill the page before sending it to the client though.
Google Custom Search (GCS) has a free mode and a paid ("enterprise") mode.
Both modes are regulated by a terms of service (Custom Search Terms of Service) - make sure you read carefully.
From what I understand, you can use the free mode and search as much as you'd like. Because google is returning the results, they also return ads, so they get paid that way.
The paid mode gives you access to the API, and let's you turn off the ads and do other things. But it comes at a cost.
I've been combing through the documentation and terms and the like -- it's really not Google's best effort. But if you are using it exactly as they describe, it's pretty standard, really.
Depends on your project size and funds available but you could get a GSA http://www.google.com/enterprise/search/products/gsa.html
The Dr Oz webite uses this to index and pull in results from partnered sites, you would have the ability to include Google results as well. Highly customizable with the works from source weight ranking, filtering options to custom output.
Related
I'm in the process of building a website and want to include a component that lists the three next upcoming calendar events from a public Google Calendar. I'm building the site using GatsbyJS, and so will be using Node.js to process the data and display it correctly on the page.
What I'm not quite sure how to do, however, is interface with the Google Calendar API in the first place; I'm building a static site with no backend, so will need to interact with the API clientside instead.
I've done a bit of reading and found the documentation for Googles Node.js Client. It lists three different ways of interacting with the API but I'm not sure which is more appropriate for my situation? I think I've rules out OAuth2 since it sounds like individual users would need to sign in with their Google accounts in order for the calendar event pulling to work correctly (which is hardly a good solution). Which of the other two options do you think I should be using?
Also, are there any security concerns I should be taking into account, given it's likely the website might be storing API keys and/or client IDs in a publicly accessible way?
The API key method stores your API key and is no different than including it in plaintext in the query string when loading e.g.
<script src="https://maps.googleapis.com/maps/api/js?key=YOUR_API_KEY&callback=initMap" async defer></script>
When you create the key, you can then restrict its usage to certain HTTP referers etc.
Like the title, I'd like to gather information about geolocation of users from my existing snippet of Google Analytics, otherwise I'll have to use some extra js library that deal with it.
You have to use an extra library.
Google resolves user location on the server, so you cannot get it from the client side tracking code. And retrieving it via the API fails for a number of reasons, to start with that this would work only after the first hit and that it would be more work than using a specialized service.
I am making a small payment system, basically it's just a point system, you pay say 1 USD and you get 100 points which is used later on in a game project to get bonuses. It's a script for game servers, something like a user panel.
Now, the script system is ready, but I'm afraid to give it away, since than someone will share it and it will spread all over the gaming area. What would be the solution keeping it working only if I give them a permission?
I thought about re-making whole code and make it work on my website but I don't think that people will want to put their SQL data to website that located NOT on their host. Please help me out, at least with some clues, maybe its possible to make some widgets? or maybe some license system?
I'm really lost.
You should implement the logic on the server side as an api REST call and include in the script only an ajax call to the api. You can limit the use of the api through an api key that you'll provide only to qualified sites.
You'd need to implement some sort or serverside authentication/api so that only varified users can use the script. Much like how software checks a licence.
On script load your javascript could make a ajax call to a server passing through the users IP, auth key, username etc etc.
This can then be varified on the server, maybe returning a dynamically generated url containing a javascript file which contains your business logic
(so that urls are dynamically generated for that users session only)
That way people cant hot link the script, and the script you give out is solely the ajax call
(With the business logic script injected on auth)
I'm trying to create a webpage that can incorporate LinkedIn info's (profile,people,company, etc...).
The things that it can/would do are the following:
When the user enters a name that is registered in LinkedIn, he gets the following
*Name, Company, Email
*List of LinkedIn messages that are waiting for reply
The same process goes on everytime the user adds a profile, I'm planning to use the Profile API of LinkedIn to get the Name, Company and Email but I can't find a working example to be my basis.
As for the 2nd one I still don't know how to get the LinkedIn messages.
Here's my Layout and expected result.
How can I achieved this? Opinions and Suggestions are highly appreciated tnx
This is far to broad a question for me to invest the necessary time in to figure the answers (multiple) for you, but do let me give you some hints. First of all, from my experience with the linkedin API not all the data you wish to access is available (do double check this though, I used the API quite awhile back and stuff might have changed in the meantime). As this data is not available through the API the only alternative would be to somehow bypass the cross domain policy, which in conclusion would require the user to install a chrome extension/firefox plugin which will function as a proxy for your application or even 'better', make you entire application a browser plugin based web app. Not that I am a fan of those whatsoever but if you application is meant in any way whatsoever as a linkedin (dedicated) plugin (probably as part of a greater service you're developing) then it might make most sense.
The whole system you are describing is very long winded and requires a large amount of development time. Alot of the data is not accessible directly or indirectly too. You cannot get email address's out from the API as a security feature (bots could just harvest emails for marketing campaigns).
First of all, you will need to make an application that allows for oAuth2 connections with the linkedin API service. People will log onto your website, click to join their linkedin account with your website and your website will receive back an access token to do the calls.
You will then need to build the queries which will access the data you require. The linkedin API documentation (http://developer.linkedin.com/) isn't greatly indepth but it gives you a good understand and points you where you need to go. There are also a couple of pre-done php API's around such as https://code.google.com/p/simple-linkedinphp/.
I have worked with many API's from twitters, facebooks and LinkedIn's and they all require a lot of back-end work to make sure that they are secure and get the correct data.
It would take me hours to go through exactly how to do it and has taken me many hours to get a solid implementation in place and working with all the different calls available.
If you have minimal coding knowledge, it would be best to go to an external company with a large amount of resources and knowledge in the field who can do it for you. Otherwise it may take many months to get a working prototype.
In my site I have:
...
<script type="text/javascript" src="https://www.google.com/jsapi"></script>
...
The script above is the Google script to load up other resources dynamically.
(eg Google charts API)
This works 99.99% of the time.
However, I just got a client that for some reasons got his company restricting access to google.com.
As a consequence of this my website simply threw a JavaScript error.
Now I know how to handle that, and I can check if window.Google exists.
but my question is
"what's the standard way to deal with this? "
In other words if you embed 3rd party JavaScript how best do you deal with their JS not available?
NOTE: VERY IMPORTANT
You can not host the chart code locally or on an intranet.
SEE FAQ from Google: https://developers.google.com/chart/interactive/faq#localdownload
Can I download and host the chart code locally, or on an intranet?
Sorry; our terms of service do not allow you to download and save or
host the Google.load or Google.visualization code.
There is no real alternative. Due to Google's terms of service you cannot use Google API without access to google.com.
Check the connection to Google and iform user that function is not available
Develop your own or use non-google api. Still you can use Google if available
The solution is that your client's company review their content filtering policies. Google are quite clear in their previous answer concerning offline access:
…your computer must have live access to http://www.google.com/jsapi in order to use charts.
You are using a third-party solution according to their terms and conditions, which naturally imposes limits on how that solution may be used by your clients. You need to stand firm or find a more liberally-licensed solution. (At any rate, you are more likely to succeed at convincing your client's IT department than petitioning Google to change their TOS.)
For the more general case of third party JS APIs that may not load but for which you are allowed to keep a local copy on your server, see this question.
You can try it like this:
Instead of using the direct link to the Google libraries you want to use, use a link which points to your server:
<script type="text/javascript" src="https://www.myserver.com/jsapi"></script>
When your server gets an incoming request to this URL, your server now makes a request to Google to get the API and sends the response to the client.
That means you do not install the API anywhere locally or on a server and always get the most actual version directly from Google. People also do not need access to Google (as in the company you mentioned) and therefore can use your service.
Use Firebug or the Chrome Dev Tools to inspect your HTML source once the charts scripts are loaded. Access the scripts in your browser and save them locally, then serve them from your own server. This isn't recommended, of course, but if you don't have any other choice...
For example, checking the code of one of the pages I use it on, the core script for the Google Charts library is located at:
https://www.google.com/uds/api/visualization/1.0/3d781368978b51b3ca00a01566dccf40/format+en,default,corechart.I.js
Use the javascript window.onload to check whether the api has loaded or not, if no then load it from your server.
You already know how to check whether or not your library has been loaded (checking the object), if it fails, than what you can do within giving constrains:
Keep checking the object with timer and trying to download library, displaying message for a user
In case first one fails, you have two ways again:
Stopping your application and displaying an error: "Application error... try later"
Or downloading different library as a fallback
Are you progressively enhancing or gracefully degrading the page? If so, what do you display to users without JavaScript for this chart? A table? A list? This is what you should leave in the page and only start changing it once google's JS is available. Either that, or find an alternative library like raphaeljs that lets you keep all your code within your project.
IF (BIG IF) you are not worried about the interactivity the Google Charts and want to display them to the user just to see - maybe add your own javascript to it but not depend on the Google Javascript at all, this can turn the google charts into a image that you can display to the user.
Also this requires access to install a command line tool on the server.
http://code.google.com/p/wkhtmltopdf/ is a command line tool that will generate an image from an html page. If you build a simple page that only shows the chart you want and point the wkhtmltoimage tool at the local html file it will load the Google Charts javascript and generate the chart then generate an image out of the results.
YES I understand this is VERY kludgy and is adding a big tool for a small problem but with the browser restriction and the Google Terms of Service this will solve most of the problem.
You can try going straight to google and if it fails (if google is restricted) you can bounce the request off of your server which forwards the request using CURL to google. If that doesn't work then Google is most likely down. This should cover the issue that you described in your question, but there isn't really a fix for if google itself actually goes down. It should, however, give your application access around domain restrictions because the request will be routed to your server rather than straight to google. I use this architecture for all requests so that I don't have ajax requests routed to random servers. It allows me to control what interacts with my front end using my backend. There are other benefits to this, especially if you are using something like AngularJS with NodeJS because you can decouple a lot of your third party libraries. This however, is beyond the scope of your question!
Basically, it works like this (pseudo code):
If(!Browser->Google->Browser){
return Browser->MyServer->Google->MyServer->Browser;
}
An answer has been accepted already, but still I would like to leave an additional aspect elaborating on the comment I made above ....
It has been accepted that the Google Server is the only place from where the API can be loaded. We don't know whether the client's IT manager will re-think their content policy, they might have good reasons for that.
Given a non-100% availability of all the components along the path between a user browser and the Google API, sooner or later a user will end up in an error situation; statistically this is unavoidable.
What is not acceptable (and avoidable) for a user is to receive an "unspecific" JS error making him/her believe there's a bug on the page. So my solution would be to trap the failure loading the Google API and display a message "Third party components temporarily unavailable - Please try later".
This will demonstrate to the user that
we know what's going on
there's nothing we can do about it now
but it's not totally unexpected and still somehow under control