I have tried many things but I'm not able to get a proper API which will return me the +1 count in google plus.
I have already tried using :-
Getting Counts for Twitter links, Facebook likes and Google +1 with Jquery and AJAX
Getting Google+ subscription count with jQuery
How do I get the counter of a google plus +1 button?
But none of them is giving me the answer.
Any thoughts....
Thanks :)
i just found out a very useful site to solve our problem. :) Thanks to him!
http://99webtools.com/script-to-get-shared-count.php
You could write your own function using the link you and jgillich mentioned. This would be slightly simplified with jQuery.
Here's a jsfiddle I made as an example. You'll probably have to use something like PHP to fetch from the site if you want to circumvent inter-domain issues. It could look something like this though (ignoring domains):
$('#myInput').keyup(function () {
var url = 'https://plusone.google.com/_/+1/fastbutton?url=' + encodeURIComponent($(this).val());
$.get(url,
function (data) {
var aggregate = $('#aggregateCount', data).html(),
exactMatch = $('script', data).html().match('\\s*c\\s*:\\s*(\\d+)');
$('div').html(exactMatch ? exactMatch[1] + ' (' + aggregate + ')' : aggregate);
}
);
});
Currently, the API does not offer any method to retrieve the +1 count. A workaround would be to fetch it directly from the +1 button like described here (you already linked to it, but I don't think there is another way).
this works for me :
var _url = 'http://mylink.com/';
$.getJSON('http://anyorigin.com/get?callback=?&url=' + encodeURIComponent('https://plusone.google.com/_/+1/fastbutton?url=' + _url)).done(function(data,status,xhr){
console.log($(data.contents).find('#aggregateCount').html());
});
Check out the following for another way to get the Google+ count.
http://share.yandex.ru/gpp.xml?url=http://google.com
Related
At the moment I have the top of the code like this:
$.getJSON(' https://api.roleplay.co.uk/v1/player/' + "END-LINK", function(data)
What I want is that the "END-LINK" bit will be the end of my url - for example, if my url is www.link.com/player.html/76561198062083666 I want it to add those numbers at the end to the jquery request so it will get the api "https://api.roleplay.co.uk/v1/player/76561198062083666"
I think what you want is to retrieve the ID from the link and send it to the API.
var endLink = window.location.href.substr(window.location.href.lastIndexOf('/') + 1)
$.getJSON(' https://api.roleplay.co.uk/v1/player/' + endLink, function(data){...})
In order to query your player ID, you would assign it to a local variable, and then pass that local variable to your .getJSON() function just as you have, expect without the quotes:
var end_link = 76561198062083666;
$.getJSON('https://api.roleplay.co.uk/v1/player/' + end_link, function(data) {
console.log(data);
});
<script src="https://ajax.googleapis.com/ajax/libs/jquery/2.1.1/jquery.min.js"></script>
Note that I have replaced END-LINK with end_link, as you cannot have hyphens in variable names (the parser will treat it as subtraction).
Also note that you will be unable to retrieve your information through .getJSON(), as RolePlay.co.uk is transmitting a No 'Access-Control-Allow-Origin' header, meaning they disallow people from querying it in accordance with Cross Origin Resource Sharing (CORS).
This is typically handled on the server, but considering you don't have access, you can bypass it by adding the argument --disable-web-security to your browser launcher.
Hope this helps! :)
Obsidian Age told you very well what you can do to solve your problem.
If I'm not wrong, you want to send the local variables from the link to the API, and you can do it in this way:
var endLink = window.location.href.substr(window.location.href.lastIndexOf('/') + 1)
$.getJSON(' https://api.roleplay.co.uk/v1/player/' + endLink, function(data){
console.log(data);
}
);
<script src="https://ajax.googleapis.com/ajax/libs/jquery/2.1.1/jquery.min.js"></script>
So I have a stock website that I want to be able to obtain the JSON information from either Google's API or Yahoo's API website. I am currently just testing so that is why I used a replacement function for my console log to print it onto a text box for my testing atm. I can not seem to get this to work correctly, I have looked on other pure JS script, but I am currently stump, I have read past posts, but they're very similar in answers, and I have tried implementing.
It works with a fully JSON string, with just {}, where I can just access the inner elements. However, even accessing it, what I believe to be the correct way, it does not seem to work. And I tried with other API with a different method and it worked fine.... Anyone can explain? And I also tried using $.getJSON
$.get("http://d.yimg.com/aq/autoc?query=y®ion=US&lang=en-US", function(data) {
var dropDownHTML;
var stock = data.ResultSet.Result;
for (var i = 0, len = stock.length ;i<len;i++){
dropDownHTML += '<option value="' + stock[i].symbol + '">' + stock[i].name + '</option>';
}
document.getElementById("options").innerHTML = dropDownHTML;
});
</script>
</div>
The problem is simply that the website you are scraping from has specifically blocked HTTP requests. You'll need to connect with HTTPS instead:
https://mysafeinfo.com/api/data?list=englishmonarchs&format=json
Also, you're returning a large array of objects from your scrape -- you'd need to loop through, and log the contents of each object individually:
for (var i = 0; i < data.length; i++) {
console.log(data);
}
I've created a Fiddle showing a working scrape here.
Hope this helps! :)
I'm not sure if the way to do this is check Google Analytics cookies or otherwise track where a user came to my site from. Basically I have a form with a hidden field code="XY1" Now I need to be able to insert a different preset code for say people who came from Facebook, so the script would have to check where the visitor came from and then assign a code XF1 to any from FB, and a code XT1 to any from Twitter, etc.
Would something like this PHP work for the capture?:
$referringPage = parse_url( $_SERVER['HTTP_REFERER'] );
if ( stristr( $referringPage['host'], 'facebook.com' ) )
Or this JS
var ref = document.referrer;
if (!ref.indexOf("facebook.com") != -1) {
document.write(...)
}
I'm not sure what is the best way to do it and what kind of methods can reliably check the source of a visitor, so any help would be greatly appreciated.
You can use $_SERVER['HTTP_REFERER'], but it's not guaranteed to be accurate, or even present. Not all browsers will necessarily set it, and some allow you to set it yourself. Google cookies won't contain any site history, and you can't examine the browser history, so there's no guaranteed way to do what you're asking.
You can try this option using jquery $.test() method.
$(function(){
var referer=document.referrer, //option 1
//referer="<?php echo $_SERVER['HTTP_REFERER'];?>",//optional 2
XFB=/facebook.com/g,
XFT=/twitter.com/g,
checkF1=XFB.test(referer),
checkF2=XFT.test(referer);
if(checkF1){
var code= "XF1";
$('#hiddenInput').attr('value','ref: '+referer)
}
else if(checkF2){
var code= "XT1";
$('#hiddenInput').attr('value','ref: '+referer)
}
});
I wrote a small JavaScript a couple of years ago that grabbed a users (mine) most recent tweet and then parsed it out for display including links, date etc.
It used this json call to retrieve the tweets and it no longer works.
http://twitter.com/statuses/user_timeline/radfan.json
It now returns the error:
{"errors":[{"message":"Sorry, that page does not exist","code":34}]}
I have looked at using the api version (code below) but this requires authentication which I would rather avoid having to do as it is just to display my latest tweet on my website which is public anyway on my profile page:
http://api.twitter.com/1/statuses/radfan.json
I haven't kept up with Twitter's API changes as I no longer really work with it, is there a way round this problem or is it no longer possible?
Previously the Search API was the only Twitter API that didn't require some form of OAuth. Now it does require auth.
Twitter's Search API is acquired from a third party acquisition - they rarely support it and are seemingly unenthused that it even exists. On top of that, there are many limitations to the payload, including but not limited to a severely reduced set of key:value pairs in the JSON or XML file you get back.
When I heard this, I was shocked. I spent a LONG time figuring out how to use the least amount of code to do a simple GET request (like displaying a timeline).
I decided to go the OAuth route to be able to ensure a relevant payload. You need a server-side language to do this. JavaScript is visible to end users, and thus it's a bad idea to include the necessary keys and secrets in a .js file.
I didn't want to use a big library so the answer for me was PHP and help from #Rivers' answer here. The answer below it by #lackovic10 describes how to include queries in your authentication.
I hope this helps others save time thinking about how to go about using Twitter's API with the new OAuth requirement.
You can access and scrape Twitter via advanced search without being logged in:
https://twitter.com/search-advanced
GET request
When performing a basic search request you get:
https://twitter.com/search?q=Babylon%205&src=typd
q (our query encoded)
src (assumed to be the source of the query, i.e. typed)
by default, Twitter returns top 25 results, but if you click on
all you can get the realtime tweets:
https://twitter.com/search?f=realtime&q=Babylon%205&src=typd
JSON contents
More Tweets are loaded on the page via AJAX:
https://twitter.com/i/search/timeline?f=realtime&q=Babylon%205&src=typd&include_available_features=1&include_entities=1&last_note_ts=85&max_position=TWEET-553069642609344512-553159310448918528-BD1UO2FFu9QAAAAAAAAETAAAAAcAAAASAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
Use max_position to request the next tweets
The following json array returns all you need to scrape the contents:
https://twitter.com/i/search/timeline?f=realtime&q=Babylon%205&src=typd
has_more_items (bool)
items_html (html)
max_position (key)
refresh_cursor (key)
DOM elements
Here comes a list of DOM elements you can use to extract
The authors twitter handle
div.original-tweet[data-tweet-id]
The name of the author
div.original-tweet[data-name]
The user ID of the author
div.original-tweet[data-user-id]
Timestamp of the post
span._timestamp[data-time]
Timestamp of the post in ms
span._timestamp[data-time-ms]
Text of Tweet
p.tweet-text
Number of Retweets
span.ProfileTweet-action–retweet > span.ProfileTweet-actionCount[data-tweet-stat-count]
Number of Favo
span.ProfileTweet-action–favorite > span.ProfileTweet-actionCount[data-tweet-stat-count]
Resources
https://code.recuweb.com/2015/scraping-tweets-directly-from-twitter-without-authentication/
If you're still looking for unauthenticated tweets in JSON, this should work:
https://github.com/cosmocatalano/tweet-2-json
As you can see in the documentation, using the REST API you'll need OAuth Tokens in order to do this. Luckily, we can use the Search (which doesn't use OAuth) and use the from:[USERNAME] operator
Example:
http://search.twitter.com/search.json?q=from:marcofolio
Will give you a JSON object with tweets from that user, where
object.results[0]
will give you the last tweet.
Here is a quick hack (really a hack, should be used with caution as its not future proof) which uses http://anyorigin.com to scrape twitter site for the latest tweets.
http://codepen.io/JonOlick/pen/XJaXBd
It works by using anyorigin (you have to pay to use it) to grab the HTML. It then parses the HTML using jquery to extract out the relevant tweets.
Tweets on the mobile site use a div with the class .tweet-text, so this is pretty painless.
The relevant code looks like this:
$.getJSON('http://anyorigin.com/get?url=mobile.twitter.com/JonOlick&callback=?', function(data){
// Remap ... utf8 encoding to ascii.
var bar = data.contents;
bar = bar.replace(/…/g, '...');
var el = $( '<div></div>' );
el.html(bar);
// Change all links to point back at twitter
$('.twitter-atreply', el).each(function(i){
$(this).attr('href', "https://twitter.com" + $(this).attr('href'))
});
// For all tweets
$('.tweet-text', el).each(function(i){
// We only care about the first 4 tweets
if(i < 4) {
var foo = $(this).html();
$('#test').html($('#test').html() + "<div class=ProfileTweet><div class=ProfileTweet-contents>" + foo + "</div></div><br>");
}
});
});
You can use a Twitter API wrapper, such as TweetJS.com which offers a limited set of the Twitter API's functionality, but does not require authentication. It's called like this;
TweetJs.ListTweetsOnUserTimeline("PetrucciMusic",
function (data) {
console.log(data);
});
You can use the twitter api v1 to take the tweets without using OAuth. For example: this link turns #jack's last 100 tweets.
The timeline documentation is here.
The method "GET statuses/user_timeline" need a user Authentification like you can see on the official documentation :
You can use the search method "GET search" wich not require authentification.
You have a code for starting here : http://jsfiddle.net/73L4c/6/
function searchTwitter(query) {
$.ajax({
url: 'http://search.twitter.com/search.json?' + jQuery.param(query),
dataType: 'jsonp',
success: function(data) {
var tweets = $('#tweets');
tweets.html('');
for (res in data['results']) {
tweets.append('<div>' + data['results'][res]['from_user'] + ' wrote: <p>' + data['results'][res]['text'] + '</p></div><br />');
}
}
});
}
$(document).ready(function() {
$('#submit').click(function() {
var params = {
q: $('#query').val(),
rpp: 5
};
// alert(jQuery.param(params));
searchTwitter(params);
});
})
The snippet of JavaScript that I am having trouble with looks like this:
var content_value = encodeURI(document.getElementById("chattext").value)
downloadUrl("/getchats", "POST", "content=" + content_value, onChatsReturned);
This code works, but it only posts the content. How would I have to change this in order to post another item, such as a description? I have all the other code ready and working, I just don't know how the parameters work for downloadUrl.
It's quite hard to say anything for certain since I can't know what downloadUrl() function actually does, but here is a solution that at might solve your problem (if it just passes the 3rd argument to the backend).
downloadUrl("/getchats", "POST", "content=" + content_value + "&description=" + desciption_value, onChatsReturned);
As the comments mention more information on the downloadUrl-function is needed to say for sure.