I have a WordPress plugin that I am working on that Polls all the major Social Networking sites and returns the social counts ( followers ) for a specific user.
This can be quite slow and intensive on the server so I have built the plugin using WordPress Transient Caching to store the details returned from the Social Network sites, and am also using jQuery AJAX json to display the data.
These are the main functions:
Retrieve the Facebook Count
/**
* Fetch Facebook count.
*
* #param string $url The url to fetch.
* #return int of Facebook counts.
*/
function ass_get_fb_likes($facebook_id) {
try {
$json = wp_remote_get("http://graph.facebook.com/".$facebook_id);
if(is_wp_error($json))
return false;
$fbData = json_decode($json['body'], true);
return format(intval($fbData['likes']));
} catch (Exception $e) {
return false;
}
}
this above function is also connected to another function that handles the Transient caching. This aspect works great.
Handles the initial display of the Social Network data
jQuery(function($) {
$('#fblikes').advanceddashboardwidget({
'action':'get_facebook_likes',
'service':'facebook',
'countof':'likes',
'callback':'formatCount'
});
});
Helper function to format the display
function formatCount(element,count){
var display_count='';
count=parseInt(count,10);
if(count>1000000)
{
count=count/1000000;
count=count.toFixed(0);
display_count=count+'m';
}
else if(count>1000)
{
count=count/1000;
count=count.toFixed(0);
display_count=count+'k';
}
else
{
display_count=count;
}
element.html(display_count);
}
The below function if the one giving me issues. It is used to communicate with WordPress to call the PHP functions and retrieve the data.
(function($) {
$(document).ready( function() {
var AdvancedDashboardWidget = function(element, options)
{
var ele = $(element);
var settings = $.extend({
action: '',
service: '',
countof: '',
query: '',
callback:''
}, options || {});
this.count=0;
var url='';
switch(settings.service)
{
case 'facebook':
if(settings.countof=='likes' || settings.countof=='talks')
{
ajaxCall(action,ele,settings);
}
break;
}
};
var ajaxCall = function(action,ele,settings){
opts = {
url: ajaxurl, // ajaxurl is defined by WordPress and points to /wp-admin/admin-ajax.php
type: 'POST',
async: true,
cache: false,
dataType: 'json',
data:{
action: settings.action // Tell WordPress how to handle this ajax request
},
success:function(response) {
//alert(response);
ele.html(response);
return;
},
error: function(xhr,textStatus,e) { // This can be expanded to provide more information
alert(e);
//alert('There was an error deleting the cache');
return;
}
};
$.ajax(opts);
};
$.fn.advanceddashboardwidget = function(options)
{
return this.each(function()
{
var element = $(this);
// Return early if this element already has a plugin instance
if (element.data('advanceddashboardwidget')) return;
// pass options to plugin constructor
var advanceddashboardwidget = new AdvancedDashboardWidget(this, options);
// Store plugin object in this element's data
element.data('advanceddashboardwidget', advanceddashboardwidget);
});
};
});
})(jQuery);
The Issues
The issue is that when the data is returned from the transient functions there is always an extra 0 ( zero ) appended to the number. From what I have been reading this could be because I am using "json" instead of "jsonp".
When I change it to "jsonp" I get an error "Error: jQuery172011280598581866697_1353705456268 was not called". I guess this has to do with the callback function.
So far I have found this the fastest way to display this information on a site. If the data exists in the transient cache the pages load quick but if not that can take a few seconds and that is where I want jQuery to come in and maybe display a loading graphic until the data is retrieved.
Any help would be greatly appreciated.
Before you return your AJAX data back to the AJAX function, you need to die(), otherwise it falls through to WordPress' which ends with die('0').
Edit:
WordPress now has (since 3.5.0) functions available for this:
wp_send_json_success( $data ): http://codex.wordpress.org/Function_Reference/wp_send_json_success
and wp_send_json_error( $data ): http://codex.wordpress.org/Function_Reference/wp_send_json_error.
Related
I have a website, where users can get inbox messages and notifications while they are on the website. (Like on facebook, you see (1) at the begining of the tile as you have notification)
Currently I have an ajax request which grabs the data the title has to show. It works liek charm but the issue is that this file is called every 10 seconds. If user has 10 page tabs though, this file is called 10x10=100 times.. if my site has thousand users, you understand how much load it would generate.
I though of running the javascript on active tab only but how can I update the title of all opened tabs of my website? Any other suggestion?
Here is my code
var oldtitle=$(document).attr("title");
var checker=function(){
$.ajax({
url : 'live_title.php',
type : 'POST',
success : function(data) {
... code ....
... code ....
... code ....
if (sum>0) {
$(document).attr("title", "("+sum+") "+oldtitle);
}
}
});
}
setInterval(checker,20000);
checker();
A cache mechanism seems the right way to go.
First idea: use HTTP caching
Be sure to add a parameter as a query string with the current timestamp rounded to the previous 10th of second.
Be sure your web server sends the correct header for the HTTP cache to work. It's best with a GET request.
Example:
$.ajax({
url : 'live_title.php',
type : 'GET',
success : function(data) {
// code
},
data: {t: Math.floor((+new Date())/10000)}
}
// we send a request similar to live_title.php?t=142608488
Second idea: use window.localStorage as a secondary local cache.
Additionnaly to the first idea:
var getCache = function(t) {
if (window.localStorage) {
var liveTitle = localStorage.getItem('liveTitle') || {};
return liveTitle[t] || null;
}
};
var setCache = function(t, data) {
if (window.localStorage) {
window.localStorage.setItem('liveTitle', {t:data});
}
}
var run = function() {
var t = Math.floor((+new Date())/10000);
var cache = getCache(t);
var success = function(data) {
/*code*/
};
if (cache) {
success(cache);
}
else {
$.ajax({
url : 'live_title.php',
type : 'GET',
success : function(data) {
setCache(t, data);
success(data);
},
data: {t: t}
}
}
}
I don't think you can do what you want easily.
Moreover to optimize that, I would recommend to use cache :
One time a tab calls the method which count the messages, do the query and cache the result to a simple file or in memory
during the next 5 minutes, each time a tab calls the method, use the cache and do not query the database
when the 5 minutes are passed, do again a query, cache it and so on.
Like this, on 100 calls, you have only 1 big request, others are like requesting a js or img files
I have an autocomplete feature in my application which makes an ajax request to server.
However, once I get data from server, I want to use the look up feature instead of using the service url(to minimize calls to server).
Here is what my js looks like
$('#country').autocomplete({
serviceUrl : './countryCache?',
paramName : 'countryName',
transformResult : function(response) {
return {
// must convert json to javascript object before process
suggestions : $.map($.parseJSON(response), function(item) {
return {
data : item.name
};
})
};
},
showNoSuggestionNotice:true,
onSelect: function (value, data) {
$('#countryId').val(value.data);
}
});
Here is a sample from my ajax call to countryCache - "India, Iceland, Indonesia".
If the user has typed I, the server returns back the result as above.
Now when the user types in n after I, I dont want to make a call to server again.
Can someone help me achieve it.
There is a simple solution for this in the jQuery UI Autocomplete documentation. There you'll find a section titled Remote with caching that shows how to implement what you are looking for.
I adapted the code from that site to this question, and added some comments for clarification:
var cache = {};
$( "#country" ).autocomplete({
source: function( request, response ) {
// If the term is in the cache, use the already existing values (no server call)
var term = request.term;
if ( term in cache ) {
response( cache[ term ] );
return;
}
// Add countryName with the same value as the term (particular to this question)
// If the service reads the parameter "term" instead, this line can be deleted.
request.countryName = request.term;
// Call the server only if the value was not in the cache
$.getJSON( "./countryCache", request, function( data, status, xhr ) {
cache[ term ] = data;
response( data );
});
},
select: function (event, data) {
$('#countryId').val(data.item.value);
}
});
As I didn't know exaclty the format of the JSON, I just used a basic one that for the text "In" returned: ["India","Indonesia","Spain"] (without ids, just a plain array).
If what you are using is the Ajax AutoComplete plugin for jQuery (the code above looks like it, although the question was tagged with jquery-ui-autocomplete), then you don't have to worry about caching, because the plugin does it automatically for you.
From the plugin's documentation:
noCache: Boolean value indicating whether to cache suggestion results. Default false.
As you didn't specify any value for nocache, then it will take the default value that is false, and it will perform caching directly.
I ended up not using this method at all and going with fast, quick searches with a limit of 100. But since I asked, here is how I sent requests using only the first character:
// global variables: models [], result {}
lookup: function(query, done) {
var mdl = $("#autocomplete").val();
if (mdl.length == 0) {
names = [];
result.suggestions = models;
done(result);
return;
} else if (mdl.length != 1) {
result.suggestions = names;
console.log(result);
done(result);
return;
}
var jqHXR = $.ajax({url: "search.php",
data: {"q": mdl},
dataType: "json",
method: "GET" }
)
.done(function(data, status, jqXHR){
models = [];
$.each( data, function( key, val) {
names.push({ value: val.u, data: { category: genders[val.g] } });
});
result.suggestions = names;
done(result);
})
.fail(function (data, status, errorThrown) {
console.log("failed: "+status+"| error: "+errorThrown);
console.log(data);
});
},
A colleague of mine used devbridge and my research seems to verify that there's an attribute for the devbridgeAutocomplete object for minChars and lookupLimit. Maybe there are different instances of devbridgeAutocomplete, but I thought it was worth posting just in case they're similar, though I should assume you would have seen them already :).
Here's the code:
var a = $('#<%= txtFindName.ClientID %>').devbridgeAutocomplete({
minChars: 3,
lookupLimit: 20,
serviceUrl: 'AutoComplete/ADUsers.aspx',
onSelect: function (suggestion) {
$('#<%= txtTo.ClientID %>').val( $('#<%= txtTo.ClientID %>').val() + ',' + suggestion.data);
$('#<%= txtFindName.ClientID %>').val('');
}
});
I can't see what the problem with this is.
I'm trying to fetch data on a different server, the url within the collection is correct but returns a 404 error. When trying to fetch the data the error function is triggered and no data is returned. The php script that returns the data works and gives me the output as expected. Can anyone see what's wrong with my code?
Thanks in advance :)
// function within view to fetch data
fetchData: function()
{
console.log('fetchData')
// Assign scope.
var $this = this;
// Set the colletion.
this.collection = new BookmarkCollection();
console.log(this.collection)
// Call server to get data.
this.collection.fetch(
{
cache: false,
success: function(collection, response)
{
console.log(collection)
// If there are no errors.
if (!collection.errors)
{
// Set JSON of collection to global variable.
app.userBookmarks = collection.toJSON();
// $this.loaded=true;
// Call function to render view.
$this.render();
}
// END if.
},
error: function(collection, response)
{
console.log('fetchData error')
console.log(collection)
console.log(response)
}
});
},
// end of function
Model and collection:
BookmarkModel = Backbone.Model.extend(
{
idAttribute: 'lineNavRef'
});
BookmarkCollection = Backbone.Collection.extend(
{
model: BookmarkModel,
//urlRoot: 'data/getBookmarks.php',
urlRoot: 'http://' + app.Domain + ':' + app.serverPort + '/data/getBookmarks.php?fromCrm=true',
url: function()
{
console.log(this.urlRoot)
return this.urlRoot;
},
parse: function (data, xhr)
{
console.log(data)
// Default error status.
this.errors = false;
if (data.responseCode < 1 || data.errorCode < 1)
{
this.errors = true;
}
return data;
}
});
You can make the requests using JSONP (read about here: http://en.wikipedia.org/wiki/JSONP).
To achive it using Backbone, simply do this:
var collection = new MyCollection();
collection.fetch({ dataType: 'jsonp' });
You backend must ready to do this. The server will receive a callback name generated by jQuery, passed on the query string. So the server must respond:
name_of_callback_fuction_generated({ YOUR DATA HERE });
Hope I've helped.
This is a cross domain request - no can do. Will need to use a local script and use curl to access the one on the other domain.
I'm trying to understand how a portion of backbone.js works. I have to fetch a collection of models once the app begins. I need to wait until fetch is complete to render each view.
I'm not 100% sure the best approach to take in this instance.
var AppRouter = Backbone.Router.extend({
routes: {
"": "home",
"customer/:id": "customer"
},
home: function () {
console.log("Home");
},
customer: function (id) {
if (this.custromers == null)
this.init();
var customer = this.customers.at(2); //This is undefined until fetch is complete. Log always says undefined.
console.log(customer);
},
init: function () {
console.log("init");
this.customers = new CustomerCollection();
this.customers.fetch({
success: function () {
console.log("success");
// I need to be able to render view on success.
}
});
console.log(this.customers);
}
});
The method I use is the jQuery complete callback like this:
var self = this;
this.model.fetch().done(function(){
self.render();
});
This was recommended in a Backbone bug report. Although the bug report recommends using complete, that callback method has since been deprecated in favor of done.
You can also do this with jquery 1.5+
$.when(something1.fetch(), something2.fetch()...all your fetches).then(function() {
initialize your views here
});
You can send your own options.success to the collections fetch method which runs only when the fetch is complete
EDIT (super late!)
From the backbone source (starting line 624 in 0.9.1)
fetch: function(options) {
options = options ? _.clone(options) : {};
if (options.parse === undefined) options.parse = true;
var collection = this;
var success = options.success;
options.success = function(resp, status, xhr) {
collection[options.add ? 'add' : 'reset'](collection.parse(resp, xhr), options);
if (success) success(collection, resp);
};
Note the second to last line. If you've passed in a function in the options object as the success key it will call it after the collection has been parsed into models and added to the collection.
So, if you do:
this.collection.fetch({success: this.do_something});
(assuming the initialize method is binding this.do_something to this...), it will call that method AFTER the whole shebang, allowing you trigger actions to occur immediately following fetch/parse/attach
Another useful way might be to bootstrap in the data directly on page load. This if from the
FAQ:
Loading Bootstrapped Models
When your app first loads, it's common to have a set of initial models that you know you're going to need, in order to render the page. Instead of firing an extra AJAX request to fetch them, a nicer pattern is to have their data already bootstrapped into the page. You can then use reset to populate your collections with the initial data. At DocumentCloud, in the ERB template for the workspace, we do something along these lines:
<script>
var Accounts = new Backbone.Collection;
Accounts.reset(<%= #accounts.to_json %>);
var Projects = new Backbone.Collection;
Projects.reset(<%= #projects.to_json(:collaborators => true) %>);
</script>
Another option is to add the following inside of your collections initialize method:
this.listenTo(this.collection, 'change add remove update', this.render);
This will fire off the render method whenever the fetch is complete and/or the collection is updated programmatically.
You Can Use on and Off Methods
if you want to add trigger method like suppose if you want on success you want to call render method so please follow below example.
_this.companyList.on("reset", _this.render, _this);
_this.companyList.fetchCompanyList({firstIndex: 1, maxResult: 10}, _this.options);
in Model js please use like
fetchCompanyList: function(data, options) {
UIUtils.showWait();
var collection = this;
var condition = "firstIndex=" + data.firstIndex + "&maxResult=" + data.maxResult;
if (notBlank(options)) {
if (notBlank(options.status)) {
condition += "&status=" + options.status;
}
}
$.ajax({
url: "webservices/company/list?" + condition,
type: 'GET',
dataType: 'json',
success: function(objModel, response) {
UIUtils.hideWait();
collection.reset(objModel);
if (notBlank(options) && notBlank(options.triggerEvent)) {
_this.trigger(options.triggerEvent, _this);
}
}
});
}
I'm working with some government data published via Socrata's SODA api.
This API provides a way to retrieve rows via a REST call. The API allows limited parameterization of the query - basically you can do a full text search, and nothing else. I cannot find a way to shape the data returned - for example only return certain columns of the data.
As a result, basically I can only get all rows and all columns of each data view. This is ok, I guess, but I'd like to cache it - memoize it to use the underscore term.
Is there a pattern for memoization of ajax calls with jQuery?
EDIT: To give you an idea of what I'm talking about, here's what I'm doing currently.
function onclick(event) {
var $t = $(event.currentTarget);
var itemId = $t.attr('data-itemid');
var url = getRestUrl(itemId);
if (typeof datacache[itemId] === "undefined") {
$.ajax({
url : url,
cache : true,
type : "GET",
dataType : "json",
error : function(xhr,status,error) {
raiseError(error);
},
success : function(response, arg2, xhr) {
datacache[itemId] = response;
doSomethingWithTheData(url, itemId);
}});
}
else {
doSomethingWithTheData(url, itemId);
}
}
// then, doSomethingWithTheData() simply references datacache[itemId]
This seems like it's faster though I haven't measured it. What I really want to know is, is there a common pattern that does something like this, that I can employ, so that everyone who reads the code will immediately see what I'm doing??
You might be able to do something like is done with autocomplete lookups (this is very much from memory, but you'll get the idea):
var searchCache = {}, searchXhr = null;
function Search(term) {
if (term in searchCache) {
return doSomethingWithTheData(searchCache[term]);
}
if (searchXhr != null) {
searchXhr.abort();
}
searchXhr = $.ajax({
url : url,
cache : true,
type : "GET",
dataType : "json",
error : function(xhr, status, error) {
raiseError(error);
},
success : function(response, arg2, xhr) {
searchCache[term] = response;
if (xhr == searchXhr) {
doSomethingWithTheData(response);
searchXhr = null;
}
}
});
}
I'm not necessarily the best expert for Javascript question, but I might be able to help you with your use of SODA.
If you're looking for more flexibility in your queries, and you can do an HTTP POST, you could look at using our query syntax to do a more targeted query: http://dev.socrata.com/querying-datasets. Our query syntax is fairly complex, but I can help you figure out how to structure your query if you hit any snags.
Unfortunately, since that'd require a POST, you'll need to break out of the XHR cross-domain lockbox by going through a proxy or something similar.
Also, FYI, we're working on a whole new syntax that'll allow you to specify queries as URL parameters, so you'll be able to perform simple requests such as /resources/agencies?acronym=CIA or /resources/agencies?$where='budget > 10000000'. It should be pretty awesome.
You would only cache ajax requests that you know won't change, like the Facebook SDF for example. It seems like in your example, you're requesting something UI related that might be inappropriate to cache? Otherwise, you might try something like this:
var store = {};
/**
* Memoized $.getScript
*
* Cache one script response per url
* Reference, see http://msdn.microsoft.com/en-us/magazine/gg723713.aspx
*
* #example $.memoizedGetScript( url ).then( successCallback, errorCallback );
* #param {String} url
* #param {Function} callback (optional)
* #returns {*}
*/
$.memoizedGetScript = function(url, callback) {
var callback = callback || {};
store.cachedScripts = {};
if (!store.cachedScripts[url]) {
store.cachedScripts[url] = $.Deferred(function(d) {
$.getScript(url).then(
d.resolve(),
d.reject()
);
}).promise();
}
return store.cachedScripts[url].done(callback);
};