Batching / combing requests in AngularJS - javascript

I have an AngularJS application that has a lot of directives which are populated from nested nodes in AJAX requests. For example I might get a response like:
{
name: "Blog post title",
comment_ids: [1, 2, 3]
}
Currently in my controller, I will make a request to load data for each of the child nodes comments and populate them onto the scope. Something like:
_.each(comment_ids, function(id) {
scope.comment_data[id] = $http.get(id); // psuedocode for the $get
});
I then use the data on the page like:
<comment data="comment_data.1"></comment>
This works great but I'm thinking about converting my directives to take an id instead of an object and to handle loading the data themselves. The problem I face is that I might get multiple requests to the same endpoint if the same directive is present on the page multiple times, e.g.
<comment id="1"></comment>
<comment id="1"></comment>
<comment id="1"></comment>
This is going to result in three calls to the comments endpoint. Given this:
Will the browser batch multiple calls to the same HTTP endpoint into one request if it occurs in small space of time, or should I write an interceptor to handle doing it myself?

I would move the $http calls to their own service. The benefit is you can use it in a service or controller as needed and as many times as necessary--this is especially useful if you use any kind of JS-controlled caching.
I'm developing an AngularJS oData service for SharePoint 2010/2013 REST calls that uses localStorage to cache results and only fetches updates from the server. This works especially well when I'm calling the same data over and over, it is just fetched from localStorage without ever making the HTTP call.
// Pseudo code to put in a service
var persistentCache = function (query) {
return localStorage[query] ? {
'then': function (callback) {
callback(JSON.parse(localStorage[query]));
}
} : $http(query).then(function (resp) {
return localStorage[query] = JSON.stringify(resp);
});
};
Alternatively, you could use a module design pattern to cache your calls just for the life of the app and only fetch if it is not in the app cache.
// Pseudo code to put in a service
var appOnlyCache = (function () {
var _cache = {};
return function (query) {
return _cache[query] ? {
'then': function (callback) {
callback(JSON.parse(_cache[query]));
}
} : $http(query).then(function (resp) {
return _cache[query] = JSON.stringify(resp);
});
};
}());
Both examples with more robunst implementation with compression & error handling can be found in the AngularSharepoint service I've working on for an app right now: AngularSharepoint

This problem is now solved quite elegantly with: https://github.com/facebook/dataloader
DataLoader is a generic utility to be used as part of your
application's data fetching layer to provide a consistent API over
various backends and reduce requests to those backends via batching
and caching.

Related

Making RESTful API call from React.js

I am doing a POC for isomorphic JavaScript application to render HTML from the server side. The POC is working with simple HTML, but I want to make an API call and get the JSON response and send to the render function. I tried various ways but it is not working.
What am I missing? I am very new to React.js.
loadCategoriesFromServer: function() {
var self = this;
// get walking directions from central park to the empire state building
var http = require("http");
url = "api url here";
var request = http.get(url, function (response) {
// data is streamed in chunks from the server
// so we have to handle the "data" event
var buffer = "",
data,
route;
response.on("data", function (chunk) {
buffer += chunk;
});
response.on("end", function (err) {
data = JSON.parse(buffer);
//console.log(data.d);
//console.log(data.d.Items);
self.setState({
categories: data.d.Items
});
});
});
}, // load from server end
getInitialState: function() {
return { categories: [] };
},
componentWillMount: function() {
console.log("calling load categories")
this.loadCategoriesFromServer();
},
render: function () {
//console.log("data");
//console.log(this.state.categories);
var postNodes = this.state.categories.map(function (cat) {
console.log(cat);
});
return (
<div id="table-area">
//i want to paint the data here..
</div>
)
}
});
Fetching inside of component using componentWillMount is not a right place, in case when you need to render server side. You need to somehow move it out form component, and pass actual data as props after it is fetched - for example as #JakeSendar suggested in his answer.
I have some experience doing isomorphic app with React, and the main problem I faced is how to wait until all data would be loaded before first render
As #FakeRainBrigand already mentioned in comments, there is not only one way to do this, and it depends from your requirements.
There is few ways to do build an isomorphic app, the some interesting from my perspective is: https://github.com/webpack/react-starter and http://fluxible.io/
But, the most elegant way to do this, as I figured out for myself - is to organise asynchronous rendering for react components, in particular using RxJS.
In general my application is structured as following:
views - React components without any logic (just a view)
models - Observables with current state (initial data is loaded using superagent, then combined with other models and/or actions results).
In simple case it is something like:
Rx.Observable.defer(fetchData).concat(updatesSubject).shareReplay()
actions(or intents) - Observers used to collects user input, do something, and dispatch action results to subscribers models and/or other actions. In simple case something like:
updatesSubject = new Rx.Subject();
action = new Rx.Subject();
action.switchMap(asyncRequest).subscribe(updatesSubject)
components - Observables(stream of virtual DOM elements) combined from models, other components and actions (I have a note about this, explaining how and why to create Observable React elements with RxJS), also now I am planning to add partial components (tuple from: react component, observables, observers, and properties. partially filled with using DI)
router - component responsible to handling location changes,
in general main feature is to map location changes to stream of virtual DOM elements and meta information. But in details, it is bit more complicated in my case(url generation, active url highlighting, handling scrolls when navigating, also it has possibility of nested routes and multiple views)
All this is assembled together using DI container, in my case similar to angular2 DI container, but a lot simplified for my specific needs.
Components, models and actions are created using DI.
On server side application is like this:
var rootInjector = new Injector();
// setup server specific providers
rootInjector.provide(..., ...)
app.get('/*', function(req,res){
var injector = rootInjector.createChild();
// setup request specific providers
injector.provide(..., ...);
injector.get(Router)
.first()
.subscribe(function(routingResult){
res.render('app', {
title: routingResult.title,
content: React.renderToString(routingResult.content)
});
});
}
and similar on client side:
var rootInjector = new Injector();
// setup server specific providers
// actually this is omitted in my case because default providers are client side
rootInjector.provide(..., ...)
contentElement = document.getElementById('#content');
rootInjector.get(Router)
.subscribe(function(routingResult){
document.title = routingResult.title;
React.render(routingResult.content, contentElement)
});
In comparison to flux, it is more declarative and more powerful way to organise app. And in case of isomorphic app - for me, it looks much better that various hacks with flux. But of course there is drawbacks... - it is more complicated.
Likely later, I will opensource all this, but for now - it is not quite ready to be published.
UPD1:
Original answer is a bit outdated(later I plan to update it), and I have some progress in this area.
Links to code mentioned above, already opensourced:
DI container: di1
Container for react componentns(connecting view to observables and obsrvers): rx-react-container
Starter template, for implementing isomorphic widgets, using RxJS and React, and libraries above: Reactive Widgets
About complete application(work still in progress, and documentation there is not quite good, but in general it should be clear):
Router built especially for isomophic reactive applications router1 and react components to use it router1-react
Application template with router and all libraries mentioned above: router1-app-template
React's renderToString method (for rendering components on the server) is synchronous. Therefore, any sort of async task, such as your api request, will still be pending by the time the component has rendered.
There are a couple of ways you can go about fixing this, depending on whether or not you want to fetch your data on the server or client.
If you choose to fetch the data on the server, first move your api-request logic outside of your component. Then, render your component in the callback, passing the fetched-data as a prop. It would look something like this:
response.on("end", function (err) {
var data = JSON.parse(buffer);
var markup = React.renderToString(Component({categories: data}));
});
Inside your component, you'd be able to access the data via this.props.categories.
The other option is to handle the api request on the client. You would make an AJAX request in componentDidMount, and set the component's state from the fetched data. It would look very similar to what you have now, the key difference being that your request logic would live in componentDidMount (async, called on the client) rather than componentWillMount (not async, called on the server).
You should use superagent, works really good for me, also you are missing the most important part, you should use flux to fetch data from a server, flux is the way that facebook strongly recommended, it's pretty easy to use flux architecture.

AngularJS : Filtered data not updating when underlying data changes

I am pretty new to AngularJS and have been working a lot with KnockoutJS bear with me a little as I still haven't quite got my head around when Angular can and cannot track changes.
I am building an app that will have an array of underlying data which initially I will poll and update, but later will improve to push from the server. All data in the app will then just be transforms or filters based on this data. So I have a service to fetch the data and to also fetch the commonly filtered versions of the data like so:
.factory('Scores', function ($resource, Utils, $q, $interval, $filter) {
var scoresResource = $resource('http://localhost:8000/scores'),
scoresData = [];
$interval(function () {
scoresResource.query(function (newScores) {
scoresData.length = 0;
angular.forEach(newScores, function (dataEntry) {
scoresData.push(dataEntry);
});
})
}, Utils.TIMES.SHORT);
return {
getAll: function () {
var deferred = $q.defer();
if (scoresData.length > 0) {
deferred.resolve(scoresData);
} else {
scoresResource.query(function (allScores) {
scoresData.length = 0;
angular.forEach(allScores, function (dataEntry) {
scoresData.push(dataEntry);
});
deferred.resolve(scoresData);
});
}
return deferred.promise;
},
getByLeagueName: function(leagueName) {
return this.getAll().then(function (allScores) {
return $filter('filter')(allScores, function (score) {
return score.League === leagueName;
})
})
}
}
});
And my controller simply fetches the filtered data and adds it to the scope.
.controller('LivescoresCtrl', function ($scope, $stateParams, Leagues, Scores, $interval, Utils) {
Scores.getByLeagueName($stateParams.leagueName).then(function (scores) {
$scope.scores = scores;
});
})
But it seems that the filtered data is not automatically updating when the underlying data updates. I would like to avoid using filters in the view as at times I need to combine data together in ways that I cannot easily achieve.
So I guess my question is why does this not update when the main data updates, and is this a valid approach in an angular world. I could hit the backend for all variations of the data, but as this is a mobile app and all data is needed in the app at all times I don't really want to make extra requests just to combine or filter the data.
Thanks,
You are only setting the data on the controller once, using the promise returned by "getByLeagueName". A promise only ever resolves once!
I'm not sure what ko.computed really does. But you create a new array inside of getByLeagueName that is in no way linked to the original array, by using the filter inside of the service. The new array doesn't "know where it came from"!
It seems you're trying to implement a polling update, which you should do inside the controller, not the service. Think of services as the containers for backend logic, without access to the scope - that's what controllers are for. And as long as you're not working with the scope directly, Angular won't ever update the visible data, because that is the only place that an Angular app gets the displayed data from.
The typical Angular way to use filtered data would be: Make the original data, after fetching it, available on the controller. Then use a filter inside of your view (HTML), i.e.: <div ng-repeat="entry in data | filter: someFilter">. See ng-filter. This way Angular knows when the original data changes, runs the filter on it again, and the UI will update effortlessly.
If you really need to use that filtered data in some other places than the view - and make sure you do - then there's some approaches to that. One is to use the service to notify the controller of data changes: Listen to an event inside the controller via $rootScope.$on, and emit that event in the service via $rootScope.$broadcast.
You can also have a look at this repository, which takes a promise-based approach to polling data, maybe it works well for your task.

What is the best practice for using $resource and sharing the results across multiple controllers?

So, I've looked all over the place and can't find the answer I'm after.
I've recently learned a few new things and switched from having my services get and store shared data like this:
angular.module('myModule')
.service('stuffService', function(){
var thisService = {
stuff: {},
getSomething: function(){
var onSuccess = function(response){
thisService.stuff = response.data;
};
var onError = function(response){
alert('Doh!');
};
var promise = $http.get(<url>).then(onSuccess, onError);
return promise;
}
};
return thisService;
});
angular.module('myModule')
.controller('myController',function($scope, stuffService){
$scope.stuff = stuffService.stuff;
});
to this:
angular.module('myModule')
.factory('stuffService', function(){
var stuffResource = $resource(<stuff_url>, <params>, <etc>);
return {
stuff: stuffResource
}
});
angular.module('myModule')
.controller('myController', function($scope, stuffService){
$scope.stuff = stuffService.stuff.get(...);
});
This has been a really beneficial change in terms of most of the operation of my application; however, I'm running into a pickle as far as how would I share the data that the stuffResource returns.
Previously, I could just always reference stuffService.stuff from any controller that had stuffService injected into it. But now with $resource, I'm not sure how I would store this data. Using $scope.stuff or even $scope.stuff.get() would always run a query, and not act like an actual collection of "stuff".
At first, I tried to have a parent controller that did $scope.stuff = myService.stuff.get(...) and that actually worked really well... until I needed that same stuff outside of the parent controller. Now I need to figure out a way to store that data so that I'm not making the same API call over and over consecutively as the app loads up.
I've thought about turning on caching, which seems to be a pretty intuitive (duh!) approach but I'm not sure if that's the best course of action. Some data needs to be refreshed more often then others, so it would have to be caching with a cache limit of varying size/length and I'd be worried that some components would have different/newer data then components that loaded earlier. Maybe it feels like too simple of a solution, not sure. It would be nice if this was the answer, and I just used `stuffService.stuff.get()' just as if it was a stored object from the olden days.
I've also thought about using an interceptor to store this data similar to what my onSuccess method was doing previously. This seems ok too, but feels a little dirty/hacky. I would just need a little push to get this one working if I was on to a good idea.
My latest thought was to create a sort of "data service", that has the other services (like stuffService) injected into it. This "dataService" would have an object that it loads up with data from all these other services and then can be shared as dataService.data.stuff. This is nice in that my data can be a single source that's shared throughout the app, but it's also yet another service that needs to be injected because I'll probably still need other methods from stuffService. This could work better if I had one stuffService with nonCRUD methods for stuff, and another crudStuffService which was basically a wrapper for $resource. I'd really prefer to have anything that related to stuff under the same service. When I was doing that before, it was really helpful to have things abstracted out like stuffService.getStuff() to make http requests or stuffService.findStuffWithSpecialConditions() to do loops and conditions, etc. on stored/shared data.
So, anyone got any ideas on how to get and store data using $resource within the same service? Is this how it should be done? Am I missing something?
Thanks!
For what you are looking is – flyweight patterrn
The most important part of its implementations will be:
.factory('stuffService', function(){
var cache = {};
var stuffResource = function(<stuff_url>, <params>, <etc>) {
if(!cache[<stuff_url>+<params>+<etc>]) {
cache[<stuff_url>+<params>+<etc>] = $resource(<stuff_url>, <params>, <etc>);
}
return cache[<stuff_url>+<params>+<etc>];
}
return {
stuff: stuffResource
}
});

Angular $resource called after api is initialized, resulting in empty return values

I am trying to build a basic rpg like simple web game. I have a java server that provides a rest api to a pure html5 application.
This application has a service that returns quests by category. It also allows the user to view details about the quest. The quests are given through a rest api. I am reading this api using the $resource dependency.
The problem is, I have a service that is defined like this:
(function( ng, app ) {
"use strict";
// I provide a repository for the quests.
app.service(
"questService",
function( $q, $resource, $http, _, categoryService ) {
var QuestbookAPI = $resource( 'http://174.126.249.6\\:8080/worldcraft/quests' , {}, { query: {method: 'GET', isArray: true} });
var quests = [];
QuestbookAPI.query(function(newQuests){
console.log("I ran !!!!! WOOTZORS . I am the questbook api query.");
quests = newQuests;
_.each(quests, function(quest){
quest.id = quest.questID.id;
quest.categoryID = quest.questCategory.id;
});
});
// ***** general questbook api
function getQuestByID( id ){}
function getQuestsByCategory( categoryId ){}
....
// ***** end general questbook api
// I get the quest with the given ID.
// Return the public API.
return({
getQuestByID: getQuestByID,
getQuestsByCategoryID: getQuestsByCategoryID,
getRandomQuestExcluding: getRandomQuestExcluding,
});
}
);
})( angular, Worldcraft );
for some reason, when the controller using this service calls for getQuestsByCategoryID, the resource query does not run.
If I leave the page and revisit it, the query runs and the quests array is populated how I expected it.
My question is, why isn't my query running before anything else? I feel like I am missing a very fundamental concept.
the git for the project is on github at
https://github.com/ClinkWorks/Worldcraft-UI
the running project is at
http://www.clinkworks.com/worldcraft-ui
If you click quests, and then combat, go back a level, and hit combat again you can see what I mean.
For some reason the getQuestsByCategoryID function is running way before QuestbookAPI.query() even though the query is ran right when the service is declared... I am pretty confused..
I know its something to do with promises... or the $q object, but i'm not quite sure how.
$resource.query calls are all asynchronously filled. There's no guarantee the call will execute before all your other code runs.
Your caller should be the one running the callback inside query & setting it to a $scope variable. You can then use a $watch to check for the assignment & do something after the data has arrived.

Can AngularJS auto-update a view if a persistent model (server database) is changed by an external app?

I'm just starting to familiarize with AngularJS, but I would like to build a web app that has a view that gets auto-upated in real-time (no refresh) for the user when something changes in the server-side database.
Can AngularJS handle this (mostly) automatically for me? And if so, what is the basic mechanism at work?
For example, do you somehow setup AngularJS to poll the DB regularly for "model" changes? Or use some sort of Comet-like mechanism to notify AngularJS client-side code that the model has changed?
In my application, the challenge is that other (non-web) server-side software will be updating the database at times. But this question applies equally to pure web-apps where you might have multiple clients changing the database through AngularJS web clients, and they each need to be updated when one of them makes a change to the DB (model).
You have a few choices...
You could do polling every X milliseconds using $timeout and $http, or if the data you're using is hooked up to a REST service, you could use $resource instead of $http.
You could create a service that uses some Websocket implementation and uses scope.$apply to handle changes that are pushed by the socket.
Here's an example using socket.io, a node.js websocket library:
myApp.factory('Socket', function($rootScope) {
var socket = io.connect('http://localhost:3000');
//Override socket.on to $apply the changes to angular
return {
on: function(eventName, fn) {
socket.on(eventName, function(data) {
$rootScope.$apply(function() {
fn(data);
});
});
},
emit: socket.emit
};
})
function MyCtrl($scope, Socket) {
Socket.on('content:changed', function(data) {
$scope.data = data;
});
$scope.submitContent = function() {
socket.emit('content:changed', $scope.data);
};
}
You could get really high tech and create a websocket implementation which syncs an Angular model with the server. When the client changes something, that change gets automatically sent to the server. Or if the server changes, it gets sent to the client.
Here's an example of that in an old version of Angular, again using socket.io: https://github.com/mhevery/angular-node-socketio
EDIT: For #3, I've been using Firebase to do this.
Here's an implementation that uses jetty instead node. The angularjs part is based on the angular-seed app. I'm not sure if the angular code is idiomatic...but I've tested that this works. HTH -Todd.
TimerWebSocketServlet see
https://gist.github.com/3047812
controllers.js
// -------------------------------------------------------------
// TimerCtrl
// -------------------------------------------------------------
function TimerCtrl($scope, CurrentTime) {
$scope.CurrentTime = CurrentTime;
$scope.CurrentTime.setOnMessageCB(
function (m) {
console.log("message invoked in CurrentTimeCB: " + m);
console.log(m);
$scope.$apply(function(){
$scope.currentTime = m.data;
})
});
}
TimerCtrl.$inject = ['$scope', 'CurrentTime'];
services.js
angular.module('TimerService', [], function ($provide) {
$provide.factory('CurrentTime', function () {
var onOpenCB, onCloseCB, onMessageCB;
var location = "ws://localhost:8888/api/timer"
var ws = new WebSocket(location);
ws.onopen = function () {
if(onOpenCB !== undefined)
{
onOpenCB();
}
};
ws.onclose = function () {
if(onCloseCB !== undefined)
{
onCloseCB();
}
};
ws.onmessage = function (m) {
console.log(m);
onMessageCB(m);
};
return{
setOnOpenCB: function(cb){
onOpenCB = cb;
},
setOnCloseCB: function(cb){
onCloseCB = cb;
},
setOnMessageCB: function(cb){
onMessageCB = cb;
}
};
})});
web.xml
<servlet>
<servlet-name>TimerServlet</servlet-name>
<servlet-class>TimerWebSocketServlet</servlet-class>
<load-on-startup>0</load-on-startup>
</servlet>
<servlet-mapping>
<servlet-name>TimerServlet</servlet-name>
<url-pattern>/api/timer/*</url-pattern>
</servlet-mapping>
What you are looking for is Firebase and Deployd.
Firebase comes with an adapter too that makes using it a breeze: http://angularfire.com/
According to the "Discover Meteor" book, Angular watches/scopes are similar to Meteor's computations regarding reactivity... but Angular is client-only and gives less-granular control than Meteor.
My impression is that using Angular might be a better fit for adding reactivity to an existing app, whereas Meteor soars when you use it for the whole thing. But I have no real experience with Angular yet (though I have built some small Meteor apps).
So, Andy Joslin has mentioned the best solution in my opnion in his answer, the 3rd option, which is to maintain state bidirectionally via websockets or whatever other async library you're dealing with (this would be the Chrome message API for Chrome Extensions and Apps for instance), and toddg has given an example of how that would be achieved. However, in his example he is implementing an anti-pattern in AngularJS: the service is calling the controller. Instead, the model should be placed inside the service, and then referenced from the controller.
The service socket callbacks will modify the service model, and because it is referenced from the controller, it will update the view. Careful if you're dealing with primitive data types or variables that can be reassigned though, those will need a watch on the controller to make this work.

Categories