Meteor Sub/Pub with Simulation where pub is slow to update - javascript

I have a pub that wraps and external API. Client subs the external api pub. There is a 'ACTIVATE' button they can push to activate a billing method. Button calls an update method that updates the collection. The pub updates the external api. Simulation runs and updates the client collection. Button changes to 'DEACTIVATE' as expected. This is where the issue comes in. The external api takes some time to return with the updated doc. Within 100-200ms of the button turning to 'DEACTIVATE' it will flip back to 'ACTIVATE' and then 500ms latter back to 'DEACTIVATE' where it should be assuming there were no issues with the external api.
I'm sure I could come up with some hacky solution to deal with this in the client but wondering if there is a way to tell the simulation/client collection that the pub is slow and to not update quite as often? Thus, giving the pub/external api more time to complete it's updates.

This turned out to be really simple.
Client side simulation alone is not enough. The trick is to do server side simulation as well. To accomplish this first setup a hook to the Meteor.publish this object something like this.
_initServer() {
if (Meteor.isServer) {
console.log(`Server initializing external collection "${this.name}"`)
let self = this
Meteor.publish(this.name, function (selector, options) {
check(selector, Match.Optional(Match.OneOf(undefined, null, Object)))
check(options, Match.Optional(Match.OneOf(undefined, null, Object)))
self.publication = this
self._externalApi.fetchAll()
.then((docs)=>docs.forEach((doc)=>this.added(self.name, doc._id, doc)))
.then(()=>this.ready())
// todo handle error
.catch((error)=>console.error(`${self.name}._initServer: self._externalApi.fetchAll`, error))
})
}
}
Then in your update function you can simulate on both the client and server like so:
this.update = new ValidatedMethod({
name: `${self.name}.update`,
validate: (validators && validators.update) ? validators.update : self.updateSchema.validator({clean: true}),
run(doc) {
console.log(`${self.name}.update `, doc)
if (Meteor.isServer && self._externalApi.update) {
// server side simulation
self.changed(doc)
self._externalApi.update(doc._id, doc)
.then(self.changed)
.catch((error)=>handleError(`${self.name}.update`, 'externalApi.update', error))
} else {
// client side simulation
self.collection.update(doc._id, {$set: doc})
}
},
})
Apologizes if this is over simplified these examples are from a large library we use for external api's.

Related

Firebase Cloud function event - sometime not getting data on update event

I have written firebase cloud function to trigger on update record. sometimes I am not getting the same record which is updating. I am adding my code below.Please check attached image also.
exports.onNotificationUpdate = functions.database.ref('/Notification/{userId}/{notificationId}/userResponse').onUpdate(event => {
return admin.database().ref(`/Notification/${event.params.userId}/${event.params.notificationId}`).once('value').then(function (snapshot) {
var notification = snapshot.val();
if (!notification) {
console.error("Notification not found on notification update");
return;
};
I can also get Notification object from the parent but I want to know issue best approach and the problem with this code.
this is error log
this is database structure
This is my 1st post here please let me know if need more information.
Thanks
You don't have to call once within the Function since it is already returning the data at the location you are listening to, just listen to the parent node.
So you should do like:
exports.onNotificationUpdate = functions.database.ref('/Notification/{userId}/{notificationId}').onUpdate(event => {
const notification = event.data.val();
if (notification === null) {
console.error("Notification not found on notification update");
return null;
//actually this would only be called in case of deletion of the Notification
} else {
//do something with the notification data: send Android notification, send mail, write in another node of the database, etc.
//BUT return a Promise
//notification const declared above is a JavaScript object containing what is under this node (i.e. a similar structure than your database structure as shown in the image within your post.)
}
});
I would suggest that you have a look at these three videos from the Firebase team:
https://www.youtube.com/watch?v=7IkUgCLr5oA&t=517s
https://www.youtube.com/watch?v=652XeeKNHSk&t=27s
https://www.youtube.com/watch?v=d9GrysWH1Lc
Also, note that Cloud Functions have been updated and the first line of your code shall be written differently if you are using a CF version above 1.0.0. See https://firebase.google.com/docs/functions/beta-v1-diff

Node.js Socket IO: How to continuously save socket data to MongoDB

I'm building a 3D game in the browser using THREE.js. Lots of fun, but I came across the following situation:
An object in my 3D scene is continuously moving around, driven by user input. I need to save the object's position to my database in real-time.
Let's start at the front-end. Angular.js is watching my object's position using its built-in $watch functionality. The object's position can change multiple times per second.
On each change, I emit an event to the backend Node.js server using Socket IO, like so:
socket.emit('update', {
id: id,
position: position
});
In the back-end, the event is caught and immediatly emitted to other members in the same Socket IO Room. This way, everyone in this room will have the most real-time update possible.
Now, because the event can happen multiple times per second, I don't want to update my MongoDB collection on each change, since this would cause a lot of overhead. Instead, I'm looking for a way of incidentally saving data to the database.
I've came up with a solution by using Node.js setInterval function, which saves data every 1000ms. For each distinct id (which is unique per object) received on the backend, a new key is created on an JavaScript object, thus keeping track of changes on a per-object basis.
The (simplified) code on the backend:
let update_queue = new Object();
// ...
// Update Event
socket.on('update', (msg) => {
// Flag Changes
if (!update_queue[msg.id]) update_queue[msg.id] = { changes: true };
// Set Interval Timer
if (!update_queue[msg.id].timer) {
update_queue[msg.id].timer = setInterval(() => {
if (!update_queue[msg.id].changes) {
clearInterval(update_queue[msg.id].timer);
return;
}
// This saves data to MongoDB
Object3DCollection.update(msg.id, msg.position)
.then((res) => {
console.log('saved');
});
// Unflag Changes
update_queue[msg.id].changes = false;
}, 1000);
}
// Immediate Broadcast to Socket Room
socket.broadcast.to('some_room').emit('object_updated', msg);
});
The Question
Is this a proper way of handling very frequent socket data and still saving it to a database? Or are there any other suggestions/solutions that are more robuust or work better.
Note
I do not want to wait for my object to be saved to the database and then emit the saved data to the rest of the socket room. The delay of database write operations is not suitable for the real-time game situation I'm dealing with.
Thanks in advance! All suggestions/solutions are appreciated and will be considered.

Firebase array item is removed and immediately auto-added back (with AngularFire)

I am trying to remove an item from $firebaseArray (boxes).
The remove funcion:
function remove(boxJson) {
return boxes.$remove(boxJson);
}
It works, however it is immediately added back:
This is the method that brings the array:
function getBoxes(screenIndex) {
var boxesRef = screens
.child("s-" + screenIndex)
.child("boxes");
return $firebaseArray(boxesRef);
}
I thought perhaps I'm holding multiple references to the firebaseArray and when one deletes, the other adds, but then I thought firebase should handle it, no?
Anyway I'm lost on this, any idea?
UPDATE
When I hack it and delete twice (with a timeout) it seems to work:
function removeForce(screenIndex, boxId) {
setTimeout(function () {
API.removeBox(screenIndex, boxId);
}, 1000);
return API.removeBox(screenIndex, boxId);
}
and the API.removeBox:
function removeBox(screenIndex, boxId) {
var boxRef = screens
.child("s-" + screenIndex)
.child("boxes")
.child(boxId);
return boxRef.remove();
}
When you remove something from firebase it is asynchronous. Per the docs the proper way to remove an item is from firebase, using AngularFire is:
var obj = $firebaseObject(ref);
obj.$remove().then(function(ref) {
// data has been deleted locally and in the database
}, function(error) {
console.log("Error:", error);
});
$remove() ... Removes the entire object locally and from the database. This method returns a promise that will be fulfilled when the data has been removed from the server. The promise will be resolved with a Firebase reference for the exterminated record.
Link to docs: https://www.firebase.com/docs/web/libraries/angular/api.html#angularfire-firebaseobject-remove
The most likely cause is that you have a security rules that disallows the deletion.
When you call boxes.$remove Firebase immediately fires the child_removed event locally, to ensure the UI is updated quickly. It then sends the command to the Firebase servers to check it and update the database.
On the server there is a security rule that disallows this deletion. The servers send a "it failed" response back to the client, which then raises a child_added event to fix the UI.
Appearantly I was saving the items again after deleting them. Clearly my mistake:
function removeSelected(boxes) {
var selectedBoxes = Selector.getSelectedBoxes(boxes);
angular.forEach(selectedBoxes, function (box) {
BoxManager.remove(box);
});
Selector.clearSelection(boxes, true);
}
In the clearSelection method I was updating a field on the boxes and saved them again.
Besides the obvious mistake this is a lesson for me on how to work with Firebase. If some part of the system keeps a copy of your deleted item, saving it won't produce a bug but revive the deleted item.
For those, who have the similar issue, but didn't solve it yet.
There are two methods for listening events: .on() and .once(). In my case that was the cause of a problem.
I was working on a migration procedure, that should run once
writeRef
.orderByChild('text_hash')
.equalTo(addItem.text_hash)
.on('value', val => { // <--
if (!val.exists()) {
writeRef.push(addItem)
}
});
So the problem was exactly because of .on method. It fires each time after a data manipulation from FB's console.
Changing to .once solved that.

Publishing structured, reactive data to the client (outside of database collections)

I am looking for the most performant solution for sending structured data to the client in the Meteor framework on a request.
THE ISSUE:
Sometimes, before you send data from the database to the client, you want to add some server side generated additional information beeing sent to the client (i.e. security credentials for many objects). This data can be time-critical (i.e. due to an expiration timestamp) and therefore should not be stored in the db. Also, this data sometimes cannot be processed on the client side (i.e. due to security reasons). In many cases, this data will be structurally related to actual database data, but also very much related to a single request, since you might want to have it discarded and re-generated on a new request.
YOU CAN (at least by design..):
create a second collection, store and publish your request-related data there, accepting the write-overhead, and then i.e. in the Meteor.myTemplate.destroyed=function(){...} remove the data again accepting another write-overhead.
store each entry in a session variable, but then you also have to take care
of deleting it later on (Meteor.myTemplate.destroyed=function(){...}), this is my favourite right now, but I am running into problems with storing large objects there.
store this data in the dom (i.e. in the attributes or data fields of hidden or visible elements)
generate this data from the dom with Meteor.call('method',arguments,callback(){...}) by storing the appropriate arguments in the dom and injecting them back with i.e. jQuery in the callback(){...}.
YOU CAN'T: (by design!!)
use transformations within Meteor.publish("name",function(){...}) on the server
use a Meteor.call() within a transformation on a Template.variable=function(){return collection.find(...)} (also not if you have a corresponding Meteor.method() on the client for guessing the result!).
Again, what I am looking for is the best performing solution for this.
To address the transform issue, which I care about because I think in terms of smart models doing things rather than a bunch of anonymous functions, here is an example of server transform reaching the client (not reactive as is, through a call, not a publish, but illustrative of the point under discussion of server transforms).
You will get this:
each image local
data: LOLCATZ RULZ
transform:
data: LOLCATZ RULZ
transform:
each image server transformed
data: LOLCATZ RULZ
transform: XYZ
data: LOLCATZ RULZ
transform: XYZ
from:
<template name='moritz'>
<h3>each image local</h3>
<dl>
{{#each images}}
<dt>data: {{caption}}</dt>
<dd>transform: {{secretPassword}}</dd>
{{/each}}
</dl>
<h3>each image server transformed</h3>
<dl>
{{#each transformed}}
<dt>data: {{caption}}</dt>
<dd>transform: {{secretPassword}}</dd>
{{/each}}
</dl>
</template>
if (Meteor.isServer) {
Images = new Meteor.Collection('images', {
transform: function (doc) {
doc.secretPassword = 'XYZ'
return doc
}
});
Images.allow({
insert: function (userid, doc) {
return true;
}
});
if (Images.find().count() < 1) {
Images.insert({ caption: 'LOLCATZ RULZ'});
}
Meteor.publish('images', function () {
return Images.find();
})
Meteor.methods({
'transformed': function() {
return Images.find().fetch();
}
})
}
else {
Images = new Meteor.Collection('images');
imageSub = Meteor.subscribe('images');
Template.moritz.helpers({
'images': function () {
console.log(Images.find().count() + ' images')
return Images.find();
},
'transformed': function () {
// Should be separated, call should be in route for example
Meteor.call('transformed', function(err,data){
Session.set('transformed', data);
});
return Session.get('transformed');
}
});
}
Have a look at Meteor Streams, you can send something directly to the client from the server without having to use collections on the client.
You could do something with the message you get (I'm using an example from the meteor-streams site):
Client
chatStream.on('message', function(message) {
if(message.expiry > new Date()) {
//Do something with the message (not being read from a collection)
}
});
Even though you do this with some intent on not having it stored, be wary that simple tools (Chrome Inspector) can peak into the Network/Websocket tab (even if its encrypted via SSL) and see the raw data being passed through.
While i'm not sure of your intentions, if this is for security in any scenario never trust whatever data you get from the client.
Your first line would seem ideally answered by 'Mongo collections' but then I read a few conflicting ideas and conclusions I'm not sure I agree with. (For example, why can't you do those two things? Because it must happen on the server?) The most confusing statement for me is:
This data can be time-critical (i.e. due to an expiration timestamp) and therefore should not be stored in the db.
I don't understand what assumptions go into that conclusion, but if you consider storing data in the dom to be plausible, per points 2 and three above, you would seem to be open to systems much less performant than a mongo db.
You know you can, from the server, publish a second collection of the additional server-generated calculations, which you generate on the fly from the data, and put that together with the data once on the client. Kind of like a parent-child relationship: as you display the client document, pull up the additional data from the server from the dynamic collection and work it into your templates.
Jim Mack created a nice example here that proves, how well it works to store db data as well as additional "transform" properties in a Session variable.
Unfortunately this example lacks reactivity and does not perform the desired "transformations" after Meteor's magic re-render. So I grabbed his cool code and added back the reactivity, it's slim code that works very well, but will be outperformed by Jim Mack's example in terms of efficiency.
lolz.html
<head>
<title>lolz</title>
</head>
<body>
{{>myItems}}
</body>
<template name="myItems">
<h3>Reactive Item List with additional properties</h3>
<button id="add">add</button>
<button id="remove">remove</button>
<dl>
{{#each items}}
<dt>data: {{caption}}</dt>
<dd>added property: {{anotherProp _id}}</dd>
{{/each}}
</dl>
</template>
lolz.js
items = new Meteor.Collection('Items');
if (Meteor.isServer) {
items.allow({
insert: function (userid, doc) {
return true;
},
remove: function(userid,doc){
return true;
}
});
while(items.find().count()>0){
items.remove(items.findOne()._id);
}
while (items.find().count() < 3) {
items.insert({caption: 'LOLCATZ RULZ'});
}
Meteor.publish('Items', function () {
return items.find();
});
Meteor.methods({
'getAdditionalProps': function() {
additionalProps={};
items.find().forEach(function(doc){
additionalProps[doc._id]=reverse(doc.caption);
});
return additionalProps;
}
});
function reverse(s){ // server side operation, i.e. for security reasons
return s.split("").reverse().join("");
};
}
if (Meteor.isClient){
Meteor.subscribe('Items');
Meteor.startup(function(){
getAdditionalProps();
itemsHandle=items.find().observe({
added : function(doc){
getAdditionalProps();
},
removed : function(doc){
getAdditionalProps();
},
changed : function(docA,docB){
getAdditionalProps();
}
});
});
Template.myItems.rendered=function(){
console.log(new Date().getTime());
};
Template.myItems.items=function(){
return items.find();
}
Template.myItems.anotherProp=function(id){
return Session.get('additionalProps')[id];
}
Template.myItems.events({
'click #add':function(e,t){
items.insert({caption: 'LOLCATZ REACTZ'});
},
'click #remove':function(e,t){
items.remove(items.findOne()._id);
}
});
}
function getAdditionalProps(){
setTimeout(function(){
Meteor.call('getAdditionalProps',function(error,props){
Session.set('additionalProps',props);
});
},0);
}

How can I populate sigma.js graph asynchronously with google feed API?

I want to represent my wordpress and github activity as a graph network using the javascript library Sigmajs. I am using the google feed api to get RSS feeds of all activity and translating them into nodes and edges on the graph.
But the feed api returns rss results asynchronously. As far as I know sigmajs does not natively support this so im getting undefined references. At this point its only around 20-30 nodes. Some possible solutions are:
Force google feed api to return results synchronously.
(not sure how to do this but im assuming it has something to do with appropriate closures?)
Create sigma instance for every feed result and push all graph objects into a single instance.
(Not sure its possible and library not well documented enough to try)
Fire an event each time result is returned to ensure sigma only processes one at a time.
(Again, not sure how to go about doing this)
Any guidance is very much appreciated. Thanks.
Here is my work so far http://fraseraddison.com
More examples and source at http://sigmajs.org/
The solution I went with was firing a custom event. Seems to be working because of javascript's synchronous event handling queue.
function getFeed()
{
return function callback(result)
{
if (!result.error)
{
console.log("Feed retrieved.");
fireFeed(result.feed);
}
else
console.log("Feed retrieval failed!");
}
}
function fireFeed(feed)
{
//console.log(feed);
var event = new CustomEvent(
"newFeed",
{
detail: {
message: feed
},
bubbles: true,
cancelable: true
}
);
document.dispatchEvent(event);
}
document.addEventListener('newFeed', function(e)
{
var feed = e.detail.message;
console.log('feed triggered');
//console.log(feed);
buildFeed(feed);
},true);

Categories