Recently Firebase introduce Cloud Functions.
In my case this feature is very usefull to count elements in my database.
Firebase posted a sample code to count elements but I ask myself some questions with big data.
In our example we consider that we need to count likes for a post.
In the sample code, at each new like, the function count all likes for the current post and update the count.
Do you think it's a good solution for big data ? (For example if we have 1M likes)
Thank you in advance !
Agreed that the code in the functions sample is not ideal for large sets of data.
For a long time I've used a two-stepped approach in my counters:
when a child is added/removed, increase/decrease the counter
when the counter gets deleted, recount all the children (as it does now)
So case #2 is memory-bound the same as the current code. But case #1 triggers on child writes, so is a lot less memory hungry.
The code:
// Keeps track of the length of the 'likes' child list in a separate property.
exports.countlikechange = functions.database.ref("/posts/{postid}/likes/{likeid}").onWrite((event) => {
var collectionRef = event.data.ref.parent;
var countRef = collectionRef.parent.child('likes_count');
return countRef.transaction(function(current) {
if (event.data.exists() && !event.data.previous.exists()) {
return (current || 0) + 1;
}
else if (!event.data.exists() && event.data.previous.exists()) {
return (current || 0) - 1;
}
});
});
// If the number of likes gets deleted, recount the number of likes
exports.recountlikes = functions.database.ref("/posts/{postid}/likes_count").onWrite((event) => {
if (!event.data.exists()) {
var counterRef = event.data.ref;
var collectionRef = counterRef.parent.child('likes');
return collectionRef.once('value', function(messagesData) {
return counterRef.set(messagesData.numChildren());
});
}
});
I also submitted this in a PR for the repo.
See the sample of this in functions-samples.
Given a data structure similar to this:
/functions-project-12345
/posts
/key-123456
likes_count: 32
/likes
user123456: true
user456789: true
user786245: true
...
This function would do the trick:
'use strict';
const functions = require('firebase-functions');
const admin = require('firebase-admin');
admin.initializeApp(functions.config().firebase);
// Keeps track of the length of the 'likes' child list in a separate attribute.
exports.countlikes = functions.database.ref('/posts/{postid}/likes').onWrite(event => {
return event.data.ref.parent.child('likes_count').set(event.data.numChildren());
});
Note that this code is copyright Google and apache licensed. See the code for more details.
Related
I am not very efficient with my code which may be the reasons why this keeps failing. I am trying to remove and assign roles to "verified" users. The basic gist of the code is to loop through all "verified" users and assign them appropriate roles according to the data received from the API.
const fetch = require("node-fetch");
var i = 0;
function mainLoop(
guild,
redisClient,
users,
main_list,
Pilot,
Astronaut,
Cadet,
main_guild,
cadet_guild,
guest
) {
setTimeout(function () {
redisClient.GET(users[i], async function (err, reply) {
if (reply != null) {
var json = await JSON.parse(reply);
var uuid = Object.keys(json).shift();
if (Object.keys(main_list).includes(uuid)) {
var tag = users.shift();
var rank = main_list[uuid];
console.log(`${tag}: ${rank}`);
var role = guild.roles.cache.find(
(role) => role.name === `| ✧ | ${rank} | ✧ |`
);
await guild.members.cache.get(tag).roles.remove(guest);
await guild.members.cache.get(tag).roles.remove(Astronaut);
await guild.members.cache.get(tag).roles.remove(Cadet);
await guild.members.cache.get(tag).roles.remove(Pilot);
await guild.members.cache.get(tag).roles.remove(cadet_guild);
await guild.members.cache.get(tag).roles.add(main_guild);
await guild.members.cache.get(tag).roles.add(role);
} else {
var tag = users.shift();
console.log(`${tag}: Guest`);
await guild.members.cache.get(tag).roles.remove(Astronaut);
await guild.members.cache.get(tag).roles.remove(Cadet);
await guild.members.cache.get(tag).roles.remove(Pilot);
await guild.members.cache.get(tag).roles.remove(main_guild);
await guild.members.cache.get(tag).roles.remove(cadet_guild);
await guild.members.cache.get(tag).roles.add(guest);
}
}
i++;
if (i < users.length) {
mainLoop(
guild,
redisClient,
users,
main_list,
Pilot,
Astronaut,
Cadet,
main_guild,
cadet_guild,
guest
);
}
});
}, 5000);
}
The code will fetch api data, map the "verified" users and api data into an array. Then, when it starts looping through the users array, it will only log 3 times and not assign any roles. Any help would be appreciated.
I can provide extra explanation/code if needed.
One possible issue I see here is that you are both incrementing the index i and calling .shift() on the users array. This may be the cause of the problem you are experiencing, as this will entirely skip some of the users in the array. Array.shift() doesn't just return the first element of the array; it removes it from the array.
Consider, for example, that your users array looks like this:
var users = ["Ted", "Chris", "Ava", "Madison", "Jay"];
And your index starts at 0 like so:
var i = 0;
This is what is happening in your code:
Assign roles for users[i]; the index is currently 0, so get users[0] (Ted).
Get Ted's tag via users.shift(). users is now: ["Chris", "Ava", "Madison", "Jay"]
Increment the index with i++. i is now: 1.
Assign roles for users[i]; the index is currently 1, so get users[1] (now Ava, skips Chris entirely).
Get Ava's tag via users.shift() (actually gets Chris' tag). users is now: ["Ava", "Madison", "Jay"]
Increment the index with i++. i is now: 2.
Assign roles for users[i]; the index is currently 2, so get users[2] (now Jay, skips Madison entirely).
And so on, for the rest of the array; about half of the users in the users array will be skipped.
I don't know how many users are supposed to be in your users array, but this could be the reason why so few logs are occurring. Note, however, that this is just one cause of the problem you are experiencing; it is possible that there are more reasons why you are having that issue, such as rate limits.
My recommendation on how to fix this is to not use users.shift() to get the user's tag. Simply use users[i], which will return the proper tag value without messing with the length of the array. Another way to fix this would be to remove the index incrementation, and always use 0 as your index. Use one or the other, but not both.
My firebase realtime database begins with a small list of unique elements in arbitrary order. Users can fetch an element(s) from this list, atomically popping it from the database, such that no other user can possess the same element. Users can also return their popped element to the list. In this way, the elements currently held by users, and those left in the database, are conserved. The list is small enough (max 27 elements) that I can efficiently load the entire database contents into memory if needed.
I am struggling to express this behaviour into my web (pure javascript) firebase application. I have seen firebase transactions, but I'm not sure how to use these such that the popped child is selected psuedo-randomly.
Here is an inadequate attempt which violates atomicity (users may end up popping/getting the same element)
function popRandElem() {
// fetch all elements currently in db list
db.ref('list').get().then( (snap) => {
// choose random element
var elems = snap.val();
var keys = Object.keys(elems);
var choice = keys[ keys.length * Math.random() << 0 ];
// remove chosen element from db
db.ref('list').child(choice).remove();
return elems[choice];
}
}
myElem = popRandElem();
function restoreElem() {
db.ref('list').push(myElem);
myElem = null;
}
How can I adapt this example such that popRandElem atomically pops from the database?
This turned out to be straightforward with transactions, using the optional second callback to obtain the successfully popped element.
function popRandElemAsynch() {
var choice = null;
db.ref('list').transaction(
// repeats with updated list until run without collision
function( list ) {
// discard previous repetition choice
choice = null;
// edge-case of list emptied during transac repeats
if (!list)
return list;
// choose and remember a random element
var keys = Object.keys(list);
choice = keys[ keys.length * Math.random() << 0 ];
// remove the element
delete list[choice];
return list;
},
// runs once after above has run for final time
function() {
// choice is the final uniquely popped element
// if it is null, list was emptied before collision-free pop
someFunc(choice);
},
// don't trigger premature events from transaction retries
false
);
}
TL;DR
I'm working on a Chat List functionality very much like any of the big social networks have, and i'm having issues with React Native state management because a very common problem with Firestore onSnapshot "in" conditions.
As workaround i'm working in batches generated from a state array.onSnapshot makes changes to the state array based on such batches, HOWEVER i'm having trouble refreshing the batches after each change.
Full Description
One of its complexities is that i must condition the realtime updates from Firestore in a way that it's not yet supported by Firebase:
const watchedGroups = db.collection('group').where('__name__', 'in', groupArray?.map(({ id }) => id));
unsubscribeListener = watchedGroups.onSnapshot((querySnapshot) => {
querySnapshot.forEach((doc) => {
//...
(Please note that group = chat)
The problem with this approach is that Firestore does not support a IN condition (groupArray) with more than 10 elements and this code block will crash if the case materializes.
To solve that, i approached groupArray in batches that do not violate such constrait:
const [recentChats, setRecentChats] = useState([]);
// ...
useFocusEffect(useCallback(() => {
const grupos = [...recentChats];
if (grupos && grupos.length > 0) {
handleRefreshSuscriptions();
const collectionPath = db.collection('group');
while (grupos.length) {
const batch = grupos.splice(0, 10);
console.log(">> QUERYING", batch?.length, batch.map(({ lastMsgForMe }) => lastMsgForMe))
const unsuscribe = collectionPath.where(
'__name__',
'in',
[...batch].map(({ id }) => id)
).onSnapshot((querySnapshot) => {
if (querySnapshot !== null) {
querySnapshot.forEach((doc) => {
const validGroup = batch.find(grupo => doc.id == grupo.id);
if (validGroup) {
lastMsg(doc.id).then((lastM) => {
console.log(batch.map(({ lastMsgForMe }) => lastMsgForMe))
if (validGroup.lastMsgForMe !== doc.data().recentMessage.messageText) {
mergeChat({
...validGroup,
messageText: doc.data().recentMessage.messageText,
lastMsgForMe: lastM.messageText,
dateMessageText: lastM.sentAt,
viewed: lastM.viewed
});
}
}).catch(error => console.log(error));
}
})
}
})
setRefreshSuscription(prevState => [...prevState].concat(unsuscribe))
}
}
return () => {
handleRefreshSuscriptions();
}
}, [recentChats.length]));
It works (almost) perfectly, every change reachs the view succesfully. However, there is an issue, here are the logs when i recieve the first update:
// Initialization (12 groups shown, 2 batches)
>> QUERYING 10 ["B", "Dddffg", "Dfff", ".", null, "Hvjuvkbn", "Sdsdx", "Vuvifdfhñ", "Ibbijn", "asdasdasd"]
>> QUERYING 2 ["Veremoss", "Hjjj"]
// Reception of a message "C" that updates last message shown ("B") of first group in the list.
["B", "Dddffg", "Dfff", ".", null, "Hvjuvkbn", "Sdsdx", "Vuvifdfhñ", "Ibbijn", "asdasdasd"] //several repetitions of this log, i've erased it for simplicity
update idx 0 - B -> C
At this point, there isn't any noticeable issue. However, if i keep interacting with other groups and then pay attention to the logs when i recieve a message to the above shown group, i will see this:
["B", "Dddffg", "Dfff", ".", null, "Hvjuvkbn", "Sdsdx", "Vuvifdfhñ", "Ibbijn", "asdasdasd"]
update idx 1 - Bbnnm -> Bbnnm // unexpected
update idx 0 - 12 -> 12 // unexpected
update idx 2 - C -> D // expected
Notice how the batch still shows "B" when i've already recieved "C" and "D" messages on that group. The problem repeats on other two groups, and because of that, now i get a real change and another two false positives.
The problem is that, because of how batches are generated, inside of onSnapshot the batch content is always the same. This results on as many false "updates" as groups have been updated since batch generation, per recieved message.
How can i keep the batch up-to-date inside onSnapshot?
One possible solution that i came with is updating the batches on the go, by switching from find to findIndex and work the updates inside the batch
querySnapshot.forEach((doc) => {
const validGroupIdx = batch.findIndex(grupo => doc.id == grupo.id);
if (validGroupIdx !== -1) {
lastMsg(doc.id).then((lastM) => {
console.log(batch.map(({ lastMsgForMe }) => lastMsgForMe))
if (batch[validGroupIdx].lastMsgForMe !== doc.data().recentMessage.messageText) {
batch[validGroupIdx] = {
...batch[validGroupIdx],
messageText: doc.data().recentMessage.messageText,
lastMsgForMe: lastM.messageText,
dateMessageText: lastM.sentAt,
viewed: lastM.viewed
}
mergeChat(batch[validGroupIdx]);
}
}).catch(error => console.log(error));
}
})
However, to my understanding this is still suboptimal, because when i navigate to other components i will get the old batch and not the updated one, provoking the false positive at least once.
I'm wondering if i could directly handle the state in batches, instead of generating batches from it.
However, sorting and merging it would be a pain afaik.
Let say I have these 2 fields in my firebase database.
user: {
postCount: 2
posts: [
{title: hello, content: world},
{title: hello again, content: world}
]
}
I want the user to have permission to update his posts. but I don't want him to be able to update his post count. I want the post counts to always represent the number of posts and prevent the user from cheating it.
How can I do this in firebase? Is it possible with front end javascript only? If not what would be the option that requires the least server side code possible?
This is the code I'm using but it doesn't prevent users from cheating and just calling the increment function by themselves infinite times.
const push = (objectToInsert, firebasePath) => {
const key = firebase.database().ref().child(firebasePath).push().key
let updates = {}
updates[firebasePath + key] = objectToInsert
firebase.database().ref().update(updates)
}
const increment = (firebasePath) => {
const ref = firebase.database().ref(firebasePath)
ref.transaction( (value) => {
value++
return value
})
}
push(post, `/${user}/${posts}/`)
increment(`/${user}/${postCount}`)
Referring you to firebase rules:
https://firebase.google.com/docs/database/security/#section-authorization
Either you set the rules on each of the user properties based on your security setup and keep the structure as you mentioned, Or move the counts to another node and set the rules (ex. user_post_counts).
I am currently making an app using Firebase.
It is one of those bulletin boards that can be seen anywhere on the web.
But there was one problem.
This is a matter of date sorting.
I want to look at the recent date first, but I always see only the data I created first.
postRef.orderByChild('createData').startAt(reverseDate).limitToFirst(1).on('child_added',(data)=>{
console.log(data.val().name + data.val().createData);
})
result - >hello1496941142093
My firebase tree
My code is the same as above.
How can I check my recent posts first?
How Do I order reverse of firebase database?
The Firebase Database will always return results in ascending order. There is no way to reverse them.
There are two common workaround for this:
Let the database do the filtering, but then reverse the results client-side.
Add an inverted value to the database, and use that for querying.
These options have been covered quite a few times before. So instead of repeating, I'll give a list of previous answers:
Display posts in descending posted order
Sort firebase data in descending order using negative timestamp
firebase sort reverse order
Is it possible to reverse a Firebase list?
many more from this list: https://www.google.com/search?q=site:stackoverflow.com+firebase+reverse%20sort%20javascript
You can simply make a function to reverse the object and then traversing it.
function reverseObject(object) {
var newObject = {};
var keys = [];
for (var key in object) {
keys.push(key);
}
for (var i = keys.length - 1; i >= 0; i--) {
var value = object[keys[i]];
newObject[keys[i]]= value;
}
return newObject;
}
This is how I solved it:
First I made a query in my service where I filter by date in milliseconds:
getImages (): Observable<Image[]> {
this.imageCollection = this.asf.collection<Image>('/images', ref => ref.orderBy('time').startAt(1528445969388).endAt(9999999999999));
this.images = this.imageCollection.snapshotChanges().pipe(
map(actions => actions.map(a => {
const data = a.payload.doc.data() as Image;
const id = a.payload.doc.id;
return { id, ...data };
}))
);
return this.images;
}
Then to get the newest date first I added this to my component where I call the method from my service:
let date = new Date;
let time = 9999999999999 - date.getTime();
console.log(time);
I pass the time let as the date. Since a newer date will be a bigger number to deduct from the 9999999999999, the newest date will turn up first in my query inside my service.
Hope this solved it for you
If you want to display it in the front end, I suggest that after you retrieve the data, use the reverse() function of JavaScript.
Example:
let result = postRef
.orderByChild("createData")
.startAt(reverseDate)
.limitToFirst(1)
.on("child_added", data => {
console.log(data.val().name + data.val().createData);
});
result.reverse();
Ive ended changing how I create my list on the frontend part.
was
posts.add(post);
changed to
posts.insert(0, post);
You could use a method where you save the same or alternate child with a negative value and then parse it.
postRef.orderByChild('createData').orderByChild('createData').on('child_added',(data)=>{
console.log(data.val().name + data.val().createData);})
Far more easier is just use Swift's reversed():
https://developer.apple.com/documentation/swift/array/1690025-reversed
https://developer.apple.com/documentation/swift/reversedcollection
let decodedIds = try DTDecoder().decode([String].self, from: value)
// we reverse it, because we want most recent orders at the top
let reversedDecodedIds = decodedIds.reversed().map {$0}
orderBy("timestamp", "desc")
I think you can give a second argument name "desc".
It worked for me