I am looking to get the the key from a firebase push command when using cloud functions.
const params = {
date: Date.now(),
movie,
movieId: movie.id,
userId: user.uid,
group: 'home',
type: 'watched'
};
const pastRef = event.data.adminRef.root.child(`pastActivity/${user.uid}`);
var newPostRef = pastRef.push().set(params);
var postId = newPostRef.key;
console.log(postId); //undefined
The postId however comes back as undefined. Have try a few other suggested methods without any results.
Reference.set() returns a void promise that resolves when the write operation completes. You can't get the key of it. Instead, split the push() and set(...) into separate statements, so that you can capture the reference
var newPostRef = pastRef.push();
newPostRef.set(params);
var postId = newPostRef.key;
console.log(postId); //undefined
A shorter version can be used if you don't need the newPostRef variable
//var newPostRef = pastRef.push().set(params);
//var postId = newPostRef.key;
const newPostRefKey = pastRef.push(params).key
//this pushes the data and returns the key
Related
Could someone tell me how to push elements into an array in localStorage?
My code:
(localStorage.getItem('projects') === null) ? localStorage.setItem('projects', ['proj1', 'proj2', 'proj3']) : '';
var ItemGet = localStorage.getItem('projects');
function CreateObject() {
console.log(ItemGet);
var Serializable = JSON.parse(ItemGet);
Serializable.push('proj4');
console.log(ItemGet);
}
<button onclick="CreateObject()">Add Object</button>
General approach:
let old_data = JSON.parse(localStorage.getItem('projects'))
let new_data = old_data.push(some_new_data)
localStorage.setItem('projects',JSON.stringify(new_data))
I would do the following assuming that your data is not a multiDimensional array.
(localStorage.getItem('projects') === null) ? localStorage.setItem('projects',
JSON.stringify(['proj1', 'proj2', 'proj3'])) : '';
var ItemGet = localStorage.getItem('projects');
function CreateObject() {
var Serializable = JSON.parse(ItemGet);
Serializable.push('proj4');
localStorage.setItem('projects',JSON.stringify(Serializable));
}
The problem you are hitting is that data stored in localStorage has to be a string. You'll have to parse/stringify before settting/getting anything from local storage. If you didn't want to work with strings, you may find something like IndexedDB API
const stuff = [ 1, 2, 3 ];
// Stringify it before setting it
localStorage.setItem('stuff', JSON.stringify(stuff));
// Parse it after getting it
JSON.parse(localStorage.getItem('stuff'));
Here is an example of using IndexedDB API from the docs
const dbName = "the_name";
var request = indexedDB.open(dbName, 2);
request.onerror = function(event) {
// Handle errors.
};
request.onupgradeneeded = function(event) {
var db = event.target.result;
// Create an objectStore to hold information about our customers. We're
// going to use "ssn" as our key path because it's guaranteed to be
// unique - or at least that's what I was told during the kickoff meeting.
var objectStore = db.createObjectStore("customers", { keyPath: "ssn" });
// Create an index to search customers by name. We may have duplicates
// so we can't use a unique index.
objectStore.createIndex("name", "name", { unique: false });
// Create an index to search customers by email. We want to ensure that
// no two customers have the same email, so use a unique index.
objectStore.createIndex("email", "email", { unique: true });
// Use transaction oncomplete to make sure the objectStore creation is
// finished before adding data into it.
objectStore.transaction.oncomplete = function(event) {
// Store values in the newly created objectStore.
var customerObjectStore = db.transaction("customers", "readwrite").objectStore("customers");
customerData.forEach(function(customer) {
customerObjectStore.add(customer);
});
};
};
There are also other solutions out there like PouchDB depending on your needs
Say for example you have an array. This is how you can store it in the local storage.
let my_array = [1, 2, 3, 4];
localStorage.setItem('local_val', JSON.stringify(my_array))
Now to push any data into the local storage array you have to override by the new data like bellow
let oldArray = JSON.parse(localStorage.getItem('local_val'))
oldArray.push(1000)
localStorage.setItem('local_val', JSON.stringify(oldArray))
I'm working with mongodb stitch/realm and I'm trying to modify objects inside an array with a foreach and also pushing ids into a new array.
For each object that i'm modifying, I'm also doing a query first, after the document is found I start modifying the object and then pushing the id into another array so I can use both arrays later.
The code is something like this:
exports = function(orgLoc_id, data){
var HttpStatus = require('http-status-codes');
// Access DB
const db_name = context.values.get("database").name;
const db = context.services.get("mongodb-atlas").db(db_name);
const orgLocPickupPointCollection = db.collection("organizations.pickup_points");
const orgLocStreamsCollection = db.collection("organizations.streams");
const streamsCollection = db.collection("streams");
let stream_ids = [];
data.forEach(function(stream) {
return streamsCollection.findOne({_id: stream.stream_id}, {type: 1, sizes: 1}).then(res => { //if I comment this query it will push without any problem
if(res) {
let newId = new BSON.ObjectId();
stream._id = newId;
stream.location_id = orgLoc_id;
stream.stream_type = res.type;
stream.unit_price = res.sizes[0].unit_price_dropoff;
stream._created = new Date();
stream._modified = new Date();
stream._active = true;
stream_ids.push(newId);
}
})
})
console.log('stream ids: ' + stream_ids);
//TODO
};
But when I try to log 'stream_ids' it's empty and nothing is shown. Properties stream_type and unit_price are not assigned.
I've tried promises but I haven't had success
It's an asynchronous issue. You're populating the value of the array inside a callback. But because of the nature of the event loop, it's impossible that any of the callbacks will have been called by the time the console.log is executed.
You mentioned a solution involving promises, and that's probably the right tack. For example something like the following:
exports = function(orgLoc_id, data) {
// ...
let stream_ids = [];
const promises = data.map(function(stream) {
return streamsCollection.findOne({ _id: stream.stream_id }, { type: 1, sizes: 1 })
.then(res => { //if I comment this query it will push without any problem
if (res) {
let newId = new BSON.ObjectId();
// ...
stream_ids.push(newId);
}
})
})
Promise.all(promises).then(function() {
console.log('stream ids: ' + stream_ids);
//TODO
// any code that needs access to stream_ids should be in here...
});
};
Note the change of forEach to map...that way you're getting an array of all the Promises (I'm assuming your findOne is returning a promise because of the .then).
Then you use a Promise.all to wait for all the promises to resolve, and then you should have your array.
Side note: A more elegant solution would involve returning newId inside your .then. In that case Promise.all will actually resolve with an array of the results of all the promises, which would be the values of newId.
I need to create a new array from iterating mongodb result. This is my code.
const result = await this.collection.find({
referenceIds: {
$in: [referenceId]
}
});
var profiles = [];
result.forEach(row => {
var profile = new HorseProfileModel(row);
profiles.push(profile);
console.log(profiles); //1st log
});
console.log(profiles); //2nd log
I can see update of profiles array in 1st log. But 2nd log print only empty array.
Why i couldn't push item to array?
Update
I think this is not related to promises. HorseProfileModel class is simply format the code.
const uuid = require("uuid");
class HorseProfileModel {
constructor(json, referenceId) {
this.id = json.id || uuid.v4();
this.referenceIds = json.referenceIds || [referenceId];
this.name = json.name;
this.nickName = json.nickName;
this.gender = json.gender;
this.yearOfBirth = json.yearOfBirth;
this.relations = json.relations;
this.location = json.location;
this.profilePicture = json.profilePicture;
this.horseCategory = json.horseCategory;
this.followers = json.followers || [];
}
}
module.exports = HorseProfileModel;
await this.collection.find(...)
that returns an array of the found data right? Nope, that would be to easy. find immeadiately returns a Cursor. Calling forEach onto that does not call the sync Array.forEach but rather Cursor.forEach which is async and weve got a race problem. The solution would be promisifying the cursor to its result:
const result = await this.collection.find(...).toArray();
Reference
I'm able to query my users array with an e-mail address and return the user's account info:
users.orderByChild('email').equalTo(authData.user.email).once('value').then(function(snapshot) {
console.log(snapshot.val());
console.log(snapshot.key); // 'users'
console.log(snapshot.child('email').key); 'email'
...
How do I get the key (-KiBBDaj4fBDRmSS3j0r). snapshot.key returns users. snapshot.child('email').key returns email. The key doesn't appear to be a child, i.e., it appears to be in between users and email.
You could do something like this:
var key = Object.keys(snapshot.val())[0];
Ref: https://developer.mozilla.org/en/docs/Web/JavaScript/Reference/Global_Objects/Object/keys
The Object.keys() method returns an array of a given object's own
enumerable properties, in the same order as that provided by a
for...in loop (the difference being that a for-in loop enumerates
properties in the prototype chain as well).
Realtime database:
For this you can simple use: snapshot.key
snapshot = firebase.database.DataSnapshot
this.app.database()
.ref('/data/')
.on('value', function(snapshot) {
const id = snapshot.key;
//----------OR----------//
const data = snapshot.val() || null;
if (data) {
const id = Object.keys(data)[0];
}
});
Firestore:
snapshot.id
snapshot = firebase.firestore.DocumentSnapshot
this.app.firestore()
.collection('collection')
.doc('document')
.onSnapshot(function(snapshot) {
const id = snapshot.id;
//----------OR----------//
const data = snapshot.data() || null;
if (data) {
const id = Object.keys(data)[0];
}
});
users.orderByChild('email').equalTo(authData.user.email) is a Query (doc) that you have built by "chaining together one or more of the filter methods". What is a bit specific with your query is that it returns a dataSnapshot with only one child, since you query with equalTo(authData.user.email).
As explained here, in this exact case, you should loop over the returned dataSnapshot with forEach():
Attaching a value observer to a list of data will return the entire list of data as a single snapshot which you can then loop over to access individual children.
Even when there is only a single match for the query, the snapshot is
still a list; it just contains a single item. To access the item,
you need to loop over the result, as follows:
ref.once('value', function(snapshot) {
snapshot.forEach(function(childSnapshot) {
var childKey = childSnapshot.key;
var childData = childSnapshot.val();
// ...
});
});
Similar to camden_kid, I used Object.keys(arr), but in three lines:
var arr = snapshot.val();
var arr2 = Object.keys(arr);
var key = arr2[0];
console.log(key) // -KiBBDaj4fBDRmSS3j0r
I found new way to get the data based on snapshot key -
firebase.database().ref('events').once('value',(data)=>{
//console.log(data.toJSON());
data.forEach(function(snapshot){
var newPost = snapshot.val();
console.log("description: " + newPost.description);
console.log("interest: " + newPost.interest);
console.log("players: " + newPost.players);
console.log("uid: " + newPost.uid);
console.log("when: " + newPost.when);
console.log("where: " + newPost.where);
})
})
I am facing the problem of clone of the mongoose query object .Javascript the copy the one object into another object by call-by-ref but in my project there is scenario i need to copy one object into another object by call-by-value.
var query=domain.User.find({
deleted: false,
role: role
})
var query1=query;
I have the scenario change in the query object is not reflected in query1. I google and try so many way to clone the object but it does't work.The query object is used in another function for pagination and query1 object is used for count query.
1.I used to Object.clone(query1) error Object.clone is not function
2.I used Object.assign(query1) but it does't works fine.
3.I used other so many ways can anybody help me to sort this problem
Alternative solution using merge method:
const query = domain.User.find({
deleted: false,
role: role
}).skip(10).limit(10)
const countQuery = query.model.find().merge(query).skip(0).limit(0)
const [users, count] = await Promise.all([query, countQuery.count()])
you are trying to clone a cursor, but it is not the right approach, you probably just need to create another
like this:
var buildQuery = function() {
return domain.User.find({
deleted: false,
role: role
});
};
var query = buildQuery();
var query1 = buildQuery();
This is work for me:
const qc = sourceQuery.toConstructor();
const clonedQuery = new qc();
This code work in pagination function where sourceQuery passed as parameter and i dont known what models used. Also it work with aggregations and complex queries.
public async paging(
query: mongoose.DocumentQuery<mongoose.Document[], mongoose.Document>,
params,
transformer: any = null
) {
let page = Number(params.page);
if (!page) page = 1;
let page_size = Number(params.count);
if (!page_size) page_size = 100;
const qc = query.toConstructor();
const cq = new qc();
return cq.countDocuments().exec()
.then(async (total) => {
const s = params.sort;
if (s) {
query.sort(s);
}
query.limit(page_size);
query.skip(page_size * (page - 1));
let results = await query.exec();
if (transformer) {
results = await Promise.all(results.map((i) => transformer(i)));
}
const r = new DtoCollection();
r.pages = Math.ceil(total / page_size);
r.total = total;
(r.results as any) = results;
return r;
});
}
Sergii Stotskyi's answer works just fine and is very elegant, except that count is deprecated.
countDocuments or estimatedDocumentCount should be used instead.
However, this causes the error the limit must be positive. We can walk around this by set limit to a large integer.
const query = domain.User.find({
deleted: false,
role: role
}).skip(10).limit(10)
const countQuery = query.model.find().merge(query).skip(0).limit(Number.MAX_SAFE_INTEGER)
const [users, count] = await Promise.all([query, countQuery.countDocuments()])
Since mongoose v6 you can use Query.prototype.clone
E.g. for your code snippet:
const query = domain.User.find({
deleted: false,
role: role
})
const query1 = query.clone();