Creating objects in Model.js, NodeJs - javascript

In my model.js (using mongoose) , I am initially creating 40 objects in model.js which are to be used in the entire program. No other function in any file creates more objects but only updates the existing ones.
My model.js
var TicketSchema = mongoose.model('Tickets', TicketSchema);
for(let i = 1;i<=40;i++)
{
var new_ticket = new TicketSchema({ticket_number:i});
new_ticket.save(function(err, ticket) {
});
}
Problem is I noticed there were much more objects than 40 after some time. I wanted to know if model.js runs more than once during execution or is it just due to repeated calling of npm run start and then closing the server?
Also is there way better way of creating objects initially which are to be used for the entire program?

It will create new 40 documents every time you start the server. You can use this function to avoid creating if the records already exist by checking count.
const TicketModel = mongoose.model('Tickets', TicketSchema);
const insertTicketNumber = async () => {
try {
const count = await TicketModel.countDocuments({});
if (count) return;
await TicketModel.create(
[...Array(40).keys()]
.map(i => i + 1)
.map(number => ({ ticket_number: number }))
);
} catch (error) {
console.log(error.message);
}
};

Related

Using Transactions and batched writes within a Cloud Function

Transactions and batched writes can be used to write multiple documents by means of an atomic operation.
Documentation says that Using the Cloud Firestore client libraries, you can group multiple operations into a single transaction.
I cannot understand what is the meaning of client libraries here and if it's correct to use Transactions and batched writes within a Cloud Function.
Example given: suppose in the database I have 3 elements (which doc IDs are A, B, C). Now I need to insert 3 more elements (which doc IDs are C, D, E). The Cloud Function should add just the latest ones and send a Push Notification to the user telling him that 2 new documents are available.
The doc ID could be the same but since I need to calculate how many documents are new (the ones that will be inserted) I need a way to read the doc ID first and check for its existence. Hence, I'm wondering if Transactions fit Cloud Functions or not.
Also, each transaction or batch of writes can write to a maximum of 500 documents. Is there any other way to overcome this limit within a Cloud Function?
Firestore Transaction behaviour is different between the Clients SDKs (JS SDK, iOS SDK, Android SDK , ...) and the Admin SDK (a set of server libraries), which is the SDK we use in a Cloud Function. More explanations on the differences here in the documentation.
Because of the type of data contention used in the Admin SDK you can, with the getAll() method, retrieve multiple documents from Firestore and hold a pessimistic lock on all returned documents.
So this is exactly the method you need to call in your transaction: you use getAll() for fetching documents C, D & E and you detect that only C is existing so you know that you need to only add D and E.
Concretely, it could be something along the following lines:
const db = admin.firestore();
exports.lorenzoFunction = functions
.region('europe-west1')
.firestore
.document('tempo/{docId}') //Just a way to trigger the test Cloud Function!!
.onCreate(async (snap, context) => {
const c = db.doc('coltest/C');
const d = db.doc('coltest/D');
const e = db.doc('coltest/E');
const docRefsArray = [c, d, e]
return db.runTransaction(transaction => {
return transaction.getAll(...docRefsArray).then(snapsArray => {
let counter = 0;
snapsArray.forEach(snap => {
if (!snap.exists) {
counter++;
transaction.set(snap.ref, { foo: "bar" });
} else {
console.log(snap.id + " exists")
}
});
console.log(counter);
return;
});
});
});
To test it: Create one of the C, D or E doc in the coltest collection, then create a doc in the tempo collection (Just a simple way to trigger this test Cloud Function): the CF is triggered. Then look at the coltest collection: the two missing docs were created; and look a the CF log: counter = 2.
Also, each transaction or batch of writes can write to a maximum of
500 documents. Is there any other way to overcome this limit within a
Cloud Function?
AFAIK the answer is no.
There used to also be a one second delay required as well between 500 record chunks. I wrote this a couple of years ago. The script below reads the CSV file line by line, creating and setting a new batch object for each line. A counter creates a new batch write per 500 objects and finally asynch/await is used to rate limit the writes to 1 per second. Last, we notify the user of the write progress with console logging. I had published an article on this here >> https://hightekk.com/articles/firebase-admin-sdk-bulk-import
NOTE: In my case I am reading a huge flat text file (a manufacturers part number catalog) for import. You can use this as a working template though and modify to suit your data source. Also, you may need to increase the memory allocated to node for this to run:
node --max_old_space_size=8000 app.js
The script looks like:
var admin = require("firebase-admin");
var serviceAccount = require("./your-firebase-project-service-account-key.json");
var fs = require('fs');
var csvFile = "./my-huge-file.csv"
var parse = require('csv-parse');
require('should');
admin.initializeApp({
credential: admin.credential.cert(serviceAccount),
databaseURL: "https://your-project.firebaseio.com"
});
var firestore = admin.firestore();
var thisRef;
var obj = {};
var counter = 0;
var commitCounter = 0;
var batches = [];
batches[commitCounter] = firestore.batch();
fs.createReadStream(csvFile).pipe(
parse({delimiter: '|',relax_column_count:true,quote: ''})
).on('data', function(csvrow) {
if(counter <= 498){
if(csvrow[1]){
obj.family = csvrow[1];
}
if(csvrow[2]){
obj.series = csvrow[2];
}
if(csvrow[3]){
obj.sku = csvrow[3];
}
if(csvrow[4]){
obj.description = csvrow[4];
}
if(csvrow[6]){
obj.price = csvrow[6];
}
thisRef = firestore.collection("your-collection-name").doc();
batches[commitCounter].set(thisRef, obj);
counter = counter + 1;
} else {
counter = 0;
commitCounter = commitCounter + 1;
batches[commitCounter] = firestore.batch();
}
}).on('end',function() {
writeToDb(batches);
});
function oneSecond() {
return new Promise(resolve => {
setTimeout(() => {
resolve('resolved');
}, 1010);
});
}
async function writeToDb(arr) {
console.log("beginning write");
for (var i = 0; i < arr.length; i++) {
await oneSecond();
arr[i].commit().then(function () {
console.log("wrote batch " + i);
});
}
console.log("done.");
}

INDEXEDDB update/put not returning updated object in the expected state

I have an INDEXEDDB database that i've created with two object stores: 'games' and 'plays' (in reference to football). I am letting IDB create the keys for each store via 'autoincrement'. The 'games' store can have multiple games and likewise, there will be multiple plays for each game. Later, i export these stores via JSON to PHP and am attempting to correlate the plays that took place in game 1 (for example) to that game and so on. I am using a 'foreign key'-like value (a gameID attribute) in the plays store to indicate that the play goes with a certain game. However, upon JSON export of the two stores, i have found that the 'games' store does not have its key value exported and therefore, i cannot for sure connect a play (which has a reference to 'gameID') to a particular game (which does not contain the reference within its structure).
So, i thought the answer to be simple: create a value called 'gameID' within the 'game' store and once i have that id, update the record in the store with the gameID value.
The problem is that i've written IDB 'update' code or 'put' code which seems to be 'successful', yet when i go get the game in question later, the value is not correct. I'm finding that my updates are not updating the data structures as i would expect to see them in Chrome Developer tools. Below is an example of what i am talking about:
Object in question in Chrome Developer tools
Above you can see graphically the issue and i'm not sure what is happening. You'll see that in the areas marked "A" and "C", there are the updated values listed (i do a similar update later to mark a game 'complete' at the end of a game). However, the actual data structure in IDB (indicated with "B") shows the old values that i "thought" that i'd updated successfully. So, i'm not at all sure how to read this structure in Chrome Developer, which seems to report the updates that were made separately from the object itself.
I've tried doing this update thru passing the gameID in question and via cursor.
function putGameID (conn, thisGameID) {
return new Promise((resolve, reject) => {
const tx = conn.transaction(['gamesList'], 'readwrite');
const gameStore = tx.objectStore(['gamesList']);
const gameRequest = gameStore.get(thisGameID);
gameRequest.onsuccess = () => {
const game = gameRequest.result;
game.gameID = thisGameID;
console.log({game});
const updateGameRequest = gameStore.put(game);
updateGameRequest.onsuccess = () => {
console.log("Successfully updated this game ID.");
resolve(updateGameRequest.onsuccess);
}
etc....
It appears the record was updated, just not in the manner i would expect.
I've also attempted this using a cursor update to similar effect:
function putGameID (conn, thisGameID) {
return new Promise((resolve, reject) => {
const tx = conn.transaction(['gamesList'], 'readwrite');
const gameStore = tx.objectStore(['gamesList']);
gameStore.openCursor().onsuccess = function(event) {
const cursor = event.target.result;
if (cursor) {
if (!cursor.value.gameID) {
const updatedGame = cursor.value;
updatedGame.gameID = thisGameID;
const request = cursor.update(updatedGame);
cursor.continue;
}
}
}
etc.....
Can someone help me to understand:
(1) how to read the structure in the CDT? Why are the updated values not part of the object's structure?
and ...
(2) how can i modify my approach to get the results that i wish to achieve?
As per requested, this is the code that originally creates the two object stores and it is called upon entry into the form:
async function idbConnect(name, version) {
return new Promise((resolve, reject) => {
const request = indexedDB.open(DBName, DBVersion);
request.onupgradeneeded = function(event) {
//if (!request.objectStoreNames.contains('gamesList')) {
console.log('Doing indexeddb upgrade');
db = request.result;
/*Create the two stores - plays and games. */
playObjectStore = db.createObjectStore('playsList',{keyPath: "id", autoIncrement:true});
gameObjectStore = db.createObjectStore('gamesList',{keyPath: "gameID", autoIncrement:true});
/* Create indexes */
playObjectStore.createIndex("playIDIdx","id", {unique:false});
playObjectStore.createIndex("gamePlayIDIdx","gameID", {unique:false});
playObjectStore.createIndex("playCreatedDateIdx","createdDate", {unique:false});
gameObjectStore.createIndex("gameIDIdx","gameID", {unique:true});
gameObjectStore.createIndex("gameCreatedDateIdx","createdDate", {unique:false});
//return db;
//}
}
request.onsuccess = () => resolve(request.result);
request.onerror = () => reject(request.error);
request.onblocked = () => { console.log('blocked'); };
});
}
This code makes the call to add the game:
try {
conn = await idbConnect(DBName,DBVersion);
game = await addGameIDB(conn);
// Understand what is going on in the line below.
//Saving the game ID to populate.
globalGameID = game.gameID;
// Here is where i'm attempting to update the gameID....
await putGameID(conn, globalGameID);
console.log({globalGameID});
Once the stores are created, the following code adds a game:
function addGameIDB(conn) {
return new Promise((resolve, reject) => {
// some irrelevant stuff to format dates, etc....
let newGame = [
{
gameID: null, // What i'd like to populate....
gameDate: thisGameDate,
gameTime: thisGameTime,
team1Name: thisTeamOne,
team2Name: thisTeamTwo,
gameCompleted: false,
createdDate: d
}
];
db = conn.transaction('gamesList','readwrite');
let gameStore = db.objectStore('gamesList');
let gameRequest = gameStore.add(newGame);
gameRequest.onsuccess = (ev) => {
console.log('Successfully inserted an game object');
const newGameRequest = gameStore.get(gameRequest.result);
newGameRequest.onsuccess = () => {
resolve(newGameRequest.result);
}
};
gameRequest.onerror = (err) => {
console.log('error attempting to insert game object' + err);
reject(gameRequest.error);
}
});

Firebase Firestore - Async/Await Not Waiting To Get Data Before Moving On?

I'm new to the "async/await" aspect of JS and I'm trying to learn how it works.
The error I'm getting is Line 10 of the following code. I have created a firestore database and am trying to listen for and get a certain document from the Collection 'rooms'. I am trying to get the data from the doc 'joiner' and use that data to update the innerHTML of other elements.
// References and Variables
const db = firebase.firestore();
const roomRef = await db.collection('rooms');
const remoteNameDOM = document.getElementById('remoteName');
const chatNameDOM = document.getElementById('title');
let remoteUser;
// Snapshot Listener
roomRef.onSnapshot(snapshot => {
snapshot.docChanges().forEach(async change => {
if (roomId != null){
if (role == "creator"){
const usersInfo = await roomRef.doc(roomId).collection('userInfo');
usersInfo.doc('joiner').get().then(async (doc) => {
remoteUser = await doc.data().joinerName;
remoteNameDOM.innerHTML = `${remoteUser} (Other)`;
chatNameDOM.innerHTML = `Chatting with ${remoteUser}`;
})
}
}
})
})
})
However, I am getting the error:
Uncaught (in promise) TypeError: Cannot read property 'joinerName' of undefined
Similarly if I change the lines 10-12 to:
remoteUser = await doc.data();
remoteNameDOM.innerHTML = `${remoteUser.joinerName} (Other)`;
chatNameDOM.innerHTML = `Chatting with ${remoteUser.joinerName}`;
I get the same error.
My current understanding is that await will wait for the line/function to finish before moving forward, and so remoteUser shouldn't be null before trying to call it. I will mention that sometimes the code works fine, and the DOM elements are updated and there are no console errors.
My questions: Am I thinking about async/await calls incorrectly? Is this not how I should be getting documents from Firestore? And most importantly, why does it seem to work only sometimes?
Edit: Here are screenshots of the Firestore database as requested by #Dharmaraj. I appreciate the advice.
You are mixing the use of async/await and then(), which is not recommended. I propose below a solution based on Promise.all() which helps understanding the different arrays that are involved in the code. You can adapt it with async/await and a for-of loop as #Dharmaraj proposed.
roomRef.onSnapshot((snapshot) => {
// snapshot.docChanges() Returns an array of the documents changes since the last snapshot.
// you may check the type of the change. I guess you maybe don’t want to treat deletions
const promises = [];
snapshot.docChanges().forEach(docChange => {
// No need to use a roomId, you get the doc via docChange.doc
// see https://firebase.google.com/docs/reference/js/firebase.firestore.DocumentChange
if (role == "creator") { // It is not clear from where you get the value of role...
const joinerRef = docChange.doc.collection('userInfo').doc('joiner');
promises.push(joinerRef.get());
}
});
Promise.all(promises)
.then(docSnapshotArray => {
// docSnapshotArray is an Array of all the docSnapshots
// corresponding to all the joiner docs corresponding to all
// the rooms that changed when the listener was triggered
docSnapshotArray.forEach(docSnapshot => {
remoteUser = docSnapshot.data().joinerName;
remoteNameDOM.innerHTML = `${remoteUser} (Other)`;
chatNameDOM.innerHTML = `Chatting with ${remoteUser}`;
})
});
});
However, what is not clear to me is how you differentiate the different elements of the "first" snapshot (i.e. roomRef.onSnapshot((snapshot) => {...}))). If several rooms change, the snapshot.docChanges() Array will contain several changes and, at the end, you will overwrite the remoteNameDOM and chatNameDOM elements in the last loop.
Or you know upfront that this "first" snapshot will ALWAYS contain a single doc (because of the architecture of your app) and then you could simplify the code by just treating the first and unique element as follows:
roomRef.onSnapshot((snapshot) => {
const roomDoc = snapshot.docChanges()[0];
// ...
});
There are few mistakes in this:
db.collection() does not return a promise and hence await is not necessary there
forEach ignores promises so you can't actually use await inside of forEach. for-of is preferred in that case.
Please try the following code:
const db = firebase.firestore();
const roomRef = db.collection('rooms');
const remoteNameDOM = document.getElementById('remoteName');
const chatNameDOM = document.getElementById('title');
let remoteUser;
// Snapshot Listener
roomRef.onSnapshot(async (snapshot) => {
for (const change of snapshot.docChanges()) {
if (roomId != null){
if (role == "creator"){
const usersInfo = roomRef.doc(roomId).collection('userInfo').doc("joiner");
usersInfo.doc('joiner').get().then(async (doc) => {
remoteUser = doc.data().joinerName;
remoteNameDOM.innerHTML = `${remoteUser} (Other)`;
chatNameDOM.innerHTML = `Chatting with ${remoteUser}`;
})
}
}
}
})

MongoDB values with undefined script

I'm trying to make some query from different databases, which I cannot merge them to one forAll, the following query probably fails due to async call which I cannot manage to solve.
var someImageIds = ["111111111111111111"]
use "databaseA"
var transactions = db.transactions.find({"data.transactionImaginaryId": {$in: someImageIds}}) // uses index
use "databaseB"
transactions.forEach(transaction => {
var report = db.reports.find({"metadata.companyId" : parseInt(transaction.data.companyId) , "metadata.originReportId": transaction.data.reportId}).project({}) // uses index
var expenses = db.expenses.find({"metadata.reportId": report._id}) // uses index
var assets = db.assets.find({"_id": report.assets[0].imaginaryId}) // uses index
print(`report with status: ${"report.reportFlow.value"}, ${expenses.count()} expenses, ${assets.count()} assets for ${transaction.data.matchType} transaction _id: ${transaction._id.valueOf()}`)
})
The problem is that
var report = db.reports.find({"metadata.companyId" : parseInt(transaction.data.companyId) , "metadata.originReportId": transaction.data.reportId}).project({})
returns value of undefined and I cannot continue with the query since the next line is using this line data.
Any ideas on how to solve that?
I'm using NoSqlBooster v6.2.8, mongo4, and written in NoSqlBooster console.
Thanks!
Thanks to #Jeremy Thille I managed to write the following WORKING code:
var someImageIds = ["111111111111111111"]
use "databaseA"
var transactions = db.transactions.find({"data.transactionImaginaryId": {$in: someImageIds}}) // uses index
use "databaseB"
transactions.forEach((transaction)=> {
const report = await(db.reports.find({ "metadata.companyId": parseInt(transaction.data.companyId), "metadata.originReportId": transaction.data.reportId }).toArray()) // uses index
const expenses = await(db.expenses.find({ "metadata.reportId": report[0]._id }).toArray()) // uses index
const assets = await(db.assets.find({ "_id": report[0].assets[0].imaginaryId }).toArray()) // uses index
print(`report with status: ${report[0].reportFlow.value}, ${expenses.length} expenses, ${assets.length} assets for ${transaction.data.matchType} transaction _id: ${transaction._id.valueOf()}`)
});
Unfortunately, databases (and HTTP requests, and many other things) are not instantaneous. They need some time to perform an operation. So you need to await them, which can't be done in a .forEach() loop, but can be in a for loop :
const someFunctionName = async () => { // needs async
for (let transaction of transactions) {
const report = await db.reports.find({ "metadata.companyId": parseInt(transaction.data.companyId), "metadata.originReportId": transaction.data.reportId }).project({}) // uses index
const expenses = await db.expenses.find({ "metadata.reportId": report._id }) // uses index
const assets = await db.assets.find({ "_id": report.assets[0].imaginaryId }) // uses index
print(`report with status: ${"report.reportFlow.value"}, ${expenses.count()} expenses, ${assets.count()} assets for ${transaction.data.matchType} transaction _id: ${transaction._id.valueOf()}`)
}
}

Cloud Functions: How to copy Firestore Collection to a new document?

I'd like to make a copy of a collection in Firestore upon an event using Cloud Functions
I already have this code that iterates over the collection and copies each document
const firestore = admin.firestore()
firestore.collection("products").get().then(query => {
query.forEach(function(doc) {
var promise = firestore.collection(uid).doc(doc.data().barcode).set(doc.data());
});
});
is there a shorter version? to just copy the whole collection at once?
I wrote a small nodejs snippet for this.
const firebaseAdmin = require('firebase-admin');
const serviceAccount = '../../firebase-service-account-key.json';
const firebaseUrl = 'https://my-app.firebaseio.com';
firebaseAdmin.initializeApp({
credential: firebaseAdmin.credential.cert(require(serviceAccount)),
databaseURL: firebaseUrl
});
const firestore = firebaseAdmin.firestore();
async function copyCollection(srcCollectionName, destCollectionName) {
const documents = await firestore.collection(srcCollectionName).get();
let writeBatch = firebaseAdmin.firestore().batch();
const destCollection = firestore.collection(destCollectionName);
let i = 0;
for (const doc of documents.docs) {
writeBatch.set(destCollection.doc(doc.id), doc.data());
i++;
if (i > 400) { // write batch only allows maximum 500 writes per batch
i = 0;
console.log('Intermediate committing of batch operation');
await writeBatch.commit();
writeBatch = firebaseAdmin.firestore().batch();
}
}
if (i > 0) {
console.log('Firebase batch operation completed. Doing final committing of batch operation.');
await writeBatch.commit();
} else {
console.log('Firebase batch operation completed.');
}
}
copyCollection('customers', 'customers_backup').then(() => console.log('copy complete')).catch(error => console.log('copy failed. ' + error));
Currently, no. Looping through each document using Cloud Functions and then setting a new document to a different collection with the specified data is the only way to do this. Perhaps this would make a good feature request.
How many documents are we talking about? For something like 10,000 it should only take a few minutes, tops.
This is the method i use to copy data to another collection, I used it to shift data (like sells or something) from an active collection to a 'sells feed' or 'sells history' collection.
At the top i reference the documents, at the bottom is the quite compact code.
You can simply add a for loop on top for more than 1 operation.
Hope it helps somebody :)
DocumentReference copyFrom = FirebaseFirestore.instance.collection('curSells').doc('0001');
DocumentReference copyTo = FirebaseFirestore.instance.collection('sellFeed').doc('0001');
copyFrom.get().then((value) => {
copyTo.set(value.data())
});
There is no fast way at the moment. I recommend you rewrite your code like this though:
import { firestore } from "firebase-admin";
async function copyCollection() {
const products = await firestore().collection("products").get();
products.forEach(async (doc)=> {
await firestore().collection(uid).doc(doc.get('barcode')).set(doc.data());
})
}

Categories