I am trying to get a large dataset in Firebase database to Algolia for search indexing and every time I get a timeout error, what does that mean and how can I solve it?
I have created a node.js application to copy the data over and is as below. I can copy small data, say 10 contacts just fine, the problem is 3,000 contacts.
const algoliasearch = require('algoliasearch');
const dotenv = require('dotenv');
const firebase = require('firebase');
// load values from the .env file in this directory into process.env
dotenv.load();
// configure firebase
firebase.initializeApp({
databaseURL: process.env.FIREBASE_DATABASE_URL,
});
const database = firebase.database();
// configure algolia
const algolia = algoliasearch(
process.env.ALGOLIA_APP_ID,
process.env.ALGOLIA_API_KEY
);
const index = algolia.initIndex(process.env.ALGOLIA_INDEX_NAME);
// Get all contacts from Firebase
database.ref('/contactDetail/brazil').once('value', contacts => {
// Build an array of all records to push to Algolia
const records = [];
contacts.forEach(contact => {
// get the key and data from the snapshot
const childKey = contact.key;
const childData = contact.val();
// We set the Algolia objectID as the Firebase .key
childData.objectID = childKey;
// Add object for indexing
records.push(childData);
});
// Add or update new objects
index
.saveObjects(records)
.then(() => {
console.log('Contacts imported into Algolia');
})
.catch(error => {
console.error('Error when importing contact into Algolia', error);
process.exit(1);
});
});
I get a snippet of the contacts json and the following message:
contentLength: 854068,
method: 'POST',
timeouts: [Object],
url: '/1/indexes/app_NAME/batch',
startTime: 2018-01-17T20:35:27.239Z,
endTime: 2018-01-17T20:35:39.242Z,
duration: 12003 } ] }
Related
ERROR
ERROR TypeError: (0, _database.where) is not a function. (In '(0, _database.where)('email', '==', email)', '(0, _database.where)' is undefined)
In here I am generating a uniqe perant in the database using uid and update some values in it, I am trying to do some filitration based on the email to know if users exist then update the values if not then generate a new user
import { ref, get, set, query, where} from 'firebase/database';
useEffect(() => {
const writeToDatabase = () => {
if (location && location.coords && UserDataFromGoogleAuth) {
const usersRef = ref(database, 'users');
const email = UserDataFromGoogleAuth.email;
if (email) {
const query = query(usersRef, where('email', '==', email));
get(query).then((snapshot) => {
const uuid = snapshot.exists() ? Object.keys(snapshot.val())[0] : uid();
const userRef = ref(database, `/users/${uuid}`);
const userData = {
id: uuid,
name: UserDataFromGoogleAuth.displayName,
email: email,
includedKids: 0,
isSubscribed: false,
long: location.coords.longitude,
lat: location.coords.latitude,
online: props.online,
profilePicture: UserDataFromGoogleAuth.photoURL,
};
set(userRef, userData);
}).catch(error => {
console.log(error);
});
}
}
};
writeToDatabase();
}, [UserDataFromGoogleAuth, location, props.online]);
database structure:
Database> users> {uid foreach user}> {email}
The where method is part of the Cloud Firestore API (firebase/firestore).
There is no direct equivalent in the Realtime Database API (firebase/database) that allows using a similar shorthand.
Instead, you invoke one of the many QueryConstraint returning methods:
endAt(), endBefore(), startAt(), startAfter(), limitToFirst(), limitToLast(), orderByChild(), orderByChild(), orderByKey(), orderByPriority(), orderByValue() or equalTo(). Take a look at QueryConstraint for links to the API reference for these methods and read over the documentation for Realtime Database: Sorting and filtering data.
The equivalent of
// firestore
const query = query(usersColRef, where('email', '==', email));
is
// database
const query = query(usersRef, orderByChild('email'), equalTo(email));
I am trying so hard to upload one image from cloud functions
I am sending an image from the web to the cloud function using onRequest. I am sending a base64 string and the fileName. Now I was following different tutorials on the internet and couldn't seem to solve my problem.
Here is my code. I think I am doing something wrong with the service account json. Although i generated the json file and used it but still it didn't work.
I get the error of The caller does not have permission at Gaxios._request when i don't use service account json
And when i do use serviceAccount.json then i get this error The "path" argument must be of type string. Received an instance of Object which is from file.createWriteStream() i think
Anyway here is the code can anyone please help me with this
The projectId that I am using is shown in the picture below
const functions = require("firebase-functions");
const admin = require("firebase-admin");
const projectId = functions.config().apikeys.projectid; // In the picture below
const stream = require("stream");
const cors = require("cors")({ origin: true });
const { Storage } = require("#google-cloud/storage");
// Enable Storage
const storage = new Storage({
projectId: projectId, // I did use serviceAccount json here but that wasn't working
});
// With serviceAccount.json code
// const storage = new Storage({
// projectId: projectId,
// keyFilename: serviceAccount,
// });
// This is giving the error of: The "path" argument must be of type string. Received an instance of Object
exports.storeUserProfileImage = functions.https.onRequest((req, res) => {
cors(req, res, async () => {
try {
const bucket = storage.bucket(`gs://${projectId}.appspot.com`);
let pictureURL;
const image = req.body.image;
const userId = req.body.userId;
const fileName = req.body.fileName;
const mimeType = image.match(
/data:([a-zA-Z0-9]+\/[a-zA-Z0-9-.+]+).*,.*/
)[1];
//trim off the part of the payload that is not part of the base64 string
const base64EncodedImageString = image.replace(
/^data:image\/\w+;base64,/,
""
);
const imageBuffer = Buffer.from(base64EncodedImageString, "base64");
const bufferStream = new stream.PassThrough();
bufferStream.end(imageBuffer);
// Define file and fileName
const file = bucket.file("images/" + fileName);
bufferStream
.pipe(
file.createWriteStream({
metadata: {
contentType: mimeType,
},
public: true,
validation: "md5",
})
)
.on("error", function (err) {
console.log("error from image upload", err.message);
})
.on("finish", function () {
// The file upload is complete.
console.log("Image uploaded");
file
.getSignedUrl({
action: "read",
expires: "03-09-2491",
})
.then((signedUrls) => {
// signedUrls[0] contains the file's public URL
console.log("Signed urls", signedUrls[0]);
pictureURL = signedUrls[0];
});
});
console.log("image url", pictureURL);
res.status(200).send(pictureURL);
} catch (e) {
console.log(e);
return { success: false, error: e };
}
});
});
const storage = new Storage({
projectId: projectId
keyFilename: "" // <-- Path to a .json, .pem, or .p12 key file
});
keyFilename accepts path to where your service account is stored and the credentials themselves.
folder
|-index.js
|-credentials
|-serviceAccountKey.json
If your directory structure looks like about then the path should be like this:
const storage = new Storage({
projectId: projectId
keyFilename: "./credentials/serviceAccountKey.json"
});
Do note that if you are using Cloud functions then the SDK will use Application Default Credentials so you don't have to pass those params. Simply initialize as shown below:
const storage = new Storage()
So first of all I didn't give any serviceaccounts because I am using the firebase cloud functions as #Dharmaraj said in his answer
Secondly, this was a permission problem in the google cloud platform which can be solved by going through the following steps
Go to your project's Cloud Console (https://console.cloud.google.com/) > IAM & admin > IAM, Find the App Engine default service account then click on the pencil at far left > Click on add role > In the filter field enter Service Account Token Creator and click on it save and you are good to go
Found this solution from here
https://github.com/firebase/functions-samples/issues/782
I am trying to deploy my firebase cloud functions. When I run firebase deploy --only functions after adding my functions to the index.js file it gives me the error mentioned above.
here is my firebase.json file:
"functions": {
"predeploy": [
"npm --prefix \"$RESOURCE_DIR\" run lint"
],
"source": "functions"
},
"database": {
"rules": "database.rules.json"
},
"firestore": {
"rules": "firestore.rules",
"indexes": "firestore.indexes.json"
}
}
here is my index.js file:
const functions = require('firebase-functions');
const algoliasearch = require('algoliasearch');
// get api keys for the algolia from the env variable of cloud functions
const APP_ID = functions.config().algolia.app;
const ADMIN_KEY = functions.config().algolia.key;
// algolia client
const client = algoliasearch(APP_ID, ADMIN_KEY);
const index = client.initIndex('pals');
// ADD algolia index from firestore database when a document is created in firestore:
exports.addToIndex = functions.firestore.document('users/{userId}').onCreate(snapshot=>{
const data = snapshot.data();
const objectID = snapshot.id;
// add objectID to algolia index
return index.addObject({...data, objectID});
});
// UPDATE algolia index from firestore database when a document is updated in firestore
exports.updateIndex = functions.firestore.document().onUpdate(change =>{
// change.after gives the document after the change
const newData = change.after.data();
const objectID = change.id;
// update objectID to algolia index
return index.saveObject({...newData, objectID});
});
// DELETE algolia index from firestore database when a document is deleted in firestore
exports.updateIndex = functions.firestore.document().onDelete(snapshot =>{
// delete objectID to algolia index
return index.deleteObject(snapshot.id);
});
I tried running firebase deploy --only functions with the hello world snippet they provide. It worked fine with that.
As the error message says, you need to always pass a path to the functions.firestore.document(...) function, to determine on which document paths the function triggers.
You do this correctly here:
exports.addToIndex = functions.firestore.document('users/{userId}').onCreate(snapshot=>{
...
But you are not passing a path in these two cases:
exports.updateIndex = functions.firestore.document().onUpdate(change =>{
...
exports.updateIndex = functions.firestore.document().onDelete(snapshot =>{
...
If you also want them to trigger on user documents, just like with onCreate, they'd be:
exports.updateIndex = functions.firestore.document('users/{userId}').onUpdate(change =>{
...
exports.updateIndex = functions.firestore.document('users/{userId}').onDelete(snapshot =>{
...
I would like to retrieve the subcollections by making my request with geofirestore, like so:
The id of each PRODUCTS corresponds to that of the user who created new products (for the moment there is only one).
That's my code right now:
import firestore from '#react-native-firebase/firestore';
import * as geofirestore from 'geofirestore';
const firestoreApp = firestore();
const GeoFirestore = geofirestore.initializeApp(firestoreApp);
const geocollection = GeoFirestore.collection('PRODUCTS');
const query = geocollection.limit(30).near({
center: new firestore.GeoPoint(coords.latitude, coords.longitude),
radius: 1000,
});
try {
query.onSnapshot((querySnapshot) => {
const productsQueried = querySnapshot.docs.reduce(
(result, documentSnapshot) => {
console.log(documentSnapshot);
if (documentSnapshot.id !== user.uid) {
result.push(documentSnapshot.data());
}
return result;
},
[]
);
setListOfProducts(productsQueried);
console.log(productsQueried);
setLoading(false);
});
} catch (error) {
console.log(error);
}
Of course, I can only find the geocollection, but cannot access the 'USER_PRODUCTS' collection inside.
{exists: true, id: "OUJ6r3aF9nVfgtfkQRES7kpYCko1", distance: 0, data: ƒ}
The final goal is to retrieve a list of products for each close customer and then sort so as not to retrieve that of the current user.
Do I necessarily have to make a second request (can I do it in one?) Or do I have to change the way I save the product lists of different users in firestore?
My Firestore sub collection names are of the format 'subcollection_name_yyyymmdd'. Whenever new documents are added, they are identified through the 'yyyymmdd' part of the subcollection name. I need to take Firestore exports for these subcollections incrementally on the 'yyyymmdd' values. Below is my cloud function taking full firestore export at the moment. Is there a way I can parameterize the 'collectionIds:' to take the subcollection names by passing the yyyymmdd part as a variable/parameter?
eg: something like collectionIds: ['subcollection_name_{$date}']?
const firestore = require('#google-cloud/firestore');
const client = new firestore.v1.FirestoreAdminClient();
const bucket = 'gs://BUCKET_NAME'
exports.scheduledFirestoreBackup = (event, context) => {
const databaseName = client.databasePath(
// process.env.GCLOUD_PROJECT,
"PROJECT_ID",
'(default)'
);
return client
.exportDocuments({
name: databaseName,
outputUriPrefix: bucket,
collectionIds: ['subcollection_name'],
})
.then(responses => {
const response = responses[0];
console.log(`Operation Name: ${response['name']}`);
return response;
})
.catch(err => {
console.error(err);
});
};