I have a local function that is called like so:
exports.testFunction = functions.pubsub
.schedule(schedule)
.onRun(() => test.scraper('123'))
However, test.scraper() is not an HTTPS function, it's just a regular function.
How can I test it using firebase without having to wrap it in a callable https function? Do I have to use the function shell?
Thanks.
As Doug said, you can test this separately if that's what you need. However, according to the official documentation, you can also use offline unit tests if it suits your workflow. You still have to wrap some functions, but it satisfies testing.
Example from the article with stubbed data and Sinon:
// Listens for new messages added to /messages/:pushId/original and creates an
// uppercase version of the message to /messages/:pushId/uppercase
exports.makeUppercase = functions.database.ref('/messages/{pushId}/original')
.onCreate((snapshot, context) => {
// Grab the current value of what was written to the Realtime Database.
const original = snapshot.val();
console.log('Uppercasing', context.params.pushId, original);
const uppercase = original.toUpperCase();
// You must return a Promise when performing asynchronous tasks inside a Functions such as
// writing to the Firebase Realtime Database.
// Setting an "uppercase" sibling in the Realtime Database returns a Promise.
return snapshot.ref.parent.child('uppercase').set(uppercase);
});
Related
I am working on a piece of my app where I need to make a call to a Firebase Function, which parses through Firestore data to return a dictionary that I will use to populate UI in Swift.
My function is declared as so:
exports.getUIBetData = functions.https.onRequest( (request, response) => {
This function takes in a userID as body parameter. Then, I need to hit firebase to get a specific document's data tied to this userId and perform some actions on it. I believe I am running into some issues with the async functionality behind getting data from a document, as I keep getting errors or simple promises that haven't been resolved. Here is my query:
const body = request.body;
const userId = body.data.userId;
const bettorIdDoc = admin.firestore()
.collection("user_dim").doc(userId).get().data();
I can confirm that "user_dim" is a valid collection, and the userId is a key to a document within it. However, I can't access the fields tied to this doc.
I was originally trying with just .data(), and realized from the official documentation that you need to do .get().data(), however this is async. How do I handle the async nature when I am attempting to do this within my main function (exports.getUIBetData = functions.https.onRequest( (request, response) => {)?
Error:
TypeError: admin.firestore(...).collection(...).doc(...).get(...).data is not a function
Loading data from Firestore (and pretty much any cloud API) is an asynchronous operation. You can see this by checking the return type of get(), which is Promise<DocumentSnapshot> and not just DocumentSnapshot.
This means you'll have to use then or await (if you're in an async context) to be able call data():
const bettorIdRef = admin.firestore()
.collection("user_dim").doc(userId)
ref.get().then((snapshot) => console.log(snapshot.data());
I'm working on an existing NodeJS web service using HapiJS, Hapi Lab for testing along with Sinon. The service connects to a Postgres DB using massiveJs. There's a method implemented by someone else, that doesn't have unit tests. Now I'm reusing this method and I want to implement some unit tests for it.
This method executes a massivejs transaction inside of it, persisting to several tables.
async createSomething(payload) {
const { code, member } = payload;
const foundCompany = await this.rawDb.ethnics.tenants.findOne({ code });
if (foundCompany && foundCompany.companyId) {
const { companyId } = foundCompany;
const { foreignId } = member;
return await this.rawDb.withTransaction(async (tx) => {
const foundMember = await tx.ethnics.members.findOne({ foreign_id: foreignId, company_id: companyId });
if (!foundMember) {
//some business logic...
const newMember = await tx.ethnics.members.insert(member);
//more business logic persisting to other tables...
return newMember;
}
});
}
}
Problem is, I don't know how to stub stuff only inside the arrow function, without stubbing the entire arrow function. I just want to stub the calls of tx. I also don't want to use a database but stub the rawDb property. Is that doable from unit testing perspective?
Yes it is doable. There are 2 alternatives:
Stub MassiveJS methods directly.
Example to stub massive method findOne:
const massive = require('massive');
const sinon = require('sinon');
// Stub massive.Readable class method findOne.
// You need to find where is the real method findOne and stub it.
const stub = sinon.stub(massive.Readable, 'findOne');
// You can resolve it.
stub.resolves();
// Or you can throw it.
stub.throws(new Error('xxx'));
Use pg in memory for test.
Just for testing purpose, you can use module like: test-pg-pool or pg-mem. Before testing, start the test pg and after the test finish, destroy it.
I am wondering if it is possible to test variables for specific values within a module call. For example:
module.exports = async(request, context, postgres) => {
let userId = request.user.id;
let balanceQuery = 'SELECT balance FROM account WHERE user_id = $1';
let userBalanceStr = await postgres.query(balanceQuery,[userId]).then(result => result.rows[0].balance);
let totalCount = BigFunction(userBalanceStr);
......
}
How would I go about testing/accessing the value of balanceQuery or userBalanceStr within a jest test?
I'm only able to access the userId portion of the query, and I am sort of confused how to go about accessing other variables within the module.
You can achieve that by mocking postgres.query and BigFunction and then asserting the arguments passed in each call. And to be able to test the async code, you can implement an async test.
API reference: Mock Functions
When mocking postgres.query, you could implement a fake returning value, which will be assigned to userBalanceStr, and then can be asserted with the mocked BigFunction.
To mock BigFunction, you can either mock its importation present in the module under test, or changing the exported function to pass BigFunction as a dependency, like postgres, passing the mock function directly as argument.
I am using web3 version 1.0.0-beta.27 where all accesses to the blockchain will be asynchronous, clearly this opens up the possibility of race conditions, ie:
var Web3 = require("web3");
// connect to etherum blockchain
var ether_port = 'http://localhost:8545'
var web3 = new Web3(new Web3.providers.HttpProvider(ether_port));
// this is how we set the value, note there is the possiblity of race condidtions here
var accounts = []
web3.eth.getAccounts().then(function(accts){
console.log("printing account: ", accts)
accounts = accts
})
// observe race condition
console.log("assert race condition: ", accounts[0])
The last line above is contrived, it is there to demonstrate that I would like to use accounts after it has been evaluated. Ie, eventually I would like modify/read the blockchain from a front end express.js web app or even a mobile app, so in the interest of being rigorous, what are the common tools in node.js to ensure race conditions never occur? Do these tools exist? If not what are some common practices. I am new to node.js as well.
One idea is to not attempt to directly store the data because code trying to access the data has no idea when it's valid due to the uncertain nature of asynchronous results. So, instead you store the promise and any code that wants access to the data, just uses .then()/.catch() on the promise. This will always work, regardless of the async timing. If the data is already there, the .then() handler will be called quickly. If the data is not yet there, then the caller will be in line to be notified when the data arrives.
let accountDataPromise = web3.eth.getAccounts().then(function(accts){
console.log("printing account: ", accts)
return accts;
});
// then, elsewhere in the code
accountDataPromise.then(accts => {
// use accts here
}).catch(err => {
// error getting accts data
});
FYI, assigning data from a .then() handler to a higher scoped variable that you want to generally use in other code outside the promise chain is nearly always a sign of troublesome code - don't do it. This is because other code outside the promise chain has no idea when that data will or will not be valid.
I'd like to use the MongoDB native JS driver with bluebird promises. How can I use Promise.promisifyAll() on this library?
The 2.0 branch documentation contains a better promisification guide https://github.com/petkaantonov/bluebird/blob/master/API.md#promisification
It actually has mongodb example which is much simpler:
var Promise = require("bluebird");
var MongoDB = require("mongodb");
Promise.promisifyAll(MongoDB);
When using Promise.promisifyAll(), it helps to identify a target prototype if your target object must be instantiated. In case of the MongoDB JS driver, the standard pattern is:
Get a Db object, using either MongoClient static method or the Db constructor
Call Db#collection() to get a Collection object.
So, borrowing from https://stackoverflow.com/a/21733446/741970, you can:
var Promise = require('bluebird');
var mongodb = require('mongodb');
var MongoClient = mongodb.MongoClient;
var Collection = mongodb.Collection;
Promise.promisifyAll(Collection.prototype);
Promise.promisifyAll(MongoClient);
Now you can:
var client = MongoClient.connectAsync('mongodb://localhost:27017/test')
.then(function(db) {
return db.collection("myCollection").findOneAsync({ id: 'someId' })
})
.then(function(item) {
// Use `item`
})
.catch(function(err) {
// An error occurred
});
This gets you pretty far, except it'll also help to make sure the Cursor objects returned by Collection#find() are also promisified. In the MongoDB JS driver, the cursor returned by Collection#find() is not built from a prototype. So, you can wrap the method and promisify the cursor each time. This isn't necessary if you don't use cursors, or don't want to incur the overhead. Here's one approach:
Collection.prototype._find = Collection.prototype.find;
Collection.prototype.find = function() {
var cursor = this._find.apply(this, arguments);
cursor.toArrayAsync = Promise.promisify(cursor.toArray, cursor);
cursor.countAsync = Promise.promisify(cursor.count, cursor);
return cursor;
}
I know this has been answered several times, but I wanted to add in a little more information regarding this topic. Per Bluebird's own documentation, you should use the 'using' for cleaning up connections and prevent memory leaks.
Resource Management in Bluebird
I looked all over the place for how to do this correctly and information was scarce so I thought I'd share what I found after much trial and error. The data I used below (restaurants) came from the MongoDB sample data. You can get that here: MongoDB Import Data
// Using dotenv for environment / connection information
require('dotenv').load();
var Promise = require('bluebird'),
mongodb = Promise.promisifyAll(require('mongodb'))
using = Promise.using;
function getConnectionAsync(){
// process.env.MongoDbUrl stored in my .env file using the require above
return mongodb.MongoClient.connectAsync(process.env.MongoDbUrl)
// .disposer is what handles cleaning up the connection
.disposer(function(connection){
connection.close();
});
}
// The two methods below retrieve the same data and output the same data
// but the difference is the first one does as much as it can asynchronously
// while the 2nd one uses the blocking versions of each
// NOTE: using limitAsync seems to go away to never-never land and never come back!
// Everything is done asynchronously here with promises
using(
getConnectionAsync(),
function(connection) {
// Because we used promisifyAll(), most (if not all) of the
// methods in what was promisified now have an Async sibling
// collection : collectionAsync
// find : findAsync
// etc.
return connection.collectionAsync('restaurants')
.then(function(collection){
return collection.findAsync()
})
.then(function(data){
return data.limit(10).toArrayAsync();
});
}
// Before this ".then" is called, the using statement will now call the
// .dispose() that was set up in the getConnectionAsync method
).then(
function(data){
console.log("end data", data);
}
);
// Here, only the connection is asynchronous - the rest are blocking processes
using(
getConnectionAsync(),
function(connection) {
// Here because I'm not using any of the Async functions, these should
// all be blocking requests unlike the promisified versions above
return connection.collection('restaurants').find().limit(10).toArray();
}
).then(
function(data){
console.log("end data", data);
}
);
I hope this helps someone else out who wanted to do things by the Bluebird book.
Version 1.4.9 of mongodb should now be easily promisifiable as such:
Promise.promisifyAll(mongo.Cursor.prototype);
See https://github.com/mongodb/node-mongodb-native/pull/1201 for more details.
We have been using the following driver in production for a while now. Its essentially a promise wrapper over the native node.js driver. It also adds some additional helper functions.
poseidon-mongo - https://github.com/playlyfe/poseidon-mongo