I would like to create a firebase cloud function that updates the current second on the real time database, every second.
As far as I understand I am not able to use setInterval because the function will stop after x seconds anyway.
Does anyone have an idea how I can achieve that using cloud functions?
There is no built-in interval based trigger in Cloud Functions for Firebase.
You can build cron-style triggering using other Google services, such as App Engine Cron (as shown in this blog post) or an external web service (as shown in this video).
But neither of those seems particularly well suited for triggering every second. For that kind of constant load, I'd consider spinning up a Node.js process that simply uses setTimeout() or setInterval().
Related
I know that cloud functions charge money based on execution time and network costs.
I am writing a cloud function that calls 2 external APIs to authenticate a user. Both external APIs just return a single boolean and therefore don't send much info.
Which technique is better overall?
Make both external API calls using fetch() at the same time knowing that 1 may be useless half the time.
Make the external API call one after the other, and not make the second one if not needed.
I'm thinking doing #1 will shorten the "wall time" of my cloud function and therefore may be cheaper? But I make more external calls which may be expensive?
But doing #2 will make the cloud function stay in memory for longer and therefore more "wall time".
Method 1 results in more complex code as you will need to wait for both fetches to complete unless you like container crashes to debug.
Method 2 is simpler to implement and debug but could result in longer wall clock time.
You will need to decide which method is better when you factor development time, debugging time, analyzing errors in logfiles, and execution time.
I am using the firebase Stripe API, and what is happening is my app doesn't have a lot of traffic yet, nor will it for a little while. Firebase decided, after 2-3 minutes of no invocations on the function, it goes into cold start mode. This is unfortunate because it means my wait time from when a new user hits register, and goes to the checkout page, it is like 8 seconds. How horrendous is that!
Anyways, does anyone know a way around this, maybe setting a script to run in the background at all times, or something I can do from inside firebase?
One way to help is to add a "cold-start" command to the Cloud Function (i.e. a "no-op" invocation/call), and invoke it when your user starts the checkout process (before collecting any information). If the User doesn't complete check-out, no-harm-no-foul; if they do, the cloud function has already been started.
Update 2020-01-01:
Firebase now allows you to designate for each function, in the console, a minimum (and/or maximum) number of invocations - i.e. keeping functions in memory. A single active function costs about $0.33 to $0.50 per month - a fairly low (but not zero) cost for keeping cold starts down...
I am trying to create an app that simulates opening a tab at a bar. I am running into one issue that I can't seem to figure out - it goes as follows:
When someone opens a bar tab, dynamically create a scheduled task that executes code to close the tab after 24 hours.
If the tab gets closed before the 24 hours, cancel the scheduled task.
If the tab doesn't get closed after 24 hours, execute the code described in step 1 to initiate a payment on the card used to open the tab.
I was initially looking into Firebase Functions, and was thinking about using a setTimeout() callable function, but after doing some research I found that Firebase Function's cannot be invoked for longer than 9 minutes.
NOTE: I would like this to be dynamic. Meaning, having it account for a variable amount of users. There could be 100 or 1000 users on the platform, each of them needs the ability to have a unique scheduled task for them (sometimes multiple per user).
Please see the comments for the full solution.
There are multiple approaches to circumvent the 10 minutes rule (which is prevalent in the serverless code) but here's something that can help you. I suggest separating the task into three:
A cloud function that close the tab when called.
A schedule function that calls it (https://firebase.google.com/docs/functions/schedule-functions)
A way to start and stop the schedule function.
I am not sure how firebase function work, but I worked with azure functions before and those can be controlled with command line (CLI) or with a sdk for your language of choice. To cancel using the command line, try something like this:
firebase functions:delete scheduledFunction
from How to cancel a scheduled firebase function?.
Now what's left is how to figure out how to start the function, and if it's possible to pass in a parameter to schedule it.
Good luck!
Our API needs to send data to Zapier if some specific data was modified in our DB.
For example, we have a company table and if the name or the address field was modified, we trigger the Zapier hook.
Sometimes our API receives multiple change requests in a few minutes, but we don't want to trigger the Zapier hook multiple times (since it is quite expensive), so we call a setTimeout() (and overwrites the existing setTimeout) on each modify requests , with a 5000ms delay.
It works fine, and there are no multiple Zapier hook calls even if we get a lot modify requests from client in this 5000ms period.
Now - since our traffic is growing - we'd like to set up multiple node.js instances behind some load balancer.
But in this case the different Node.js instances can not use - and overwrite - the same setTimeout instance, which would cause a lot useless Zapier calls.
Could you guys help us, how to solve this problem - while remaining scalable?
If you want to keep a state between separate instances you should consider, from an infrastructure point of view, some locking mechanism such as Redis.
Whenever you want to run the Zapier call, if no lock is active, you set one on Redis, all other calls won't be triggered as it is locked, whenever the setTimeout callback runs, you disable the Lock.
Beware that Redis might become a SPOF, I don't know where you are hosting your services, but that might be an important point to consider.
Edit:
The lock on Redis might have a reference to the last piece of info you want to update. So on the first request you set the data to be saved on Redis, wait 5 seconds, and update. If any modifications were made in that time frame, it will be stored on Redis, that way you'll only update on 5 second intervals, you'll need to add some extra logic here though. Example:
function zapierUpdate(data) {
if (isLocked()) {
// Locked! We will update the data that needs to be saved on the
// next setTimeout callback
updateLockData(data);
} else {
// First lock and save data.
lock(data);
// and update in 5 seconds
setTimeout(function(){
// getLockData fetches the data on Redis and releases the lock
var newData = getLockData();
// Update the latest data that might have been updated.
callZapierNow(newData);
},5000);
}
}
My app's framework is built around collapsing backbone models sending the data via websockets and updating models on other clients with the data. My question is how should I batch these updates for times when an action triggers 5 changes in a row.
The syncing method is set up to update on any change but if I set 5 items at the same time I don't want it to fire 5 times in a row.
I was thinking I could do a setTimeout on any sync that gets cleared if something else tries to sync within a second of it. Does this seem like the best route or is there a better way to do this?
Thanks!
i haven't done this with backbone specifically, but i've done this kind of batching of commands in other distributed (client / server) apps in the past.
the gist of it is that you should start with a timeout and add a batch size for further optimization, if you see the need.
say you have a batch size of 10. what happens when you get 9 items stuffed into the batch and then the user just sits there and doesn't do anything else? the server would never get notified of the things the user wanted to do.
timeout generally works well to get small batches. but if you have an action that generates a large number of related commands you may want to batch all of the commands and send them all across as soon as they are ready instead of waiting for a timer. the time may fire in the middle of creating the commands and split things apart in a manner that causes problems, etc.
hope that helps.
Underscore.js, the utility library that Backbone.js uses, has several functions for throttling callbacks:
throttle makes a version of a function that will execute at most once every X milliseconds.
debounce makes a version of a function that will only execute if X milliseconds elapse since the last time it was called
after makes a version of a function that will execute only after it has been called X times.
So if you know there are 5 items that will be changed, you could register a callback like this:
// only call callback after 5 change events
collection.on("change", _.after(5, callback));
But more likely you don't, and you'll want to go with a timeout approach:
// only call callback 30 milliseconds after the last change event
collection.on("change", _.debounce(30, callback));