I'm currently building a node application that's using Stripe for payments. I've got webhooks setup and working as I want to create subscribers in my application but require the response from the stripe webhooks to store the data received for different events in different collections in my mongo db.
The issue I'm having is that the order of events sent by Stripe is not always in the same order and in order to create relationships between documents/collections I require that the event handlers are triggered in the following order:
customer.created
customer.card.created (relates to customer)
invoice.created (relates to customer)
As it stands the event handler for 2 can be executed before 1 and 3 before 2 etc.
What would be the best way to ensure my handlers are executed in the correct order every time? I'm thinking promises of some-sort. If this is the case, what's a good promise module for node?
You can reject hook if an event is missing, stripe will try it later but if you want it to happen quick, you can make a small loop waiting for max 10 seconds and who checks every second if missing event is arrived).
I have same problem when updating a billing plan, the event invoice.updated arrives before invoice.created ...
Why don't you just use the id of the stripe object as the foreign key on your document? Don't store the actual json object, just the stripe id. When you need to query for the documents, build your result set by using the stripe id as the primary key.
Related
I have a chat app using Firebase as a realtime database and React Native. I'm trying to figure out the most efficient way to set up the listener for chat messages from Firebase in terms of minimizing read operations and transferring data. Here is my data structure:
- messages
- chatId
- messageId
- sentBy
- timestamp
- text
As I see it I have 2 options, either ref.on("child_added) or ref.on("value")
If I use ref.on("child_added"), the advantage is that when a new message is sent then only the newest message is retrieved. The problem though is that when the conversation is loaded the read operation is called for each message in the chat. If a conversation is hundreds of messages long, then that's hundreds of read operations.
The other option is to use ref.on("value"). The problem here is that on every new message added, the entire conversation is resent instead of just the most recent message. The advantage is that when the conversation is loaded, only one read operation is called to transfer the entire conversation to the screen.
I want some combination of the two of these in which when the conversation is loaded, there is one read operation that brings the entire contents of the conversation, AND when a new child node is added (a new message) only that message is transmitted to the listener. How can I achieve this?
firebaser here
There is no difference between the wire traffic for a value listener and child_ listeners on the same location/query. If you check the Network tab of your browser, you can see exactly what is sent retrieved, and you'll see that it's exactly the same between the listener types.
The difference between value and child_* events is purely made client-side to make it easier for you to update the UI. In fact, even when you attach both value and child_* listeners to the same query/location, Firebase will only retrieve the data only once.
The common way to do what you want is to attach both child_* and value listeners to the query/location. Since the value listener is guaranteed to be fired last, you can use that fact to detect when the initial load is done.
Something like:
var chatRef = firebase.database().ref("messages/chatId");
var initialLoadDone = false;
chatRef.on("child_added", (snapshot) => {
if (initialLoadDone) {
...
}
});
chatRef.once("value", (snapshot) => {
snapshot.forEach((messageSnapshot) => {
...
});
initialLoadDone = true;
});
Suggestion: Use Firestore. It maintains a cache of your data and efficiently handles such scenarios.
You can use ref.once('value') to get current nodes only once and then ref.on('child_added') for subsequent additions. More performance notes.
Edit: I believe Firebase Database handles this efficiently by just ref.on('value'). On checking the network tab after adding a new node to my database, I notified the amount of data that got transferred was very low. This might mean that firebase by default caches your previous data. Would recommend you to look at your network tab and take decisions as such or wait from someone from their team show directions.
I am implementing the stripe api in my node application. The problem i am having is that if i click the submit button very fast multiple times, the api is called multiple times and the stripe customer suddenly has multiple subscriptions and charges.
The scenario is when a user has a stripe customer account without a subscription since they have previously unsubscribed. Now they would like to resubscribe with a new plan since the old one is deleted.
The logic of my code is as follows:
1. Submit form
2. I retrieve the user from my mongo db
3. I retrieve the customer from stripe using a stored customer id (api call)
4. I create a customer subscription (api call)
5. I update the stripe customer object with their credit card (api call)
6. Respond back to user
All of the above is done using async waterfall, each subsequent asynchronous call is in a separate function
What I want:
Between steps 3 and 4 i want to retrieve the customer and check if he or she is already subscribed and prevent 4 and 5 from occurring if the user already has an active subscription.
The Issue:
I am testing the scenario where the user clicks the submit button multiple times and ends up being charged for several subscriptions. I believe that it has to do with how node sends several requests at once due to its asynchronous nature and me wanting to check make an api call to check if something is set takes too long and node doesn't wait when it sends all of the requests.
How do i go about solving this.
Note: Of course i have the front end handling this to prevent user from submitting the form multiple times but this is not ideal and don't want this to be my only line of defense.
Thanks
use a powerful rate limiter library
https://github.com/jhurliman/node-rate-limiter
var RateLimiter = require('limiter').RateLimiter;
// Allow 1 requests per minute
var limiterPayments = new RateLimiter(1, 'minute');
exports.payWithStripe = function(req,res){
limiterPayments.removeTokens(1,function(err,remainingRequests){
if (err){
return res.sendStatus(429);
} else {
// continue with stripe payment
}
}
This will throttle per session (per user browser)
One way is to keep a cache (in Redis or similar) of recent transactions, indexed by a hash derived from key transaction values (e.g. customer id, date, time, amount.)
Before submitting the subscription request to stripe, compare the current transaction to the cache. If you get a hit, you can either show an error, automatically ignore the transaction, or let the user choose whether to proceed.
Disable the button in your frontend using javascript as soon as it's clicked the first time.
Stripe has built in functionality to help you prevent this. https://stripe.com/docs/api/idempotent_requests
You need to generate a unique string and send it along with the stripe request.
Putting it in the HTML in a hidden form, a la your CORS token, then reading it out and putting it in the Stripe call will prevent double click / retries.
I have custom Firebase auth processes (Auth0 => Firebase), so I run all my login logic using TypeScript with Svelte, once I have a successful JWT token, either from sessionStorage or from a fresh login, I boot off my Elm app, sending it some JWT and profile info via a flag. It's a SPA, with a routeing and pages, all componentized and working fine.
My only real problem now is, when it comes to firebase, sure, ElmFire exists, but how do I just give it an active auth token etc. And without loading Firebase for JS and also ElmFire for Elm, it just seems like way too many Kb's.
Is there a nice and efficient way to let Elm port out a "hey listen to this ref", with a "hey Elm, I have some new data for you for this ref". Ports to tell JS to listen, and subscriptions to tell Elm about new data. Without having a port for every listener, and a subscription for every data callback.
Ideally, I'd like my update, to send off a CMD that accepts a callback Msg, to update the function to call and a ref. So that way I can store that in a List of some sort, and when I get a new data payload from JS, I can loop through my List to find the item that matches the ref, execute the update, send in the Msg with the string value, so decoding happens on the pages' update.
Someone feel free to abstract this question into something more general. Feel like this question is something more people might have.
The problem is that port/sub must be typed in Elm.
So, a single port / sub with one argument could listen to multiple refs as long as they all have the same type.
If not, you could define the port / sub with multiple arguments, one for each type of data you want to exchange, and pass any particular data in the proper argument. The other arguments would be null.
If I have a subscription inside an Tracker.autorun(), the publish takes a variable selector, means that every time, the return may vary, would minimongo cache all the docs returned from all the publications? or each time, it clears all its documents and only preserve the returned docs from previous publication?
Meteor is clever enough to keep track of the current document set that each client has for each publisher. When the publisher reruns, it knows to only send the difference between the sets. Let's use the following sequence as an example:
subscribe for posts: a,b,c
rerun the subscription for posts b,c,d
server sends a removed message for a and an added message for d.
Note this will not happen if you stopped the subscription prior to rerunning it.
I am trying to implement a stripe checkout process in one of my express.js routes. To do this, I have:
Official Node.js Stripe module
Official client-side Stripe module
A json logger I use to log things like javascript errors, incoming requests and responses from external services like stripe, mongodb, etc…
An Order model defined using mongoose - a MongoDB ODM
My steps are as follows:
Client:
Submit order details which include a stripe payment token
Server:
Create an unpaid order and save to database (order.status is created)
Use stripe client to charge user's credit/debit card
Update order and save to database (order.status is accepted or failed depending on response from Stripe)
Question: If payment is successful after step 2 but an error occurs updating the order in step 3 (due to database server error, outage or similar), what are some appropriate ways to handle this failure scenario and potentially recover from it?
With payment systems, you always need a consolidation process (hourly, daily, monthly) based on sane accounting principles that will check that every money flow is matched.
In your case, I suggest that every external async call logs the sent parameters and the received response. If you do not have a response within a certain time, you know that something has gone wrong on the external system (Stripe, in your case) or on the way back from the external system (you mention a database failure on your side)
Basically, for each async "transaction" that you spawn, you know when you start it and have to decide of a reasonable amount of time before it ends. Thus you have an expected_end_ts in the database.
If you have not received an answer after expected_end_ts, you know that something is wrong. Then you could ask for the status to Stripe or another PSP. Hopefully the API will give you a sane answer as to whether the payment went through or not.
Also note that you should add a step between 1. and 2 : re-read the database. You want to make sure that every payment request you make is really in the database, stored exactly as you are going to send it.