Subscribing sequentially to a variable number of Observables in Angular 2+ - javascript

Sorry if this was answered elsewhere, I tried to search but I'm not even sure what I'm looking for.
Let say I have this object to work with:
userRequest: {
id: number,
subject: string,
...
orderIds: number[]
...
}
order: {
id: number,
...
clientId: number,
productIds: number[]
}
client: {
id: number,
name: string,
...
}
product: {
id: number,
name: string,
price: number
}
Now, at some point the user will fill a form using that composite object and send it for analysis. But before sending it, it has first to be validated. And I cannot validate in the form because the user is simply entering the data received on paper. If the data is "invalid", a request for more information will be sent.
So, I need to validate the request, but also the order, the products and the client. I am requested to show a "Validating Request" screen and after each element was checked, a "Valid" or "Invalid" screen. Simple enough.
But now, I'm sending http requests and get Observables to deal with. I'm trying to learn more about them and all the available operators and how to mix them, but at the moment, I'm completely lost.
So, I first get an Observable<userRequest> from server. Then, once I get a userRequest, I need to get all the orders from their id's, and when I get an "order", I have to get the client & his products.
All this is done asynchronously, but I cannot get the client or the products until I receive the order, and I need the userRequest to provide the orders. In addition, when I get an order, I need to get both the client AND the products at the "same time" since they both need the same order...? For the grand finale, for every element I get (request, order, client, product) I need to validate it and wait for every element to say "the request is valid" or not.
So to resume:
I need to get an Observable<userRequest> and validate
Now, I have to get an Observable<order[]> and validate each order
For each order, I have to 1) get an Observable<Client> and validate PLUS 2) get an Observable<Product[]> and validate each one
Wait for every observables to complete and check if it's valid or not
Steps 1 and 2 needs to be executed sequentially, but when step 2 completes, I need to execute steps 3.1 and 3.2 for each result of step 2. And wait.
I'm sure it's far from clear, I just hope it clear enough so you guys gets want I want to achieve. If you have any hints for me, please do share!!! ; )
Edit
I do know somehow what needs to be done. But where I lose my cool, is when I need to chain the Observables sequentially (as each one depends on the one before), at various point I need to call a validation method and when it comes to the Client and the Products, both need the Order for it's Id. I did try many, many ways but I just don't grasp the concept completely.
bygrace - No, I don't want the validation to block. It should validate everything as it will result in a request for all the missing or invalid parts, and it should be showed at the end. That why I need a way to know when everything is done so I can check if errors were found.
The request, orders, client and products each comes from their respective services. The service makes an http resquest and returns an Observable. So I need to chain the calls (and when it comes to the Order, I need to get TWO Observables for the same Order Id).
QuietOran - Here's something I tried. It's horrible I know, but I'm so lost right now...
onValidateRequest(requestId: number) {
this.requestService.getUserRequest$(this.requestId)
.do(request => {
this.validateRequest(request);
})
.concatMap(request => this.orderService.getOrdersForRequest$(request.id))
.do(orders => {
this.validateOrders(orders);
})
.concatMap(orders => {
// Now, this is were I'm completely lost
// I manage to get the request and the orders, but in this block, I need to get the client AND the products
// and validate each one as I receive it
// Then return something
})
.do(() => {
// when I validate an element, if there's an error, I simple add it in an array.
// So when ALL the Observable above are completed, this function simply checks
// if there's something in it
this.checkForErrors();
})
.subscribe();
}

I'm going to give you something rough that you can refine with feedback because I'm not clear on the final shape of the data you want back and all. Hopefully this points you in the right direction.
Basically if you want the data from one observable to feed another then you can use switchmap or one of its cousins. If you need the value fed in as well as the result then just lump them together with a combineLatest or something similar.
console.clear();
function getUserRequest(requestId) {
return Rx.Observable.of({ id: 1, subject: 'a', orderIds: [10, 20] })
.delay(500).take(1);
}
function getOrdersForRequest(requestId) {
return Rx.Observable.of([
{ id: 10, clientId: 100, productIds: [ 1000 ] },
{ id: 20, clientId: 200, productIds: [ 1001, 1002 ] }
]).delay(200).take(1);
}
function getClientForOrder(orderId) {
let client;
switch(orderId) {
case 10:
client = { id: 100, name: 'Bob' };
break;
case 20:
client = { id: 200, name: 'Alice' };
break;
}
return Rx.Observable.of(client).delay(200).take(1);
}
function getProductsForOrder(orderId) {
let products;
switch(orderId) {
case 10:
products = [{ id: 1000, name: 'p1', price: 1 }];
break;
case 20:
products = [
{ id: 1001, name: 'p1', price: 2 },
{ id: 1002, name: 'p1', price: 3 }
];
break;
}
return Rx.Observable.of(products).delay(200).take(1);
}
Rx.Observable.of(1)
.switchMap(id => Rx.Observable.combineLatest(
getUserRequest(id),
getOrdersForRequest(id)
.switchMap(orders => Rx.Observable.combineLatest(
Rx.Observable.of(orders),
Rx.Observable.combineLatest(...orders.map(o => getClientForOrder(o.id))),
Rx.Observable.combineLatest(...orders.map(o => getProductsForOrder(o.id)))
)
),
(userRequest, [orders, clients, products]) =>
({ userRequest, orders, clients, products })
)
).subscribe(x => { console.dir(x); });
<script src="https://cdnjs.cloudflare.com/ajax/libs/rxjs/5.5.6/Rx.min.js"></script>
Right now I flattened the results by category. You may want them nested or something like that. This is just a rough pass so provide feedback as needed.

Related

Is there a more accurate way to use <MockedProvider /> for testing apollo requests

I've got my <MockedProvider /> set up passing in mocks={mocks}. everything is working, all good.
the issue is I have a form that whenever any part of it is edited then this makes a mutation, which returns a response and updates the total. say for example, quantity is changed, mutation increases quantity from 1 to 2. total price should double
problem is that in unit tests and mocked provider you only test the functionality in props and hardcoded response. it's not a proper test. perhaps it's more of an e2e/integration test but I was wondering if there's anything you can do with MockedProvider that allows for better testing in this situation?
Instead of using the normal static result property of the objects in the mocks array, you can set a newData function that will run dynamically and use whatever is returned as the result value. For example:
let totalNoteCount = 0;
const mocks = [{
request: {
query: CREATE_NOTE,
variables: {
title: 'Aloha!',
content: 'This is a note ...',
},
},
newData: () => {
// do something dynamic before returning your data ...
totalNoteCount += 1;
return {
data: {
createNote: {
id: 1,
totalNoteCount,
},
},
};
}
}];

How to update firestore collection based on other docs?

I am building an order form that limits how many items you can order based on the stock of the item. I have a menu collection which has items
// menu
{ id: "lasagna", name: "Lasagna", price: 10, stock: 15 }
{ id: "carrot-soup", name: "Carrot Soup", price: 10, stock: 15 }
{ id: "chicken-pot-pie", name: "Chicken Pot Pie", price: 10, stock: 15 }
And an orders collection
// orders
{ id: <auto>, name: "Sarah", cart: {lasagna: 1, carrot-soup: 3}, ... }
{ id: <auto>, name: "Wendy", cart: {chicken-pot-pie: 2, carrot-soup: 1}, ... }
{ id: <auto>, name: "Linda", cart: {lasagna: 3}, ... }
4 carrot-soup has been ordered so the stock should be updated
// updated stock
{ id: "carrot-soup", name: "Carrot Soup", stock: 11 }
Orders are inserted from my Form component
function Form(props) {
// ...
// send order to firestore
const onSubmit = async _event => {
try {
const order = { cart, name, email, phone, sms }
dispatch({ action: "order-add" })
const id = await addDocument(store, "orders", order)
dispatch({ action: "order-add-success", payload: { ...order, id } })
}
catch (err) {
dispatch({ action: "order-add-error", payload: err })
}
}
return <form>...</form>
}
This is my database addDocument function
import { addDoc, collection, serverTimeStamp } from "firebase/firestore"
async function addDocument(store, coll, data) {
const docRef = await addDoc(collection(store, coll), { ...data, timestamp: serverTimestamp() })
return docRef.id
}
How should I decrement the stock field in my menu collection?
Ideally the client should have only read access to menu but to update the stock the client would need write access.
Another possibility is to have the client query the orders, sum the items, and subtract them from the read-only menu. But giving the client read access to other people's orders seems wrong too.
I am new to firestore and don't see a good way to design this.
You should deffinitely use a cloud function to update the stock. Create a function onCreate and onDelete functions trigger. If users can change data you would also need to onWrite function trigger.
Depending on the amount of data you have you woould need to create a custom queue system to update the stock. Belive me! It took me almost 2 years to figure out to solve this. I have even spoken with the Firebase engeeners at the last Firebase Summit in Madrid.
Usualy you would use a transaction to update the state. I would recommend you to do so if you don't have to much data to store.
In my case the amount of data was so large that those transactions would randomly fail so the stock wasn't correct at all. You can see my StackOverflow answer here. The first time I tought I had an answer. You know it took me years to solve this because I asked the same question on a Firebase Summit in Amsterdam. I asked one of the Engeeners who worked on the Realtime Database before they went to Google.
There is a solution to store the stock in chunks but even that would cause random errors with our data. Each time we improved our solution the random errors reduced but still remained.
The solution we are still using is to have a custom queue and work each change one by one. The downside of this is that it takes some time to calculate a lot of data changes but it is 100% acurate.
Just in case we still have a "recalculator" who recalculates one day again and checks if everything worked as it should.
Sorry for the long aswer. For me it looks like you are building a similar system like we have. If you plan to create a warehouse management system like we did I would rather point you to the right direction.
In the end it depends on the amount of data you have and how often or fast you change it.
Here is a solution based on Tarik Huber's advice.
First I include functions and admin
const functions = require("firebase-functions")
const admin = require("firebase-admin")
admin.initializeApp()
Then I create increment and decrement helpers
const menuRef = admin.firestore().collection("menu")
const increment = ([ id, n ]) =>
menuRef.doc(id).update({
stock: admin.firestore.FieldValue.increment(n)
})
const decrement = ([ id, n ]) =>
increment([ id, n * -1 ])
Here is the onCreate and onDelete hooks
exports.updateStockOnCreate =
functions
.firestore
.document("orders/{orderid}")
.onCreate(snap => Promise.all(Object.entries(snap.get("cart") ?? {}).map(decrement)))
exports.updateStockOnDelete =
functions
.firestore
.document("orders/{orderid}")
.onDelete(snap => Promise.all(Object.entries(snap.get("cart") ?? {}).map(increment)))
To handle onUpdate I compare the cart before and after using a diff helper
exports.updateStockOnUpdate =
functions
.firestore
.document("orders/{orderid}")
.onUpdate(snap => Promise.all(diff(snap.before.get("cart"), snap.after.get("cart")).map(increment)))
Here is the diff helper
function diff (before = {}, after = {}) {
const changes = []
const keys = new Set(Object.keys(before).concat(Object.keys(after)))
for (const k of keys) {
const delta = (before[k] ?? 0) - (after[k] ?? 0)
if (delta !== 0)
changes.push([k, delta])
}
return changes
}

Javascript, Redux thunk, synchronous / nested promises

I have a direct messaging application. All the data is stored in Firebase. Each chat contains an array of user IDs.
I use the following function to get all chats from componentDidMount():
return dispatch => new Promise(resolve => FirebaseRef.child('chats')
.on('value', snapshot => resolve(dispatch({
type: 'CHATS_REPLACE',
data: snapshot.val() || [],
})))).catch(e => console.log(e));
Which goes through:
chatReducer(state = initialState, action) {
case 'CHATS_REPLACE': {
let chats = [];
if (action.data && typeof action.data === 'object') {
chats = Object.values(action.data).map(item => ({
id: item.id,
title: item.title,
authorizedUsers: Object.values(item.authorizedUsers).map(user => ({
id: user.id,
// Somedata: fetchUserData(user.id)
// -> pretty sure it can't be done here <-
})),
}));
}
return {
...state,
error: null,
loading: false,
chats,
};
How would I go about fetching more data of every user inside each chat from Firebase at users/:uid?
I don't know what is the use case of this. It would be great if you can share, like how much information about the user you want to use. If its small data, why don't you add it in same API Only. You can pass the users data in the same object with user id as keys, and use the same keys inside your nested data like (only if user data is small or you know API data is always limited like because of pagination or page size. :
{
posts : [
{
title : 'abc'
authorizedUsers : ['1a', '2b', '3c']
}, ....
],
users : {
'1a' : {
name : 'john doe',
profileImage : 'https://some.sample.link',
},
'2b' : {
name : 'bob marshal',
profileImage : 'https://some.sample.link2',
}
}
}
If data is huge or cannot be added in the API ( because API is owned by 3rd party), then only place you can put you code is, instead of just dispatching the actions after the response is recieved, loop over the response in your service only, make async calls to get all "Unique users" only, append that data to the data you recieved from the previous api call, and then dispatch the action with the complete data to the store. It might not be the best way, as everything will have to stall i.e. even the data recieved in 1st api also will stall(not updated on screen) till all the users data is fetched. But best solution can only be given once we know more details about the use case. Like maybe lazy fetching the users data as end user scrolls the screen and may see a particular post Or fetching the user details once you start rendering your data from 1st API call like making a component for showing user associate with a post and in its componentDidMount, you pass the userIds as props from top component which might be "article/post/blog" component and it fetched the data at the time when it is actually rendering that "article/blog/post".
Hope this helps.

Can't get state to update array inside of an object correctly in REACT/REDUX

I am going to break this down step by step for what I want to happen so hopefully people can understand what I am wanting.
Using React/Redux, Lodash
I have many post that are sent from a back end api as an array. Each post has an _id. When I call on the action getAllPost() it gives me back that array with all the post. This is working just fine.
I then dispatch type GET_ALL_POSTS and it triggers the reducer reducer_posts to change/update the state.
reducer:
export default function(state = {}, action) {
switch(action.type) {
case GET_ALL_POSTS:
const postsState = _.mapKeys(action.payload.data, '_id');
//const newPostsState = _.map(postsState, post => {
//const newComments = _.mapKeys(post.comments, '_id');
//});
return postsState;
break;
default:
return state;
break;
}
}
As you can see I change the array into one giant object that contains many post as objects with keys that are equal to their '_id'. This works just fine and returning this part of the state also works fine.
As I mentioned each of these posts has a comments value that is an array. I would like to change the comments array into one large object that holds each comment as an object with a key that is equal to their '_id' just like I did in the post.
Now I need to do this all at once and return the newly created state with One large object that contains all the post as objects and on each of those post there should be a comments object that contains all the comments as objects. I will try to write some example code to show what I am trying to do.
Example:
BigPostsObject {
1: SinglePostObject{},
2: SinglePostObject{},
3: SinglePostObject {
_id: '3',
author: 'Mike',
comments: BigCommentObject{1: SingleCommentObject{}, 2: SingleCommentObject{}}
}
}
I hope that the example kind of clears up what I am trying to do. If it still is confusing as to what I am doing then please ask and also please do not say things like use an array instead. I know I can use an array, but that is not helpful to this post as if others want to do it this way that is not helpful information.
Write a function that processes all the comments from the comments array for each post you have in the posts array:
function processComment(post) {
post.bigCommentsObject = _.mapKeys(post.comments, '_id');
// now the comments array is no longer needed
return _.omit(post, ['comments']);
}
Now use that function to turn each comments array into a big object with all the comments WHILE it still is in the array. Then afterwards turn the array itself in a big object:
const commentsProcessed = _.map(action.payload.data, procesComment);
const postsState = _.mapKeys(commentsProcessed, '_id');
I believe nowadays JS builtin function can do this without requiring external libraries. Anyway this should be the way to go. I will really encourage you getting back to js builtin functions.
var data = [
{
_id: '3',
title: 'Going on vaccation',
comments:[
{_id: 1, comment: 'hello'},
{_id: 2, comment: 'world'}
]
},
{
_id: '2',
title: 'Going to dinner',
comments:[
{_id: 1, comment: 'hello'},
{_id: 2, comment: 'world'}
]
}
]
//you can use JS builtin reduce for this
var transformedPost= _.reduce(data, function(posts, post) {
var newPost = Object.assign({}, post)
newPost._id=post._id
//you can use js builtin map for this
newPost.comments = _.mapKeys(post.comments, '_id')
// if you are using es6, replace the last three line with this
//return Object.assign({}, posts, {[newPost._id]: newPost})
var item = {}
item[newPost._id]=newPost
return Object.assign({}, posts, item)
},{});
console.log(transformedPost)
https://jsbin.com/suzifudiya/edit?js,console

How to access aggregated collection data in meteor client?

I aggregated some data and published it, but I'm not sure how/where to access the subscribed data. Would I be able to access WeeklyOrders client collection (which is defined as client-only collection i.e WeeklyOrders = new Mongo.Collection(null);)?
Also, I see "self = this;" being used in several examples online and I just used it here, but not sure why. Appreciate anyone explaining that as well.
Here is publish method:
Meteor.publish('customerOrdersByWeek', function(customerId) {
check(customerId, String);
var self = this;
var pipeline = [
{ $match: {customer_id: customerId} },
{ $group: {
_id : { week: { $week: "$_created_at" }, year: { $year: "$_created_at" } },
weekly_order_value: { $sum: "$order_value" }
}
},
{ $project: { week: "$_id.week", year: "$_id:year" } },
{ $limit: 2 }
];
var result = Orders.aggregate(pipeline);
result.forEach(function(wo) {
self.added('WeeklyOrders', objectToHash(wo._id), {year: wo.year, week: wo.week, order_value: wo.weekly_order_value});
});
self.ready();
});
Here is the route:
Router.route('/customers/:_id', {
name: 'customerOrdersByWeek',
waitOn: function() {
return [
Meteor.subscribe('customerOrdersByWeek', this.params._id)
];
},
data: function() { return Customers.findOne(this.params._id); }
});
Here is my template helper:
Template.customerOrdersByWeek.helpers({
ordersByWeek: function() {
return WeeklyOrders.find({});
}
});
You want var self = this (note the var!) so that call to self.added works. See this question for more details. Alternatively you can use the new es6 arrow functions (again see the linked question).
There may be more than one issue where, but in your call to added you are giving a random id. This presents two problems:
If you subscribe N times, you will get N of the same document sent to the client (each with a different id). See this question for more details.
You can't match the document by id on the client.
On the client, you are doing a Customers.findOne(this.params._id) where this.params._id is, I assume, a customer id... but your WeeklyOrders have random ids. Give this a try:
self.added('WeeklyOrders', customerId, {...});
updated answer
You'll need to add a client-only collection as a sort-of mailbox for your publisher to send WeeklyOrders to:
client/collections/weekly-orders.js
WeeklyOrders = new Meteor.Collection('WeeklyOrders');
Also, because you could have multiple docs for the same user, you'll probably need to:
Forget what I said earlier and just use a random id, but never subscribe more that once. This is an easy solution but somewhat brittle.
Use a compound index (combine the customer id + week, or whatever is necessary to make them unique).
Using (2) and adding a customerId field so you can find the docs on the client, results in something like this:
result.forEach(function (wo) {
var id = customerId + wo.year + wo.week;
self.added('WeeklyOrders', id, {
customerId: customerId,
year: wo.year,
week: wo.week,
order_value: wo.weekly_order_value,
});
});
Now on the client you can find all of the WeeklyOrders by customerId via WeeklyOrders.find({customerId: someCustomerId}).
Also note, that instead of using pub/sub you could also just do all of this in a method call. Both are no-reactive. The pub/sub gets you collection semantics (the ability to call find etc.), but it adds the additional complexity of having to deal with ids.

Categories