How to avoid race conditions when fetching data with Redux? - javascript

We have an action that fetches an object async, let's call it getPostDetails, that takes a parameter of which post to fetch by
an id. The user is presented with a list of posts and can click on one to get some details.
If a user clicks on "Post #1", we dispatch a GET_POST action which might look something like this.
const getPostDetails = (id) => ({
type: c.GET_POST_DETAILS,
promise: (http) => http.get(`http://example.com/posts/#${id}`),
returnKey: 'facebookData'
})
This is picked up by a middleware, which adds a success handler to the promise, which will call an action like
GET_POST__OK with the deserialized JSON object. The reducer sees this object and applies it to a store. A typical
__OK reducer looks like this.
[c.GET_ALL__OK]: (state, response) => assign(state, {
currentPost: response.postDetails
})
Later down the line we have a component that looks at currentPost and displays the details for the current post.
However, we have a race condition. If a user submits two GET_POST_DETAILS actions one right after the other, there is
no guarantee what order we recieve the __OK actions in, if the second http request finishes before the first, the
state will become incorrect.
Action => Result
---------------------------------------------------------------------------------
|T| User Clicks Post #1 => GET_POST for #1 dispatched => Http Request #1 pending
|i| User Clicks Post #2 => GET_POST for #2 dispatched => Http Request #2 pending
|m| Http Request #2 Resolves => Results for #2 added to state
|e| Http Request #1 Resolves => Results for #1 added to state
V
How can we make sure the last item the user clicked always will take priority?

The problem is due to suboptimal state organization.
In a Redux app, state keys like currentPost are usually an anti-pattern. If you have to “reset” the state every time you navigate to another page, you'll lose one of the main benefits of Redux (or Flux): caching. For example, you can no longer navigate back instantly if any navigation resets the state and refetches the data.
A better way to store this information would be to separate postsById and currentPostId:
{
currentPostId: 1,
postsById: {
1: { ... },
2: { ... },
3: { ... }
}
}
Now you can fetch as many posts at the same time as you like, and independently merge them into the postsById cache without worrying whether the fetched post is the current one.
Inside your component, you would always read state.postsById[state.currentPostId], or better, export getCurrentPost(state) selector from the reducer file so that the component doesn’t depend on specific state shape.
Now there are no race conditions and you have a cache of posts so you don’t need to refetch when you go back. Later if you want the current post to be controlled from the URL bar, you can remove currentPostId from Redux state completely, and instead read it from your router—the rest of the logic would stay the same.
While this isn’t strictly the same, I happen to have another example with a similar problem. Check out the code before and the code after. It’s not quite the same as your question, but hopefully it shows how state organization can help avoid race conditions and inconsistent props.
I also recorded a free video series that explains these topics so you might want to check it out.

Dan's solution is probably a better one, but an alternative solution is to abort the first request when the second one begins.
You can do this by splitting your action creator into an async one which can read from the store and dispatch other actions, which redux-thunk allows you to do.
The first thing your async action creator should do is check the store for an existing promise, and abort it if there is one. If not, it can make the request, and dispatch a 'request begins' action, which contains the promise object, which is stored for next time.
That way, only the most recently created promise will resolve. When one does, you can dispatch a success action with the received data. You can also dispatch an error action if the promise is rejected for some reason.

Related

How to prevent API calls on a minute timer loop when in the process of logging out of a web app?

In my Ionic/Angular app, I have a 60 second timer observable which just emits the current time synced with server time. Each minute I fetch permissions, settings, etc. I pass a token with each request. On logout I revoke the token. Here's a sample of what my logic looks like.
Side note: There's also a feature where a user can "change login type" where they can "become" an administrator, for example, and this process may also trigger a similar circumstance.
this.clientTimeSub = this.timeService.clientTime
.pipe(takeUntil(this.logoutService.isLoggingOut$))
.subscribe(async (latestClientTime) => {
this.clientTime = { ...latestClientTime };
// if client time just rolled over to a new minute, update settings
if (
this.clientTime?.time?.length === 7 &&
this.clientTime?.time?.slice(-1) === '0'
) {
await updateSettings();
await updatePermissions();
// etc
// These functions will:
// (1) make an api call (using the login token!)
// (2) update app state
// (3) save to app storage
}
});
When I am logging out of the app, there's a small time window where I could be in the middle of sending multiple api requests and the token is no longer valid, due to the timer rolling to a new minute just as I was logging out, or close to it. I am then presented with a 401: Unauthorized in the middle of logging out.
My naive solution was to tell this observable to stop propagation when a Subject or BehaviorSubject fires a value telling this observable that it is logging out, you can see this here .pipe(takeUntil(this.logoutService.isLoggingOut$)).
Then, in any of my logout methods, I would use:
logout() {
this.isLoggingOut.next(true);
...
// Logout logic here, token becomes invalidated somewhere here
// then token is deleted from state, etc, navigate back to login...
...
this.isLoggingOut.next(false);
}
In that small time window of logging out, the client timer should stop firing and checking if it's rolled to a new minute, preventing any further api calls that may be unauthenticated.
Is there a way I can easily prevent this issue from happening or is there a flaw in my logic that may be causing this issue?
I appreciate any help, thank you!
First of all, it is not the best way to use async-await along with RXJS. Its because RXJS as a reactive way of functional programming, have its "pipeable" operators so you can kinda "chain" everything.
So instead of having a logic of calculating time in your subscribe callback function you should rather use, for example filter() RXJ operator, and instead of using await-async you can use switchMap operator and inside it, use forkJoin or concat operator.
this.timeService.clientTime
.pipe(
// Filter stream (according to your calculation)
filter((time) => {
// here is your logic to calculate if time has passed or whatever else you are doing
// const isValid = ...
return isValid;
}),
// Switch to another stream so you can call api calls
// Here with "from" we are converting promises to observables in order to be able to use magic of RXJS
switchMap(_ => forkJoin([from(updateSettings), from(updatePermissions)])),
// Take until your logout
takeUntil(this.logoutService.isLoggingOut$)
).subcribe(([updateSettings, updatePermissions]) => {
// Basically your promises should just call API services, and other logic should be here
// Here you can use
// (2) update app state
// (3) save to app storage
})
If you split actions like in my example, in your promises you just call api calls to update whatever you are doing, then when its done, in subscribe callback you can update app state, save to app storage etc. So you can have 2 scenarios here:
Api calls from promises, are still in progress. If you trigger logout in the meanwhile takeUntil will do the thing and you will not update app state etc.
If both Api calls from promises are done, you are in a subscribe callback block and if its just a synchronous code (hopefully) it will be done. And then async code can be executed (your timer can now emit next value, its all about Event Loop in javascript)

Is there a well-established way to update local state immediately without waiting for an API response in React/Redux?

TL;DR: Is there some well-known solution out there using React/Redux for being able to offer a snappy and immediately responsive UI, while keeping an API/database up to date with changes that can gracefully handle failed API requests?
I'm looking to implement an application with a "card view" using https://github.com/atlassian/react-beautiful-dnd where a user can drag and drop cards to create groups. As a user creates, modifies, or breaks up groups, I'd like to make sure the API is kept up to date with the user's actions.
HOWEVER, I don't want to have to wait for an API response to set the state before updating the UI.
I've searched far and wide, but keep coming upon things such as https://redux.js.org/tutorials/fundamentals/part-6-async-logic which suggests that the response from the API should update the state.
For example:
export default function todosReducer(state = initialState, action) {
switch (action.type) {
case 'todos/todoAdded': {
// Return a new todos state array with the new todo item at the end
return [...state, action.payload]
}
// omit other cases
default:
return state
}
}
As a general concept, this has always seemed odd to me, since it's the local application telling the API what needs to change; we obviously already have the data before the server even responds. This may not always be the case, such as creating a new object and wanting the server to dictate a new "unique id" of some sort, but it seems like there might be a way to just "fill in the blanks" once the server does response with any missing data. In the case of an UPDATE vs CREATE, there's nothing the server is telling us that we don't already know.
This may work fine for a small and lightweight application, but if I'm looking at API responses in the range of 500-750ms on average, the user experience is going to just be absolute garbage.
It's simple enough to create two actions, one that will handle updating the state and another to trigger the API call, but what happens if the API returns an error or a network request fails and we need to revert?
I tested how Trello implements this sort of thing by cutting my network connection and creating a new card. It eagerly creates the card immediately upon submission, and then removes the card once it realizes that it cannot update the server. This is the sort of behavior I'm looking for.
I looked into https://redux.js.org/recipes/implementing-undo-history, which offers a way to "rewind" state, but being able to implement this for my purposes would need to assume that subsequent API calls all resolve in the same order that they were called - which obviously may not be the case.
As of now, I'm resigning myself to the fact that I may need to just follow the established limited pattern, and lock the UI until the API request completes, but would love a better option if it exists within the world of React/Redux.
The approach you're talking about is called "optimistic" network handling -- assuming that the server will receive and accept what the client is doing. This works in cases where you don't need server-side validation to determine if you can, say, create or update an object. It's also equally easy to implement using React and Redux.
Normally, with React and Redux, the update flow is as follows:
The component dispatches an async action creator
The async action creator runs its side-effect (calling the server), and waits for the response.
The async action creator, with the result of the side-effect, dispatches an action to call the reducer
The reducer updates the state, and the component is re-rendered.
Some example code to illustrate (I'm pretending we're using redux-thunk here):
// ... in my-component.js:
export default () => {
const dispatch = useDispatch();
useEffect(() => {
dispatch(MyActions.UpdateData(someDataFromSomewhere));
});
return (<div />);
};
// ... in actions.js
export const UpdateData = async (data) => (dispatch, getStore) => {
const results = await myApi.postData(data);
dispatch(UpdateMyStore(results));
};
However, you can easily flip the order your asynchronous code runs in by simply not waiting for your asynchronous side effect to resolve. In practical terms, this means you don't wait for your API response. For example:
// ... in my-component.js:
export default () => {
const dispatch = useDispatch();
useEffect(() => {
dispatch(MyActions.UpdateData(someDataFromSomewhere));
});
return (<div />);
};
// ... in actions.js
export const UpdateData = async (data) => (dispatch, getStore) => {
// we're not waiting for the api response anymore,
// we just dispatch whatever data we want to our reducer
dispatch(UpdateMyStore(data));
myApi.postData(data);
};
One last thing though -- doing things this way, you will want to put some reconciliation mechanic in place, to make sure the client does know if the server calls fail, and that it retries or notifies the user, etc.
The key phrase here is "optimistic updates", which is a general pattern for updating the "local" state on the client immediately with a given change under the assumption that any API request will succeed. This pattern can be implemented regardless of what actual tool you're using to manage state on the client side.
It's up to you to define and implement what appropriate changes would be if the network request fails.

Unsubscribing from an observable when another observable is unsubscribed

Suppose I have a list of items that is queried from an API depending on a parameter that can be changed in the UI. When changing the value of this parameter, I dispatch an action:
this.store$.dispatch(new ChangeParameterAction(newParameterValue));
Now, on the receiving end, I want to trigger a new API call on every parameter change. I do this by subscribing to the store and then switching to the API observable. Then, I dispatch the result back into the store:
/** Statement 1 **/
this.store$.select(selectParameter).pipe(
switchMap(parameter => this.doApiCall$(parameter))
).subscribe(apiResult => {
this.store$.dispatch(new ResultGotAction(apiResult))
});
My UI is receiving the items by subscribing to
/** Statement 2 **/
this.store$.select(selectResults);
Now my question is: How can I join these two statements together so that we only have the subscription for Statement 1 for as long as the UI showing the results is active (and not destroyed)? I will always subscribe to the result of Statement 2, so Statement 1 will never be unsubscribed.
I've tried merging both observables and ignoring the elements for Statement 1, then subscribing tothe merged observables. But this looks like a very unreadable way for doing such a basic task. I think there must be a better way, but I can't find one. Hope you can help!
I can't tell exactly if this would run correctly, but I would go with moving the dispatch of ResultGotAction to a tap operator and then switching to this.store$.select(selectResults)
For example:
this.store$.select(selectParameter).pipe(
switchMap(parameter => this.doApiCall$(parameter)),
tap(apiResult => this.store$.dispatch(new ResultGotAction(apiResult))),
switchMapTo(this.store$.select(selectResults))
);

React-Redux: Where do I put a side effect that uses part of a reducer?

I know not to put side effects in reducers, and I know there are lots of great explanations about how to handle async actions. I have read them. I have a specific question I'm stumped on. Thanks!
I have state.largeObject which is an object with many entries. I have a reducer that does some complex logic and merges the result into state.largeObject like so:
export const myReducer = (state, { input }) => {
const largeObject = doSomethingComplex(input)
// other logic that uses largeObject
return {
...state,
largeObject: {
...state.largeObject,
...largeObject
}
}
}
I want to save the result of doSomethingComplex(input) to the server. Where do I put the side effect? EDIT: without duplicating doSomethingComplex which is still needed for other logic in the reducer.
I can't put it in myReducer or doSomethingComplex since they are pure.
I can't put it in Redux middleware, as that only has access to the action and the state as a whole. I don't want to call doSomethingComplex both in the reducer and in the middleware.
I don't want to move it into an action-creator, since that would force a lot of pure functionality into a thunk, making it harder to compose.
What am I missing? Thanks!
Middleware, thunks, and store subscription callbacks are all valid places to do that:
It's fine to have some saving logic as part of a thunk. Thunks are a type of middleware, and that's where side effects are supposed to live in general.
You could have a custom middleware that looks for the specific action type that results in this change, or checks to see if this chunk of state has changed, and makes a call to the server after the state has been updated
You could have a store subscribe callback that checks to see if this chunk of state has changed and makes a call to the server
Based on the way you phrased things, I'm not sure if you're looking to send only the result of that call to the server, or the result of doing {...state.largeObject, ...largeObject}.
The most suitable approach in my opinion would be moving the whole reducer logic (your doSomethingComplex function) to the server side.
So all what you'd have to do is dispatching the action and sending required arguments to the API. In case of a success response, you'd dispatch the success action, call myReducer and save the result in the store.
However, if you really want to keep this logic on the front side, you'd have to use a middleware - thunks or sagas (I prefer sagas).
// some pseudo code //
dispatch the action
- inside the middleware -
call doSomethingComplex() // called once and stored in some variable
dispatch action that will call the reducer and store the result
call API with the result
But still I'd recommend the first solution since that (the second) approach will work, but may break the proper data flow.
Edit: some final thoughts - if you really want to keep those calculations on the front-end side, consider following strategy:
dispatch the action
inside middleware (thunks or sagas) call doSomethingComplex function
call the API with the stuff that doSomethingComplex returned
return the stuff - that doSomethingComplex returned - from the API as the success response
call success action with the stuff and invoke the reducer that will save it in the store
This is how the proper data-flow can be kept alive.

Controlling Async Batch request inside redux action

I have a redux action for a search in the application. When the user starts the search it batches the queries and send 30 queries per request and queues first 10 requests. Whenever any one of the request is successful, it will add the next request to the request queue. All this is happening as a redux action, and whenever a request is successful it will dispatch action to append the result into the store. I would like input regarding how to handle this if the user clicks "cancel search" and enters a new search term. How can I cancel existing request and redux actions so the previous searches requests will not succeed and add to the result store?
Example Code Below :-
function search(queries){
// split the queries into chunks of size 30
batches = _.chunks(queries, 30);
let count = 0;
//get the first batch manually
getBatch(batches[0]);
function getBatch(batch){
axios.post('url', batch.join(',')).then((response) => {
// recursively call get batch to get rest of the data
if(count < batches.length) { getBatch(batches[count++]); }
// dispatch action to append the result into the store
dispatch({ type: 'APPEND_RESULT', payload: response })
}
}
}
this is a minimal code for sharing the concept
I have read about cancellable promises axios supports it. But I am not sure how to control this recursive call on a second execution of the same function.
eg: user input will be { ids :[1,2,3,..1000] } I am trying to create batches and sent parallel requests { ids:[1,2, .. 29,30 }, { ids: [31, 32, ..
59,60]} etc.
This looks like a candidate for redux observable. You can create an observable and then add a debounce (like mentioned in the above one comment ) and you can easily cancel or switchMap to the new request whenever the current request is finished so that any remaining items in the previous queue wont be considered.
Answer is total IMHO
In my opinion, the whole way to solve problem is broken here. You'll drop your server by constant asking it for any input change. Just imagine how much traffic will your website consume for mobile!
The best workaround here is to:
Add _.debounce to your input with a small waiting time (100ms is good)
Have only one query in memory. On each new event from user input cancel the request and override the current one.
Dispatch an action with data on each server response

Categories