I have a custom useMutation hook:
const {
status: updateScheduleStatus,
reset: updateScheduleReset,
mutateAsync: updateSchedule,
} = useUpdateSchedule(queryClient, jobId as string);
Which I understand sets up the mutation but how would I use this if I wanted to do multiple parallel mutations?
I have tried to implement the following but the mutations execute prior to getting to the Promise.all(mutations line.
let mutations: Array<any> = [];
schedulesForDeletion.forEach(async (schedule) => {
const resp = await monitoringService?.getSchedule(
schedule.schedule_id,
);
mutations.push(
updateSchedule({
monitoringService: monitoringService as MonitoringServiceClient,
schedule,
etag: resp?.type === "data" ? resp.headers.etag : "",
}),
);
});
console.dir(mutations);
await Promise.all(mutations);
I would have through that as mutateAsync returns a Promise that they would not fire in sequence but seems that they do.
Is there a way to handle this in react-query or am I better off just performing this with axios? It would be useful to do in react-query as I need to invalidate some queries when the mutations are successful.
running multiple mutations in parallel does work with mutateAsync:
const { mutateAsync } = useMutation(num => Promise.resolve(num + 1))
const promise1 = mutateAsync(1)
const promise2 = mutateAsync(2)
await Promise.all([promise1, promise2])
I'm guessing in your example you push a Promise to the array, then you continue your loop and await monitoringService?.getSchedule. Only after that returns, you fire off the second mutation.
So in that sense, it seems that this is what's "blocking" your execution. If you push the original Promise coming from getSchedule, it should work:
schedulesForDeletion.forEach((schedule) => {
mutations.push(
monitoringService?.getSchedule(
schedule.schedule_id,
).then(resp => updateSchedule({...})
)
)
})
Related
There is a requirement of cancelling the request calls when navigating away from the page or when the same api call is made multiple calls ( keeping the last one active).
This is how the API is extracted out( just a high level)
AJAX.ts
export async function customAjax(options){
let options = {};
options.headers = { ...options.headers, ...obj.headers };
const response = await fetch(url, options);
await response.json()
}
GET and POST calls are being extracted as
API.ts
const get = (url, extra = {}) => request({ url, type: "GET", ...extra });
const post = (url, payload, extra = {}) => request({ url, data: payload ,type: "POST",
}, ...extra });
In the react component I call these utilities as follows:
function MyComponent(){
useEffect(() => {
makeCall();
}, []);
async function makeCall(){
const { response, error } = await API.post(URL, payload);
// Handling code is not added here
// In the similar fashion GET calls are also made
}
}
I have come across Abortcontroller to cancel request where we could use abort method during unmounting of the component.
Is there a way to do this at a utililty level, may be inside customAjax so that I could avoid writing abort controller code everywhere?
From my understanding... What you describe is no different than a memory leak issue. And the current method for avoiding memory leaks is with the AbortController().
As far as handling this at the "utility level", I don't think this is feasible, and indeed would go against the preferred notion of an api being unaware of what's going on at the React component level; i.e separation of concerns..
So, in order to accomplish your requirement, you'll need to use AbortController(), or a custom implementation using a boolean flag that reflects whether the component is mounted, on a per component basis.
Using the boolean flag, you may be able to accept an argument in your api, passing the flag as a parameter; but again, I think this would be considered an anti-pattern.
I understand you're looking for a minimal implementation; but standard practice is fairly minimal:
useEffect(() => {
let abortController = new AbortController();
// Async code
return () => { abortController.abort(); }
}, []);
Using a boolean flag would be more verbose, and would entail something like this in your case:
useEffect(() => {
let isMounted = true;
customAjax(isMounted);
return () => {
isMounted = false;
}
}, []);
To handle out-of-order ajax responses, you can use a local variable inside the effect. For example,
useEffect(() => {
let ignore = false;
async function fetchProduct() {
const response = await fetch('http://myapi/product/' + productId);
const json = await response.json();
if (!ignore) setProduct(json);
}
fetchProduct();
return () => { ignore = true };
}, [productId]);
The ignore variable will ensure that only the latest request's response is updated to state. Reference - https://reactjs.org/docs/hooks-faq.html#performance-optimizations
Regarding memory leak concerns, please see this discussion - https://github.com/reactwg/react-18/discussions/82
I have been studying promises, await and async functions. While I was just in the stage of learning promises, I realized that the following: When I would send out two requests, there was no guarantee that they would come in the order that they are written in the code. Of course, with routing and packets of a network. When I ran the code below, the requests would resolve in no specific order.
const getCountry = async country => {
await fetch(`https://restcountries.com/v2/name/${country}`)
.then(res => res.json())
.then(data => {
console.log(data[0]);
})
.catch(err => err.message);
};
getCountry('portugal');
getCountry('ecuador');
At this point, I hadn't learned about async and await. So, the following code works the exact way I want it. Each request, waits until the other one is done.
Is this the most simple way to do it? Are there any redundancies that I could remove? I don't need a ton of alternate examples; unless I am doing something wrong.
await fetch(`https://restcountries.com/v2/name/${country}`)
.then(res => res.json())
.then(data => {
console.log(data[0]);
})
.catch(err => err.message);
};
const getCountryData = async function () {
await getCountry('portugal');
await getCountry('ecuador');
};
getCountryData();
Thanks in advance,
Yes, that's the correct way to do so. Do realize though that you're blocking each request so they run one at a time, causing inefficiency. As I mentioned, the beauty of JavaScript is its asynchronism, so take advantage of it. You can run all the requests almost concurrently, causing your requests to speed up drastically. Take this example:
// get results...
const getCountry = async country => {
const res = await fetch(`https://restcountries.com/v2/name/${country}`);
const json = res.json();
return json;
};
const getCountryData = async countries => {
const proms = countries.map(getCountry); // create an array of promises
const res = await Promise.all(proms); // wait for all promises to complete
// get the first value from the returned array
return res.map(r => r[0]);
};
// demo:
getCountryData(['portugal', 'ecuador']).then(console.log);
// it orders by the countries you ordered
getCountryData(['ecuador', 'portugal']).then(console.log);
// get lots of countries with speed
getCountryData(['mexico', 'china', 'france', 'germany', 'ecaudor']).then(console.log);
Edit: I just realized that Promise.all auto-orders the promises for you, so no need to add an extra sort function. Here's the sort fn anyways for reference if you take a different appoach:
myArr.sort((a, b) =>
(countries.indexOf(a.name.toLowerCase()) > countries.indexOf(b.name.toLowerCase())) ? 1 :
(countries.indexOf(a.name.toLowerCase()) < countries.indexOf(b.name.toLowerCase()))) ? -1 :
0
);
I tried it the way #deceze recommended and it works fine: I removed all of the .then and replaced them with await. A lot cleaner this way. Now I can use normal try and catch blocks.
// GET COUNTRIES IN ORDER
const getCountry = async country => {
try {
const status = await fetch(`https://restcountries.com/v2/name/${country}`);
const data = await status.json();
renderCountry(data[0]); // Data is here. Now Render HTML
} catch (err) {
console.log(err.name, err.message);
}
};
const getCountryData = async function () {
await getCountry('portugal');
await getCountry('Ecuador');
};
btn.addEventListener('click', function () {
getCountryData();
});
Thank you all.
When I try to proces data come from api then use it to render, but I always go to a problem with async because the process function doesn't wait for my fetching functions.
const [fetchData1, setData1] = useState([]);
const [fetchData1, setData2] = useState([]);
const [processedData, setProcessedData] = useState([]);
useEffect(() => {
const getData1 = async () => {
//get data1 using axios
//setData1(response)
}
const getData2 = async () => {
//get data2 using axios
//setData2(response)
}
getData1();
getData2();
setProcessedData(processData(fetchData1, fetchData2));
}, [])
const processData = (data1, data2) => {
//process two data
//return data;
}
Even when I try to wrap two fetching functions and the process function in an async function but the problem remains the same.
(async () => {
await getData1();
await getData2();
setProcessedData(processData(fetchData1, fetchData2));
})
Reading your question, as far as I can tell you don't need fetchData1 and fetchData2, you just want the processedData. The problem with your current code is that it's using the default values of fetchData1 and fetchData2 when calling setProcessedData, it's not using the results form axios.
Wait for both promises to settle and use their results. See comments:
const [processedData, setProcessedData] = useState([]);
useEffect(() => {
const getData1 = async () => {
//get data1 using axios
//setData1(response)
};
const getData2 = async () => {
//get data2 using axios
//setData2(response)
};
// *** Wait for both promises to be fulfilled
Promise.all(
getData1(),
getData2()
).then([data1, data2]) => { // Get those results into parameters
// *** Use the parameter values
setProcessedData(processData(data1, data2));
}).catch(error => {
// handle/report error
});
}, []);
// *** render using the current values in `processedData`
Note that since you're only do this when the component is first created, you don't need to worry about cancelling it, etc., when other state in the component changes (if it has other state). If the calls depended on other state data you were listing in the dependency array, you might need to handle disregarding earlier results if that other data changed during the calls to axios. But again, not with what you're doing here.
Promise.all is for handling multiple asnyc operations:
https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Promise/all
Here is more examples:
https://www.taniarascia.com/promise-all-with-async-await/
I have an array of items which on each element I want to make an HTTP call, wait for it to finish, then make another call, only one at a time.
I tried:
index(item) {
return this.service.index(item).pipe(
map(response => {
// handle success case
}),
catchError(error => {
// handle error case
})
)
}
async processArray(array) {
const promises = array.map(item => this.index(item));
await Promise.all(promises);
}
proccessArray(array);
Also with NGRX Effects:
#Effect()
effect$ = this.actions$.pipe(
ofType<action>(actionTypes.action),
mergeMapTo(this.store.select(getMyArray)),
flatMap((request: any[]) => {
return zip(...request.map(item => {
return this.service.index(item).pipe(
map(response => {
// handle success case
}),
catchError(error => {
// handle error case
})
)
}))
}),
);
Also tried doing it in for and forEach loops but they fire all the requests at once. How could I achieve this?
If you are using promises and want to wait for each promise to resolve before another call is made then (1) you should not use Promise.all as this will wait til all requests are resolved and (2) you need to use a plain old for-loop which enables you to wait for async operations within the loop.
async processArray(array) {
for(var i = 0; i < array.length; i++){
await yourServiceCall();
}
}
As a sidenote: Since you are using async-await, don't forget to convert your observables to promises.
If you want to move away from promises (and async-await) and rely on pure RxJS instead, have a look at concatMap:
Projects each source value to an Observable which is merged in the output Observable, in a serialized fashion waiting for each one to complete before merging the next.
For example:
import { from } from 'rxjs/observable/from';
ngOnInit() {
from(myArray)
.pipe(concatMap(el => yourServiceCall(el)))
.subscribe(/* your logic */);
}
I am watching videos to learn MongoDB Express.js VueJS Node.js (MEVN) stack.
And I want to create a seed directory and also use promise functions
// const delay = require('delay')
const Promise = require('bluebird')
const songs = require('./songs.json')
const users = require('./users.json')
const bookmarks = require('./bookmarks.json')
const historys = require('./history.json')
sequelize.sync({ force: true })
.then( async function () {
await Promise.all(
users.map( user => {
User.create(user)
})
)
await Promise.all(
songs.map( song => {
Song.create(song)
})
)
//I have to add this line
// ---> await delay(1000)
await Promise.all(
bookmarks.map( bookmark => {
Bookmark.create(bookmark)
})
)
await Promise.all(
historys.map( history => {
History.create(history)
})
)
})
I have four tables with seeds to create, and the last two tables data must be created after the former two tables data. (They are foreign keys)
But every time I run this file, the last two tables data will be created first
The only way I can prevent this is to add delay(1000) between them.
I am wondering if there exists any efficient way to solve this issue~
Thank you.
Race conditions like this one is always caused by that promises weren't properly chained.
A promise should be returned from map callback:
await Promise.all(
users.map( user => User.create(user))
);
etc.
Not returning a value from map is virtually always a mistake. It can be prevented by using array-callback-return ESLint rule.
If User.create(user), etc. were Bluebird promises with default configuration, not chaining them would also result in this warning.
My assumption why your code might fail:
You're not returning the Promises that I guess /(User|Song|Bookmark|History).create/g return to the Promise.all() function, since your map callback is not returning anything.
If you're using Arrow functions with brackets, then you need to explicitly specify the return value (using the familar return keyword).
Otherwise you can just omit the curly brackets.
My suggestion is, that you refactor you're code by utilizing Promise .then()-Chaining.
For you're example, I would suggest something like this:
const Promise = require('bluebird')
const songs = require('./songs.json')
const users = require('./users.json')
const bookmarks = require('./bookmarks.json')
const histories = require('./history.json')
sequelize.sync({
force: true
}).then(() =>
Promise.all(
users.map(user =>
User.create(user)
)
).then(() =>
Promise.all(
songs.map(song =>
Song.create(song)
)
)
).then(() =>
Promise.all(
bookmarks.map(bookmark =>
Bookmark.create(bookmark)
)
)
).then(() =>
Promise.all(
histories.map(history =>
History.create(history)
)
)
)
);