I'm working with an API that has very strict rate limits and I need to send a number of requests to the same endpoint from names in an array. I set up a simple demo project and I tried this (and may variants of):
const pokemon = ['ditto', 'bulbasaur', 'charizard', 'pikachu'];
const obs = pokemon.map((pk, index) => {
return from(axios.get(`https://pokeapi.co/api/v2/pokemon/${pk}`)).pipe(delay(1000),map(res => {
return {id: res.data.id, name: res.data.name, height: res.data.height};
}));
});
concat(obs).subscribe(data => {
console.log(data);
});
but the Axios.get()'s all fire off when they are created and the concat().subscribe() just logs 4 observables. If I subscribe to the from().pipe() then after a second all 4 logout at once but then I'm subscribing in a subscribe which is poor.
The solution I settled on feels so cumbersome I have to believe there is a better way:
const axios = require('axios');
const { forkJoin, from } = require('rxjs');
const { map } = require('rxjs/operators');
const pokemon = ['ditto', 'bulbasaur', 'charizard', 'pikachu'];
const obs = pokemon.map((pk, index) => {
return from(new Promise(resolve => setTimeout(async () => {
const prom = await axios.get(`https://pokeapi.co/api/v2/pokemon/${pk}`);
resolve(prom);
}, index*1000))).pipe(map(res => {
console.log('fetched: ', pk);
return {id: res.data.id, name: res.data.name, height: res.data.height};
}))
})
forkJoin(obs).subscribe(data => {
console.log(data);
});
This delays the creation of the axios.get(), if I run with node --require debugging-aid/network rxjs_axios_delay.js I can see the delayed network requests and the real API I am hitting is happy, but this feels complicated and not very "RXy".
Anyone got anything better?
but the Axios.get()'s all fire off when they are created
this highlights a very interesting trait of Promises: they are eager. I think the defer operator can come in handy:
const pokemon = ['ditto', 'bulbasaur', 'charizard', 'pikachu'];
const obs = pokemon.map((pk, index) => {
return defer(() => axios.get(`https://pokeapi.co/api/v2/pokemon/${pk}`)).pipe(delay(1000),map(res => {
return {id: res.data.id, name: res.data.name, height: res.data.height};
}));
});
concat(...obs).subscribe(data => {
console.log(data);
});
StackBlitz demo.
The cool thing about defer is that it evaluates the given expression(i.e invokes the callback function) when it is being subscribed to.
This means you could also do things like these:
let dep$ = of('A');
const src$ = defer(() => dep$);
if (someCondition) {
dep$ = of('B')
}
// if `someCondition` is true, `dep$` will be `of('B')`
src$.pipe(...).subscribe()
Related
I've been dealing for a while with this problem and still can't tackle it.
I'm using React-query as a server state management library and I'm trying to get my UI state synchronized with my server state when a mutations occurs. Since I can use the mutation response to avoid a new API call, I'm using the setQueryData feature that React-query gives us.
The problem is that the old-data is being correctly modified (I can see it in the react-query DevTools) when a mutation is successful, but the component using it isn't being re-rendered, making my UI State not synchronized with my Server state (well, at least the user can't see the update).
Let me show some code and hope someone can give me some insights.
Component using the query:
const Detail = ({ orderId }) => {
const { workGroups } = useWorkGroups();
const navigate = useNavigate();
const queryClient = useQueryClient();
const orderQueries = queryClient.getQueryData(["orders"]);
const queryOrder = orderQueries?.find((ord) => ord.id === orderId);
// more code
Component mutating the query:
const Deliver = ({
setIsModalOpened,
artisan,
index,
queryQuantity,
queryOrder,
}) => {
const [quantity, setQuantity] = useState(() => queryQuantity);
const { mutate: confirmOrderDelivered } = useMutateOrderDeliveredByArtisan(
queryOrder.id
);
const onSubmit = () => {
confirmOrderDelivered(
{
id: queryOrder.artisan_production_orders[index].id,
artisan: artisan.user,
items: [
{
quantity_delivered: quantity,
},
],
},
{
onSuccess: setIsModalOpened(false),
}
);
};
// more code
Now the mutation function (ik it's a lot of logic but I dont' want to refetch the data using invalidateQueries since we're dealing with users with a really bad internet connection). Ofc you don't need to understand each step of the fn but what it basically does is update the old queried data. In the beginning I thought it was a mutation reference problem since React using a strict comparison under the hood but I also checked it and It doesn't look like it's the problem. :
{
onSuccess: (data) => {
queryClient.setQueryData(["orders"], (oldQueryData) => {
let oldQueryDataCopy = [...oldQueryData];
const index = oldQueryDataCopy.findIndex(
(oldData) => oldData.id === orderId
);
let artisanProdOrders =
oldQueryDataCopy[index].artisan_production_orders;
let artisanProductionOrderIdx = artisanProdOrders.findIndex(
(artProdOrd) => artProdOrd.id === data.id
);
artisanProdOrders[artisanProductionOrderIdx] = {
...artisanProdOrders[artisanProductionOrderIdx],
items: data.items,
};
const totalDelivered = artisanProdOrders.reduce((acc, el) => {
const delivered = el.items[0].quantity_delivered;
return acc + delivered;
}, 0);
oldQueryDataCopy[index] = {
...oldQueryDataCopy[index],
artisan_production_orders: artisanProdOrders,
items: [
{
...oldQueryDataCopy[index].items[0],
quantity_delivered: totalDelivered,
},
],
};
return oldQueryDataCopy;
});
},
onError: (err) => {
throw new Error(err);
},
}
And last but not least: I already checked that the oldQueryData is being correctly modified (console loging in the onSuccess fn in the mutation response) and, as I said before, the data is correctly modified in the React-query DevTools.
I know this is a lot of code and the problem seems to be complex but I really believe that it might be a really easy thing that I'm not pointing out because of how tired I already am.
Thanks!
Well, I fixed it in the worst possible way imho, so I will answer this question but I really would like to read your thoughts.
It looks like the new query data setted on the expected query is re-rendering the component only if the mutation function is located in the component that we actually want to re-render.
With that in mind what I did was just colocate my mutation function in the parent component and pass it down through the child component.
Something like this:
const Detail = ({ orderId }) => {
const { workGroups } = useWorkGroups();
const navigate = useNavigate();
const { mutate: confirmOrderDelivered } = useMutateOrderDeliveredByArtisan(
queryOrder.id
); ==============> THE MUTATION FN IS NOW IN THE PARENT COMPONENT
const queryClient = useQueryClient();
const orderQueries = queryClient.getQueryData(["orders"]);
const queryOrder = orderQueries?.find((ord) => ord.id === orderId);
// more code
First child:
const Assigned = ({ artisan, queryOrder, index, confirmOrderDelivered }) => {
// THE IMPORTANT PART HERE IS THE PROP BEING PASSED DOWN.
<Modal
isOpen={isModalOpened}
toggleModal={setIsModalOpened}
// className="w312"
width="80%"
height="fit-content"
justifyCont="unset"
alignItems="unset"
>
<Deliver
setIsModalOpened={setIsModalOpened}
artisan={artisan}
queryQuantity={quantity}
queryOrder={queryOrder}
index={index}
confirmOrderDelivered={confirmOrderDelivered} => HERE
/>
</Modal>
Component that actually needs the mutation fn:
const Deliver = ({
setIsModalOpened,
artisan,
index,
queryQuantity,
queryOrder,
confirmOrderDelivered,
}) => {
const [quantity, setQuantity] = useState(() => queryQuantity);
const onSubmit = () => {
confirmOrderDelivered( => HERE.
{
id: queryOrder.artisan_production_orders[index].id,
artisan: artisan.user,
items: [
{
quantity_delivered: quantity,
},
],
}
);
};
You can't mutate any prop.
You always need to create new versions of the objects and props and use destructuring.
Example
queryClient.setQueryData([QUERY_KEYS.MYKEY], (old) => {
const myArray = [...old.myArray];
return {
...old,
// WRONG
myArray[0].name: 'text1',
// CORRECT
myArray[0]: {
...myArray[0],
name: 'text1'
}
}})
i need a filter with time "old to new" and "new to old"
here is my code template:
const timeNewToOld = () => {
const [paginationUsers,setPaginationUsers] = useState([])
const newToOld = users.sort((a, b) => {
return b.Time.localeCompare(a.Time)
})
setPaginationUsers(newToOld)
}
const timeOldToNew = () => {
const oldToNew = users.sort((a, b) => {
return a.Time.localeCompare(b.Time)
})
setPaginationUsers(oldToNew)
}
this functions working but, not responding instantly on web browser.
i hope i can explain with these images:
i click on the "newtoold" function and nothing changes:
i move to the next page and i'm back to the 1st page:
everything is fine. only the first time I click on the function, it doesn't get instant updates, when I change the page, the index returns to normal.
paginationUsers created here:
useEffect(() => {
const getAllData = async () => {
onSnapshot(_dbRef, (snapshot) => {
const data = snapshot.docs.map((doc) => {
return {
id: doc.id,
...doc.data(),
}
})
setUsers(data)
setUserPageCount(Math.ceil(data.length / 20))
})
}
getAllData()
}, [])
useEffect(() => {
displayUsers(users, setPaginationUsers, userCurrentPage)
}, [users, setPaginationUsers, userCurrentPage])
i hope i could explain,
happy coding..
Array.prototype.sort doesn't create a new array, so react can't know that it changed. Creating a new array should help.
const timeOldToNew = () => {
const oldToNew = [...users].sort((a, b) => {
return a.Time.localeCompare(b.Time)
})
setPaginationUsers(oldToNew)
}
My NodeJS application has to do some API requests, so I'm mocking their return as my tests are just for my application's business logic. However, there's two things that I quite didn't understand.
I'm using jest's mockImplementation method to change the return of my service, but I can't make it work without calling jest.mock with the service beforehand.
Also, if I try to set automock: true in my jest.config.js, it returns me an error:|
TypeError: Cannot set property 'gracefulify' of undefined
Here's my test.js code in where I'm testing a function that calls automation.js, which has my application logic and make the calls for my services:
const automation = require('../automations/fake.automation');
// MOCKS
const mockedBlingProduct = require('../mocks/bling-product.mocks.json');
const mockedShopifyCreatedProduct = require('../mocks/shopify-created-product.mocks.json');
// SERVICES
const BlingProductService = require('../services/bling-product.service');
const ShopifyProductService = require('../services/shopify-product.service');
jest.mock('../services/bling-product.service');
jest.mock('../services/shopify-product.service');
describe('Automation test', () => {
beforeEach(() => {
const blingMockedReturn = jest.fn(() => {
return mockedBlingProduct;
});
const shopifyMockedReturn = jest.fn(() => {
return mockedShopifyCreatedProduct;
});
BlingProductService.mockImplementation(() => {
return {
list: blingMockedReturn
};
});
ShopifyProductService.mockImplementation(() => {
return {
create: shopifyMockedReturn
};
});
});
it('should return status SUCCESS', async () => {
const result = await
.run();
expect(result).toEqual({ status: 'SUCCESS' });
});
});
And here's the code of one of my services, keep in mind that the logic behind the API calls is abstracted from the service. In the mockImplementation I'm trying to overwrite the list and create functions inside them:
class BlingPriceService {
async list(query = {}) {
const httpMethod = 'GET';
const resource = 'produtos/page={pageNumber}/json';
const options = {
queryString: query,
urlParams: {
pageNumber: 1,
}
};
return blingComponent.request(httpMethod, resource, options);
}
}
module.exports = BlingPriceService;
const automation = require('../automations/fake.automation');
// MOCKS
const mockedBlingProduct = require('../mocks/bling-product.mocks.json');
const mockedShopifyCreatedProduct = require('../mocks/shopify-created-product.mocks.json');
// SERVICES
const BlingProductService = require('../services/bling-product.service');
const ShopifyProductService = require('../services/shopify-product.service');
describe('Automation test', () => {
beforeAll(() => {
jest.spyOn(BlingProductService.prototype, 'list').mockImplementation(() => Promise.resolve(mockedBlingProduct));
jest.spyOn(ShopifyProductService.prototype, 'list').mockImplementation(() => Promise.resolve(mockedShopifyCreatedProduct));
});
afterAll(() => {
jest.restoreAllMocks();
});
it('should return status SUCCESS', async () => {
const result = await automation.run();
expect(result).toEqual({ status: 'SUCCESS' });
});
});
I would like to change the following code as it reaches Firebases 10 item in an array limit. I want to loop through all teamIds and make a Firebase query for each individual teamId. The issue is, I'm not sure how to do this in a way that it waits until all promises are complete before continuing.
This is the current code;
const unsubscribe = Firebase.firestore().collection('invites').where('teamId', 'in', teamIds).onSnapshot(snapshot => {
const invites = [];
snapshot.docs.forEach(doc => {
const data = doc.data();
if (!invites[data.teamId]) {
invites[data.teamId] = [];
}
invites[data.teamId].push(Object.assign({}, { id: doc.id }, doc.data()));
});
setTeamInvites(invites);
setLoading(false);
setError(false);
});
I've like to change it to something like this;
teamIds.forEach(teamId => {
Firebase.firestore().collection('invites').where('teamId', '==', teamId).onSnapshot(snapshot => {
// Map the results to an array that will be stored in the pageState when all promises are complete
});
});
How can I do this?
I figured out I can do this with Promise.all, this is what I ended up with.
const promises = [];
teamIds.forEach(teamId => {
promises.push(new Promise((resolve, reject) => {
Firebase.firestore().collection('invites').where('teamId', '==', teamId).onSnapshot(snapshot => {
let invites = [];
invites[teamId] = [];
snapshot.docs.forEach(doc => {
const data = doc.data();
invites[data.teamId].push(Object.assign({}, { id: doc.id }, data));
resolve(invites);
});
});
}));
});
Promise.all(promises).then(allInvites => {
// Do what I needed to do with all of the invites here
});
I have a Websocket Endpoint I am subscribing. I want to get that Data, and then operate on them.
CODE:
// Simple HTTP POST Request. Works Perfectly. I am Logged In to the API
const authenticationRequest = () => axios.post(authenticationUrl, {
user: username, password
})
.then((response) => response.data)
.catch((error) => console.error(console.error('Error Response', error)));
// WS Request. I need to wait for this to return my Data and then operate on them
const wsRequest = async () => {
// Getting the Auth Token. Working Perfectly.
const reqToken = await authenticationRequest();
// Hitting the ws Endplint. Working Perfectly.
const webSocketRequest = new WebSocket(topicDataUrl);
// Setting the Data for the First Message. Works Perfectly.
const firstMessage = {
token: reqToken,
stats: 2,
sql: "SELECT * FROM cc_payments LIMIT 100",
live: false
};
// Initialising an Empty Array. Works.
let websocketData = [];
// Opening the Endpoint
webSocketRequest.onopen = () => {
// Sending the first Message
webSocketRequest.send(JSON.stringify(firstMessage));
// On Each Message
webSocketRequest.onmessage = (streamEvent) => {
of(streamEvent).pipe(
map(event => JSON.parse(event.data)), // Parse the Data
filter(message => message.type === 'RECORD') // Filter the Data
).subscribe(
message => websocketData.push(message.data.value)// Adding each Value from each message to the Array.
);
};
};
console.log(JSON.stringify(websocketData), 'Websocket DATA'); // Empty Array
return websocketData;
};
Here I am calling it a few lines down, but still with no results. I get an empty Array.
(async function () {
const data = await wsRequest();
console.log(JSON.stringify(data), 'Data'); // Still Empty
}());
So, what am I doing wrong? Can someone, explain to me the problem? I mean I get the asynchronisity of things, but I am awaiting. I even tried setting a timeout but didn't work.
Is my stream correct? Maybe there is a problem there??
So, the RXJS Actions are asynchronous. So, I would need 2 Things.
- Close the Stream when Operationg Completed. (Tried takeUntil, takeWhile, but obviously was doing something wrong)
- Wait in order to return the Actual Data(WebsocketData).
UPDATE:
async function authenticationRequest() {
const AuthenticateWith = await axios.post(authenticationUrl, {
user: username,
password
})
.then(response => response.data)
.catch((error) => console.error('Error:', error));
return AuthenticateWith;
}
const webSocketRequest = new WebSocket(topicDataUrl);
const websocketData = new Array;
const subject = new Subject();
async function requestToWSEndpoint() {
const reqToken = await authenticationRequest();
const firstMessage = {
token: reqToken,
stats: 2,
sql: "SELECT * FROM cc_payments LIMIT 100",
live: false
};
webSocketRequest.onopen = () => {
webSocketRequest.send(JSON.stringify(firstMessage));
webSocketRequest.onmessage = (streamEvent) => {
JSON.parse(streamEvent.data).type === 'RECORD' && websocketData.push(JSON.parse(streamEvent.data).data.value);
subject.next(websocketData);
JSON.parse(streamEvent.data).type === 'END' && subject.complete();
};
};
};
(async function () {
requestToWSEndpoint();
const chartData = subject.subscribe((event) => console.log(event, 'Event')); // Event Adds to the Array and at the End I have all my Items(Filtered). It printed 100 Times.
console.log('ARRAY', chartData); // This returns [Subscriber {closed: false, _parentOrParents: null, _subscriptions: Array(1), syncErrorValue: null, syncErrorThrown: false, …}]. This is what I want. The Array.
}());
My suggestion as outlined in my comment:
const subject = new Subject();
...
}
(async function () {
wsRequest();
subject.pipe(finalize(()=> {console.log('ARRAY', websocketData);})).subscribe();
}());
Actually you dont even need a function for what you do in wsRequest, except for const reqToken = await authenticationRequest();. The rest can easily be globally scoped.
You should take a look at the documentation for rxjs operators.