I want to implement observable of array that auto-refresh itself. I do it like this:
const list$ = Observable.create(function(observer) {
getList(list => observer.next(threads);
}));
const liveList$ = Observable.interval(2000).switchMapTo(list$);
When I do subscribe to the liveList$ stream I get values only after n ms. as expected.
The question is how can I get values after first call of getList immediately and each next call with interval?
P.S. I've tried to $list.switchMapTo($liveList).subscribe(console.log) but nothing is changed in behaviour of the chain
Use the timer operator instead. It can be passed an initial delay, as well as a period:
Observable.timer(0, 2000).switchMapTo(list$);
Related
I want a variable states$: an observable stream of objects, with each object containing a member nextState$ of type observable. This property nextState$ sends a unique item corresponding to the next state, and so on...
example:
const states$ = of({ nextState$: createObservableWithNextState$() }).pipe(
switchMap(state => state.nextState$),
switchMap(state => state.nextState$),
switchMap(state => state.nextState$),
switchMap(state => state.nextState$),
...
)
of course it doesn't work, for two reasons at least:
I don't want the ... infinite repetition of switchMap in my code
I want to subscribe to state$ and receive each object (including the initial one in the of())
Of course I could create my own observable from scractch but before I would like to known if it would be possible with existing rxjsoperators. Any idea ?...
RxJS#expand
Expand should do what you're after pretty simply.
I assume at some point you'll reach a state without a nextState$, but you can change that condition easily.
const states$ = of({
nextState$: createObservableWithNextState$()
}).pipe(
expand(state => state.nextState$ != null? state.nextState$ : EMPTY)
);
Expand is closer to mergeMap than switchMap. You can set concurrent to 1 to make it work like concatMap. If you're really after a switchMap-like behaviour, this gets a bit more complicated.
I am trying to build switches that create an array and have this so far:
const [playgroundFilters, setPlaygroundFilters] = useState([initialF]);
const updateItem = (whichvalue, newvalue) => {
let g = playgroundFilters[0];
g[whichvalue] = newvalue;
setPlaygroundFilters(g, ...playgroundFilters.slice());
console.log(playgroundFilters);
};
When I call up updateItem onPress it works once and every subsequent time I get an error "undefined is not an object evaluating g"
Is there an easy way to fix this?
setPlaygroundFilters expects an array so you would need to call it like that
setPlaygroundFilters([g, ...playgroundFilters.slice()]);
instead of
setPlaygroundFilters(g, ...playgroundFilters.slice());
I'm not sure you actually wants to use .slice() like that here, since it just returns the same (cloned) array.
I'm trying to make some checks before saving an array of objects (objects[]) to the DB (mongoDB using mongoose):
Those objects are already sorted by date, so objects[0].date is lower than objects[1].date.
Each object should check that last related saved object has a different value (to avoid saving the same info two times). This means that I've to query to the DB before each save, to make that check, AND each of these object MUST be stored in order to be able to make the check with the right object. If objects are not stored in order, the last related saved object might not be the correct one.
In-depth explanation:
HTTP request is send to an API. It returns an array of objects (sortered by date) that I want to process and save on my Mongo DB (using mongoose). I've to iterate through all these objects and, for each:
Look for the previous related object stored on DB (which COULD BE one of that array).
Check some values between the 'already stored' and the object to save to evaluate if new object must be saved or could be discarded.
Save it or discard it, and then jump to next iteration.
It's important to wait each iteration to finish because:
Items on array MUST be stored in DB in order: first those which lower date, because each could be modified by some object stored later with a higher date.
If next iteration starts before previous has finished, the query that searchs for the previous object could not find it if it hasn't been stored yet
Already tried:
Using promises or async/await on forEach/for loops only makes that iteration async, but it keeps launching all iterations at once.
I've tried using async/await functions inside forEach/for loops, even creating my own asyncForEach function as shown below, but none of this has worked:
Array.prototype.asyncForEach = function(fn) {
return this.reduce(
(promise, n) => promise.then(() => fn(n)),
Promise.resolve()
);
};
Test function:
let testArray = [0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10];
testArray.asyncForEach(function(element) {
setTimeout(() => {
console.log(element);
}, Math.random() * 500);
});
Provided example should show numbers on order in every case. It's not a problem if internal function (setTimeout in the example) should return a promise.
What I think I need is a loop that waits some function/promise between iterations, and only starts the next iteration when the first is already finished.
How could I do that? Thanks you in advance!
const myArray = ['a','b','c','d'];
async function wait(ms) { // comment 3
return new Promise(resolve => setTimeout(resolve, ms));
}
async function doSomething() {
await myArray.reduce(async (promise, item) => {
await promise; // comment 2
await wait(1000);
// here we could await something else that is async like DB call
document.getElementById('results').append(`${item} `);
}, Promise.resolve()); // comment 1
}
setTimeout(() => doSomething(), 1000);
<div id="results">Starting in 1 second <br/></div>
You can also use reduce and async await which you already said you've tried.
Basically, if you read how reduce works you can see that it accepts 2 parameters, first being callback to execute over each step and second optional initial value.
In the callback we have first argument being an accumulator which means that it accepts whatever the previous step returns or the optional initial value for first step.
1) You are giving initial value of promise resolve so that you start your first step.
2) Because of this await promise you will never go into next step until previous one has finished, since that is the accumulator value from previous step, which is promise since we said that callback is async. We are not resolving promise per say here, but as soon as the previous step is finish, we are going to implicitly resolve it and go to next step.
3) You can put for example await wait(30) to be sure that you are throttling the Ajax requests and not sending to many requests to 3rd party API's, since then there is no way that you will send more than 1000/30 requests per second, even if your code executes really fast on your machine.
Hm, ok i am not 100% sure if i understand your question in the right way. But if you try to perform an async array operation that awaits for your logic for each item, you can do it like follow:
async loadAllUsers() {
const test = [1,2,3,4];
const users = [];
for (const index in test) {
// make some magic or transform data or something else
await users.push(test[index])
}
return users;
}
Then you can simply invoke this function with "await". I hope that helps you.
In asyncForEach function you are resolving a Promise, setTimeout doesn't return a Promise.So if you convert your setTimeout to Promise. It will work as expected.
Here is the modified code:
testArray.asyncForEach(function(element) {
return new Promise((resolve, reject) => {
setTimeout(() => {
console.log(element);
return resolve(element)
}, Math.random() * 500);
})
});
I have a situation where I need to output a certain amount of elements on user's action (let's say click), every element has to appear 500ms after the previous element.
Array of elements is stored inside the state, new elements are added inside setInterval running inside useEffect hook.
Here is the problem: inside the hook and interval I don't have an access to the length of the array and it can't be passed as a dependency since it would result in an infinite loop. But I have to stop the interval after there are a certain amount of elements already displayed.
I found a solution to the problem by keeping setInterval iteration inside the hook. I wonder if that's correct or there is a better/more idiomatic approach.
I created a simple example of the problem on codesandbox: https://codesandbox.io/s/xo879wn08z
You can clear the interval in state updater callback where you will have access to the numbers array
setNumbers(numbers => {
if (numbers.length === 9) {
clearInterval(interval);
}
return [...numbers, Math.random()]
});
Working demo
How about instead of using setInterval, you use setTimeout. You'll create a new timeout each time, but only if the length is less than 10.
useEffect(() => {
let id;
if (isCounting && numbers.length < 10) {
id = setTimeout(() => setNumbers([...numbers, Math.random()]), 500);
}
return () => clearTimeout(id);
}, [isCounting, numbers]);
I have a situation where the value that I want to add depends on a future value i.e.
let metadata = {videoId: 123, likes: 400};
let addSubtitles = R.assoc('subtitles', R.__, R.__);
Here addSubtitles is a partially applied function. However, the second argument of addSubtitles is supposed to be the subtitles, but the subtitles themselves depend on the third argument i.e. metadata.videoId. To be called as follows:
addSubtitles(metadata); //but with another argument perhaps?
const subtitles = async getSubtitles(videoId) => await apiCall(videoId);
Any way to solve this in a functional manner? It seems possible if I were to bound the context of the future third argument, but unsure of how to go about doing this.
Please let me know if there is any extra information needed to answer.
This is a working solution, though I was looking to use Ramda to make this work:
const addSubs = async function(el) {¬
const subtitles = await writeCaptions(el.videoId);¬
return R.assoc('subtitles', subtitles, el);¬
};
First let me back up a moment to note that you don't need to use R._ for later values in the signature. Almost every function in the Ramda libary is auto-curried for you, so you can call with all arguments or just some subset to preload data. R._ is only used for holding a space for future calls.
So what you want to do with currying is just keep adding parameters one at a time until you get a complete function ready to call. When the order is wrong you can use R.flip or R._ (depending on context) to reach the values you are ready to fill, and come back to the values you don't know yet.
So from your description, it sounds like your concern is that getSubtitles(metadata) is a network call that returns a promise. This example will proceed under that assumption. Here I think R.flip will be more expressive for you than R._.
const metadata = { videoId: 123, likes: 400 }
const addSubtitles = R.flip(R.assoc('subtitles'))
// Now you have a curried function that takes 2 arguments, first the metadata, and then the subtitles.
const withSubtitlesPromise = getSubtitles(metadata)
// Here we drill down a little further by adding the metadata, and then
// pass it as the callback to .then, which will pass in the final param
// once it resolves
.then(addSubtitles(metadata)
withSubtitlesPromise.then(console.log)
You could definitely capture all of this logic in a single function that takes in metadata and returns a promise for subtitled data. For good measure, we'll also pass in getSubtitles as a depenedency, for easier testing and weaker coupling. Now it's trivial to swap out another function for retrieving subtitle data. In this case, the use of R._ makes the code a little cleaner, so we'll switch it up.
// Define all the logic in a single easy function
const makeAddSubtitles = getSubtitles => metadata =>
getSubtitles(metadata).then(R.assoc('subtitles', R._, metadata))
// Then push in the dependency
const addSubtitles = makeAddSubtitles(getSubtitles)
// Use it
const vid123 = { videoId: 123, likes: 400 }
const vid123WithSubsPromise = addSubtitles(vid123)