I am trying to run multiple count-down from an array, e.g., [11, 12, 13, 14, 15, 16]. What I would like to achieve is that the first time set count down time to 11, when it reach 0, the timer is set to 12 and then count down to 0. After that, reset to 13 and count down, then reset to 14 and count down, etc.
However, my code can only count down from 11 to 0, and then stop. For loop seems not working and never put the second item 12 into the timer. I later found out it is because the retun inside for loop break it outside of the loop. I wonder if there are some smart ways to avoid return in a For-Loop? Or, how to use Return in For-Loop without breaking out of the loop?
(I have another counter there called TotalTime which I inteneded to count all the total time it takes, ie, 11+12+13+14,etc)
The Timer and Display Screen:
import {View, Text, StyleSheet, FlatList} from 'react-native';
import PlaySounds from './PlaySounds';
const CustomTimer = ({time, length}) => {
const [seconds, setSeconds] = useState(time);
const [totalTime, setTotalTime] = useState(0);
useEffect(() => {
if (seconds > 0 )
{ const interval = setInterval(() => {
setSeconds(seconds => seconds - 1);
setTotalTime(seconds=>seconds + 1)
}, 1000);
return () => clearInterval(interval);
}}, [seconds])
return (
<View>
<Text style={{fontSize:20}}> Count Down: {seconds} sec</Text>
<Text style={{fontSize:20}}> Total Time: {totalTime} sec</Text>
</View>
)}
export default CustomTimer;
=====================
import React, {useEffect, useState} from 'react';
import {SafeAreaView,View,Button,ScrollView, Text, StyleSheet} from 'react-native';
import CustomTimer from '../component/CustomTimer';
const BrewScreen = () => {
const timeArray= [11, 12, 13, 14, 15, 16]
const length = timeArray.length
const countDownArray=() =>{
for (let i=0; i<length; i++) {
return(<CustomTimer time={timeArray[i]} length={length}/>)
}
}
return (
<>
<ScrollView>
{countDownArray()}
</ScrollView>
</>
)
}
The issue seems to be that you return inside the for loop.
This means that during the first iteration of the for loop in the countDownArray function you return the Timer element. When you do this, the function will exit and so the loop will not continue.
What you would need instead to achieve your desired behaviour is a different way of creating the Timer elements. This will likely require a callback function in the Timer element. This can be used to update the state of the BrewScreen, and update the Timer displayed.
I finally solve it, not using loops but the all mighty useEffect:
const CustomTimer = ({time}) => {
const [seconds, setSeconds] = useState(time[0]);
const [count, setCount] = useState(1)
const [totalTime, setTotalTime] = useState(0);
useEffect(() => {
if (seconds >= 0 )
{ const interval = setInterval(() => {
setSeconds(seconds => seconds - 1);
setTotalTime(seconds=> seconds + 1)
console.log('seconds ', seconds, ' totalTime ', totalTime)
}, 1000);
return () => clearInterval(interval);}
else if (count<(time.length)) {
setCount(count => count+1)
setSeconds(time[count])}
else return
}
)
return (
<View>
<Text style={{fontSize:20}}> Count Down: {seconds} sec</Text>
<Text style={{fontSize:20}}> Total Time: {totalTime} sec</Text>
{seconds===3?<PlaySounds/>:null}
</View>
)}
Having Return inside For Loop will break out of the loop
JS will execute all the i in one go so you need to add Promise with Async/Await pair
Return won't break ForEach loop
However, Async does not work in ForEach loop
useEffect automatically run the loop with proper support of time Interval
In my case where I want to run a count-down clock, setInterval is must preferred to setCountDown, since setInterval renew EVERY milisecond you specified.
Remeber to clean up your interval with clearInterval
There is one thing I don't understand: The initial value of count has to be 1 instead of 0. It seems like count is forced into initial value the first time it is read. If I initialized it as 0, the first number in array will be executed twice.
Also, the final number displayed in Count Down is -1.... I know it is because this is how the useEffect is stopped, but wonder how to avoid showing a negative number?
This question already has answers here:
The useState set method is not reflecting a change immediately
(15 answers)
Closed 2 years ago.
I'm learning react native and I'm doing a small app that regists the time of sleep for each day.
I'm using the useEffect() to trigger some modifications of values showed on screen, one of those is the average time average() that I update inside that useEffect:
const [updateAction, setUpdateAction] = useState(false);
useEffect(() => {
console.log("lenght:" + registSleep.length);
var registed = false;
if (!isNaN(enteredHours)) {
for (var i = 0; i < registSleep.length; i++) {
if (
registSleep[i].day === selectedDay &&
registSleep[i].month === selectedMonth &&
registSleep[i].year === selectedYear
) {
registed = true;
registSleep[i].hours = enteredHours;
}
}
if (!registed) {
var newReg = {
day: selectedDay,
month: selectedMonth,
year: selectedYear,
hours: enteredHours,
};
setNewRegist((prevReg) => [
...prevReg,
newReg,
]);
}
if (registSleep.length != 0) {
average();
}
}
console.log("2. lenght:" + registSleep.length);
setviewInfoAction(!viewInfoAction);
}, [updateAction]);
To debug, as you can see I print to console the lenght before I add a new value to the array of regists setNewRegist(...) and as far as I know it should be printing lenght: 0 and then 2. lenght: 1 but instead it prints lenght: 0 and then 2. lenght: 0 and on the next trigger lenght: 1 and then 2. lenght: 1.
Why the array is not updating on addition?
I'm assuming setNewRegist is useState Hook, where it's value is registSleep
const [registSleep, setNewRegist] = useState([ ... ])
Two reasons why it's not working. useState Hook is asynchronous, the logic will not stop for the logic inside setState.
setNewRegist( ... update registSleep)
console.log(registSleep) // will run before setState finishes
However even it did finish in time, registSleep was already set at a fixed value, so it will not change unless the component is rerendered, which is what setState does, to trigger the component to rerender.
//I am considering this
const [ registSleep , setNewRegist ] = useState([])
setNewRegist is async function, so next statment will execute first in your case console.log so it won't have updated registSleep .
So How to check right?
Tada.... !!! You can check each update of registSleep via useEffect like this :
useEffect(() => {
// Here you can get updated length
// as soon as registSleep updates
console.log(registSleep.length);
},[registSleep]) // <-- watch for any update on `registSleep`
I have a bunch of events to send up to a service. But the requests are rate limited and each request has a count limit:
1 request per second: bufferTime(1000)
100 event items per request: bufferCount(100)
The problem is, I am not sure how to combine them in a way that makes sense.
Allowing pass-through
Complicating this further, I need to make sure that events go through instantaneously if we don't hit either limit.
For example, I don't want it to actually wait for 100 event items before letting it go through if it's only one single event during a non-busy time.
Legacy API
I also found that there was a bufferWithTimeOrCount that existed in RxJS v4, although I am not sure how I'd use that even if I had it.
Test playground
Here is a JSBin I made for you to test your solution:
http://jsbin.com/fozexehiba/1/edit?js,console,output
Any help would be greatly appreciated.
The bufferTime() operator takes three parameters which combines the functionality of bufferTime and bufferCount. See http://reactivex.io/rxjs/class/es6/Observable.js~Observable.html#instance-method-bufferTime.
With .bufferTime(1000, null, 3) you can make a buffer every 1000ms or when it reaches 3 items. However, this means that it doesn't guarantee 1000ms delay between each buffer.
So you could use something like this which is pretty easy to use (buffers only 3 items for max 1000ms):
click$
.scan((a, b) => a + 1, 0)
.bufferTime(1000, null, 3)
.filter(buffer => buffer.length > 0)
.concatMap(buffer => Rx.Observable.of(buffer).delay(1000))
.timestamp()
.subscribe(console.log);
See live demo: http://jsbin.com/libazer/7/edit?js,console,output
The only difference to what you probably wanted is that the first emission might be delayed by more than 1000ms. This is because both bufferTime() and delay(1000) operators make a delay to ensure that there's always at least 1000ms gap.
I hope this works for you.
Operator
events$
.windowCount(10)
.mergeMap(m => m.bufferTime(100))
.concatMap(val => Rx.Observable.of(val).delay(100))
.filter(f => f.length > 0)
Doc
.windowCount(number) : [ Rx Doc ]
.bufferTime(number) : [ Rx Doc ]
Demo
// test case
const mock = [8, 0, 2, 3, 30, 5, 6, 2, 2, 0, 0, 0, 1]
const tInterval = 100
const tCount = 10
Rx.Observable.interval(tInterval)
.take(mock.length)
.mergeMap(mm => Rx.Observable.range(0, mock[mm]))
// start
.windowCount(tCount)
.mergeMap(m => m.bufferTime(tInterval))
.concatMap(val => Rx.Observable.of(val).delay(tInterval))
.filter(f => f.length > 0)
// end
.subscribe({
next: (n) => console.log('Next: ', n),
error: (e) => console.log('Error: ', e),
complete: (c) => console.log('Completed'),
})
<script src="https://unpkg.com/rxjs/bundles/Rx.min.js"></script>
Updated
After more testing. I found the answer above has some problem in extreme condition. I think they are caused by .window() and .concat(), and then I find a warning in the doc#concatMap.
Warning: if source values arrive endlessly and faster than their corresponding inner Observables can complete, it will result in memory issues as inner Observables amass in an unbounded buffer waiting for their turn to be subscribed to.
However, I thought the right way to limit the request rate possibly is, that we could limit the cycle time of requests. In your case, just limit there is only 1 request per 10 milliseconds. It is simpler and may be more efficient to control the requests.
Operator
const tInterval = 100
const tCount = 10
const tCircle = tInterval / tCount
const rxTimer = Rx.Observable.timer(tCircle).ignoreElements()
events$
.concatMap(m => Rx.Observable.of(m).merge(rxTimer)) // more accurate than `.delay()`
// .concatMap(m => Rx.Observable.of(m).delay(tCircle))
or
events$
.zip(Rx.Observable.interval(tCircle), (x,y) => x)
I've modified the answer I gave to this question to support your use case of adding a limited number of values (i.e. events) to pending requests.
The comments within should explain how it works.
Because you need to keep a record of the requests that have been made within the rate limit period, I don't believe that it's possible to use the bufferTime and bufferCount operators to do what you want - a scan is required so that you can maintain that state within the observable.
function rateLimit(source, period, valuesPerRequest, requestsPerPeriod = 1) {
return source
.scan((requests, value) => {
const now = Date.now();
const since = now - period;
// Keep a record of all requests made within the last period. If the
// number of requests made is below the limit, the value can be
// included in an immediate request. Otherwise, it will need to be
// included in a delayed request.
requests = requests.filter((request) => request.until > since);
if (requests.length >= requestsPerPeriod) {
const leastRecentRequest = requests[0];
const mostRecentRequest = requests[requests.length - 1];
// If there is a request that has not yet been made, append the
// value to that request if the number of values in that request's
// is below the limit. Otherwise, another delayed request will be
// required.
if (
(mostRecentRequest.until > now) &&
(mostRecentRequest.values.length < valuesPerRequest)
) {
mostRecentRequest.values.push(value);
} else {
// until is the time until which the value should be delayed.
const until = leastRecentRequest.until + (
period * Math.floor(requests.length / requestsPerPeriod)
);
// concatMap is used below to guarantee the values are emitted
// in the same order in which they are received, so the delays
// are cumulative. That means the actual delay is the difference
// between the until times.
requests.push({
delay: (mostRecentRequest.until < now) ?
(until - now) :
(until - mostRecentRequest.until),
until,
values: [value]
});
}
} else {
requests.push({
delay: 0,
until: now,
values: [value]
});
}
return requests;
}, [])
// Emit only the most recent request.
.map((requests) => requests[requests.length - 1])
// If multiple values are added to the request, it will be emitted
// mulitple times. Use distinctUntilChanged so that concatMap receives
// the request only once.
.distinctUntilChanged()
.concatMap((request) => {
const observable = Rx.Observable.of(request.values);
return request.delay ? observable.delay(request.delay) : observable;
});
}
const start = Date.now();
rateLimit(
Rx.Observable.range(1, 250),
1000,
100,
1
).subscribe((values) => console.log(
`Request with ${values.length} value(s) at T+${Date.now() - start}`
));
.as-console-wrapper { max-height: 100% !important; top: 0; }
<script src="https://unpkg.com/rxjs#5/bundles/Rx.min.js"></script>
I would like to use make a series of requests to a server, but the server has a hard rate limit of 10 request per second. If I try to make the requests in a loop, it will hit the rate limit since all the requests will happen at the same time.
for(let i = 0; i < 20; i++) {
sendRequest();
}
ReactiveX has lots of tools for modifying observable streams, but I can't seem to find the tools to implement rate limiting. I tried adding a standard delay, but the requests still fire at the same time, just 100ms later than they did previously.
const queueRequest$ = new Rx.Subject<number>();
queueRequest$
.delay(100)
.subscribe(queueData => {
console.log(queueData);
});
const queueRequest = (id) => queueRequest$.next(id);
function fire20Requests() {
for (let i=0; i<20; i++) {
queueRequest(i);
}
}
fire20Requests();
setTimeout(fire20Requests, 1000);
setTimeout(fire20Requests, 5000);
The debounceTime and throttleTime operators are similar to what I'm looking for as well, but that is lossy instead of lossless. I want to preserve every request that I make, instead of discarding the earlier ones.
...
queueRequest$
.debounceTime(100)
.subscribe(queueData => {
sendRequest();
});
...
How do I make these requests to the server without exceeding the rate limit using ReactiveX and Observables?
The implementation in the OP's self answer (and in the linked blog) always imposes a delay which is less than ideal.
If the rate-limited service allows for 10 requests per second, it should be possible to make 10 requests in, say, 10 milliseconds, as long as the next request is not made for another 990 milliseconds.
The implementation below applies a variable delay to ensure the limit is enforced and the delay is only applied to requests that would see the limit exceeded.
function rateLimit(source, count, period) {
return source
.scan((records, value) => {
const now = Date.now();
const since = now - period;
// Keep a record of all values received within the last period.
records = records.filter((record) => record.until > since);
if (records.length >= count) {
// until is the time until which the value should be delayed.
const firstRecord = records[0];
const lastRecord = records[records.length - 1];
const until = firstRecord.until + (period * Math.floor(records.length / count));
// concatMap is used below to guarantee the values are emitted
// in the same order in which they are received, so the delays
// are cumulative. That means the actual delay is the difference
// between the until times.
records.push({
delay: (lastRecord.until < now) ?
(until - now) :
(until - lastRecord.until),
until,
value
});
} else {
records.push({
delay: 0,
until: now,
value
});
}
return records;
}, [])
.concatMap((records) => {
const lastRecord = records[records.length - 1];
const observable = Rx.Observable.of(lastRecord.value);
return lastRecord.delay ? observable.delay(lastRecord.delay) : observable;
});
}
const start = Date.now();
rateLimit(
Rx.Observable.range(1, 30),
10,
1000
).subscribe((value) => console.log(`${value} at T+${Date.now() - start}`));
<script src="https://unpkg.com/rxjs#5/bundles/Rx.min.js"></script>
This blog post does a great job of explaining that RxJS is great at discarding events, and how they came to the answer, but ultimately, the code you're looking for is:
queueRequest$
.concatMap(queueData => Rx.Observable.of(queueData).delay(100))
.subscribe(() => {
sendRequest();
});
concatMap adds concatenates the newly created observable to the back of the observable stream. Additionally, using delay pushes back the event by 100ms, allowing 10 request to happen per second. You can view the full JSBin here, which logs to the console instead of firing requests.
Actually, there's an easier way to do this with the bufferTime() operator and its three arguments:
bufferTime(bufferTimeSpan, bufferCreationInterval, maxBufferSize)
This means we can use bufferTime(1000, null, 10) which means we'll emit a buffer of max 10 items or after max 1s. The null means we want to open a new buffer immediately after the current buffer is emitted.
function mockRequest(val) {
return Observable
.of(val)
.delay(100)
.map(val => 'R' + val);
}
Observable
.range(0, 55)
.concatMap(val => Observable.of(val)
.delay(25) // async source of values
// .delay(175)
)
.bufferTime(1000, null, 10) // collect all items for 1s
.concatMap(buffer => Observable
.from(buffer) // make requests
.delay(1000) // delay this batch by 1s (rate-limit)
.mergeMap(value => mockRequest(value)) // collect results regardless their initial order
.toArray()
)
// .timestamp()
.subscribe(val => console.log(val));
See live demo: https://jsbin.com/mijepam/19/edit?js,console
You can experiment with different initial delay. With only 25ms the request will be sent in batches by 10:
[ 'R0', 'R1', 'R2', 'R3', 'R4', 'R5', 'R6', 'R7', 'R8', 'R9' ]
[ 'R10', 'R11', 'R12', 'R13', 'R14', 'R15', 'R16', 'R17', 'R18', 'R19' ]
[ 'R20', 'R21', 'R22', 'R23', 'R24', 'R25', 'R26', 'R27', 'R28', 'R29' ]
[ 'R30', 'R31', 'R32', 'R33', 'R34', 'R35', 'R36', 'R37', 'R38', 'R39' ]
[ 'R40', 'R41', 'R42', 'R43', 'R44', 'R45', 'R46', 'R47', 'R48', 'R49' ]
[ 'R50', 'R51', 'R52', 'R53', 'R54' ]
But with .delay(175) we'll emit batches of less than 10 items because we're limited by the 1s delay.
[ 'R0', 'R1', 'R2', 'R3', 'R4' ]
[ 'R5', 'R6', 'R7', 'R8', 'R9', 'R10' ]
[ 'R11', 'R12', 'R13', 'R14', 'R15' ]
[ 'R16', 'R17', 'R18', 'R19', 'R20', 'R21' ]
[ 'R22', 'R23', 'R24', 'R25', 'R26', 'R27' ]
[ 'R28', 'R29', 'R30', 'R31', 'R32' ]
[ 'R33', 'R34', 'R35', 'R36', 'R37', 'R38' ]
[ 'R39', 'R40', 'R41', 'R42', 'R43' ]
[ 'R44', 'R45', 'R46', 'R47', 'R48', 'R49' ]
[ 'R50', 'R51', 'R52', 'R53', 'R54' ]
There's however one difference to what you might need. This solution starts initially starts emitting values after 2s delay because of the .bufferTime(1000, ...) and delay(1000). All other emissions happen after 1s.
You could eventually use:
.bufferTime(1000, null, 10)
.mergeAll()
.bufferCount(10)
This will always collect 10 items and only after that it'll perform the request. This would be probably more efficient.
I wrote a library to do this, you set up the maximum number of requests per interval and it rate limits observables by delaying subscriptions. It's tested and with examples: https://github.com/ohjames/rxjs-ratelimiter
Go with Adam’s answer. However, bear in mind the traditional of().delay() will actually add a delay before every element. In particular, this will delay the first element of your observable, as well as any element that wasn’t actually rate limited.
Solution
You can work around this by having your concatMap return a stream of observables that immediately emit a value, but only complete after a given delay:
new Observable(sub => {
sub.next(v);
setTimeout(() => sub.complete(), delay);
})
This is kind of a mouthful, so I’d create a function for it. That said, since there’s no use for this outside of actual rate limiting, you’d probably be better served just writing a rateLimit operator:
function rateLimit<T>(
delay: number,
scheduler: SchedulerLike = asyncScheduler): MonoTypeOperatorFunction<T> {
return concatMap(v => new Observable(sub => {
sub.next(v);
scheduler.schedule(() => sub.complete(), delay);
}));
}
Then:
queueRequest$.pipe(
rateLimit(100),
).subscribe(...);
Limitation
This will now create a delay after every element. This means that if your source observable emits its last value then completes, your resulting rate-limited observable will have a little delay between itself between its last value, and completing.
Updated cartant's answer as pipe-able operator for newer rxjs versions:
function rateLimit(count: number, period: number) {
return <ValueType>(source: Observable<ValueType>) => {
return source.pipe
( scan((records, value) => {
let now = Date.now();
let since = now - period;
// Keep a record of all values received within the last period.
records = records.filter((record) => record.until > since);
if (records.length >= count) {
// until is the time until which the value should be delayed.
let firstRecord = records[0];
let lastRecord = records[records.length - 1];
let until = firstRecord.until + (period * Math.floor(records.length / count));
// concatMap is used below to guarantee the values are emitted
// in the same order in which they are received, so the delays
// are cumulative. That means the actual delay is the difference
// between the until times.
records.push(
{ delay: (lastRecord.until < now) ?
(until - now) :
(until - lastRecord.until)
, until
, value });
} else {
records.push(
{ delay: 0
, until: now
, value });
}
return records;
}, [] as RateLimitRecord<ValueType>[])
, concatMap((records) => {
let lastRecord = records[records.length - 1];
let observable = of(lastRecord.value);
return lastRecord.delay ? observable.pipe(delay(lastRecord.delay)) : observable;
}) );
};
}
interface RateLimitRecord<ValueType> {
delay: number;
until: number;
value: ValueType;
}