I'm fairly new with rxjs. I'm calling a function below and the complete stream is read and the read console statements are printed, but I never see a "Subscibe done" and I don't know why. What will it take to get this stream to finish? Is something obviously wrong?
const readline$ = RxNode.fromReadLineStream(rl)
.filter((element, index, observable) => {
if (index >= range.start && index < range.stop) {
console.log(`kept line is ${JSON.stringify(element)}`);
return true;
} else {
console.log(`not keeping line ${JSON.stringify(element)}`);
return false;
}
})
.concatMap(line => Rx.Observable.fromPromise(myFunction(line)))
.do(response => console.log(JSON.stringify(response)));
readline$.subscribe(i => { console.log(`Subscribe object: ${util.inspect(i)}`); },
err => { console.error(`Subscribe error: ${util.inspect(err)}`); },
done => { console.log("Subscribe done."); // NEVER CALLED
anotherFunc(); // NEVER CALLED
}
);
You can see from the source code that it send complete notification only when the source stream emits close event. https://github.com/Reactive-Extensions/rx-node/blob/master/index.js#L100-L102
So if you need the proper complete handler to be called you'll need to close the stream yourself, see How to close a readable stream (before end)?.
In other words the Observable doesn't complete automatically after reading the entire file.
Related
I am trying to write a code in angular 11 for a scenario like this -
I have list of files, and for every file I hit an api (say api1), i take an fileId from response and i pass it to another api (say api2),i want to keep on hitting the api2 every 3 seconds,unless i dont get the status="available" in the response. Once i get the available status, i no more need to hit the api2 for that fileId and we can start processing for the next file in loop.
This whole process for every file that I have.
I understand we can achieve this using rxjs operators like mergeMap or switchMap (as the sequence do not matter to me right now) . But i am very new to rxjs and not sure how to put it together.
This is what i am doing right now -
this.filesToUpload.forEach((fileItem) => {
if (!fileItem.uploaded) {
if (fileItem.file.size < this.maxSize) {
self.fileService.translateFile(fileItem.file).then( //hit api1
(response) => {
if (response && get(response, 'status') == 'processing') {
//do some processing here
this.getDocumentStatus(response.fileId);
}
},
(error) => {
//show error
}
);
}
}
});
getDocumentStatus(fileId:string){
this.docStatusSubscription = interval(3000) //hitting api2 for every 3 seconds
.pipe(takeWhile(() => !this.statusProcessing))
.subscribe(() => {
this.statusProcessing = false;
this.fileService.getDocumentStatus(fileId).then((response)=>{
if(response.results.status=="available"){
this.statusProcessing = true;
//action complete for this fileId
}
},(error)=>{
});
})
}
Here's how I might do this given the description of what you're after.
Create a list of observables of all the calls you want to make.
Concatenate the list together
Subscribe
The thing that makes this work is that we only subscribe once (not once per file), and we let the operators handle subscribing and unsubscribing for everything else.
Then nothing happens until we subscribe. That way concat can do the heavy lifting for us. There's no need for tracking anything ourselves with variables like this.statusProessing or anything like that. That's all handled for us! It's less error prone that way.
// Create callList. This is an array of observables that each hit the APIs and only
// complete when status == "available".
const callList = this.filesToUpload
.filter(fileItem => !fileItem.uploaded && fileItem.file.size < this.maxSize)
.map(fileItem => this.createCall(fileItem));
// concatenate the array of observables by running each one after the previous one
// completes.
concat(...callList).subscribe({
complete: () => console.log("All files have completed"),
error: err => console.log("Aborted call list due to error,", err)
});
createCall(fileItem: FileItemType): Observable<never>{
// Use defer to turn a promise into an observable
return defer(() => this.fileService.translateFile(fileItem.file)).pipe(
// If processing, then wait untill available, otherwise just complete
switchMap(translateFileResponse => {
if (translateFileResponse && get(translateFileResponse, 'status') == 'processing') {
//do some processing here
return this.delayByDocumentStatus(translateFileResponse.fileId);
} else {
return EMPTY;
}
}),
// Catch and then rethrow error. Right now this doesn't do anything, but If
// you handle this error here, you won't abort the entire call list below on
// an error. Depends on the behaviour you're after.
catchError(error => {
// show error
return throwError(() => error);
})
);
}
delayByDocumentStatus(fileId:string): Observable<never>{
// Hit getDocumentStatus every 3 seconds, unless it takes more
// than 3 seconds for api to return response, then wait 6 or 9 (etc)
// seconds.
return interval(3000).pipe(
exhaustMap(_ => this.fileService.getDocumentStatus(fileId)),
takeWhile(res => res.results.status != "available"),
ignoreElements(),
tap({
complete: () => console.log("action complete for this fileId: ", fileId)
})
);
}
I have a stream that is composed from a chain of pipes.
I am using event-stream package to create the building blocks of the pipes.
The code gets a file from S3, unzips it, parses it and send the data to some async function
I am trying to get the promise resolved when it finished handling that file.
How can I be sure that the all the chain has finished draining?
My current solution looks like this.
it looks bad and I still think that there is a possibility that resolve()
will be called while there are data chunks in the gzReader for example.
thanks
const inputStream = this.s3client.getObject(params).createReadStream()
inputStream.on('end',() => {
console.log("Finished handling file " + fileKey)
let stopInterval = setInterval(() => {
if (counter == 0) {
resolve(this.eventsSent)
clearInterval(stopInterval)
}
}, 300)
})
const gzReader = zlib.createGunzip();
inputStream
.pipe(gzReader)
.pipe(es.split())
.pipe(es.parse())
.pipe(es.mapSync(data => {
counter++
this.eventsSent.add(data.data)
asyncFunc(this.destinationStream, data.data)
.then(() => {
counter--
})
.catch((e) => {
counter--
console.error('Failed sending event ' + data.data + e)
})
}))
Because you never initialize counter, it is zero and after the first 300ms, your function resolves (which can be before your pipes are working and increase the counter).
So don't use setInterval ;) You don't need it.
Also there is no need to use mapSync, if you already call an async function in it. Just use map and pass the data and callback (https://github.com/dominictarr/event-stream#map-asyncfunction). Don't forget to call the callback in your async function!
Add a last step in your pipe: wait(callback) (https://github.com/dominictarr/event-stream#wait-callback)
There you can resolve.
Hi I have the following code and I would like to know how to prevent the main (upstream) Observable from getting deleted when an error is thrown.
How can I change the following code so that all numbers expect '4' get displayed?
I am looking for a general pattern solution that would work in other cases with different operators. This is the simplest case I could come up with.
const Rx = require('rxjs/Rx');
function checkValue(n) {
if(n === 4) {
throw new Error("Bad value");
}
return true;
}
const source = Rx.Observable.interval(100).take(10);
source.filter(x => checkValue(x))
.catch(err => Rx.Observable.empty())
.subscribe(v => console.log(v));
You will want to keep the source observable running, but if you let the error happen on the main event stream it will collapse the entire observable and you will no longer receive items.
The solution involves creating a separated stream where you can filter and catch without letting the upstream pipe collapse.
const Rx = require('rxjs/Rx');
function checkValue(n) {
if(n === 4) {
throw new Error("Bad value");
}
return true;
}
const source = Rx.Observable.interval(100).take(10);
source
// pass the item into the projection function of the switchMap operator
.switchMap(x => {
// we create a new stream of just one item
// this stream is created for every item emitted by the source observable
return Observable.of(x)
// now we run the filter
.filter(checkValue)
// we catch the error here within the projection function
// on error this upstream pipe will collapse, but that is ok because it starts within this function and will not effect the source
// the downstream operators will never see the error so they will also not be effect
.catch(err => Rx.Observable.empty());
})
.subscribe(v => console.log(v));
You could also use the second argument passed into the catch selector to restart the observable source, but this will start it as though it hasn't run before.
const Rx = require('rxjs/Rx');
function checkValue(n) {
if(n === 4) {
throw new Error("Bad value");
}
return true;
}
const source = Rx.Observable.interval(100).take(10);
source.filter(x => checkValue(x))
.catch((err, source) => source)
.subscribe(v => console.log(v));
But this does not achieve the desired effect. You will see a stream that emits 1..3 repeatedly until the end of time... or you shutdown the script. Which ever comes first. (this is essential what .retry() does)
You need to use a flatMap operator where you will do the filtering. In the flatMap in this example I'm using Observable.if() to do the filtering as it guarantees me that I'm returning observables all the time. I'm sure you can do it other ways but this is a clean implementation for me.
const source = Rx.Observable.interval(100).take(10).flatMap((x)=>
Rx.Observable.if(() => x !== 4,
Rx.Observable.of(x),
Rx.Observable.throw("Bad value"))
.catch((err) => {
return Rx.Observable.empty()
})
);
source.subscribe(v => console.log(v));
I have a component that fetches content from a service to process it. The thing is I can have multiple calls to this function, which results in duplicates on my array. I the following workaround:
getOptions(): Observable<PickQuality[]> {
console.log("MY LENGTH: ", this.options.length) // <=== Always returns 0 because the callback hasn't run yet
if(this.options.length == 0) {
this.championService.getChampions()
.subscribe(champions => {
champions.forEach(champion => this.options.push(new PickQuality(champion, 0)));
this.reevaluate();
this.optionsSubject.next(this.options);
});
return this.optionsSubject.asObservable();
}
else
return Observable.of(this.options);
}
and it didn't work, and then I tried the following trick inside the callback (where the this.options.length is correctly recognized):
if(this.options.length != 0) return; // <=== Looks bad!
which actually worked but seemed extremely inefficient to me, since the call to my service is still executed. How can I fix this?
I'd recommend to restructure your code a little:
if (this.options.length == 0) {
let source = this.championService.getChampions()
.share();
source.subscribe(champions => {
// ... whatever
this.options = whateverResult;
});
return source;
} else {
return Observable.of(this.options);
}
Now you can avoid using Subjects and return the source Observable which represents the HTTP request and is shared via the share() operator. This means there's only one HTTP request and its result is sent to this internal subscribe() call as well as to the subscriber outside this method.
Check for duplicates before pushing them.
this.championService.getChampions()
.subscribe(champions => {
champions.forEach(champion => {
if (champions.indexOf(champion) == -1)
this.options.push(new PickQuality(champion, 0));
});
this.reevaluate();
this.optionsSubject.next(this.options);
});
I am trying to populate an array in my component called processes which is an array of process. Each process also has a list of tasks.
So currently, I am working with two api calls which are:
/processes
and
/process/{processId}/tasks
I use /processes to get all the processes and initially populate the processes array. Then I use the process id of each process to call the second API to get the tasks of that process.
Currently, my code looks something like this:
this.processes.forEach((process, index) => {
myService.getTasks().subscribe((tasks) => {
process.tasks = tasks;
})
})
I understand that I can create an array of observables, and use Observable.forkJoin() to wait for all these async calls to finish but I want to be able to define the subscribe callback function for each of the calls since I need a reference to the process. Any ideas on how I can go about approaching this issue?
Using the for loop to make multiple HTTP requests, and then subscribe to all of them separately should be avoided in order not to have many Observable connections opened.
As #Juan Mendes mentioned, Observable.forkJoin will return an array of tasks that match the index of each process in your processes array. You can also assign tasks to each process as they arrive as follows:
getTasksForEachProcess(): Observable<any> {
let tasksObservables = this.processes.map(process, processIdx) => {
return myService.getTasks(process)
.map(tasks => {
this.processes[processIdx].tasks = tasks; // assign tasks to each process as they arrive
return tasks;
})
.catch((error: any) => {
console.error('Error loading tasks for process: ' + process, 'Error: ', error);
return Observable.of(null); // In case error occurs, we need to return Observable, so the stream can continue
});
});
return Observable.forkJoin(tasksObservables);
};
this.getTasksForEachProcess().subscribe(
tasksArray => {
console.log(tasksArray); // [[Task], [Task], [Task]];
// In case error occurred e.g. for the process at position 1,
// Output will be: [[Task], null, [Task]];
// If you want to assign tasks to each process after all calls are finished:
tasksArray.forEach((tasks, i) => this.processes[i].tasks = tasksArray[i]);
}
);
Please also take a look at this post: Send multiple asynchronous HTTP GET requests
Thanks to Seid Mehmedovic for great explanation but it looks like code missing one round bracket near map. For me it worked as follows:
getTasksForEachProcess(): Observable<any> {
let tasksObservables = this.processes.map((process, processIdx) => {
return myService.getTasks(process)
.map(tasks => {
this.processes[processIdx].tasks = tasks; // assign tasks to each process as they arrive
return tasks;
})
.catch((error: any) => {
console.error('Error loading tasks for process: ' + process, 'Error: ', error);
return Observable.of(null); // In case error occurs, we need to return Observable, so the stream can continue
});
});
return Observable.forkJoin(tasksObservables);
};
this.getTasksForEachProcess().subscribe(
tasksArray => {
console.log(tasksArray); // [[Task], [Task], [Task]];
// In case error occurred e.g. for the process at position 1,
// Output will be: [[Task], null, [Task]];
// If you want to assign tasks to each process after all calls are finished:
tasksArray.forEach((tasks, i) => this.processes[i].tasks = tasksArray[i]);
}
);