I am struggling to convert Node streams to Rxjs Observables.
The streaming by itself works great when I try 1 URL.But, when I try to map the same function over an array of URLS, I get errors.
I am using Rx.Node to convert the stream into an Observable.
This is what I'm currently trying
// data_array is an array of 10 urls that I'm scraping data from.
let parentStream = Rx.Observable.from(data_array);
parentStream.map(createStream).subscribe(x => console.log(x), (e)=> console.log('Error', e), console.log('Complete'));
function createStream(url){
return RxNode.fromStream(x(url, '#centercol ul li', [{name: 'a', link: 'a#href'}]).write().pipe(JSONStream.parse('*')))
}
But this is the output X 10(the number of URLS in data_array)
RefCountObservable {
source:
ConnectableObservable {
source: AnonymousObservable { source: undefined, __subscribe: [Function] },
_connection: null,
_source: AnonymousObservable { source: [Object], __subscribe: [Function: subscribe] },
_subject:
Subject {
isDisposed: false,
isStopped: false,
observers: [],
hasError: false } },
_count: 0,
_connectableSubscription: null }
I first thought flatMap would work because it's flattening observables in an observable....but when I try flatMap, I get this:
Complete
Error TypeError: unknown type returned
However, if I do this:
This works for 1 URL, but I can't capture all of the urls in the data_array in one stream.
let stream = RxNode.fromStream(x(url, '#centercol ul li', [{name: 'a', link: 'a#href'}]).write().pipe(JSONStream.parse('*')))
stream.subscribe(x => console.log(x), (e)=> console.log('Error', e), console.log('Complete'))
I feel like I'm misunderstanding something not only because it clearing isn't working for multiple URLS, but even when it does work in the second example....I get 'Complete' first before all data comes in.
Clearly, I'm misunderstanding something. Any help would be wonderful. Thanks.
*UPDATE*
I tried a different path, which works, but does not use Node Stream. Node streams would be ideal, so still would like to make the above example work.
The approach I used next was to wrap a promise around my web scraping function, which is scrape below. This works, but the result is ten huge arrays with all the data from each URL in each array. What I really want is a stream of objects that I can compose a series of transformations as the data objects pass through.
Here is different, but working approach:
let parentStream = Rx.Observable.from(data_array);
parentStream.map(url => {
return Rx.Observable.defer(() => {
return scrape(url, '#centercol ul li', [{name: 'a', link: 'a#href'}]);
})
})
.concatAll()
.subscribe(x => console.log(x), (e)=> console.log('Error', e), console.log('Complete'));
function scrape(url, selector, scope) {
return new Promise(
(resolve, reject) => x(
url,
selector,
scope
)((error, result) => error != null ? reject(error) : resolve(result))
);
}
*Solution*
I figured it out. I have attached the solution below:
Instead on using RxNode, I opted to use Rx.Observable.fromEvent().
Node streams emit events, whether it be an new data, error or on complete.
So the fromEvent static operator is listening for the 'data' event and creates a new Observable for each event.
I then mergeAll those, and subscribe. Here's the code:
let parentStream = Rx.Observable.from(data_array);
parentStream.map((url)=> { return createEventStream(url); } ).mergeAll().subscribe(x => console.log(x), (e)=> console.log('Error', e), console.log('Complete'));
function createEventStream(url){
return Rx.Observable.fromEvent(x(url, '#centercol ul li', [{name: 'a', link: 'a#href'}]).write().pipe(JSONStream.parse('*')), 'data');
}
Related
I am a react-native developer and new to firebase. I am performing firebase realtime database operation, have a look at code below;
firebase.database().ref('events/wedding/items').push(object).then((data) => {
//success callback
dispatch(addPendingInvoice({ ...invoice, id: data.key }))
Alert.alert('Successfully added to Invoices', 'Please go to invoice section to clear first and continue.', [{ text: 'Ok' }])
}).catch((error) => {
//error callback
Alert.alert("Can't book package.", 'Please check your internet connection!', [{ text: 'OK', style: 'destructive' }])
})
Now, I wish to push another object to another node events/wedding/packages right after this firebase database function above. I can use another function inside then callback in above firebase functions. This is not a professional way to do this.
Is there any way to do this?
You can use the update() method to "simultaneously write to specific children of a node without overwriting other child nodes". Note that "simultaneous updates made this way are atomic: either all updates succeed or all updates fails", see the doc.
So in your case you would do along the following lines:
var newNodeKey = firebase.database().ref().child('events/wedding/items').push().key;
var updates = {};
updates['events/wedding/items/' + newNodeKey] = { foo: "bar" };
updates['events/wedding/packages/' + newNodeKey] = { bar: "foo" };
firebase.database().ref().update(updates)
.then(() => {
// The two writes are completed, do whatever you need
// e.g. dispatch(...);
});
All Firebase operations return a promise so you can use Promise.all() to run them all simultaneously.
Promise.all([
firebase.database().ref(reference).set({}),
firebase.database().ref(reference2).set({})
]).then(() => {
console.log("Operations Successful")
}).catch((e) => console.log(e))
You can also push all your operations to an array and then pass that array in Promise.all()
I have this method (Angular 9, so Typescript) which is used to retrieve a brend new Json Web Token for authenticate the current user
getNewAccessToken(){
return this.httpClient.post<Token>(`${this.baseService.baseUrl}auth-token-refresh/`, { refresh: this.getRefreshToken() }, this.baseService.httpOptions).pipe(
tap((response:Token) => {
this.cookieService.set(environment.tokenAccessName, response.access, null, '/', null, null, 'Strict');
this.isLoggedIn.next(true);
}
}
When I subscribe to this method, I check for errors like so
this.authService.getNewAccessToken().subscribe(
res => { //do something with res... },
error => throw error //catch error
);
Could I move the error detection directly inside my observable code using pipe and catchError? The code would turn to this
getNewAccessToken(){
return this.httpClient.post<Token>(`${this.baseService.baseUrl}auth-token-refresh/`, { refresh: this.getRefreshToken() }, this.baseService.httpOptions).pipe(
tap((response:Token) => {
this.cookieService.set(environment.tokenAccessName, response.access, null, '/', null, null, 'Strict');
this.isLoggedIn.next(true);
},
catchError(error => {
throw error;
})
));
}
I think this is a sort of centralized way of managing errors in observable.
Generally, is error handling better on observables or on their observers?
What are the pros and cons of these two approaches? Is there any difference in terms of performance?
I think the same question can be raised for promises
Yeah, and it is the good practice to move error handling into pipe as it is separation of concern. It separates data retrieving from the presentation of the data.
An example of code of Angular 2 documentation:
return this.http.get<Hero[]>(this.heroesUrl)
.pipe(
catchError(this.handleError('getHeroes', []))
);
I have a scenario where I have one parent machine and several child machines that can be spawned from the parent machine.
The current setup looks like this:
const parentMachine = Machine({
context: {
children: [] //can contain any number of child services
},
...
on: {
ADD_CHILD: {
actions: assign({
children: (ctx, e) => {
return [
...ctx.children,
{
ref: spawn(childMachine)
},
];
},
}),
},
UPDATE_CHILDREN: {
actions: ??? //need to somehow loop through children and send the UPDATE event to each service
}
}
});
When the parent machine receives the "UPDATE_CHILDREN" event, I want to update each of the child services. I know you can send batch events by passing an array to send, but I want each event to also be sent to a different service. I've only seen examples where they are sent to a single service at a time. I've tried several things, including the following:
UPDATE_CHILDREN: {
actions: ctx => ctx.children.forEach(c => send("UPDATE", { to: () => c.ref }) //doesn't send
}
Am I missing something obvious? Is this possible?
Ah, I bumped into exactly the same issue as you!
It turns out that if you give actions a function, it assumes the function to be the actual action, not a function that returns actions.
If you want to generate your actions based on context, you need to use a pure action:
import { actions } from 'xstate';
const { pure } = actions;
...
actions: pure((context, event) =>
context.myActors.map((myActor) =>
send('SOME_EVENT', { to: myActor })
)
),
This is a tricky mistake to fall into as you get no feedback that you're doing something wrong..
Had a realization about how this is supposed to work in XState.
The references to the children are already being stored, so we can just basically send events to them directly without using the "to" property:
actions: ctx => ctx.children.forEach(c => c.ref.send("UPDATE"))
There are numerous examples out there for initializing a service worker with a single cache similar to the following:
let cacheName = 'myCacheName-v1';
let urlsToCache = ['url1', 'url2', url3'];
self.addEventListener('install', function (event) {
event.waitUntil(
caches.open(cacheName).then(function (cache) {
return cache.addAll(urlsToCache);
}).then(function () {
return this.skipWaiting();
})
);
});
I wish to initialize multiple caches on my service worker. The motivation is to group assets by their tendency for change (e.g., static app data vs. css, javascript, etc.). With multiple caches, I can update individual caches (via versioned cache names) as files within that cache change. Ideally, I wish to setup a structure similar to the following:
let appCaches = [{
name: 'core-00001',
urls: [
'./',
'./index.html', etc...
]
},
{
name: 'data-00001',
urls: [
'./data1.json',
'./data2.json', etc...
]
},
etc...
];
My best attempt so far is something similar to:
self.addEventListener('install', function (event) {
appCaches.forEach(function (appCache) {
event.waitUntil(
caches.open(appCache.name).then(function (cache) {
return cache.addAll(appCache.urls);
}));
});
self.skipWaiting();
});
This approach seems to work. However, I'm still a newbie to service workers and promises. Something tells me that this approach has a pitfall that I'm too inexperienced to recognize. Is there a better way to implement this?
It's better to call event.waitUntil only once in a handler, but the good thing is it's relatively easy to get a single Promise to wait for.
Something like that should work:
event.waitUntil(Promise.all(
myCaches.map(function (myCache) {
return caches.open(myCache.name).then(function (cache) {
return cache.addAll(myCache.urls);
})
)
));
Promise.all takes an Array of Promises and resolves only after all the promises in the Array resolve, which means that install handler will wait till all the caches are initialized.
Thanks pirxpilot! Your answer, combined with JavaScript Promises: an Introduction, and a lot of trial and error figuring out how to chain all these Promises together resulted in a nice little implementation.
I added a requirement to only cache changes. Per best practices, the old cache is deleted during the 'activate' event. The end product is:
let cacheNames = appCaches.map((cache) => cache.name);
self.addEventListener('install', function (event) {
event.waitUntil(caches.keys().then(function (keys) {
return Promise.all(appCaches.map(function (appCache) {
if (keys.indexOf(appCache.name) === -1) {
return caches.open(appCache.name).then(function (cache) {
console.log(`caching ${appCache.name}`);
return cache.addAll(appCache.urls);
})
} else {
console.log(`found ${appCache.name}`);
return Promise.resolve(true);
}
})).then(function () {
return this.skipWaiting();
});
}));
});
self.addEventListener('activate', function (event) {
event.waitUntil(
caches.keys().then(function (keys) {
return Promise.all(keys.map(function (key) {
if (cacheNames.indexOf(key) === -1) {
console.log(`deleting ${key}`);
return caches.delete(key);
}
}));
})
);
});
Using bookshelf.js, I'm trying to get a set of objects and then update them all.
function markPostsAsSent(postsIds) {
return new Promise((resolve, reject) => {
// Get posts
Search.Post
.where({ id: postsIds })
.fetchAll()
.then(posts => {
// For each post, update sent value
Promise.map(posts.toJSON(), post => post.save({ is_sent: true }))
.then(() => {
resolve(true);
}, err => {
reject(Dependencies.getException(exceptionName, 500, 'POST_UPDATE_ERROR', new Error(), err));
});
}, err => {
reject(Dependencies.getException(exceptionName, 500, 'POST_FETCH_ERROR', new Error(), err));
});
});
}
What I'm trying to achieve in Bookshelf.js is equivalent to this in SQL :
UPDATE searches_posts
SET is_sent = true
WHERE id IN (1, 2, 3, 4);
(1, 2, 3, 4) are obviously values from postsIds parameter.
The SQL seems so much more simpler than the Bookshelf.js method.
Is there a better way to update these rows all at the same time instead of looping on .save() method ?
Your solution seems solid, but of course, there is a better way.
E.g., you can specify explicitly that you want to perform an update (http://bookshelfjs.org/#Model-instance-save) and you can tap into the knex query builder and limit which entries you want to update (http://knexjs.org/#Builder-whereIn).
So, putting everything together, try something like this (this is some demo example I performed in order to test the solution):
new Person().query(function(qb) {
qb.whereIn('id', [2, 4]);
}).save({
number_of_children: 5
}, {
method: 'update'
}).then(function() {
//Entries with ids 2 and 4 have been updated!
});
Hope this helps!
What about update/insert? What is more eloquent way to do this?
yield Promise.map data, (oneEvent)->
eventModel = new Event oneEvent
eventModel.fetch().then (fromDB) ->
return do eventModel.save if not fromDB?
fromDB.save oneEvent