Passing values into the `next` function in Redux Saga - javascript

I have a saga:
export function* loadData() {
const requestURL = 'http://www.web.address/thing';
try {
const data = yield call(getRequest, requestURL);
yield put(dataLoaded(data));
} catch (err) {
yield put(dataError(err));
}
}
I'm trying to test it like this:
describe('loadData saga', () => {
describe('200 response', () => {
const getReq = jest.fn();
const requestURL = 'mock.url.com';
const generator = cloneableGenerator(loadData)();
it('sends the request to the correct url', () => {
expect(generator.next(getReq, requestURL).value).toEqual(call(getReq, requestURL));
});
});
This test fails because the generator doesn't seem to heed what I pass into the next function, and I don't know why. That is to say, the test receives http://www.web.address/thing as the url, and the getRequest function, instead of what I've tried to pass in in the .next() function.
This also did not work:
generator.next(getReq(requestURL))
What am I misunderstanding?

You you shouldn't actually be passing any args into next() here, as your saga has no previous yields and doesn't pass any dynamic data from previous yield's to the call - it has everything it needs already in scope
So in your test you would check next().value is as expected using the non-mocked values
import {getRequest} from 'same/place/as/saga/loads/getRequest';
const requestURL = 'http://www.web.address/thing';
expect(generator.next().value).toEqual(call(getRequest, requestURL));
The problem you have is you are attempting to mock and pass in stubs which the saga has no way of injecting them in this case
Remember that a yield will stop execution of the generator until the next iteration so you are not actually running a request here, only stopping and receiving back an object of instructions of how to call getRequest with what args to pass.
Using .next(someData) after your above test will however allow you to pass in mock data as a simulation of the response of your request
const mockData = {};
expect(generator.next(mockData).value).toEqual(put(dataLoaded(mockData)))
Think of args to next() as allowing mocking the return value of the previous yield

Related

How to consume a promise and use the result later on with my code?

I'm new to async operations and js. Here is my question.
I have a Person class. I want to init Person instance using data that I get from an API call.
class Person {
constructor(data) {
this.data = data;
}
}
I am making an API call using Axios. I get a response and want to use it in my class.
const res = axios.get('https://findperson.com/api/david');
const david = new Person(res);
I understand that res is a promise at this stage, and that I need to consume it.
How should I do it?
How can I take the response and use it properly?
axios.get() return a promise of an object which contains the returned data, status, headers, etc...
async function getPerson() {
try {
const res = await axios.get('https://findperson.com/api/david');
const david = new Person(res.data);
// do something with david
} catch (error) {
console.log(error)
}
}
or
function getPerson() {
axios
.get('https://findperson.com/api/david')
.then(res => {
const david = new Person(res.data)
// do something with david
})
.catch(console.log)
}
Inside another async function, or at the top level of a module or at the REPL (in node 16.6+ or some earlier versions with the --experimental-repl-await feature enabled), you can just use await.
const res = await axios.get('https://findperson.com/api/david');
That will wait for the promise to be resolved and unwrap it to store the contained value in res.
If you want to get the value out of the world of async and into synchro-land, you have to do something with it via a callback function:
axios.get('https://findperson.com/api/david').then(
res => {
// do stuff with res here
});
... but don't be fooled; without an await, any code that comes after that axios.get call will run immediately without waiting for the callback. So you can't do something like copy res to a global var in the callback and then expect it to be set in subsequent code; it has to be callbacks all the way down.
You can do this:
axios.get('https://findperson.com/api/david').then(res => {
const david = new Person(res);
});
Or in async function: (See async await for javascript)
const res = await axios.get('https://findperson.com/api/david');
const david = new Person(res);

Better way to deal with Promise flow when functions need to access outer scope variable

I'm creating a Node.js module to interact with my API, and I use the superagent module to do the requests. How it works:
module.exports = data => {
return getUploadUrl()
.then(uploadFiles)
.then(initializeSwam)
function getUploadUrl() {
const request = superagent.get(....)
return request
}
function uploadFiles(responseFromGetUploadUrl) {
const request = superagent.post(responseFromGetUploadUrl.body.url)
// attach files that are in data.files
return request
}
function initializeSwam(responseFromUploadFilesRequest) {
// Same thing here. I need access data and repsonseFromUploadFilesRequest.body
}
}
I feel like I'm doing something wrong like that, but I can't think in a better way to achieve the same result.
Two simple ways:
write your function to take all parameters it needs
const doStuff = data =>
getUploadUrl()
.then(uploadFiles)
.then(initializeSwam)
might become
const doStuff = data =>
getUploadUrl()
.then(parseResponseUrl) // (response) => response.body.url
.then(url => uploadFiles(data, url))
.then(parseResponseUrl) // same function as above
.then(url => initializeSwam(data, url))
That should work just fine (or fine-ish, depending on what hand-waving you're doing in those functions).
partially apply your functions
const uploadFiles = (data) => (url) => {
return doOtherStuff(url, data);
};
// same deal with the other
const doStuff = data =>
getUploadUrl()
.then(parseResponseUrl)
.then(uploadFiles(data)) // returns (url) => { ... }
.then(parseResponseUrl)
.then(initializeSwam(data));
A mix of all of these techniques (when and where sensible) should be more than sufficient to solve a lot of your needs.
The way you have your code structured in the above snippet results in the getUploadUrl(), uploadFiles(), and initializeSwam() functions not being declared until the final .then(initializeSwam) call is made. What you have in this final .then() block is three function declarations, which simply register the functions in the namespace in which they are declared. A declaration doesn't fire-off a function.
I believe what you want is something like:
async function getUploadUrl() { <-- notice the flow control for Promises
const request = await superagent.get(....);
return request;
}
async function uploadFiles(responseFromGetUploadUrl) {
const request = await superagent.post(responseFromGetUploadUrl.body.url)
// attach files that are in data.files
return request;
}
async function initializeSwam(responseFromUploadFilesRequest) {
// Same thing here. I need access data and
repsonseFromUploadFilesRequest.body
const request = await ...;
}
module.exports = data => {
return getUploadUrl(data) <-- I'm guessing you wanted to pass this here
.then(uploadFiles)
.then(initializeSwam);
}
This approach uses ES6 (or ES2015)'s async/await feature; you can alternatively achieve the same flow control using the bluebird Promise library's coroutines paired with generator functions.

Can you call a saga and continue execution only when a different saga is finished?

Our app uses an ATTEMPT - SUCCESS - FAILURE approach to handling our responses from the server.
I have a generator function that needs to behave like this:
function * getSingleSectorAttempt(action) {
const sectors = yield select(getSectors);
if (!sectors) {
//If there are no sectors, I need to call the GET_SECTORS_ATTEMPT action
//and only restart/continue this saga when the GET_SECTORS_SUCCESS action is fired
}
const id = sectors[action.name].id;
try {
const response = yield call(api.getSector, id);
//...
} catch (err) {
//...
}
}
From what I've read of the Redux Saga documentation, this does not seem immediately possible. However, I would like to see if I'm missing something. I've already tried this:
yield fork(takeLatest, Type.GET_SECTORS_SUCCESS, getSingleSectorAttempt);
yield put(Actions.getSectorsAttempt());
in the if(!sectors) conditional block, but while this works it does not persist the initial GET_SINGLE_SECTOR_ATTEMPT action parameters, and I am not sure how to get it to do so without getting into callback and argument spaghetti.
The effect allowing you to wait for an action to be dispatched is take. In your case:
function* getSingleSectorAttempt(action) {
let sectors = yield select(getSectors);
if (!sectors) {
yield put(getSectorsAttempt());
yield take(GET_SECTORS_SUCCESS);
sectors = yield select(getSectors);
}
// resume here as normal
}
Your own answer could have unexpected side-effects. For example, if getSectors can return a falsy value several times in the lifetime of the app, you would have several forked processes waiting for GET_SECTORS_SUCCESS to be dispatched, and each executing your side-effect, each keeping a reference to the action that triggered it.
Oops, figured it out:
function* getSingleSectorAttempt(action) {
const sectors = yield select(getSectors);
if(!sectors){
//Pass initial action in a callback function like so:
yield fork(takeLatest, Type.GET_SECTORS_SUCCESS, () => getSingleSectorAttempt(action));
yield put(Actions.getSectorsAttempt());
} else {
const id = sectors[action.name].id;
try {
const response = yield call(api.getSector, id);
//...
} catch (err) {
//..
}
}
}

How to tie emitted events events into redux-saga?

I'm trying to use redux-saga to connect events from PouchDB to my React.js application, but I'm struggling to figure out how to connect events emitted from PouchDB to my Saga. Since the event uses a callback function (and I can't pass it a generator), I can't use yield put() inside the callback, it gives weird errors after ES2015 compilation (using Webpack).
So here's what I'm trying to accomplish, the part that doesn't work is inside replication.on('change' (info) => {}).
function * startReplication (wrapper) {
while (yield take(DATABASE_SET_CONFIGURATION)) {
yield call(wrapper.connect.bind(wrapper))
// Returns a promise, or false.
let replication = wrapper.replicate()
if (replication) {
replication.on('change', (info) => {
yield put(replicationChange(info))
})
}
}
}
export default [ startReplication ]
As Nirrek explained it, when you need to connect to push data sources, you'll have to build an event iterator for that source.
I'd like to add that the above mechanism could be made reusable. So we don't have to recreate an event iterator for each different source.
The solution is to create a generic channel with put and take methods. You can call the take method from inside the Generator and connect the put method to the listener interface of your data source.
Here is a possible implementation. Note that the channel buffers messages if no one is waiting for them (e.g. the Generator is busy doing some remote call)
function createChannel () {
const messageQueue = []
const resolveQueue = []
function put (msg) {
// anyone waiting for a message ?
if (resolveQueue.length) {
// deliver the message to the oldest one waiting (First In First Out)
const nextResolve = resolveQueue.shift()
nextResolve(msg)
} else {
// no one is waiting ? queue the event
messageQueue.push(msg)
}
}
// returns a Promise resolved with the next message
function take () {
// do we have queued messages ?
if (messageQueue.length) {
// deliver the oldest queued message
return Promise.resolve(messageQueue.shift())
} else {
// no queued messages ? queue the taker until a message arrives
return new Promise((resolve) => resolveQueue.push(resolve))
}
}
return {
take,
put
}
}
Then the above channel can be used anytime you want to listen to an external push data source. For your example
function createChangeChannel (replication) {
const channel = createChannel()
// every change event will call put on the channel
replication.on('change', channel.put)
return channel
}
function * startReplication (getState) {
// Wait for the configuration to be set. This can happen multiple
// times during the life cycle, for example when the user wants to
// switch database/workspace.
while (yield take(DATABASE_SET_CONFIGURATION)) {
let state = getState()
let wrapper = state.database.wrapper
// Wait for a connection to work.
yield apply(wrapper, wrapper.connect)
// Trigger replication, and keep the promise.
let replication = wrapper.replicate()
if (replication) {
yield call(monitorChangeEvents, createChangeChannel(replication))
}
}
}
function * monitorChangeEvents (channel) {
while (true) {
const info = yield call(channel.take) // Blocks until the promise resolves
yield put(databaseActions.replicationChange(info))
}
}
We can use eventChannel of redux-saga
Here is my example
// fetch history messages
function* watchMessageEventChannel(client) {
const chan = eventChannel(emitter => {
client.on('message', (message) => emitter(message));
return () => {
client.close().then(() => console.log('logout'));
};
});
while (true) {
const message = yield take(chan);
yield put(receiveMessage(message));
}
}
function* fetchMessageHistory(action) {
const client = yield realtime.createIMClient('demo_uuid');
// listen message event
yield fork(watchMessageEventChannel, client);
}
Please Note:
messages on an eventChannel are not buffered by default. If you want to process message event only one by one, you cannot use blocking call after const message = yield take(chan);
Or You have to provide a buffer to the eventChannel factory in order to specify buffering strategy for the channel (e.g. eventChannel(subscriber, buffer)). See redux-saga API docs for more info
The fundamental problem we have to solve is that event emitters are 'push-based', whereas sagas are 'pull-based'.
If you subscribe to an event like so: replication.on('change', (info) => {}) ,then the callback is executed whenever the replication event emitter decides to push a new value.
With sagas, we need to flip the control around. It is the saga that must be in control of when it decides to respond to new change info being available. Put another way, a saga needs to pull the new info.
Below is an example of one way to achieve this:
function* startReplication(wrapper) {
while (yield take(DATABASE_SET_CONFIGURATION)) {
yield apply(wrapper, wrapper.connect);
let replication = wrapper.replicate()
if (replication)
yield call(monitorChangeEvents, replication);
}
}
function* monitorChangeEvents(replication) {
const stream = createReadableStreamOfChanges(replication);
while (true) {
const info = yield stream.read(); // Blocks until the promise resolves
yield put(replicationChange(info));
}
}
// Returns a stream object that has read() method we can use to read new info.
// The read() method returns a Promise that will be resolved when info from a
// change event becomes available. This is what allows us to shift from working
// with a 'push-based' model to a 'pull-based' model.
function createReadableStreamOfChanges(replication) {
let deferred;
replication.on('change', info => {
if (!deferred) return;
deferred.resolve(info);
deferred = null;
});
return {
read() {
if (deferred)
return deferred.promise;
deferred = {};
deferred.promise = new Promise(resolve => deferred.resolve = resolve);
return deferred.promise;
}
};
}
There is a JSbin of the above example here: http://jsbin.com/cujudes/edit?js,console
You should also take a look at Yassine Elouafi's answer to a similar question:
Can I use redux-saga's es6 generators as onmessage listener for websockets or eventsource?
Thanks to #Yassine Elouafi
I created short MIT licensed general channels implementation as redux-saga extension for TypeScript language based on solution by #Yassine Elouafi.
// redux-saga/channels.ts
import { Saga } from 'redux-saga';
import { call, fork } from 'redux-saga/effects';
export interface IChannel<TMessage> {
take(): Promise<TMessage>;
put(message: TMessage): void;
}
export function* takeEvery<TMessage>(channel: IChannel<TMessage>, saga: Saga) {
while (true) {
const message: TMessage = yield call(channel.take);
yield fork(saga, message);
}
}
export function createChannel<TMessage>(): IChannel<TMessage> {
const messageQueue: TMessage[] = [];
const resolveQueue: ((message: TMessage) => void)[] = [];
function put(message: TMessage): void {
if (resolveQueue.length) {
const nextResolve = resolveQueue.shift();
nextResolve(message);
} else {
messageQueue.push(message);
}
}
function take(): Promise<TMessage> {
if (messageQueue.length) {
return Promise.resolve(messageQueue.shift());
} else {
return new Promise((resolve: (message: TMessage) => void) => resolveQueue.push(resolve));
}
}
return {
take,
put
};
}
And example usage similar to redux-saga *takeEvery construction
// example-socket-action-binding.ts
import { put } from 'redux-saga/effects';
import {
createChannel,
takeEvery as takeEveryChannelMessage
} from './redux-saga/channels';
export function* socketBindActions(
socket: SocketIOClient.Socket
) {
const socketChannel = createSocketChannel(socket);
yield* takeEveryChannelMessage(socketChannel, function* (action: IAction) {
yield put(action);
});
}
function createSocketChannel(socket: SocketIOClient.Socket) {
const socketChannel = createChannel<IAction>();
socket.on('action', (action: IAction) => socketChannel.put(action));
return socketChannel;
}
I had the same problem also using PouchDB and found the answers provided extremely useful and interesting. However there are many ways to do the same thing in PouchDB and I dug around a little and found a different approach which maybe easier to reason about.
If you don't attach listeners to the db.change request then it returns any change data directly to the caller and adding continuous: true to the option will cause to issue a longpoll and not return until some change has happened. So the same result can be achieved with the following
export function * monitorDbChanges() {
var info = yield call([db, db.info]); // get reference to last change
let lastSeq = info.update_seq;
while(true){
try{
var changes = yield call([db, db.changes], { since: lastSeq, continuous: true, include_docs: true, heartbeat: 20000 });
if (changes){
for(let i = 0; i < changes.results.length; i++){
yield put({type: 'CHANGED_DOC', doc: changes.results[i].doc});
}
lastSeq = changes.last_seq;
}
}catch (error){
yield put({type: 'monitor-changes-error', err: error})
}
}
}
There is one thing that I haven't got to the bottom. If I replace the for loop with change.results.forEach((change)=>{...}) then I get an invalid syntax error on the yield. I'm assuming it's something to do with some clash in the use of iterators.

Using rx.js, how do I emit a memoized result from an existing observable sequence on a timer?

I'm currently teaching myself reactive programming with rxjs, and I've set myself a challenge of creating an observable stream which will always emit the same result to a subscriber no matter what.
I've memoized the creation of an HTTP "GET" stream given a specific URL, and I'm trying to act on that stream every two seconds, with the outcome being that for each tick of the timer, I'll extract a cached/memoized HTTP result from the original stream.
import superagent from 'superagent';
import _ from 'lodash';
// Cached GET function, returning a stream that emits the HTTP response object
var httpget = _.memoize(function(url) {
var req = superagent.get(url);
req = req.end.bind(req);
return Rx.Observable.fromNodeCallback(req)();
});
// Assume this is created externally and I only have access to response$
var response$ = httpget('/ontologies/acl.ttl');
// Every two seconds, emit the memoized HTTP response
Rx.Observable.timer(0, 2000)
.map(() => response$)
.flatMap($ => $)
.subscribe(response => {
console.log('Got the response!');
});
I was sure that I'd have to stick a call to replay() in there somewhere, but no matter what I do, a fresh HTTP call is initiated every two seconds. How can I structure this so that I can construct an observable from a URL and have it always emit the same HTTP result to any subsequent subscribers?
EDIT
I found a way to get the result I want, but I feel like I am missing something, and should be able to refactor this with a much more streamlined approach:
var httpget = _.memoize(function(url) {
var subject = new Rx.ReplaySubject();
try {
superagent.get(url).end((err, res) => {
if(err) {
subject.onError(err);
}
else {
subject.onNext(res);
subject.onCompleted();
}
});
}
catch(e) {
subject.onError(e);
}
return subject.asObservable();
});
Your first code sample is actually closer to the way to do it
var httpget = _.memoize(function(url) {
var req = superagent.get(url);
return Rx.Observable.fromNodeCallback(req.end, req)();
});
However, this isn't working because there appears to be a bug in fromNodeCallback. As to work around till this is fixed, I think you are actually looking for the AsyncSubject instead of ReplaySubject. The latter works, but the former is designed for exactly this scenario (and doesn't have the overhead of an array creation + runtime checks for cache expiration if that matters to you).
var httpget = _.memoize(function(url) {
var subject = new Rx.AsyncSubject();
var req = superagent.get(url);
Rx.Observable.fromNodeCallback(req.end, req)().subscribe(subject);
return subject.asObservable();
});
Finally, though map appreciates that you are thinking of it, you can simplify your timer code by using the flatMap overload that takes an Observable directly:
Rx.Observable.timer(0, 2000)
.flatMap($response)
.subscribe(response => {
console.log('Got the response');
});
Unless I am getting your question wrong, Observable.combineLatest does just that for you, it cache the last emitted value of your observable.
This code sends the request once and then give same cached response every 200 ms:
import reqPromise from 'request-promise';
import {Observable} from 'rx';
let httpGet_ = (url) =>
Observable
.combineLatest(
Observable.interval(200),
reqPromise(url),
(count, response) => response
);
httpGet_('http://google.com/')
.subscribe(
x => console.log(x),
e => console.error(e)
);

Categories