What does the following syntax mean? I don’t understand the use of a * after a yield. I’m new to generators, redux, and sagas, so I would appreciate some help understanding what the syntax *, takeEvery(), and the return function *(action) { do:
var MIDDLEWARES = []
function builder( ) {
const LOAD_DATA = "POI_LOADER/LOAD_POIS"
MIDDLEWARES.push( function *sagaFunction() {
yield *takeEvery( LOAD_DATA, loadData( statusField) )
} )
}
const loadData = (statusField) => {
return function *(action) {
console.log("action.venueId = " + action.venueId)
}
}
There are several questions here:
What does the * in yield *takeEvery() mean? Seems to be answered by Delegated yield (yield star, yield *) in generator functions.
Why doesn't the builder() function need a * to make it a generator given that it contains a yield * statement? Is that because the yield *takeEvery() is wrapped in the generator function sagaFunction()?
What does the takeEvery() function do, especially given that it has a * in front of it? I think takeEvery(). Based on its documentation, I think it applies loadData() to everything in LOAD_DATA. But if LOAD_DATA isn't an array, is takeEvery() needed in this code?
How come the declaration return function *(action) seems to have no name for the function? Is it declaring a generator with input parameter action and assigning that generator for a const called loadData?
Does the Saga library call next() on the generators created in this code?
So generators allow to return a value from a generator function by using yield and then resume execution from there on the next call.
yield* is used to indicate that the value returned is coming from another generator - so the generator function that calls yield* is delegating the actual value creation to another generator function in this case.
See MDN on this for more info.
Related
So, i am entirely new to Reudx-Saga and i have been testing it a for a few days.
I have a working understanding of generators, actions, redux-stores, sagas and others. Have some good experience with JS overall.
I have a scenario where i have a function as follows:
project().itemClicked(val => /* do something with */)
Now, i am trying to put this code inside a generator function, in my saga. I basically need to put an action, but for that, i need to use the yield keyword. I need to put the val returned by the callback using yield put.
I have to make a few other yield call(function..) calls after executing the above requirement. I tried wrapping my function inside a Promise, but the problem with that is, the promise is only getting called only once intemClicked is fired and thus the rest of my yield calls are being blocked.
Is there a way i can yield inside my anonymous function ?
Pardon my wording because i am still learning/understanding Redux-Saga.
The solutions depends on if the callback can be called multiple times or only once.
If it is going to be called only once, you first approach - converting it to promise - was the right one, you just need to use fork to make the wait non-blocking.
function * itemClickedSaga() {
const val = yield new Promise(resolve => {
project().itemClicked(resolve)
})
yield put(someActionCreator(val))
}
function * someSaga() {
// waits for the promise in different saga
// using fork to not wait for the call bellow
yield fork(itemClickedSaga)
yield call(somethingSomething)
}
In case the callback can be called multiple times, you might want to use an eventChannel:
function createItemClickedChannel = () => {
return eventChannel(emitter => {
project().itemClicked(emitter)
})
}
function itemClickedSaga(val) {
yield put(someActionCreator(val))
}
function * someSaga() {
const itemClickedChannel = yield call(createItemClickedChannel)
yield takeEvery(itemClickedChannel, itemClickedSaga)
yield call(somethingSomething)
}
I am trying to synchronously invoke a regular call-back style function in koa using generators. The following approach works:
var res = yield function (cb) {
myDaoObject.load(function (err, res) {
cb(err, res);
})
};
So I wont to replace it with the proper library use which should be equivalent:
var ld = thunkify(myDaoObject.load);
var res = yield ld();
And that doesn't work. Aren't these supposed to be the same thing?
Actually you hardly need to use thunkify here, as your function doesn't take an argument. You can (and should) however simplify it to
yield function(cb) { myDaoObject.load(cb); }
and possibly even further to just
yield myDaoObject.load;
which would work if load was not a method that used this. You will have to bind it to the object you want it get called upon:
yield myDaoObject.load.bind(myDaoObject);
The same problem was with your thunkify call - which was otherwise fine (albeit unnecessary).
I am trying to figure out how to arrange a pair of routines to control writing to a stream using the generator/iterator functions in ES2015. Its a simple logging system to use in nodejs
What I am trying to achieve is a function that external processes can call to write to a log.I am hoping that the new generator/iterator functions means that if it needs to suspend/inside this routine that is transparent.
stream.write should normally return immediately, but can return false to say that the stream is full. In this case it needs to wait for stream.on('drain',cb) to fire before returning
I am thinking that the actual software that writes to the stream is a generator function which yields when it is ready to accept another request, and that the function I provide to allow external people to call the stream is an interator, but I might have this the wrong way round.
So, something like this
var stopLogger = false;
var it = writer();
function writeLog(line) {
it.next(line);
})
function *writer() {
while (!stopLogger) {
line = yield;
if(!stream.write) {
yield *WaitDrain(); //can't continue until we get drain
}
}
});
function *waitDrain() {
//Not sure what do do here to avoid waiting
stream.on('drain', () => {/*do I yield here or something*/});
I found the answer here https://davidwalsh.name/async-generators
I have it backwards.
The code above should be
var stopLogger = false;
function *writeLog(line) {
yield writer(line)
})
var it = writeLog();
function writer(line) {
if (stopLogger) {
setTimeout(()=>{it.next();},1};//needed so can get to yield
} else {
if(stream.write(line)) {
setTimeout(()=>{it.next();},1}; //needed so can get to yeild
}
}
}
stream.on('drain', () => {
it.next();
}
I haven't quite tried this, just translated from the above article, and there is some complication around errors etc which the article suggests can be solved by enhancing the it operator to return a promise which can get resolved in a "runGenerator" function, But it solved my main issue, which was about how should the pattern work.
I'm attempting to understand javascript generators in node.js v8, without using any third party libraries.
I want to try having a generator that invokes an asynchronous callback (something like a web request) and returns the value that is called back.
My code (using Node v0.11.12) looks like:
var callbackWrapper = function (){
return function(callback) {
callback(null, 5);
};
};
var result = null;
var generator = (function *(){
result = yield callbackWrapper();
})();
var nextIteration = generator.next();
At the end of this code, nextIteration.value is a function, and result is still null.
It was my understanding that if yield is given a method with a callback, it invokes that callback.
My goal is to somehow get the value, 5, that is called back.
What is the way to do this? Do I have to pass some function to nextIteration.value and invoke it?
Let's clarify some things.
It was my understanding that if yield is given a method with a callback, it invokes that callback.
yield does not invoke anything. The purpose of yield is, well, to yield... When you define a generator you define an entity which yields values. It has nothing to do with async code. We can, however, leverage an important property of generators to handle async code.
Generators, by their definition, yield values. Here's the important part - between one yield to another the generator is suspended. In other words, it waits until it is asked to yield the next value. The internal state of the generator is kept in memory (on the stack) until the generator is exhausted (no more values to yield).
Another important property of generators is that we can send values back, thus changing their internal state.
So if we can suspend generators (make them wait) and we can send values back; we can essentially make them wait for some async operation to complete and then send the result back.
What is the way to do this? Do I have to pass some function to nodeIteration.value and invoke it?
Basically you need to wrap you code with a generator. Every time you start an async operation you make the generator wait by using yield. When the async operation completes you resume your generator by sending the result back with next(result).
I wrote an extensive post to explain this issue, which I'm sure will help you understand. Take a look: http://eyalarubas.com/javascript-generators-and-callbacks.html
Generators don't handle node style callbacks on their own. Instead it's returning the function that you wrapped inside of the callbackWrapper thunk. As yield only returns a value and then pauses execution at that point in time.
Generators weren't really designed for control flow but you can build on top of them to create control flow libraries like co, suspend, etc..
Basically what these libraries do (I'm oversimplifying here), is take your generator function and recursively call it until it tells them that it has finished.
Each of these libraries handles the internal yields in different ways, for example co turns everything it can handle into thunks internally. While suspend uses node-style callbacks for everything internally.
At each yield they check to see what was yielded to them a thunk, promise, generator, or whatever constructs that library handles, and abstracts the control out based on when they are completed.
You can build a structure around generators to handle asynchronous thunked functions but it's not for the feint of heart. Basically it would go something like this (Note: don't use this other than for playing around as its missing all the normal checks, error handling, etc..):
function controlFlow(genFunc){
// check to make sure we have a generator function otherwise explode
var generator; // reference for an initialized generator
// return a funcion so that execution time can be postponed
return function(){
runGen(genFunc());
}
function runGen(generator){
var ret = generator.next();
// here the generator is already finished we we'll return the final value
if(ret.done){
return ret.value
}
// here we'll handle the yielded value no matter what type it is
// I'm being naive here for examples sake don't do this
if(typeof ret.value === 'function'){
// oh look we have a regular function (very faulty logic)
// we wouldn't do this either but yeah
ret.value(function(err, result){
console.log(result);
});
}
// oh we got another item like an array or object that means parallelism or serialization depending on what we got back
// turn array, object, whatever into callback mechanisms and then track their completion
// we're just going to fake it here and just handle the next call
runGen(generator);
}
}
function thunked(callback){
return function(){
callback(null, 5);
};
};
function regular(callback){
console.log('regular function call');
callback(null, 'value');
};
controlFlow(function *(){
yield thunked(function(err, result){
console.log(err);
console.log(result);
});
yield regular;
yield thunked(function(err, result){
console.log('Another Thunked');
});
yield regular(function(err, result){
console.log(err);
console.log(result);
});
})();
result won’t get assigned until you send a value back to the generator by calling next again with the value you want to assign to result. In your example that would look like this:
nextIteration.value(function (error, value) {
generator.next(value);
});
I've been very excited about Node JS for awhile. I finally decided to knuckle down and write a test project to learn about generators in the latest Harmony build of Node.
Here is my very simple test project:
https://github.com/kirkouimet/project-node
To run my test project, you can easily pull the files from Github and then run it with:
node --harmony App.js
Here's my problem - I can't seem to get Node's asynchronous fs.readdir method to run inline with generators. Other projects out there, such as Galaxy and suspend seem to be able to do it.
Here is the block of code I need to fix. I want to be able to instantiate an object of type FileSystem and call the .list() method on it:
https://github.com/kirkouimet/project-node/blob/4c77294f42da9e078775bb84c763d4c60f21e1cc/FileSystem.js#L7-L11
FileSystem = Class.extend({
construct: function() {
this.currentDirectory = null;
},
list: function*(path) {
var list = yield NodeFileSystem.readdir(path);
return list;
}
});
Do I need to do something ahead of time to convert Node's fs.readdir into a generator?
One important note, I am parsing all class functions as they are created. This lets me handle generator functions differently than normal functions:
https://github.com/kirkouimet/project-node/blob/4c77294f42da9e078775bb84c763d4c60f21e1cc/Class.js#L31-L51
I've been really stumped with this project. Would love any assistance!
Here is what I am trying to accomplish:
Heavy use of classes with a modified version of John Resig's JavaScript Class support with inheritance
Using generators to get inline support for Node's stock async calls
Edit
I've tried to implement your example function and I am running into some trouble.
list: function*(path) {
var list = null;
var whatDoesCoReturn = co(function*() {
list = yield readdir(path);
console.log(list); // This shows an array of files (good!)
return list; // Just my guess that co should get this back, it doesn't
})();
console.log(whatDoesCoReturn); // This returns undefined (sad times)
// I need to use `list` right here
return list; // This returns as null
}
First and foremost, it is important to have a good model in your head of exactly what a generator is. A generator function is a function that returns a generator object, and that generator object will step through yield statements within the generator function as you call .next() on it.
Given that description, you should notice that asynchronous behavior is not mentioned. Any action on a generator on its own is synchronous. You can run to the first yield immediately and then do a setTimeout and then call .next() to go to the next yield, but it is the setTimeout that causes asynchronous behavior, not the generator itself.
So let's cast this in the light of fs.readdir. fs.readdir is an async function, and using it in a generator on its own will have no effect. Let's look at your example:
function * read(path){
return yield fs.readdir(path);
}
var gen = read(path);
// gen is now a generator object.
var first = gen.next();
// This is equivalent to first = fs.readdir(path);
// Which means first === undefined since fs.readdir returns nothing.
var final = gen.next();
// This is equivalent to final = undefined;
// Because you are returning the result of 'yield', and that is the value passed
// into .next(), and you are not passing anything to it.
Hopefully it makes it clearer that what you are still calling readdir synchronously, and you are not passing any callback, so it will probably throw an error or something.
So how do you get nice behavior from generators?
Generally this is accomplished by having the generator yield a special object that represents the result of readdir before the value has actually been calculated.
For (unrealistic) example, yielding a function is a simple way to yield something that represents the value.
function * read(path){
return yield function(callback){
fs.readdir(path, callback);
};
}
var gen = read(path);
// gen is now a generator object.
var first = gen.next();
// This is equivalent to first = function(callback){ ... };
// Trigger the callback to calculate the value here.
first(function(err, dir){
var dirData = gen.next(dir);
// This will just return 'dir' since we are directly returning the yielded value.
// Do whatever.
});
Really, you would want this type of logic to continue calling the generator until all of the yield calls are done, rather than hard-coding each call. The main thing to notice with this though, is now the generator itself looks synchronous, and everything outside the read function is super generic.
You need some kind of generator wrapper function that handles this yield value process, and your example of the suspend does exactly this. Another example is co.
The standard method for the method of "return something representing the value" is to return a promise or a thunk since returning a function like I did is kind of ugly.
With the thunk and co libraries, you with do the above without the example function:
var thunkify = require('thunkify');
var co = require('co');
var fs = require('fs');
var readdir = thunkify(fs.readdir);
co(function * (){
// `readdir` will call the node function, and return a thunk representing the
// directory, which is then `yield`ed to `co`, which will wait for the data
// to be ready, and then it will start the generator again, passing the value
// as the result of the `yield`.
var dirData = yield readdir(path, callback);
// Do whatever.
})(function(err, result){
// This callback is called once the synchronous-looking generator has returned.
// or thrown an exception.
});
Update
Your update still has some confusion. If you want your list function to be a generator, then you will need to use co outside of list wherever you are calling it. Everything inside of co should be generator-based and everything outside co should be callback-based. co does not make list automatically asynchronous. co is used to translate a generator-based async flow control into callback-based flow control.
e.g.
list: function(path, callback){
co(function * (){
var list = yield readdir(path);
// Use `list` right here.
return list;
})(function(err, result){
// err here would be set if your 'readdir' call had an error
// result is the return value from 'co', so it would be 'list'.
callback(err, result);
})
}
#loganfsmyth already provides a great answer to your question. The goal of my answer is to help you understand how JavaScript generators actually work, as this is a very important step to using them correctly.
Generators implement a state machine, the concept which is nothing new by itself. What's new is that generators allow to use the familiar JavaScript language construct (e.g., for, if, try/catch) to implement a state machine without giving up the linear code flow.
The original goal for generators is to generate a sequence of data, which has nothing to do with asynchrony. Example:
// with generator
function* sequence()
{
var i = 0;
while (i < 10)
yield ++i * 2;
}
for (var j of sequence())
console.log(j);
// without generator
function bulkySequence()
{
var i = 0;
var nextStep = function() {
if ( i >= 10 )
return { value: undefined, done: true };
return { value: ++i * 2, done: false };
}
return { next: nextStep };
}
for (var j of bulkySequence())
console.log(j);
The second part (bulkySequence) shows how to implement the same state machine in the traditional way, without generators. In this case, we no longer able to use while loop to generate values, and the continuation happens via nextStep callback. This code is bulky and unreadable.
Let's introduce asynchrony. In this case, the continuation to the next step of the state machine will be driven not by for of loop, but by some external event. I'll use a timer interval as a source of the event, but it may as well be a Node.js operation completion callback, or a promise resolution callback.
The idea is to show how it works without using any external libraries (like Q, Bluebird, Co etc). Nothing stops the generator from self-driving itself to the next step, and that's what the following code does. Once all steps of the asynchronous logic have completed (the 10 timer ticks), doneCallback will be invoked. Note, I don't return any meaningful data with yield here. I merely use it to suspend and resume the execution:
function workAsync(doneCallback)
{
var worker = (function* () {
// the timer callback drivers to the next step
var interval = setInterval(function() {
worker.next(); }, 500);
try {
var tick = 0;
while (tick < 10 ) {
// resume upon next tick
yield null;
console.log("tick: " + tick++);
}
doneCallback(null, null);
}
catch (ex) {
doneCallback(ex, null);
}
finally {
clearInterval(interval);
}
})();
// initial step
worker.next();
}
workAsync(function(err, result) {
console.log("Done, any errror: " + err); });
Finally, let's create a sequence of events:
function workAsync(doneCallback)
{
var worker = (function* () {
// the timer callback drivers to the next step
setTimeout(function() {
worker.next(); }, 1000);
yield null;
console.log("timer1 fired.");
setTimeout(function() {
worker.next(); }, 2000);
yield null;
console.log("timer2 fired.");
setTimeout(function() {
worker.next(); }, 3000);
yield null;
console.log("timer3 fired.");
doneCallback(null, null);
})();
// initial step
worker.next();
}
workAsync(function(err, result) {
console.log("Done, any errror: " + err); });
Once you understand this concept, you can move on with using promises as wrappers for generators, which takes it to the next powerful level.