Is there a javascript universal chaining method - javascript

WARNING: Axios is broken if you use this, but fetch is not.
For short I'm asking if there is a standard way to chain functions independently of type, with a similar implementation which doesn't polute Native Prototypes:
Object.prototype.pipe = function (func) {
return func(this)
}
Edit: Use the following if you want to avoid enumerable bugs where the method would appear when enumerating keys of any object, also it's a loose defined method.
Object.defineProperty(Object.prototype, 'pipe', {
value(func) {
return func(this)
}
})
This would permit to do thing like :
const preloadContext = require.context('#/', true, /\.preload\.vue$/)
const preloadComponents = preloadContext
.keys()
.map(fileName => {
const name = fileName
.split('/')
.pop()
.replace(/\.\w+.\w+$/, '')
.pipe(camelCase)
.pipe(upperFirst)
const component = filename
.pipe(preloadContext)
.pipe(config => config.default || config)
Vue.component(name, component)
return [name, component]
})
.pipe(Object.fromEntries)
instead of
const preloadContext = require.context('#/', true, /\.preload\.vue$/)
const preloadComponents = Object.fromEntries(preloadContext
.keys()
.map(fileName => {
const name = upperFirst(camelCase(fileName
.split('/')
.pop()
.replace(/\.\w+.\w+$/, '')
))
const config = preloadContext(fileName)
const component = config.default || config
Vue.component(name, component)
return [name, component]
})
)

No, there’s no real standard way. There’s a stage 1 proposal for an operator that would do it, though:
const name = fileName
.split('/')
.pop()
.replace(/\.\w+.\w+$/, '')
|> camelCase
|> upperFirst
which you can use today with rewriting tools like Babel.
If you don’t want to use a rewriting tool, though, I definitely wouldn’t create Object.prototype.pipe. There are still cleaner non-ideal options:
const pipe = (value, fns) =>
fns.reduce((acc, fn) => fn(acc), value);
const pipe = (value, fns) =>
fns.reduce((acc, fn) => fn(acc), value);
const foo = pipe('Mzg0MA==', [
atob,
Number,
x => x.toString(16),
]);
console.log(foo);
(Lots of helpful function collections and individual packages implement this.)

Related

How to pass variable from each promise to a Promise.allSettled?

In my project (VUE + Vuex) I need to make some API requests simultaneously, according to some contents and then process the results.
The getters.api_props(key) function will return the method ('post', 'patch', 'delete') or false if there is no need for a request. It will also return the url and the object that is needed for the request.
The api method returns the request as a Promise using axios.
Here is my code so far:
var contents = {person: {...}, info: {...}}
var promiseArray = [];
for (var key in contents) {
let [method, url, hash] = getters.api_props(key);
if (method) { promiseArray.push(api[method](url, hash)) }
}
await Promise.allSettled(promiseArray).then((results) => {
results.map(r => {
// THE RESULTS WILL BE PROCESSED HERE like:
// commit("save", [key, r])
console.info(r)
})
}).catch(e => console.log('ERROR:::',e)).finally(commit("backup"))
The problem is that the results does not include the 'key' so the save method that is called cannot know where to save the results.
Can you propose a fix or a better solution?
I would recommend to write
const contents = {person: {...}, info: {...}}
cosnt promiseArray = [];
for (const key in contents) {
let [method, url, hash] = getters.api_props(key);
if (method) {
promiseArray.push(api[method](url, hash)).then(value => ({
key,
status: 'fulfilled',
value
}), reason => ({
key,
status: 'rejected',
reason
})))
}
}
const results = await Promise.all(promiseArray);
for (const r of results) {
if (r.status=='fulfilled') {
console.info(r.key, ':', r.value.data)
commit("save", [r.key, r.value]);
} else if (r.status=='rejected') {
console.warn(r.key, ':', r.reason)
}
})
commit("backup");
So, to answer my own question, after Bergi's comments I filled promiseArray with
api[method](url, hash).then((r) => [key, r]).catch((e) => {throw [key, e.response]})
and then found the key that I needed:
await Promise.allSettled(promiseArray).then((results) => {
results.map((r) => {
if (r.status=='fulfilled') {
console.info(r.value[0],':',r.value[1].data)
}
if (r.status=='rejected') {
console.warn(r.reason[0],':',r.reason[1])
}
})
})
You obviously don't need to take this, but I fiddled with it for a while and this is what I liked best:
import forEach from 'lodash/forEach'
import mapValues from 'lodash/mapValues'
import { api, getters } from 'somewhere'
var contents = {person: {...}, info: {...}}
const promiseContents = mapValues(contents, (value, key) => {
let [method, url, hash] = getters.api_props(key);
if (!method) { return }
return api[method](url, hash)
})
await Promise.allSettled(Object.values(promiseContents))
forEach(promiseContents, (promise, key) => {
promise.then(response => {
if (promise.status === 'rejected') {
console.warn(key, ':', response)
}
console.info(key, ':', value.data)
})
})
The big requirement is that you include lodash in the project, but that is not an unusual ask in javascript projects.
mapValues allows you to keep the structure of your contents object while replacing the values with promises. I just use await on Promise.allSettled to tell the rest of the code when it can go. I just ignore the results.
Finally, using lodash's forEach I interpret the results. The advantage here is that every promise is run in a function alongside the key from your original contents object.
I like doing it this way because it doesn't require you to create a [key, result] array. That said, either way works fine.

JS spread operator workflow on React

React suggests not to mutate state. I have an array of objects which I am manipulating based on some events. My question is, is it okay to write it like this:
const makeCopy = (arr) => arr.map((item) => ({ ...item }));
function SomeComponenet() {
const [filters, setFilters] = useState(aemFilterData);
const handleFilterClick = (filter, c) => {
let copiedFilters = makeCopy(filters);
/**
* Apply toggle on the parent as well
*/
if (!("parentId" in filter)) {
copiedFilters[filter.id].open = !copiedFilters[filter.id].open;
}
setFilters(copiedFilters);
}
}
Am I mutating the original object by doing like above? Or does it make a difference if written like this:
const makeCopy = (arr) => arr.map((item) => ({ ...item }));
function SomeComponent() {
const [filters, setFilters] = useState(aemFilterData);
const handleFilterClick = (filter, c) => {
let copiedFilters = makeCopy(filters);
/**
* Apply toggle on the parent as well
*/
if (!("parentId" in filter)) {
copiedFilters = copiedFilters.map((f) => {
if (filter.id === f.id) {
return {
...f,
open: !f.open,
};
} else {
return { ...f };
}
});
}
setFilters(copiedFilters);
}
}
What's the preferred way to do this? Spread operators are getting a lot verbose and I am not liking it, but I prefer it if that's how I need to do it here. immutable.js and immer or not an option right now.
const makeCopy = (arr) => arr.map((item) => item );
With above code, it's mutating on the original object reference because we're not creating a deep clone.
copiedFilters[filter.id].open = !copiedFilters[filter.id].open;
Here reference of copiedFilters[filter.id] and filters[filter.id] is same.
With spread operator
const makeCopy = (arr) => arr.map((item) => ({ ...item }));
Here we create a new copy of the inner object too. So copiedFilters[filter.id] and filters[filter.id] will have different reference.
This is same as your second approach.
So either you use spread operator while making a copy or you can skip making a copy in the second approach and directly map on filters since you're using spread operator there. This looks better because, why run loop twice - first to create copy and then to update open.
// let copiedFilters = makeCopy(filters); Not needed in second approach
copiedFilters = copiedFilters.map((f) => {
if (filter.id === f.id) {
return {
...f,
open: !f.open,
};
} else {
return { ...f };
}
});
You can create a deep clone when you copy but that would be waste of computation and memory, I don't think it's needed here.
Deep clone is helpful when you have further nesting in the object.

Best way to call a function before every function in an object?

I'm creating an interface and i need to run a check function before most methods, repeating the code over and over doesn't seem great. Is there a way I can run a function before a function?
Example of my current code
const uploadObject = async (object) => {
if (!ipfs) noProvider()
const buffer = objectToIpfsBuffer(object)
return ipfs.add(buffer)
}
const uploadString = async (string) => {
if (!ipfs) noProvider()
const buffer = stringToIpfsBuffer(string)
return ipfs.add(buffer)
}
const uploadBuffer = async (buffer) => {
if (!ipfs) noProvider()
return ipfs.add(buffer)
}
...
module.exports = {
uploadObject,
uploadString,
uploadBuffer,
...
}
The function I wish to run before is if (!ipfs) noProvider()
I see no issue with handling this the way you are; however, another approach to "hook" a property accessor is to use a Javascript Proxy.
The Proxy object is used to define custom behavior for fundamental operations (e.g. property lookup, assignment, enumeration, function invocation, etc).
When initializing a Proxy, you'll need to provide two function inputs:
target: A target object (can be any sort of object, including a native array, a function or even another proxy) to wrap with Proxy.
handler: An object which is a placeholder object which contains traps for Proxy. All traps are optional. If a trap has not been defined, the default behavior is to forward the operation to the target.
Here's an example:
const handler = {
get: function(target, prop, receiver) {
console.log('A value has been accessed');
return Reflect.get(...arguments);
}
}
const state = {
id: 1,
name: 'Foo Bar'
}
const proxiedState = new Proxy(state, handler);
console.log(proxiedState.name);
I would probably just do it inline like you are, but to add another tool to your toolbelt: you could create a higher order function which takes in a function and the produces a new function that will do the check, and then do the work.
const checkIpfs = fxn => {
return (...args) => {
if (!ipfs) noProvider();
return fxn(...args);
}
}
const uploadObject = checkIpfs(async (object) => {
const buffer = objectToIpfsBuffer(object)
return ipfs.add(buffer);
});
const uploadString = checkIpfs(async (string) => {
const buffer = stringToIpfsBuffer(string)
return ipfs.add(buffer)
})
const uploadBuffer = checkIpfs(async (buffer) => {
return ipfs.add(buffer)
})
You could use the Proxy Object that intercepts internal operation of other object.
var newObject = new Proxy(yourObject, {
get(target, prop, receiver){
if(['uploadObject', 'uploadString','uploadBuffer'].includes(prop) && type(target[prop]) == typeof(Function)) {
if (!ipfs) noProvider()
return Reflect.get(target, prop, receiver);
}
},
});
newObject.uploadObject();

Equivalent of BlueBird Promise.props for ES6 Promises?

I would like to wait for a map of word to Promise to finish. BlueBird has Promise.props which accomplishes this, but is there a clean way to do this in regular javascript? I think I can make a new object which houses both the word and the Promise, get an array of Promises of those objects, and then call Promise.all and put them in the map, but it seems like overkill.
An implementation of Bluebird.props that works on plain objects:
/**
* This function maps `{a: somePromise}` to a promise that
* resolves with `{a: resolvedValue}`.
* #param {object} obj
* #returns {Promise<object>}
*/
function makePromiseFromObject(obj) {
const keys = Object.keys(obj);
const values = Object.values(obj);
return Promise.all(values)
.then(resolved => {
const res = {};
for (let i = 0; i < keys.length; i += 1) {
res[keys[i]] = resolved[i];
}
return res;
})
}
If you are dealing with a Map with values that are promises (or a mix of promises and non-promises) - and you want the final resolved value to be a Map with all values resolved
const mapPromise = map =>
Promise.all(Array.from(map.entries()).map(([key, value]) => Promise.resolve(value).then(value => ({key, value}))))
.then(results => {
const ret = new Map();
results.forEach(({key, value}) => ret.set(key, value));
return ret;
});
Although, I bet someone has a slicker way to do this, some of the new ES2015+ stuff is still new to me :p
The venerable async.js library has a promisified counterpart: async-q
The promisified async-q library supports all the functions in the async library. Specifically async.parallel(). At first glance async.parallel() looks just like Promise.all() in accepting an array of functions (note one difference, an array of functions, not promises) and run them in parallel. What makes async.parallel special is that it also accepts an object:
const asyncq = require('async-q');
async function foo () {
const results = await asyncq.parallel({
something: asyncFunction,
somethingElse: () => anotherAsyncFunction('some argument')
});
console.log(results.something);
console.log(results.somethingElse);
}
Alternative implementation combining ES6+ Object.entries() and Object.fromEntries():
async function pprops(input) {
return Object.fromEntries(
await Promise.all(
Object.entries(input)
.map(
([k, p])=>p.then(v=>[k, v])
)
)
);
};
I have two different implementations using ES6 async functions:
async function PromiseAllProps(object) {
const values = await Promise.all(Object.values(object));
Object.keys(object).forEach((key, i) => object[key] = values[i]);
return object;
}
One line shorter, but less optimized:
async function PromiseAllProps(object) {
const values = await Promise.all(Object.values(object));
return Object.fromEntries(Object.keys(object).map((prop, i) => ([prop, values[i]])));
}
Example
const info = await PromiseAllProps({
name: fetch('/name'),
phone: fetch('/phone'),
text: fetch('/foo'),
});
console.log(info);
{
name: "John Appleseed",
phone: "5551234",
text: "Hello World"
}
It would be advisable to use a library like bluebird for this. If you really want to do this yourself, the main idea is to:
Resolve each of the map values and connect the promised value back with the corresponding key
Pass those promises to Promise.all
Convert the final promised array back to a Map
I would make use of the second argument of Array.from, and the fact that an array of key/value pairs can be passed to the Map constructor:
Promise.allMap = function(map) {
return Promise.all( Array.from(map,
([key, promise]) => Promise.resolve(promise).then(value => [key, value])
) ).then( results => new Map(results));
}
// Example data
const map = new Map([
["Planet", Promise.resolve("Earth")],
["Star", Promise.resolve("Sun")],
["Galaxy", Promise.resolve("Milky Way")],
["Galaxy Group", Promise.resolve("Local Group")]
]);
// Resolve map values
Promise.allMap(map).then( result => console.log([...result]) );
.as-console-wrapper { max-height: 100% !important; top: 0; }
You can simply write it using Promise.all + reduce
const promiseProps = (props) => Promise.all(Object.values(props)).then(
(values) => Object.keys(props).reduce((acc, prop, index) => {
acc[prop] = values[index];
return acc;
}, {})
);
And a nice lodash variant for sake of completeness for plain js objects
async function makePromiseFromObject(obj: {[key: string]: Promise<any>}) {
return _.zipObject(Object.keys(obj), await Promise.all(Object.values(obj)))
}

Advanced example of RxJS5 combination with delayed observable

Hy, i faced a problem with RxJS Combination operators...
here is example object:
const userData = {
dbKeyPath: 'www.example.com/getDbKey',
users:[
{name:'name1'},
{name:'name2'},
{name:'name3'}
]
}
Make observable from them:
const userDataStream = Rx.Observable.of(userData)
const dbKeyStream : string = this.userDataStream.mergeMap(_userData => getDbKey(_userData.dbKeyPath))
const userStream = this.userDataStream.pluck('users').mergeMap(_users=>Rx.Observable.from(_users))
My expected result is stream with combined observables:
[user[0],dbKey],[user[1],dbKey],[user[2],dbKey]...
It works pretty well with withLatestFrom operator:
const result = userStream.withLatestFrom(dbKeyStream) // [user, dbkey]
But, how can i archive same result when i apply .delay() operator to dbKeyStream ?
I would suggest using the mergeMap overload with the selectorFunc:
const userData = {
dbKeyPath: 'www.example.com/getDbKey',
users:[
{name:'name1'},
{name:'name2'},
{name:'name3'}
]
};
function getDbKey(path) {
return Rx.Observable.of('the-db-key:'+path)
.do(() => console.log('fetching db key for path: '+ path))
.delay(1000);
}
const userDataStream = Rx.Observable.of(userData)
.mergeMap(
_userData => getDbKey(_userData.dbKeyPath),
(_userData, dbKey) => _userData.users.map(_usr => ({ user: _usr, dbKey }))
)
.subscribe(console.log);
<script src="https://cdnjs.cloudflare.com/ajax/libs/rxjs/5.0.3/Rx.js"></script>
This gives you the input object and each output value to combine together as you require.

Categories