I recently got into functional programming bit by bit; started trying to point-free every function I code today.
I was practicing using Ramda while I was coding a binary search algorithm.
const R = require("ramda");
const getMidIndex = R.compose(
R.call(R.invoker(1, "floor"), R.__, Math),
R.divide(R.__, 2),
R.apply(R.add),
R.props(["upper", "lower"])
);
const getMidElement = R.converge(R.nth, [getMidIndex, R.prop("list")]);
const getSearchValue = R.prop("searchValue");
var binarySearch = R.compose(
R.prop("ans"),
R.until(
R.either(
R.compose(R.not, R.compose(R.isNil, R.prop("ans"))),
R.compose(R.apply(R.gt), R.props(["lower", "upper"]))
),
R.cond([
[
R.converge(R.equals, [getSearchValue, getMidElement]),
R.converge(R.assoc("ans"), [getMidIndex, R.identity]),
],
[
R.converge(R.lt, [getSearchValue, getMidElement]),
R.converge(R.assoc("upper"), [
R.compose(R.dec, getMidIndex),
R.identity,
]),
],
[
R.converge(R.gt, [getSearchValue, getMidElement]),
R.converge(R.assoc("lower"), [
R.compose(R.inc, getMidIndex),
R.identity,
]),
],
])
),
R.converge(R.assoc("upper"), [
R.compose(R.dec, R.length, R.prop("list")),
R.assoc("lower", 0),
]),
R.assoc("ans", null),
R.converge(R.mergeLeft, [
R.compose(R.objOf("list"), R.nthArg(0)),
R.compose(R.objOf("searchValue"), R.nthArg(1)),
])
);
var binarySearch = R.curryN(2, binarySearch);
module.exports = binarySearch;
the function takes two inputs a list and a value(v) -then-> the first three composing collecting args in an object like this
{
searchValue: 7,
list: [ 2, 3, 7, 12 ],
ans: null,
lower: 0,
upper: 3
}
after finishing I started to wonder from the performance point of view is this worse than having just one local midElement for example instead to keep calculating it
am I on the right track or just wasting time is this even readable
Related
I'm trying to migrate a Python code to Javascript and I'm facing same issues on my calculations.
I also have no documentation to base on except this code:
var commonDenominator = ((myClassMap map (sub) -> sub reduce ((item, denominator = 0) -> item * perClassMultiplier[$$])) reduce ($$+$)) as Number
From what I understand, it's trying to read the myClassMap and trying to find the common denominator of everything.
I'm trying to convert but my code is doing a greatCommonDenominator and my calculations are wrong in the order of 100, I believe. Maybe I'm wrong...
Entry:
myClassMap -> [ [ 1, 150 ], [ 1, 0 ], [ 1.5, 8 ], [ 1.5, 0 ] ]
perClassMultiplier -> [ 1, 1, 1.5, 1.5 ]
Code
myClassMap.map(entry => {
const commonDenominator = greaterCommonDenominator(entry[0], entry[1]);
if(data.commonDenominator == null | commonDenominator > data.commonDenominator) {
data.commonDenominator = commonDenominator;
}
});
const greaterCommonDenominator = function(a, b) {
if (!b) {
return a;
}
return greaterCommonDenominator(b, a % b);
}
Any help? I want to make sure the code does the same calculation
I am trying to calculate and create an Array with Average Prices for different stocks.
For every stock, I have the Data in this format:
{
prices: [
[
1634304009628,
0.7118076876774715
],
[
1634307586874,
0.7063647246346818
],
[
1634311139365,
0.7049706990925925
],
[
1634313858611,
0.7085543691926037
],
[
1634318343009,
0.7057442983161784
]
]
}
For every stock API call I get the data like how I posted above, it has 2 values timestamp and the second one is the price. Now let's say I want the average trend for 5 stocks I will get the data in 5 different arrays and I just want to somehow make an average out of those 5 arrays in one to find the trend.
For the final result, I want the Array to be in the same format just with the calculated average altogether (the goal is to identify the trend direction).
What would be the best way to do that? I am Using React
first create and array of prices only and then using reduce you can just do this:
let myObj = {
prices: [
[
1634304009628,
0.7118076876774715
],
[
1634307586874,
0.7063647246346818
],
[
1634311139365,
0.7049706990925925
],
[
1634313858611,
0.7085543691926037
],
[
1634318343009,
0.7057442983161784
]
]
};
const average = arr => arr.reduce( ( p, c ) => p + c, 0 ) / arr.length;
const pricesArr = myObj.prices.map(function(value,index) { return value[1]; })
const result = average( pricesArr );
console.log(result);
with my current project, I am dealing with large streams of numerical data and transformations that have to take place on them in a data-flow-programmable fashion.
I stumbled upon the idea of transducers, which promised to solve the difficulties to handle multiple transformations on large arrays. It seems that transducers don't suit exactly for what I'm trying to solve here.
I am looking for a pattern / concept for transducers which only collect a needed amount of lookback to then process out a result. Similar to the browser version of tensorflow, reaktor, max-msp (input outputs, flow-graphs, node-based, visual-programming)
Most of these modules, should be connected to a source, but should also be able to act as a source to chain those to other modules
source ( a stream ) =[new-value]|=> module1 => module2 => ...
|=> module3 => module4 // branch off here to a new chain
From my understanding, the transducers as explained in most blogs takes the whole array, and feeds each individual values trough chosen transformers.
Yet my modules/transformers don't require so much data to work, say the example of a simple moving average with a look back of 4 steps.
I imagine that module to collect enough data until it starts it's output.
I also don't need to hold the whole array in memory, I should only deal with the exact amounts needed. Results/Outputs would be optionally stored in a database.
stream =[sends-1-value]=> module[collects-values-until-processing-starts] =[sends-one-value]=>...
It should also be possible to connect multiple sources into a module (which transducers didn't seem to provide.
Would the transducer pattern here still apply or is something else out there?
To be honest, every programmer would have an idea to make this work, yet I am asking for some established way of doing it, just like transducers came to be.
The transducer pattern certainly applies here. You can create a floating point processor with transducers paired with the right data structure. I'll give you a baseline example, with one assumption:
the stream you are working with implements Symbol.asyncIterator
Consider a simple queue
function SimpleQueue({ size }) {
this.size = size
this.buffer = []
}
SimpleQueue.prototype.push = function(item) {
this.buffer.push(item)
if (this.buffer.length > this.size) {
this.buffer.shift()
}
return this
}
SimpleQueue.prototype[Symbol.iterator] = function*() {
for (const item of this.buffer) {
yield item
}
}
Our simple queue has one method push that pushes an item into its internal buffer (an array). The simple queue is also iterable, so you could do for (const x of simpleQueue) {/* stuff */}
We'll now use our SimpleQueue in our floating point processor.
const average = iterable => {
let sum = 0, count = 0
for (const item of iterable) {
sum += item
count += 1
}
return sum / count
}
const floatingPointAverage = ({ historySize }) => {
const queue = new SimpleQueue({ size: historySize })
return item => {
queue.push(item)
const avg = average(queue)
console.log(queue, avg) // this shows the average as the process runs
return avg
}
}
floatingPointAverage takes an item, pushes it into our SimpleQueue, and returns the current average of items in the queue.
Finally, we can implement and consume our transducer
const { pipe, map, transform } = require('rubico')
const numbersStream = {
[Symbol.asyncIterator]: async function*() {
for (let i = 0; i < 1000; i++) yield i
},
}
transform(
pipe([
map(floatingPointAverage({ historySize: 4 })),
/* transducers that do stuff with floating point average here */
]),
null,
)(numbersStream)
The transducer in this case is map(floatingPointAverage({ historySize: 4 })). This transducer is courtesy of rubico, a library I wrote to solve my own async problems. I write about transducers in the context of rubico here
Your output should look like this
SimpleQueue { size: 4, buffer: [ 0 ] } 0
SimpleQueue { size: 4, buffer: [ 0, 1 ] } 0.5
SimpleQueue { size: 4, buffer: [ 0, 1, 2 ] } 1
SimpleQueue { size: 4, buffer: [ 0, 1, 2, 3 ] } 1.5
SimpleQueue { size: 4, buffer: [ 1, 2, 3, 4 ] } 2.5
SimpleQueue { size: 4, buffer: [ 2, 3, 4, 5 ] } 3.5
SimpleQueue { size: 4, buffer: [ 3, 4, 5, 6 ] } 4.5
SimpleQueue { size: 4, buffer: [ 4, 5, 6, 7 ] } 5.5
SimpleQueue { size: 4, buffer: [ 5, 6, 7, 8 ] } 6.5
SimpleQueue { size: 4, buffer: [ 6, 7, 8, 9 ] } 7.5
When Promise.all completes it returns an array of arrays that contain data. In my case the arrays are just numbers:
[
[ 1, 4, 9, 9 ],
[ 4, 4, 9, 1 ],
[ 6, 6, 9, 1 ]
]
The array can be any size.
Currently I'm doing this:
let nums = []
data.map(function(_nums) {
_nums.map(function(num) {
nums.push(num)
})
})
Is there an alternative way of doing this? Does lodash have any functions that are able to do this?
ES2019 introduced Array.prototype.flat which significantly simplifies this to:
const nums = data.flat();
const data = [
[ 1, 4, 9, 9 ],
[ 4, 4, 9, 1 ],
[ 6, 6, 9, 1 ]
];
const nums = data.flat();
console.log(nums);
Original Answer
Use reduce and concat:
data.reduce(function (arr, row) {
return arr.concat(row);
}, []);
Or alternatively, concat and apply:
Array.prototype.concat.apply([], data);
I would do as follows;
var a = [
[ 1, 4, 9, 9 ],
[ 4, 4, 9, 1 ],
[ 6, 6, 9, 1 ]
],
b = [].concat(...a)
console.log(b)
You actually don't need any sort of library to do it, you can use concat with apply:
Promise.all(arrayOfPromises).then((arrayOfArrays) => {
return [].concat.apply([], arrayOfArrays);
});
If you are using lodash, though, you can use _.flatten(arrayOfArrays) for the same effect.
If using async/await, to expand on #Retsam's answer, you can do it like so
const mergedArray = []
.concat
.apply([], await Promise.all([promise1, promise2, promiseN]));
A real world example I did using the AWS SDK, getting a list of usernames from multiple IAM user groups
const users = await getActiveUsersByGroup(['group1', 'group2'])
async function getActiveUsersByGroup(groups = []) {
getUsersByGroupPromises = groups.map(group => getUsersByGroup(group));
const users = []
.concat
.apply([], await Promise.all(getUsersByGroupPromises)) // Merge (concat) arrays
.map(users => users.UserName); // Construct new array with just the usernames
return users;
}
async function getUsersByGroup(group) {
const params = {
GroupName: group,
MaxItems: 100 // Default
};
const { Users: users } = await iam.getGroup(params).promise();
return users;
}
is my understanding of arrays inside of arrays is built like this?
a = new Array[
b = new Array[
c = new Array[
something,
something.more,
something.more.to,
something.more.to.learn
]]];
or does it need to be
a = new Array[];
a.b = new Array[];
a.b.c = new Array[
a.b.c.something,
a.b.c.something.more,
a.b.c.something.more.to,
a.b.c.something.more.to.learn
];
or am I as lost as I think I am.
You need to differentiate between arrays and objects.
You may use arrays like so:
var nested = new Array(
new Array( '1.1', '1.2' ),
new Array( '2.1', '2.2' ),
);
Or use the short syntax:
var nested = [ [ '1.1', '1.2' ], [ '2.1', '2.2' ] ];
But you can't access array items by a textual index like in PHP or the like, you need to refer to the numbered index to call the different elements.
// Alert the first value of the second array in nested.
// Note that array indeces start off on zero.
alert( nested[ 1 ][ 0 ] ); // -> 2.1
If you need to reference the different elements you have to use objects:
var nested = {
first: {
a: '1.1',
b: '1.2'
},
second: {
a: '2.1',
b: '2.2'
}
}
alert( nested.second.a ); // -> 2.1
You can define multi-dimensional arrays a couple different ways but if you want inline, this would be a very simple way. (This is just a random structure for example purposes)
var arr =
[
["a"],
[
["b"],
[],
[
["c"]
]
],
[
[],
[]
]
];
Things get really hard to follow but just treat each tab in as a sub dimension of the parent array.
To access any element you would use indexes
arr[0][0]; // "a"
arr[1][0][0]; // "b"
arr[1][2][0][0]; // "c"
This really helps when you have intracate structures where delcaring new Array(...) would get very chained and complicated.