Javascript Convert numbers to different formats or string alternative - javascript

UPDATED:
Using javascript or jQuery, how can I convert a number into it's different variations:
eg:
1000000
to...
1,000,000 or 1000K
OR
1000
to...
1,000 or 1K
OR
1934 and 1234
to...
1,934 or -2K (under 2000 but over 1500)
or
1,234 or 1k+ (over 1000 but under 1500)
Can this is done in a function?
Hope this make sense.
C

You can add methods to Number.prototype, so for example:
Number.prototype.addCommas = function () {
var intPart = Math.round(this).toString();
var decimalPart = (this - Math.round(this)).toString();
// Remove the "0." if it exists
if (decimalPart.length > 2) {
decimalPart = decimalPart.substring(2);
} else {
// Otherwise remove it altogether
decimalPart = '';
}
// Work through the digits three at a time
var i = intPart.length - 3;
while (i > 0) {
intPart = intPart.substring(0, i) + ',' + intPart.substring(i);
i = i - 3;
}
return intPart + decimalPart;
};
Now you can call this as var num = 1000; num.addCommas() and it will return "1,000". That's just an example, but you'll find that all the functions create will involve converting the numbers to strings early in the process then processing and returning the strings. (The separating integer and decimal part will probably be particularly useful so you might want to refactor that out into its own method.) Hopefully this is enough to get you started.
Edit: Here's how to do the K thing... this one's a bit simpler:
Number.prototype.k = function () {
// We don't want any thousands processing if the number is less than 1000.
if (this < 1000) {
// edit 2 May 2013: make sure it's a string for consistency
return this.toString();
}
// Round to 100s first so that we can get the decimal point in there
// then divide by 10 for thousands
var thousands = Math.round(this / 100) / 10;
// Now convert it to a string and add the k
return thousands.toString() + 'K';
};
Call this in the same way: var num = 2000; num.k()

Theoretically, yes.
As TimWolla points out, it requires a lot of logic.
Ruby on Rails have a helper for presenting time with words. Have a look at the documentation. The implementation for that code is found on GitHub, and could give you some hint as how to go about implementing this.
I agree with the comment to reduce the complexity by choosing one format.
Hope you find some help in my answer.

Related

Convert dollars to cents in JavaScript

I am trying to convert $3.77 (USD) to cents as follows:
const number = "3.77"
const separe = number.replace(".", "")
Is that the correct way to convert USD to cents? because I have seen some reports with different code and not which one is correct, any ideas?
Be careful when doing arithmetic calculation in JavaScript. Especially if you are dealing with floating point.
Consider this case:
+19.99 * 100 // 1998.9999999999998
19.99 * 100 // 1998.9999999999998
parseFloat(19.99) * 100 // 1998.9999999999998
I know, right?! So I made this simple helper to help you out:
const toCent = amount => {
const str = amount.toString()
const [int] = str.split('.')
return Number(amount.toFixed(2).replace('.', '').padEnd(int.length === 1 ? 3 : 4, '0'))
}
Result:
toCent(19.99) // 1999
toCent(19.9) // 1990
toCent(1.9) // 190
toCent(1999.99) // 199999
I haven't tested it yet thoroughly. Please don't copy paste my solution, at least you do your own test.
The way you are doing it now is perfectly valid. I have benchmarked your answer vs the answers from the comments and here are the results.
Winner:
const result = (parseFloat(number) * 100).toString();
Second Place (~99% of Winner):
const result = (Number(number) * 100).toString();
Third Place (~96% of Winner):
const result = (+number * 100).toString();
Fourth Place (~80% of Winner):
const result = number.replace(".", "");
As noted, your method will not work when the string does not match /\d\.\d\d/ however the other methods will work.
Tested on Safari 14
This will do it with any value.
export function toCents(aValue){
return Math.round((Math.abs(aValue) / 100) * 10000);
}
Math.round(x * 100) seems pretty safe because it will round to the nearest integer. The better options are:
Always represent your amounts in cents (i.e. whole numbers) and divide into dollars when needed
Use a library (such as decimal.js or Big.js)
The implementations in the above two libraries are super complex (here, here).

How to display second zero when only single digit is returned after decimal, in JavaScript?

I have two functions here...
function getCostOne() {
var cost = 1.5;
return 1 * cost.toFixed(2);
}
and...
function getCostTwo() {
var cost = 1.5;
return 1 + cost.toFixed(2);
}
What is the difference between multiplying cost.toFixed(2) and adding cost.toFixed(2)?
Why does multiplying it return .5 and adding return .50?
Those functions return 1.5 and "11.50" respectively. Working JSBin Demo...
console.log(1 * '1.50');
console.log(1 + '1.50');
It looks like the string is cast in the first case (as though you had called parseFloat('1.50') and then concatted in the second. However, this is only the results on my own browser. Take a look at the official MDN Web Docs...
console.log('foo' * 2);
// expected output: NaN
So, Chrome is probably handling it well, but I wouldn't expect that kind of behavior across all browsers!
If you want them to both definitely return the right numbers, do all the mathematical logic first, and then format it with toFixed(). That code would look like...
function getCostTwo() {
var cost = 1.5;
cost += 1; // do the math logic FIRST!
return cost.toFixed(2); // once the number is just right, we format it!
}

Javascript Help - selfDividingNumbers Algorithm producing all 0's

Greetings Stack Overflow!
First off, this is my first question!
I am trying to solve the selfDividingNumbers algorithm and I ran into this interesting problem. This function is supposed to take a range of numbers to check if they are self dividing.
Self Dividing example:
128 is a self-dividing number because
128 % 1 == 0, 128 % 2 == 0, and 128 % 8 == 0.
My attempt with Javascript.
/*
selfDividingNumbers( 1, 22 );
*/
var selfDividingNumbers = function(left, right) {
var output = [];
while(left <= right){
// convert number into an array of strings, size 1
var leftString = left.toString().split();
// initialize digit iterator
var currentDigit = leftString[0];
for(var i = 0; i < leftString.length; i++){
currentDigit = parseInt(leftString[i])
console.log( left % currentDigit );
}
// increment lower bound
left++;
}
return output
};
When comparing the current lower bound to the current digit of the lower bound, left % currentDigit it always produces zero! I figure this is probably a type error but I am unsure of why and would love for someone to point out why!
Would also like to see any other ideas to avoid this problem!
I figured this was a good chance to get a better handle on Javascript considering I am clueless as to why my program is producing this output. Any help would be appreciated! :)
Thanks Stack Overflow!
Calling split() isn't buying you anything. Remove it and you'll get the results you expect. You still have to write the code to populate output though.
The answer by #Joseph may fix your current code, but I think there is a potentially easier way to go about doing this. Consider the following script:
var start = 128;
var num = start;
var sd = true;
while (num > 0) {
var last = num % 10;
if (start % last != 0) {
sd = false;
break;
}
num = Math.floor(num / 10);
}
if (sd) {
print("Is self dividing");
}
else {
print("Is NOT self dividing");
}
Demo
To test each digit in the number for its ability to cleanly divide the original number, you can simply use a loop. In each iteration, check num % 10 to get the current digit, and then divide the number by ten. If we never see a digit which can not divide evenly, then the number is not self dividing, otherwise it is.
So the string split method takes the string and returns an array of string parts. The method expects a parameter, however, the dividing element. If no dividing element is provided, the method will return only one part, the string itself. In your case, what you probably intended was to split the string into individual characters, which would mean the divider would be the empty string:
var leftString = left.toString().split('');
Since you are already familiar with console.log, note that you could also use it to debug your program. If you are confused about the output of left % currentDigit, one thing you could try is logging the variables just before the call,
console.log(typeof left, left, typeof currentDigit, currentDigit)
which might give you ideas about where to look next.

Golomb sequence

I'm trying to resolve a small programming challenge, which is calculating the nth number of the Golomb's sequence (see this for more help). I've written a simple solution, but it may have any problem, because the number at 2500000 position is 10813 but my program gives me 10814.
var golomb = (function(){
var cache = [null, 1];
const o = 0.5 * (1 + Math.sqrt(5)); // Golden ratio
return function(n){
return cache[n] || (function(){
return Math.round(Math.pow(o, 2-o) * Math.pow(n, o-1));
})();
};
})();
var num = golomb(process.argv[2]);
console.log(num);
Maybe, the golden ratio needs more lenght than JavaScript gives. Someone can help? Thanks.
For what it's worth, here is a function based on the recurrence relation, with a cache, that gives the correct answer pretty quickly
var golomb = (function() {
var cache = [null, 1];
return function(n) {
var i;
for (i=cache.length;i<n;i++) cache[i]=golomb(i);
return cache[n] || (cache[n]=1+golomb(n-golomb(golomb(n-1))));
}
})();
check it up on jsFiddle
Sgmonda, the formula you got from Wolfram Alpha isn't an exact solution. I've actually complained to them, since I like the Golomb sequence. The recurrence relation is exact but slow, even if you cache it. Heh, that's why the programming challenge is a challenge.
From the wikipedia article:
Colin Mallows has given an explicit recurrence relation:
a(1) = 1;
a(n + 1) = 1 + a(n + 1 − a(a(n)))
You need to implement your solution in this iterative method that uses integers.
A quick attempt at trying to implement that gives:
function golomb(n) {
if(n == 1) return 1;
else return 1 + golomb(n − golomb(golomb(n-1)));
}

Math.random() returns value greater than one?

While playing around with random numbers in JavaScript I discovered a surprising bug, presumably in the V8 JavaScript engine in Google Chrome. Consider:
// Generate a random number [1,5].
var rand5 = function() {
return parseInt(Math.random() * 5) + 1;
};
// Return a sample distribution over MAX times.
var testRand5 = function(dist, max) {
if (!dist) { dist = {}; }
if (!max) { max = 5000000; }
for (var i=0; i<max; i++) {
var r = rand5();
dist[r] = (dist[r] || 0) + 1;
}
return dist;
};
Now when I run testRand5() I get the following results (of course, differing slightly with each run, you might need to set "max" to a higher value to reveal the bug):
var d = testRand5();
d = {
1: 1002797,
2: 998803,
3: 999541,
4: 1000851,
5: 998007,
10: 1 // XXX: Math.random() returned 4.5?!
}
Interestingly, I see comparable results in node.js, leading me to believe it's not specific to Chrome. Sometimes there are different or multiple mystery values (7, 9, etc).
Can anyone explain why I might be getting the results I see? I'm guessing it has something to do with using parseInt (instead of Math.floor()) but I'm still not sure why it could happen.
The edge case occurs when you happen to generate a very small number, expressed with an exponent, like this for example 9.546056389808655e-8.
Combined with parseInt, which interprets the argument as a string, hell breaks loose. And as suggested before me, it can be solved using Math.floor.
Try it yourself with this piece of code:
var test = 9.546056389808655e-8;
console.log(test); // prints 9.546056389808655e-8
console.log(parseInt(test)); // prints 9 - oh noes!
console.log(Math.floor(test)) // prints 0 - this is better
Of course, it's a parseInt() gotcha. It converts its argument to a string first, and that can force scientific notation which will cause parseInt to do something like this:
var x = 0.000000004;
(x).toString(); // => "4e-9"
parseInt(x); // => 4
Silly me...
I would suggest changing your random number function to this:
var rand5 = function() {
return(Math.floor(Math.random() * 5) + 1);
};
This will reliably generate an integer value between 1 and 5 inclusive.
You can see your test function in action here: http://jsfiddle.net/jfriend00/FCzjF/.
In this case, parseInt isn't the best choice because it's going to convert your float to a string which can be a number of different formats (including scientific notation) and then try to parse an integer out of it. Much better to just operate on the float directly with Math.floor().

Categories