Mimic this Erlang code behaviour in Javascript - javascript

I'm trying to obtain in Javascript the same value returned by the following generate_hash erlang function
-define(b2l(V), binary_to_list(V)).
-define(l2b(V), list_to_binary(V)).
generate_hash(User, Secret, TimeStamp) ->
SessionData = User ++ ":" ++ erlang:integer_to_list(TimeStamp, 16),
Hash = crypto:sha_mac(Secret, SessionData),
base64:encode(SessionData ++ ":" ++ ?b2l(Hash)).
make_time() ->
{NowMS, NowS, _} = erlang:now(),
NowMS * 1000000 + NowS.
This function is being called in erlang in this way:
Username = "username"
Secret = ?l2b("secret"),
UserSalt = "usersalt",
CurrentTime = make_time(),
Hash = generate_hash( ?b2l(UserName), <<Secret/binary, UserSalt/binary>>, CurrentTime).
I managed to use the google CryptoJS library to calculate the hash, but the base64 returned value does not match the one returned in erlang.
<script src="http://crypto-js.googlecode.com/svn/tags/3.1.2/build/rollups/hmac-sha1.js"></script>
function generate_hash(User, Secret, TimeStamp) {
var SessionData = User + ":" + parseInt(TimeStamp,16);
var hash = CryptoJS.HmacSHA1(Secret,SessionData);
return atob(SessionData + ":" + hash.toString())
}
var Hash = generate_hash( "username", "secret"+"usersalt", new Date().getTime())
alert(Hash);

There are three problems in your code.
Firstly: CryptoJS.HmacSHA1(Secret,SessionData); has its arguments reversed. It should be CryptoJS.HmacSHA1(SessionData, Secret);.
You can check it out in JS console:
var hash = CryptoJS.HmacSHA1("b", "a");
0: 1717011798
1: -2038285946
2: -931908057
3: 823367506
4: 21804555
Now, go to Erlang console and type this:
crypto:sha_mac(<<"a">>, <<"b">>).
<<102,87,133,86,134,130,57,134,200,116,54,39,49,19,151,82,1,76,182,11>>
binary:encode_unsigned(1717011798).
<<102,87,133,86>>
binary:encode_unsigned(21804555).
<<1,76,182,11>>
I don't know equivalent method for signed integers, but this proves, that changing the order of arguments gives the same binary value.
Second problem is with hash.toString(), which following my example gives something like:
hash = CryptoJS.HmacSHA1("b", "a");
hash.toString();
"6657855686823986c874362731139752014cb60b"
while Erlang binary to list will result in:
Str = binary_to_list(Hash).
[102,87,133,86,134,130,57,134,200,116,54,39,49,19,151,82,1,76,182,11]
io:format("~s", [Str]).
fWV9Èt6'1^SR^AL¶^K
I am not sure, what toString does with word array, but this messes up the final result.
Third problem is, that new Date().getTime() will return time in milliseconds, while in Erlang, you have microseconds. This shouldn't matter, when you test it with static integer, though.

Two things:
The make_time function in the Erlang code above returns the number of seconds since the Unix epoch, while the Javascript method getTime returns the number of milliseconds.
Besides that, since you're probably not running the two functions in the same second, you'll get different timestamps and therefore different hashes anyway.
The Javascript function parseInt parses a string and returns an integer, while the Erlang function integer_to_list takes an integer and converts it to a string (in Erlang, strings are represented as lists, thus the name). You probably want to use the toString method instead.

This algorithm can generate the same sequence of bytes generated by erlang counterpart:
var ret = [];
var hash = CryptoJS.HmacSHA1("b","a").words;
angular.forEach(hash,function(v){
var pos = v>=0, last=ret.length;
for(v=pos?v:v>>>0; v>0; v=Math.floor(v/256)) {
ret.splice(last, 0, v%256);
}
});
console.info(ret);
console.info(String.fromCharCode.apply(String,ret));
The above outputs:
[102, 87, 133, 86, 134, 130, 57, 134, 200, 116, 54, 39, 49, 19, 151, 82, 1, 76, 182, 11]
fWV9Èt6'1RL¶

Related

Keep zero character when convert from DEC to HEX in Node JS

I am currently reading the ID number of the energy meter with Node JS and serialport library. The power meter ID has the following format xx xx xx xx xx xx. When I send the command and receive the data, I get the following DEC numbers: 0 0 24 1 104 115. Following the manufacturer's instructions, I have to convert this sequence to HEX. I have added it in an array and exported to the console as follows:
console.log(
(arrID[0]).toString(16)+
(arrID[1]).toString(16) +
(arrID[2]).toString(16) +
(arrID[3]).toString(16) +
(arrID[4]).toString(16) +
(arrID[5].toString(16)
);
and it returned to me as follows 001816873. This is the wrong ID, The correct ID to show must be 000018016873. I know the reason is the conversion of numbers with the first character is 0. I look forward to advice from you.
I used normal js, hope this helps you.
var arrID =[0, 0, 24, 1, 104, 115];
var arrID2 = ['','','','','',''];
for(var i=0;i<6;i++)
{
arrID2[i]=(arrID[i]).toString(16);
if(arrID2[i].length==1)arrID2[i]='0'+arrID2[i];
}
console.log(
(arrID2[0])+
(arrID2[1])+
(arrID2[2])+
(arrID2[3])+
(arrID2[4])+
(arrID2[5])
)
the output is
000018016873

JavaScript convert Array of 4 bytes into a float value from modbusTCP read

I'am trying to convert a array of 4 bytes to a float value. Here is the thing:
I get an answer from my request via ModbusTCP, this looks something like this:
{ "data": [ 16610, 40202 ], "buffer": { "type": "Buffer", "data": [ 64, 226, 157, 10 ] } }
This string is converted into a json-object, parsed and accessed with
var ModbusArray = JSON.parse(msg.payload);
var dataArray = ModbusArray.buffer.data;
(the msg.payload comes from node red)
Until here it works find. The Array represents a floating value. In this case it should be a value of around 7.0.
So, here is my Question: how can I get a float from this dataArray?
You could adapt the excellent answer of T.J. Crowder and use DataView#setUint8 for the given bytes.
var data = [64, 226, 157, 10];
// Create a buffer
var buf = new ArrayBuffer(4);
// Create a data view of it
var view = new DataView(buf);
// set bytes
data.forEach(function (b, i) {
view.setUint8(i, b);
});
// Read the bits as a float; note that by doing this, we're implicitly
// converting it from a 32-bit float into JavaScript's native 64-bit double
var num = view.getFloat32(0);
// Done
console.log(num);
For decoding a float coded in Big Endian (ABCD) with Node.js:
Buffer.from([ 64, 226, 157, 10 ]).readFloatBE(0)

Get unique int from MongoDB ObjectId in JavaScript(NodeJS)?

I'm involved in a project where we use MongoDB as the primary database system and JavaScript/NodeJS for the server side. We need to make an integration with external partner.
Our partner's API requires operation number which should be a unique integer value. We decided to use a combination of 4 byte timestamp and 3 byte random increment value from ObjectId, but the result numbers are too high and we lose precision.
Here is the procedure
var getUniqueIntFromObjectId = function (object_id) {
var res = null;
object_id = object_id.toString();
res = parseInt(object_id.substring(0, 8), 16).toString() + parseInt(object_id.substring(18, 24), 16).toString();
return parseInt(res, 10);
};
How can we improve this procedure, or change it to achieve the goal?
You can get a valid integer by picking an initial ObjectId (oldest one in your table) and use this to offset the timestamp in the first 4 bytes. This will give you a smaller number that is a valid integer without losing uniqueness. Alternatively you could set your initial ObjectId to 500000000000000000000000 which corresponds to 2012-07-13T11:01:20.000Z
var getUniqueIntFromObjectId = function (object_id) {
var res = null;
object_id = object_id.toString();
var firstObjectId='5661728913124370191fa3f8'
var delta = parseInt(object_id.substring(0, 8), 16)-parseInt(firstObjectId.substring(0, 8),16)
res = delta.toString() + parseInt(object_id.substring(18, 24), 16).toString();
return parseInt(res, 10);
};
document.write("Unique integer: <br>" + getUniqueIntFromObjectId("56618f7d0000000000000000"))
document.write('<br/>')
document.write("Max safe integer: <br>"+Number.MAX_SAFE_INTEGER)

Javascript ascii string to hex byte array

I am trying to convert an ASCII string into a byte array.
Problem is my code is converting from ASCII to a string array and not a Byte array:
var tx = '[86400:?]';
for (a = 0; a < tx.length; a = a + 1) {
hex.push('0x'+tx.charCodeAt(a).toString(16));
}
This results in:
[ '0x5b','0x38','0x36','0x30','0x30','0x30','0x3a','0x3f','0x5d' ]
But what I am looking for is:
[0x5b,0x38 ,0x30 ,0x30 ,0x30 ,0x30 ,0x3a ,0x3f,0x5d]
How can I convert to a byte rather than a byte string ?
This array is being streamed to a USB device:
device.write([0x5b,0x38 ,0x30 ,0x30 ,0x30 ,0x30 ,0x3a ,0x3f,0x5d])
And it has to be sent as one array and not looping sending device.write() for each value in the array.
A single liner :
'[86400:?]'.split ('').map (function (c) { return c.charCodeAt (0); })
returns
[91, 56, 54, 52, 48, 48, 58, 63, 93]
This is, of course, is an array of numbers, not strictly a "byte array". Did you really mean a "byte array"?
Split the string into individual characters then map each character to its numeric code.
Per your added information about device.write I found this :
Writing to a device
Writing to a device is performed using the write call in a device
handle. All writing is synchronous.
device.write([0x00, 0x01, 0x01, 0x05, 0xff, 0xff]);
on https://npmjs.org/package/node-hid
Assuming this is what you are using then my array above would work perfectly well :
device.write('[86400:?]'.split ('').map (function (c) { return c.charCodeAt (0); }));
As has been noted the 0x notation is just that, a notation. Whether you specify 0x0a or 10 or 012 (in octal) the value is the same.
function getBytes(str){
let intArray=str.split ('').map (function (c) { return c.charCodeAt (0); });
let byteArray=new Uint8Array(intArray.length);
for (let i=0;i<intArray.length;i++)
byteArray[i]=intArray[i];
return byteArray;
}
device.write(getBytes('[86400:?]'));

Dealing with uint8_t in javascript

G'day peoples,
I'm using MavLink to obtain GPS information. One of the message types is GPS_STATUS which is described by MavLink using a series of uint8_t[20].
If I run the following code:
console.log(' sat prn: ' + message.satellite_prn);
console.log(' sat prn: ' + JSON.stringify(message.satellite_prn));
console.log(' sat prn: ' + JSON.stringify(new Uint8Array(message.satellite_prn)));
I get the following output:
sat prn: <weird charcaters...>
sat prn: "\u0002\u0005\n\f\u000f\u0012\u0015\u0019\u001a\u001b\u001d\u001e\u001f\u0000\u0000\u0000\u0000"
sat prn: {"BYTES_PER_ELEMENT":1,"buffer":{"byteLength":0},"length":0,"byteOffset":0,"byteLength":0}
So obviously it's not working. I need a means to get the int value of each element.
I found this https://developer.mozilla.org/en-US/docs/JavaScript/Typed_arrays?redirectlocale=en-US&redirectslug=JavaScript_typed_arrays
Which made me think I would be able to do the following:
satellite_prn = Array.apply([], new Uint8Array(message.satellite_prn));
satellite_prn.length === 20;
satellite_prn.constructor === Array;
But when I stringify it via JSON it reports [], I presume this is an empty array.
Anyone know how I can do this? I know that the data is an array of 20 unsigned 8 bit integers. I just need to know how to access or parse them.
Note: I'm using node.js, but that shouldn't affect what I'm doing. This is why I'm using console.log, as it's avail in node.js.
Two issues with your code:
message.satellite_prn is a string not an array
Unit8Array needs to be loaded with .set
To get an array of numbers from message.satellite_prn, do this:
var array = message.satellite_prn.map(function(c) { return c.charCodeAt(0) })
To load an ArrayBuffer, do this:
var buffer = new ArrayBuffer(array.length);
var uint8View = new Uint8Array(buffer);
uint8View.set(array);
Ideally you wouldn't need to go through the string. If you are obtaining the data from an up-to-date implementation of XMLHttpRequest, such as xhr2, you can set:
responseType = "arraybuffer"
Looks like the problem is my understanding on how to deal with binary data within javascript. I found that I can just do the following to get the data as base 10 via charCodeAt().
The following is the code that allowed me to iterate through the 20 unsigned 8 bit integers and get each value as a decimal:
for(var i = 0, l = message.satellite_prn.length; i < l; i++) {
console.log(' Satellite ' + i + ', id: ' +
parseInt(message.satellite_prn.charCodeAt(i), 10)
);
}
I suspect that there may be a function that allows me to convert the binary data into an array, but for an unknown reason Uint8Array didn't appear to be working for me. However the above does.

Categories