node encode and decode utf-16 buffer - javascript

im working on a javascript/nodejs application that needs to talk with a C++ tcp/udp socket. It seems like I get from the old C++ clients an utf16 buffer. I didn't found a solution right now to convert it to a readable string and the other direction seems to be the same problem.
Is there a easy way for this two directions?
Nice greetings

If you have a UTF-16-encoded buffer, you can convert it to a UTF-8 string like this:
let string = buffer.toString('utf16le');
To read these from a stream, it's easiest to use convert to string at the very end:
let chunks = [];
stream.on('data', chunk => chunks.push(chunk))
.on('end', () => {
let buffer = Buffer.concat(chunks);
let string = buffer.toString('utf16le');
...
});
To convert a JS string to UTF-16:
let buffer = Buffer.from(string, 'utf16le')

Related

Javascript Webworker how to put json information into array buffer

I have thousands of small strings that I have to pass from a webworker back to the main page, each one is something like this:
"this string needs to be sent"
How would I be able to include it into an array buffer in order to increase the transfer speed? I understand how to use numbers with array buffers, but how do you use strings? I am looking for something like this:
var strings = ["str1","str2","str3",...]
for (var i = 0; i < strings.length; i++) {
arraybuffer[i] = //Whatever operation works to add strings[i]
}
It's worth measuring and comparing performance of various techniques. The worker could use SharedArrayBuffer if supported in your target browsers (not exemplified below), otherwise Transferrable objects can be used with postMessage(). TextEncoder creates ArrayBuffers from strings.
Individual strings can be transferred as they are encoded:
const encoder = new TextEncoder()
strings.forEach(s => {
const encoded = encoder.encode(s)
postMessage(encoded, [encoded.buffer])
})
An array of strings could be transferred in batches:
const encoded = strings.map(s => encoder.encode(s))
postMessage(encoded, encoded.map(bytes => bytes.buffer))

How do I reverse a buffer.toString() on a hex buffer?

const uuidc = '9acf0decef304b229ea1560d4b3bf7d0';
const packed = Buffer.from(uuidc, 'hex');
const packedAndStringified = 'm:' + packed;
I have some keys stored in a redis database that were stored like above. The problem is that once a string is appended to pack it is (I'm guessing) effectively converting the hex buffer into a binary string.
The stringified output looks something like: K;��V��
Is there any way for me to get packedAndStringified back to packed, and ultimately get the uuidc pulled back out?
https://nodejs.org/api/buffer.html#buffer_buf_tostring_encoding_start_end
Here it should be const packedAndStringified = 'm:' + packed.toString('hex'); ?

How can I efficiently write numeric data to a file?

Say I have an array containing a million random numbers:
[ 0.17309080497872764, 0.7861753816498267, ...]
I need to save them to disk, to be read back later. I could store them in a text format like JSON or csv, but that will waste space. I'd prefer a binary format where each number takes up only 8 bytes on disk.
How can I do this using node?
UPDATE
I did not find an answer to this specific question, with a full example, in the supposedly duplicate question. I was able to solve it myself, but in a verbose way that could surely be improved:
// const a = map(Math.random, Array(10));
const a = [
0.9651891365487693,
0.7385397746441058,
0.5330173086062189,
0.08100066198727673,
0.11758119861500771,
0.26647845473863674,
0.0637438360410223,
0.7070151519015955,
0.8671093412761386,
0.20282735866103718
];
// write the array to file as raw bytes (80B total)
var wstream = fs.createWriteStream('test.txt');
a.forEach(num => {
const b = new Buffer(8);
b.writeDoubleLE(num);
wstream.write(b);
})
wstream.end(() => {
// read it back
const buff = fs.readFileSync('test.txt');
const aa = a.map((_, i) => buff.readDoubleLE(8*i));
console.log(aa);
});
I think this was answered in Read/Write bytes of float in JS
The ArrayBuffer solution is probably what you are looking for.

Base64 encode a javascript object

I have large Javascript objects which I would like to encode to base-64 for AWS Kinesis`
It turns out that:
let objStr = new Buffer(JSON.stringify(obj), 'ascii');
new Buffer(objStr, 'base64').toString('ascii') !== objStr
I'm trying to keep this as simple as possible.
How can I base-64 encode JSON and safely decode it back to its original value?
From String to Base-64
var obj = {a: 'a', b: 'b'};
var encoded = btoa(JSON.stringify(obj))
To decode back to actual
var actual = JSON.parse(atob(encoded))
For reference look here.
https://developer.mozilla.org/en/docs/Web/API/WindowBase64/Base64_encoding_and_decoding
You misunderstood the Buffer(str, [encoding]) constructor, the encoding tells the constructor what encoding was used to create str, or what encoding the constructor should use to decode str into a byte array.
Basically the Buffer class represents byte streams, it's only when you convert it from/to strings that encoding comes into context.
You should instead use buffer.toString("base64") to get base-64 encoded of the buffer content.
let objJsonStr = JSON.stringify(obj);
let objJsonB64 = Buffer.from(objJsonStr).toString("base64");
When converting object to base64 I was getting out of latin range issues and character invalid error.
I made it work in my project with the below line.
Include the base64 and utf8 node packages and access them like this:
var bytes = base64.encode(utf8.encode(JSON.stringify(getOverviewComments())));
You can easily encode and decode from and to JSON/Base64 using a Buffer:
JSON to Base64:
function jsonToBase64(jsonObj) {
const jsonString = JSON.stringify(jsonObj)
return Buffer.from(jsonString).toString('base64')
}
Base64 to JSON:
function encodeBase64ToJson(base64String: string) {
const jsonString = Buffer.from(base64String,'base64').toString()
return JSON.parse(jsonString)
}
atob() and btoa() are outdated and should no longer be used.

Converting byte[] to ArrayBuffer in Nashorn

How do I convert an array of bytes into ArrayBuffer in Nashorn? I am trying to insert binary data into a pure JavaScript environment (i.e., it doesn't have access to Java.from or Java.to) and so would like to create an instance out an array of bytes.
Looks like I was going about this the wrong way. It made more sense to convert it into Uint8Array since what I'm sending in is an array of bytes.
I created the following function:
function byteToUint8Array(byteArray) {
var uint8Array = new Uint8Array(byteArray.length);
for(var i = 0; i < uint8Array.length; i++) {
uint8Array[i] = byteArray[i];
}
return uint8Array;
}
This will convert an array of bytes (so byteArray is actually of type byte[]) into a Uint8Array.
I think you're right about using a Uint8Array, but this code might be preferable:
function byteToUint8Array(byteArray) {
var uint8Array = new Uint8Array(byteArray.length);
uint8Array.set(Java.from(byteArray));
return uint8Array;
}
Also, if you really need an ArrayBuffer you can use uint8Array.buffer.

Categories