ruby array pack and unpack functionality in javascript? - javascript

What are the JavaScript functions or libraries that are equivalent to Ruby's pack and unpack functions for the Array class? I'm particularly interested in converting hex strings to strings.
irb(main):022:0> ["446f67"].pack("H*")
=> "Dog"
I'm not a JavaScript programmer and would rather not roll my own converter if libraries are available.

I don't think JavaScript has a function that does quite the same thing; pack seems to be a Ruby specific. If you're using pack to turn an object into a string, which can be sent over a network, you could use JSON instead. The Prototype library provides methods for turning objects into JSON-encoded strings. There are also Ruby libraries for working with JSON (encoding and decoding) such as:
http://flori.github.com/json/

answered here: pack / unpack functions for node.js

Related

Fastest way to serialize and compress a hashmap (object) in nodejs?

What is the fastest way to serialize and compress a javascript hashmap (object) in NodeJS 12+? I'm trying to find the fastest combination of serialization and compression methods to transform a javascript object into binary data.
The number of possible combinations is 100+, my goal is a pre-study to choose several best options for final benchmark.
Input: an object having arbitrary keys, so some really fast serialization methods like AVSC cannot be used. Assume that the object has 30 key-value pairs, example:
{
"key-one": 2,
"key:two-random-text": "English short text, not binary data.",
... 28 more keys
}
No need to support serialization of Date, Regex etc.
Serialization only schemaless serialization formats can be taken into account, like JSON or BSON. V8.serialize is an interesting option, probably it's fast because it's native. Compress-brotli package added it's support for some reason, but did not provide a benchmark or highlighted it as a recommended option.
Compression Probably only fastest methods should be considered. Not sure if brotli is a perfect choice because according to wiki it's strong in compressing JS, CSS and HTML because expects "keywords" in input. Native nodejs support is preferred.
I've found a helpful research for a similar use case (I'm planning to compress via lambda and store in S3), but their data originates in JSON, opposite to my case.
I'd recommend lz4 for fast compression.

Is there a performance hit when using JavaScript objects passed to Rust?

When using Rust in a browser, I can get JavaScript objects and use them inside Rust (using, for instance, the js! macro from the stdweb library).
Do I get a performance hit when using these objects? Should I always copy them to Rust structures?
In Wasm access of Rust struct fields is definitely much faster than access of fields of a JS object.
There is a cost of converting a JS object to a Rust struct, so if you need to access only one or two fields from a JS object once, then it might be more efficient to do just that rather than to convert the entire object to a Rust struct first.

Can Kotlin or Swift parse JSON just like Javascript?

If I'm using javascript (or TypeScript), I can do following (just idea);
object = JSON.parse(jsonString)
And I can just use it like this,
alert(object.property);
Super Simple.
If I'm using Java, I need to create classes and parse it to use it. I understand.
How about Kotlin and Swift. They have optional types, so why single line, Javascript-like simple parsing doesn't exist for them, or does it? (Without even data class or going through JSON's properties)
If you look up what JSON stands for it's no wonder why JavaScript has "native support" for it: JavaScript Object Notation
In Kotlin you'll need to use libraries for parsing JSON, I'd recommend Jackson for that, a library widely used with Java already.

Javascript operator overloading on lists

JavaScript does not support operator overloading. Matrix libraries in JavaScript could not simplify notation. I would like to create operator overloading with a simple trick by adding using syntax like z = x++y. This is not a valid statement in JavaScript.
That is why I would like to create an include method which will parse existing JavaScript files and replace those statements with actual JavaScript code. This is somehow related to coffescript where the compiler is inside JavaScript. What would be the best way to approach that.
I have string manipulation solution:
"z=x++y;".replace(/(.*)=(.*)\+\+(.*)/i,"for(var _i_=0;_i_<$1.length;_i_++){ $1[i] = $2[i]+$3[i]}")
Example run:
for(var _i_=0;_i_<c.length;_i_++){ c[i] = data[0][i]+data[1][i];}
Obtaining Matlab, numpy like environment in JavaScript would be very convenient for easily deploying scientific models as web applications and avoiding computational burden in server side. Also, parallelization would be as easy as opening another browser tab from somewhere.

Use JSON Into An Object Model or Plain JSON

I'm doing a research about the subject of JSON Deserialization Into An Object Model, what do you think about it?
Would you prefer to use JSON from the server as is or converting it to Object Model (concrete JS objects)
What are the benefits of mapping it to Objects and not use the raw JSON, what are the negative aspects ?
What are the performance implications?
Our legacy developer wrote an SDK / DAL with a mapper function that traversing the JSON and make concrete objects,
you can see the implementation here : https://gist.github.com/send2moran/211a2eb19c4a7bf494e8
Would you work with the parsed JSON or prefer to use a mapping function?

Categories