Issue in decoding in node js - javascript

I have a requirement where I need to encode data in "iso-8859-1" and then convert back it to readable string in node js.
In .Net env:
string encodedData = "VABpAG0AZQAgAHMAZQByAGUAaQBzAA==";
Encoding encoding = Encoding.GetEncoding("iso-8859-1"); // encoding in "iso-8859-1"
byte[] = decodedbuff = convert.FromBase64String(encodedData); // getting buffer
result = encoding.GetString(decodedbuff); //decoding
result = timesereis
In a similar way, I need to encode and decode in node js
In Node js(using iconvlite)
const data = "VABpAG0AZQAgAHMAZQByAGUAaQBzAA=="
const buffer = iconvlite.encode(data,'iso-8859-1');
const result = buffer.toString('utf8');
Here in result, I am getting "VABpAG0AZQAgAHMAZQByAGUAaQBzAA==" instead of decoded result

By using the following code you get your desired result
let buffer = new Buffer(data, 'base64');
let result = buff.toString('utf-8');
console.log("result: "+text)

Related

I want to decompress a GZIP string with JavaScript

I have this GZIPed string: H4sIAAAAAAAA//NIzcnJVyguSUzOzi9LLUrLyS/XUSjJSMzLLlZIyy9SSMwpT6wsVshIzSnIzEtXBACs78K6LwAAAA==
I created that with this website: http://www.txtwizard.net/compression
I have tried using pako to ungzip it.
import { ungzip } from 'pako';
const textEncoder = new TextEncoder();
const gzipedData = textEncoder.encode("H4sIAAAAAAAA//NIzcnJVyguSUzOzi9LLUrLyS/XUSjJSMzLLlZIyy9SSMwpT6wsVshIzSnIzEtXBACs78K6LwAAAA==");
console.log('gzipeddata', gzipedData);
const ungzipedData = ungzip(gzipedData);
console.log('ungziped data', ungzipedData);
The issue is that Pako throws the error: incorrect header check
What am I missing here?
A JSbin
The "H4sIAAAAAAAA//NIzcnJVyguSUzOzi9LLUrLyS/XUSjJSMzLLlZIyy9SSMwpT6wsVshIzSnIzEtXBACs78K6LwAAAA==" is a base64 encoded string, you first need to decode that into a buffer.
textEncoder.encode just encodes that base64 encoded string into a byte stream.
How to do that depend on whether you are in a browser or on nodejs.
node.js version
To convert the unzipped data to a string you further have use new TextDecoder().decode()
For node you will use Buffer.from(string, 'base64') to decode the base64 encoded string:
import { ungzip } from 'pako';
// decode the base64 encoded data
const gzipedData = Buffer.from("H4sIAAAAAAAA//NIzcnJVyguSUzOzi9LLUrLyS/XUSjJSMzLLlZIyy9SSMwpT6wsVshIzSnIzEtXBACs78K6LwAAAA==", "base64");
console.log('gzipeddata', gzipedData);
const ungzipedData = ungzip(gzipedData);
console.log('ungziped data', new TextDecoder().decode(ungzipedData));
browser version
In the browser, you have to use atob, and you need to convert the decoded data to an Uint8Array using e.g. Uint8Array.from.
The conversion I used was taken from Convert base64 string to ArrayBuffer, you might need to verify if that really works in all cases.
// decode the base64 encoded data
const gezipedData = atob("H4sIAAAAAAAA//NIzcnJVyguSUzOzi9LLUrLyS/XUSjJSMzLLlZIyy9SSMwpT6wsVshIzSnIzEtXBACs78K6LwAAAA==")
const gzipedDataArray = Uint8Array.from(gezipedData, c => c.charCodeAt(0))
console.log('gzipeddata', gzipedDataArray);
const ungzipedData = pako.ungzip(gzipedDataArray);
console.log('ungziped data', new TextDecoder().decode(ungzipedData));
<script src="https://cdnjs.cloudflare.com/ajax/libs/pako/2.0.4/pako.min.js"></script>
This string is base64-encoded.
You first need to decode it to a buffer:
const gzippedString = 'H4sIAAAAAAAA//NIzcnJVyguSUzOzi9LLUrLyS/XUSjJSMzLLlZIyy9SSMwpT6wsVshIzSnIzEtXBACs78K6LwAAAA==';
const gzippedBuffer = new Buffer(gzippedString, 'base64');
Then you can un(g)zip it:
const unzippedBuffer = ungzip(gzippedBuffer);
The result on ungzip is a Unit8Array. If you want to convert it back to a string you'll need to decode it again:
const unzippedString = new TextDecoder('utf8').decode(unzipped);

Client side decompression back to string from C# compression of string

I have some large data sets which I would like to compress before I send to my client. The compression works.
Utilizing this bit of code which turns my data into a nice, small base64String:
Example: string mytest = "This is some test text.";
public static string Compress(string mytest)
{
byte[] buffer = System.Text.Encoding.UTF8.GetBytes(text);
MemoryStream ms = new MemoryStream();
using (GZipStream zip = new GZipStream(ms, CompressionMode.Compress, true))
{
zip.Write(buffer, 0, buffer.Length);
}
ms.Position = 0;
MemoryStream outStream = new MemoryStream();
byte[] compressed = new byte[ms.Length];
ms.Read(compressed, 0, compressed.Length);
byte[] gzBuffer = new byte[compressed.Length + 4];
System.Buffer.BlockCopy(compressed, 0, gzBuffer, 4, compressed.Length);
System.Buffer.BlockCopy(BitConverter.GetBytes(buffer.Length), 0, gzBuffer, 0, 4);
return Convert.ToBase64String(gzBuffer);
}
On the client side, I need to walk this whole thing backwards.
I can convert the base64string back to a byte array using (library):
var byteArray = Base64Binary.decodeArrayBuffer(source);
Then using pako.js I can deflate the gzip compressed content:
var deflate = new pako.Deflate({ level: 1 });
deflate.push(uintArray, true);
if (deflate.err) { throw new Error(deflate.err); }
Finally, I should be able to convert this back to my text:
var encodedString = String.fromCharCode.apply(null, deflate.result)
var decodedString = decodeURIComponent(encodedString);
Problem is that while I get no errors, I don't get expected results, which should be the the original string - "This is some test text."
Output is like this (can't paste it all):
xg``ïæ
Any thought on what am I missing?
You need to use pako.Inflate in your frontend.
Additionally you need to remove the 4 bytes size you added to the front of the gzBuffer in the frontend before decoding.
Something like this should work:
// "cookies rule the world" compressed with your c# code
let sample = "FgAAAB+LCAAAAAAABABLzs/PzkwtVigqzUlVKMlIVSjPL8pJAQBkkN7rFgAAAA==";
// decode base64 & convert to Uint8 Array
let binary = atob(sample);
let bytes = Uint8Array.from(binary, c => c.charCodeAt(0));
// You appended the length at the start of gzBuffer, so you need to remove those bytes
bytes = bytes.slice(4);
// inflate the message & convert it to a string
let inflated = pako.inflate(bytes);
let message = String.fromCharCode.apply(null, inflated);
console.log(message);
<script src="https://raw.githubusercontent.com/danguer/blog-examples/master/js/base64-binary.js"></script>
<script src="https://unpkg.com/pako#1.0.10/dist/pako.min.js"></script>

Similar Encrypt code in javascript as in C#

I use some remote api, they use such C# code:
SHA256Managed sha256Managed = new SHA256Managed();
byte[] passwordSaltBytes = Encoding.Unicode.GetBytes("zda");
byte[] hash = sha256Managed.ComputeHash(passwordSaltBytes);
string result = Convert.ToBase64String(hash);
Console.WriteLine("result = " + result); // result = NUbWRkT8QfzmDt/2kWaikNOZUXIDt7KKRghv0rTGIp4=
I need to get the same result in my javascript frontend code. Does somebody can help with such problem?
The answer is:
var utf8arr = CryptoJS.enc.Utf16LE.parse("zda");
var hash = CryptoJS.SHA256(utf8arr);
var base64 = CryptoJS.enc.Base64.stringify(hash);
console.log(base64);
Not quite obvious, but Unicode in C# is using UTF-16LE enconding.
So you can use CryptoJS to achieve the same result:
var utf16 = CryptoJS.enc.Utf16LE.parse("zda");
var hash = CryptoJS.SHA256(utf16);
var base64 = CryptoJS.enc.Base64.stringify(hash);
console.log(base64);

How to compute an SHA256 hash and Base64 String encoding in JavaScript/Node

I am trying to recreate the following C# code in JavaScript.
SHA256 myHash = new SHA256Managed();
Byte[] inputBytes = Encoding.ASCII.GetBytes("test");
myHash.ComputeHash(inputBytes);
return Convert.ToBase64String(myHash.Hash);
this code returns "n4bQgYhMfWWaL+qgxVrQFaO/TxsrC4Is0V1sFbDwCgg="
This is what I have so far for my JavaScript code
var sha256 = require('js-sha256').sha256;
var Base64 = require('js-base64').Base64;
var sha256sig = sha256("test");
return Base64.encode(sha256sig);
the JS code returns "OWY4NmQwODE4ODRjN2Q2NTlhMmZlYWEwYzU1YWQwMTVhM2JmNGYxYjJiMGI4MjJjZDE1ZDZjMTViMGYwMGEwOA=="
These are the 2 JS libraries that I have used
js-sha256
js-base64
Does anybody know how to make it work ? Am I using the wrong libs ?
You don't need any libraries to use cryptographic functions in NodeJS.
const crypto = require('crypto');
const hash = crypto.createHash('sha256')
.update('test')
.digest('base64');
console.log(hash); // n4bQgYhMfWWaL+qgxVrQFaO/TxsrC4Is0V1sFbDwCgg=
If your target user using modern browser such as chrome and edge, just use browser Crypto API:
const text = 'hello';
async function digestMessage(message) {
const msgUint8 = new TextEncoder().encode(message); // encode as (utf-8) Uint8Array
const hashBuffer = await crypto.subtle.digest('SHA-256', msgUint8); // hash the message
const hashArray = Array.from(new Uint8Array(hashBuffer)); // convert buffer to byte array
const hashHex = hashArray.map((b) => b.toString(16).padStart(2, '0')).join(''); // convert bytes to hex string
return hashHex;
}
const result = await digestMessage(text);
console.log(result)
Then you could verify the result via online sha256 tool.

Decompress gzip and zlib string in javascript

I want to get compress layer data from tmx file . Who knows libraries for decompress gzip and zlib string in javascript ? I try zlib but it doesn't work for me . Ex , layer data in tmx file is :
<data encoding="base64" compression="zlib">
eJztwTEBAAAAwqD1T20JT6AAAHgaCWAAAQ==
</data>
My javascript code is
var base64Data = "eJztwTEBAAAAwqD1T20JT6AAAHgaCWAAAQ==";
var compressData = atob(base64Data);
var inflate = new Zlib.Inflate(compressData);
var output = inflate.decompress();
It runs with displays message error "unsupported compression method" . But I try decompress with online tool as http://i-tools.org/gzip , it returns correct string.
Pako is a full and modern Zlib port.
Here is a very simple example and you can work from there.
Get pako.js and you can decompress byteArray like so:
<html>
<head>
<title>Gunzipping binary gzipped string</title>
<script type="text/javascript" src="pako.js"></script>
<script type="text/javascript">
// Get datastream as Array, for example:
var charData = [31,139,8,0,0,0,0,0,0,3,5,193,219,13,0,16,16,4,192,86,214,151,102,52,33,110,35,66,108,226,60,218,55,147,164,238,24,173,19,143,241,18,85,27,58,203,57,46,29,25,198,34,163,193,247,106,179,134,15,50,167,173,148,48,0,0,0];
// Turn number array into byte-array
var binData = new Uint8Array(charData);
// Pako magic
var data = pako.inflate(binData);
// Convert gunzipped byteArray back to ascii string:
var strData = String.fromCharCode.apply(null, new Uint16Array(data));
// Output to console
console.log(strData);
</script>
</head>
<body>
Open up the developer console.
</body>
</html>
Running example: http://jsfiddle.net/9yH7M/
Alternatively you can base64 encode the array before you send it over as the Array takes up a lot of overhead when sending as JSON or XML. Decode likewise:
// Get some base64 encoded binary data from the server. Imagine we got this:
var b64Data = 'H4sIAAAAAAAAAwXB2w0AEBAEwFbWl2Y0IW4jQmziPNo3k6TuGK0Tj/ESVRs6yzkuHRnGIqPB92qzhg8yp62UMAAAAA==';
// Decode base64 (convert ascii to binary)
var strData = atob(b64Data);
// Convert binary string to character-number array
var charData = strData.split('').map(function(x){return x.charCodeAt(0);});
// Turn number array into byte-array
var binData = new Uint8Array(charData);
// Pako magic
var data = pako.inflate(binData);
// Convert gunzipped byteArray back to ascii string:
var strData = String.fromCharCode.apply(null, new Uint16Array(data));
// Output to console
console.log(strData);
Running example: http://jsfiddle.net/9yH7M/1/
To go more advanced, here is the pako API documentation.
I can solve my problem by zlib . I fix my code as below
var base64Data = "eJztwTEBAAAAwqD1T20JT6AAAHgaCWAAAQ==";
var compressData = atob(base64Data);
var compressData = compressData.split('').map(function(e) {
return e.charCodeAt(0);
});
var inflate = new Zlib.Inflate(compressData);
var output = inflate.decompress();
For anyone using Ruby on Rails, who wants to send compressed encoded data to the browser, then uncompress it via Javascript on the browser, I've combined both excellent answers above into the following solution. Here's the Rails server code in my application controller which compresses and encodes a string before sending it the browser via a #variable to a .html.erb file:
require 'zlib'
require 'base64'
def compressor (some_string)
Base64.encode64(Zlib::Deflate.deflate(some_string))
end
Here's the Javascript function, which uses pako.min.js:
function uncompress(input_field){
base64data = document.getElementById(input_field).innerText;
compressData = atob(base64data);
compressData = compressData.split('').map(function(e) {
return e.charCodeAt(0);
});
binData = new Uint8Array(compressData);
data = pako.inflate(binData);
return String.fromCharCode.apply(null, new Uint16Array(data));
}
Here's a javascript call to that uncompress function, which wants to unencode and uncompress data stored inside a hidden HTML field:
my_answer = uncompress('my_hidden_field');
Here's the entry in the Rails application.js file to call pako.min.js, which is in the /vendor/assets/javascripts directory:
//= require pako.min
And I got the pako.min.js file from here:
https://github.com/nodeca/pako/tree/master/dist
All works at my end, anyway! :-)
I was sending data from a Python script and trying to decode it in JS. Here's what I had to do:
Python
import base64
import json
import urllib.parse
import zlib
...
data_object = {
'_id': '_id',
...
}
compressed_details = base64.b64encode(zlib.compress(bytes(json.dumps(data_object), 'utf-8'))).decode("ascii")
urlsafe_object = urllib.parse.quote(str(compressed_details))#.replace('%', '\%') # you likely don't need this last part
final_URL = f'https://my.domain.com?data_object={urlsafe_object}'
...
JS
// npm install this
import pako from 'pako';
...
const urlParams = new URLSearchParams(window.location.search);
const data_object = urlParams.get('data_object');
if (data_object) {
const compressedData = Uint8Array.from(window.atob(data_object), (c) => c.charCodeAt(0));
originalObject = JSON.parse(pako.inflate(compressedData, { to: 'string' }));
};
...

Categories