I have a requirement where I need to encode data in "iso-8859-1" and then convert back it to readable string in node js.
In .Net env:
string encodedData = "VABpAG0AZQAgAHMAZQByAGUAaQBzAA==";
Encoding encoding = Encoding.GetEncoding("iso-8859-1"); // encoding in "iso-8859-1"
byte[] = decodedbuff = convert.FromBase64String(encodedData); // getting buffer
result = encoding.GetString(decodedbuff); //decoding
result = timesereis
In a similar way, I need to encode and decode in node js
In Node js(using iconvlite)
const data = "VABpAG0AZQAgAHMAZQByAGUAaQBzAA=="
const buffer = iconvlite.encode(data,'iso-8859-1');
const result = buffer.toString('utf8');
Here in result, I am getting "VABpAG0AZQAgAHMAZQByAGUAaQBzAA==" instead of decoded result
By using the following code you get your desired result
let buffer = new Buffer(data, 'base64');
let result = buff.toString('utf-8');
console.log("result: "+text)
Google cloud storage gives md5 hashes of objects encoded in base64. An example would be H0m5T/tigkNJLqL6+z9A7Q==. I've tried to convert it using btoa(), but that results in I9O{bCI."z{?#m instead of the expected b1f4f9a523e36fd969f4573e25af4540.
I'm getting the string with File.metadata.md5Hash described here
Is there any way to convert this base64 -> H0m5T/tigkNJLqL6+z9A7Q== to this string -> b1f4f9a523e36fd969f4573e25af4540 in node.js?
Code for reference:
async function getAllmd5()
{
const {Storage} = require('#google-cloud/storage');
const storage = new Storage();
var bucket = storage.bucket('example');
var [files] = await bucket.getFiles();
for (var i = 0; i < files.length; i++)
{
console.log(Buffer.from(files[i].metadata.md5Hash, 'base64').toString("ascii"))
}
}
It seems I had the wrong thought process.
I was thinking: get data -> get hash -> encode to base64
The problem was that isn't how that works.
What you need to instead is to: get data -> get hash AS base64. Usually a program will output it as hex, which is fine for most cases, but in Google's it was a base64 md5, different from a hex one.
An example of how to get the md5 hash as base64:
//The modules we need:
const { md5 } = require('crypto-md5');
const { fs } = require('fs');
var filename "example.txt";
//Read the file content
var filecontent = fs.readFileSync(path.join(__dirname, filename));
var hash = md5(filecontent, 'base64'); //Generate the md5 AS base64
I'm using a web service to convert a pdf to bitmap to print on an Arduino mini thermal printer.
The arduino BMP function requires the bitmap to be an array of type uint8_t. I can bring the BMP down in base64 encoding, so my question is how do I convert a base64 string to an array of equivalent type uint8_t?
let buffer = new Buffer(body, 'base64').toString('hex');
let array = [...buffer];
arr = arr.map(e => { return `0x${e.charCodeAt(0).toString(16)}`; });
I want to offload as much as possible to the server so that the arduino doesn't have to handle this, so I return this json in the response:
let obj = {
width: img.width, // from cloudinary response (pdf to bmp)
height: img.height, // from cloudinary response (pdf to bmp)
data: arr
};
But this whole the response is invalid. I'm not quite sure what I'm doing wrong here, but I assume it has something to do with my conversion to base64, hex, and then converting the range of characters to hex.
Update
I believe I'm getting closer:
let buffer = new Buffer(body, 'base64');
let arrBuffer = [...buffer];
let imgArray = new Uint8Array([...arrBuffer], 0, arrBuffer.length);
let hexArray = [];
for (data of imgArray.values()) { hexArray.push(data.toString(16)); }
the question says it all, im receiving a base64 encoded ZIPFILE from the server, which I first want to decode to a ZIPFILE in memory and then get the ZIPFILES content, which is a json-file.
I tried to use JSZIP but im totally lost in this case ... the base64 string is received with javascript by a promise.
So my question in short is: How can I convert a base64 encoded ZIPFILE to a ZIPFILE in memory to get its contents.
BASE64 -> ZIPFILE -> CONTENT
I use this complicated process to save much space on my database. And I dont want to handle this process on server-side, but on clientside with JS.
Thanks in advance!
If anyone is interested in my solution to this problem read my answer right here:
I received the data in a base64-string format, then converted the string to a blob. Then I used the blob-handle to load the zipfile with the JSZip-Library. After that I could just grab the contents of the zipfile. Code is below:
function base64ToBlob(base64) {
let binaryString = window.atob(base64);
let binaryLen = binaryString.length;
let ab = new ArrayBuffer(binaryLen);
let ia = new Uint8Array(ab);
for (let i = 0; i < binaryLen; i++) {
ia[i] = binaryString.charCodeAt(i);
}
let bb = new Blob([ab]);
bb.lastModifiedDate = new Date();
bb.name = "archive.zip";
bb.type = "zip";
return bb;
}
To get the contents of the zipfile:
let blob = base64ToBlob(resolved);
let zip = new JSZip();
zip.loadAsync(blob).then(function(zip) {
zip.file("archived.json").async("string").then(function (content) {
console.log(content);
// content is the file as a string
});
}).catch((e) => {
});
As you can see, first the blob is created from the base64-string. Then the handle is given over to the JSZip loadAsync method. After that you have to set the name of the file which you want to retrieve from the zipfile. In this case it is the file called "archived.json". Now because of the async("string") function the file (file contents) are returned as a string. To further use the extracted string, just work with the content variable.
I want to get compress layer data from tmx file . Who knows libraries for decompress gzip and zlib string in javascript ? I try zlib but it doesn't work for me . Ex , layer data in tmx file is :
<data encoding="base64" compression="zlib">
eJztwTEBAAAAwqD1T20JT6AAAHgaCWAAAQ==
</data>
My javascript code is
var base64Data = "eJztwTEBAAAAwqD1T20JT6AAAHgaCWAAAQ==";
var compressData = atob(base64Data);
var inflate = new Zlib.Inflate(compressData);
var output = inflate.decompress();
It runs with displays message error "unsupported compression method" . But I try decompress with online tool as http://i-tools.org/gzip , it returns correct string.
Pako is a full and modern Zlib port.
Here is a very simple example and you can work from there.
Get pako.js and you can decompress byteArray like so:
<html>
<head>
<title>Gunzipping binary gzipped string</title>
<script type="text/javascript" src="pako.js"></script>
<script type="text/javascript">
// Get datastream as Array, for example:
var charData = [31,139,8,0,0,0,0,0,0,3,5,193,219,13,0,16,16,4,192,86,214,151,102,52,33,110,35,66,108,226,60,218,55,147,164,238,24,173,19,143,241,18,85,27,58,203,57,46,29,25,198,34,163,193,247,106,179,134,15,50,167,173,148,48,0,0,0];
// Turn number array into byte-array
var binData = new Uint8Array(charData);
// Pako magic
var data = pako.inflate(binData);
// Convert gunzipped byteArray back to ascii string:
var strData = String.fromCharCode.apply(null, new Uint16Array(data));
// Output to console
console.log(strData);
</script>
</head>
<body>
Open up the developer console.
</body>
</html>
Running example: http://jsfiddle.net/9yH7M/
Alternatively you can base64 encode the array before you send it over as the Array takes up a lot of overhead when sending as JSON or XML. Decode likewise:
// Get some base64 encoded binary data from the server. Imagine we got this:
var b64Data = 'H4sIAAAAAAAAAwXB2w0AEBAEwFbWl2Y0IW4jQmziPNo3k6TuGK0Tj/ESVRs6yzkuHRnGIqPB92qzhg8yp62UMAAAAA==';
// Decode base64 (convert ascii to binary)
var strData = atob(b64Data);
// Convert binary string to character-number array
var charData = strData.split('').map(function(x){return x.charCodeAt(0);});
// Turn number array into byte-array
var binData = new Uint8Array(charData);
// Pako magic
var data = pako.inflate(binData);
// Convert gunzipped byteArray back to ascii string:
var strData = String.fromCharCode.apply(null, new Uint16Array(data));
// Output to console
console.log(strData);
Running example: http://jsfiddle.net/9yH7M/1/
To go more advanced, here is the pako API documentation.
I can solve my problem by zlib . I fix my code as below
var base64Data = "eJztwTEBAAAAwqD1T20JT6AAAHgaCWAAAQ==";
var compressData = atob(base64Data);
var compressData = compressData.split('').map(function(e) {
return e.charCodeAt(0);
});
var inflate = new Zlib.Inflate(compressData);
var output = inflate.decompress();
For anyone using Ruby on Rails, who wants to send compressed encoded data to the browser, then uncompress it via Javascript on the browser, I've combined both excellent answers above into the following solution. Here's the Rails server code in my application controller which compresses and encodes a string before sending it the browser via a #variable to a .html.erb file:
require 'zlib'
require 'base64'
def compressor (some_string)
Base64.encode64(Zlib::Deflate.deflate(some_string))
end
Here's the Javascript function, which uses pako.min.js:
function uncompress(input_field){
base64data = document.getElementById(input_field).innerText;
compressData = atob(base64data);
compressData = compressData.split('').map(function(e) {
return e.charCodeAt(0);
});
binData = new Uint8Array(compressData);
data = pako.inflate(binData);
return String.fromCharCode.apply(null, new Uint16Array(data));
}
Here's a javascript call to that uncompress function, which wants to unencode and uncompress data stored inside a hidden HTML field:
my_answer = uncompress('my_hidden_field');
Here's the entry in the Rails application.js file to call pako.min.js, which is in the /vendor/assets/javascripts directory:
//= require pako.min
And I got the pako.min.js file from here:
https://github.com/nodeca/pako/tree/master/dist
All works at my end, anyway! :-)
I was sending data from a Python script and trying to decode it in JS. Here's what I had to do:
Python
import base64
import json
import urllib.parse
import zlib
...
data_object = {
'_id': '_id',
...
}
compressed_details = base64.b64encode(zlib.compress(bytes(json.dumps(data_object), 'utf-8'))).decode("ascii")
urlsafe_object = urllib.parse.quote(str(compressed_details))#.replace('%', '\%') # you likely don't need this last part
final_URL = f'https://my.domain.com?data_object={urlsafe_object}'
...
JS
// npm install this
import pako from 'pako';
...
const urlParams = new URLSearchParams(window.location.search);
const data_object = urlParams.get('data_object');
if (data_object) {
const compressedData = Uint8Array.from(window.atob(data_object), (c) => c.charCodeAt(0));
originalObject = JSON.parse(pako.inflate(compressedData, { to: 'string' }));
};
...