What to use instead of Buffer(bitmap) - javascript

I'm using the following method to convert files into base64 encoding and it's been working fine for a long time, but I see now that Buffer is depricated.
// function to encode file data to base64 encoded string
function base64_encode(file) {
var bitmap = fs.readFileSync(file);
// convert binary data to base64 encoded string
return new Buffer(bitmap).toString("base64");
}
let base64String = base64_encode("Document.png");
Can someone please help me modify this to work with the new suggested method as I'm not sure how to modify it myself?
Thank you so much in advance.

It is not Buffer that is deprecated but its constructor, so instead of new Buffer() you use e.g. Buffer.from.
However fs.readFileSync already returns a Buffer if no encoding is specified, so there is not really a need to pass that to another buffer. Instead you can do return fs.readFileSync(file).toString("base64")
Using the Sync part of the API is most of the time something would like to avoid and if possible switch over to the promise-based API.

Related

How to decode a base64 encoded string returned from an api correctly

Having a base64 string encoded JSON like object coming from a python API, what is the correct way to decode and parse the encoded JSON in javascript?
Python makes it a byte string literal by adding b prefix and additional apostrophes.
I have written the following function using Buffer.from and buf.toString methods, which is working fine, but the problem is, the string I receive from the API has b'<encoded-string>' format. With the initial b and the apostrophes(').
const decodedObject: {
foo?: string;
} = JSON.parse(
Buffer.from(encodedString.replace("b'", '').replace("'", ''), 'base64').toString(),
);
atob and btoa seems deprecated I as understand from the warning in my IDE. That's why I used Buffer methods
So, my question is: Removing those apostrophes and b manually didn't feel right. Is there something I might be missing?
For example creating the encoded base64 in python:
>>> import base64
>>> "https://example.com?encoded={}".format(base64.b64encode(str({"foo":"bar"}).encode('utf-8')))
"https://example.com?encoded=b'eydmb28nOiAnYmFyJ30='"
In order to decode that base64 in javascript, I need to first remove those prefix and apostrophes.
> console.log(atob('eydmb28nOiAnYmFyJ30='))
{'foo': 'bar'}
The problem here is that your Python code is pretending strings are JSON, instead of actually using the json library to generate proper JSON.
So, let's fix that:
import base64
import json
data = { "foo": "bar" }
encoded_value = base64.b64encode(json.dumps(data))
url = f"https://example.com?encoded={encoded_value}"
And done, the encoded URL query argument is now a normal, trivially decoded value on the JS side, because we're guaranteed that it's proper JSON, just base64 encoded.
Which means that in order to unpack that on the JS side, we just run through the obvious steps:
get the query value,
decode it,
parse it as JSON
So:
// I have no idea if you're running Node, a browser, Deno, etc.
// so substitute window.location or your framework's path object.
const url = new URL(request.path)
// get the encoded value
const query = new URLSearchParams(url.search)
const encoded = query.get(`encoded`)
// turn it back into a JSON string and parse that
const decoded = base64Decode(encoded); // again, no idea where you're running this.
try {
const unpacked = JSON.parse(decoded);
console.log(unpacked); // { foo: "bar" }
} catch (e) {
// ALWAYS run JSON.parse in a try/catch because it can, and will, throw.
}
Noting that that base64Decode(encoded) is a stand-in for whatever base64 library you're using.

How to convert Javascript Object to ArrayBuffer?

I retrieve an encoded string (using TextEncoder into UTF-8, which was stringified before sending to the server) from the server using AJAX. I parse it upon retrieval and get an Object. I need to convert this Object to a decoded string. TextDecoder seems to have decode method, but it expects ArrayBuffer or ArrayBufferView, not Object. That method gives TypeError if I use my Object as-is:
var myStr = "This is a string, possibly with utf-8 or utf-16 chars.";
console.log("Original: " + myStr);
var encoded = new TextEncoder("UTF-16").encode(myStr);
console.log("Encoded: " + encoded);
var encStr = JSON.stringify(encoded);
console.log("Stringfied: " + encStr);
//---------- Send it to the server; store in db; retrieve it later ---------
var parsedObj = JSON.parse(encStr); // Returns an "Object"
console.log("Parsed: " + parsedObj);
// The following decode method expects ArrayBuffer or ArrayBufferView only
var decStr = new TextDecoder("UTF-16").decode(parsedObj); // TypeError
// Do something with the decoded string
This SO 6965107 has extensive discussion on converting strings/ArrayBuffers but none of those answers work for my situation. I also came across this article, which does not work if I have Object.
Some posts suggest to use "responseType: arraybuffer" which results in ArrayBuffer response from the server, but I cannot use it when retrieving this encoded string because there are many other items in the same result data which need different content-type.
I am kind of stuck and unable to find a solution after searching for a day on google and SO. I am open to any solution that lets me save "strings containing international characters" to the server and "retrieve them exactly as they were", except changing the content-type because these strings are bundled within JSON objects that carry audio, video, and files. Any help or suggestions are highly appreciated.

Why is readAsBinaryString() deprecated

Why is readAsBinaryString() deprecated? From W3C
The use of readAsArrayBuffer() is preferred over readAsBinaryString(), which is provided for backwards compatibility.
readAsBinaryString returns a completely different thing than the other method, so how could one be the replacement for the other?
In my specific case I have a Blob that I need to convert to base64, there are many ways, but most of them not memory efficient. As of my tests calling window.btoa() from readAsBinaryString' result works best. If I cannot use this anymore (or for now lets say "should"), then I must convert the array to a string using iteration and strings concatenation which is not memory efficient at all!
So after researching for days I don't really find an alternative to readAsBinaryString, that's why the question, or do you see an alternative that also works with 100MB blobs?
The history is that readAsBinaryString was present in an early specification of the FileReader API, before the ArrayBuffer interface exist.
When the ArrayBuffer interface appeared, readAsBinaryString has been deprecated because all its use cases could be done in a better way using that new interface.
Indeed, readAsBinaryString only converts the binary data into a DOMString (UTF-16). There is not much you can do from it afterward. Also, storing it as an UTF-16 string implies that it takes far more space in memory than the original data size. Add to this that strings are immutable, I guess you can see how inefficient it is to work from this.
And finally, if you really need to have this string, you can actually do the same from an ArrayBuffer, you just need to call String.fromCharCode over an Uint8 view of this ArrayBuffer.
// generate some binary data
document.createElement('canvas').toBlob(blob => {
const bin_reader = new FileReader();
const arr_reader = new FileReader();
let done = 0;
bin_reader.onload = arr_reader.onload = e => {
if(++done===2) {
const arr_as_bin = [...new Uint8Array(arr_reader.result)]
.map(v => String.fromCharCode(v)).join('');
console.log('same results: ', arr_as_bin === bin_reader.result);
console.log(arr_as_bin);
}
}
bin_reader.readAsBinaryString(blob);
arr_reader.readAsArrayBuffer(blob);
});
Now, this method, while still very useless, has been re-added to the specs, because some websites did start using it.
And to help OP a bit more, since what they were trying to do was actually to get a base64 version of their Blob, then don't even use readAsArrayBuffer(), readAsDataURL() is what you need:
const blob = new Blob(['hello']);
const reader = new FileReader();
reader.onload = e => {
const dataURL = reader.result;
const base64 = dataURL.slice(dataURL.indexOf(',')+1);
console.log(base64);
};
reader.readAsDataURL(blob);

Blob's DataUri vs Base64 string DataUri

As you know & stated in w3 it is possible to create a url for a Blob object in javascript by using Blob's createObjectUrl. On the other hand, if we have a data as a Base64 encoded string we can present it as a Url with the format "data[MIMEType];base64,[data>]".
Let's suppose that I have a base64 encoded string that was generated from an image that is very popular on these days :) "The red dot" image in wikipedia.
var reddotB64 = "iVBORw0KGgoAAAANSUhEUgAAAAUAAAAFCAYAAACNbyblAAAAHElEQVQI12P4//8/w38GIAXDIBKE0DHxgljNBAAO9TXL0Y4OHwAAAABJRU5ErkJggg";
I'm 100% sure that if I create a URL conforming the Data URI Scheme as stated above, then, I'll be able to put a link element and download it from the browser: please see the code example below:
var reddotB64 = "iVBORw0KGgoAAAANSUhEUgAAAAUAAAAFCAYAAACNbyblAAAAHElEQVQI12P4//8/w38GIAXDIBKE0DHxgljNBAAO9TXL0Y4OHwAAAABJRU5ErkJggg";
var reddotLink = document.createElement("a");
reddotLink.target = "_blank";
reddotLink.href = "data:image/png;base64," + reddotB64;
document.body.appendChild(reddotLink);
reddotLink.click();
document.body.removeChild(reddotLink);
This works prettywell and displays the image in a new tab. On the other hand I'll try to create the link by using Blob as follow:
var reddotB64 = "iVBORw0KGgoAAAANSUhEUgAAAAUAAAAFCAYAAACNbyblAAAAHElEQVQI12P4//8/w38GIAXDIBKE0DHxgljNBAAO9TXL0Y4OHwAAAABJRU5ErkJggg";
var reddotBlob = new Blob([atob(reddotB64)], { type: 'image/png' });
var reddotLink = document.createElement("a");
reddotLink.target = "_blank";
reddotLink.href = URL.createObjectURL(reddotBlob);
document.body.appendChild(reddotLink);
reddotLink.click();
document.body.removeChild(reddotLink);
This code is decoding base64 encoded string variable reddotB64 via atob function. And then, creating a Blob object and continues with URL.createObjectURL function. In that case, since I've decoded reddotB64 from base64 to binary and created a Blob of type image/png and then create object url from that I expect it to work but it's not working.
Do you have a clue why it's not working? Or am I missing anything on the standards? Or doing something wrong in Javascript?
Here is the answer. Looks like it is an encoding issue. In order to convert/decode Base64 string to binary(UInt8Array/byte) using atob is not enough. After using atob it is required to use UTF-16 character code: and we achieve this by using charCodeAt function for every character in the decoded string. As a result we get UTF-16 encoded binary string which is definately working. Just create a Blob and then call URL.createObjectURL.

Returning a byte string to ExternalInterface.call throws an error

I am working on my open source project Downloadify, and up until now it simply handles returning Strings in response to ExternalInterface.call commands.
I am trying to put together a test case using JSZip and Downloadify together, the end result being that a Zip file is created dynamically in the browser, then saved to the disk using FileReference.save. However, this is my problem:
The JSZip library can return either a base64 encoded string of the Zip, or the raw byte string. The problem is, if I return that byte string in response to the ExternalInterface.call command, I get this error:
Error #1085: The element type "string" must be terminated by the matching end-tag "</string>"
ActionScript 3:
var theData:* = ExternalInterface.call('Downloadify.getTextForSave',queue_name);
Where queue_name is just a string used to identify the correct instance in JS.
JavaScript:
var zip = new JSZip();
zip.add("test.txt", "Hello world!\n");
var content = zip.generate(true);
return content;
If I instead return a normal string instead of the byte string, the call works correctly.I would like to avoid using base64 as I would have to include a base64 decoder in my swf which will increase its size.
Finally: I am not looking for a AS3 Zip generator. It is imperative to my project to have that part run in JavaScript
I am admittedly not a AS3 programmer by trade, so if you need any more detail please let me know.
When data is being returned from javascript calls it's being serialized into an XML string. So if the "raw string" returned by JSZip will include characters which make the XML non-valid, which is what I think is happening here, you'll get errors like that.
What you get as a return is actually:
<string>[your JSZip generated string]</string>
Imagine your return string includes a "<" char - this will make the xml invalid, and it's hard to tell what character codes will a raw byte stream translate too.
You can read more about the external API's XML format on LiveDocs
i think the problem is caused by the fact, that flash expects a utf8 String and you throw some binary stuff at it. i think for example 0x00FF will not turn out to be valid utf8 ...
you can try fiddling around with flash.system::System.setCodePage, but i wouldn't be too optimistic ...
i guess a base64 decoder is probably really the easiest ... i'd rather worry about speed than about file size though ... this rudimentary decoder method uses less than half a K:
public function decodeBase64(source:String):ByteArray {
var ret:ByteArray = new ByteArray();
var map:Object = new Object();
var i:int = 0;
for each (var char:String in "ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789+/".split("")) map[char] = i++;
map["="] = 0;
source = source.split("\n").join("").split("\r").join("");//remove linebreaks
for (i = 0; i < source.length/4; i++) {
var buf:int = 0;
for each (char in source.substr(i * 4, 4).split("")) buf = (buf << 6) + map[char];
ret.writeByte(buf >>> 16);
ret.writeShort(buf);
}
return ret;
}
you could simply shorten function names and take a smaller image ... or use ColorTransform or ConvolutionFilter on one image instead of four ... or compile the image into the SWF for smaller overall size ... or reduce function name length ...
so unless you're planning on working with MBs of data, this is the way to go ...

Categories