I have a Unit8Array in a mongodb that represents a png image. I pull the array out and convert it to base64 using javascript (in meteor client side) so that I can display it in the browser. The code is as follows
var blob = new Uint8Array(Tester.findOne().image);
var base64String = btoa(String.fromCharCode.apply(null, blob));
var src = 'data:image/png;base64,' + base64String + '';
console.log(src);
Where Tester.findOne().image is my mongodb collection containing the Uint8Array.
I take the src once it is logged and I paste it into the browser. The generic broken image icon appears (see below). However when I click the broken image in the browser and say save to desktop, it downloads the file and when I open it the png image appears as expected.
After I Download the Image:
When I directly try to assign it to <img src = 'data'>:
Adding data as img source or by using document.findElementbyId("img") then changing img.src:
Related
I'm trying to download an image and write it to the disk in Javascript. The image will be delivered base64 encoded by the server. I'm then trying to decode the image, create a blob with that data and create a new URL object. The download itself works, but the output file will be corrupted and unusable. My code looks as follows:
jsonObject = JSON.parse(requestObject.getReturnData());
decoded = atob(jsonObject['DownloadFile']);
url = window.URL.createObjectURL(new Blob([decoded], { type: "image/png" }));
aElement = document.createElement('a');
aElement.style.display = 'none';
aElement.href = url;
aElement.download = 'download.' + jsonObject['DownloadType'];
document.body.appendChild(aElement);
aElement.click();
window.URL.revokeObjectURL(url);
I verified that jsonObject['DownloadFile'] contains the correct base64 representation of the image. However, it seems that there's an error when creating the blob, since its size is too big when looking at the debug console.
I'm trying to download an image using node.js and puppeteer but I'm running into some issues. I'm using a webscraper to gather the links of the images from the site and then using the https/http package to download the image.
This works for the images using http and https sources but some images have links that look like this (the whole link is very long so I cut the rest):
data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAw8AAADGCAYAAACU07w3AAAZuUlEQVR4Ae3df4yU930n8Pcslu1I1PU17okdO1cLrTD+g8rNcvRyti6247K5NG5S5HOl5hA2uZ7du6RJEGYPTFy1Nv4RUJy0cWVkeQ9ErqqriHNrR8niZuVIbntBS886rBZWCGHVsNEFRQ5BloPCzGn2B+yzZMLyaP........
I'm not sure how to handle these links or how to download the image. Any help would be appreciated.
You need to first decode the url from base64 using node.js Buffer.
// the content type image/png has to be removed first
const data = 'iVBORw0KGgoAAAANSUhEUgAAAw8AAADGCAYAAACU07w3AAAZuUlEQVR4Ae3df4yU930n8Pcslu1I1PU17okdO1cLrTD+g8rNcvRyti6247K5NG5S5HOl5hA2uZ7du6RJEGYPTFy1Nv4RUJy0cWVkeQ9ErqqriHNrR8niZuVIbntBS886rBZWCGHVsNEFRQ5BloPCzGn2B+yzZMLyaP';
const buffer = new Buffer(data);
const base64data = buff.toString('base64');
// after this you will get the url string and continue to fetch the image
These are the base64 encoded images (mostly used for icons and small images).
you can ignore it.
if(url.startsWith('data:')){
//base 64 image
} else{
// an image url
}
if you really want to mess with base64 I can give you a workaround.
import { parseDataURI } from 'dauria';
import mimeTypes from 'mime-types';
const fileContent = parseDataURI(file);
// you probably need an extension for that image.
let ext = mimeTypes.extension(fileContent.MIME) || 'bin';
fs.writeFile("a random file"+"."+ext, fileContent.buffer, function (err) {
console.log(err); // writes out file without error, but it's not a valid image
});
I'm working on a web application that involves loading images into a canvas object, then manipulating those images beyond recognition. I need to hide the original source image file (a jpeg) so that the user on the client side should not be able to use dev tools to see the original image.
I have tried to encode the images as a base64 and load it via a JSON data file, but even with this method, the inspector tool still shows the original image file (when it is set as the src of my javascript image object). Is there some way that I can encrypt and decrypt the image files, so that the user has no way of seeing the original image (or have it be some garbled image, for example)? Preferably I'd like to do this on the client side, as all my code is client side at the moment. Thanks in advance!
Here is my code for loading the base64 encoded image data via a JSON file:
//LOAD JSON INSTEAD?
$.getJSON( "media/masks.json", function( data ) {
console.log("media/masks.json LOADED");
//loop through data
var cnt = 0;
for (var key in data)
{
if (data.hasOwnProperty(key))
{
// here you have access to
//var id = key;
var imgData = data[key];
//create image object from data
var image = new Image();
image.src = imgData;
console.log('img src: '+ imgData);
var elementId = $scope.masks[cnt].id;
// copy the images to canvases
imagecanvas = document.createElement('CANVAS');
imagecanvas.width = image.width;
imagecanvas.height = image.height;
imagecanvas.getContext('2d').drawImage(image,0,0);
imageCanvases[elementId] = imagecanvas;
}
cnt++;
}
});
This is what I see in the Chrome dev tools Network inspector (exactly what I'm trying to avoid):
I need to hide the original source image file (a jpeg) so that the user on the client side should not be able to use dev tools to see the original image.
That's not possible. There is always a way to get at the image using developer tools. Even if there wasn't, a simple screen capture would defeat whatever measures you put in place.
I have a geotiff file that is being converted to a base64string upon being selected. The encoded file is then uploaded to an ASP.NET web service and then decoded and saved with a .tiff extension. The problem is that the metadata within the file is significantly altered from the original file.
JavaScript
var fr = new FileReader();
fr.onloadend = function () {
var base64string = fr.result;
var imgStr = base64string.split("base64,")[1];
App.instance.client.area.uploadMap(imgStr);
};
fr.readAsDataURL(value.rawFile);
C#/ASP.NET Web API:
byte[] imageBytes = Convert.FromBase64String(mapImage);
MemoryStream ms = new MemoryStream(imageBytes);
Image img = Image.FromStream(ms);
I'm then extrating the metadata uploaded image with the GDAL library. The image looks fine, but the metadata contained within the file is completely different. The corner coordinates are no longer accurate and there is color table information in the new file, which wasn't present in the original.
Is there any way to handle this conversion so that the bytes aren't altered?
Why the data captured from the webcam gets modified if i put it in an image field:
a = canvas.toDataURL("image/jpeg")
document.getElementById("avatar").src = a
src will be corrupted and will show only blank file, but when accessing to canvas.toDataURL("image/jpeg") in console, i get healthy' image.
using:
document.getElementById("avatar").src
"data:image/jpeg;base64,/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAMCAgICAgMCAgIDAwMD…gAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooA//Z"
But:
a = canvas.toDataURL("image/jpeg")
"data:image/jpeg;base64,/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAMCAgICAgMCAgIDAwMD…vtP7ypIi8kA5zUDTCJSDyw7kUrOjzF0dmAOADSTFpJTJM49TxyaG3e0np6joNx0lHTp1+fY//Z"
the code is here