IPFS: base64-encoded image not showing as image - javascript

I have a simple function that tries to base64-encode an image and upload it to IPFS:
async function toIPFS() {
const node = await IPFS.create()
const data = fs.readFileSync('./src/assets/logo.png', 'base64').toString('base64')
const results = await node.add(data)
console.log(results.cid.string)
}
However, when I actually check the hash it displays as a long string:
iVBORw0KGgoAAAANSUhEUgAAAMgAAADICAYAAACtWK6eAAAAGXRFWHRTb2Z0d2FyZQBBZG9iZSBJbWFnZVJlYWR5ccllPAAAAyNpVFh0WE1MOmNvbS5hZG9iZS54bXAAAAAAADw/eHB...etc
How do I upload an image such that it actually displays as an image? What am I missing?
I've never worked with images so pardon if this is a noob question:)

What you're seeing returned is the file encoded as base64, if you want to store the image itself for later retrieval, this is how you'd do it:
async function toIPFS() {
const node = await IPFS.create()
const data = fs.readFileSync('./src/assets/logo.png')
const results = await node.add(data)
console.log(results.cid.string)
}

Related

How can I display an Image without downloading it from a url in NodeJS?

In NodeJS I used package named node-fetch, also for taking JSON Response, but how about an Image response? How can I do that? In my current codes, It does only save the image not showing like PIL from python.
var tr = "https://i.picsum.photos/id/866/200/300.jpg?hmac=rcadCENKh4rD6MAp6V_ma-AyWv641M4iiOpe1RyFHeI"
export async function get_image() {
const get_url = await fetch(tr)
const image = get_url.body.pipe(fs.createWriteStream('./image.png'))
}
await get_image();
You can get an image in Base64 format and save it in a variable using the axios module like so:
const axios = require('axios')
const imageURL = 'https://i.picsum.photos/id/866/200/300.jpg?hmac=rcadCENKh4rD6MAp6V_ma-AyWv641M4iiOpe1RyFHeI';
(async ()=>{
// Get image Buffer from url and convert to Base64
const image = await axios.get(imageURL, {responseType: 'arraybuffer'});
const base64Image = Buffer.from(image.data).toString('base64');
// Do stuff with result...
console.log(base64Image);
})();
// Or if you prefer, a one liner
(async ()=>{
const base64Image = Buffer.from((await axios.get(imageURL, {responseType: 'arraybuffer'})).data).toString('base64');
})();
You can check if it worked by using a website to decode the base64 string into an image.

My example.downloadURL gives back an undefined

I am simply trying to test uploading images and I want to display the image, that is uploadoaded.
My code looks like this:
function uploadFile(files){
const storageRef = firebase.storage().ref(); //this references the firebase storage
const horseRef = storageRef.child("horse.jpg");
const file = files.item(0); //will return a list so lets take the first item
const task = horseRef.put(file); //to upload the file we call the put file
console.log(task);
task.then(snapshot => { //returns a buncha data including a snapshot url
const url = snapshot.downloadURL
document.getElementById("upload").setAttribute("src", url)
});
}
But the snapshot.downloadURL is giving back an undefined. Can you help me?
There isn't any downloadURL property on the snapshot which is an UploadTaskSnapshot. You need to use getDownloadURL() method on the storage reference to get the URL. Try refactoring the code as shown below:
task.then(async (snapshot) => {
const url = await snapshot.ref.getDownloadUrl()
document.getElementById("upload").setAttribute("src", url)
});

How to add user image details like date or time to file name while uploading photos to firebase storage?

I am facing a problem related to uploading images in firebase storage. I am able to upload user images in firebase storage but it is only showing the newest image. it is not showing previous images. I think it is because of the path that I have created and the timestamp in this line of code:
const path = 'photos/${Date.now()}.jpg';
I am using expo and React. Here is the code for uploading images:
uploadPhotoAsync = async uri => {
const path = 'photos/${Date.now()}.jpg';
return new Promise (async(res, rej) => {
const response = await fetch(uri)
const file = await response.blob()
let upload = firebase.storage().ref(path).put(file)
upload.on("state_changed", snapshot=>{
}, err => {
rej(err)
}, async ()=>{
const url = await upload.snapshot.ref.getDownloadURL()
res(url)
})
})
}
Thank you.
There are several errors in your code:
Template literals should be delimited with backticks (`), while you delimit yours with standard single quotes ('): const path = 'photos/$(Date.now()}.jpg';.
Result: The value of path is always the same.
In addition, after the $ you use a parenthesis instead of a curly bracket.
Doing as follows shall do the trick:
const path = `photos/${Date.now()}.jpg`

Can not get AEC model Data in Autodesk Forge

i try to activate Revit Levels and 2D Minimap extension in autodesk forge viewer, but can not get AEC Model Data. I got this worning`
i tried to get AEC data with this code
const url = window.location.search;
console.log(url);
const svf_path = `${url.replace("?", "/storage/").replace(/%20/g, " ")}`;
Autodesk.Viewing.endpoint.getItemApi = (endpoint, derivativeUrn, api) => {
return svf_path;
};
Autodesk.Viewing.Initializer(options, async () => {
const paths = svf_path.split("/");
const [dest, svf_dir] = [paths[2], paths[3]];
const url = `/api/viewer/dest/${dest}/svf/${svf_dir}/manifest`;
const response = await fetch(url);
const manifest = await response.json();
const init_div = document.getElementById("init_div");
viewer = new Autodesk.Viewing.GuiViewer3D(init_div, config3d);
const viewerDocument = new Autodesk.Viewing.Document(manifest);
const viewable = viewerDocument.getRoot().getDefaultGeometry();
viewer.start();
await viewerDocument.downloadAecModelData();
viewer.loadDocumentNode(viewerDocument, viewable)
.then(function (result) {
Autodesk.Viewing.Document.getAecModelData(viewable);
})
});
wats wrong in my code?
The warning comes from the BubbleNode.prototype.getAecModelData method. You are not calling it in your code but it's possible that it's being called by the LevelsExtension itself. Try configuring the extension so that it doesn't detect the AEC data automatically by passing in { autoDetectAecModelData: false } as the extension options.
Btw. to debug the issue on your side, you can also try getting the non-minified version of viewer3D.js, put a breakpoint to where the warning is being logged, and see the call stack when the breakpoint is hit.

Getting all images from a webpage and save the to disk programmatically (NodeJS & Javascript)

I need to get a lot of images from a few websites and download them to my disk so that I can use them (will upload them to a blob (azure) and then save the link to my DB).
GETTING THE IMAGES
I know how to get the images from the html with JS, for example one of them I would make a for-loop and do:
document.getElementsByClassName('person')[i].querySelector('div').querySelector('img').getAttribute('src')
And there I would have the links to all the images.
SAVING THE IMAGES
I also saw that I can save the files to disk using node and the fs module, by doing:
function saveImageToDisk(url, localPath) {var fullUrl = url;
var file = fs.createWriteStream(localPath);
var request = https.get(url, function(response) {
response.pipe(file);
});
}
HOW TO PUT IT ALL TOGETHER
This is where I am stuck, I don't know exactly how to connect the two parts (the script and the nodejs code), I want to get the image and also the image name (alt tag in this case) and then use them in node to upload the image to a blob and put them name and image blob url in my DB.
I thought I could download the html page and then put the JS script on the bottom of the body but then I don't know how to pass the url to the nodejs code.
How can I do this?
I am not very used to using scripts, I mostly used node without them and I get a bit confused by their interactions and how to connect js scripts to my code.
Also is this the best way to go about this or is there a simpler/better way I am not seeing?
This feels like you should use a crawler. The following code should work (using the npm module crawler):
const Crawler = require("crawler")
const c = new Crawler({
callback: function(error, res, done) {
if (error) {
console.log({error})
} else {
const images = res.$('.person div img')
images.each(index => {
// here you can save the file or save them in an array to download them later
console.log({
src: images[index].attribs.src,
alt: images[index].attribs.alt,
})
})
}
}
})
c.queue('https://www.yoursite.com')
You need a bridge between Web API (for DOM parsing etc) and Node.js API. For example, some headless browser managing tool for Node.js. Say, you can use puppeteer with this script:
'use strict';
const puppeteer = require('puppeteer');
const https = require('https');
const fs = require('fs');
(async function main() {
try {
const browser = await puppeteer.launch();
const [page] = await browser.pages();
await page.goto('https://en.wikipedia.org/wiki/Image');
const imgURLs = await page.evaluate(() =>
Array.from(
document.querySelectorAll('#mw-content-text img.thumbimage'),
({ src }) => src,
)
);
console.log(imgURLs);
await browser.close();
imgURLs.forEach((imgURL, i) => {
https.get(imgURL, (response) => {
response.pipe(fs.createWriteStream(`${i++}.${imgURL.slice(-3)}`));
});
});
} catch (err) {
console.error(err);
}
})();
You can even download images just once, using pictures already downloaded by the browser. This script saves identical images, but with one session of requests, without using https Node.js module (this saves time, network traffic and server workload):
'use strict';
const puppeteer = require('puppeteer');
const fs = require('fs');
(async function main() {
try {
const browser = await puppeteer.launch();
const [page] = await browser.pages();
const allImgResponses = {};
page.on('response', (response) => {
if (response.request().resourceType() === 'image') {
allImgResponses[response.url()] = response;
}
});
await page.goto('https://en.wikipedia.org/wiki/Image');
const selecedImgURLs = await page.evaluate(() =>
Array.from(
document.querySelectorAll('#mw-content-text img.thumbimage'),
({ src }) => src,
)
);
console.log(selecedImgURLs);
let i = 0;
for (const imgURL of selecedImgURLs) {
fs.writeFileSync(
`${i++}.${imgURL.slice(-3)}`,
await allImgResponses[imgURL].buffer(),
);
}
await browser.close();
} catch (err) {
console.error(err);
}
})();
I recommend you to use the dom-parser module. See here: https://www.npmjs.com/package/dom-parser
By doing so, you can download the whole html-File with http.get() and parse it using the dom-parser. Then extract all the information you need from the HTML-File. With the Image URL, use your saveImageToDisk() function.
Following your idea, you have to add the JS script to the html-File as you mentioned. But in addition you have to use Ajax (xmlHttpRequest) to post the URL to a nodeJS-Server.
You can use Promise & inside it do the job of getting all the images and put the image url in an array.Then inside the then method you can either iterate the array and call the saveImageToDisk each time or you can send the array to the middle layer with slide modification. The second option is better since it will make only one network call
function getImages() {
return new Promise((resolve, reject) => {
// Array.from will create an array
// map will return a new array with all the image url
let k = Array.from(document.getElementsByClassName('person')[0].querySelector('div')
.querySelectorAll('img'))
.map((item) => {
return item.getAttribute('src')
})
resolve(k)
})
}
getImages().then((d) => {
// it will work only after the promise is resolved
console.log('****', d);
(item => {
// call saveImageToDisk function
})
})
function saveImageToDisk(url, localPath) {
var fullUrl = url;
var file = fs.createWriteStream(localPath);
var request = https.get(url, function(response) {
response.pipe(file);
});
<div class='person'>
<div>
<img src='https://www.fast-growing-trees.com/images/P/Leyland-Cypress-450-MAIN.jpg'>
<img src='http://cdn.shopify.com/s/files/1/2473/3486/products/Cypress_Leyland_2_Horticopia_d1b5b63a-8bf7-4897-96fb-05320bf3d81b_grande.jpg?v=1532991076'>
<img src='https://www.fast-growing-trees.com/images/P/Live-Oak-Tree-450w.jpg'>
<img src='https://www.greatgardenplants.com/images/uploads/452_1262_popup.jpg'>
<img src='https://shop.arborday.org/data/default/images/catalog/600/Turnkey/1/Leyland-Cypress_3-828.jpg'>
<img src='https://images-na.ssl-images-amazon.com/images/I/51RZkKnrlSL._SX425_.jpg'>
<img src='https://thumbs-prod.si-cdn.com/Z3JYiuJ96ReLq04NCT1B94sTd4E=/800x600/filters:no_upscale()/https://public-media.si-cdn.com/filer/06/9c/069cfb16-c46c-4742-85f0-3c7e45fa139d/mar2018_a05_talkingtrees.jpg'>
</div>

Categories