Upload File to Firebase Ddmin SDK (GCS) with https stream - javascript

When trying to upload a stream into a Google bucket I am getting Error: Not Found when using get method and Error: socket hang up after a few second delay when using the request method.
Everything with firebase seems to be initialized fine, and when I log the stream I see the data coming through, but what would be the best way to write a file to GCS using a remote URL?
const storage = firebase.storage()
const bucket = storage.bucket("bucket/path")
const file = bucket.file("filename.pdf")
const url =
"https://url/to/file/filename.pdf"
https.get(url, async (res) => {
console.log(res)
res.pipe(file.createWriteStream())
})

The cause of the issue was passing the folder path into the bucket name instead of the file name.
Bucket name is available in the storage console, and do not pass in a folder path.
Bucket name example:
gs://bucket.appspot.com
(remove the gs:// when passing it as a value)
const bucket = storage.bucket("bucketname")
const file = bucket.file("bucket/path/filename.pdf")

Related

How to fetch JSON-file from ipfs and display on a webpage (using 'ipfs-core')

import * as IPFS from 'ipfs-core'
var ipfs = await IPFS.create({ repo: 'ok' + Math.random() })
const metadataMap = new Map()
//content.metadataCid is the ipfs-cid of the metadata.json file stored on ipfs.
var res = await ipfs.cat(content.metadataCid)
//var data = await res.Json()
console.log("** Metadata from Cid **")
console.log(res)
//This just maps the content (content-cid) to its metadata
//sets the metadata for each Content
metadataMap.set(theCid, res)
if (metadataMap) {
console.log('**metadata map**')
console.log(metadataMap)
}
The console -
I am hosting those metadata files on pinata as well as on my desktop-ipfs.
And you can access it using ipfs cli or gateways.
eg_link: ipfs://bafybeibro7fxpk7sk2nfvslumxraol437ug35qz4xx2p7ygjctunb2wi3i/
The link opens in the browser using ipfs-gateway just fine:
But when I use ipfs.cat(), the console just shows cat {suspended} (as shown in the image attached)]
I can access the image stored on ipfs using "img" tag without any problem:
In the image above, the images are from ipfs.
I also wanna show title and description which is stored in that json file on ipfs
Same issue with ipfs.get() !
How can I access that metadata.json file and parse them.
Am I missing any step here?
thanks 🤞

google cloud storage image url works fine but when used in the src of an img html file it goes to the alternative and does not show the image

I have a PERN JS app and from the front (in react) a form has an input type file that sends the file (an image) to the server side. I then uploaded the image in a bucket in google cloud storage with public permission for everything to allUsers. It uploads fine, if I go to the public url provided by google storage i can see the image fine even on incognito window. The problem is when i send the path to a react component to display the image. It has a img html tag that goes to the alternative text property of img html tag and the component never displays the image i want. It uploads with a strange size of 20 B in Google Cloud Sotorage Bucket.
My code:
const router = require('express').Router();
const {Storage} = require('#google-cloud/storage');
const gc = new Storage({
projectId: GOOGLE_CLOUD_PROJECT_ID,
keyFilename: GOOGLE_CLOUD_KEYFILE,
})
getPublicUrl = (bucketName, fileName) => `https://storage.googleapis.com/best-buds/${bucketName}/${fileName}`
router.post('/', async (req, res, next) => {
try {
const file = req.files && req.files.image
const bucket = gc.bucket('best-buds');
const file_bucket = file && bucket.file(file.name)
const stream = file_bucket.createWriteStream({
resumable: false,
gzip: true
})
stream.on('finish', () => {
file.cloudStorageObject = file.name;
return file_bucket.makePublic()
.then(() => {
file.gcsUrl = getPublicUrl('best-buds',file.name);
next()
})
})
stream.end(file.buffer);
...
res.sendStatus(200);
} catch (error) {
next(error);
res.sendStatus(500);
}
});
module.exports = router;
I used devtools extension and checked my react component receives de props image with the correct url. If i go to that url using the browser i can see the image but still its not displaying in my component.
Could anyone help me?
The problem here is the upload, uploading files to google cloud storage, whether that be cloud storage or firebase storage, you will need to pass the file as blob and then you will need to assign proper extension to it with proper metadata, every filetype has its own metadata, for example: a pdf document will have document/pdf or an image will have image/png or image/jpeg. If the upload fails and you get some low byte size file then that means that the upload failed and it is due to metadata really.
var metadata = {
contentType: 'image/jpeg',
};
Firebase's doc explains how it works here

Download file to Firebase Storage via Firebase Functions

I have a program that allows users to upload video files to firebase storage. If the file is not an mp4 I then send it to a third party video converting site to convert it to mp4. They hit a webhook(firebase function) with the URL and other information about the converted file.
Right now I'm trying to download the file to the tmp dir in firebase functions and then send it to firebase storage. I have the following questions.
Can I bypass downloading a file to the functions tmp dir and just save it directly to storage? if so how?
I'm currently having trouble downloading to the functions tmp dir below is my code. The function is returning Function execution took 6726 ms, finished with status: 'crash'
export async function donwloadExternalFile(url:string, fileName:string) {
try {
const axios = await import('axios')
const fs = await import('fs')
const workingDir = join(tmpdir(), 'downloadedFiles')
const tmpFilePath = join(workingDir, fileName)
const writer = fs.createWriteStream(tmpFilePath);
const response = await axios.default.get(url, { responseType: 'stream' })
response.data.pipe(writer);
await new Promise((resolve, reject) => {
writer.on('error', err => {
writer.close();
reject(err);
});
writer.on('close', () => {
resolve();
});
});
return
} catch (error) {
throw error
}
}
As mentioned above in the comments section, you can use the Cloud Storage Node.js SDK to effectuate the upload of your file to Cloud Storage.
Please take a look at the SDK Client reference documentation where you can find numerous samples and more information about this Cloud Storage client library.
Also, I'd like to remind you that you can bypass writing to /tmp by using a pipeline. According to the documentation for Cloud Functions, "you can process a file on Cloud Storage by creating a read stream, passing it through a stream-based process, and writing the output stream directly to Cloud Storage."
Last but not least, always delete temporary files from the Cloud Function's local system. Failing to do so can eventually result in out-of-memory issues and subsequent cold starts.

Upload image to cloudinary without generating local file

I am using jdenticon to generate user avatars on signup in a node/express app.
Running locally, I can do this by:
Generate identicon using jdenticon
Save file locally
Upload local file to cloudinary
Here's how I do this
const cloudinary = require("cloudinary");
cloudinary.config({
cloud_name: 'my-account-name',
api_key: process.env.CLOUDINARY_API,
api_secret: process.env.CLOUDINARY_SECRET
});
// 1. Generate identicon
let jdenticon = require("jdenticon"),
fs = require("fs"),
size = 600,
value = String(newUser.username),
svg = jdenticon.toPng(value, size);
let file = "uploads/" + value + ".png";
// 2. Save file locally
fs.writeFileSync(file, svg);
// 3. Upload local file to cloudinary
let avatar = await cloudinary.v2.uploader.upload(file);
// Do stuff with avatar object
This works great for running my app locally. However, as I understand it, I can't store images on Heroku (if this is not the case then that would be great to know, and would simplify things massively), so I will need to save the generated identicon directly to cloudinary.
How can I upload the generated image (svg = jdenticon.toPng(value, size);) directly to cloudinary, without first saving?
Any help would be appreciated!
jdenticon.toPng returns a buffer, I believe. And cloudinary's upload_stream method accepts a buffer, so you should be able to just do ....
const data = jdenticon.toPng(value, size);
const options = {}; // optional
cloudinary.v2.uploader.upload_stream(options, (error, result) => {
if (error) {
throw error;
}
console.log('saved .....');
console.log(result);
}).end(data);

How to change file's metadata in Google Cloud Storage with Node.js

My image's (which is hosted in Google Cloud Storage) metadata has the property named downloaded, if the image has been downloaded, the value inside the downloaded key will be changed from 0 to 1.
The code in https://cloud.google.com/storage/docs/viewing-editing-metadata#storage-view-object-metadata-nodejs shows how to view the metadatas but didn't really cover how to change the metadata.
Is it possible to do so?
Yes, it is possible.
The way to do it is by using the File.setMetadata() method.
For example, to add metadata to an object in GCS:
const file = storage
.bucket(bucketName)
.file(filename)
const metadata = {
metadata: {
example: 'test'
}
}
file.setMetadata(metadata)
// Get the updated Metadata
const get_metadata = file.getMetadata();
// Will print `File: test`
console.log(`File: ${metadata.metadata.example}`)
To update it, you can retrieve the current metadata with the getMetadata() method, modifying it, and updating it with the setMetadata() method .
For example:
const storage = new Storage();
const file = storage
.bucket(bucketName)
.file(filename)
// Get the file's metadata
const [metadata] = await file.getMetadata()
console.log(`File: ${metadata.name}`)
// update metadata
file.setMetadata(metadata.metadata.example='updated')
// Get the updated metadata
const [get_metadata] = await file.getMetadata()
console.log(`File: ${get_metadata.metadata.example}`)

Categories