Upload image to cloudinary without generating local file - javascript

I am using jdenticon to generate user avatars on signup in a node/express app.
Running locally, I can do this by:
Generate identicon using jdenticon
Save file locally
Upload local file to cloudinary
Here's how I do this
const cloudinary = require("cloudinary");
cloudinary.config({
cloud_name: 'my-account-name',
api_key: process.env.CLOUDINARY_API,
api_secret: process.env.CLOUDINARY_SECRET
});
// 1. Generate identicon
let jdenticon = require("jdenticon"),
fs = require("fs"),
size = 600,
value = String(newUser.username),
svg = jdenticon.toPng(value, size);
let file = "uploads/" + value + ".png";
// 2. Save file locally
fs.writeFileSync(file, svg);
// 3. Upload local file to cloudinary
let avatar = await cloudinary.v2.uploader.upload(file);
// Do stuff with avatar object
This works great for running my app locally. However, as I understand it, I can't store images on Heroku (if this is not the case then that would be great to know, and would simplify things massively), so I will need to save the generated identicon directly to cloudinary.
How can I upload the generated image (svg = jdenticon.toPng(value, size);) directly to cloudinary, without first saving?
Any help would be appreciated!

jdenticon.toPng returns a buffer, I believe. And cloudinary's upload_stream method accepts a buffer, so you should be able to just do ....
const data = jdenticon.toPng(value, size);
const options = {}; // optional
cloudinary.v2.uploader.upload_stream(options, (error, result) => {
if (error) {
throw error;
}
console.log('saved .....');
console.log(result);
}).end(data);

Related

Uploading a file from front-end to firebase storage via cloud functions

I am trying to achieve the following:
User selects file on website
User calls Firebase Cloud function and passes file into the function
Cloud function uploads the file that to storage.
So far I am able to do all of the above, however, when I try to access the above file in storage, a file with no extension is downloaded. The original file was a pdf, but I am still unable able to open it with PDF viewers. It appears I am storing something in storage, although I am not exactly sure what.
Here is an example of how my front-end code works:
const getBase64 = file => new Promise((resolve, reject) => {
const reader = new FileReader();
reader.readAsDataURL(file);
reader.onload = () => resolve(reader.result);
reader.onerror = error => reject(error);
});
var document_send = document.getElementById('myFile')
var send_button = document.getElementById('send_button')
send_button.addEventListener('click', async () => {
var sendDocument = firebase.functions().httpsCallable('sendDocument')
try {
await sendDocument({
docu: await getBase64(document_send.files[0])
})
} catch (error) {
console.log(error.message);
}
})
Here is an example of how my cloud function works:
const functions = require("firebase-functions");
const admin = require("firebase-admin");
exports.sendDocument = functions.https
.onCall((data, context) => {
return admin.storage().bucket()
.file("randomLocationName")
//.file("randomLocationName"+".pdf") - tried this also
.save(data.docu);
})
.catch((error) => {
console.log(error.message);
return error;
});
});
I do not receive an error message as the function runs without error.
The save() function seems to take either a string or Buffer as first parameter.
> save(data: string | Buffer, options?: SaveOptions)
The issue arises when you pass the base64 string directly instead of a Buffer. Try refactoring the code as shown below:
return admin.storage().bucket()
.file("randomLocationName" + ".pdf") // <-- file extension required
.save(Buffer.from(data.docu, "base64"));
Cloud Functions also have a 10 MB max request size so you won't be able to upload large images that way. You can use Firebase client SDKs to upload files directly, restrict access using security rules and use Cloud Storage Triggers for Cloud Function in case you want to process the file and update the database. Alternatively, use signed URLs for uploading the files if you are using GCS without Firebase.

Upload File to Firebase Ddmin SDK (GCS) with https stream

When trying to upload a stream into a Google bucket I am getting Error: Not Found when using get method and Error: socket hang up after a few second delay when using the request method.
Everything with firebase seems to be initialized fine, and when I log the stream I see the data coming through, but what would be the best way to write a file to GCS using a remote URL?
const storage = firebase.storage()
const bucket = storage.bucket("bucket/path")
const file = bucket.file("filename.pdf")
const url =
"https://url/to/file/filename.pdf"
https.get(url, async (res) => {
console.log(res)
res.pipe(file.createWriteStream())
})
The cause of the issue was passing the folder path into the bucket name instead of the file name.
Bucket name is available in the storage console, and do not pass in a folder path.
Bucket name example:
gs://bucket.appspot.com
(remove the gs:// when passing it as a value)
const bucket = storage.bucket("bucketname")
const file = bucket.file("bucket/path/filename.pdf")

google cloud storage image url works fine but when used in the src of an img html file it goes to the alternative and does not show the image

I have a PERN JS app and from the front (in react) a form has an input type file that sends the file (an image) to the server side. I then uploaded the image in a bucket in google cloud storage with public permission for everything to allUsers. It uploads fine, if I go to the public url provided by google storage i can see the image fine even on incognito window. The problem is when i send the path to a react component to display the image. It has a img html tag that goes to the alternative text property of img html tag and the component never displays the image i want. It uploads with a strange size of 20 B in Google Cloud Sotorage Bucket.
My code:
const router = require('express').Router();
const {Storage} = require('#google-cloud/storage');
const gc = new Storage({
projectId: GOOGLE_CLOUD_PROJECT_ID,
keyFilename: GOOGLE_CLOUD_KEYFILE,
})
getPublicUrl = (bucketName, fileName) => `https://storage.googleapis.com/best-buds/${bucketName}/${fileName}`
router.post('/', async (req, res, next) => {
try {
const file = req.files && req.files.image
const bucket = gc.bucket('best-buds');
const file_bucket = file && bucket.file(file.name)
const stream = file_bucket.createWriteStream({
resumable: false,
gzip: true
})
stream.on('finish', () => {
file.cloudStorageObject = file.name;
return file_bucket.makePublic()
.then(() => {
file.gcsUrl = getPublicUrl('best-buds',file.name);
next()
})
})
stream.end(file.buffer);
...
res.sendStatus(200);
} catch (error) {
next(error);
res.sendStatus(500);
}
});
module.exports = router;
I used devtools extension and checked my react component receives de props image with the correct url. If i go to that url using the browser i can see the image but still its not displaying in my component.
Could anyone help me?
The problem here is the upload, uploading files to google cloud storage, whether that be cloud storage or firebase storage, you will need to pass the file as blob and then you will need to assign proper extension to it with proper metadata, every filetype has its own metadata, for example: a pdf document will have document/pdf or an image will have image/png or image/jpeg. If the upload fails and you get some low byte size file then that means that the upload failed and it is due to metadata really.
var metadata = {
contentType: 'image/jpeg',
};
Firebase's doc explains how it works here

How to change file's metadata in Google Cloud Storage with Node.js

My image's (which is hosted in Google Cloud Storage) metadata has the property named downloaded, if the image has been downloaded, the value inside the downloaded key will be changed from 0 to 1.
The code in https://cloud.google.com/storage/docs/viewing-editing-metadata#storage-view-object-metadata-nodejs shows how to view the metadatas but didn't really cover how to change the metadata.
Is it possible to do so?
Yes, it is possible.
The way to do it is by using the File.setMetadata() method.
For example, to add metadata to an object in GCS:
const file = storage
.bucket(bucketName)
.file(filename)
const metadata = {
metadata: {
example: 'test'
}
}
file.setMetadata(metadata)
// Get the updated Metadata
const get_metadata = file.getMetadata();
// Will print `File: test`
console.log(`File: ${metadata.metadata.example}`)
To update it, you can retrieve the current metadata with the getMetadata() method, modifying it, and updating it with the setMetadata() method .
For example:
const storage = new Storage();
const file = storage
.bucket(bucketName)
.file(filename)
// Get the file's metadata
const [metadata] = await file.getMetadata()
console.log(`File: ${metadata.name}`)
// update metadata
file.setMetadata(metadata.metadata.example='updated')
// Get the updated metadata
const [get_metadata] = await file.getMetadata()
console.log(`File: ${get_metadata.metadata.example}`)

Featherjs Serverside Upload

Does anyone have any examples of how I can handle the files that are sent to featherjs?
I have the following client side completely separate from featherjs but haven't trouble actually accessing said files in my service.
var req = request.post('http://uploadhost/upload').set('Authorization', 'Bearer '+this.props.authtoken);
this.state.files.forEach(file => {
req.attach(file.name, file);
});
req.end(this.callback);
FeathersJS just extends express. You need to add a multipart parser, like multer, if you are decoding form data (which looks like you are).
const multer = require('multer');
const multipartMiddleware = multer();
// Upload Service with multipart support
app.use('/uploads',
// multer parses the file named 'uri'.
// Without extra params the data is
// temporarely kept in memory
multipartMiddleware.single('uri'),
// another middleware, this time to
// transfer the received file to feathers
function(req,res,next){
req.feathers.file = req.file;
next();
},
blobService({Model: blobStorage})
);
Ultimately, feathers uses their blob service to create files.
const blobService = require('feathers-blob');
const blobStorage = fs(__dirname + '/uploads');
Read More
I hope this helps clarify my comment.

Categories