Sending and displaying images from Adonis V4.1 - javascript

How can I send an image from a React App to Adonis, “save” it on the database, and when needed to fetch it to use in the front-end?
Right now, I was only successful in processing an image via Postman, my code would be like this:
const image = request.file('photo', {
types: ['image'],
size: '2mb',
});
await image.move(Helpers.tmpPath('uploads'), {
name: `${Date.now()}-${image.clientName}`,
});
if (image.status !== 'moved') {
return image.error;
}
/////
const data = {
username,
email,
role,
photo: image.fileName,
password,
access: 1,
};
const user = await User.create(data);
In the first part, I process the image move it to tmp inside the backend, on the next part I use image.fileName and create a User.
And when I need to fetch my user list, I do it like this:
const colaboradoresList = await Database.raw(
'select * from colaboradores where access = 1'
);
const userList = colaboradoresList[0];
userList.map((i) => (i.url = Helpers.tmpPath(`uploads/${i.photo}`)));
But as you can tell, Helpers.tmpPath('uploads/${i.photo}')) will return the local path to the current image, and I cannot display it on React since I need to use the public folder or download it and import.
Is there a way to do it locally, or the only way would be to create an AWS and use Drive.getUrl() to create a URL and send back to my front end?

Yes the UI cannot display local files, you need to expose a public url for the image. You have two options depending of the scope of this application
If you plan to run this as a professional app, I would highly suggest to use something like AWS S3 to store images.
Otherwise you can probably get away with setting up a route for the React UI to query. Something like /api/image/:id could return the binary or base64 encoded data of the image, which React could then display.

instead of
await image.move(Helpers.tmpPath('uploads'), {
name: `${Date.now()}-${image.clientName}`,
});
I use:
await image.move(Helpers.publicPath('uploads'),
{name: `${Date.now()}-${image.clientName}`})
For that you will need to change the folders to make it store correctly:
And then send to the front-end url = '/uploads/${i.photo}', where i.photo is the file name, so I can concatenate in React like so apiBase + url.
The result being your apiUrl + your file path that should be on public folder:

Related

Retrieve and send a PostgreSQL bytea image

I have an app which uses AWS Lambda functions to store images in a AWS PostgreSQL RDS as bytea file types.
The app is written in javascript and allows users to upload an image (typically small).
<input
className={style.buttonInputImage}
id="logo-file-upload"
type="file"
name="myLogo"
accept="image/*"
onChange={onLogoChange}
/>
The image is handled with the following function:
function onLogoChange(event) {
if (event.target.files && event.target.files[0]) {
let img = event.target.files[0];
setFormData({
name: "logo",
value: URL.createObjectURL(img),
});
}
}
Currently I am not concerned about what format the images are in, although if it makes storage and retrieval easier I could add restrictions.
I am using python to query my database and post and retrieve these files.
INSERT INTO images (logo, background_image, uuid) VALUES ('{0}','{1}','{2}') ON CONFLICT (uuid) DO UPDATE SET logo='{0}', background_image='{1}';".format(data['logo'], data['background_image'], data['id']);
and when I want to retrieve the images:
"SELECT logo, background_image FROM clients AS c JOIN images AS i ON c.id = i.uuid WHERE c.id = '{0}';".format(id);
I try to return this data to the frontend:
return {
'statusCode': 200,
'body': json.dumps(response_list),
'headers': {
"Access-Control-Allow-Origin" : "*"
},
}
I get the following error: Object of type memoryview is not JSON serializable.
So I have a two part question. First, the images are files being uploaded by a customer (typically they are logos or background images). Does it make sense to store these in my database as bytea files? Or is there a better way to store image uploads.
Second, how do I go about retrieving these files and converting them into a format usable by my front end.
I am still having issues with this. I added a print statement to try and see what exactly the images look like.
Running:
records = cursor.fetchall()
for item in records:
print(item)
I can see the image data looks like <memory at 0x7f762b8f7dc0>
Here is the full backend function:
cursor = connection.cursor()
print(event['pathParameters'].get('id'))
id = event['pathParameters'].get('id')
postgres_insert_query = "SELECT name, phone, contact, line1, city, state, zip, monday_start, monday_end, tuesday_start, tuesday_end, wednesday_start, wednesday_end, thursday_start, thursday_end, friday_start, friday_end, saturday_start, saturday_end, sunday_start, sunday_end, logo, background_image FROM clients AS c JOIN address AS a ON c.id = a.uuid JOIN hours AS h ON c.id = h.uuid JOIN images AS i ON c.id = i.uuid WHERE c.id = '{0}';".format(id);
query = postgres_insert_query;
cursor.execute(query)
records = cursor.fetchall()
response_list= []
for item in records:
item_dict ={'name': item[0], 'phone': item[1], 'contact': item[2], 'address':{'line1': item[3], 'city': item[4], 'state': item[5], 'zip': item[6]}, 'hours':{'monday_start': item[7], 'monday_end': item[8], 'tuesday_start': item[9], 'tuesday_end': item[10], 'wednesday_start': item[11], 'wednesday_end': item[12], 'thursday_start': item[13], 'thursday_end': item[14], 'friday_start': item[15], 'friday_end': item[16], 'saturday_start': item[17], 'saturday_end': item[18], 'sunday_start': item[19], 'sunday_end': item[20]}, 'image': {'background_image': item[21], 'logo': item[22]}}
response_list.append(item_dict)
# print(response_list)
# connection.commit()
return {
'statusCode': 200,
'body': response_list,
'headers': {
"Access-Control-Allow-Origin" : "*"
},
}
A byte format is not always castable to JSON, likely characters are used that are not allowed in json. Return a different data format. return a different datatype to your frontend.
For example, if you look at quill rich editor you'll see that you can send a base64 image in a .html file that you can send from backend to frontend.
I would also suggest that you use Sqlalchemy (https://www.sqlalchemy.org/), this makes your application SQL injection proof and also offers support for special datatypes.
Workflow
Load the image and encode with base64
Source: https://stackoverflow.com/a/3715530/9611924
import base64
with open("yourfile.ext", "rb") as image_file:
encoded_string = base64.b64encode(image_file.read())
Send in your API request
return {
'statusCode': 200,
'body': {"image":encoded_string },
'headers': {
"Access-Control-Allow-Origin" : "*"
},
}
Frontend
Decode the image .. (with base64)
I know this is not the initial question.
But have you consider storing images on a dedicated S3 bucket instead?
That would be cleaner and not complicated at all to implement IMHO.
So you would store the actual image file on a S3 bucket and store its path in your DB.
Your database would be lighter and front will load image based on the returned path.
I know it could sound like a lot of changes but the AWS SDK is very well done and that is not that long to do.
This is what I personally use for my project and it works like a charm.
I tried https://www.backblaze.com/b2/cloud-storage.html.
Follow through doc, it's not that hard to upload a file. I mainly through command line, but the doc also offer other options.
After you upload, you can get all the uploaded file metadata.
So overall, you can upload file to backblaze(or other cloud storage) and insert all the metadata to database.
Then when you retrieve the images, you retrieve through download url.

How to upload Bulk amount of json data with images to the firebase realtime and storage respectively

I have a bulk amount of data in CSV format. I am able to upload that data with python by converting them to the dictionary (dict) with loop. the whole data is getting uploaded.
but now I want to upload bulk data to firebase and images to storage and I want to link between each document and image because i am working on e-commerce react app. so that I can retrieve documents along with images.
which is a good way to do this? should I do this with javascript or python?
I uploaded data manually to firebase by importing from there but again I am unable to upload bulk images to storage and also unable to create references between them. please give me a source where I can find this solution
This is tough, because it's hard to fully understand how exactly your images and CSV's are linked, but generally if you need to link something to items stored in Firebase, you can get a link either manually (go into storage, click and item, and the 'Name' Field on the right hand side is a link), or you can get it when you upload it. So for example, I have my images stored in firebase, and a postgres database with a table storing the locations. In my API (Express), when I post the image to blob storage, I create the URL of the item, and post that as an entry in my table, as well as setting it to be the blobs name. I'll put the code here, but obviously it's a completely different architecture to your problem, so i'll try and highlight the important bits (it's also JS, not Python, sorry!):
const uploadFile = async () => {
var filename = "" + v4.v4() + ".png"; //uses the uuid library to create a unique value
const options = {
destination: filename,
resumable: true,
validation: "crc32c",
metadata: {
metadata: {
firebaseStorageDownloadTokens: v4.v4(),
},
},
};
storage
.bucket(bucketName)
.upload(localFilename, options, function (err, file) {});
pg.connect(connectionString, function (err, client, done) {
client.query(
`INSERT INTO table (image_location) VALUES ('${filename}')`, //inserts the filename we set earlier into the postgres table
function (err, result) {
done();
if (err) return console.error(err);
console.log(result.rows.length);
}
);
});
console.log(`${filename} uploaded to ${bucketName}.`);
};
Once you have a reference between the two like this, you can just get the one in the table first, then use that to pull in the other one using the location you have stored.

File retrieval from AWS S3 to Node.js server, and then to React client

I have a need to retrieve individual files from Node.js with Express.js. For that, I have installed aws-sdk, as well as #aws-sdk/client-s3. I am able to successfully fetch the file by using this simple endpoint:
const app = express(),
{ S3Client, GetObjectCommand } = require('#aws-sdk/client-s3'),
s3 = new S3Client({ region: process.env.AWS_REGION });
app.get('/file/:filePath', async (req,res) => {
const path_to_file = req.params.filePath;
try {
const data = await s3.send(new GetObjectCommand({ Bucket: process.env.AWS_BUCKET, Key: path_to_file }));
console.log("Success", data);
} catch (err) {
console.log("Error", err);
}
});
...but I have no idea how to return the data correctly to the React.js frontend so that the file can be further downloaded. I tried to look up the documentation, but it's looking too messy for me, and I can't even get what does the function return. .toString() method didn't help because it simply returns `"[object Object]" and nothing really else.
On React, I am using a library file-saver, which works with blobs and provides them for download using a filename defined by user.
Node v15.8.0, React v16.4.0, #aws-sdk/client-s3 v3.9.0, file-saver v2.0.5
Thanks for your tips and advices! Any help is highly appreciated.
Generally data from S3 is returned as a buffer. The file contents are part of the Body param in the response. You might be doing toString on the root object.
You need to use .toString() on the body param to make it a string.
Here is some sample code that might work for use case
// Note: I am not using a different style of import
const AWS = require("aws-sdk")
const s3 = new AWS.S3()
const Bucket = "my-bucket"
async getObject(key){
const data = await s3.getObject({ Bucket, key}).promise()
if (data.Body) {return data.Body.toString("utf-8") } else { return undefined}
}
To return this in express, you can add this to your route and pass the final data back once you have it.
res.end(data));
Consuming it in React should be the same as taking values from any other REST API.
I used Ravi's answer but caused a problem to display the image object in Frontend.
this worked fine:
const data = await s3.send(new GetObjectCommand({ Bucket: process.env.AWS_BUCKET, Key: path_to_file }));
data.Body.pipe(res);

Get image url Firebase storage (admin)

I have to upload an image to the firebase storage. I'm not using the web version of storage (I shouldn't use it). I am using the firebase admin.
No problem, I upload the file without difficulty and I get the result in the variable "file".
and if I access the firebase storage console, the image is there. all right.
return admin.storage().bucket().upload(filePath, {destination: 'demo/images/restaurantCover.jpg',
metadata:{contentType: 'image/jpeg'}
public: true
}).then(file =>{
console.log(`file --> ${JSON.stringify(file, null, 2)}`);
let url = file["0"].metadata.mediaLink; // image url
return resolve(res.status(200).send({data:file})); // huge data
}) ;
Now, I have some questions.
Why so much information and so many objects as a response to the upload () method? Reviewing the immense object, I found a property called mediaLink inside metadata and it is the download url of the image. but...
Why is the url different from the one shown by firebase? Why can not I find the downloadURL property?
How can get the url of firebase?
firebase: https://firebasestorage.googleapis.com/v0/b/myfirebaseapp.appspot.com/o/demo%2Fimages%2Fthumb_restaurant.jpg?alt=media&token=bee96b71-2094-4492-96aa-87469363dd2e
mediaLink: https://www.googleapis.com/download/storage/v1/b/myfirebaseapp.appspot.com/o/demo%2Fimages%2Frestaurant.jpg?generation=1530193601730593&alt=media
If I use the mediaLink url is there any problem with different urls? (read, update from ios and Web Client)
Looking at Google Cloud Storage: Node.js Client documentation, they have a link to sample code which shows exactly how to do this. Also, see the File class documentation example (below)
// Imports the Google Cloud client library
const Storage = require('#google-cloud/storage');
// Creates a client
const storage = new Storage();
/**
* TODO(developer): Uncomment the following lines before running the sample.
*/
// const bucketName = 'Name of a bucket, e.g. my-bucket';
// const filename = 'File to access, e.g. file.txt';
// Gets the metadata for the file
storage
.bucket(bucketName)
.file(filename)
.getMetadata()
.then(results => {
const metadata = results[0];
console.log(`File: ${metadata.name}`);
console.log(`Bucket: ${metadata.bucket}`);
console.log(`Storage class: ${metadata.storageClass}`);
console.log(`Self link: ${metadata.selfLink}`);
console.log(`ID: ${metadata.id}`);
console.log(`Size: ${metadata.size}`);
console.log(`Updated: ${metadata.updated}`);
console.log(`Generation: ${metadata.generation}`);
console.log(`Metageneration: ${metadata.metageneration}`);
console.log(`Etag: ${metadata.etag}`);
console.log(`Owner: ${metadata.owner}`);
console.log(`Component count: ${metadata.component_count}`);
console.log(`Crc32c: ${metadata.crc32c}`);
console.log(`md5Hash: ${metadata.md5Hash}`);
console.log(`Cache-control: ${metadata.cacheControl}`);
console.log(`Content-type: ${metadata.contentType}`);
console.log(`Content-disposition: ${metadata.contentDisposition}`);
console.log(`Content-encoding: ${metadata.contentEncoding}`);
console.log(`Content-language: ${metadata.contentLanguage}`);
console.log(`Metadata: ${metadata.metadata}`);
console.log(`Media link: ${metadata.mediaLink}`);
})
.catch(err => {
console.error('ERROR:', err);
});

AngularFire / Firestore - Return collections and documents as a service

I have several pages that reference the same node in firestore, each pulling different segments from the firestore node. For example, a summary page might pull through album title, date, genre and image, whilst another page might pull through just the title, artist and record label. A couple of questions:
Is it possible to turn one of the firestore queries into a service?
If so, does that mean the data is only read once whilst navigating across different pages (angular components) that use the same service?
Will the query only run again when data is modified in firestore through the observable? ("return Observable.create(observer => {" )
I have tried a service with the code below. However, the issue observed is that on page refresh, the data isn't present. It is however present whilst navigating through the site. I believe this is because my page is running before the observable is returned. Is there a way to wrap up the query as an observable?
Any assistance would be greatly appreciated.
getAlbumData() {
this.albumDoc = this.afs.doc(`albums/${this.albumId}`);
this.album = this.albumDoc.snapshotChanges();
this.album.subscribe((value) => {
// The returned Data
const data = value.payload.data();
// Firebase Reference
var storage = firebase.storage();
// If album cover exists
if (data.project_signature != undefined) {
// Get the Image URL
var image = data.album_cover_image;
// Create an image reference to the storage location
var imagePathReference = storage.ref().child(image);
// Get the download URL and set the local variable to the result (url)
imagePathReference.getDownloadURL().then((url) => {
this.album_cover = url;
});
}
});
}
When I build my observables, I try to use operators as much as I can until I get the data I want to display in my UI.
You don't want to implement too much code in the subscribe method because you break the reactive paradigm by doing so.
Instead, extract you data in your observable and display it in your template.
Don't forget to use the async pipe in your template to display your data when it gets fetched by your application.
I would do something like this:
// In AlbumService
getAlbumCover(albumId: string) {
const albumDoc = this.afs.doc(`albums/${albumId}`);
const album_cover$ = this.albumDoc.snapshotChanges().pipe(
// get the album data from firestore request
map(data => {
return {
id: data.payload.id,
...data.payload.data()
};
}),
// emits data only if it contains a defined project_signature
filter(album => album.project_signature),
// prepare the imagePath and get the album cover from the promise
mergeMap(album => {
const storage = firebase.storage();
const image = album.album_cover_image;
const imagePathReference = storage.ref().child(image);
return imagePathReference.getDownloadURL();
})
);
return album_cover$;
}
By doing so, when your data is updated in firestore, it will be fetched automatically by your application since you use an observable.
In your component, in the onInit() method after getting your album id from the url:
this.album_cover$ = this.albumService.getAlbumCover(albumId);
Finally, in my template, I would do :
<div>{{album_cover$ | async}}</div>

Categories