Retrieve and send a PostgreSQL bytea image - javascript

I have an app which uses AWS Lambda functions to store images in a AWS PostgreSQL RDS as bytea file types.
The app is written in javascript and allows users to upload an image (typically small).
<input
className={style.buttonInputImage}
id="logo-file-upload"
type="file"
name="myLogo"
accept="image/*"
onChange={onLogoChange}
/>
The image is handled with the following function:
function onLogoChange(event) {
if (event.target.files && event.target.files[0]) {
let img = event.target.files[0];
setFormData({
name: "logo",
value: URL.createObjectURL(img),
});
}
}
Currently I am not concerned about what format the images are in, although if it makes storage and retrieval easier I could add restrictions.
I am using python to query my database and post and retrieve these files.
INSERT INTO images (logo, background_image, uuid) VALUES ('{0}','{1}','{2}') ON CONFLICT (uuid) DO UPDATE SET logo='{0}', background_image='{1}';".format(data['logo'], data['background_image'], data['id']);
and when I want to retrieve the images:
"SELECT logo, background_image FROM clients AS c JOIN images AS i ON c.id = i.uuid WHERE c.id = '{0}';".format(id);
I try to return this data to the frontend:
return {
'statusCode': 200,
'body': json.dumps(response_list),
'headers': {
"Access-Control-Allow-Origin" : "*"
},
}
I get the following error: Object of type memoryview is not JSON serializable.
So I have a two part question. First, the images are files being uploaded by a customer (typically they are logos or background images). Does it make sense to store these in my database as bytea files? Or is there a better way to store image uploads.
Second, how do I go about retrieving these files and converting them into a format usable by my front end.
I am still having issues with this. I added a print statement to try and see what exactly the images look like.
Running:
records = cursor.fetchall()
for item in records:
print(item)
I can see the image data looks like <memory at 0x7f762b8f7dc0>
Here is the full backend function:
cursor = connection.cursor()
print(event['pathParameters'].get('id'))
id = event['pathParameters'].get('id')
postgres_insert_query = "SELECT name, phone, contact, line1, city, state, zip, monday_start, monday_end, tuesday_start, tuesday_end, wednesday_start, wednesday_end, thursday_start, thursday_end, friday_start, friday_end, saturday_start, saturday_end, sunday_start, sunday_end, logo, background_image FROM clients AS c JOIN address AS a ON c.id = a.uuid JOIN hours AS h ON c.id = h.uuid JOIN images AS i ON c.id = i.uuid WHERE c.id = '{0}';".format(id);
query = postgres_insert_query;
cursor.execute(query)
records = cursor.fetchall()
response_list= []
for item in records:
item_dict ={'name': item[0], 'phone': item[1], 'contact': item[2], 'address':{'line1': item[3], 'city': item[4], 'state': item[5], 'zip': item[6]}, 'hours':{'monday_start': item[7], 'monday_end': item[8], 'tuesday_start': item[9], 'tuesday_end': item[10], 'wednesday_start': item[11], 'wednesday_end': item[12], 'thursday_start': item[13], 'thursday_end': item[14], 'friday_start': item[15], 'friday_end': item[16], 'saturday_start': item[17], 'saturday_end': item[18], 'sunday_start': item[19], 'sunday_end': item[20]}, 'image': {'background_image': item[21], 'logo': item[22]}}
response_list.append(item_dict)
# print(response_list)
# connection.commit()
return {
'statusCode': 200,
'body': response_list,
'headers': {
"Access-Control-Allow-Origin" : "*"
},
}

A byte format is not always castable to JSON, likely characters are used that are not allowed in json. Return a different data format. return a different datatype to your frontend.
For example, if you look at quill rich editor you'll see that you can send a base64 image in a .html file that you can send from backend to frontend.
I would also suggest that you use Sqlalchemy (https://www.sqlalchemy.org/), this makes your application SQL injection proof and also offers support for special datatypes.
Workflow
Load the image and encode with base64
Source: https://stackoverflow.com/a/3715530/9611924
import base64
with open("yourfile.ext", "rb") as image_file:
encoded_string = base64.b64encode(image_file.read())
Send in your API request
return {
'statusCode': 200,
'body': {"image":encoded_string },
'headers': {
"Access-Control-Allow-Origin" : "*"
},
}
Frontend
Decode the image .. (with base64)

I know this is not the initial question.
But have you consider storing images on a dedicated S3 bucket instead?
That would be cleaner and not complicated at all to implement IMHO.
So you would store the actual image file on a S3 bucket and store its path in your DB.
Your database would be lighter and front will load image based on the returned path.
I know it could sound like a lot of changes but the AWS SDK is very well done and that is not that long to do.
This is what I personally use for my project and it works like a charm.

I tried https://www.backblaze.com/b2/cloud-storage.html.
Follow through doc, it's not that hard to upload a file. I mainly through command line, but the doc also offer other options.
After you upload, you can get all the uploaded file metadata.
So overall, you can upload file to backblaze(or other cloud storage) and insert all the metadata to database.
Then when you retrieve the images, you retrieve through download url.

Related

Sending user uploaded images to backend using JavaScript

I am trying to send images to my backend to be stored in a database using a byte format. The images are uploaded by a user and I am not sure what format they will be in.
What is the best way to convert the image to a format that can be send via network request to a python server and stored in a database.
currently I just have this function:
function onLogoChange(event) {
if (event.target.files && event.target.files[0]) {
let img = event.target.files[0];
setFormData({
name: "logo",
value: URL.createObjectURL(img),
});
}
}
This seems to send something. However I haven't been able to retrieve these images and I think the reason is because they are not formatted correctly going into the db.

How to upload Bulk amount of json data with images to the firebase realtime and storage respectively

I have a bulk amount of data in CSV format. I am able to upload that data with python by converting them to the dictionary (dict) with loop. the whole data is getting uploaded.
but now I want to upload bulk data to firebase and images to storage and I want to link between each document and image because i am working on e-commerce react app. so that I can retrieve documents along with images.
which is a good way to do this? should I do this with javascript or python?
I uploaded data manually to firebase by importing from there but again I am unable to upload bulk images to storage and also unable to create references between them. please give me a source where I can find this solution
This is tough, because it's hard to fully understand how exactly your images and CSV's are linked, but generally if you need to link something to items stored in Firebase, you can get a link either manually (go into storage, click and item, and the 'Name' Field on the right hand side is a link), or you can get it when you upload it. So for example, I have my images stored in firebase, and a postgres database with a table storing the locations. In my API (Express), when I post the image to blob storage, I create the URL of the item, and post that as an entry in my table, as well as setting it to be the blobs name. I'll put the code here, but obviously it's a completely different architecture to your problem, so i'll try and highlight the important bits (it's also JS, not Python, sorry!):
const uploadFile = async () => {
var filename = "" + v4.v4() + ".png"; //uses the uuid library to create a unique value
const options = {
destination: filename,
resumable: true,
validation: "crc32c",
metadata: {
metadata: {
firebaseStorageDownloadTokens: v4.v4(),
},
},
};
storage
.bucket(bucketName)
.upload(localFilename, options, function (err, file) {});
pg.connect(connectionString, function (err, client, done) {
client.query(
`INSERT INTO table (image_location) VALUES ('${filename}')`, //inserts the filename we set earlier into the postgres table
function (err, result) {
done();
if (err) return console.error(err);
console.log(result.rows.length);
}
);
});
console.log(`${filename} uploaded to ${bucketName}.`);
};
Once you have a reference between the two like this, you can just get the one in the table first, then use that to pull in the other one using the location you have stored.

Sending and displaying images from Adonis V4.1

How can I send an image from a React App to Adonis, “save” it on the database, and when needed to fetch it to use in the front-end?
Right now, I was only successful in processing an image via Postman, my code would be like this:
const image = request.file('photo', {
types: ['image'],
size: '2mb',
});
await image.move(Helpers.tmpPath('uploads'), {
name: `${Date.now()}-${image.clientName}`,
});
if (image.status !== 'moved') {
return image.error;
}
/////
const data = {
username,
email,
role,
photo: image.fileName,
password,
access: 1,
};
const user = await User.create(data);
In the first part, I process the image move it to tmp inside the backend, on the next part I use image.fileName and create a User.
And when I need to fetch my user list, I do it like this:
const colaboradoresList = await Database.raw(
'select * from colaboradores where access = 1'
);
const userList = colaboradoresList[0];
userList.map((i) => (i.url = Helpers.tmpPath(`uploads/${i.photo}`)));
But as you can tell, Helpers.tmpPath('uploads/${i.photo}')) will return the local path to the current image, and I cannot display it on React since I need to use the public folder or download it and import.
Is there a way to do it locally, or the only way would be to create an AWS and use Drive.getUrl() to create a URL and send back to my front end?
Yes the UI cannot display local files, you need to expose a public url for the image. You have two options depending of the scope of this application
If you plan to run this as a professional app, I would highly suggest to use something like AWS S3 to store images.
Otherwise you can probably get away with setting up a route for the React UI to query. Something like /api/image/:id could return the binary or base64 encoded data of the image, which React could then display.
instead of
await image.move(Helpers.tmpPath('uploads'), {
name: `${Date.now()}-${image.clientName}`,
});
I use:
await image.move(Helpers.publicPath('uploads'),
{name: `${Date.now()}-${image.clientName}`})
For that you will need to change the folders to make it store correctly:
And then send to the front-end url = '/uploads/${i.photo}', where i.photo is the file name, so I can concatenate in React like so apiBase + url.
The result being your apiUrl + your file path that should be on public folder:

Streaming binary saved image from mongo to the response object

I have schema that take two fields - name and file.
file is Binary type and name is a String
I use this schema to save images as documents in my db
While the image is getting saved without issues - I am not sure how can I read it later. The main propose is to be able to search a file by its name using findOne and then to stream it to the response using pipe or any other solution that might work - to show the image to the client inside <img/> tag. I tried to convert the file to base64 and then to send something like res.send('data:image/png;base64,${file.file}') without much luck .
* Note that because I have inside the document two fields (name and file) I need to stream only the file for obvious reasons
This is my GET request to fetch the file :
router.get('/:url',(req, res) => {
const url = req.params.url;
console.log(url)
console.log('streaming file')
File.
findOne({ name:url}, 'file').
cursor().
on('data', (doc) => {
console.log('doc', doc.file);
res.send(doc.file)
}).
on('end', () => { console.log('Done!'); });
})
this not helps because its for streaming file with path which I dont have. My file stored in a db
Image of the saved file inside the db:

How to load a file into a byte array in JavaScript

I'm am developing a mobile application with cordova. The applications needs in some point to be off-line which means that it has to have data in an internal database.
The data consists on a bunch of fields some of them are links to files. For example:
var restaurant = {
name: 'Restaurant X'
address: 'So far'
logo: http://www.restaurantX.com/logo.png
image: pics/restaurant-x.jpg
}
When the user decides to be off-line we should extract all the relevant information from the database (This point is cleared) and download all the related images (This is where I am stuck). At this point I thought in two options the first one is download all the files to the mobile as files or as data in the database, however, the project chief said that It was mandatory that images were saved as blob on the database.
I've been looking for a function to convert the content of the file into a byte Array for saving it into the database but I need a syncronous function since I have to wait for the pictures before of saving the row in the database.
function(tx, row){
var query = "INSERT INTO Restaurants (name, address, logo, image) values (?,?,?,?)";
var logo = null; // I need to fill it with the logo
var picture = null; // I need to fill it with the picture
tx.executeSql(query, [row["nombre"], row["address"],logo,picture], onSuccess, onError);
});
I'm sorry if the question is too basis, I'm practically new at JavaScript.
You should definitely take a look at promises, if your image serialization routine is async and you want to store it on serialize success. More detailed info about promises you can check here

Categories