How to Upload file in a directory to minIO bucket - javascript

Hello everyone i have bucket in minio server and bucket name is 'geoxing' and geoxing have directory img/site. i want to upload picture in site directry using nodejs. below is code and i am getting error Invalid bucket name: geoxing/img/site. how can i solve this error. thanks
savefile() {
const filePath = 'D://repositories//uploads//geoxing//site//b57e46b4bcf879839b7074782sitePic.jpg';
const bucketname = 'geoxing/img/site'
var metaData = {
'Content-Type': 'image/jpg',
'Content-Language': 123,
'X-Amz-Meta-Testing': 1234,
example: 5678,
};
this.minioClient.fPutObject(
bucketname,
'b57e46b4bcf879839b7074782sitePic.jpg',
filePath,
metaData,
function (err, objInfo) {
if (err) {
return console.log(err);
}
return console.log('Success', objInfo.etag);
},
);
}

In Amazon S3 and Minio:
Bucket should just be the name of the bucket (eg geoxing)
Key should include the full path as well as the filename (eg img/site/b57e46b4bcf879839b7074782sitePic.jpg)
Amazon S3 and Minio do not have 'folders' or 'directories' but they emulate directories by including the path name in the Key. Folders do not need to be created prior to uploading to a folder -- they just magically appear when files are stored in that 'path'.

Related

Linode Storage With NodeJs

I am new with linode. i see linode provide cloud storage just aws s3. i want to use it with my nodejs app.i can not find any sdk to do it like s3 any solution please help me .
any body tell me how can we upload file from nodejs to linode storage in javascript
new to linode too. Got my free $100 2 month trial and I figured I'd try the bucket feature.
I used AWS S3 in the past, this is pretty much identical as far as the SDK goes. The only hurdle here was to configure the endpoint. With AWS S3 you put the region, with linode you put the endpoint instead. The list of endpoints is here:
https://www.linode.com/docs/products/storage/object-storage/guides/urls/#cluster-url-s3-endpoint
As you didn't mention if you wanted an example on the server (nodejs) or the browser, I'll go with the one I've got. It's for nodejs (server side).
Steps
I used node stable (currently 18.7). I set up package.json to start the index.js script (e.g. "scripts": {"start": "node index.js"}).
Install aws-sdk
npm i aws-sdk
Code for index.js
const S3 = require('aws-sdk/clients/s3')
const fs = require('fs')
const config = {
endpoint: 'https://us-southeast-1.linodeobjects.com/',
accessKeyId: 'BLEEPBLEEPBLEEP',
secretAccessKey: 'BLOOPBLOOPBLOOP',
}
var s3 = new S3(config)
function listObjects() {
console.debug("List objects")
const bucketParams = {
Bucket: 'vol1'
}
s3.listObjects(bucketParams, (err, data) => {
if(err) {
console.error("Error ", err)
} else {
console.info("Objects vol1 ", data)
}
})
}
function uploadFile() {
const fileStream = fs.createReadStream('./testfile.txt')
var params = {Bucket: 'vol1', Key: 'testfile', Body: fileStream}
s3.upload(params, function(err, data) {
if(err) {
console.error("Error uploading test file", err)
} else {
console.info("Test file uploaded ", data)
listObjects()
}
})
}
// Start
uploadFile()
Run "npm start".
Output I get:
Test file uploaded {
ETag: '"0ea76c859582d95d2c2c0caf28e6d747"',
Location: 'https://vol1.us-southeast-1.linodeobjects.com/testfile',
key: 'testfile',
Key: 'testfile',
Bucket: 'vol1'
}
List objects
Objects vol1 {
IsTruncated: false,
Marker: '',
Contents: [
{
Key: 'Inflation isnt transitory.mp4',
LastModified: 2023-01-10T15:38:42.045Z,
ETag: '"4a77d408defc08c15fe42ad4e63fefbd"',
ChecksumAlgorithm: [],
Size: 58355708,
StorageClass: 'STANDARD',
Owner: [Object]
},
{
Key: 'testfile',
LastModified: 2023-02-13T20:28:01.178Z,
ETag: '"0ea76c859582d95d2c2c0caf28e6d747"',
ChecksumAlgorithm: [],
Size: 18,
StorageClass: 'STANDARD',
Owner: [Object]
}
],
Name: 'vol1',
Prefix: '',
MaxKeys: 1000,
CommonPrefixes: []
}
Adjust the config with your own creds/data center. Hope this helps.
Note: if you want to upload files > 1gb, you'll want to use the multipart upload feature. It's a bit more complex, but this should get you started. Any AWS S3 code example should do, there are plenty out there.

Uploading an MP3 to S3 from the Buffer

I'm attempting to render an MP3 file on the front-end and send it to AWS S3. I can upload a given file to AWS S3 easily enough, but not from the buffer, which is where the file is generated.
The textToSpeech call in the code sample below is coming from the IBM Watson Text-to-Speech API.
Here is an example code, the app is a Next.JS app that is calling S3 via an API:
module.exports = requireAuth(async (req, res) => {
textToSpeech
.synthesize(synthesizeParams)
.then(buffer => {
const s3Params = {
Bucket: 'waveforms/audioform',
Key: 'CONTENT.mp3',
Body: buffer,
ContentType: 'audio/mpeg',
ACL: 'public/read'
}
s3.upload(s3Params, function (s3Err, data) {
if (s3Err) throw s3Err
console.log(`File uploaded successfully at ${data.Location}`)
})
})
.catch(err => {
console.log('error:', err)
})
}
The error is Error: Unsupported body payload object, so it appears that the buffer is not being accessed correctly, but I'm not sure how to extract the relevant data.
How can I pass a file from the Node buffer to S3?

Reduced file size after uploading to S3 bucket from nodejs

I am trying to upload files from a particular folder location to a sample S3 bucket. I am using standard nodejs aws-sdk for this. Files are deepzoom images (.dzi) files.
Files are getting uploaded to my S3 bucket but the contents of the file are not getting uploaded properly. Like I am uploading images of sizes 800B, but after uploading the size of image is only 7B. I tried downloading it to see its content but the file doesn't contains the image but just the file name. This is the code I am running for uploading files:
function read(file, numFiles) {
fs.readFile(file, function (err, data) {
if (err) console.log(err);
const fileContent = Buffer.from(file, "binary");
s3.putObject(
{
Bucket: "sample-bucket",
Key: file,
Body: fileContent,
},
function (resp) {
console.log(arguments);
console.log("Successfully uploaded, ", file);
uploadCount++;
console.log("uploadcount is:", uploadCount);
if (uploadCount == numFiles) {
res.send("All files uploaded");
}
}
).on("httpUploadProgress", (evt) => {
console.log(`Uploaded ${evt.loaded} out of ${evt.total}`);
});
});
}
I am passing files to this read function from another function. I am not sure why this is happening. Any help would be appreciated.
Before uploading image properties:
Property of image uploaded to S3 bucket:
Buffer.from(file) will not return the content of the file but return the buffer of the argument, this time the argument is "file". So the file uploaded to S3 has filename as contents.
Try to change this line
const fileContent = Buffer.from(file, "binary");
to like this.
const fileContent = fs.readFileSync(file);

By using Express-fileupload library ,Upload image to AWS S3 bucket

any one please help me, by using express-fileupload library, I want to upload my file and image to AWS S3 bucket using restful API.
Hope you found the answer.
In case you haven't found the answer, here is what I just found out
const uploadSingleImage = async (file, s3, fileName) => {
const bucketName = process.env.S3_BUCKET_NAME;
if (!file.mimetype.startsWith('image')) {
return { status: false, message: 'File uploaded is not an image' };
}
const params = {
Bucket: bucketName,
Key: fileName,
Body: file.data,
ACL: 'public-read',
ContentType: file.mimetype,
};
return s3.upload(params).promise();
};
s3.upload(params).promise()
will return an object which contains the Location of the file you uploaded, in fact you could generate it yourself but that would not cover in case any error occur so I think what I posted here is a better solution.

S3 file upload file object using node js

I'm using Sailsjs 0.12.1, node.js 4.2.6
I want to upload the file From front-end(angular.js) through an API and from backend I want to upload the file to the AWS S3 bucket.
front-end I'm sending the file to the api. In backend I'm getting the file with the name but while uploading the file to S3 I'm getting the error
Cannot determine length of [object Object]
I google the error and found the many links but my bad luck.
Back-end
uploadPersonAvtar: function(req, res) {
var zlib = require('zlib');
var file = req.file('image');
var mime = require('mime');
data = {
Bucket: 'bucket',
Key : 'my_key',
Key: file.name,
Body: file,
ContentType: mime.lookup(file.name)
};
// Upload the stream
var s3obj = new AWS.S3(S3options);
s3obj.upload(data, function(err, data) {
if (err) console.log("An error occurred", err);
console.log("Uploaded the file at", data);
})
}
Is my approach is correct
If yes what I'm doing wrong.
I want to know how to use the file object to upload the file.
I can create a read stream but I don't have the file path When I'm creating the file object I'm getting the error: path must be a string
You could look at following example about streaming data : Amazon S3: Uploading an arbitrarily sized stream (upload)
var fs = require('fs');
var zlib = require('zlib');
var body = fs.createReadStream('bigfile').pipe(zlib.createGzip());
var s3obj = new AWS.S3({params: {Bucket: 'myBucket', Key: 'myKey'}});
s3obj.upload({Body: body}).
on('httpUploadProgress', function(evt) { console.log(evt); }).
send(function(err, data) { console.log(err, data) });

Categories