Uploading an MP3 to S3 from the Buffer - javascript

I'm attempting to render an MP3 file on the front-end and send it to AWS S3. I can upload a given file to AWS S3 easily enough, but not from the buffer, which is where the file is generated.
The textToSpeech call in the code sample below is coming from the IBM Watson Text-to-Speech API.
Here is an example code, the app is a Next.JS app that is calling S3 via an API:
module.exports = requireAuth(async (req, res) => {
textToSpeech
.synthesize(synthesizeParams)
.then(buffer => {
const s3Params = {
Bucket: 'waveforms/audioform',
Key: 'CONTENT.mp3',
Body: buffer,
ContentType: 'audio/mpeg',
ACL: 'public/read'
}
s3.upload(s3Params, function (s3Err, data) {
if (s3Err) throw s3Err
console.log(`File uploaded successfully at ${data.Location}`)
})
})
.catch(err => {
console.log('error:', err)
})
}
The error is Error: Unsupported body payload object, so it appears that the buffer is not being accessed correctly, but I'm not sure how to extract the relevant data.
How can I pass a file from the Node buffer to S3?

Related

How to Upload file in a directory to minIO bucket

Hello everyone i have bucket in minio server and bucket name is 'geoxing' and geoxing have directory img/site. i want to upload picture in site directry using nodejs. below is code and i am getting error Invalid bucket name: geoxing/img/site. how can i solve this error. thanks
savefile() {
const filePath = 'D://repositories//uploads//geoxing//site//b57e46b4bcf879839b7074782sitePic.jpg';
const bucketname = 'geoxing/img/site'
var metaData = {
'Content-Type': 'image/jpg',
'Content-Language': 123,
'X-Amz-Meta-Testing': 1234,
example: 5678,
};
this.minioClient.fPutObject(
bucketname,
'b57e46b4bcf879839b7074782sitePic.jpg',
filePath,
metaData,
function (err, objInfo) {
if (err) {
return console.log(err);
}
return console.log('Success', objInfo.etag);
},
);
}
In Amazon S3 and Minio:
Bucket should just be the name of the bucket (eg geoxing)
Key should include the full path as well as the filename (eg img/site/b57e46b4bcf879839b7074782sitePic.jpg)
Amazon S3 and Minio do not have 'folders' or 'directories' but they emulate directories by including the path name in the Key. Folders do not need to be created prior to uploading to a folder -- they just magically appear when files are stored in that 'path'.

AWS Image upload data to S3 from form is corrupt

I am using multer in a Lambda function to upload an image through an API POST request I am building, this is part of a form on my website.
This is the console log from cloud watch:
{
fieldname: 'logo_image',
originalname: '8824.png',
encoding: '7bit',
mimetype: 'image/png',
destination: '/tmp/upload',
filename: 'f7f44f5c39304937d10e90ceb7e9ddbb',
path: '/tmp/upload/f7f44f5c39304937d10e90ceb7e9ddbb',
size: 1376654
}
This is my js express function that accepts the image using multer:
routes.post('/', upload.single('logo_image'), async (req, res) => {
const file = req.file
// here I am just going into another function to check the form data is valid
await checkTicketId(req, res)
});
I then upload the data to my S3 bucket:
const result = await uploadFile(req.file)
function uploadFile(file, policy_id) {
const fileStream = fs.createReadStream(file.path)
const uploadParams = {
Bucket: bucketName,
Body: fileStream,
Key: file.filename,
}
return s3.putObject(uploadParams).promise()
}
The data is uploaded fine, however its just some binary nonsense. And If I return the object I get data like this:
IHDR � �2�p IDATx���i�$I�%��Ǣj�WVVf]���9zw�X�h"\D � ��7�# -�h���S�]gfU��n�����EDE��##"�ȡݠ���ws3UQ��� �?P ��_#� P`3�5l'�.��w�[�! J�0���?��#� ���1h`�8A�H#�$�#D C1�#�d`��� �ְ5 A21��rʫ ���ɀ� F̈ ���
��
k��p8 $��/4Ly3�A���e"IH�</�ߞ�
�C�R&���<��j�����b �eY�N���nY�K����x�0��Y(�Dx L(d�$� p�pE��5��Q������ٟ�u�K��ΰ $#��N������� ��0���?~���?����?�ዏ>^���tw��\��+�����_۴IҊ#�1�����_~��������s�=%�>��W��D�z
'4c�#̺�[��_M�#D^b{)}�'�b�W���p}su3��ӡ�����xw\�u� ����
onts=s9�B ��89֠#�Cw�K�~Z�Ӳ���XݗEp��"P�uٓ�]롅�����飛'7W���<�zz}uSJ)tDz���/�;.��O�_����8�*y�����vsu����j����Cy<MWW��j.�Ja������������w����Ջ�/��}~{w��
�a��t�2�M4PB�W_|�X!x���ږs1��
���l6�P��NZ�8=Dz����,���mXl�*��`���'_���_(�o�����?�����v�
��a#���?�������C$ �� ��^!�.D=�/Σ�~O���+u�rz�%���.���f��K���x�}ɚ�4Ѭ� �����8m�&7�#�jf�_|���Y�[=pq�v9�5����0a���h�<�;��:)#��B�_��.���H`#��� �8�b�bF�P("K���Z�^8F(i0�f�X������bC �0����M�����}��|CQo��կ�A�ר?��/�VZk�s��b
��n�A�jr0�Z�&�n�wL�i��E`�����<�A���p!�� ��E�yp�*�����dD( ��� M����k�֧�EC�T�V�����Ԋ+�l�<�H�Y1�!�ʤIe��r���
ɱ���L3$��O� �-�z`�Kv���O7nK�.����9CE���Ŧ����|����g�<��W�����V���
�rw���1���D��*�J
Ideally I want it stored as .webp, I have also tried using upload instead of putObject and changing file extension. There doesn't seem to be any buffer data I can use either on the req.file property.
Any insight into this would be helpful, I've been stuck on it for a while now and I've enabled binary data through the API on AWS as well. Thanks!

How to upload URL-based mp3 file from the client to the node server

I'm trying to upload a mp3 file to node server using fetch.
Is the below code correct way to do this?
var song;
toDataUrl('http://s5.qhres.com/static/465f1f953f1e6ff2.mp3', function(myBase64) {
// console.log(myBase64); // myBase64 is the base64 string
song = myBase64;
});
fetch("/ffmpegserver/upload", {
method: 'PUT',
headers: { 'Accept': 'audio/mpeg', 'Content-Type': 'audio/mpeg' },
body: song
})
.then(response => {
console.log("Got response after uploading song:", response);
})
.catch(error => {
console.log("Error in Firebase AC upload song: ", error);
});
}
If so, how can I receive it and write the file as mp3 in the node server?
app.put("/ffmpegserver/upload", (req, res) => {
var mp3SongName = "output/test.mp3";
var mp3_file = fs.createWriteStream(mp3SongName);
// how to write the mp3 file?
}
This code would need something on the client that would steam the audio into a buffer and then send that buffer to the node server.
Instead of that, I think the best option would be to send the url of the mp3 file to the node backend, and then use a library like axios to download the mp3 file and persist it somehow.

Upload HTML file to AWS S3 and then serving it instead of downloading

I am downloading a web page and then I am writing to a file named thisArticle.html, using the below code.
var file = fs.createWriteStream("thisArticle.html");
var request = http.get(req.body.url, response => response.pipe(file) );
After that I am trying to read file and uploading to S3, here is the code that I wrote:
fs.readFile('thisArticle.html', 'utf8', function(err, html){
if (err) {
console.log(err + "");
throw err;
}
var pathToSave = 'articles/ ' + req.body.title +'.html';
var s3bucket = new AWS.S3({ params: { Bucket: 'all-articles' } });
s3bucket.createBucket(function () {
var params = {
Key: pathToSave,
Body: html,
ACL: 'public-read'
};
s3bucket.upload(params, function (err, data) {
fs.unlink("thisArticle.html", function (err) {
console.error(err);
});
if (err) {
console.log('ERROR MSG: ', err);
res.status(500).send(err);
} else {
console.log(data.Location);
}
// ..., more code below
});
});
});
Now, I am facing two issues:
The file is uploading but with 0 bytes (empty)
When I am trying to upload manually via S3 dashboard is uploaded successfully but when I tried to load the URL in the browser it downloads the HTML file instead of serving it.
Any guides if I am missing something?
Set the ContentType to "text/html".
s3 = boto3.client("s3")
s3.put_object(
Bucket=s3_bucket,
Key=s3_key,
Body=html_string,
CacheControl="max-age=0,no-cache,no-store,must-revalidate",
ContentType="text/html",
ACL="public-read"
)
It looks like your upload function is deleting the file with fs.unlink before it gets uploaded. That's why its going up as 0 Bytes.
Also, to make the bucket serve the HTML, you need to turn on webserving as described in the AWS S3 Docs. http://docs.aws.amazon.com/AmazonS3/latest/UG/ConfiguringBucketWebsite.html

S3 file upload file object using node js

I'm using Sailsjs 0.12.1, node.js 4.2.6
I want to upload the file From front-end(angular.js) through an API and from backend I want to upload the file to the AWS S3 bucket.
front-end I'm sending the file to the api. In backend I'm getting the file with the name but while uploading the file to S3 I'm getting the error
Cannot determine length of [object Object]
I google the error and found the many links but my bad luck.
Back-end
uploadPersonAvtar: function(req, res) {
var zlib = require('zlib');
var file = req.file('image');
var mime = require('mime');
data = {
Bucket: 'bucket',
Key : 'my_key',
Key: file.name,
Body: file,
ContentType: mime.lookup(file.name)
};
// Upload the stream
var s3obj = new AWS.S3(S3options);
s3obj.upload(data, function(err, data) {
if (err) console.log("An error occurred", err);
console.log("Uploaded the file at", data);
})
}
Is my approach is correct
If yes what I'm doing wrong.
I want to know how to use the file object to upload the file.
I can create a read stream but I don't have the file path When I'm creating the file object I'm getting the error: path must be a string
You could look at following example about streaming data : Amazon S3: Uploading an arbitrarily sized stream (upload)
var fs = require('fs');
var zlib = require('zlib');
var body = fs.createReadStream('bigfile').pipe(zlib.createGzip());
var s3obj = new AWS.S3({params: {Bucket: 'myBucket', Key: 'myKey'}});
s3obj.upload({Body: body}).
on('httpUploadProgress', function(evt) { console.log(evt); }).
send(function(err, data) { console.log(err, data) });

Categories