Angular - how to get S3 bucket contents as Observable - javascript

How do I structure a service method in Angular4 to make an s3.listObjects call to return the contents of an S3 bucket as an Observable?
Here's what I'm trying at present, failing miserably:
public loadFilesFromS3(): Observable<any[]> {
const s3 = new AWS.S3();
const params = {
Bucket: 'bucket-name',
Prefix: 'prefix-name'
};
return (
s3.listObjects(params, (err, data) => {
if (err) {
throw err;
} else {
return(data);
}
})
)
}
Just completely stuck on this for the moment! :-|

Well, after some homework here's what I came up with. Seems to work well:
private tracksListSubject = new BehaviorSubject([]);
public tracksList$: Observable<Track[]> = this.tracksListSubject.asObservable();
public loadTracksFromS3() {
console.log('loading tracks from S3...');
this.authenticate();
const s3 = new AWS.S3();
const params = {
Bucket: config.AWS_BUCKET_NAME
};
s3.listObjects(params, (err, data) => {
if (err) {
console.log('error!', err);
}
const raw = data.Contents;
const tracks: Track[] = [];
raw.forEach((item) => {
tracks.push({
id: item.ETag,
title: item.Key,
url: config.AWS_BASE_URL + item.Key,
size: Math.round(item.Size / 1024 / 1024 * 10) / 10
})
});
console.log(tracks.length, 'tracks loaded');
this.tracksListSubject.next(tracks);
});
}
Then in other components I can just inject this service and subscribe to the tracksList$ property on it. Every time I modify the list of tracks inside the service, I issue this.tracksListSubject.next(newTracksList).

You should be able to use the Observable.bindNodeCallback function to convert the s3.listObjects function from a node style callback to a function that returns an Observable.
const listObjectsAsObservable = Observable.bindNodeCallback(s3.listObjects.bind(s3));
Note that as shown above you need to bind the s3 object to the function, to tell it that s3 is this. Otherwise, you will get an error.
Then you can use it as follows:
const params = {
Bucket: config.AWS_BUCKET_NAME
};
listObjectsAsObservable(params)
.subscribe({
next: (response) => console.log(response),
error: (err) => console.log(err)
});

Related

nextJS: async getInitialProps() with AWS S3?

I'm trying to get an s3.getObject() running inside an async getInitialProps() function in a nextJS project, but I can't for the love of it figure out how to get the results prepped to they can be returned as an object (which is needed for getInitialProps() and nextJS' SSR to work properly).
Here is the code:
static async getInitialProps({ query }) {
const AWS = require('aws-sdk');
const s3 = new AWS.S3({
credentials: {
accessKeyId: KEY
secretAccessKey: KEY
}
});
// The id from the route (e.g. /img/abc123987)
let filename = query.id;
const params = {
Bucket: BUCKETNAME
Key: KEYDEFAULTS + '/' + filename
};
const res = await s3.getObject(params, (err, data) => {
if (err) throw err;
let imgData = 'data:image/jpeg;base64,' + data.Body.toString('base64');
return imgData;
});
return ...
}
The idea is to fetch an image from S3 and return it as base64 code (just to clear things up).
From your code, s3.getObject, works with callback. you need to wait for the callback to be called.
You can achieve it by converting this callback into a promise.
static async getInitialProps({ query }) {
const AWS = require('aws-sdk');
const s3 = new AWS.S3({
credentials: {
accessKeyId: KEY
secretAccessKey: KEY
}
});
// The id from the route (e.g. /img/abc123987)
let filename = query.id;
const params = {
Bucket: BUCKETNAME
Key: KEYDEFAULTS + '/' + filename
};
const res = await new Promise((resolve, reject) => {
s3.getObject(params, (err, data) => {
if (err) reject(err);
let imgData = 'data:image/jpeg;base64,' + data.Body.toString('base64');
resolve(imgData);
});
});
return ...
}

Upload Image from form-data to S3 using a Lambda

So I am writing a Lambda that will take in some form data via a straight POST through API Gateway (testing using Postman for now) and then send that image to S3 for storage. Every time I run it, the image uploaded to S3 is corrupted and won't open properly. I have seen people having to decode/encode the incoming data but I feel like I have tried everything using Buffer.from. I am only looking to store either .png or .jpg. The below code does not reflect my attempts using Base64 encoding/decoding seeing they all failed. Here is what I have so far -
Sample Request in postman
{
image: (uploaded .jpg/.png),
metadata: {tag: 'iPhone'}
}
Lambda
const AWS = require('aws-sdk')
const multipart = require('aws-lambda-multipart-parser')
const s3 = new AWS.S3();
exports.handler = async (event) => {
const form = multipart.parse(event, false)
const s3_response = await upload_s3(form)
return {
statusCode: '200',
body: JSON.stringify({ data: data })
}
};
const upload_s3 = async (form) => {
const uniqueId = Math.random().toString(36).substr(2, 9);
const key = `${uniqueId}_${form.image.filename}`
const request = {
Bucket: 'bucket-name',
Key: key,
Body: form.image.content,
ContentType: form.image.contentType,
}
try {
const data = await s3.putObject(request).promise()
return data
} catch (e) {
console.log('Error uploading to S3: ', e)
return e
}
}
EDIT:
I am now atempting to save the image into the /tmp directory then use a read stream to upload to s3. Here is some code for that
s3 upload function
const AWS = require('aws-sdk')
const fs = require('fs')
const s3 = new AWS.S3()
module.exports = {
upload: (file) => {
return new Promise((resolve, reject) => {
const key = `${Date.now()}.${file.extension}`
const bodyStream = fs.createReadStream(file.path)
const params = {
Bucket: process.env.S3_BucketName,
Key: key,
Body: bodyStream,
ContentType: file.type
}
s3.upload(params, (err, data) => {
if (err) {
return reject(err)
}
return resolve(data)
}
)
})
}
}
form parser function
const busboy = require('busboy')
module.exports = {
parse: (req, temp) => {
const ctype = req.headers['Content-Type'] || req.headers['content-type']
let parsed_file = {}
return new Promise((resolve) => {
try {
const bb = new busboy({
headers: { 'content-type': ctype },
limits: {
fileSize: 31457280,
files: 1,
}
})
bb.on('file', function (fieldname, file, filename, encoding, mimetype) {
const stream = temp.createWriteStream()
const ext = filename.split('.')[1]
console.log('parser -- ext ', ext)
parsed_file = { name: filename, path: stream.path, f: file, type: mimetype, extension: ext }
file.pipe(stream)
}).on('finish', () => {
resolve(parsed_file)
}).on('error', err => {
console.err(err)
resolve({ err: 'Form data is invalid: parsing error' })
})
if (req.end) {
req.pipe(bb)
} else {
bb.write(req.body, req.isBase64Encoded ? 'base64' : 'binary')
}
return bb.end()
} catch (e) {
console.error(e)
return resolve({ err: 'Form data is invalid: parsing error' })
}
})
}
}
handler
const form_parser = require('./form-parser').parse
const s3_upload = require('./s3-upload').upload
const temp = require('temp')
exports.handler = async (event, context) => {
temp.track()
const parsed_file = await form_parser(event, temp)
console.log('index -- parsed form', parsed_file)
const result = await s3_upload(parsed_file)
console.log('index -- s3 result', result)
temp.cleanup()
return {
statusCode: '200',
body: JSON.stringify(result)
}
}
The above edited code is a combination of other code and a github repo I found that is trying to achieve the same results. Even with this solution the file is still corrupted
Figured out this issue. Code works perfectly fine - it was an issue with API Gateway. Need to go into the API Gateway settings and set thee Binary Media Type to multipart/form-data then re-deploy the API. Hope this helps someone else who is banging their head against the wall on figuring out sending images via form data to a lambda.

Node.js process exiting in the middle, with no error (using streams)

I'm writing a Lambda function which is given a list of text files on S3, and concatenates them together, and then zips that resulting file. For some reason, the function is bombing out in the middle of the process, with no errors.
The payload sent to the Lambda func looks like this:
{
"sourceFiles": [
"s3://bucket/largefile1.txt",
"s3://bucket/largefile2.txt"
],
"destinationFile": "s3://bucket/concat.zip",
"compress": true,
"omitHeader": false,
"preserveSourceFiles": true
}
The scenarios in which this function works totally fine:
The two files are small, and compress === false
The two files are small, and compress === true
The two files are large, and compress === false
If I try to have it compress two large files, it quits in the middle. The concatenation process itself works fine, but when it tries to use zip-stream to add the stream to an archive, it fails.
The two large files together are 483,833 bytes. When the Lambda function fails, it reads either 290,229 or 306,589 bytes (it's random) then quits.
This is the main entry point of the function:
const packer = require('zip-stream');
const S3 = require('aws-sdk/clients/s3');
const s3 = new S3({ apiVersion: '2006-03-01' });
const { concatCsvFiles } = require('./csv');
const { s3UrlToParts } = require('./utils');
function addToZip(archive, stream, options) {
return new Promise((resolve, reject) => {
archive.entry(stream, options, (err, entry) => {
console.log('entry done', entry);
if (err) reject(err);
resolve(entry);
});
});
}
export const handler = async event => {
/**
* concatCsvFiles returns a readable stream to pass to either the archiver or
* s3.upload.
*/
let bytesRead = 0;
try {
const stream = await concatCsvFiles(event.sourceFiles, {
omitHeader: event.omitHeader,
});
stream.on('data', chunk => {
bytesRead += chunk.length;
console.log('read', bytesRead, 'bytes so far');
});
stream.on('end', () => {
console.log('this is never called :(');
});
const dest = s3UrlToParts(event.destinationFile);
let archive;
if (event.compress) {
archive = new packer();
await addToZip(archive, stream, { name: 'concat.csv' });
archive.finalize();
}
console.log('uploading');
await s3
.upload({
Body: event.compress ? archive : stream,
Bucket: dest.bucket,
Key: dest.key,
})
.promise();
console.log('done uploading');
if (!event.preserveSourceFiles) {
const s3Objects = event.sourceFiles.map(s3Url => {
const { bucket, key } = s3UrlToParts(s3Url);
return {
bucket,
key,
};
});
await s3
.deleteObjects({
Bucket: s3Objects[0].bucket,
Delete: {
Objects: s3Objects.map(s3Obj => ({ Key: s3Obj.key })),
},
})
.promise();
}
console.log('## Never gets here');
// return {
// newFile: event.destinationFile,
// };
} catch (e) {
if (e.code) {
throw new Error(e.code);
}
throw e;
}
};
And this is the concatenation code:
import MultiStream from 'multistream';
import { Readable } from 'stream';
import S3 from 'aws-sdk/clients/s3';
import { s3UrlToParts } from './utils';
const s3 = new S3({ apiVersion: '2006-03-01' });
/**
* Takes an array of S3 URLs and returns a readable stream of the concatenated results
* #param {string[]} s3Urls Array of S3 URLs
* #param {object} options Options
* #param {boolean} options.omitHeader Omit the header from the final output
*/
export async function concatCsvFiles(s3Urls, options = {}) {
// Get the header so we can use the length to set an offset in grabbing files
const firstFile = s3Urls[0];
const file = s3UrlToParts(firstFile);
const data = await s3
.getObject({
Bucket: file.bucket,
Key: file.key,
Range: 'bytes 0-512', // first 512 bytes is pretty safe for header size
})
.promise();
const streams = [];
const [header] = data.Body.toString().split('\n');
for (const s3Url of s3Urls) {
const { bucket, key } = s3UrlToParts(s3Url);
const stream = s3
.getObject({
Bucket: bucket,
Key: key,
Range: `bytes=${header.length + 1}-`, // +1 for newline char
})
.createReadStream();
streams.push(stream);
}
if (!options.omitHeader) {
const headerStream = new Readable();
headerStream.push(header + '\n');
headerStream.push(null);
streams.unshift(headerStream);
}
const combinedStream = new MultiStream(streams);
return combinedStream;
}
Got it. The problem was actually with the zip-stream library. Apparently it doesn't work well with S3 + streaming. I tried yazl and it works perfectly.

How upload image to firebase storage from Ionic?

I need help to send an image to firebase storage from ionic. So, i take a picture with this function:
async takePicture() {
const options: CameraOptions = {
quality: 100,
destinationType: this.camera.DestinationType.FILE_URI,
encodingType: this.camera.EncodingType.JPEG,
mediaType: this.camera.MediaType.PICTURE
}
try {
let imageURI = await this.camera.getPicture(options);
let newImageURI = await this.cropImage(imageURI);
let imageSanitized = await this.encodeFile(newImageURI);
this.imgSrc = imageSanitized;
} catch (e) {
console.error(JSON.stringify(e));
}
}
And I crop with this function:
cropImage(imgURI): Promise<string> {
return new Promise((resolve,reject) => {
this.cropService.crop(imgURI, { quality: 100 }).then((newImageURI: string) => {
resolve(newImageURI);
}, (err) => {
reject(err);
})
})
}
finishing I encode with this function:
encodeFile(ImageURI: string): Promise<any>{
return new Promise((resolve, reject) => {
this.base64.encodeFile(ImageURI).then((base64File: string) => {
this.imgUri = base64File;
resolve(this.domSanitizer.bypassSecurityTrustResourceUrl(base64File));
}, (err) => {
reject(err);
})
})
}
this.imgSrc is my sanitized image and this show very well in my file.html. However, I need send this image to firebase storage. For that, I created this function:
uploadToStorage(imgString) {
console.log(imgString);
this.storage.child('exemplo.JPEG').putString(imgString, 'data_url').then((res) => {
console.log(res);
}, (error) => {
console.error(error);
});
}
imgString is who gets the value of the this.domSanitizer.bypassSecurityTrustResourceUrl(base64File) or base64File from function encodeFile.
I don't get an error in my upload function, however, I don't get success, nothing shows up for me.
How I can send correctly the image to the server?
I think you can capture the image as base64 by using:
destinationType: this.camera.DestinationType.DATA_URL
instead of FILE_URI.
Anyways, to record an image I used the following code with AngularFire2 that works. this.data.image is where I saved base64 of image. Yet, it may get tricky if your encoder is adding ""data:image/jpeg;base64" to the beginning of your base64 string. You may try with or without the added string if this code doesn't work as expected.
import { AngularFireStorage, AngularFireStorageReference } from 'angularfire2/storage/public_api';
//...
const storageRef: AngularFireStorageReference = this.afStorage.ref(`images/${this.userId}/profilePic/`);
storageRef.putString(this.data.image, 'data_url', {
contentType: 'image/jpeg'
}).then(() => {
storageRef.getDownloadURL().subscribe(url => {
console.log('download url: ' + url);
//function to save download URL of saved image in database
});
})
})
I did this:
encodeFile(ImageURI: string): Promise<any>{
return new Promise((resolve, reject) => {
this.base64.encodeFile(ImageURI).then((base64File: string) => {
this.imgUri = base64File;
...
}, (err) => {
...
})
})
}
this.imgUri store my image base64 encoding. But is necessary to modify the beginning of it. So split:
let messageSplit = this.imgUri.split('data:image/*;charset=utf-8;base64,')[1];
add in her place
let message64 = 'data:image/jpg;base64,' + messageSplit;
So upload to the firebase storage.
const filePath = `my-pet-crocodile_${ new Date().getTime() }.jpg`;
let task = this.afStorage.ref(filePath).putString(message64, 'data_url');

how to pipe an archive (zip) to an S3 bucket

I’m a bit confused with how to proceed. I am using Archive ( node js module) as a means to write data to a zip file. Currently, I have my code working when I write to a file (local storage).
var fs = require('fs');
var archiver = require('archiver');
var output = fs.createWriteStream(__dirname + '/example.zip');
var archive = archiver('zip', {
zlib: { level: 9 }
});
archive.pipe(output);
archive.append(mybuffer, {name: ‘msg001.txt’});
I’d like to modify the code so that the archive target file is an AWS S3 bucket. Looking at the code examples, I can specify the bucket name and key (and body) when I create the bucket object as in:
var s3 = new AWS.S3();
var params = {Bucket: 'myBucket', Key: 'myMsgArchive.zip' Body: myStream};
s3.upload( params, function(err,data){
…
});
Or
s3 = new AWS.S3({ parms: {Bucket: ‘myBucket’ Key: ‘myMsgArchive.zip’}});
s3.upload( {Body: myStream})
.send(function(err,data) {
…
});
With regards to my S3 example(s), myStream appears to be a readable stream and I am confused as how to make this work as archive.pipe requires a writeable stream. Is this something where we need to use a pass-through stream? I’ve found an example where someone created a pass-through stream but the example is too terse to gain proper understanding. The specific example I am referring to is:
Pipe a stream to s3.upload()
Any help someone can give me would greatly be appreciated. Thanks.
This could be useful for anyone else wondering how to use pipe.
Since you correctly referenced the example using the pass-through stream, here's my working code:
1 - The routine itself, zipping files with node-archiver
exports.downloadFromS3AndZipToS3 = () => {
// These are my input files I'm willing to read from S3 to ZIP them
const files = [
`${s3Folder}/myFile.pdf`,
`${s3Folder}/anotherFile.xml`
]
// Just in case you like to rename them as they have a different name in the final ZIP
const fileNames = [
'finalPDFName.pdf',
'finalXMLName.xml'
]
// Use promises to get them all
const promises = []
files.map((file) => {
promises.push(s3client.getObject({
Bucket: yourBubucket,
Key: file
}).promise())
})
// Define the ZIP target archive
let archive = archiver('zip', {
zlib: { level: 9 } // Sets the compression level.
})
// Pipe!
archive.pipe(uploadFromStream(s3client, 'someDestinationFolderPathOnS3', 'zipFileName.zip'))
archive.on('warning', function(err) {
if (err.code === 'ENOENT') {
// log warning
} else {
// throw error
throw err;
}
})
// Good practice to catch this error explicitly
archive.on('error', function(err) {
throw err;
})
// The actual archive is populated here
return Promise
.all(promises)
.then((data) => {
data.map((thisFile, index) => {
archive.append(thisFile.Body, { name: fileNames[index] })
})
archive.finalize()
})
}
2 - The helper method
const uploadFromStream = (s3client) => {
const pass = new stream.PassThrough()
const s3params = {
Bucket: yourBucket,
Key: `${someFolder}/${aFilename}`,
Body: pass,
ContentType: 'application/zip'
}
s3client.upload(s3params, (err, data) => {
if (err)
console.log(err)
if (data)
console.log('Success')
})
return pass
}
The following example takes the accepted answer and makes it work with local files as requested.
const archiver = require("archiver")
const fs = require("fs")
const AWS = require("aws-sdk")
const s3 = new AWS.S3()
const stream = require("stream")
const zipAndUpload = async () => {
const files = [`test1.txt`, `test2.txt`]
const fileNames = [`test1target.txt`, `test2target.txt`]
const archive = archiver("zip", {
zlib: { level: 9 } // Sets the compression level.
})
files.map((thisFile, index) => {
archive.append(fs.createReadStream(thisFile), { name: fileNames[index] })
})
const uploadStream = new stream.PassThrough()
archive.pipe(uploadStream)
archive.finalize()
archive.on("warning", function (err) {
if (err.code === "ENOENT") {
console.log(err)
} else {
throw err
}
})
archive.on("error", function (err) {
throw err
})
archive.on("end", function () {
console.log("archive end")
})
await uploadFromStream(uploadStream)
console.log("all done")
}
const uploadFromStream = async pass => {
const s3params = {
Bucket: "bucket-name",
Key: `streamtest.zip`,
Body: pass,
ContentType: "application/zip"
}
return s3.upload(s3params).promise()
}
zipAndUpload()

Categories