NestJS Multer Amazon S3 issues uploading multiple files - javascript

I'm using NestJS, Node, and Express for my backend and Angular for my frontend. I have a stepper where the user steps through and enters in information about themselves as well as a profile photo and any photos of their art that they want to post (it's a rough draft). I'm sending the files to the backend with this code:
<h2>Upload Some Photos</h2>
<label for="singleFile">Upload file</label>
<input id="singleFile" type="file" [fileUploadInputFor]= "fileUploadQueue"/>
<br>
<mat-file-upload-queue #fileUploadQueue
[fileAlias]="'file'"
[httpUrl]="'http://localhost:3000/profile/artPhotos'">
<mat-file-upload [file]="file" [id]="i" *ngFor="let file of fileUploadQueue.files; let i = index"></mat-file-upload>
</mat-file-upload-queue>
The front-end sends the photos as an array of files; I tried to change it so that it just sent a single file but could not get it working. I'm less focused on that because the user may need to upload multiple files, so I want to figure it out regardless. On the backend, I'm using multer, multer-s3, and AWS-SDK to help upload the files however it isn't working. Here is the controller code:
#Post('/artPhotos')
#UseInterceptors(FilesInterceptor('file'))
async uploadArtPhotos(#Req() req, #Res() res): Promise<void> {
req.file = req.files[0];
delete req.files;
// tslint:disable-next-line:no-console
console.log(req);
await this._profileService.fileupload(req, res);
}
Here is ProfileService:
import { Profile } from './profile.entity';
import { InjectRepository } from '#nestjs/typeorm';
import { Repository } from 'typeorm';
import { ProfileDto } from './dto/profile.dto';
import { Req, Res, Injectable, UploadedFile } from '#nestjs/common';
import * as multer from 'multer';
import * as AWS from 'aws-sdk';
import * as multerS3 from 'multer-s3';
const AWS_S3_BUCKET_NAME = 'blah';
const s3 = new AWS.S3();
AWS.config.update({
accessKeyId: 'blah',
secretAccessKey: 'blah',
});
#Injectable()
export class ProfileService {
constructor(#InjectRepository(Profile)
private readonly profileRepository: Repository<Profile> ){}
async createProfile( profileDto: ProfileDto ): Promise<void> {
await this.profileRepository.save(profileDto);
}
async fileupload(#Req() req, #Res() res): Promise<void> {
try {
this.upload(req, res, error => {
if (error) {
// tslint:disable-next-line:no-console
console.log(error);
return res.status(404).json(`Failed to upload image file: ${error}`);
}
// tslint:disable-next-line:no-console
console.log('error');
return res.status(201).json(req.file);
});
} catch (error) {
// tslint:disable-next-line:no-console
console.log(error);
return res.status(500).json(`Failed to upload image file: ${error}`);
}
}
upload = multer({
storage: multerS3({
// tslint:disable-next-line:object-literal-shorthand
s3: s3,
bucket: AWS_S3_BUCKET_NAME,
acl: 'public-read',
// tslint:disable-next-line:object-literal-shorthand
key: (req, file, cb) => {
cb(null, `${Date.now().toString()} - ${file.originalname}`);
},
}),
}).array('upload', 1);
}
I haven't implemented any middleware extending multer, but I don't think I have to. You can see in the controller I erase the files property on req and replace it with the file where it's value is just the first member of the files array but that was just to see if it would work if I send it something it was expecting, but it did not work then. Does anyone have any ideas regarding how I can fix this? Or can anyone at least point me in the right direction with a link to a relevant tutorial or something?

My first guess would be that you are using the FileInterceptor and multer. I assume FileInterceptor adds multer in the controller which makes it available to the #UploadedFile decorator. Which could cause a conflict to your later use of multer. Try removing the interceptor and see if that fixes the issue.
Also I am attaching how I am doing file uploads. I am only uploading single images and I am using the AWS SDK so I don't have to work with multer directly, but here is how I am doing it, it might be helpful.
In the controller:
#Post(':id/uploadImage')
#UseInterceptors(FileInterceptor('file'))
public uploadImage(#Param() params: any, #UploadedFile() file: any): Promise<Property> {
return this.propertyService.addImage(params.id, file);
}
Then my service
/**
* Returns a promise with the URL string.
*
* #param file
*/
public uploadImage(file: any, urlKey: string): Promise<string> {
const params = {
Body: file.buffer,
Bucket: this.AWS_S3_BUCKET_NAME,
Key: urlKey
};
return this.s3
.putObject(params)
.promise()
.then(
data => {
return urlKey;
},
err => {
return err;
}
);
}

Thanks Jedediah, I like how simple your code is. I copied your code however it still wasn't working. Turns out you have to instantiate the s3 object after you update the config with your accesskey and secretID.

Related

Deleting Wasabi CDN Bucket 204 Error Not Deleting

So I'm trying to programmatically delete Wasabi CDN objects from one of my buckets. My request is sending back 204 and showing success but nothing is being moved/deleted. I'm using node/javascript to do this.
Here is my function that is supposed to delete the bucket.
import expressAsyncHandler from 'express-async-handler'
import User from '../../models/User.js'
import axios from 'axios'
import aws4 from 'aws4'
/**
* #desc: THIS is going to be a testing function that will be added into admin delete user and all related docs.
* #route: DELETE
* #access: Private - Admin Route for when deleting user will delete the CDN username bucket aswell
* #goalHere: The goal of this function is to delete the user bucket from CDN. So that if we d
* #comment: This is a testing function that will be added into deleteVideo.js. Unless we just await this function in deleteVideo.js.
*/
export const deleteUserBucket = expressAsyncHandler(async (req, res, next) => {
try {
const user = await User.findOne({ username: req.user.username })
const username = user.username
let request = {
host: process.env.CDN_HOST,
method: 'DELETE',
url: `https://s3.wasabisys.com/truthcasting/${username}?force_delete=true`,
path: `/truthcasting/${username}?force_delete=true`,
headers: {
'Content-Type': 'application/json',
},
service: 's3',
region: 'us-east-1',
maxContentLength: Infinity,
maxBodyLength: Infinity,
}
let signedRequest = aws4.sign(request, {
accessKeyId: process.env.CDN_KEY,
secretAccessKey: process.env.CDN_SECRET,
})
//delete the Host and Content-Length headers
delete signedRequest.headers.Host
delete signedRequest.headers['Content-Length']
const response = await axios(signedRequest)
console.log(response.data)
console.log('successfully deleted user bucket', response.status)
return res.status(200).json({
message: `Successfully deleted user bucket`,
})
} catch (error) {
console.log(error)
return res.status(500).json({
message: `Problem with deleting user bucket`,
})
}
})
export default deleteUserBucket
When I send the http DELETE request in POSTMAN to {{dev}}api/admin/deleteuserbucket it then gives me a response of 204 ok and this is the response.
{
"message": "Successfully deleted user bucket"
}
I then go to my Wasabi CDN Buckets to check if it is deleted, in this case it's goodstock and it's still there. Feel like I'm missing something dumb here.
In S3, the delete bucket API call return 204 No content and an empty response body on successful delete.
With that URL, you are making a delete request on an object and not the bucket:
URL: `https://s3.wasabisys.com/truthcasting/${username}?force_delete=true`
The username passed in this URL will be interpreted as a key and S3 will look for an object in the root of the bucket.
Also why not using the AWS SDK to delete the bucket instead of reimplementing all of this. Check the AWS docs for this.
So for deleting contents that're inside of your root bucket, you need to point it to the complete object. That being said the way I had it set up in the original post code was returning 204("which is expected from Wasabi API") and not deleting anything due to the fact that I wasn't pointing it to the complete object path. Also I've found out that if you want to do batch delete instead of deleting one file, one by one you can use the aws-sdk node package to do a get request to your object, then use that response to loop through the object and remove what you need.. Here is an example. Hopefully this can help someone in the near future.
import expressAsyncHandler from 'express-async-handler'
import User from '../../models/User.js'
import axios from 'axios'
import aws4 from 'aws4'
import errorHandler from '../../middleware/error.js'
import AWS from 'aws-sdk'
/**
* #desc: THIS is going to be a testing function that will be added into admin delete user and all related docs.
* #route: DELETE
* #access: Private - Admin Route for when deleting user will delete the CDN username bucket aswell
* #goalHere: The goal of this function is to delete the user bucket from CDN. So that if we d
* #comment: This is a testing function that will be added into deleteVideo.js. Unless we just await this function in deleteVideo.js.
*/
export const deleteUserBucket = expressAsyncHandler(async (req, res, next) => {
const username = req.body.username
try {
// Connection
// This is how you can use the .aws credentials file to fetch the credentials
const credentials = new AWS.SharedIniFileCredentials({
profile: 'wasabi',
})
AWS.config.credentials = credentials
// This is a configuration to directly use a profile from aws credentials file.
AWS.config.credentials.accessKeyId = process.env.CDN_KEY
AWS.config.credentials.secretAccessKey = process.env.CDN_SECRET
// Set the AWS region. us-east-1 is default for IAM calls.
AWS.config.region = 'us-east-1'
// Set an endpoint.
const ep = new AWS.Endpoint('s3.wasabisys.com')
// Create an S3 client
const s3 = new AWS.S3({ endpoint: ep })
// The following example retrieves an object for an S3 bucket.
// set the details for the bucket and key
const object_get_params = {
Bucket: 'truthcasting',
Prefix: `${username}/`,
//Key: `cnfishead/videos/4:45:14-PM-5-6-2022-VIDDYOZE-Logo-Drop.mp4`,
// Key: `cnfishead/images/headshot.04f99695-photo.jpg`,
}
// get the object that we just uploaded.
// get the uploaded test_file
// s3.getObject(object_get_params, function (err, data) {
// if (err) console.log(err, err.stack) // an error occurred
// else console.log(data) // successful response
// })
// get the object that we just uploaded.
// get the uploaded test_file
await s3.listObjectsV2(object_get_params, (err, data) => {
if (err) {
console.log(err)
return res.status(500).json({
message: 'Error getting object',
error: err,
})
} else {
console.log(data)
//TODO Change this for loop to a async for loop. Like this: for await (const file of data.Contents) { }
for (let i = 0; i < data.Contents.length; i++) {
const object_delete_params = {
Bucket: 'truthcasting',
Key: data.Contents[i].Key,
}
s3.deleteObject(object_delete_params, (err, data) => {
if (err) {
console.log(err)
return res.status(500).json({
message: 'Error deleting object',
error: err,
})
} else {
console.log(data)
}
})
}
if (data.IsTruncated) {
console.log('Truncated')
getObjectFromBucket(req, res, next)
}
//console.log('Not Truncated')
res.status(200).json({
message: `Successfully retrieved + deleted ${data.Contents.length} objects`,
data: data,
})
}
})
} catch (error) {
console.log(error)
errorHandler(error, req, res)
}
})
export default deleteUserBucket

How to send piped response in Next.js API using wkhtmltoimage?

I'm new to Next.js, and I'm trying to use wkhtmltoimage but I can't seem to send the generated image stream as a response in my Next.js API.
const fs = require('fs')
const wkhtmltoimage = require('wkhtmltoimage').setCommand(__dirname + '/bin/wkhtmltoimage');
export default async function handler(req, res) {
try {
await wkhtmltoimage.generate('<h1>Hello world</h1>').pipe(res);
res.status(200).send(res)
} catch (err) {
res.status(500).send({ error: 'failed to fetch data' })
}
}
I know I'm doing plenty of stuff wrong here, can anyone point me to the right direction?
Since you're concatenating __dirname and /bin/wkhtmltoimage together, that would mean you've installed the wkhtmltoimage executable to ./pages/api/bin which is probably not a good idea since the pages directory is special to Next.js.
We'll assume you've installed the executable in a different location on your filesystem/server instead (e.g., your home directory). It looks like the pipe function already sends the response, so the res.status(200).send(res) line will cause problems and can be removed. So the following should work:
// ./pages/api/hello.js
const homedir = require("os").homedir();
// Assumes the following installation path:
// - *nix: $HOME/bin/wkhtmltoimage
// - Windows: $env:USERPROFILE\bin\wkhtmltoimage.exe
const wkhtmltoimage = require("wkhtmltoimage").setCommand(
homedir + "/bin/wkhtmltoimage"
);
export default async function handler(req, res) {
try {
res.status(200);
await wkhtmltoimage.generate("<h1>Hello world</h1>").pipe(res);
} catch (err) {
res.status(500).send({ error: "failed to fetch data" });
}
}

How to store files in production in Next.js?

I have a file exchange Next.js app that I would like to deploy. In development whenever file is dropped, the app stores the file in root public folder, and when the file is downloaded the app takes it from there as well using the <a> tag with href attribute of uploads/{filename}. This all works pretty well in development, but not in production.
I know that whenever npm run build is run, Next.js takes the files from public folder and the files added there at runtime will not be served.
The question is are there any ways of persistent file storage in Next.js apart from third party services like AWS S3?
Next.js does allow file storage at buildtime, but not at runtime. Next.js will not be able to fulfill your file upload requirement. AWS S3 is the best option here.
next in node runtime is NodeJS, thus If your cloud provider allows create persistent disk and mount it to your project, then you can do it:
e.g. pages/api/saveFile.ts:
import { writeFileSync } from 'fs';
import { NextApiRequest, NextApiResponse } from 'next';
export default async function handler(req: NextApiRequest, res: NextApiResponse) {
const { path = null } = req.query;
if (!path) {
res.status(400).json({ name: 'no path provided' })
} else {
// your file content here
const content = Date.now().toString();
writeFileSync(`/tmp/${path}.txt`, content);
res.json({
path,
content
})
}
}
/tmp works almost in every cloud provider (including Vercel itself), but those files will be lost on next deployment; instead you should use your mounted disk path
pages/api/readFile.ts
import { readFileSync } from 'fs';
import { NextApiRequest, NextApiResponse } from 'next';
export default async function handler(req: NextApiRequest, res: NextApiResponse) {
const { path = '' } = req.query;
if (!path) {
res.status(400).json({ name: 'wrong' })
} else {
res.send(readFileSync(`/tmp/${path}`));
}
}
Live running example:
fetch('https://eaglesdoc.vercel.app/api/writefile?path=test')
.then(res => res.text()).then(res => console.log(res)).then( () =>
fetch('https://eaglesdoc.vercel.app/api/readfile?path=test'))
.then(res => res.text()).then(res => console.log(res))

How to retrieve image from nodejs backend to react

I have uploaded image by using multer library in express.
in path Backend->Uploads/
and stored image path in mongodb.
I have project structure as
DirectoryName
Backend
Uploads
Frontend
I can get the image path in frontend component , but How to get actual images from backend folder.
Can I use file moving to store it in public in frontend , or retrieve stream from server.
Will moving File from backend to frontend works actually in deployment.
I think you have done most of the work. Just set the image source or uri to the path and it will serve the image.
A simple example of implementation from one of my projects.
Mayby not the best, but it works and, most important, you can get the idea.
In this case I keep files on server.
Created a route in my API:
router.get('/download/:photo_id', async (req, res) => {
try {
const photo = await Photo.findOne({
photoID: req.params.photo_id,
});
if (!photo) {
return res.status(404).json({ msg: 'Photo not found' });
}
const filename = photo.photoFileName;
const downloadPath = path.join(__dirname, '../../../imgs/', `${filename}`);
res.download(downloadPath);
} catch (err) {
console.error(err.message);
if (err.kind === 'ObjectId') {
return res.status(404).json({ msg: 'Photo not found' });
}
res.status(500).send('Server error');
}
});
And this route is called from front something like this:
const handleDownload = async () => {
const res = await fetch(`/api/photo/download/${photoID}`);
const blob = await res.blob();
download(
blob,
`${photoID}-${title}-${contributorName}.jpg`
);
};

nestjs multer get list of new file names after multer storage

I have this middleware that I use to upload files.
#Injectable()
export class FilesMiddleware implements NestMiddleware {
private storage = multer.diskStorage({
destination: (req, file, cb) => {
cb(null, path.join(__dirname, '../../uploads/'));
},
filename: (req, file, cb) => {
let extArray = file.mimetype.split("/");
let extension = extArray[extArray.length - 1];
cb(null, file.fieldname + '-' + Date.now() + '.' + extension)
}
});
resolve(...args: any[]): MiddlewareFunction {
return (req, res, next) => {
console.log(req.files);
const upload = multer({storage: this.storage});
upload.any();
return next();
}
}
}
The problem that in my request, when I use req.files it gives me the original file names instead of the new file names (with date, etc like I set in the multer storage options).
Is there a way I can get the new file names multer just uploaded with the middleware?
#Post('upload')
#UseInterceptors(FilesInterceptor('files[]', 20, {}))
public async onUpload(#Request() req, #Response() res, #UploadedFiles() files) {
const mediaResponse = await this.media.saveMedias(0, files);
res.json({status: true});
}
First of all: It does not make sense to both use multer via the built-in FilesInterceptor and a custom FilesMiddleware. Choose one of the following two options:
A) Use the built-in FilesInterceptor (recommended)
You can provide your storage configuration directly for each FilesInterceptor:
const storage = {...};
#Controller()
export class AppController {
#Post('upload')
#UseInterceptors(FilesInterceptor('files', 20, { storage }))
public async onUpload(#UploadedFiles() files) {
return files.map(file => file.filename);
}
}
Or provide the default multer storage configuration by importing the MulterModule:
imports: [
MulterModule.register({
storage: {...},
})
]
B) Using your own middleware (without FilesInterceptor)
Only use this if you need more flexibility than the FilesInterceptor provides. You can use a Promise to wait for the upload to finish. After the upload, you can access the new file names via req.files.
export class FilesMiddleware implements NestMiddleware {
private storage = {...};
async use(req, res, next) {
const upload = multer({ storage: this.storage });
// wait until upload has finished
await new Promise((resolve, reject) => {
upload.array('files')(req, res, err => err ? reject(err) : resolve());
});
// Then you can access the new file names
console.log(req.files.map(file => file.filename));
return next();
}
}
Access the uploaded files in your controller via the request object:
#Post('upload')
public async onUpload(#Request() req) {
return req.files.map(file => file.filename);
}
How to access the new file names?
You'll find the uploaded files in req.files (middleware) or #UploadedFiles() files (interceptor) as an array with the following structure:
[ { fieldname: 'files',
originalname: 'originalname.json',
encoding: '7bit',
mimetype: 'application/json',
destination: 'D:/myproject/src/uploads',
// This is what you are looking for
filename: 'files-1558459911159.json',
path:
'D:/myproject/src/uploads/files-1558459911159.json',
size: 2735 } ]

Categories