The goal is to receive an archive form the client, add a file, and upload it to Cloud Storage without creating a temporary file. The client and the server both use the archiver library. The problem with the code below is that file2.txt does not get added to the archive.
Client:
import archiver from "archiver";
const archive = archiver("zip", {
zlib: { level: 9 },
});
archive.append("string cheese!", { name: "file1.txt" });
await fetch(`/archive`, {
method: "POST",
body: archive,
});
Server:
import archiver from "archiver";
import { Storage } from "#google-cloud/storage";
import { Router } from "express";
const router = Router();
const storage = new Storage();
router.post("/", (req, res) => {
const archive = archiver("zip", {
zlib: { level: 9 },
});
const cloudFile = storage.bucket("archives").file("archive.zip");
req.pipe(archive, {
end: false,
});
archive.pipe(cloudFile.createWriteStream());
req.on("end", async () => {
archive.append("string cheese!", { name: "file2.txt" });
archive.finalize();
archive.end();
});
});
as i can see from the documentations
they are using fs.createReadStream(file1) to reach out file system
however you can achieve this if you get Buffer from the received data
const multer = require('multer')
const upload = multer()
router.post("/", upload.none(), function (req, res, next) {
// append a file from buffer
const buffer = req.files[0].buffer
archive.append(buffer, { name: 'file3.txt' });
})
also a tip if you working in serverless env most likely you don't have access
to file system to read, write and clean
but at some cases you have access to /temp directory
it worth a quick search if this is the case
Related
I'm trying to upload an image (jpg/jpeg/png) from the browser to NodeJS. I have read through several tutorials and many posts on forums but very few seem to have this specific issue.
I've made sure to match the name provided to multer (upload.single('upload')) with the formData key (formData.append('upload', selectedFile, selectedFile.name))
I tried using headers originally, but later read that I should exclude them.
I tried submitting through a <form action="/upload" method="post" enctype="multipart/form-data"> but still got the same error.
I have found this similar question with only one answer which isn't clear
Multer gives unexpetcted end of form error and this question Unexpected end of form at Multipart._final which has no answers.
Every other question seems to be about an 'Unexpected field' or 'Unexpected end of multipart data' error which - judging from the solutions - is irrelevant here.
Below is my code...
Browser:
<body>
<input type="file" id="file_uploader" name="upload" />
<button onclick="uploadImage()" class="btn-default">SUBMIT</button>
<!-- OTHER STUFF -->
</body>
<script>
let selectedFile;
let uploadData = new FormData();
const fileInput = document.getElementById('file_uploader');
fileInput.onchange = () => {
selectedFile = fileInput.files[0];
uploadData.append('upload', selectedFile, selectedFile.name);
}
function uploadImage(){
fetch('/upload', {
method: 'POST',
body: uploadData
})
.then((response) => {
console.log(response);
})
.catch((error) => {
console.error('Error: ', error);
});
}
</script>
NodeJS
let express = require('express');
const multer = require('multer');
//multer options
const upload = multer({
dest: './upload/',
limits: {
fileSize: 1000000,
}
})
const app = express();
app.post('/upload', upload.single('upload'), (req, res) => {
res.send();
}, (error, req, res, next) => {
console.log(error.message);
})
exports.app = functions.https.onRequest(app);
...And here is the error log, if it helps:
Error: Unexpected end of form
> at Multipart._final (C:\Users\p\Downloads\MyInvestmentHub\functions\node_modules\busboy\lib\types\multipart.js:588:17)
> at callFinal (node:internal/streams/writable:694:27)
> at prefinish (node:internal/streams/writable:723:7)
> at finishMaybe (node:internal/streams/writable:733:5)
> at Multipart.Writable.end (node:internal/streams/writable:631:5)
> at onend (node:internal/streams/readable:693:10)
> at processTicksAndRejections (node:internal/process/task_queues:78:11)
I haven't posted many questions as of yet, so I apologise if I'm missing something or the format is off. Let me know and I will make appropriate edits.
Thanks.
I also got the exact same error.
Before using multer I had installed express-fileupload. When I unistalled it using the command npm uninstall express-fileupload I could get rid of the error.
And if it is the same case with you don't forget to delete the commands you already added for express-fileupload module. (like requiring fileupload)
Hi there I ran into the same issue for me was the lack of a bodyParser middleware that allows our requests files to parsed into Buffers.
I was able to resolve the problem like so in express:
var bodyParser = require('body-parser')
bodyParser.json([options])
I had this problem using multer with next js api. What worked for me is, I exported an a config that sets bodyParser to false like so;
export const config = {
api: {
bodyParser: false
}
}
In my case, the cause was other middleware. Check for other middleware running before multer. For me, the issue was express-openapi-validator middleware. Once I removed that middleware, it worked as expected.
Using body-parser package worked for me:
const bodyParser = require('body-parser')
// ...
app.use(bodyParser()) // support encoded bodies
My upload single file route:
const multer = require('multer')
const express = require('express')
const router = express()
const path = require('path') // node built-in path package
// needs "app.use(bodyParser())" middleware to work
const storage = multer.diskStorage({
destination: function (req, file, cb) {
cb(null, process.cwd() + '/public/')
},
filename: function (req, file, cb) {
// generate the public name, removing problematic characters
const originalName = encodeURIComponent(path.parse(file.originalname).name).replace(/[^a-zA-Z0-9]/g, '')
const timestamp = Date.now()
const extension = path.extname(file.originalname).toLowerCase()
cb(null, originalName + '_' + timestamp + extension)
}
})
const upload = multer({
storage: storage,
limits: { fileSize: 1 * 1024 * 1024 }, // 1 Mb
fileFilter: (req, file, callback) => {
const acceptableExtensions = ['png', 'jpg', 'jpeg', 'jpg']
if (!(acceptableExtensions.some(extension =>
path.extname(file.originalname).toLowerCase() === `.${extension}`)
)) {
return callback(new Error(`Extension not allowed, accepted extensions are ${acceptableExtensions.join(',')}`))
}
callback(null, true)
}
})
router.post('/api/upload/single', upload.single('file'), (req, res) => {
res.status(200).json(req.file)
})
module.exports = {
uploadRouter: router
}
I think this is may causes by the responsed end,so in your continuous Middleware,you can do upload file at last.
i do this resolve problems.
const upload = multer({
dest: "./uploads",
});
app.use(upload.any());
app.post(
"/upload",
(req, res, next) => {
res.end("文件上传成功");
},
upload.single("fileKey")
);
try using these it work
const express = require('express')
const app = express()
const path = require('path')
const multer = require('multer')
var filestorageEngine = multer.diskStorage({
destination: (req, file, cb) => {
cb(null,'./uploads')
},
filename:(req,file, cb) => {
cb(null,"[maues]-" + file.originalname)
}
})
var upload = multer({
storage:filestorageEngine
})
app.post('/file', upload.array('file', 3),(req, res) => {
console.log(req.file)
res.send("file uploaded successfully")
})
app.listen(5000, ()=> {
console.log("server running")
})
in my frontend or client-side removing the headers in my request. And make sure your inputs are as a formData.
For example:
let formData = new FormData();
formData.append("fileName", file);
const res = await fetch("/api/create-card", {
method: "POST",
body: formData,
})
This worked for me.
I think, the problem is in the express and body-parser module, I just eliminated it
app.use(bodyParser.text({ type: '/' }));
and it works!
Try downgrading Multer to 1.4.3. It worked for me.
See https://github.com/expressjs/multer/issues/1144
I'm at a loss here. My .env file is in my root directory, as instructed. I've tried this by way of both require('dotenv').config() and import dotenv from 'dotenv', followed by dotenv.config(). I've tried passing config an absolute path, as you will see. Trying to console log the environment variables always returns undefined. I tried checking for dotenv.error, as you will also see, and the condition doesn't trigger.
All I see is undefined. It's as if my .env file doesn't even exist.
Here is my code in its current state. Any help would be appreciated.
import express from "express";
import cors from "cors";
import multer from "multer";
import AWS, { PutObjectCommandOutput, S3, S3ClientConfig } from "#aws-sdk/client-s3";
import { createReadStream } from "fs";
import path from 'path';
const dotenvAbsolutePath = path.join(__dirname, '.env')
const app = express();
const port = 5000;
// require('dotenv').config();
const dotenv = require('dotenv').config({
path: dotenvAbsolutePath,
});
if (dotenv.error) {
console.log(`DOTENV ERROR: ${dotenv.error.message}`);
throw dotenv.error;
}
const keys = {
key: process.env.AWS_ACCESS_KEY_ID,
secret: process.env.AWS_SECRET_KEY,
region: process.env.AWS_REGION
};
console.log(dotenv);
const upload = multer({
storage: multer.diskStorage({ destination: "./tmp" }),
});
const s3 = new S3({
credentials: { accessKeyId: keys.key!, secretAccessKey: keys.secret! },
region: keys.region!
});
app.use(cors());
app.post("/", upload.single("pdf_upload"), async (req, res) => {
let fileName: string | undefined;
// let fileBuffer: Buffer | undefined;
let uploadResponse: PutObjectCommandOutput | undefined;
if (req.file) {
fileName = req.file.originalname;
// fileBuffer = req.file.buffer
console.log("HTTP Request Received");
const fileReadStream = createReadStream(req.file!.path).on(
"ready",
async () => {
try {
console.log("Stream is Readable");
console.log(fileReadStream.bytesRead);
uploadResponse = await s3.putObject({
Bucket: "fiberpunch-test",
Key: fileName,
Body: fileReadStream,
});
res.header("Access-Control-Allow-Origin", "*");
res
.send({ file: { ETag: uploadResponse.ETag, Key: fileName } })
.status(200);
} catch (err: any) {
console.log("Upload Error: ", err.message);
res.sendStatus(500);
console.log(fileReadStream.bytesRead);
}
}
);
}
});
try {
app.listen(port);
console.log(`listening at port ${port}`);
} catch (err) {
console.log("ERROR SETTING UP SERVER");
}
I think you need to use process.env to access your env variables. Try logging process.env.AWS_ACCESS_KEY_ID
Okay, I figured it out. First of all, I was working in typescript and forgot to compile. Second, the absolute path to the .env file was incorrect. After I did that, the environmental variable pull just fine.
I'm running into a new error, though:
Upload Error: The bucket you are attempting to access must be addressed using the specified endpoint. Please send all future requests to this endpoint.
I'll post another question if I can't figure this one out. Thanks, everyone.
If you use AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY in your .env file, you don't have to use them again to create a S3 object. AWS automatically recognizes these keys and uses them for S3 object creation.
You only need region to create a S3 object.
const s3 = new S3({
region: 'your region'
});
I have been trying to upload files (mostly images) to firebase storage through firebase cloud function (onRequest method). I had to upload files from its base64 form. With the below code, i was able to achieve it, yet the file seems to be broken after upload.
const functions = require('firebase-functions');
const admin = require('firebase-admin');
const bucket = admin.storage().bucket();
const database = admin.database();
const express = require('express');
const cors = require('cors');
const safetyLogsAPI = express();
safetyLogsAPI.use(cors({ origin: true }));
safetyLogsAPI.post('/', async (req, res) => {
try {
const {
attachments,
projectId
} = req.body;
const safetyKey = database.ref('safetyLogs')
.child(projectId)
.push().key;
if(attachments && Array.isArray(attachments)) {
if (attachments.length > 0) {
for (let index = 0; index < attachments.length; index++) {
const base64Obj = attachments[index];
const { content, fileName, contentType } = base64Obj;
const stream = require('stream');
const bufferStream = new stream.PassThrough();
bufferStream.end(Buffer.from(content, 'base64'));
const fullPath = `SafetyIncidentLog/Images/${projectId}/${safetyKey}/${fileName}`;
const file = bucket.file(fullPath);
const metadata = {
projectId,
safetyLogId: safetyKey,
createdTimestamp: Date.now(),
systemGenerated: 'false',
fileType: 'Image',
fileName,
path: fullPath
};
bufferStream.pipe(file.createWriteStream({
metadata: {
contentType,
metadata
},
public: true,
validation: "md5"
}))
.on('error', (err) => {
console.log('Error Occured');
console.log(err);
})
.on('finish', () => {
console.log('File Upload Successfull');
});
}
}
}
return res.status(200).send({
code: 200,
message: 'Success'
});
} catch (error) {
console.log(error);
return res.status(500).send({
code:500,
message: 'Internal Server Error',
error: error.message
});
}
});
module.exports = functions.https.onRequest(safetyLogsAPI);
I have tried this approach with both the prefix part data:image/png;base64 present and eliminated. In both ways i see broken image. So where have I gone wrong. Is there a better way to make it?
Thanks in advance.
Also, is the approach i try to do so is a recommended way?. For use cases like, profile picture upload, and conversation image attachments, is this way recommended, or the a direct upload from client is recommended?
With Cloud Functions HTTP triggers, the request is terminated and the function is shut down as soon as you send a respond to the client. Any asynchronous work that isn't finished might never finish.
What I'm seeing in your code is that you send a response before the upload is complete. I can see that your call to res.status(200).send() happens immediately after you start the upload. Instead, your code should wait to send the response until after it completes, perhaps using on('finish') and on('error').
I am trying to fetch a zip file uploaded to aws s3. After that file is fetched, I have to extract it and display the names of files inside the folder. How can I achieve this? I am new to file streaming and this is what I have done till now.
import * as aws from "aws-sdk";
import express from "express";
import fs from "fs";
import request from "request";
import * as unzipper from "unzipper";
const config = {
// credentials
};
const s3Client = new aws.S3(config);
const app = express();
app.use(express.json({
limit: "1mb"
}));
app.use(express.urlencoded({
extended: true
}));
app.post("/seturl", async(req, res) => {
try {
const url = req.body.url;
request(url).pipe(fs.createWriteStream('ez.zip'));
console.log("here");
const zip = fs.createReadStream('ez.zip').pipe(unzipper.Parse({
forceStream: true
}));
for await (const entry of zip) {
const fileName = entry.path;
console.log("///////////", fileName);
const type = entry.type; // 'Directory' or 'File'
const size = entry.vars.uncompressedSize; // There is also compressedSize;
if (fileName === "this IS the file I'm looking for") {
entry.pipe(fs.createWriteStream('output/path'));
} else {
entry.autodrain();
}
}
} catch (error) {
return Promise.reject(`Error in reading ${error}`);
}
});
app.listen(5600, (err) => {
if (err) {
console.error(err);
} else {
console.log("running");
}
});
I am using the unzipper library here. If there is something better, I am open to use it. As of now, I am getting FILE ENDED error.
GOAL: Allow the user to download a PDF
Background: The below code generates a car.pdf file and stores it into the main project's directory when localhost:3000/ is loaded. This is great because I want to find a Car by id in the database, generate a handlebars template, pass the data from Car into it, and generate a PDF from the compiled HTML
Issue: Instead of saving the PDF to the main project's directory, I want it to download to the user's computer.
How can I do this?
Here is my code. I am using the NPM package: html-pdf
helpers/export-helper.js
const fs = require('fs');
const pdf = require('html-pdf');
const Handlebars = require('handlebars');
const { Car } = require('./../models/car');
var writePDF = (res) => {
Car.findById({_id: '58857e256b639400110897af'})
.then((car) => {
var source = fs.readFileSync(__dirname + '/templates/car.handlebars', 'utf8');
var template = Handlebars.compile(source);
var file = template(car);
pdf.create(file, { format: 'Letter' })
.toFile('./car.pdf', (err, res) => {
if (err) return console.log(err);
console.log(res); // { filename: '/app/businesscard.pdf' }
});
})
.catch((errors) => {
console.log(errors);
});
};
module.exports = { writePDF };
routes/home.js
const express = require('express');
const router = express.Router();
const { writePDF } = require('./../helpers/export-helpers');
router.get('/', (req, res) => {
writePDF();
});
module.exports = router;
You should use res.download for this. Like so
router.get('/', (req, res) => {
res.download('car.pdf');
});
https://expressjs.com/en/api.html#res.download
You have to pipe the created pdf with response to client side.