Is it possible to use res.download() after writing a file to the filesystem?
router.get('/exportjson', (req, res, next) => {
let json = `{"#dope":[{"set":"","val":"200"}],"comment":"comment","folderType":"window"}`
const file = `${__dirname}/upload-folder/export.JSON`;
fs.writeFile('file', json, 'application/json', function(){
res.download(file);
})
})
I'm not sure I fully understand your question, but I'm assuming you want to be able to save that json data to the path /upload-folder/export.json and then allow the browser to download the file using res.download() at the path GET /exportjson.
You've got a couple of issues. First, fs.writeFile takes a file path as the first argument, and you are just passing the string file. With your code, the data would be written to the current directory as file. You probably want to use the path module and create a path to the file you want to write, like so:
const path = require('path');
const jsonFilePath = path.join(__dirname, '../upload-folder/export.json');
Assuming the code is at routes/index.js, this path would point to the root directory of the project to the file upload-folder/export.json.
The data you want to write is in your variable json, but you have it stored as a string. I would actually leave it as an object:
let json = {
"#dope": [
{
"set":"",
"val":"200"
}
],
"comment":"comment",
"folderType":"window"
};
And then call JSON.stringify on it when you pass it to fs.writeFile as the second argument. You will also need to pass in the utf-8 option as the third argument, not application/json:
fs.writeFile(jsonFilePath, JSON.stringify(json), 'utf-8', function(err) {
In the callback to fs.writeFile, you want to call res.download and pass it the path to the file that you just wrote to the filesystem, which is stored in jsonFilePath (you had this part right, I just changed the variable name):
res.download(jsonFilePath);
Here is the relevant portion of the router file that has code to get everything working correctly:
const fs = require('fs');
const path = require('path');
const jsonFilePath = path.join(__dirname, '../upload-folder/export.json');
router.get('/exportjson', (req, res, next) => {
let json = {
"#dope": [
{
"set":"",
"val":"200"
}
],
"comment":"comment",
"folderType":"window"
};
fs.writeFile(jsonFilePath, JSON.stringify(json), 'utf-8', function(err) {
if (err) return console.log(err);
res.download(jsonFilePath);
});
});
Assuming this file lives in /routes/index.js, the file would be saved at /upload-folder/export.json.
Here is a gif showing how it looks on my machine:
Related
I got this third party lib which generates a screenshot.
I want to save this on my server. I'm using Axios.It's probably something with blobs, arraybuffers etc?
How do I send it?
Axios.post('/api/saveimage', { ??? })
Using NodeJs express on backend. How do I save this to a physical image file?
Well at the frontend you need to send it like this:
let formData = new FormData()
formData.append("image", file)
axios.post("/api/saveimage",formData)
At the first step you create a FormData, then you append the file. In this case i named the file image. Now lets go to the next step. You will need multer on your nodejs side.
npm i multer
The first think you need to do, is to create an middleware:
const multer = require("multer");
const whitelist = ["image/png", "image/jpeg", "image/jpg", "image/webp"];
const storeImages = multer.diskStorage({
destination: async function (req, file, cb) {
if (!whitelist.some((type) => type === file.mimetype)) {
return cb(new Error("File is not allowed"), "/");
}
cb(null, "your/destination/path");
},
filename(req, file, cb) {
let [name] = file.originalname.split(".");
cb(null, name + ".png");
},
});
exports.uploadImageStorage = multer({
storage: storeImages,
});
Here watch out: Your destination path should exist. Also dont forget an extension for your file in this case .png
Now you create your route:
const { uploadImageStorage } = require("../yourstorage")
app.post("/api/saveimage", uploadImageStorage.single("image"), (req, res) => {
let file = req.file
let path = file.path
})
Here you need to know at uploadImageStorage.single("image") i used image like i used it in formData.append("image", file) they need to be the same.
Now you can save the path of your file into a database. You can also transform your image with sharp if you want
From my experience if you have folder called static and you have a image inside of it like photo.png you usually get the photo with localhost:3000/photo.png and not with localhost:3000/static/photo.png
You will need to remove static from your path if you have this setup. Otherwise if you try to display the image on the frontend you wont see it.
I'm trying to download a file (without saving it) and convert it, but readFileSync throws an error that the file can't be found: "Error: ENOENT: no such file or directory, open 'https://s3.amazonaws.com/appforest_uf/f1631452514756x615162562554826200/testdoc.txt'"
Code:
var path = require('path');
const fs = require('fs');
const https = require('https');
const file_x = 'https://s3.amazonaws.com/appforest_uf/f1631452514756x615162562554826200/testdoc.txt';
var filename = path.basename(file_x);
const FileBuffer = fs.readFileSync(file_x);
const fileParam = {
value: FileBuffer,
options: {
filename: filename,
contentType: 'application/octet-stream'
}
};
console.log(fileParam);
And I get the error the file can't be found... The URL is working OK, do I need to do something else to download from URL ?
Did you try looking for similar questions? For example, readFileSync not reading URL as file. There it is clearly stated that the fs API only deals with local files.
You will need to use the https module to stream the file.
The package fs only reads files, not URLs. Alternatively, you can use a package like axios. Axios allows you to make HTTP requests within node.js.
Example usage:
const axios = require('axios');
axios.get('https://s3.amazonaws.com/appforest_uf/f1631452514756x615162562554826200/testdoc.txt').then(response => {
console.log(response);
...
});
Can someone tell me why function below properly download my file from server when i work locally (by localhost) but not download me and return me 500 internal server error when i try do is when i deploy my app on remote server?
async downloadFile(fileId: number): Promise<Buffer> {
const fileName = await this.getFileName(fileId);
const fileBuffer = await new Promise<Buffer>((resolve, reject) => {
fs.readFile(process.cwd() + '/files/' + fileName + '.xlsx', {}, (err, data) => {
if (err) reject(err)
else resolve(data)
});
});
return fileBuffer ;
}
thanks for any help
EDIT, ERROR FROM LOG:
ENOENT: no such file or directory
If you are willing to access your file relatively to your script dir you should use __dirname
Also using the path module in order to build your file location in a platform agnostic way is a good practice.
const path = require('path')
const filePath = path.join(__dirname, 'files', `${fileName}.xlsx`)
process.cwd() refers to you node process working dir. Using it in your context would tie your code to how the entry point has been called. This is bad. Code should not have to be aware of its execution context to work whenever this is possible.
An even better way would be to make your file location configurable (using an environment variable or a config file) and pass your download folder value to your code this way.
see https://12factor.net/config
example
const baseDir = process.env.FILES_PATH || '/some/default/location';
const filePath = path.join(baseDir, 'files', `${fileName}.xlsx`);
then run your program with
FILES_PATH=/your/directory node your_script.js
I have an express server.
server.js
const express = require('express');
const app = express();
var json = require("./sample.js")
app.use("/", (req, res)=>{
console.log("----------->", JSON.stringify(json));
res.status(200).send(JSON.stringify(json));
});
app.listen(2222,()=>{
console.log(`Listening on port localhost:2222/ !`);
});
sample.js
var offer = {
"sample" : "Text",
"ting" : "Toing"
}
module.exports = offer ;
Once i execute the server.js file, it fetches the json data from sample.js file. If I update the data of the sample.js while the server.js is still executing I dont get updated data. Is there any way
to do the same, without stopping the execution of server.js.
Yes, there is a way, you have to read the file every time a request occurs (or cache it for a time, for a bit better performance).
The reason why require does not work is that NodeJS automatically caches modules for you. So even if you would require it inside the request handler (in the use), it won't work.
Because you cannot use require, it won't be convenient (or performant) to use a module. So your file should be in JSON format instead:
{
"sample" : "Text",
"ting" : "Toing"
}
To read it, you have to use the fs (file system) module. This allows you to read the file from disk every time:
const fs = require('fs');
app.get("/", (req, res) => {
// To read as a text file, you have to specify the correct
// encoding.
fs.readFile('./sample.json', 'utf8', (err, data) => {
// You should always specify the content type header,
// when you don't use 'res.json' for sending JSON.
res.set('Content-Type', 'application/json');
res.send(data)
})
});
It is important to know, that now data is a string, not an object. You would need JSON.parse() to get an object.
Also use is not recommended in this case. It is for middlewares, you should consider using get (as in my example), or all if you want to handle any verb.
You need to read the file at runtime:
fs = require('fs');
function getJson() {
fs.readFile('sample.json', (err, jsonData) => {
if (err) {
console.log('error reading sample.js ', err)
}
return(jsonData)
}
}
}
make sure your sample.js is instead just a json object.
In the following path:
app/test/vehicle/
there are several .js file. Is there any way in Javascript where I can get the number of .js file contains in the above-mentioned path?
I am using webStorm environment
Yes, using node:
const fs = require('fs');
const dir = './directory';
fs.readdir(dir, (err, files) => {
console.log(files.length);
});
https://stackoverflow.com/a/43747896/3650835
Since javascript doesn't have local filesystem access when run in the browser, there is no way to access local files from javascript. If you are asking about accessing all files hosted on a cdn or something similar in that directory, there is still no way to tell which requests will return a 20x.
If you are using nodejs, use the fs.readdir function:
const fs = require('fs');
fs.readdir('app/test/vehicle/', (err, files) => {
files.forEach(file => { console.log(file);} );
})
There is no way to access the file system from a web page.
If you are using nodejs, then you can use fs.readdir or fs.readdirSync:
const fs = require('fs');
const dir = './directory';
fs.readdir(dir, (err, files) => {
console.log(files.filter(file => file.endsWith(".js")).length);
});
Where are you trying to get list of the files? On the client side it is impossible. JavaScript does not have access to the client file system. If you are trying to do it in the server side using for example node.js then you can do something like:
var filesys = require('fs');
var files = filesys.readdirSync('/app/test/vehicle/');