Express.js Response Sent Callback - javascript

I have the following code in Node/Express that sends a file as a response then deletes the file using a timeout.
res.sendFile(req.params.id, { root: process.env.UPLOADPATH });
setTimeout(function () {
if (fs.existsSync(process.env.UPLOADPATH + req.params.id)) { // check to ensure file still exists on file system
fs.unlink(process.env.UPLOADPATH + req.params.id); // delete file from server file system after 60 seconds
}
}, 60000);
If I didn't use the setTimeout it failed with an error. I'm assuming Express does the sendFile async so it was deleting the file before it actually sent.
Is there a better way to do this tho? Is there a way to check for when the file has been sent so I can safely delete it? Maybe like a sendFile callback or something?

Is there a better way to do this tho? Is there a way to check for when the file has been sent so I can safely delete it? Maybe like a sendFile callback or something?
Yes, you should just remove the file when the res.sendFile() is actually done. You can use the completion callback on res.sendFile() to know when it's done.
Also, it is an anti-pattern to use if (fs.existsSync(...)) and then delete the file because it can be subject to race conditions. If you want the file deleted, just delete it and handle any errors you might get:
let filename = path.join(process.env.UPLOADPATH, req.params.id);
res.sendFile(filename, function (err) {
if (err) {
next(err);
} else {
try {
fs.unlink(filename);
} catch(e) {
console.log("error removing ", filename);
}
}
});
I'm assuming Express does the sendFile async so it was deleting the file before it actually sent.
Yes, that is true.
You could also use the res.on('finish', ...) event to know when the sending of the response is done.
let filename = path.join(process.env.UPLOADPATH, req.params.id);
res.sendFile(filename);
res.on('finish', function() {
try {
fs.unlink(filename);
} catch(e) {
console.log("error removing ", filename);
}
});

The method invokes the callback function fn(err) when the transfer is complete or when an error occurs. If the callback function is specified and an error occurs, the callback function must explicitly handle the response process either by ending the request-response cycle
res.sendFile(fileName, { root: process.env.UPLOADPATH }, function (err) {
if (err) {
next(err);
} else {
// File has been sent
console.log('Sent:', fileName);
if (fs.existsSync(process.env.UPLOADPATH + req.params.id)) {
// check to ensure file still exists on file system
fs.unlink(process.env.UPLOADPATH + req.params.id);
// delete file from server file system after 60 seconds
}
}
});

The main drawback to res.on('finish', ...) is that it isn't called on the response being closed or on an error. Using on-finished will run the callback on closes, finishes, or errors. This is especially helpful in the case of deleting a file where you want to delete the file even on errors or situations like that.

Related

How to wait for the file to get created before sending request back to client using res.sendFile()

I have a file creator block which writes the file to disk, and then sends the file to the client.
But the response is sent before the file is getting written.
carbone.render("./template.odt", data, options, (err, ress) => {
if (err) {
return console.log(err);
}
var paths = `./static/reports/report-${timestamp}.pdf`;
fs.writeFileSync(paths, ress);
process.exit();
});
res.sendFile(
path.join(__dirname, "..", "static", "reports", `report-${timestamp}.pdf`)
);
What is the solution so that sendFile waits for the file to be written and then fetches it?
This is what the callback function is for. carbone.render renders a template and when it's done returns result to the callback function. JS will not wait for the carbone.render to complete before moving to res.sendFile. You can move the sending part inside the callback function.
You need to add sendFile code into callback function of carbone.render or else you can refer this question.

Node.js Can't set headers after they are sent Error after introducing res.render()

I have this snippet of code:
app.post('/pst', function(req, res) {
var data = req.body.convo;
res.render('waiting.ejs'); //ADDED THIS
myFunc(data).then(result => {
res.render('success.ejs'); //THEN THIS
//---------------------------------
//clever way to send text file to client from the memory of the server
var fileContents = Buffer.from(result, 'ascii');
var readStream = new stream.PassThrough();
readStream.end(fileContents);
res.set('Content-disposition', 'attachment; filename=' + fileName);
res.set('Content-Type', 'text/plain');
readStream.pipe(res);
//--------------------------------------
}).catch( .....
The code i commented as 'clever way to send file from memory of the server' comes from this post:
Node Express.js - Download file from memory - 'filename must be a string'
What this does is is takes a string from the memory and serves it to the client as a .txt file.
This code used to work.
Then i decided to add the res.render('waiting.ejs'); line and i got this error:
Error: Can't set headers after they are sent.
I then experimented with adding another res.render() [in this case res.render('success.ejs');] before and after the code tht sends the .txt file to the client.
The error remained. Also, there is no redirect to success.ejs, in other words the res.render('success.ejs'); never worked, despite whether it is placed before ofr after that piece of code.
app.post('/pst', function(req, res) {
var data = req.body.convo;
myFunc(data).then(result => {
//---------------------------------
//clever way to send text file to client from the memory of the server
var fileContents = Buffer.from(result, 'ascii');
var readStream = new stream.PassThrough();
readStream.end(fileContents);
res.set('Content-disposition', 'attachment; filename=' + fileName);
res.set('Content-Type', 'text/plain');
readStream.pipe(res);
res.redirect(`/success`); //THEN THIS
//--------------------------------------
}).catch( .....
When you add middleware to express (which is built on connect) using the app.use method, you're appending items to Server.prototype.stack in connect.
When the server gets a request, it iterates over the stack, calling the (request, response, next) method.
The problem is, if in one of the middleware items writes to the response body or headers (it looks like it's either/or for some reason), but doesn't call response.end() and you call next() then as the core Server.prototype.handle method completes, it's going to notice that:
there are no more items in the stack, and/or
that response.headerSent is true.
So, it throws an error. But the error it throws is just this basic response (from the connect http.js source code:
res.statusCode = 404;
res.setHeader('Content-Type', 'text/plain');
res.end('Cannot ' + req.method + ' ' + req.url);
The problematic middleware sets the response header without calling response.end() and calls next(), which confuses express's server.
so you set the header through res.render() .Now if you will try to render again it will throw you an error.
app.get('/success',(req,res)=> {
res.render("container/index",{waiting:"waiting",......});
//handle your task then in client side index.ejs with appropriate setTimeout(()=>{},2000) for the waiting div , show waiting div for 2 seconds
});
//then your actual success gets render
You would have to check express.js source code (here):
res.render = function render(view, options, callback) {
var app = this.req.app;
var done = callback;
var opts = options || {};
var req = this.req;
var self = this;
// support callback function as second arg
if (typeof options === 'function') {
done = options;
opts = {};
}
// merge res.locals
opts._locals = self.locals;
// default callback to respond
done = done || function (err, str) {
if (err) return req.next(err);
self.send(str);
};
// render
app.render(view, opts, done);
};
You can see that when You use res.render() method, it will pass the done callback to app.render(...) (source code), it will then pass done to tryInitView etc.
At the end, it will invoke done callback with str in case of success or err in case of failure. It then triggers res.send() inside done callback which simply blocks You from setting headers after that.
res.render() function compiles your template, inserts locals there, and creates html output out of those two things. that's why error comes.
don't use it twice coz it send response.

Node: Wait for python script to run

I have the following code. Where i upload the file first and then i read the file and console the output like console.log(obj). But the response comes first and the python scripts runs behind the scene. How can i make code to wait for the python script to run then proceed?
router.post(`${basePath}/file`, (req, res) => {
//Upload file first
PythonShell.run('calculations.py', { scriptPath: '/Path/to/python/script' }, function (err) {
console.log(err);
let obj = fs.readFileSync('Path/to/file', 'utf8');
console.log(obj);
});
return res.status(200).send({
message : 'Success',
});
});
I cannot get console.log(obj); output because it runs after the response. How can i make it wait for the python script to run and get console.log(obj) output on console.
To return the result after some async operation, you should call res.send inside the done-callback.
router.post(`${basePath}/file`, (req, res) => {
//Upload file first
PythonShell.run('calculations.py', { scriptPath: '/Path/to/python/script' }, function (err) {
console.log('The script work has been finished.'); // (*)
if(err) {
res.status(500).send({
error: err,
});
console.log(err);
return;
}
let obj = fs.readFileSync('Path/to/file', 'utf8');
console.log(obj); // (**)
res.status(200).send({
message : 'Success',
});
});
});
Then if you will not see the log (*) in the console, then it would mean that the script does not work or works improperly. The callback is not being called. First of all, you need to be sure that the script (PythonShell.run) works and the callback is being called. The POST handler will wait until you call res.send (with no matter of delay value), so that callback is the main point.
Also readFileSync could fail. In case of readFileSync failure you should see an exception. If it's ok then you'll see the next log (**) and the response will be sent.
I see PythonShell in your code. I have no experience with it, but after some reading I think that the problem could be in how you are using it. It seems the python-shell npm package, so following it's documentation you may try to to instantiate a python shell for your script and then to use listeners:
let pyshell = new PythonShell('calculations.py');
router.post(`${basePath}/file`, (req, res) => {
pyshell.send(settings); // path, args etc
pyshell.end(function (err) {
console.log('The script work has been finished.');
if(err) { res.status(200).send({ error: err }); }
else { res.status(200).send({ message : 'Success' }); }
});
});
This approach could be more appropriate because the pyton shell is kept open between different POST requests. This depends on your needs. But I guess it does not solve the problem of script running. If you are sure that the script itself is fine, then you need just to run it properly in the Node environment. There are some points:
path to script
arguments
other settings
Try to remove all arguments (create some new test script), cleanup settings object (keep only path) and execute it from Node. Handle its result in Node. You should be able to run the simplest script by correct path! Research how to setup correct scriptPath. Then add an argument to your script and run it with an argument. Hanlde the result again. There are not so many options, but each of them could be the cause of improper call.

writestream and express for json object?

I might be out of depth but I really need something to work. I think a write/read stream will solve both my issues but I dont quite understand the syntax or whats required for it to work.
I read the stream handbook and thought i understood some of the basics but when I try to apply it to my situation, it seems to break down.
Currently I have this as the crux of my information.
function readDataTop (x) {
console.log("Read "+x[6]+" and Sent Cached Top Half");
jf.readFile( "loadedreports/top"+x[6], 'utf8', function (err, data) {
resT = data
});
};
Im using Jsonfile plugin for node which basically shortens the fs.write and makes it easier to write instead of constantly writing catch and try blocks for the fs.write and read.
Anyways, I want to implement a stream here but I am unsure of what would happen to my express end and how the object will be received.
I assume since its a stream express wont do anything to the object until it receives it? Or would I have to write a callback to also make sure when my function is called, the stream is complete before express sends the object off to fullfill the ajax request?
app.get('/:report/top', function(req, res) {
readDataTop(global[req.params.report]);
res.header("Content-Type", "application/json; charset=utf-8");
res.header("Cache-Control", "max-age=3600");
res.json(resT);
resT = 0;
});
I am hoping if I change the read part to a stream it will allievate two problems. The issue of sometimes receiving impartial json files when the browser makes the ajax call due to the read speed of larger json objects. (This might be the callback issue i need to solve but a stream should make it more consistent).
Then secondly when I load this node app, it needs to run 30+ write files while it gets the data from my DB. The goal was to disconnect the browser from the db side so node acts as the db by reading and writing. This due to an old SQL server that is being bombarded by a lot of requests already (stale data isnt an issue).
Any help on the syntax here?
Is there a tutorial I can see in code of someone piping an response into a write stream? (the mssql node I use puts the SQL response into an object and I need in JSON format).
function getDataTop (x) {
var connection = new sql.Connection(config, function(err) {
var request = new sql.Request(connection);
request.query(x[0], function(err, topres) {
jf.writeFile( "loadedreports/top"+x[6], topres, function(err) {
if(err) {
console.log(err);
} else {
console.log(x[6]+" top half was saved!");
}
});
});
});
};
Your problem is that you're not waiting for the file to load before sending the response. Use a callback:
function readDataTop(x, cb) {
console.log('Read ' + x[6] + ' and Sent Cached Top Half');
jf.readFile('loadedreports/top' + x[6], 'utf8', cb);
};
// ...
app.get('/:report/top', function(req, res) {
// you should really avoid using globals like this ...
readDataTop(global[req.params.report], function(err, obj) {
// setting the content-type is automatically done by `res.json()`
// cache the data here in-memory if you need to and check for its existence
// before `readDataTop`
res.header('Cache-Control', 'max-age=3600');
res.json(obj);
});
});

Detecting successful read stream open

I'm implementing cache for static serving middleware for Express.js, which works as follows — when request comes, middleware first tries to serve file from filesystem, and if there is none, file is fetched from upstream and stored in file system.
Problem is I don't know how to properly detect “cache hit” event.
staticMiddleware = function(req, res, next) {
// try to read file from fs
filename = urlToFilename(req.url);
stream = fs.createReadStream(filename);
// cache miss - file not found
stream.on('error', function() {
console.log('miss ' + req.url);
// get file from upstream, store it into fs and serve as response
stream = fetchFromUpstream(url);
stream.pipe(fs.createWriteStream(filename));
stream.pipe(res);
});
// cache hit - file is being read
I_DONT_KNOW_WHAT_TO_PUT_HERE(function() {
console.log('hit ' + req.url);
stream.pipe(res);
});
}
So, basically, how can I detect succesful file reading? If I listen to 'data' event, I guess I miss first chunk of data. If I just pipe() it to response, response stream gets closed on error, and I can't serve it with fetched data, and this approach really lacks flexibility. I wonder if there is way to listen for event like fdcreated or opened or similar, or way to push back data I've got in data event, so it will be resent in next data event.
Method createReadStream returns a ReadableStream which also an event open. You can add an event handler for the open event so you will know when the resource is valid before piping:
stream.on('open', function() {
console.log('hit ' + req.url);
stream.pipe(res);
});

Categories