I'm testing my API endpoints with supertest, and it works great, but i can't figure out how to test if a file download is successful.
In my routes file i have defined the endpoint to be:
app.get('/api/attachment/:id/file', attachment.getFile);
and the function getFile() looks something like this:
exports.getFile = function(req, res, next) {
Attachment.getById(req.params.id, function(err, att) {
[...]
if (att) {
console.log('File found!');
return res.download(att.getPath(), att.name);
}
Then, in my test file, I try the following:
describe('when trying to download file', function() {
it('should respond with "200 OK"', function(done) {
request(url)
.get('/api/attachment/' + attachment._id + '/file');
.expect(200)
.end(function(err, res) {
if (err) {
return done(err);
}
return done();
});
});
});
I know for sure that the file is found, because it logs out File found!. It also works fine if i try manually, but for some reason, mocha returns Error: expected 200 "OK", got 404 "Not Found".
I've experimented with different mime-types and supertest .set("Accept-Encoding": "*"), but nothing works.
Anyone know how to do this?
Either the problem has been fixed in the libraries, or there is a bug in some other part of your code. Your example runs fine, and gives
when trying to download file
File found!
✓ should respond with "200 OK"
When testing for a download file, it is not enough to validate the response status from the server, it will be even better if you can somehow validate the response data.
For download data, the content of the file are usually passed in the http response as text, with the file type as Content-Type, and the attachment and filename stored in Content-Disposition.
Depending on how detailed you would like to go, you can try the following:
const response = await request(url)
.get('/api/attachment/' + attachment._id + '/file');
expect(response.headers["content-type"]).toEqual("image/png");
expect(response.text).toMatchSnapshot(); // Use only if the file is deterministic.
Using jest or any other snapshot framework, you can achieve a more reliable test.
This may be coming late, but I'm dropping this here for future reference and to help others who might be facing something similar.
Related
I want upload the content of an excel file into the server in order to get its data and do some stuff...
I came up with the following code, however it seems like it is not working properly as the following error displays in the console Error: Can't set headers after they are sent.
The file is getting uploaded into the folder and the json message is being displayed... However I do not know if I am going to face any issue in the future...
Actually I just need the excel data no need for the excel being uploaded... Maybe you could give me a workaround, guys...
const router = express.Router();
const storage = multer.diskStorage({
destination(req, file, cb) {
cb(null, 'uploads/');
},
filename(req, file, cb) {
cb(
null,
`${file.fieldname}-${Date.now()}${path
.extname(file.originalname)
.toLowerCase()}`
);
},
});
const excelFilter = (req, file, cb) => {
if (
file.mimetype.includes('excel') ||
file.mimetype.includes('spreadsheetml')
) {
cb(null, true);
} else {
cb('Please upload only excel file.', false);
}
};
const upload = multer({
storage,
fileFilter: excelFilter,
});
router.post('/', upload.single('file'), (req, res) => {
var workbook = XLSX.readFile(req.file.path);
var sheet_name_list = workbook.SheetNames;
var xlData = XLSX.utils.sheet_to_json(workbook.Sheets[sheet_name_list[0]]);
res.json(xlData).sendFile(`/${req.file.path}`, { root: path.resolve() });
});
May I have a res.json and res.sendFile together in the same api endpoint in express?
No, you cannot. Each of those methods, sends a complete http response (including calling res.end() which terminates the http request) and you can only send one http response to each incoming request. The particular error you're getting has to do with the res.sendFile() trying to configure the response that it's getting ready to send and finding that the http response object has already been used for sending a response and can't be used again.
Ordinarily, if you wanted to sent two different pieces of data, you would just combine them into a single Javascript object and just call res.json() on the object that contains both pieces of data.
But, sending a binary file is not something you can easily put in a JSON package. You could construct a multipart response where one part was the JSON and one part was the file. You could JSON encode binary data (though that's inefficient). I presume there are probably some modules that would help you do that, but for most clients, that isn't what they are really expecting or equipped to handle.
The only way to a proper solution is for us to understand what client/server workflow you're trying to implement here and why you're trying to send back the same file that was just uploaded. There would normally not be a reason to do that since the client already has that data (they just uploaded it).
I have written a simple function to handle upload of files in my sails.js app.
let upload = file.upload((err, uploadedFiles) => {
if (err) {
return res.serverError(err);
} else {
return res.send({ data: uploadedFiles });
}
});
When the upload is complete I am redirected to a page displaying raw json, which contains the uploaded file information (including the path).
raw json response
What I am expecting when I console.log(upload) is the same information, however I am getting the writestream instead.
console.log output
This is a problem for me because I would like to be able to extract the file name from the object and use it in another part of my program, but I can't do this because all I am able to access is the writestream.
I have tried using async/await and callbacks and can't seem to fix my issue.
Hopefully someone can help me!
Thanks
A helpful person on the sails Gitter suggested that I use this package, which supports async/await: https://www.npmjs.com/package/sails-hook-uploads
I tested it out with the following code and it works:
let upload = await sails
.uploadOne(file, {
maxBytes: 3000000,
})
.intercept('E_EXCEEDS_UPLOAD_LIMIT', 'tooBig')
.intercept(
(err) => new Error('The photo upload failed: ' + util.inspect(err))
);
Thanks for looking into my question.
I have a node/express server, configured with a server.js file, which calls urls.js which in turn calls controllers for handling http requests, and all of them are configured identically and all work fine, except for one.
It's my OrderController. The OrderController.js never gets called, and thus never returns any HTTP responses even though I'm as sure as I can be that it should be getting called.
I've tried adding logging statements all over OrderController.js, but they never fire. Nothing in my OrderController.js is running at all.
I've inspected my urls.js file carefully but can find no clue why OrderController would fail to load, while my other 8 controllers all work fine.
I've added logging statements to my server.js file to ensure that my server.js file is indeed handling all http requests and passing them correctly to my urls.js file.
I've added logging statements to my urls.js file to confirm that it is being executed.
I've logged the controllers, both require(OrderController.js) and my other controllers, and inspected the dumped json objects, but couldn't determine anything different or mistaken from the output I got.
The repository is: https://github.com/allenchan3/foodproject/tree/40a43231ae49989a35fb0c004a897bbee69fe669
In theory, the problem should be somewhere in urls.js:
const path = require("path");
const staticFilesError = require("./errors").errorHTML;
module.exports = function(app, staticDir) {
const loginRegController = require(path.join("..","controllers","LoginRegController"));
const controllers = {
'categories': require(path.join("..","controllers","CategoryController")),
'diningrooms':require(path.join("..","controllers","DiningRoomController")),
'ingredients':require(path.join("..","controllers","IngredientController")),
'items': require(path.join("..","controllers","ItemController")),
'options': require(path.join("..","controllers","OptionController")),
'orders': require(path.join("..","controllers","OrderController")),
'tables': require(path.join("..","controllers","TableController")),
'users': require(path.join("..","controllers","UserController")),
};
for (key in controllers) {
app.use("/api/"+key, controllers[key]);
}
app.use(loginRegController);
app.get(/^\/api\//, (req, res) => res.status(404).json({"error":req.url+" not found"}));
app.get("*", (_,res) => res.sendFile(staticDir+"/index.html"));
}
I am just trying to make HTTP GET requests to /api/orders/ and get something from the order controller (currently it should get a list of all orders from my database). I also made a handler for /hello so /api/orders/hello should be at least returning something. Whats actually happening is no http response ever comes, not even a 404, and I don't have any idea why.
I cloned your repo and was able to reproduce the bug.
What is seems to be is that in authentication.js, the handleOrders() function only handles a user being a manager, bartender, server or cashier. If the req.session.userType is not set or it its equal to cook or customer, it is not handled by the if statement and will hang indefinitely.
Can you attempt to call the API as a user with a different type and confirm this?
You may need to an else as below to throw a 403 if the user is the wrong type.
function handleOrders(req, res, next) {
console.log('handling')
if (req.session.userType == "manager") {
console.log('i am manager')
next();
} else if (["bartender","server","cashier"].indexOf(req.session.userType) > -1) {
console.log('am i a cahsier or server or bartender')
if (req.url == "/api/order" && req.method == "GET") {
console.log('in this if')
next();
}
else {
//not a get
Order.findById(req.params.id,(err,data)=>{
if (err) {
console.log(err);
res.status(500).json(err);
} else {
console.log(data);
next();
}
});
}
} else res.status(403).send(errors.forbidden);
}
I've got a simple node.js + Restify backend with standard CORS settings and this endpoint:
var file = '1,5,8,11,12,13,176,567,9483';
server.get('/download', function(req, res, next) {
res.set({"Content-Disposition": "attachment; filename='numbers.csv'"});
res.setHeader("Content-type", "text/csv");
res.send(file);
return next();
}, function(err) {
res.send(err);
});
What it's suppose to do is to is to create CSV file and return it.
It works great when I simply type in the endpoint address to web browser and hit enter. The file gets downloaded properly.
But when I try to do the same thing, but instead of using browser's address bar I use Restangular like that:
Restangular.one('download').get().then(function (res) {
console.log(res);
});
it just writes response to console, but no file is being downloaded.
Is there a way to do this using Restangular? Or maybe I need to use something else for this?
I am not sure if Restangular can do that, but I am using FileSaver script for stuff like that. Add Filesaver to your HTML head and then:
Restangular.one('download').get().then(function (res) {
var file = new Blob([res], { type: 'text/csv' });
saveAs(file, 'something.csv');
});
I might be out of depth but I really need something to work. I think a write/read stream will solve both my issues but I dont quite understand the syntax or whats required for it to work.
I read the stream handbook and thought i understood some of the basics but when I try to apply it to my situation, it seems to break down.
Currently I have this as the crux of my information.
function readDataTop (x) {
console.log("Read "+x[6]+" and Sent Cached Top Half");
jf.readFile( "loadedreports/top"+x[6], 'utf8', function (err, data) {
resT = data
});
};
Im using Jsonfile plugin for node which basically shortens the fs.write and makes it easier to write instead of constantly writing catch and try blocks for the fs.write and read.
Anyways, I want to implement a stream here but I am unsure of what would happen to my express end and how the object will be received.
I assume since its a stream express wont do anything to the object until it receives it? Or would I have to write a callback to also make sure when my function is called, the stream is complete before express sends the object off to fullfill the ajax request?
app.get('/:report/top', function(req, res) {
readDataTop(global[req.params.report]);
res.header("Content-Type", "application/json; charset=utf-8");
res.header("Cache-Control", "max-age=3600");
res.json(resT);
resT = 0;
});
I am hoping if I change the read part to a stream it will allievate two problems. The issue of sometimes receiving impartial json files when the browser makes the ajax call due to the read speed of larger json objects. (This might be the callback issue i need to solve but a stream should make it more consistent).
Then secondly when I load this node app, it needs to run 30+ write files while it gets the data from my DB. The goal was to disconnect the browser from the db side so node acts as the db by reading and writing. This due to an old SQL server that is being bombarded by a lot of requests already (stale data isnt an issue).
Any help on the syntax here?
Is there a tutorial I can see in code of someone piping an response into a write stream? (the mssql node I use puts the SQL response into an object and I need in JSON format).
function getDataTop (x) {
var connection = new sql.Connection(config, function(err) {
var request = new sql.Request(connection);
request.query(x[0], function(err, topres) {
jf.writeFile( "loadedreports/top"+x[6], topres, function(err) {
if(err) {
console.log(err);
} else {
console.log(x[6]+" top half was saved!");
}
});
});
});
};
Your problem is that you're not waiting for the file to load before sending the response. Use a callback:
function readDataTop(x, cb) {
console.log('Read ' + x[6] + ' and Sent Cached Top Half');
jf.readFile('loadedreports/top' + x[6], 'utf8', cb);
};
// ...
app.get('/:report/top', function(req, res) {
// you should really avoid using globals like this ...
readDataTop(global[req.params.report], function(err, obj) {
// setting the content-type is automatically done by `res.json()`
// cache the data here in-memory if you need to and check for its existence
// before `readDataTop`
res.header('Cache-Control', 'max-age=3600');
res.json(obj);
});
});