how to stream from nodejs server to client - javascript

I am currently trying to send a very long csv file that will be processed in the browser.
I would like to stream it to the client as it would exceed the string size limit and would also take up too much memory in the server.
I have tried
app.get('/test', (req, res)=>{
let csvStream = byline(fs.createReadStream('./resources/onescsv.csv'));
csvStream.on('data', (line)=>{
csvStream.pipe(res);
});
csvStream.on('end', () => {
res.render('./test/test', {
css:['test/test.css'],
js:['test/test.js']
})
})
});
When I do the above, it sends read stream to the client but it renders to the page which is not what I want. I would like to be able to receive stream buffer by buffer in the client javascript to process the stream as they come in e.g. put them into a table. How can I do this?

Well firstly, you don't want to be calling render in the same request your looking to pipe data into the response. You'd want to split these out
Render the page
Start the stream request
To render the page, just have your default route send down the page HTML
app.get('/', (req, res) => {
res.render('./test/test', {
css: ['test/test.css'],
js: ['test/test.js']
});
});
Then to stream, at the server side tweak your code like
app.get('/api/csv', (req, res) => {
let stream = fs.createReadStream('./resources/onescsv.csv');
stream = byline.createStream(stream);
stream.pipe(res);
stream.on('end', res.end);
});
Then on your client, in your default HTML page, either on load (or hook up to a button press), fire an AJAX request to pull down the CSV data e.g. using jQuery
$.get('/api/csv', data => {
// do something with CSV data
});

Related

How to send real time response from server to client using Node.js express

on post request server is generating data after few seconds, these are almost 1000 to 10000 entries. Currently i'm saving data into csv file and it's working fine. how can i pas data to client with json array.
Name and Age variable getting data after few seconds
app.post('/', (req, res) => {
// currently passing to results.csv. its working fine i want to send this real time (Name and age) data to client.
const stream = createWriteStream('./results.csv', { flags: 'a', encoding: 'utf8' })
// Append evaluation from response to file
stream.write(`${Name}, ${Age}\n`)
// example data: Patrick, 32
// End stream to avoid accumulation
stream.end()
})
res.send() only send first row but Name and age variable getting update after after each 10 seconds
app.listen(port, () => {
console.log("App is running on port " + port);
}); ```
To parse data into json array, you would have to read the stream before sending the data. Example:
let rows = []
fs.createReadStream('./results.csv')
.pipe(parse())
.on("error", (error) => console.error(error))
.on("data", (row) => rows.push(row))
.on("end", () => res.send(rows))
But if you want to send the csv file data, do something like this:
return res.sendFile("./results.csv", options, (err) => {
if(error) console.error(error)
console.log("File has been sent")
})
NOTE: The parse() method used in the first example is gotten from the fast-csv package.
Also, using a package like fast-csv makes it easy to write data to the csv file while setting headers:true. Trying to read the data, the parse method has to be informed of the headers already set in order to use those values for the json keys.

May I have a res.json and res.sendFile together in the same api endpoint in express?

I want upload the content of an excel file into the server in order to get its data and do some stuff...
I came up with the following code, however it seems like it is not working properly as the following error displays in the console Error: Can't set headers after they are sent.
The file is getting uploaded into the folder and the json message is being displayed... However I do not know if I am going to face any issue in the future...
Actually I just need the excel data no need for the excel being uploaded... Maybe you could give me a workaround, guys...
const router = express.Router();
const storage = multer.diskStorage({
destination(req, file, cb) {
cb(null, 'uploads/');
},
filename(req, file, cb) {
cb(
null,
`${file.fieldname}-${Date.now()}${path
.extname(file.originalname)
.toLowerCase()}`
);
},
});
const excelFilter = (req, file, cb) => {
if (
file.mimetype.includes('excel') ||
file.mimetype.includes('spreadsheetml')
) {
cb(null, true);
} else {
cb('Please upload only excel file.', false);
}
};
const upload = multer({
storage,
fileFilter: excelFilter,
});
router.post('/', upload.single('file'), (req, res) => {
var workbook = XLSX.readFile(req.file.path);
var sheet_name_list = workbook.SheetNames;
var xlData = XLSX.utils.sheet_to_json(workbook.Sheets[sheet_name_list[0]]);
res.json(xlData).sendFile(`/${req.file.path}`, { root: path.resolve() });
});
May I have a res.json and res.sendFile together in the same api endpoint in express?
No, you cannot. Each of those methods, sends a complete http response (including calling res.end() which terminates the http request) and you can only send one http response to each incoming request. The particular error you're getting has to do with the res.sendFile() trying to configure the response that it's getting ready to send and finding that the http response object has already been used for sending a response and can't be used again.
Ordinarily, if you wanted to sent two different pieces of data, you would just combine them into a single Javascript object and just call res.json() on the object that contains both pieces of data.
But, sending a binary file is not something you can easily put in a JSON package. You could construct a multipart response where one part was the JSON and one part was the file. You could JSON encode binary data (though that's inefficient). I presume there are probably some modules that would help you do that, but for most clients, that isn't what they are really expecting or equipped to handle.
The only way to a proper solution is for us to understand what client/server workflow you're trying to implement here and why you're trying to send back the same file that was just uploaded. There would normally not be a reason to do that since the client already has that data (they just uploaded it).

Not allowed to load local resource when sending file from node js server to client [duplicate]

I Googled this but couldn't find an answer but it must be a common problem. This is the same question as Node request (read image stream - pipe back to response), which is unanswered.
How do I send an image file as an Express .send() response? I need to map RESTful urls to images - but how do I send the binary file with the right headers? E.g.,
<img src='/report/378334e22/e33423222' />
Calls...
app.get('/report/:chart_id/:user_id', function (req, res) {
//authenticate user_id, get chart_id obfuscated url
//send image binary with correct headers
});
There is an api in Express.
res.sendFile
app.get('/report/:chart_id/:user_id', function (req, res) {
// res.sendFile(filepath);
});
http://expressjs.com/en/api.html#res.sendFile
a proper solution with streams and error handling is below:
const fs = require('fs')
const stream = require('stream')
app.get('/report/:chart_id/:user_id',(req, res) => {
const r = fs.createReadStream('path to file') // or any other way to get a readable stream
const ps = new stream.PassThrough() // <---- this makes a trick with stream error handling
stream.pipeline(
r,
ps, // <---- this makes a trick with stream error handling
(err) => {
if (err) {
console.log(err) // No such file or any other kind of error
return res.sendStatus(400);
}
})
ps.pipe(res) // <---- this makes a trick with stream error handling
})
with Node older then 10 you will need to use pump instead of pipeline.

download data from link javascript

I have input type='text' where a user will provide hyperlink for google docs to be downloaded, like this
"https://docs.google.com/spreadsheets/d/1w_qYEgD5w-xIrnMpRMmeUycZBJiAfo4zxlVTkXb8LU4/export?gid=1865948320&format=csv"
When I pass this string as href for link with download parameter it prompts save as window and save file in csv format. If you follow the mentioned link above it will trigger file download. My goal is to save data on the client-side without "Save as" window and work with csv data. Any suggestions on how to implement this? Appreciate any help.
In fact, on the client-side I have an input where the user provides docs.google link on a specific document with predefined csv format and I send this link to my local server on express. On the local server I make axios get request, function looks like this
// Request from client where I send url link
app.post("/upload", (req, res) => {
// Handle get request with link provided from client/front
axios(req.body.payload, { method: "GET" })
.then((response) => {
res.status(200);
// Data represents string of characters that I send back to client and
// and in my case populate table
res.json({ file: response.data });
})
.catch((error) => {
res.status(404);
res.send(error);
});
});

express node server return a value to async client call

Sorry, I tend to be a bad writer when I have not fully woken up, let me revise.
I am using expressjs with passportjs (local strategy) to manage my server and using connect-busboy to manage file uploading. I do not think passport will play a role in this.
Here is the server code for managing file uploads:
app.post('/upload', isLoggedIn, (req, res) => {
if(req.busboy){
req.pipe(req.busboy);
req.busboy.on('file', (fieldname, file, filename, encoding, mimetype) => {
if(mimetype.match(/^image\//)){
var root = path.join(__dirname, "../public/images/");
if(fs.existsSync(path.join(root, filename))){
var name = getUnique(path.join(root, filename));
} else {
var name = filename;
}
var ws = fs.createWriteStream(path.join(root, name), { flags: "a" });
file.pipe(ws);
}
});
}
});
As for my client page, it is used to change a JSON object which will get re-uploaded to the server as a configuration tool. When I upload a new image asynchronously I need to get the filename to update this JSON object while working on it. For uploading from the clients end I am using dropzonejs, which did not require any configuration on my part to work.
So, in summary I upload a number of images via dropzone asynchronously, busboy and fs on my server save the file, and I would like to get the filename returned to my javascript to modify the existing JSON object.
Edit solution:
Thanks to Elliot Blackburn for pointing me in the right direction.
By calling:
ws.on('close', () => {
res.send({filename: name});
});
after file.pipe(ws); to send the response back to the client. On the client side modify dropzone to handle the response like so:
dropzone.on('success', (file, res) => {
console.log(res);
});
Just send it in the normal http response. It'll depend what library you're using but most will allow you to trigger a normal req, res, next express call. From that you can access the file object, and return anything you want.
Something like:
req.send({filename: name}); // name is the filename var set earlier in the code.
Once you've finished editing the file and such, you can get the name and put it into that returned object and your client will receive that as object as the response which you can act upon.

Categories