download data from link javascript - javascript

I have input type='text' where a user will provide hyperlink for google docs to be downloaded, like this
"https://docs.google.com/spreadsheets/d/1w_qYEgD5w-xIrnMpRMmeUycZBJiAfo4zxlVTkXb8LU4/export?gid=1865948320&format=csv"
When I pass this string as href for link with download parameter it prompts save as window and save file in csv format. If you follow the mentioned link above it will trigger file download. My goal is to save data on the client-side without "Save as" window and work with csv data. Any suggestions on how to implement this? Appreciate any help.

In fact, on the client-side I have an input where the user provides docs.google link on a specific document with predefined csv format and I send this link to my local server on express. On the local server I make axios get request, function looks like this
// Request from client where I send url link
app.post("/upload", (req, res) => {
// Handle get request with link provided from client/front
axios(req.body.payload, { method: "GET" })
.then((response) => {
res.status(200);
// Data represents string of characters that I send back to client and
// and in my case populate table
res.json({ file: response.data });
})
.catch((error) => {
res.status(404);
res.send(error);
});
});

Related

Download File using express and fetch doesn't work

I try to download a file using nodejs and Javascript.
When I call the URL in the Browser, the file gets downloaded.
When I call this Endpoint in my javascript file using fetch, the download doesn't work
NodeJS Endpoint
app.get("/download", function (req, res, next) {
res.download(
filepath
);
});
Javascript Call
const downloadFile = async (path) => {
await fetch("http://localhost:8080/download", {
method: "Get",
})
.then((response) => {
console.log(response);
})
.catch((error) => {
console.log(error);
});
};
Do you have any suggestions?
Thank you very much!
When you make a request using Ajax then the response is passed back to the JavaScript code for handling.
If you want to do something with the file the server has sent you, then you need to write JavaScript to do something with it.
Your JavaScript logs the response object then stops.
The browser will only automatically render it in the viewport / save it to downloads if you type the URL into the address bar / click a link / etc. Doing Ajax explicitly avoids that automatic handling.
So the solution here is: Don't use Ajax. Use a link, or assign a value to location, etc.

May I have a res.json and res.sendFile together in the same api endpoint in express?

I want upload the content of an excel file into the server in order to get its data and do some stuff...
I came up with the following code, however it seems like it is not working properly as the following error displays in the console Error: Can't set headers after they are sent.
The file is getting uploaded into the folder and the json message is being displayed... However I do not know if I am going to face any issue in the future...
Actually I just need the excel data no need for the excel being uploaded... Maybe you could give me a workaround, guys...
const router = express.Router();
const storage = multer.diskStorage({
destination(req, file, cb) {
cb(null, 'uploads/');
},
filename(req, file, cb) {
cb(
null,
`${file.fieldname}-${Date.now()}${path
.extname(file.originalname)
.toLowerCase()}`
);
},
});
const excelFilter = (req, file, cb) => {
if (
file.mimetype.includes('excel') ||
file.mimetype.includes('spreadsheetml')
) {
cb(null, true);
} else {
cb('Please upload only excel file.', false);
}
};
const upload = multer({
storage,
fileFilter: excelFilter,
});
router.post('/', upload.single('file'), (req, res) => {
var workbook = XLSX.readFile(req.file.path);
var sheet_name_list = workbook.SheetNames;
var xlData = XLSX.utils.sheet_to_json(workbook.Sheets[sheet_name_list[0]]);
res.json(xlData).sendFile(`/${req.file.path}`, { root: path.resolve() });
});
May I have a res.json and res.sendFile together in the same api endpoint in express?
No, you cannot. Each of those methods, sends a complete http response (including calling res.end() which terminates the http request) and you can only send one http response to each incoming request. The particular error you're getting has to do with the res.sendFile() trying to configure the response that it's getting ready to send and finding that the http response object has already been used for sending a response and can't be used again.
Ordinarily, if you wanted to sent two different pieces of data, you would just combine them into a single Javascript object and just call res.json() on the object that contains both pieces of data.
But, sending a binary file is not something you can easily put in a JSON package. You could construct a multipart response where one part was the JSON and one part was the file. You could JSON encode binary data (though that's inefficient). I presume there are probably some modules that would help you do that, but for most clients, that isn't what they are really expecting or equipped to handle.
The only way to a proper solution is for us to understand what client/server workflow you're trying to implement here and why you're trying to send back the same file that was just uploaded. There would normally not be a reason to do that since the client already has that data (they just uploaded it).

How to pass uploaded files(images, pdf) and JSON data to the backend at the same time using fetch API?

I'm using React JS as the frontend framework. I have a form with different inputs. Some of those are text inputs and the others are file uploads. Here's the post request.
const formData = new FormData();
const imgUpload = document.querySelector('#imgUpload');
const mouUpload = document.querySelector('#mouUpload');
//-------setting the form data----------
formData.append('oName', formValues.oName);
formData.append('oWeb', formValues.oWeb);
formData.append('oVC', formValues.oVC);
formData.append('oVP', formValues.oVP);
formData.append('oImg', imgUpload.files[0]);
formData.append('oMou', mouUpload.files[0]);
//---------fetch post request--------------
fetch('https://httpbin.org/post', {
method: 'POST',
body: formData
})
.then(response => response.json())
.then(result => {
console.log('Success:', result);
})
.catch(error => {
console.error('Error:', error);
});
This is a successful post request and I can see this as the result.
But I want to be able to see text input filed data under data and the file uploads under files in this success result. I know that I'm passing those values to formData , I did that because I wanted to get text input data to be shown in the result somehow. How can I implement this ? Please guide me if I'm doing something wrong here. Appreciate your help.
An HTTP request body has to be encoded somehow. Then the server-side code has to decode it.
From your comments we can infer that if you encode the body as JSON then it will decode it and put it in data. If you encode it as multi-part, then it will decode it and put the files in files and the rest of the data in forms.
If you want it to put the rest of the data in data then you'll need to change the server-side code.
I'd just read it from form myself.

Node.js Pipe a PDF API Response

So my scenario is a user clicks a button on a web app, this triggers a server side POST request to an internal (i.e non public) API sitting on another server in the same network, this should return a PDF to my server which will proxy (pipe) it back to the user.
I want to just proxy the PDF body content directly to the client without creating a tmp file.
I have this code which works using the npm request module but it does not feel right:
var pdfRequest = request(requestOptions);
pdfRequest.on('error', function (err) {
utils.sendErrorResponse(500, 'PROBLEM PIPING PDF DOWNLOAD: ' + err, res);
});
pdfRequest.on('response', function (resp) {
if (resp.statusCode === 200) {
pdfRequest.pipe(res);
} else {
utils.sendErrorResponse(500, 'PROBLEM PIPING PDF DOWNLOAD: RAW RESP: ' + JSON.stringify(resp), res);
}
});
Is the the correct way to pipe the PDF response?
Notes:
I need to check the status code to conditionally handle errors, the payload for the POST is contained in the requestOptions (I know this part is all correct).
I would like to keep using the request module
I defiantly do not want to be creating any temp files
If possible I would also like to modify the content disposition header to set a custom filename, i know how to do this without using pipes

how to stream from nodejs server to client

I am currently trying to send a very long csv file that will be processed in the browser.
I would like to stream it to the client as it would exceed the string size limit and would also take up too much memory in the server.
I have tried
app.get('/test', (req, res)=>{
let csvStream = byline(fs.createReadStream('./resources/onescsv.csv'));
csvStream.on('data', (line)=>{
csvStream.pipe(res);
});
csvStream.on('end', () => {
res.render('./test/test', {
css:['test/test.css'],
js:['test/test.js']
})
})
});
When I do the above, it sends read stream to the client but it renders to the page which is not what I want. I would like to be able to receive stream buffer by buffer in the client javascript to process the stream as they come in e.g. put them into a table. How can I do this?
Well firstly, you don't want to be calling render in the same request your looking to pipe data into the response. You'd want to split these out
Render the page
Start the stream request
To render the page, just have your default route send down the page HTML
app.get('/', (req, res) => {
res.render('./test/test', {
css: ['test/test.css'],
js: ['test/test.js']
});
});
Then to stream, at the server side tweak your code like
app.get('/api/csv', (req, res) => {
let stream = fs.createReadStream('./resources/onescsv.csv');
stream = byline.createStream(stream);
stream.pipe(res);
stream.on('end', res.end);
});
Then on your client, in your default HTML page, either on load (or hook up to a button press), fire an AJAX request to pull down the CSV data e.g. using jQuery
$.get('/api/csv', data => {
// do something with CSV data
});

Categories