res.sendfile in Node Express with passing data along - javascript

Is there any way to redirect to an HTML file from a Node.JS application with something like: res.sendFile of express and pass a JSON data along to the html file?

I know this is late but I wanted to offer a solution which no one else has provided. This solution allows a file to be streamed to the response while still allowing you to modify the contents without needing a templating engine or buffering the entire file into memory.
Skip to the bottom if you don't care about "why"
Let me first describe why res.sendFile is so desirable for those who don't know. Since Node is single threaded, it works by performing lots and lots of very small tasks in succession - this includes reading from the file system and replying to an http request. At no point in time does Node just stop what it's doing and read an entire from the file system. It will read a little, do something else, read a little more, do something else. The same goes for replying to an http request and most other operations in Node (unless you explicitly use the sync version of an operation - such as readFileSync - don't do that if you can help it, seriously, don't - it's selfish).
Consider a scenario where 10 users make a request for for the same file. The inefficient thing to do would be to load the entire file into memory and then send the file using res.send(). Even though it's the same file, the file would be loaded into memory 10 separate times before being sent to the browser. The garbage collector would then need to clean up this mess after each request. The code would be innocently written like this:
app.use('/index.html', (req, res) => {
fs.readFile('../public/index.html', (err, data) => {
res.send(data.toString());
});
});
That seems right, and it works, but it's terribly inefficient. Since we know that Node does things in small chunks, the best thing to do would be to send the small chunks of data to the browser as they are being read from the file system. The chunks are never stored in memory and your server can now handle orders of magnitude more traffic. This concept is called streaming, and it's what res.sendFile does - it streams the file directly to the user from the file system and keeps the memory free for more important things. Here's how it looks if you were to do it manually:
app.use('/index.html', (req, res) => {
fs.createReadStream('../public/index.html')
.pipe(res);
});
Solution
If you would like to continue streaming a file to the user while making slight modifications to it, then this solution is for you. Please note, this is not a replacement for a templating engine but should rather be used to make small changes to a file as it is being streamed. The code below will append a small script tag with data to the body of an HTML page. It also shows how to prepend or append content to an http response stream:
NOTE: as mentioned in the comments, the original solution could have an edge case where this would fail. For fix this, I have added the new-line package to ensure data chunks are emitted at new lines.
const Transform = require('stream').Transform;
const parser = new Transform();
const newLineStream = require('new-line');
parser._transform = function(data, encoding, done) {
let str = data.toString();
str = str.replace('<html>', '<!-- Begin stream -->\n<html>');
str = str.replace('</body>', '<script>var data = {"foo": "bar"};</script>\n</body>\n<!-- End stream -->');
this.push(str);
done();
};
// app creation code removed for brevity
app.use('/index.html', (req, res) => {
fs
.createReadStream('../public/index.html')
.pipe(newLineStream())
.pipe(parser)
.pipe(res);
});

You get one response from a given request. You can either combine multiple things into one response or require the client to make separate requests to get separate things.
If what you're trying to do is to take an HTML file and modify it by inserting some JSON into it, then you can't use just res.sendFile() because that just reads a file from disk or cache and directly streams it as the response, offering no opportunity to modify it.
The more common way of doing this is to use a template system that lets you insert things into an HTML file (usually replacing special tags with your own data). There are literally hundreds of template systems and many that support node.js. Common choices for node.js are Jade (Pug), Handlebars, Ember, Dust, EJS, Mustache.
Or, if you really wanted to do so, you could read the HTML file into memory, use some sort of .replace() operation on it to insert your own data and then res.send() the resulting changed file.

Well, it's kinda old, but I didn't see any sufficient answer, except for "why not". You DO have way to pass parameters IN static file. And that's quite easy. Consider following code on your origin (using express):
let data = fs.readFileSync('yourPage.html', 'utf8');
if(data)
res.send(data.replace('param1Place','uniqueData'));
//else - 404
Now for example, just set a cookie, in yourPage.html, something like:
<script>
var date = new Date();
document.cookie = "yourCookieName='param1Place';" +
date.setTime(date.getTime() + 3600) + ";path=/";
</script>
And you can plainly pull content of uniqueData from yourCookieName wherever you want in your js

I think the answer posted by Ryan Wheale is the best solution if you actually want to modify something within an HTML file. You could also use cheerio for working with complex logic.
But in regards to this particular question where we just want to pass some data to the client from the server, there's actually no need to read index.html into memory at all.
You can simply add the following script tag somewhere at the top of your HTML file:
<script src="data.js"></script>
And then let Express serve that file with whatever data needed:
app.get("/data.js", function (req, res) {
res.send('window.SERVER_DATA={"some":"thing"}');
});
This data can then easily be referenced anywhere in your client application using the window object as: window.SERVER_DATA.some
Additional context for a React frontend:
This approach is especially useful during development if your client and server are running on different ports such as in the case of create-react-app because the proxied server can always respond to the request for data.js but when you're inserting something into index.html using Express then you always need to have your production build of index.html ready before inserting any data into it.

Why not just read the file, apply transformations and then set up the route in the callback?
fs.readFile(appPath, (err, html) => {
let htmlPlusData = html.toString().replace("DATA", JSON.stringify(data));
app.get('/', (req, res) => {
res.send(htmlPlusData);
});
});
Note that you can't dynamically change data, you'd have to restart the node instance.

You only have one response you can return from the server. The most common thing to do would be to template your file on the server with nunjucks or jade. Another choice is to render the file on the client and then to use javascript to make an ajax call to the server to get additional data. I suppose you could also set some data in a cookie and then read that on the client side via javascript as well.

(Unless you want to template the html file to insert the json data into a script tag). You'll need to expose an api endpoint in express the send along the data to the page, and have a function on the page to access it. for example,
// send the html
app.get('/', (req, res) => res.sendFile('index'));
// send json data
app.get('/data', (req, res) => res.json(data));
Now on the client side you can create a request to access this endpoint
function get() {
return new Promise((resolve, reject) => {
var req = new XMLHttpRequest();
req.open('GET', '/data');
req.onload = () => resolve(req.response);
});
}
// then to get the data, call the function
get().then((data) => {
var parsed = JSON.parse(data);
// do something with the data
});
EDIT:
So arrow functions probably don't work client side yet. make sure to replace them with function(){} in your real code

This is pretty easy to do using cookies. Simply do this:
On the server side -
response.append('Set-Cookie', 'LandingPage=' + landingPageCode);
response.sendFile(__dirname + '/mobileapps.html');
On client side -
<!DOCTYPE html>
<html>
<body onload="showDeferredLandingPageCode()">
<h2>Universal Link Mobile Apps Page</h2>
<p>This html page is used to demostrate deferred deeplinking with iOS</p>
</body>
<script language="javascript">
function showDeferredLandingPageCode() {
alert(document.cookie);
}
</script>
</html>

Related

Is there a way to Post an array to web api or mvc controller and get a file back to download as a result?

I use an html table where it's content can be changed with mouse drag and drop implemented. Technically, you can move the data from any table cell to another. The table size 50 row * 10 column with each cell given a unique identifier. I want to export it to .xlsx format with C# EPPlus library, and give back the exported file to client.
So I need the pass the whole table data upon a button press and post it to either a web api or an mvc controller, create an excel file (like the original html table data) and send it back to download with browser.
So the idea is to create an array which contains each of table cell's value ( of course there should be empty cells in that array), and post that array to controller.
The problem with that approach lies in the download, if I call the api or mvc controller with regular jquery's ajax.post it did not recognize the response as a file.
C# code after ajax post:
[HttpPost]
public IHttpActionResult PostSavedReportExcel([FromBody]List<SavedReports> savedReports, [FromUri] string dateid)
{
//some excel creation code
HttpResponseMessage response = new HttpResponseMessage(HttpStatusCode.OK)
{
Content = new StreamContent(new MemoryStream(package.GetAsByteArray()))
};
response.Content.Headers.ContentType = new System.Net.Http.Headers.MediaTypeHeaderValue("application/vnd.openxmlformats-officedocument.spreadsheetml.sheet");
response.Content.Headers.ContentDisposition = new System.Net.Http.Headers.ContentDispositionHeaderValue("attachment")
{
FileName = dateid + "_report.xlsx"
};
ResponseMessageResult responseMessageResult = ResponseMessage(response);
return responseMessageResult;
}
Usually, for this kind of result I could use window.location = myurltocontroller to download properly , but that is only for GET requests, POST anything is not possible.
I found some answers which could help me in this topic:
JavaScript post request like a form submit
This points out I should go with creating a form, which passes the values, but I do not know how to do so in case of arrays (the table consists 50*10 = 500 values which I have to pass in the form)
I tried some only frontend solutions to the html-excel export problem, which of course does not require to build files on api side, but free jquery add-ins are deprecated, not customizeable, handle only .xls formats, etc.
I found EPPlus nuget package a highly customizeable tool, that is why I want to try this is at first place.
So the question is: how can I post an array of 500 elements, that the controller will recognize, generate the file, and make it automatically download from browser?
If you can provide some code that would be fantastic, but giving me the right direction is also helpful.
Thank you.
You can use fetch() (docs) to send the request from the JS frontend. When the browser (JS) has received the response, it can then offer its binary content as a download. Something like this:
fetch("http://your-api/convert-to-excel", // Send the POST request to the Backend
{
method:"POST",
body: JSON.stringify(
[[1,2],[3,4]] // Here you can put your matrix
)
})
.then(response => response.blob())
.then(blob => {
// Put the response BLOB into a virtual download from JS
if (navigator.appVersion.toString().indexOf('.NET') > 0) {
window.navigator.msSaveBlob(blob, "my-excel-export.xlsx");
} else {
var a = window.document.createElement('a');
a.href = URL.createObjectURL(blob);
a.download = "my-excel-export.xlsx";
a.click();
}});
So the JS part of the browser actually first downloads the file behind the scenes, and only when it's done, it's triggering the "download" from the browsers memory into a file on the HD.
This is a quite common scenario with REST APIs that require bearer token authentication.

Resolve a 301 redirect and store the url for future use javascript

So I have a script that organises an un-formatted csv file and presents an output.
One of the pieces of data I receive in this data that we must return is a link to an image stored on Google Drive. The problem with this is Google Drive doesn't like to present you with a direct link to a file.
You can get the ID of a file (e.g. abc123DEFz) and view it online at https://drive.google.com/open?id=abc123DEFz. We need a direct link for another service to be able to process the file, not a redirect or some fancy website.
After poking around I discovered that https://drive.google.com/uc?export=view&id=abc123DEFz would redirect you directly to the file, and was what I somehow had to obtain inside the script.
The url it gave me though didn't really seem to have any relation to the ID and I couldn't just go ahead and swap the ID, for each file I would have to resolve this uc?export link into this link that would send me directly to the file. (Where the redirect sent me: http://doc-0c-2s-docs.googleusercontent.com/docs/securesc/32-char-long-alphanumeric-thing/another-32-char-long-alphanumeric-thing/1234567891234/12345678901234567890/12345678901234567890/abc123DEFz?e=view&authuser=0&nonce=abcdefgh12345&user=12345678901234567890&hash=32-char-long-alphanumeric-hash)
No authentication is required to access the file, it is public.
My script works like this:
const csv = require('csv-parser'),
fs = require('fs'),
request = require('request');
let final = [],
spuSet = [];
fs.createReadStream('data.csv')
.pipe(csv())
.on('data', (row) => {
>> data processing stuff, very boring so you don't care
console.log(`
I'm now going to save this information and tell you about the row I'm processing
so you can see why something went wrong`);
final.push(`[{"yes":"there is something here"},{"anditinvolves":${thatDataIJustGot}]`);
spuSet.push(`[{"morethings":123}]`);
})
.on('end', () => {
console.log('CSV file successfully processed');
console.log(`
COMPLETED! Check the output below and verify:
[${String(final).replace(/\r?\n|\r/g, " ")}]
COMPLETED! Check the output below and verify:
[${String(spuSet).replace(/\r?\n|\r/g, " ")}]`);
>> some more boring stuff where I upload the data somewhere and create a file containing said data
});
I tried using requests but it's a function with a callback so using the data outside of the function would be difficult, and wrapping everything inside the function would remove my ability to push to the array.
The url I get from the redirect would be included in the data I am pushing to the array for me to use later on.
I'm pretty bad at explaining crap, if you have any questions please ask.
Thanks in advance for any help you can give.
Try using the webContentLink parameter of the Get API call:
var webLink = drive.files.get({
fileId: 'fileid',
fields: 'webContentLink'
});
This will return the object:
{
"webContentLink": "https://drive.google.com/a/google.com/uc?id=fileId&export=download"
}
Then you can use split() to remove &export=download from the link, as we don't want to download it.
As fileId, you can get the Ids of your files by using the List API Call, and then you can loop through the list array calling the files.get from the first step.
My apologies if I misunderstood your issue.
In case you need help with the authentication to the Google Services, you can take a look at the Quickstart

Get value from txt file only when it is updated in Javascript [duplicate]

I have the following UseCase:
A creates a Chat and invites B and C - On the Server A creates a
File. A, B and C writes messages into this file. A, B and C read this
file.
I want a to create a file on server and observe this file if anybody else writes something into this file send the new content back with websockets.
So, any change of this file should be observed by my node.js application.
How can I observe files-changes? Is this possible with node js without locking the files?
If not possible with files, would it be possible with database object (NoSQL)
Good news is that you can observe filechanges with Node's API.
This however doesn't give you access to the contents that has been written into the file.
You can maybe use the fs.appendFile(); function so that when something is being written into the file you emit an event to something else that "logs" your new data that is being written.
fs.watch(): Directly pasted from the docs
fs.watch('somedir', function (event, filename) {
console.log('event is: ' + event);
if (filename) {
console.log('filename provided: ' + filename);
} else {
console.log('filename not provided');
}
});
Read here about the fs.watch(); function
EDIT: You can use the function
fs.watchFile();
Read here about the fs.watchFile(); function
This will allow you to watch a file for changes. Ie. whenever it is accessed by some other processes of any kind.
Also you could use node-watch. Here's an easy example:
const watch = require('node-watch')
watch('README.md', function(event, filename) {
console.log(filename, ' changed.')
})
I do not think you need to have observe file changes or use a NoSQL database for this (if you do not want to). My advice would be to look at events(Observer pattern). There are more than enough tutorials on this topic available online (Google). For example Felix's article about Using EventEmitters
This publish/subcribe semantic can also be achieved with NoSQL. In Redis for example, I think you should have a look at pubsub.
In MongoDB I think tailable cursors is what you are looking for. On their blog they have a post explaining pub/sub.

node.js does not recognise the url in the unfluff module

Any help will be appreciated.
I need to extract data from websites and found that node-unfluff does the job (see https://github.com/ageitgey/node-unfluff). There is two ways to call this module.
First, from command line which works!
Second, from node js which doesn't work.
extractor = require('unfluff');
data = extractor('test.html');
console.log(data);
Output : {"title":"","lang":null,"tags":[],"image":null,"videos":[],"text":""}
The data returns an empty json object. It appears like it cannot read the test.html.
It seems like it doesn't recognise test.html. The example says, "my html data", is there a way to get html data ? Thanks.
From the docs of unfluff:
extractor(html, language)
html: The html you want to parse
language (optional): The document's two-letter language code. This
will be auto-detected as best as possible, but there might be cases
where you want to override it.
You are passing a filename, and it expects the actual HTML of the file to be passed in.
If you are doing this in a scripting context, I'd recommend doing
data = extractor(fs.readFileSync('test.html'));
however if you are doing this in the context of a server or some time when blocking will be an issue, you should do:
fs.readFile('test.html', function(err, html){
var data = extractor(html);
console.log(data);
));

Piping data from a file to a rendered page in Sails.js

My application needs to read in a large dataset and pipe it to the client to manipulate with D3.js. The problem is, on large datasets, the reading/loading of the file contents could take a while. I want to solve this using streams. However, I'm unsure of how to do so in the context of the Sails framework.
What I want to do is read the contents of the file and pipe it to a rendered page. However, I can't figure out how to pipe it through if I use something like res.view('somePage', { data: thePipedData });.
I currently have something like this:
var datastream = fs.createReadStream(path.resolve(DATASET_EXTRACT_PATH, datatype, dataset, dataset + '.csv'));
datastream.pipe(res);
...
return res.view('analytics', { title: 'Analytics', data: ??? });
What's the best way to approach this?
Based on your example it seems like the best course of action would be to set up a separate endpoint to serve just the data, and include it on the client via a regular <script> tag.
MyDataController.js
getData: function(req, res) {
/* Some code here to determine datatype and dataset based on params */
// Wrap the data in a Javascript string
res.write("var theData = '");
// Open a read stream for the file
var datastream = fs.createReadStream(
path.resolve(DATASET_EXTRACT_PATH, datatype, dataset, dataset + '.csv')
);
// Pipe the file to the response. Set {end: false} so that res isn't closed
// when the file stream ends, allowing us to continue writing to it.
datastream.pipe(res, {end: false});
// When the file is done streaming, finish the Javascript string
datastream.on('end', function() {
res.end("';");
});
}
MyView.ejs
<script language="javascript" src="/mydata/getdata?datatype=<%=datatype%>&etc.."></script>
MyViewController.js
res.view('analytics', {datatype: 'someDataType', etc...});
A slight variation on this strategy would be to use a JSONP-style approach; rather than wrapping the data in a variable in the data controller action, you would wrap it in a function. You could then call the endpoint via AJAX to get the data. Either way you'd have the benefit of a quick page load since the large data set is loaded separately, but with the JSONP variation you'd also be able to easily show a loading indicator while waiting for the data.

Categories