Evaluate javascript for Fetch request - javascript

I am using node.js on Windows, with express module to generate an HTML where I would like to return data from a server processed function getfiledata() (i.e. I do not want to expose my js or txt file publicly).
I have been trying to use fetch() to return the value from getfiledata().
PROBLEM: I have not been able to get the data from getfiledata() returned to fetch().
HTML
<!DOCTYPE html>
<html>
<script type="text/javascript">
function fetcher() {
fetch('/compute', {method: "POST"})
.then(function(response) {
return response.json();
})
.then(function(myJson) {
console.log(JSON.stringify(myJson));
});
}
</script>
<input type="button" value="Submit" onClick="fetcher()">
</body>
</html>
^^ contains my fetch() function
server
var express = require("express");
var app = express();
var compute = require("./compute")
app.post('/compute',compute.getfiledata);
compute.js
var fs = require('fs');
module.exports={
getfiledata: function() {
console.log("we got this far");
fs.readFile("mytextfile.txt", function (err, data) {
if (err) throw err;
console.log("data: " + data);
return data;
})
}
}
^^ contains my server side function
Note: from compute.js the console successfully logs:
we got this far
data: this is the data in the text file
but doesn't log from:
console.log(JSON.stringify(myJson)) in the HTML
I suspect this is due to the fact I have not set up a "promise", but am not sure, and would appreciate some guidance on what the next steps would be.

I think you're on the way. I'd suggest making a few little changes, and you're all the way there.
I'd suggest using fs.readFileSync, since this is a really small file (I presume!!), so there's no major performance hit. We could use fs.readFile, however we'd need to plug in a callback and in this case I think doing all this synchronously is fine.
To summarize the changes:
We need to call getFileData() on compute since its a function.
We'll use readFileSync to read your text file (since it's quick).
We'll call res.json to encode the response as json.
We'll use the express static middleware to serve index.html.
To test this out, make sure all files are in the same directory.
Hit the command below to serve:
node server.js
And then go to http://localhost/ to see the web page.
server.js
var express = require("express");
var compute = require("./compute");
var app = express();
app.use(express.static("./"));
app.post('/compute', (req, res, next) => {
var result = compute.getfiledata();
res.status(200).json({ textData: result } );
});
app.listen(80);
compute.js
var fs = require('fs');
module.exports = {
getfiledata: function() {
console.log("we got this far");
return fs.readFileSync("mytextfile.txt", "utf8");
}
}
index.html
<!DOCTYPE html>
<html>
<script type="text/javascript">
function fetcher() {
fetch('/compute', {method: "POST"})
.then(function(response) {
return response.json();
})
.then(function(myJson) {
console.log(JSON.stringify(myJson));
document.getElementById("output").innerHTML = "<b>Result: </b>" + myJson.textData;
});
}
</script>
<input type="button" value="Submit" onClick="fetcher()">
<br><br>
<div id="output">
</div>
</body>
</html>
mytextfile.txt
Why, then, ’tis none to you, for there is nothing either good or bad, but thinking makes it so.

You have a couple problems here. First, you can't directly return asynchronous data from getfiledata(). See How do I return the response from an asynchronous call? for a full description of that issue. You have to either use a callback to communicate back the asynchronous results (just like fs.readFile() does) or you can return a promise that fulfills to the asynchronous value.
Second, a route handler in Express such as app.post() has to use res.send() or res.write() or something like that to send the response back to the caller. Just returning a value from the route handler doesn't do anything.
Third, you've seen other suggestions to use synchronous I/O. Please, please don't do that. You should NEVER use synchronous I/O in any server anywhere except in startup code because it absolutely ruins your ability for your server to handle multiple requests at once. Instead, synchronous I/O forces requests to be processed serially (with the whole server process waiting and doing nothing while the OS is fetching things from the disk) rather than letting the server use all available CPU cycles to process other requests while waiting for disk I/O.
Keeping these in mind, your solution is pretty simple. I suggest using promises since that's the future of Javascript and node.js.
Your compute.js:
const util = require('util');
const readFile = util.promisify(require('fs').readFile);
module.exports= {
getfiledata: function() {
return readFile("mytextfile.txt");
}
}
And, your server.js:
const express = require("express");
const compute = require("./compute");
const app = express();
app.use(express.static("./"));
app.post('/compute', (req, res) => {
compute.getfiledata().then(textData => {
res.json({textData});
}).catch(err => {
console.log(err);
res.sendStatus(500);
});
});
app.listen(80);
In node.js version 10, there is an experimental API for promises built-in for the fs module so you don't even have to manually promisify it like I did above. Or you can use any one of several 3rd party libraries that make it really easy to promisify the whole fs library at once so you have promisified versions of all the functions in the module.

Related

What is a Event Driven non blocking IO model in Node.js?

I don't understand what is the real difference between the codes:
const fs = require('fs');
fs.readFile('/file.md', (err, data) => {
if (err) throw err;
});
const fs = require('fs');
const data = fs.readFileSync('/file.md');
Please somebody tell me what is going on here in simplified way.
In few words the difference is that the first snippet is asynchronous.
The real difference is when the code after the snippets get executed.
So if you try to execute:
const fs = require('fs');
console.log('preparing...');
fs.readFile('/file.md', (err, data) => {
if (err) throw err;
console.log('I am sure that the data is read!');
console.log(data);
});
console.log('Not sure if the data is here... ');
you'll see (if the file is big enough):
preparing...
Not sure if the data is here...
I am sure that the data is read!
$data
In the other case (the readFileSync), the data will be there (unless of errors).
Take look at this example
var express = require('express');
var fs = require('fs');
var app = express.createServer(express.logger());
app.get('/readFile', function(request, response) {
fs.readFile('data.txt', function(err, data){
response.send(data);
});
});
app.get('/readFileSync', function(request, response) {
let data = fs.readFileSync('data.txt');
});
response.send(data);
fs.readFile takes a call back which calls response.send. If you simply replace that with fs.readFileSync, you need to be aware it does not take a callback so your callback which calls response.send will never get called and therefore the response will never end and it will timeout.
You need to show your readFileSync code if you're not simply replacing readFile with readFileSync.
Also, just so you're aware, you should never call readFileSync in a node express/webserver since it will tie up the single thread loop while I/O is performed. You want the node loop to process other requests until the I/O completes and your callback handling code can run. Though you can use the promise to handle this.
And from v10.0.0 The callback parameter is no longer optional for readFile. Not passing it will throw a TypeError at runtime.

Like we have readFileSync function in node js, I need http.getSync - a wrapper for http.get() to make it sync

I want to make the implementation of this https.getSync the wrapper method, so that it calls the api synchronously, same like the readFileSync method which we use for reading file synchronously,
const https = require('https');
How should i implement this method -
https.getSync = (url) => {
let data = '';
https.get(url, resp => {
resp.on('data', (chunk) => {
data += chunk;
});
resp.on('end', () => {
console.log(JSON.parse(data));
});
}).on("error", (err) => {
console.log("Error: " + err.message);
});
return data;
}
I want the below two calls to be made synchronously, without changing the below code where we are calling the getSync method. Here for this calling i don't want to use promises or callback.
let api1 = https.getSync('https://api.nasa.gov/planetary/apod?api_key=DEMO_KEY');
let api2 = https.getSync('https://api.nasa.gov/planetary/apod?api_key=NNKOjkoul8n1CH18TWA9gwngW1s1SmjESPjNoUFo');
You can use npm package sync-request.
It's quite simple.
var request = require('sync-request');
var res = request('GET', 'http://example.com');
console.log(res.getBody());
Here is the link: sync-request
Read this also: An announcement from the package, in case readers would think using it is a good idea. You should not be using this in a production application. In a node.js application you will find that you are completely unable to scale your server. In a client application you will find that sync-request causes the app to hang/freeze. Synchronous web requests are the number one cause of browser crashes.
According to me also you should avoid making http sync request. Instead clear your concepts of using callback, promise, async/await.

How do I make a MongoDB query throw an error if there is no database connection? [duplicate]

I'm new to Mongo. I needed a database for a simple project and ended up following a tutorial using Mongo with Monk but I have problems understanding how to handle errors.
Background: I have a registration form on the client side. When the user clicks a button, the data is sent via AJAX to the controller that (upon validation, but this is not relevant now) inserts such data into the database and sends back either success or error. When the db is up all seems to work fine.
The problem: If I don't start the db and try to send the request anyway, no error is returned. Simply nothing happens. After some time on the console I get: POST /members/addmember - - ms - -.
I think some error should be returned to the user in this case, so how could I do this?
The post request is below (pretty much as from the tutorial):
// app.js
var db = monk('localhost:27017/dbname')
[...]
// I realize it might be not optimal here
app.use(function(req,res,next){
req.db = db;
next();
});
// members.js
router.post('/addmember', function(req, res) {
var db = req.db;
var collection = db.get('memberstest');
collection.insert(req.body, function(err, result){
res.json(
(err === null) ? { msg: 'success' } : { msg: err }
);
});
});
If the db is down I guess the problem is actually even earlier than the insert, that is in that "db.get()". So how to check if that get can actually be done? I suppose that given the asynchronous nature of node something like a try/catch would be pointless here. Correct?
EDIT: After Neil's answer and a bit of trying, I put together the following that seems to do the job. However, given my scarce degree of confidence on this, I'd appreciate a comment if the code below works because it makes sense or by chance. I added the bufferMaxEntries: 0 options and modified the controller as follows. In the ajax callback I simply have an alert for now that shows the error message thrown (if any).
router.post('/addmember', async (req,res) => {
try {
let db = req.db;
let collection = db.get('memberstest');
collection.insert(req.body, function(err, result){
res.json(
(err === null) ? { msg: 'success' } : { msg: err }
);
});
await db.then(() => 1);
} catch(e) {
res.json({msg: e.message})
}
});
Well you can actually set the bufferMaxEntries option ( documented under Db but deprecated for that object usage, use at "top level as demonstrated instead" ) on the connection, which essentially stops "queuing" requests on the driver when no connection is actually present.
As a minimal example:
index.js
const express = require('express'),
morgan = require('morgan'),
db = require('monk')('localhost/test',{ bufferMaxEntries: 0 }),
app = express();
const routes = require('./routes');
app.use(morgan('combined'));
app.use((req,res,next) => {
req.db = db;
next();
});
app.use('/', routes);
(async function() {
try {
await db.then(() => 1);
let collection = db.get('test');
await collection.remove({});
await collection.insert(Array(5).fill(1).map((e,i) => ({ a: i+1 })));
console.log('inserted test data');
await app.listen(3000,'0.0.0.0');
console.log('App waiting');
} catch(e) {
console.error(e);
}
})();
routes.js
var router = require('express').Router();
router.get('/', async (req,res) => {
try {
let db = req.db,
collection = db.get('test');
let response = await collection.find();
res.json(response);
} catch(e) {
res.status(500).json(e);
}
});
module.exports = router;
So I am actually awaiting the database connection to at least be present on "start up" here, but really only for example since I want to insert some data to actually retrieve. It's not required, but the basic concept is to wait for the Promise to resolve:
await db.then(() => 1);
Kind of trivial, and not really required for your actual code. But I still think it's good practice.
The real test is done by stopping mongod or otherwise making the server unreachable and then issuing a request.
Since we set the connection options to { bufferMaxEntries: 0 } this means that immediately as you attempt to issue a command to the database, the failure will be returned if there is no actual connection present.
Of course when the database becomes available again, you won't get the error and the instructions will happen normally.
Without the option the default is to "en-queue" the operations until a connection is resolved and then the "buffer" is essentially "played".
You can simulate this ( as I did ) by "stopping" the mongod daemon and issuing requests. Then "starting" the daemon and issuing requests. It should simply return the caught error response.
NOTE: Not required, but in fact the whole purpose of async/await syntax is to make things like try..catch valid again, since you can actually scope as blocks rather than using Promise.catch() or err callback arguments to trap the errors. Same principles apply when either of those structures are actually in use though.

ExpressJS apparent race condition between Promise and EventEmitter

I have a NodeJS/Express web application that allows the user to upload a file, which I then parse using connect-busboy save to my database using Sequelize. Once that's done, I want to redirect the user to a given page. But Express is returning a status of 404 before my Promise resolves, even though I'm never calling next(), which I thought was mandatory in order to call the next handler in the middleware chain and thus result in a 404.
This is my code so far:
function uploadFormFile(req, res, next) {
var documentInstanceID = req.params.documentInstanceID;
// set up an object to hold my data
var data = {
file: null,
documentDate: null,
mimeType: null
};
// call the busboy middleware explicitly
// EDIT: this turned out to be the problem... of course this calls next()
// removing this line and moving it to an app.use() made everything work as expected
busboy(req, res, next);
req.pipe(req.busboy);
req.busboy.on('file', function (fieldName, file, fileName, encoding, mimeType) {
var fileData = [];
data.mimeType = mimeType;
file.on('data', function (chunk) {
fileData.push(chunk);
});
file.on('end', function () {
data.file = Buffer.concat(fileData);
});
});
req.busboy.on('finish', function () {
// api methods return promises from Sequelize
api.querySingle('DocumentInstance', ['Definition'], null, { DocumentInstanceID: documentInstanceID })
.then(function (documentInstance) {
documentInstance.RawFileData = data.file;
documentInstance.FileMimeType = data.mimeType;
// chaining promise
return api.save(documentInstance);
}).then(function () {
res.redirect('/app/page');
});
});
}
I can confirm that my data is being persisted correctly. But due to the race condition, the web page says 'can't POST' due to the 404 status being returned by Express, and the res.redirect is failing with an error setting the headers because it's trying to redirect after the 404 has been sent.
Can anyone help me figure out why Express is returning the 404?
The problem is coming from your internal call to busboy inside your handler. Rather than it executing and simply returning control to your handler, it would be calling the next which is passed to it before it returns control. So you code after the busboy call does execute, but the request has already advanced past that point.
In cases in which you want some middleware to only be executed for certain requests, you can chain middleware into those requests, such as:
router.post('/upload',busboy,uploadFromFile)
You can also separate them with .use() such as:
router.use('/upload', busboy);
router.post('/upload', uploadFromFile);
Either of the above will chain the middleware in the way you intended. In the case of .use() the middleware would also be applied to any applicable .METHOD() as Express refers to it in their documentation.
Also, note that you can pass in an arbitrary number of middleware this way, either as separate parameters or as arrays of middleware functions, such as:
router.post('/example', preflightCheck, logSomeStuff, theMainHandler);
// or
router.post('example', [ preflightCheck,logSomeStuff ], theMainHandler);
The execution behavior of either of the above examples will be equivalent. Speaking only for myself and not suggesting it is a best practice, I normally only use the array-based addition of middleware if i am building the middleware list at runtime.
Good luck with it. I hope you enjoy using Express as much as I have.

How to know when node.js express server is up and ready to use

Have an application where I want to start a node express server and then start a browser on the same machine automatically as soon as the server is up. How can I query to see if the server is up and ready to go? I really wanted there to be some sort of callback on the .listen call, but doesn't seem to be. I could just wait a longer than I expect amount of time, but this is going on equipment that will be in the field so I either have to wait a ridiculous amount of time to make sure I'm up and running before kicking off the browser or have the user be smart enough to hit refresh if the page doesn't load right. Neither of those are good options for me. . .
I read the API online but don't see anything like this. Surely there's a trick I don't know that can accomplish this.
If the node HTTP api (which has a callback and tells me about the listening event) is the base for the express object, maybe there is a callback option for the express call listen that isn't documented. Or perhaps I'm supposed to just know that it's there.
Any help would be greatly appreciated.
The Express app.listen function does support a callback. It maps the arguments that you pass in to the http.listen call.
app.listen = function(){
var server = http.createServer(this);
return server.listen.apply(server, arguments);
};
So you can just call: app.listen(port, callback);
Or you could use http.listen directly.
var app = require('express')(),
server = require('http').createServer(app);
server.listen(80, function() {
console.log('ready to go!');
});
You can fire a custom event after the server is started:
// server.js
const express = require('express');
const app = express();
modeule.export = app;
app.listen(3000, () => {
app.emit('listened', null)
});
In a separate module, the app can listen your custom event:
// custom.js
const server = require('server.js');
server.on('listened', function() {
console.log('The server is running!');
});
You can use the http.listen method which has a callback function that triggers once the server is ready:
http.createServer(app).listen(app.get('port'), function () {
console.log('Printed when ready!!!');
});
See the official reference at Node.js:
http://nodejs.org/api/all.html#all_server_listen_port_hostname_backlog_callback
http://nodejs.org/api/all.html#all_server_listen_path_callback_1
http://nodejs.org/api/all.html#all_server_listen_handle_callback_1
As many have mentioned, the listen function (on the express app or an http server, both support it), does support a callback and that will let your node process know when it is listening.
So if you plan to launch the browser from within your express app, do it there and you are good. However, if you are launching the express app from an external script and then want that external script to open the browser, the node callback doesn't really buy you anything.
Waiting for some magic string on stdout isn't really an improvement on just waiting for a good HTTP response. You may as well just use a try/backoff/timeout loop with curl until you get a successful response.
server.on('listening', function() {
resolve();//I added my own promise to help me await
});
The Listening event worked for me. Note I added my own Promise. I imagine you could obtain similar results without a promise by adding an entry point to this listener.
Note, I tried the more intuitive server.on('listen') and it didn't work. Running node 6.9.1
With async/await syntax, this can be done by wrapping the server startup in a promise, so you can wait for it to be started before running anything else:
import express from 'express';
import http from 'http';
const app = express();
let server: http.Server;
const startServer = async (): Promise<void> => {
return new Promise((resolve, _reject) => {
server = app.listen(3000, () => {
console.log('Express server started');
resolve();
});
});
};
await startServer();
// here the server is started and ready to accept requests

Categories