What is a Event Driven non blocking IO model in Node.js? - javascript

I don't understand what is the real difference between the codes:
const fs = require('fs');
fs.readFile('/file.md', (err, data) => {
if (err) throw err;
});
const fs = require('fs');
const data = fs.readFileSync('/file.md');
Please somebody tell me what is going on here in simplified way.

In few words the difference is that the first snippet is asynchronous.
The real difference is when the code after the snippets get executed.
So if you try to execute:
const fs = require('fs');
console.log('preparing...');
fs.readFile('/file.md', (err, data) => {
if (err) throw err;
console.log('I am sure that the data is read!');
console.log(data);
});
console.log('Not sure if the data is here... ');
you'll see (if the file is big enough):
preparing...
Not sure if the data is here...
I am sure that the data is read!
$data
In the other case (the readFileSync), the data will be there (unless of errors).

Take look at this example
var express = require('express');
var fs = require('fs');
var app = express.createServer(express.logger());
app.get('/readFile', function(request, response) {
fs.readFile('data.txt', function(err, data){
response.send(data);
});
});
app.get('/readFileSync', function(request, response) {
let data = fs.readFileSync('data.txt');
});
response.send(data);
fs.readFile takes a call back which calls response.send. If you simply replace that with fs.readFileSync, you need to be aware it does not take a callback so your callback which calls response.send will never get called and therefore the response will never end and it will timeout.
You need to show your readFileSync code if you're not simply replacing readFile with readFileSync.
Also, just so you're aware, you should never call readFileSync in a node express/webserver since it will tie up the single thread loop while I/O is performed. You want the node loop to process other requests until the I/O completes and your callback handling code can run. Though you can use the promise to handle this.
And from v10.0.0 The callback parameter is no longer optional for readFile. Not passing it will throw a TypeError at runtime.

Related

Like we have readFileSync function in node js, I need http.getSync - a wrapper for http.get() to make it sync

I want to make the implementation of this https.getSync the wrapper method, so that it calls the api synchronously, same like the readFileSync method which we use for reading file synchronously,
const https = require('https');
How should i implement this method -
https.getSync = (url) => {
let data = '';
https.get(url, resp => {
resp.on('data', (chunk) => {
data += chunk;
});
resp.on('end', () => {
console.log(JSON.parse(data));
});
}).on("error", (err) => {
console.log("Error: " + err.message);
});
return data;
}
I want the below two calls to be made synchronously, without changing the below code where we are calling the getSync method. Here for this calling i don't want to use promises or callback.
let api1 = https.getSync('https://api.nasa.gov/planetary/apod?api_key=DEMO_KEY');
let api2 = https.getSync('https://api.nasa.gov/planetary/apod?api_key=NNKOjkoul8n1CH18TWA9gwngW1s1SmjESPjNoUFo');
You can use npm package sync-request.
It's quite simple.
var request = require('sync-request');
var res = request('GET', 'http://example.com');
console.log(res.getBody());
Here is the link: sync-request
Read this also: An announcement from the package, in case readers would think using it is a good idea. You should not be using this in a production application. In a node.js application you will find that you are completely unable to scale your server. In a client application you will find that sync-request causes the app to hang/freeze. Synchronous web requests are the number one cause of browser crashes.
According to me also you should avoid making http sync request. Instead clear your concepts of using callback, promise, async/await.

Evaluate javascript for Fetch request

I am using node.js on Windows, with express module to generate an HTML where I would like to return data from a server processed function getfiledata() (i.e. I do not want to expose my js or txt file publicly).
I have been trying to use fetch() to return the value from getfiledata().
PROBLEM: I have not been able to get the data from getfiledata() returned to fetch().
HTML
<!DOCTYPE html>
<html>
<script type="text/javascript">
function fetcher() {
fetch('/compute', {method: "POST"})
.then(function(response) {
return response.json();
})
.then(function(myJson) {
console.log(JSON.stringify(myJson));
});
}
</script>
<input type="button" value="Submit" onClick="fetcher()">
</body>
</html>
^^ contains my fetch() function
server
var express = require("express");
var app = express();
var compute = require("./compute")
app.post('/compute',compute.getfiledata);
compute.js
var fs = require('fs');
module.exports={
getfiledata: function() {
console.log("we got this far");
fs.readFile("mytextfile.txt", function (err, data) {
if (err) throw err;
console.log("data: " + data);
return data;
})
}
}
^^ contains my server side function
Note: from compute.js the console successfully logs:
we got this far
data: this is the data in the text file
but doesn't log from:
console.log(JSON.stringify(myJson)) in the HTML
I suspect this is due to the fact I have not set up a "promise", but am not sure, and would appreciate some guidance on what the next steps would be.
I think you're on the way. I'd suggest making a few little changes, and you're all the way there.
I'd suggest using fs.readFileSync, since this is a really small file (I presume!!), so there's no major performance hit. We could use fs.readFile, however we'd need to plug in a callback and in this case I think doing all this synchronously is fine.
To summarize the changes:
We need to call getFileData() on compute since its a function.
We'll use readFileSync to read your text file (since it's quick).
We'll call res.json to encode the response as json.
We'll use the express static middleware to serve index.html.
To test this out, make sure all files are in the same directory.
Hit the command below to serve:
node server.js
And then go to http://localhost/ to see the web page.
server.js
var express = require("express");
var compute = require("./compute");
var app = express();
app.use(express.static("./"));
app.post('/compute', (req, res, next) => {
var result = compute.getfiledata();
res.status(200).json({ textData: result } );
});
app.listen(80);
compute.js
var fs = require('fs');
module.exports = {
getfiledata: function() {
console.log("we got this far");
return fs.readFileSync("mytextfile.txt", "utf8");
}
}
index.html
<!DOCTYPE html>
<html>
<script type="text/javascript">
function fetcher() {
fetch('/compute', {method: "POST"})
.then(function(response) {
return response.json();
})
.then(function(myJson) {
console.log(JSON.stringify(myJson));
document.getElementById("output").innerHTML = "<b>Result: </b>" + myJson.textData;
});
}
</script>
<input type="button" value="Submit" onClick="fetcher()">
<br><br>
<div id="output">
</div>
</body>
</html>
mytextfile.txt
Why, then, ’tis none to you, for there is nothing either good or bad, but thinking makes it so.
You have a couple problems here. First, you can't directly return asynchronous data from getfiledata(). See How do I return the response from an asynchronous call? for a full description of that issue. You have to either use a callback to communicate back the asynchronous results (just like fs.readFile() does) or you can return a promise that fulfills to the asynchronous value.
Second, a route handler in Express such as app.post() has to use res.send() or res.write() or something like that to send the response back to the caller. Just returning a value from the route handler doesn't do anything.
Third, you've seen other suggestions to use synchronous I/O. Please, please don't do that. You should NEVER use synchronous I/O in any server anywhere except in startup code because it absolutely ruins your ability for your server to handle multiple requests at once. Instead, synchronous I/O forces requests to be processed serially (with the whole server process waiting and doing nothing while the OS is fetching things from the disk) rather than letting the server use all available CPU cycles to process other requests while waiting for disk I/O.
Keeping these in mind, your solution is pretty simple. I suggest using promises since that's the future of Javascript and node.js.
Your compute.js:
const util = require('util');
const readFile = util.promisify(require('fs').readFile);
module.exports= {
getfiledata: function() {
return readFile("mytextfile.txt");
}
}
And, your server.js:
const express = require("express");
const compute = require("./compute");
const app = express();
app.use(express.static("./"));
app.post('/compute', (req, res) => {
compute.getfiledata().then(textData => {
res.json({textData});
}).catch(err => {
console.log(err);
res.sendStatus(500);
});
});
app.listen(80);
In node.js version 10, there is an experimental API for promises built-in for the fs module so you don't even have to manually promisify it like I did above. Or you can use any one of several 3rd party libraries that make it really easy to promisify the whole fs library at once so you have promisified versions of all the functions in the module.

How do I make a MongoDB query throw an error if there is no database connection? [duplicate]

I'm new to Mongo. I needed a database for a simple project and ended up following a tutorial using Mongo with Monk but I have problems understanding how to handle errors.
Background: I have a registration form on the client side. When the user clicks a button, the data is sent via AJAX to the controller that (upon validation, but this is not relevant now) inserts such data into the database and sends back either success or error. When the db is up all seems to work fine.
The problem: If I don't start the db and try to send the request anyway, no error is returned. Simply nothing happens. After some time on the console I get: POST /members/addmember - - ms - -.
I think some error should be returned to the user in this case, so how could I do this?
The post request is below (pretty much as from the tutorial):
// app.js
var db = monk('localhost:27017/dbname')
[...]
// I realize it might be not optimal here
app.use(function(req,res,next){
req.db = db;
next();
});
// members.js
router.post('/addmember', function(req, res) {
var db = req.db;
var collection = db.get('memberstest');
collection.insert(req.body, function(err, result){
res.json(
(err === null) ? { msg: 'success' } : { msg: err }
);
});
});
If the db is down I guess the problem is actually even earlier than the insert, that is in that "db.get()". So how to check if that get can actually be done? I suppose that given the asynchronous nature of node something like a try/catch would be pointless here. Correct?
EDIT: After Neil's answer and a bit of trying, I put together the following that seems to do the job. However, given my scarce degree of confidence on this, I'd appreciate a comment if the code below works because it makes sense or by chance. I added the bufferMaxEntries: 0 options and modified the controller as follows. In the ajax callback I simply have an alert for now that shows the error message thrown (if any).
router.post('/addmember', async (req,res) => {
try {
let db = req.db;
let collection = db.get('memberstest');
collection.insert(req.body, function(err, result){
res.json(
(err === null) ? { msg: 'success' } : { msg: err }
);
});
await db.then(() => 1);
} catch(e) {
res.json({msg: e.message})
}
});
Well you can actually set the bufferMaxEntries option ( documented under Db but deprecated for that object usage, use at "top level as demonstrated instead" ) on the connection, which essentially stops "queuing" requests on the driver when no connection is actually present.
As a minimal example:
index.js
const express = require('express'),
morgan = require('morgan'),
db = require('monk')('localhost/test',{ bufferMaxEntries: 0 }),
app = express();
const routes = require('./routes');
app.use(morgan('combined'));
app.use((req,res,next) => {
req.db = db;
next();
});
app.use('/', routes);
(async function() {
try {
await db.then(() => 1);
let collection = db.get('test');
await collection.remove({});
await collection.insert(Array(5).fill(1).map((e,i) => ({ a: i+1 })));
console.log('inserted test data');
await app.listen(3000,'0.0.0.0');
console.log('App waiting');
} catch(e) {
console.error(e);
}
})();
routes.js
var router = require('express').Router();
router.get('/', async (req,res) => {
try {
let db = req.db,
collection = db.get('test');
let response = await collection.find();
res.json(response);
} catch(e) {
res.status(500).json(e);
}
});
module.exports = router;
So I am actually awaiting the database connection to at least be present on "start up" here, but really only for example since I want to insert some data to actually retrieve. It's not required, but the basic concept is to wait for the Promise to resolve:
await db.then(() => 1);
Kind of trivial, and not really required for your actual code. But I still think it's good practice.
The real test is done by stopping mongod or otherwise making the server unreachable and then issuing a request.
Since we set the connection options to { bufferMaxEntries: 0 } this means that immediately as you attempt to issue a command to the database, the failure will be returned if there is no actual connection present.
Of course when the database becomes available again, you won't get the error and the instructions will happen normally.
Without the option the default is to "en-queue" the operations until a connection is resolved and then the "buffer" is essentially "played".
You can simulate this ( as I did ) by "stopping" the mongod daemon and issuing requests. Then "starting" the daemon and issuing requests. It should simply return the caught error response.
NOTE: Not required, but in fact the whole purpose of async/await syntax is to make things like try..catch valid again, since you can actually scope as blocks rather than using Promise.catch() or err callback arguments to trap the errors. Same principles apply when either of those structures are actually in use though.

Passing function on express js Route not working

I'm just really new on Node and Express. Trying to pass a function instead of text on my route but it seems not working. I just looked up at documentation there, They mentioned only text with req.send() method. I'm trying to pass here function's but it's not working. and also the alert() not working like this req.send(alert('Hello world')) it say's alert isn't defined or something similar.
**Update: ** I'm trying to execute this library with express and node https://github.com/przemyslawpluta/node-youtube-dl
I'm trying to do here pass functions like this
function blaBla() {
var youtubedl = require('youtube-dl');
var url = 'http://www.youtube.com/watch?v=WKsjaOqDXgg';
// Optional arguments passed to youtube-dl.
var options = ['--username=user', '--password=hunter2'];
youtubedl.getInfo(url, options, function(err, info) {
if (err) throw err;
console.log('id:', info.id);
console.log('title:', info.title);
console.log('url:', info.url);
console.log('thumbnail:', info.thumbnail);
console.log('description:', info.description);
console.log('filename:', info._filename);
console.log('format id:', info.format_id);
});
}
app.get('/', (req, res) => {
res.send(blaBla());
})
**Instead of **
app.get('/', function (req, res) {
res.send('Hello World!')
})
I hope you guy's understood my question.
res.send() expects a string argument. So, you have to pass a string.
If you want the browser to execute some Javascript, then what you send depends upon what kind of request is coming in from the browser.
If it's a browser page load request, then the browser expects an HTML response and you need to send an HTML page string back. If you want to execute Javascript as part of that HTML page, then you can embed a <script> tag inside the page and then include Javascript text inside that <script> tag and the browser will execute that Javascript when the page is parsed and scripts are run.
If the route is in response to a script tag request, then you can return Javascript text as a string and you need to make sure the MIME type appropriately indicates that it is a script.
If the route is in response to an Ajax call, then it all depends upon what the caller of the Ajax call expects. If they expect a script and are going to execute the text as Javascript, then you can also just send Javascript text as a string. If they expect HTML and are going to process it as HTML, then you probably need to embed the <script> tag inside that HTML in order to get the Javascript executed.
In your example of:
response.send(blaBla());
That will work just fine if blaBla() synchronously returns a string that is formatted properly per the above comments about what the caller is expecting. If you want further help with that, then you need to show or describe for us how the request is initiated in the browser and show us the code for the blaBla() function because the issue is probably in the blaBla() function.
There are lots of issues with things you have in your question:
You show req.send(alert('Hello world')) in the text of your question. The .send() method belongs to the res object, not the req object (the second argument, not the first). So, that would be res.send(), not req.send().
In that same piece of code, there is no alert() function in node.js, but you are trying to execute it immediately and send the result with .send(). That won't work for a bunch of reasons.
Your first code block using blaBla() will work just fine as long as blaBla() returns a string of the right format that matches what the caller expects. If that doesn't work, then there's a problem with what blaBla() is doing so we need to see that code.
Your second code block works because you are send a string which is something the caller is equipped to handle.
Update now that you've shown the code for blaBla().
Your code for blaBla() does not return anything and it's asynchronous so it can't return the result. Thus, you cannot use the structure response.send(blaBla());. There is no way to make that work.
Instead, you will need to do something different like:
blaBla(response);
And, then modify blaBla() to call response.send(someTextValue) when the response string is known.
function blaBla(res) {
var youtubedl = require('youtube-dl');
var url = 'http://www.youtube.com/watch?v=WKsjaOqDXgg';
// Optional arguments passed to youtube-dl.
var options = ['--username=user', '--password=hunter2'];
youtubedl.getInfo(url, options, function(err, info) {
if (err) {
res.status(500).send("Internal Error");
} else {
console.log('id:', info.id);
console.log('title:', info.title);
console.log('url:', info.url);
console.log('thumbnail:', info.thumbnail);
console.log('description:', info.description);
console.log('filename:', info._filename);
console.log('format id:', info.format_id);
// construct your response here as a string
res.json(info);
}
});
}
Note also that the error handling does not use throw because that is really not useful inside an async callback.
No one just could help me with that and after finding things are alone I got to know how to do this. In express there is something called middleware we have to use that thing to get this kind of matter done. Those who are really expert or have working experience with express they know this thing.
to using functions with express you need to use middleware.
like below I'm showing
const express = require('express')
const youtubedl = require('youtube-dl');
const url = 'https://www.youtube.com/watch?v=quQQDGvEP10';
const app = express()
const port = 3000
function blaBla(req, res, next) {
youtubedl.getInfo(url, function(err, info) {
console.log('id:', info.id);
console.log('title:', info.title);
console.log('url:', info.url);
// console.log('thumbnail:', info.thumbnail);
// console.log('description:', info.description);
console.log('filename:', info._filename);
console.log('format id:', info.format_id);
});
next();
}
app.use(blaBla);
app.get('/', (request, response) => {
response.send('Hey Bebs, what is going on here?');
})
app.listen(port, (err) => {
if (err) {
return console.log('something bad happened', err)
}
console.log(`server is listening on ${port}`)
})
And remember that you must need to use app.use(blaBla); on top of getting your route. Otherwise this might not work.

node.js - first request returns blank, second request returns data

I'm setting up a node.js api using express, body-parser and mysql.
Whenever I make a get request to a route for the first time, I get a blank return. If I fire it a second time, I get the desired return.
Here is the basic setup...
server.js
var express = require('express');
var app = express();
var bodyParser = require('body-parser');
// include the model
var Tasks = require('./models/tasks.js');
// configure api to use bodyParser()
// this will let us get the data from a POST
app.use(bodyParser.urlencoded({ extended: true }));
app.use(bodyParser.json());
var port = process.env.PORT || 8080;
var router = express.Router();
// middleware to use for all requests
router.use(function(req, res, next) {
// do logging
//console.log(req)
next(); // make sure we go to the next routes and don't stop here
});
router.route('/tasks')
// on routes that end in /tasks/:task_id
// ----------------------------------------------------
router.route('/tasks/:task_id')
// get the task by id (accessed as GET)
.get(function(req, res) {
Tasks.setProjectId(req.params.task_id);
Tasks.setResult(req.params.task_id);
var tasks = Tasks.getTasksByProjectId();
res.json(tasks);
});
app.use('/v1', router);
// START THE SERVER
app.listen(port);
console.log('Magic happens on port ' + port);
and here is tasks.js
// include the db
require('../db.js');
var project_id, result;
module.exports.setProjectId = function(pid) {
project_id = pid;
}
module.exports.setResult = function(pid) {
//connection.connect();
connection.query('SELECT * FROM tasks WHERE project_id = ' + pid, function(err, rows, fields) {
if (err) throw err;
result = rows;
});
//connection.end();
}
module.exports.getTasksByProjectId = function(){
return result;
}
any ideas why the blank return on the first request?
Thanks!
Async, async, async.
You're trying to set a module global based on an asynchronous operation which completes some indeterminate time in the future.
But, before that async operation has completed, you're trying to read the result. So, the module global is still empty.
By the time you make the second request, the first one has actually completed to you get to see that first result.
This whole model you're using will simply not work. You can't make asynchronous requests, but try to use them synchronously. It simply won't work that way.
The results of all async operations can ONLY be used reliably in a callback that is called when the async operation completes. No stuffing of results into globals because you have ZERO idea of when the operation is done and it is safe to read the global.
.setResult() needs to take a callback function that you can call when the result is ready and you can pass the result to the callback. That's where you get the result. You don't stuff it into a module global.
Your global variable
var result
is trying to hold the value from the mysql statement that gets executed. Yet you will have to remember that all the code in Node is async. The code that is written is trying to mimic like async but is not async out-of-box. To put this in simple words, you have done code separation / refactored code and nothing much. Instead of doing this, make your code
setResult
wait for the async to accept the return value. Give it a callback parameter. Happy coding!

Categories