How to get json-server, when used as module, to delay responses? - javascript

json-server allows one to configure responses to be delayed via the command line:
json-server --port 4000 --delay 1000 db.json
How does one try to do this when using json-server as a module? The following doesn't work:
const jsonServer = require('json-server')
var server = jsonServer.create();
server.use(jsonServer.defaults());
server.use(jsonServer.router("db.json"));
server.use(function(req, res, next) {
setTimeout(next, 1000);
});
server.listen(4000);
It just ignores the setTimeout function completely and doesn't execute it.

The order is important. Middlewares should be before the router. If you move your timeout before server.use(jsonServer.router("db.json")); it should work.
Here is my working example:
var app = jsonServer.create();
var router = jsonServer.router(path.join(__dirname, '../../test/api/dev.json'));
var middlewares = jsonServer.defaults();
app.use(function(req, res, next){
setTimeout(next, 10000);
});
app.use(middlewares);
app.use(router);
server = app.listen(3000, function () {
console.log('JSON Server is running on localhost:3000');
done();
});

Related

How to make Socket.IO serveClient option work?

I'd like to use client distrubuted with the socket.io package in client-side scripts.
This is what I did:
const IOserver = io.listen(server, { serveClient: true, path: "/socket.io.client.js" });
But when I try to access socket.io client on that path http://localhost:1337/socket.io.client.js I get a 404 error.
How to properly set up socket.io to serve client side JavaSript file?
I think you are confusing what the path properties does. The path is the endpoint it should use to connect to your websocket server.
You need to manage the installation of the frontend js client manually. It does not get fed in from your server.
I just met the problem.
If you want the serverClient option to work, you need to do as follows:
http version
// app.js file
const server = require('http').createServer((req, res) => {
fs.readFile(__dirname + '/index.html',
(err, data) => {
if (err) {
res.writeHead(500);
return res.end('Error loading index.html');
}
res.writeHead(200);
res.end(data);
});
});
const io = require('socket.io')(server, { serveClient: true });
server.listen(1234, () => {
console.log('Server listening at http://localhost:1234.');
})
Or express version:
const express = require('express');
const app = express();
const server = require('http').Server(app);
const io = require('socket.io')(server, { serveClient: true });
server.listen(1234, () => {
console.log('Server listening at http://localhost:1234.');
})
And then, you can obtain the socket.io.js file by link http://localhost:1234/socket.io/socket.io.js, or socket.io.js.map file by link http://localhost:1234/socket.io/socket.io.js.map.

Temporary 'loading' route in Express?

I have a web app that takes a moment to load - as it needs to connect to a database, and so some other things that take time.
What's the best way to have a temporary loading route in Express?
I'd like to do something like the following:
const express = require('express')
const app = express()
// Temporary / for if someone hits the server before it's finished starting
app.get('/', (req, res) => res.send(`Loading....`))
// In my non-demo app, there's a router here that takes a moment to load, instead of a timeout.
setTimeout(function(){
app.get('/', (req, res) => res.send(`Ready!`))
}, 3 * 1000)
app.listen(3000, () => console.log('Example app listening on port 3000!'))
Routes can't be deleted at runtime, but you can add a middleware that checks if everything is initialized, if it isn't you end the request with: res.send('Loading'); otherwise you go to the next middelware.
let initialized = false;
app.get('/', (req, res, next) => {
if(!initialized)
return res.send('Loading...');
next();
});
app.get('/', (req, res, next) => {
res.send(`Ready!`);
});
setTimeout(() => initialized = true, 3000);
If your app needs some time to load properly, the best option is to NOT let it start the server.
It works very well with i.e. load balancers and multiple containers as they wait for the /health check to pass before they put the container behind the loadbalancer. Which is something you want to do for modern services.
For example this:
import { app } from './app';
import { config } from './config';
import { logger } from './components/ourLog';
import { initPromise } from './components/ourMongo';
const port = config.server.port;
async function startServer() {
await initPromise;
app.listen(port, () => {
logger.info(
{
port,
params: config.params,
processEnv: process.env,
},
'App has started'
);
});
}
startServer()
.catch(err => {
logger.error({ err }, 'Critical error, cannot start server');
process.exit(1);
});
We have component that connects to mongo and it expose initPromise, which is the promise, which is resolved after the connection is sucesfully done and you can start with using the db.
You could ping the server "x" seconds to test when it is ready.
Server
We will create a ready variable, that is set to false, when all of your database, api and other things are done doing what they need you set it to true.
We will also create a route that such as /ping that will respond with the value of ready.
const express = require('express')
const app = express()
let ready = false
// Do database stuff, API stuff, etc.
// Set ready to true when ready
// Temporary / for if someone hits the server before it's finished starting
app.get('/', (req, res) => res.send(`Loading....`))
app.get('/ping', (req, res) => res.json({ready}))
app.listen(3000, () => console.log('Example app listening on port 3000!'))
Client
Here we will ping the server every "x" seconds, I set it to 0.5s, once ping returns true, we cancel the timeout and run the initialization code that builds the page.
let timeout
timeout = setInterval(async () => {
let response = await fetch('/ping')
let ready = await response.json()
if (ready.ready) {
clearInterval(timeout)
loadpage()
}
}, 500)

Is this the correct way to share a instance of a Redis client in Express?

Due to I have the application routes dedined in separated scripts I need a way to share the same Redis client across all of them. Is this the correct way of doing it ?
app.js
const express = require('express');
const app = express();
var redis = require("redis"),
client = redis.createClient();
app.use((req, res, next) => {
req.redis = client;
next();
});
// load routes
app.listen(3000, () => console.log('Example app listening on port 3000!'));
... other js file
app.get('/xyz', (req, res) => {
var redis = red.redis;
// use redis
});
You're un-necessarily passing your client instance to every request regardless of if it's needed.
I use a pretty standard controller which you can then require() as needed, example below.
var redis = require('redis')
.createClient(...);
redis.on('connect', function() {
console.log('Redis server online.');
});
module.exports = redis;
Then whenever you want to access it
var redis = require('./app/controllers/redis');
// redis => Redis client instance

Node.js Express response is ended before Promise resolve

Executing following code on Node.js with Express, but it does not return anything. It looks like res.send does not work from within promise.then(); Looks like it already returned back to the caller before promise resolve(); What I'm doing wrong? According to examples, it should work. Thanks.
const express = require('express');
const app = express();
app.get('/test', (req, res, next) => {
res.send("Good"); // Works Just Fine , but I dont need to return here
getMessage().then(function(data){
console.log("I'm here"); // Message works
res.send("Good"); // Does not Work
}).catch(function(error){
res.send("Error");
});
});
function getMessage(){
return new Promise(function(resolve, reject){
setTimeout(function() {
resolve();
}, 3000);
});
}
app.listen(PORT, () => {
console.log("run");
});
Please add following code to your app: Refer Here
This app starts a server and listens on port 8080 for connections
app.listen(8080, function () {
console.log('Example app listening on port 8080!')
})
You need to listen on a port for the express server to run.
const express = require('express');
const app = express();
const port = process.env.PORT || 3000;
app.get('/test', (req, res, next) => {
getMessage().then(function(data){
res.send("Good");
}).catch(function(error){
res.send("Error");
});
});
function getMessage() {
return new Promise((resolve, reject) => {
setTimeout(function() {
resolve();
}, 3000);
});
}
app.listen(port, () => console.log(`App listening at http://localhost:${port}`));
The problem was in request timeout setting in Postman that I'm using for testing.
Remove the first
res.send('good');
or you could res.write() first
then the message should be concatenated when res.send()
res.write('good')
res.send() closes the connection between the server and client
therefore using res.write() will write the message and when res.send() is sent all the written messages (using res.write()) are sent back to the client called
IN SHORT
res.write() can happen several times
res.send() happens once only per request

Create a request waiting-list on NodeJs

I'd like to build a NodeJS server that responds to requests just one at the time.
Basically: by doing fetch('<domain>/request/<id> I want that untill the client received the data the other requests are queued. Is it possible?
An npm module like express-queue could work.
var express = require('express');
var queue = require('express-queue');
var app = express();
app.use(queue({
activeLimit: 1
}));
app.use("*", function(req, res, next) {
setTimeout(function() {
res.send("OK");
}, 2000);
});
app.listen(3000);

Categories