Node deploy failing because container did not respond to warmup request - javascript

I'm deploying a node app to azure and the deploy is failing because the container is not responding to the warmup request. The app starts locally, and is listening on the same port as the warmup requests. Here are the deploy logs:
2022-10-29T06:01:45.287Z INFO - Initiating warmup request to container
customcalligraphy_0_9c31f200 for site customcalligraphy
2022-10-29T06:03:45 No new trace in the past 1 min(s).
2022-10-29T06:04:45 No new trace in the past 2 min(s).
2022-10-29T06:05:45 No new trace in the past 3 min(s).
2022-10-29T06:01:45.035657849Z _____
2022-10-29T06:01:45.035686150Z / _ \ __________ _________ ____
2022-10-29T06:01:45.035691350Z / /_\ \\___ / | \_ __ \_/ __ \
2022-10-29T06:01:45.035695650Z / | \/ /| | /| | \/\ ___/
2022-10-29T06:01:45.035699650Z \____|__ /_____ \____/ |__| \___ >
2022-10-29T06:01:45.035703750Z \/ \/ \/
2022-10-29T06:01:45.035707650Z A P P S E R V I C E O N L I N U X
2022-10-29T06:01:45.035711351Z
2022-10-29T06:01:45.035714951Z Documentation: http://aka.ms/webapp-linux
2022-10-29T06:01:45.035718651Z NodeJS quickstart: https://aka.ms/node-qs
2022-10-29T06:01:45.035722451Z NodeJS Version : v18.2.0
2022-10-29T06:01:45.035726151Z Note: Any data outside '/home' is not persisted
2022-10-29T06:01:45.035729951Z
2022-10-29T06:01:45.152855802Z Starting OpenBSD Secure Shell server: sshd.
2022-10-29T06:01:45.215494733Z Starting periodic command scheduler: cron.
2022-10-29T06:01:45.235215447Z Cound not find build manifest file at '/home/site/wwwroot/oryx-manifest.toml'
2022-10-29T06:01:45.242910847Z Could not find operation ID in manifest. Generating an operation id...
2022-10-29T06:01:45.242928347Z Build Operation ID: feba110f-d139-44e2-b66c-0e10cff5c53d
2022-10-29T06:01:45.476505931Z Environment Variables for Application Insight's IPA Codeless Configuration exists..
2022-10-29T06:01:45.522707434Z Writing output script to '/opt/startup/startup.sh'
2022-10-29T06:01:45.595152421Z Running #!/bin/sh
2022-10-29T06:01:45.595192122Z
2022-10-29T06:01:45.595198422Z # Enter the source directory to make sure the script runs where the user expects
2022-10-29T06:01:45.595204922Z cd "/home/site/wwwroot"
2022-10-29T06:01:45.595210723Z
2022-10-29T06:01:45.595215423Z export NODE_PATH=/usr/local/lib/node_modules:$NODE_PATH
2022-10-29T06:01:45.595220223Z if [ -z "$PORT" ]; then
2022-10-29T06:01:45.595224923Z export PORT=8080
2022-10-29T06:01:45.595229723Z fi
2022-10-29T06:01:45.595234423Z
2022-10-29T06:01:45.595239023Z npm start
2022-10-29T06:01:47.149225225Z npm info it worked if it ends with ok
2022-10-29T06:01:47.149401829Z npm info using npm#6.14.15
2022-10-29T06:01:47.150028044Z npm info using node#v18.2.0
2022-10-29T06:01:47.346984068Z npm info lifecycle custom-calligraphy-ecommerce#1.0.0~prestart: custom-calligraphy-ecommerce#1.0.0
2022-10-29T06:01:47.348375300Z npm info lifecycle custom-calligraphy-ecommerce#1.0.0~start: custom-calligraphy-ecommerce#1.0.0
2022-10-29T06:01:47.352639898Z
2022-10-29T06:01:47.352662299Z > custom-calligraphy-ecommerce#1.0.0 start /home/site/wwwroot
2022-10-29T06:01:47.352681799Z > node index.js
2022-10-29T06:01:47.352686199Z
2022-10-29T06:01:48.466127815Z STARTING CUSTOM CALLIGRAPHY SERVER
2022-10-29T06:01:49.692607132Z CCvbeta1.1 Listening on port 8080!
2022-10-29T06:01:50.344268374Z serving app
2022-10-29T06:02:21.332686762Z serving app
2022-10-29T06:02:52.341610061Z serving app
2022-10-29T06:03:23.343535160Z serving app
2022-10-29T06:03:54.350896814Z serving app
2022-10-29T06:04:25.361345562Z serving app
2022-10-29T06:04:56.364076228Z serving app
2022-10-29T06:05:27.370947666Z serving app
2022-10-29T06:05:57.383Z ERROR - Container customcalligraphy_0_9c31f200 for site customcalligraphy did not start within expected time limit. Elapsed time = 252.0962469 sec
2022-10-29T06:05:57.401Z ERROR - Container customcalligraphy_0_9c31f200 didn't respond to HTTP pings on port: 8080, failing site start. See container logs for debugging.
2022-10-29T06:05:57.408Z INFO - Stopping site customcalligraphy because it failed during startup.
My app is actually running, but the deploy is failing. Here is my serverside code:
console.log("STARTING CUSTOM CALLIGRAPHY SERVER");
const express = require("express");
const path = require("path");
const app = express();
app.use(express.static(path.join(__dirname, "frontend", "build")), () =>
console.log("serving app")
);
app.use(express.json({ limit: "5gb" }));
const port = process.env.port || 8080;
const version = "beta1.1";
app.listen(port, () => console.log(`CCv${version} Listening on port ${port}!`));
How can I disable the warmup request or respond to the pings? Any help would be appreciated.

I would suggest use of express-generator to create the app. Just run
npx express-generator myapp
Then you can add your code in the app.js file.
my app file looks like this
console.log("STARTING CUSTOM CALLIGRAPHY SERVER");
const express = require("express");
const path = require("path");
const app = express();
app.use(express.static(path.join(__dirname, "frontend", "build")), () =>
console.log("serving app")
);
app.use(express.json({ limit: "5gb" }));
const port = process.env.port || 8080;
module.exports = app;
Here instead of listening to the ports in the app.js file we export the app entity and all the listening of port will be done in the www file.
This www also call the app.js file and will run app.
Now you can just deploy the app by you preferred way of choice. Here I have used vscode.
This way you will not get the could not find javascript build manifest file errors
logs after deployment of app :
Also use port 80 as it is already exposed in the app service.

Related

Docker compose, JS: Issue while connecting to doc. container PHP Apache with socket.io from localhost

Windows 10, Docker Desktop + VS;
Docker-compose.yml has images:
node.js with socket.io
php 7.4-apache with socket.io
I need to connect socket.io within php-apache (website) to socket.io within node.js (server to handle data from php-apache) in one docker-compose.yml to run it in remote VM (hah, sounds like it is possible for an ordinary mortal).
First, tried to connect 2 containers within docker-compose (node.js to node.js) to be sure docker ports + socker.io ports set correct (successfully).
(Server side: socket.io listen 90 port, docker 90:90 (service name in docker-compose is node_server) || Client side: io(http://node_server:90), docker 85:85).
Confirmed, 2 containers are linked and ports are set correctly, started to make "php apache - node.js" socket-io docker-compose.yml.
My php-apache container with inner socket.io client composer link is "http://node_client:80".
When tried to connect "localhost:80" to PHP container: everything loaded fine, but error occured: net:ERR_NAME_NOT_RESOLVED (polling-xhr.js:206);
Error which I get while connecting to localhost:80 (docker php-apache container)
docker php-apache container is on 80:80 port in docker-compose.yml
request URL
I can connect to apache (it opens windows and all html stuff written in index.php), but gets an error (err_name_not_resolved) like socket.io in client-side (php-apache) can't make a request(?).
Checked for ping: both have connection (pinged node_server:90\client:80 from each terminal).
Checked "php-apache docker": "curl "http://node_server:90/socket.io/?EIO=4&transport=polling" and it also showed an information.
I do understand that some troubles have to be occured (because I connect from localhost to docker php container, and socket.io client in that container in that container gets mad(what ip to use(172.0.. or 192.168.. etc). But I have even no idea how to solve it (I need to connect somehow to index.php apache and socket.io.
I need to use php-apache and connect it to node.js, I confirmed socket.io worked in node.js - node.js, but in php-apache->node.js something happens while connection localhost.
docker-compose.yml:
version: "3"
services:
node_client:
container_name: client
image: img_client
ports:
- "80:80"
networks:
- test
node_server:
container_name: server
image: img_server
ports:
- "90:90"
networks:
- test
networks:
test:
external: true
Docker.client:
FROM php:7.4-apache
COPY ./client /var/www/html
EXPOSE 80
#(in ./client -> index.php)
./client/index.php:
<script src="https://cdn.socket.io/3.1.3/socket.io.min.js" integrity="sha..." crossorigin="anonymous"></script>
<script>
const socket = io(`http://node_server:90`, {
//secure: true,
//transport: ['websocket'],
//upgrade: false,
//rejectUnauthorized: false,
});
socket.on('connect',() =>{console.log(socket.id)});
</script>
Docker.server:
FROM node:alpine
COPY ./server /app
WORKDIR /app
COPY package*.json ./
COPY . .
CMD [ "node", "./server.mjs" ]
EXPOSE 90
//(in ./server -> server.mjs + node_modules)
./server/server.mjs:
import { createRequire } from 'module';
const require = createRequire(import.meta.url);
const express = require('express')
const app = express();
const cors = require("cors")
//app.use(cors({}))
const server = require('http').createServer(app);
const { Server } = require("socket.io");
const io = new Server(server, {
//rejectUnauthorized: false,
cors: {
origin: '*',
//methods: ["GET", "POST"],
//credentials: true,
}
});
server.listen(90, () => { console.log("Server is Ready!");});
//Also tried (90, '0.0.0.0', () => but there will be CORS troubles i believe, anyway I can't even determine the problem, but socket.io in "node.js + node.js" worked
//server.listen(PORT_SERVER, '0.0.0.0', () => { console.log("Server is Ready!");});
io.on('connection', (socket) => { console.log(`Подключился!`)});
Heh.. I tried to understand what is proxy and can it be used (nginx) somehow to connect 1th docker php-apache with CDN module (client) + 2th docker container node.js (server) all within docker-compose.yml, but gave up.

Heroku) Ways to use 2 ports on FREE dyno type (Only provide one port for an App.)? [duplicate]

Here is the deployment challenge that I'm currently facing. I've googled quite extensively and tried a few with no luck.
I have app A that uses express as the server to serve at port 5000. Then I have app B that users Next as the server to serve at port 3000. I wrote some codes to integrate the two apps into one and I'm trying to deploy it onto Heroku. I kept getting deployment failures. Here is my package.json:
"scripts": {
"start": "concurrently \" npm run dev \" \" next \" ",
....
Alternatively, you can use Next.js's Custom Server functionality. Simply define your API routes first and then place expressApp.all('*', nextHandler) at the end so that Next catches everything not caught by your API endpoints.
const express = require('express');
const next = require('next');
const APIRouter = require('./routes');
const dev = process.env.NODE_ENV !== 'production';
const hostname = 'localhost';
const port = 3000;
const nextApp = next({dev, hostname, port});
const nextHandler = nextApp.getRequestHandler();
const expressApp = express();
nextApp.prepare().then(() => {
expressApp.use(APIRouter());
expressApp.all('*', nextHandler);
expressApp.listen(port, () => {
console.log(`App listening on port ${port}`);
});
});

React-router URL's, Routing node/express & file structure

File structure
.root
|-client
| -build
| -node_modules
| -public
| -src
| -components
| -resources
| -data
| -images
| -templates
|-server
| -models
| -public
| -routes
| -server.js
server.js
//Required NPM Packages
const express = require('express'),
app = express(),
cors = require('cors'),
bodyParser = require('body-parser'),
mongoose = require('mongoose'),
methodOverride = require('method-override');
//MongoDB models.
const Product = require('./models/Product');
//Routes.
const indexRoute = require('./routes/index');
const demoRoute = require('./routes/demo');
const bootstrapTemplateRoute = require('./routes/bootstrapTemplate');
//Port.
const _PORT = process.env.PORT || 5000;
//Connect to mongoDB.
mongoose.connect({pathToMongoDB}, {useNewUrlParser:true, useUnifiedTopology:true});
//Setup body-parser.
app.use(bodyParser.urlencoded({extended:true}));
//Allow express/node to accept Cross-origin resource sharing.
app.use(cors());
//Set view engine to EJS.
app.set('view engine', 'ejs');
//Change views to specified directory
app.set('views', path.join(__dirname, '..', 'client','src','Resources','templates'));
app.use(express.static(__dirname+"/public"))
app.use(express.static("client/build"));
app.use(express.static("client/Resources/templates"));
//Setup method override.
app.use(methodOverride("_method"));
//register routes to express.
app.use('/', indexRoute);
app.use('/demo', demoRoute);
app.use('/bootstrapTemplate', bootstrapTemplateRoute)
//listen to established port.
app.listen(_PORT, () => {
console.log(`The server has started on port ${_PORT}!`);
});
module.exports = app;
Question
When I click the back button or load the page in my browser bar nodeJS states that it cannot get the page. I recognise that the front-end and back-end request are different as the React-Router takes care of routing via it's own javaScript allowing for zero page reloads, but how do you actually solve the problem on nodeJS/Express?
Also when I go to localhost:3000/demo it returns the data from my mongoDB and renders it in json format rather then loading the correct page.
Currently working on a MERN Stack with the below Nginx basic routing config.
http{
server{
listen 3000;
root pathToRoot/client/build;
location / {
proxy_pass http://localhost:5000/;
}
}
}
events {
}
I believe the problem is within my routing via express/nodejs. I can't create express specific routes and render the correct page because react is rendering it for me. I looked at the following question React-router urls don't work when refreshing or writing manually. Should I just do a catch-all and re-route back to the main index page? This seems like it would render bookmarks unusable.
Edit
Here are the 2 node routes.
index.js
const express = require('express'),
router = express.Router();
router.get("/", (req,res) => {
//Render index page.
res.render("index");
});
demo.js
const express = require('express'),
router = express.Router();
const Product = require('../models/Product');
router.get('/:searchInput', (req,res) => {
Product.find({ $text: {$search: req.params.searchInput}}, (err,foundItem) => {
if(err){
console.log(err);
}else{
res.send(foundItem);
}
})
})
router.get('/', (req, res) => {
Product.find({}, (err, foundItem) => {
res.send(foundItem);
});
});
module.exports = router;
I would try separate out your development folders from your build folders as it can get a bit messy once you start running react build. A structure I use is:
api build frontend run_build.sh
The api folder contains my development for express server, the frontend contains my development for react and the build is created from the run_build.sh script, it looks something like this.
#!/bin/bash
rm -rf build/
# Build the front end
cd frontend
npm run build
# Copy the API files
cd ..
rsync -av --progress api/ build/ --exclude node_modules
# Copy the front end build code
cp -a frontend/build/. build/client/
# Install dependencies
cd build
npm install
# Start the server
npm start
Now in your build directory you should have subfolder client which contains the built version of your react code without any clutter. To tell express to use certain routes for react, in the express server.js file add the following.
NOTE add your express api routes first before adding the react routes or it will not work.
// API Routing files - Located in the routes directory //
var indexRouter = require('./routes/index')
var usersRouter = require('./routes/users');
var oAuthRouter = require('./routes/oauth');
var loginVerification = require('./routes/login_verification')
// API Routes //
app.use('/',indexRouter);
app.use('/users', usersRouter);
app.use('/oauth',oAuthRouter);
app.use('/login_verification',loginVerification);
// React routes - Located in the client directory //
app.use(express.static(path.join(__dirname, 'client'))); // Serve react files
app.use('/login',express.static(path.join(__dirname, 'client')))
app.use('/welcome',express.static(path.join(__dirname, 'client')))
The react App function in the App.js component file will look like the following defining the routes you have just added told express to use for react.
function App() {
return (
<Router>
<div className="App">
<Switch>
<Route exact path="/login" render={(props)=><Login/>}/>
<Route exact path="/welcome" render={(props)=><Welcome/>}/>
</Switch>
</div>
</Router>
);
}
Under the components folder there are components for login and welcome. Now navigating to http://MyWebsite/login will prompt express to use the react routes, while for example navigating to http://MyWebsite/login_verification will use the express API routes.

Postgresql on OpenShift 2, using node.js

I have an app that uses Node.js and Postgresql on OpenShift, I can connect locally to the database and make queries, but I can't get it to work on the openshift server. When I push to server, I get this error:
Waiting for application port (8080) become available ...
Application 'myapp' failed to start (port 8080 not available)
But Im using the port 8080...
My openshift ports are:
Service --- Local --------------- OpenShift
node ------ 127.0.0.1:8080 => 127.8.120.129:8080
postgresql 127.0.0.1:5432 => 127.8.120.130:5432
And here I write the important code line.
First, the server.js:
...
var db = require('./postgresql/database.js');
db.sync();
...
var server_port = process.env.OPENSHIFT_NODEJS_PORT || 8080
var server_ip_address = process.env.OPENSHIFT_NODEJS_IP || '127.0.0.1'
server.listen(server_port, server_ip_address, function () {});
...
And database.js:
var Sequelize = require('sequelize');
var bd_url = process.env.OPENSHIFT_POSTGRESQL_DB_URL || 'postgres://'user':'pass'#127.0.0.1:5432/sw'
var sequelize = new Sequelize(bd_url, {
dialect: 'postgres',
dialectOptions: {}
});
module.exports = sequelize;
Does anyone know what can fail?
Thanks!
OpenShift provides a default web server (written in Ruby) on almost every container/cartridge you create.
Every service is started using the "start" service hook, located at:
$OPENSHIFT_REPO_DIR/.openshift/action_hooks/start
You may find a line like this one:
[]\> nohup $OPENSHIFT_REPO_DIR/diy/testrubyserver.rb $OPENSHIFT_DIY_IP $OPENSHIFT_REPO_DIR/diy |& /usr/bin/logshifter -tag diy &
In order to verify which application is using port 8080, you can execute "oo-lists-ports" command.
This command is just an alias for "lsof" command.
Execute it without any arguments and you'll obtain the application that it's locking your port 8080 (in my case):
[]\> oo-lists-ports
COMMAND PID USER FD TYPE DEVICE SIZE/OFF NODE NAME
node 88451 1027 10u IPv4 392176 0t0 TCP 127.2.1.129:8080 (LISTEN)
[]\>
With the above information (PID), you just need to kill the related process:
(in my case)
[]\> ps -ef |grep 88451
1027 62829 61960 0 08:33 pts/0 00:00:00 grep 88451
1027 88451 1 0 Jun21 ? 00:00:16 node faceBot.js
[]\> kill -9 88451
After killing the process that is locking your port 8080 you will be able to run your Node JS stack on that port.
Regards

What is the difference to run React server-side rendering locally and on real NodeJS server (IBM Bluemix)

I created a simple react app with serverside rendering using this workshop git as a base with my minor changes.
So when I run locally NODE_ENV=server node server.js it works fine. But my attempts to deploy this app on a trial of Bluemix the Nodejs server failed. Here's a log :
Here is my server.js code:
require('babel-register')
const express = require('express')
const React = require('react')
const ReactDOMServer = require('react-dom/server')
const ReactRouter = require('react-router')
const StaticRouter = ReactRouter.StaticRouter
const _ = require('lodash')
const fs = require('fs')
const PORT = 5050
const baseTemplate = fs.readFileSync('./index.html')
const template = _.template(baseTemplate)
const App = require('./js/App').default
const server = express()
server.use('/_public', express.static('./_public'))
server.use((req, res) => {
const context = {}
const body = ReactDOMServer.renderToString(
React.createElement(StaticRouter, {location: req.url,
context: context},
React.createElement(App))
)
res.write(template({body: body}))
res.end()
})
console.log('listening on port', PORT)
server.listen(PORT)
P.S. It's obvious that it doesn't understand ES6 syntax in js/App.js, but on my local server it works.
By default NODE_ENV=production but according to Bluemix docs I created a file in .profile.d directory
node_env.sh code:
export NODE_ENV=server;
But I'm not sure if this file changes node_env.
I'm hoping someone more knowledgeable than me can offer a better solution, but here is what I did to make your app work. There is probably a better answer.
Assuming that you do NOT want to run in production mode...
1) server.js: Listen to the port as set in the PORT env var.
server.listen(process.env.PORT || PORT)
2) package.json: Add start command in scripts
"start": "babel-node server.js --presets es2015,stage-2"
3) Get babel-cli
npm install --save-dev babel-cli
npm install --save-dev babel-preset-es2015 babel-preset-stage-2
4) Create a manifest.yml to set CF properties
applications:
- name: rvennam-node-react
memory: 1G
disk_quota: 2G
env:
NPM_CONFIG_PRODUCTION: false
NODE_ENV: dev
5) remove eslint dependencies from devDependencies in package.json (there was a mismatch)
Again, this is assuming you want to run on Bluemix under dev mode. If you wanted production on Bluemix, I would think you would want to use webpack to build locally, and then push and serve your dist directory.

Categories