I'd like to facilitate a way to start a harp.js server and run a browser-sync process at the same time. It's working like a charm on Linux. This is the content of my package.json
{
"scripts": {
"dev": "bash ./serve.sh"
}
}
And this is the serve.sh
#!/bin/bash
harp server &
browser-sync start --proxy 'localhost:9000' --files '*.jade, *.less'
When doing this on windows, I am supposed to do a bat file I presume, but isn't there a way to translate harp server & browser-sync etc etc to a corresponding command on windows?
for a package.json file, we would do something like this instead.
{
"scripts": {
"dev": "harp server & browser-sync start --proxy 'localhost:9000' --files '*.jade, *.less'"
}
}
give that a spin.
Related
I know this topic has been created before, but nothing I tried could fix it. The problem is precisely the following. I have a react script on an AWS ec2 server, that I want to execute automatically, whenever the instance is starting. For this purpose, the following script is executed at the start of the AWS server:
#!/usr/bin/python3
import time
import shlex, subprocess
args = shlex.split('sudo su ubuntu -c "/usr/bin/npm start --prefix /home/ubuntu/my-app > /home/ubuntu/output.txt 2>&1"')
subprocess.Popen(args)
When I run the script manually, everything works just fine. But whenever it is run during the server start, I get the following log:
> my-app#0.1.0 start /home/ubuntu/my-app
> react-scripts start
^[[34mℹ^[[39m ^[[90m「wds」^[[39m: Project is running at http://172.31.14.57/
^[[34mℹ^[[39m ^[[90m「wds」^[[39m: webpack output is served from
^[[34mℹ^[[39m ^[[90m「wds」^[[39m: Content not from webpack is served from /home/ubuntu/my-app/public
^[[34mℹ^[[39m ^[[90m「wds」^[[39m: 404s will fallback to /
Starting the development server...
That's all - nothing happens. Does anybody have an idea how to fix this? I thought it had something to do with the fact, that it is started from root. So I tried to fix that by using sudo su ubuntu -c, but it doesn't help either.
My guess is that it's the same problem as this issue. When you call npm start by default it calls the start script in package.json which points to :
"start": "react-scripts start",
There was a change in react-scripts that checks for non interactive shell when CI variable is not set here
But when you start it using sudo su ubuntu -c, it starts a non interactive shell.
What could work is setting CI variable to true like this :
export CI=true
sudo su ubuntu -c "/usr/bin/npm start ....."
You can also create a new script inside package.json :
"scripts": {
"start": "react-scripts start",
"ec2-dev": "CI=true;export CI; react-scripts start",
.....
}
and run :
/usr/bin/npm run ec2-dev
instead of npm start
Starting a development server is only useful if you mount the src folder to a local directory via nfs or other file sharing mechanism in order to use the nodemon capability of react-script to instantly restart your server for live changes during development.
If it's for other purposes. You need to build the app using :
npm run build
Provision the build directory artifacts and serve it using a web server
Following this tutorial to get an api and client on google cloud platform:
https://www.freecodecamp.org/news/create-a-react-frontend-a-node-express-backend-and-connect-them-together-c5798926047c/
I have a root dir with an /api and /client inside it
the /api package.json has the following script
"scripts": {
"start": "node ./bin/www"
},
the /client package.json has the following script
"scripts": {
"client-install": "npm install --prefix client",
"start": "node server.js",
"server": "nodemon server.js",
"client": "npm start --prefix client",
"dev": "concurrently \"npm run server\" \"npm run client\""
},
When I try to deploy it says:
Step #0: Application detection failed: Error: node.js checker: Neither
"start" in the "scripts" section of "package.json" nor the "server.js"
file were found. Finished Step #0
I'm thinking it can't find the scripts? What is the best approach to start my api and client at the same time when I deploy?
My supposition is that the reported problem is generated by the fact that in root directory there is no package.json file. You could try to add one in root directory to handle stuff in app and client directories, but...
... but a taken a look to the tutorial you used and I found some things I don't like, so you want to give you some suggestions. I recently developed a simple tool to visualize charts about covid-19 diffusion in Italy, it uses same structure as your app with different approaches. The main difference is I deploy it in a VPS and I use external tool to launch it so in my package.json file there is not the script command to launch it in prod. It is something like:
cd client && npm install && npm run build && cd .. && npm install && node server.js
You can take a look to github repos to get ideas, but I'm going to explain the main differences.
server stuff (with package.json) is in root directory (simply this could solve your problem)
as per other answers, you need to add two lines to your express app to serve the built client as static files:
I suggest to use proxy features rather than cors package.
In your express app:
// At the begenning, with other middlewares
app.use(express.static(path.join(__dirname, "client", "build")));
// As last, to substitute any 404 with your client home page
app.get("*", (req, res) => res.sendFile(path.join(__dirname, "client", "build", "index.html")));
I don't know which kind of data you are going to treat, but CORS is a sort of protection you should never disable (by cors package), at least as it is possible. More than this, once you'll be able to solve the reported problem, I'm afraid that following part from the tutorial you used will not work, as it will try to fetch APIs from your app users's localhost:9000.
callAPI() {
fetch("http://localhost:9000/testAPI")
.then(res => res.text())
.then(res => this.setState({ apiResponse: res }));
}
To both enable CORS and solve fetching from localhost problem you should use proxy feature. In your client package.json file just add:
"proxy": "http://localhost:9000/",
This makes your webpack dev server to proxy calls it can't handle to your dev server, than changing your client fetching function to:
callAPI() {
fetch("/testAPI")
.then(res => res.text())
.then(res => this.setState({ apiResponse: res }));
}
it will automagically works both in dev and prod envs and let you to enable back CORS.
Hope this helps.
Well in that tutorial that guy creates the react app using create-react-app and because of that I don't know why you have in the 'start' command in your second package.json the following command 'node server.js'. You should have "start": "react-scripts start", that won't fix your problem but I'm not sure if server.js another server or what.
Anyway I'll try to help you, first 'create-react-app' creates an app which internally uses webpack-dev-server which is cool for developing, but in order to deploy your app you need to run this command
npm run build
After that you will have a folder called 'build' with your react app. Then you need to copy that folder and go to your server project and paste the
'build' folder there. Once you finish that you need to add the following code to tell your server that you want to serve static files
app.use(express.static(path.join(__dirname, 'build')));
And you also need to add the route
app.get('/', function (req, res) {
res.sendFile(path.join(__dirname, 'build', 'index.html'));
});
Now you can test it locally running only your server and the '/' path should take you to the react app. Once you know that is working, you can deploy ONLY that server with the build folder and just one package.json to google app engine.
I hope helped you!
I have a full-stack app that runs great on LocalHost, but once it gets deployed to the web (either Heroku or netlify), the app stops working properly. My question is what are the necessary changes I need to make in order for the backend to work properly and continue working with the API in order to update the frontend, etc. I've already tried changing the port on express:
const PORT = process.env.PORT || 5000;
app.listen(PORT, "0.0.0.0", err => {
if (err) {
console.error(err)
} else {
console.log(`Running on port ${PORT}`)
}
})
Do I need to add my own .env file for the port, or does heroku do it automatically? Thanks in advance!
if you are going to use netlify you can only host static files which means if you are going to do that you might need to seperate your backend from your frontend code wise
host the backend in heroku
and your frontend on netlify
also you need a procfile to tell heroku what to do with your app if you are going to build react with your backend in the same server hope this help people who are wondering about netlify/heroku deployment
If you are going to be using heroku and assuming you have already setup your heroku account and already have heroku terminal installed and connected (if you are having troubles there let me know)
Then you need the following:
Procfile - this file tells heroku what script to use to run your server. Make sure you name it Procfile and no extensions.
It can include something like the following code
web: yarn heroku-start - Note here I am using yarn as my pacakge manager, you can easily replace it with npm if that's what you are using. I am also calling heroku-start which is a script defined in my package.json
Here is a sample of pacakge.json (I only included important lines)
{
...
"scripts": {
"dev": "nodemon -w src --exec \"babel-node src --presets env,stage-0\"",
"build": "babel src -s -D -d dist --presets env,stage-0",
"start": "pm2 start dist",
"prestart": "yarn -s build",
"heroku-prestart": "yarn global add pm2 && pm2 install pm2-logrotate",
"heroku-start": "node dist",
"heroku": "yarn -s build && git add . && git commit -m \"Rebuilt dist folder\" && git push heroku mj/bug/memory-leak:master",
"lint": "eslint src",
"heroku-postbuild": "echo Skip build on Heroku"
},
"devDependencies": {
"babel-cli": "^6.26.0",
"babel-core": "^6.26.0",
"babel-eslint": "^8.0.3",
"babel-preset-env": "^1.6.1",
"babel-preset-stage-0": "^6.24.1",
"eslint": "^4.13.0"
},
"eslintConfig": {
"parserOptions": {
"ecmaVersion": 9,
"sourceType": "module"
},
"env": {
"node": true
},
"rules": {
"no-console": 0,
"no-unused-vars": [
"warn",
{
"argsIgnorePattern": "^_"
}
]
}
},
"heroku-run-build-script": true
}
I am using babel to build my project. Please don't get overwhelmed with the amount of scripts I have, some are useless. You should pay attention however to this line "heroku-start": "node dist" under scripts - this is my script to run my application on heroku. yours can say something like node index.js I am using dist because babel is building my application to make it available with older ecma versions while allowing me to use new ecma version, the script build is what generates my dist folder.
I have also included my devDependences just in case you are interested.
You also need app.json - this file basically describes your application for heroku.
Here is a sample
{
"name":"myApp",
"description":"my cool app",
"repository":"https://github.com/mycoolapp/",
"logo":"http://node-js-sample.herokuapp.com/node.svg",
"keywords":["node","express","anothertag"],
"image":"heroku/nodejs"
}
After this is done you can upload your project to heroku and it should run fine. You can setup a hook between heroku and your master branch on github so you have automatic deployment when you push to maser or merge to it.
NEXT:
I have noticed something off in your code, I wouldn't recommend using 0.0.0.0 on heroku, here is some explanation as to why https://help.heroku.com/P1AVPANS/why-is-my-node-js-app-crashing-with-an-r10-error
here is your new code
const PORT = process.env.PORT || 5000;
app.listen(PORT, function(err) {
if (err) {
console.error(err)
} else {
console.log(`Running on port ${PORT}`)
}
}
Also don't use arrow functions as some browsers and heroku may not build it correctly (thats why I use babel).
Finally, this is a good tutorial on creating nodejs apps on heroku.
https://appdividend.com/2018/04/14/how-to-deploy-nodejs-app-to-heroku/
Good luck.
I have few Nodejs servers, they are small servers and each one of them is stored in a separate folder. And all the folders are stored in one root folder. Every time i want to run the servers i have to go through each one of them and type
nodemon *name*.
This is becoming tiresome, especially that the number of servers is growing. is there any tool or a script i could use to run all the servers in one command??
Basically, how can i run all the servers in one command or a script?
With NPM. Write this in package.json :
{
"name": "project-name",
"version": "1.0.0",
"scripts": {
"start": "nodemon server1.js | nodemon server2.js | nodemon server3.js"
}
}
Then you only need to execute npm start.
Also see this post
PM2 is a great answer for this.
pm2 start app.js -i 4 (or max will take up all available cores)
You also get great benefits such as automatic restarts, log aggregation and load balancing.
Use pm2
If you use Linux
#!/bin/bash
pm2 start << Path to User Server>>
pm2 start << Path to User Server>>
pm2 logs
You can save
pm2 save
pm2 list
pm2 stop
Since 2 weeks in lost time I try to install Puppeteer on AWS Lambda without success.
I have try:
https://github.com/sambaiz/puppeteer-lambda-starter-kit
and
https://github.com/deathemperor/puppeteer-lambda-starter-kit
My final code is:
https://github.com/sambaiz/puppeteer-lambda-starter-kit
Replace index.js:
https://github.com/sambaiz/puppeteer-lambda-starter-kit/blob/master/src/index.js
By:
https://github.com/deathemperor/puppeteer-lambda-starter-kit/blob/master/src/index.js
Also, i'm on windows 7 so to build the package I remove/change a lot of stuff on the package.json for scripts sections.
I have create package with and without babel and lint. Also, I have try with different version of puppeteer and chronium.
Someone suggest me to fix the version of puppeteer to 1.1.1 without success.
See( TheCat and cirdes ): https://github.com/GoogleChrome/puppeteer/issues/323
I always get this error on aws:
{
"errorMessage": "Failed to launch chrome! spawn /tmp/headless_shell ENOENT\n\n\nTROUBLESHOOTING: [...]",
"errorType": "Error",
"stackTrace": [
"",
"",
"TROUBLESHOOTING:[..]",
"",
"onClose (/var/task/node_modules/puppeteer/lib/Launcher.js:299:14)",
"ChildProcess.helper.addEventListener.error (/var/task/node_modules/puppeteer/lib/Launcher.js:290:64)",
"emitOne (events.js:116:13)",
"ChildProcess.emit (events.js:211:7)",
"Process.ChildProcess._handle.onexit (internal/child_process.js:196:12)",
"onErrorNT (internal/child_process.js:372:16)",
"_combinedTickCallback (internal/process/next_tick.js:138:11)",
"process._tickDomainCallback (internal/process/next_tick.js:218:9)"
]
}
Config AWS:
I use "Upload a file from Amazon S3" option because it always finish by timeout with the UI and same thing for CLI command.
Runtime: Node.js 8.10
Handler: index.handler
Executable role: lambda_basic_execution. I have also try with a custom role who have full access on lambda and S3 just in case.
TimeOut: 30 sec
Memory: 3008 mb.
If someone can guide me a little bit.
I finally managed to deploy the sambaiz package. Also I updated the chronium to the lasted stable version( HeadlessChrome/68.0.3440.106 ) and last version of puppeteer ( 1.7.0 ).
https://www.dropbox.com/s/p4t7zod2nf97cwn/sambaiz-puppeteer.zip?dl=0
If you want to build your own package and you are on windows you can:
Download: https://github.com/sambaiz/puppeteer-lambda-starter-kit
Change package.json by mine:
{
"name": "puppeteer-lambda-starter-kit",
"version": "1.1.2",
"description": "Starter Kit for running Headless-Chrome by Puppeteer on AWS Lambda",
"scripts": {
"package": "npm run package-prepare",
"package-prepare": "npm run babel && copy package.json dist && cd dist && npm config set puppeteer_skip_chromium_download true -g && npm install --production",
"babel": "mkdir dist && \"./node_modules/.bin/babel\" src --out-dir dist",
"local": "npm run babel && copy node_modules dist && node dist/starter-kit/local.js",
"package-nochrome": "npm run package-prepare && cd dist && zip -rq ../package.zip ."
},
"dependencies": {
"babel": "^6.23.0",
"puppeteer": "^1.1.1",
"tar": "^4.0.1"
},
"devDependencies": {
"aws-sdk": "^2.111.0",
"babel-cli": "^6.26.0",
"babel-preset-env": "^1.6.0"
}
}
Change the version of node in .babelrc to 8.10
npm install babel ( if it's not already install )
npm run package
Copy chrome/headless_shell-67.0.3361.0.tar.gz to dist
Rename dist/headless_shell-67.0.3361.0.tar.gz to headless_shell.tar.gz
Zip the content of dist and you have your package ready to deploy
I have been down this painful road too and would suggest looking at Google Cloud Functions, because Google Cloud Functions installs the NPM packages from the package.json file rather than you having to install them locally and upload the node_modules directory (which is what blows the AWS 50MB limit).
You can do something like:
gcloud functions deploy screenshot --runtime nodejs8 --trigger-http --memory=2048MB --timeout=60 --project=xyz --region europe-west1