How to invoke an npm script from a parent folder? - javascript

I created an express-based backend (in folder A) and a related react-based front-end project (in folder B) respectively. Now I put B inside A for the following benefits:
I don't need to copy files from font-end build to server project anymore because A/server.js can serve files from A/B/build directly.
No need to worry about the cross origin request errors.
They look like one project and are easier to manage in GitHub.
But can I run npm run buildjs from folder A, which actually runs npm run build in folder B? I guess it has much to do with the npm run-script usage.

This can be done using --prefix <path>. From folder A:
npm run --prefix ./B build
You could add the following to A/package.json:
{
...
"scripts": {
"buildjs": "npm run --prefix ./B build"
},
...
}

copy is not a problem, just make it be automatic. Introduce a deployment step is not bad.
there will not be any cross origin problem because you are serving front-end as static files. or at least the cross origin problem should do no matter with it
people first. split or not split projects, depends on developers

Related

NPM post-install, CA certificates, and Windows

TLDR;
Is it possible, in a Windows environment, to set NODE_EXTRA_CA_CERTS in a way that works with NPM packages' post-install scripts, without making system changes, configuration file changes, or changes that require admin-level permissions?
Details
This has been driving me nuts.
I added image optimization to our Webpack build process via imagemin, imagemin-webpack, and the various imagemin format-specific plugins.
All of the imagemin plugins have one thing in common -- during post-install, they:
a. Attempt to download a pre-built EXE.
b. If (a) fails, they attempt to build the EXE from source.
I.T. snoops on our traffic, so (a) fails due to the "self-signed certificate in chain" error when attempting to fetch the remote EXE. (b) fails because our studio is Windows-based, and we don't have all the various build tools installed to make that happen. It's also not reasonable to have them installed on every machine where npm install might be run.
I did some digging (thanks S.O.), found our company's CA certificate, added it to the repo, and was able to get (a) working with the following commands:
> SET NODE_EXTRA_CA_CERTS=%cd%\ca.cert
> npm install
I thought I was home free at that point -- all I'd have to do is add this npm script to our package.json:
{
"preinstall": "SET NODE_EXTRA_CA_CERTS=%cd%\\ca.cert"
}
But that doesn't work. I'm guessing it's because there's a separate process involved, and the environment variable doesn't carry over to the other process.
Note that this does work, but is absolutely awful:
{
"preinstall": "SET NODE_EXTRA_CA_CERTS=%cd%\\ca.cert&& npm install imagemin-gifsicle imagemin-mozjpeg imagemin-optipng imagemin-svgo"
}
Is there a way to set this environment variable automatically in a way that works with NPM packages' post-install scripts?
I'd like this to be transparent to other team members so that they can just continue to npm install without any additional steps, system changes, or configuration file changes if at all possible. Some team members are not developers, so while they're used to running npm install, I don't want to introduce any additional complications. I super appreciate any help in advance!
Have you tried
npm config set cafile "<path to your certificate file>"

JS serving and REST API from machine with changing IP address

I have an app using React that interfaces to a REST server via axios. The REST server (flask) is on the same machine that is serving the build of the JS project. The project being served is the output of npm run build. npm serve is then used to deploy this package. The web interface is then viewed from a remote machine of a different IP.
The issue I have encountered is that the IP of the machine that is serving the site and the REST API may change. How to I go about changing the IP that axios is calling dynamically? At the moment I have a script that searches for the IP string in the js build and replaces it with the machines current IP.
Using utilities like ip have only been returning localhost. I guess I need to find a way to get the IP of who is serving the script?
You can make use of env variables to solve these issues. One popular way is to use cross-env. Your package.json file would have a build command in the script section. You'll have to modify these to add the required configs as environment variables and use it in the code where required.
Example:
{
"scripts": {
"build:prod": "cross-env API_URL=http://myserverip.com NODE_ENV=production webpack --config build/webpack.config.js",
"build:dev": "cross-env API_URL=http://localhost:8000 NODE_ENV=development webpack --config build/webpack.config.js"
}
}
Then you can use different commands to build them. To build prod run npm run build:prod. To build dev run npm run build:dev.
In your code, you can use process.env.API_URL(in place where you write ip address/hostname) which will have different values based on the build. To allow webpack to replace these env variables, use webpack Define plugin as
new webpack.DefinePlugin({
'process.env': {
NODE_ENV: JSON.stringify(process.env.NODE_ENV),
API_URL: JSON.stringify(process.env.API_URL),
}
})
If the REST server and the API share a common server, you could just omit the IP address entirely, so instead of
axios.get('http://123.4.567.89/user?ID=12345')
you can just
axios.get('/user?ID=12345')
Other than that, using DNS is usually a better way:
axios.get('http://my.domain/user?ID=12345')

create-react-app loading slowly of my production server

I am try to search this question by google.But not helpful for me.
I hava a demo of create-react-app.
Run it in localhost(npm start), that work well when browser disable cache.for example:
localhost
The step was run npm run build in localhost.then scp -r build server:/home/deploy/app/.run it by nginx.
Then open the browser to run,that initialization or refresh slowly when disable cache too. for example:
server
you could find load 500KB js file need 15seconds in server.
I guess that is a relationship with bandwidth. My server bandwidth was 1M/s.but i'm not sure.
ps: I'm sorry I forgot to declare the specific environment, but I did these steps.
If you're running this in production, I wouldn't suggest you running the web-application with npm start on the server.
A much better solution would be to run npm run-script build by which you'd get a response as such:
Creating an optimized production build...
Compiled successfully.
File sizes after gzip:
48.12 KB build/static/js/main.9fdf0e48.js
288 B build/static/css/main.cacbacc7.css
The project was built assuming it is hosted at the server root.
To override this, specify the homepage in your package.json.
For example, add this to build it for GitHub Pages:
"homepage" : "http://myname.github.io/myapp",
The build folder is ready to be deployed.
You may serve it with a static server:
sudo npm install -g serve
serve -s build
You could either do serve -s build or set up nginx or apache to serve the files (it's just html, css and js). You could also use Github Pages to host it.
Are you missing the build step?
If yes, try npm run build or yarn build. It will output an optimized version of your app to ./build directory. You can then host it on your server with nginx or other server setup.
More info here: official docs
When you do npm start npm starts the development version of your app.
It includes some debug code, error checking and live refresh.

Sane approach to build and deploy typescript/node.js application

I'm working on node.js app that is written in Typescript, which means it needs to be compiled to JS before running. As I'm coming from java/jvm background where you ship prebuilt package to server and it gets run there I'm a bit afraid of the way of deployment where you push code to git and it's being built/compiled on server first and then run.
I don't like it for two main reasons:
dev dependencies need to be installed on server
deployment depends on external resources availability (npm etc).
I found NAR https://github.com/h2non/nar which is more or less what I wanted but it has some drawbacks (doesn't work with some deps that have native extensions).
My question is: is there any other "sane" way of doing deployment node.js deployment than this risky combination of npm install and tsc on server? Or should I let that sink in and do it that way?
To be honest I don't believe there are no more sane/reliable options for that.
What you can do (but there are probably other perfectly valid approaches) is building your project locally (or on a CI service), and only deploy this built version when you consider it as valid (tests, etc.).
This way, if something bad happens, like npm that fails, or a compilation error, you don't deploy anything, and you have time to resolve the situation.
For example, I used to have a gulp task (but it can be anything else: Grunt, a simple npm script...) that clone a production repository and build the project into this directory.
That way, I can check that my build is valid. If it is, I make a new commit and push it to the production repo, that is served the way you need (on a Heroku instance for example).
Pros
Clear separation of dev and non-dev dependencies
Deployment only when you know that the build is valid
No built files on source control on the development repository
No "live" dependency on external tasks like npm install or tsc build
Cons
You have two separated git repositories (one with the source code, one with the built version of your project)
Production process is a little bit heavier than simply committing to master
(From comment) Doesn't properly handle the case of npm package that relies on native extensions that have to be (re)built
is there any other "sane" way of doing deployment node.js deployment than this risky combination of npm install and tsc on server
package.json + npm install + tsc is the way to do it. Nothing risky about it.
More
Just use an npm script : https://github.com/TypeStrong/ntypescript#npm-scripts

Run npm build for play framework within sbt

I'm not very much familiar with sbt/play configuration. I'm using play 2.3.8 to serve my javascript application. In the project there is:
.enablePlugins(SbtWeb)
.enablePlugins(play.PlayScala)
.settings(
...
libraryDependencies ++= WebDependancies :+ evobufAkka,
pipelineStages in Assets := Seq(closure, digest),
...
// Some closure compiler settings
)
The project is using closure compiler to minify the code etc. But I would like to change it. I would like to stop using closure compiler and instead just use simple npm packages. I'm aware that sbt can run some shell task. The reason for all that is to separate server from frontend, so all tasks related to frontend like (less, uglify, fingerprinting, etc.) are actually done by javascript tool like node. I was reading about sbt-web but I would like to avoid that if possible. What I have in mind is:
1. start sbt, open my project
2. run compile:
- sbt would run my npm tasks which ends up some build.js file which then can be served via play framework from /public directory or whatever.
3. I would like to have possibility to have a separate process for unit tests if possible.
In terms of npm setup I was thinking about putting package.json in my project/public folder, unless its better to put it in project/app/assets.
Is that all possible?
Update 8/8/2015
I did some research and found out about external processes. Based on some example I created:
lazy val npmBuildTask = taskKey[Unit]("Execute the npm build command to build the ui")
npmBuildTask := {
"cd public/ && npm install" !
}
but not sure how can I add this task to the compile process?
You could make the compile task depend on your npmBuildTask task:
compile <<= (compile in Compile) dependsOn npmBuildTask

Categories