This is probably a very green question, but I've been designing a react app for a while using webpack and installing various packages using npm install. Every package is for some front end widget such as tabs, or D3, etc. My question is does this mean I have to make my server a node server if and when I go production? Could it be a Flask server, or some other random type and still use these node packages? I know that seems like a stupid question because I'm using node, and they're called node modules, but they're all for the front end and not the back end, so I don't know if they require a node back end or not.
My question is does this mean I have to make my server a node server if and when I go production?
Nope. You can use whatever web server you like. WebPack is going to bundle everything up as static resources which are deployed to your server in a normal way.
In fact, you probably shouldn't be using Node.js for normal static HTTP file serving. You would have a more performant site by using something like Nginx.
NPM was poorly named, but the naming made more sense at the time it was created. Web developers can use it as a package manager as well.
Related
I am making a small website for a buddy but he only has a Webserver from hostinger, for his website I want to use a Node.js Package. Is it still possible to use this hosting service if I just point the URL to the index.html file?
Or does the javascript not work anymore if I use Node.js?
tl'dr: Most likely not.
If the plan your friend has does not offer support Node.js you will not be able to use such package, but it will be hard to say for sure without knowing exactly the plan your friend has.
Webserver usually means static content (HTML, CSS and JS) hosting, and serving to the browser as-is. In this case no actual processing is done on the server - which is what node.js is for.
You could use other browser npm packages (or isomorphic ones, that support Node.js and browser environments), but in the case you described you won't be able to use node specific packages.
According to Hostinger's website, you can only use node.js if you are on VPS plan.
EDIT: It is worth mentioning that you could use a node package if you use it locally to generate or preprocess the assets before putting it in a webserver. But in this case the package will be executing in your machine instead of the server, during build time (not run time).
It depends on the package and what you want to do with it.
Node.js can be used to run server-side JavaScript. If you need to run server-side JavaScript then you need a hosting service that supports it. A service that supports only static files will not suffice.
The term “a Node.js package” might refer to a package available via NPM. Some such packages require Node.js (in which case see the previous paragraph). Others will run client-side in the browser. Typically, if you are using such a package in client-side code you will use a bundler (such as Webpack or Parcel) to convert your program (which imports said package) for use in browsers.
Some websites are generated programatically at build time and the resulting static files uploaded to a static hosting site. Node.js can be used to do that generation. It is, for example, the usual means by which sites using Gatsby.js are built.
after searching quite a lot without findind nothing about it I'm here to ask you if there is a way to implement Git in a pure Javascript web application.
I already know about Git.js but it implements just some basic things and I also wanted to build my own library to learn more in depth about Git.
What I'm not looking for is an API or a lib that could help me.
What I'm looking for is something like:
var command = {{git commit -m "Hello world"}} // Also pure git implementation
gitExecute(command);
I'm still a junior developer and maybe this could be impossible...thanks for the reply :)
What you are asking for may be difficult to do in a browser (because you will need access to the file system to run git commands). What you may need to do is create a NodeJS server which exposes REST endpoints which can be accessed by code in the browser which provides the GUI. The NodeJS server code can run commands as needed and respond to the REST HTTP requests which can be then used by your code in the browser to show/update the GUI.
The disadvantage with this method is that you will need to run your NodeJS server on the computer which has the repository and will not work if the repo is not local to the server.
Another alternative is to use the REST APIs exposed by popular GIT providers like GitHub.
EDIT:
Come to think of it, your usecase may be a good fit for an Electron App. That will allow you to build a desktop app (with access to the filesystem and privileges to execute commands) using Javascript.
For this purpose, NodeJS is mandatory.
You could install git on your server machine, and thene execute your cmds through nodejs' child processes (DOCs)
I am writing an application with a Node.js backend and a single-page web app front-end.
I am keeping the client and server logic in the same project for simplicity and speed of development.
I am considering how best to organise the artifacts.
The Node.js part is straightforward because it doesn't need to go through a battery of pre-processors (transpilation, minification, concatenation etc).
The front-end needs to be transformed per the above, and I guess placed in a dist folder.
The current hierarchy of files is like so:
my-app
- src
- client
- server
Should I put the dist folder for the client artifacts under src/client?
Has anyone tried this and found problems with this approach?
I am using Heroku (a deployment system that uses git).
Committing the built artifacts for the client feels wrong, but if I want to deploy it by pushing to Heroku I think I need to commit them. Is this correct?
This question, as is, invites opinionated answers, so I'll start by saying this is by no means the only way to go, but in my opinion, it is the easiest to work with and makes the most sense.
The production client code, after pre-processing, should be located in my-app/dist or my-app/dst, which could either mean distribution or destination, depending on how you look at it. Either way, my recommendation is to commit this folder, as it saves you a lot of hassle debugging remotely.
For example, if your code works locally but not remotely, using something like the postinstall hook to generate your dist folder adds yet another suspect to check when trying to determine what the issue is with your program.
Another advantage of committing the dist folder is it allows you to specify all the packages you use for your build process as devDependencies rather than dependencies. This is a huge plus, and makes deployment a lot faster, as well as less memory usage on your heroku process.
That being said, I still recommend (as you already probably plan to do) using an automated watch task to build your dist folder for ease of development, even if you decide you don't want to use that same build process remotely and opt for committing the dist directory instead. You could add that as a custom npm command, e.g. npm run build and have that invoke your gulp task.
One last thing. For those of you using templating languages like pug or dust or ejs instead of a framework like react or angular, I recommend determining whether you can run any of your templates to build static HTML files that will be served in production.
If not, you should at least compile your templates (not to be confused with running them) by following the recommendations provided by your particular templating language. Typically, they'll suggest using their command line utility to generate the compiled templates, so that they don't have to be compiled every time they're invoked in production. This will make your node.js server respond faster to requests at the expense of using more memory to cache the compiled templates.
If you're planning to go this route, I would edit nodejs app.js/index.js to serve static file and point the directory to dist/.
Also, you would need to tell express to forward all non-api requests to the frontend.
I currently have ReactJS + NodeJS/ExpressJS + Webpack onto EC2, Amazon Web Services (AWS) under one project and would like to get it deployed together at once, in one project.
What are some suggestions on how to go about doing so? Done the research, and I've only seen tutorials on deploying one in specific, whether it be just ReactJS or just NodeJS. Any insights or leads would be greatly appreciated.
Will accept/upvote answer. Thank you in advance
You don't "deploy" ReactJS, it's just a static file or files like any other JS libraries in your applications. You also don't deploy Webpack. Webpack should run on a developer machine (or in CI/CD stack or build system).
As for the NodeJS part just use Elastic Beanstalk.
I do not commit builds to source control. I see that a lot and it can make things easier, but you can also forget to rebuild as you have to do it manually, and it adds a lot of bloat to your repo.
I believe builds should be run as part of the deployment process. Assuming you are using git, you can add script hooks/post-receive in a remote repo there. When you push to that remote, the script will run. This is where I do my webpack build.
You may want to look into https://github.com/git-deploy/git-deploy for context, but I do this manually.
In my projects, on the deployment machine I do git --init --bare /var/git/myproject.git then add the script in /var/git/hooks/post-receive. The hook checks out the code into /var/www/myproject, runs the build, which fills in the /var/www/myproject/build. Then it removes the old /var/www/myproject/public and renames build to public. And done.
I'm coming from more of an operations background and would say that if your goal includes keeping that site up as much as possible then use Packer to generate AMI's and CloudFormation to build an Application Load Balancer (the newer, cheaper brother of ELB) in front of an AutoScalingGroup which keeps the EC2 instances up and running.
I'm currently working on a large scale project doing exactly what you describe. First off, there are so many different ways to do this, so what you really need is some general guidelines to get started, then we can dig a little deeper into details when some initial decisions are made, if you'd like. If you've already got the app deploying and running in two separate steps, but are just looking to combine those, I can definitely help. I'd just need to know how you're currently building/deploying. If you're just getting started on building your pipeline and need to set up the process from scratch, then read on:
First off you'll want to set up some kind of build server that will install your npm dependancies and run your webpack build. Most likely you'll want a separate webpack config that's just for your build server, this'll give you a build optimized for production or qa/staging environments. This config should split out vendor files that you won't update all the time, pull out seperate css files with extract text plugin and uglify the files. If you have an isomorphic React app, or are using es6 features not supported in your version of node, then you'll need a webpack build for your server code as well. This is really different from the hot reloading build you'll want to have on your local machine while you're actually coding the app. I'll be happy to show some examples if you'd like of our webpack config files for both local development and our CI build. You may also need a build.sh or makefile to do something with the compiled .js files that your webpack build creates, but that'll depend on your deployment which I'll cover later. You can run your production build locally as your getting your config just right and fire up the app from those files to test it's all working. Additionally, since you'll likely want to be able to automate all of this, you probably want to run your tests and linting right before you build your app, we run eslint and mocha/jdom to run our enzyme/expect specs as part of our build. Once that's all working nicely, you'll most likely want to set up a build server that can run your builds automatically. My team is using Jenkins for this, which is a little more work to set up, but it's free (aside from the ec2 box we run it on). There are also a ton of subscription based build/continuous integration servers, such as Travis and CodeShip. There's plenty of articles on the pros and cons of these different products and how to set them up. The bottom line is you'll want to have a build server that can pull down your code from source control, install npm deps, lint, test and build your app. If anything fails it should fail your build and if your build succeeds you'll have some sort of archive that you'll later deploy to an ec2 instance. In our shop we use a build.sh file to tarball up our build archive (basically a folder with our node server files as well as our minified client files, css files and any fonts or images needed to run the app) and upload it to an S3 bucket that we deploy from. We like this fairly old school method because the tarball will never change, so we have ultra reliable roll backs.
What you do with your build archive will depend on how you want to do deployments. We have a custom deployment system using puppet, but there are plenty of products that do this such as elastic beanstalk, that would be much easier to set up. You'll want some kind of process supervisor to actually run your node app, so unless you have a dev ops team that wants to build custom pipelines, using AWS built in features will probably be the easiest way to get started. As usual, there are so many ways to do this, but the basic principal is that you need something to download your build archive and run/supervise your node process. You also may want to be able to create and configure ec2 boxes on the fly (Puppet, chef, etc.), or even use containers (Docker) which allow you to move complete stacks around as single units. Using automation to create and configure servers is crucial if you need to scale your app, but it is complicated and may not be necessary for smaller projects. This is definitely an area where you can start simply and add complexity later on, as long as you have good long term goals and make sure to take the necessary step to prepare for future complexity.
All of this can get you pretty far in the weeds, so it's best to find the simplest thing that will serve your needs as you get started and then add complexity as real life situations demand so. I'll be happy to elaborate on any of these details if you provide a little more context about how big and well funded of a project your working on. If it's a little side project to learn the tech, I'd have very different advice then if you're trying to build an app that'll have a lot of traffic and/or complex features.
This could get 100 different answers and they could all end up being good ideas. First, you mention react + nodejs - keep in mind that these solve different tasks. React is going to be frontend and served out via static files. Nodejs is focused more around the server-side and would be the code that serves data. They can easily work together. You might use Express for the webserver (nodejs) to serve the HTML/React pages.
Unfortunately, I saw that you mentioned webpack, so you are going to have to 'build' your application with something - either via webpack, gulp, grunt, etc. This is where source control and build servers are great - but if you're new to it, it might be more complex than you need.
If you have just basic EC2 images as webservers and only 1-2, then the biggest hurdle is just pushing up your code. Something like https://deploybot.com/ could work as it can push your git repo down to multiple hosts via ftp, etc. If you wanted to get a bit fancier, you could look at something like Jenkins or some of the other items.
Docker is a great choice and if you are going to be dealing with multiple developers, server environments, deployments - it's worth the time. Otherwise, keep it simple and just get your code on the EC2 instance ;).
I am very new to Node.js (which I'm assuming this is; I'm so new that I'm not really understanding what's going on here). I'm working with a client library for a system called RJ Metrics. I'm basically tying their API in with a Volusion API in order to import data into their system from the Volusion site. The code for that all makes sense but I'm not understanding how to install it and use it.
Their documentation is here:
And I'm needing to use the Javascript library because I'm working with Volusion which is on a Windows server and there is no ASP/C# option here. It says The RJMetrics Javascript client library is available via npm: and then terminal code. After research, it appeared that this uses Node.js so I installed that on my computer and ran the npm install rjmetrics in the Terminal which succeeded. I was assuming though that I must have to log into their server and run the code in order to get it to work.
Does this require me to SSH into the server? Am I way off base and is there a way I can just include some JS files in my page? I looked at their GitHub too and all of the main files use the require() function in them which I'm gathering is a Node.js function?
Apologies if I'm way off, I'm into this up to my neck and just trying to sort it all out now.
This part of the documentation (to which you refferred) is just plain ol' javascript. though NPM is the node package manager. So if you want, it looks like you can just run this .js script in a web browser like any other.
var rjmetrics = require("rjmetrics");
client = rjmetrics.Client(api_key, client_id);
# do stuff with client
If you wanted to do it in Node, you would create a .js file on your machine with their API code inside of it doing whatever you want. Then in terminal you run the script by going "node myfile.js". A local webserver setup is all you need to create and test this.