Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 5 years ago.
Improve this question
I have developed a nodejs application for a client, but obviously asking a client to install nodejs and npms in their system does not look good.
NPM's does not work if we simply copy paste them, we require to do npm install 'npm name', which is also an issue.
Assuming that client does not have internet access in their system,
I want a single exe or some file which on click can work without any need to install any npms or nodejs.
Is there something which can make it happen? I am sure there must be, which I am missing. I will be grateful if someone can help me in this
Thanks in advance
UPDATE
I am not getting option to answer my question so updating the question itself
How to install NodeJS project locally without internet connection?
this was really helpful.. that is how I can zip my modules and install it at some other machine .. and
https://github.com/jxcore/jxcore
this can be useful to run project as a single file
Kinda hacky, but you could run an electron (atom-shell) instance without showing the browser window and use the main process as a Node.js playground.
For any references about building and deploying Electron
The professional way to do this is to build a locally-installable package for your client's appropriate package management system.
All major Unix-derived operating systems (and even Windows, sortof) ship with a robust package management system which was designed with this sort of scenario in mind.
Fedora / Redhat / CentOS - Building an RPM package
Ubuntu / Debian / Windows (Windows Subsystem for Linux) - Building a DPKG
Mac OSX - pkgbuild
FreeBSD / OpenBSD - This is a bit trickier to do for a package like Node.JS for a number of reasons, but there are plenty of resources online.
If you or your clients can't provide the Windows Subsystem for Linux, there are perhaps dozens of projects from nexe to the Closure compiler for building a Node project as a single executable for a diverse set of platforms.
If you've got the itching feeling that you have too much free time, V8 also includes tooling for targeting code emitted from CrankShaft for arbitrary instruction sets with the d8 debugger. I've heard stories of sufficiently "sophisticated" (read: unhinged) programmers wrapping this in a PE/COFF for Windows à la SBCL Common Lisp 'core dumps'.
SSH into their system and install it. Not hard at all. Depending on their operating system would determine the steps needed to be taken. That's the only proper way to do it. I do it for clients on the daily.
Related
I have just started my NodeJS course, the lecture was recorded at the time of NodeJS version 10 (on a mac). I'm on Windows, it is now version 16. The lecture does not contain this page of the installation screen:
Summary: I do not know if I want native modules, or what they are - but I do not want chocolatey.
I have done my research, yet still I cannot find anything to clear up the following question for me anywhere.
1.My question:
How important are these native modules? Do I need them? Or do you recommend them, and why?
2.Chocolatey:
Out of interest, perhaps you could tell me why NodeJS have bundled together native modules and Chocolatey?
I have decided I do not want chocolatey (no problem, if I decide to install the 'tools' then I will go onto GitHub and install them manually, as it says in the screenshot.)
The reason I do not want chocolatey is because: from my research I do not think I need chocolatey and I have seen that uninstalling chocolatey will potentially cause me one or two problems, so I'll avoid it all together - but I thought I'd mention that here on the side, because maybe somebody knows a very valid reason why they are bundled together, and it will change my mind.
A big thank you to the Stack Overflow community.
Native modules need to be compiled, most often (but not exclusively) from C/C++ source, in order to function. Some folks avoid them like cancer, as they need to be compiled on installation which can be a deployment risk. Others (like me) embrace native modules because of the performance benefits they can bring.
Note that this is not a concept unique to Javascript or Node.js. Other languages like Ruby and Python also have "modules" (by other names) that involve compiling native code in order to function as well.
As to why Node.js uses Chocolatey to manage its native toolchain, it's because Chocolatey already has packages available for the tools it needs. It doesn't make sense to maintain separate NPM packages of these tools, and relying on existing packages reduces a lot of overhead in getting a wide berth of tools and utilities installed. In addition, Chocolatey can be installed system wide or for only a specific application's use. I'm not sure which technique Node.js uses but if it's asking, I assume it wants to use a system-wide config.
If you don't want to use Chocolatey, you'll have to manage the native toolchain on your own. If you tell it to use Chocolatey, you can manage the toolchain upgrades with the choco upgrade command.
That said, I would consider exploring Chocolatey if I were you. It makes package management so much easier on Windows. It's about as close to a standard as a 3rd party solution can get, in part because it builds off of nuget, and you can technically manage Chocolatey packages with PowerShell without installing Chocolatey (though I don't recommend this, just use Chocolatey).
Somebody had raised this question to the node.js developers through a github issue - https://github.com/nodejs/node/issues/30242.
tl;dr It is not essential for node.js
Currently, I have a few (unpublished) Python packages in local use, which I install (for development purposes) with a Bash script on Linux into an activated (otherwise "empty") virtual environment in the following manner:
cd /root/of/python/package
pip install -r requirements_python.txt # includes "nodeenv"
nodeenv -p # pulls node.js and integrates it into my virtual environment
npm i -g npm # update npm ...
cat requirements_node.txt | xargs npm install -g
pip install -e .
The background is that I have a number of node.js dependencies, JavaScript CLI scripts, which are called by my Python code.
Pros of current approach:
dead simple: relies on nodeenv for all required plumbing
can theoretically be implemented within setup.py with subprocess.Popen etc
Cons of current approach:
Unix-like platforms with Bash only
"hard" to distribute my packages, say on PyPI
requires a virtual environment
has potentially "interesting" side effects if a package is installed globally
potentially interferes with a pre-existing configuration / "deployment" of nodeenv in the current virtual environment
What is the canonical (if there is any) or just a sane, potentially cross-platform approach of defining node.js dependencies for a Python package, making it publishable?
Why is this question even relevant? JavaScript is not just for web development (any more). There are also interesting (relevant) data processing tools out there. If you do not want to miss / ignore them, well, welcome to this particular form of hell.
I recently came across calmjs, which appears to be what I am looking for. I have not experimented much with it yet and it also appears to be a relatively young project.
I started an issue there asking a similar question.
EDIT (1): Interesting resource: JavaScript versus Research Computing - A Brief Guide for Those Who Regret That This Has Become Necessary
EDIT (2): I started an issue against nodeenv, asking how I could make a project depend on it.
(Disclaimer: I am the author of calmjs)
After mulling over this particular issue for another few days, this question actually encapsulates multiple problems which may or may not be orthogonal to each other depending on one's given point of view, given some of the following (the list is not exhaustive)
How can a developer ensure that they have all the information
required to install the package when given one.
How does a project
ensure that the ground they are standing on is solid (i.e. has all
the dependencies required).
How easy is it for the user to install the given project.
How easy is it to reproduce a given build.
For a single language, single platform project, the first question posed is trivially answered - just use whatever package management solution implemented for that language (i.e. Python - PyPI, Node.js - npm). The other questions generally fall into place.
For a multi-language, multi-platform, this is where it completely falls apart. Long story short, this is why projects generally have multiple sets of instructions for whatever version of Windows, Mac or Linux (of various mainstream distros) for the installation of their software, especially in binary form, to address the third question so that it's easy for the end user (which usually end up being doable, but not necessarily easy).
For developers and system integrators, who are definitely more interested in questions 2 and 4, they likely want an automation script for whatever platform they are on. This is kind of what you already got, except it only works on Linux, or wherever Bash is available. Now this also begs the question: How does one ensure Bash is available on the system? Some system administrators may prefer some other form of shell, so we are again back to the same problem, but instead of asking if Node.js is there, we have to ask if Bash is there. So this problem is basically unsolvable unless a line is drawn.
The first question hasn't really been mentioned yet, and I am going to make this fun by asking it in this manner: given a package from npm that requires a Python package, how does one specify a dependency on PyPI? Turns out such a project exists: nopy. I have not use it before, but at a casual glance it provide a specific way to record dependency information in the package.json file, which is the standard method for Node.js packages convey information about itself. Do note that it has a non-standard way of managing Python packages, however given that it does use whatever Python available, it will probably do the right thing if a Python virtual environment was activated. Doing it this way also mean that Node.js package dependants may have a way to figure out the required Python dependencies that have been declared by their Node.js dependencies, but note that without something else on top of it (or some other ground/line), there is no way to assert from within the environment that it will guarantee to do what needs to be done.
Naturally, coming back to Python, this question has been asked before (but not necessarily in a useful way specifically to you as the contexts are all different):
javascript dependencies in python project
How to install npm package from python script?
Django, recommended way to declare and solve JavaScript dependencies in blocks
pip: dependency on javascript library
Anyway, calmjs only solves problem 1 - i.e. let developers have the ability to figure out the Node.js packages they need from a given Python package, and to a lesser extent assist with problem 4, but without the guarantees of 2 and 3 it is not exactly solved.
From within Python dependency management point of view, there is no way to guarantee that the required external tools are available until their usage are attempted (it will either work or not work, and likewise from Node.js as explained earlier, and thank you for your question on the issue tracker, by the way). If this particular guarantee is required, many system integrators would make use of their favorite operating system level package manager (i.e. dpkg/apt, rpm/yum, or whatever else on Linux, Homebrew on OS X, perhaps Chocolatey on Windows), but again this does require further dependencies to install. Hence if multiple platforms are to be supported, there is no general solutions unless one were to reduce the scope, or have some kind of standard continuous integration that would generate working installation images that one would then deploy onto whatever virtualisation services the organisation uses (just an example).
Without all the specific baselines, this question is very difficult to provide a satisfactory answer for all parties involved.
What you describe is certainly not the simplest problem. For Python alone, companies came up with all kinds of packaging methods (e.g. Twitter's pex, Spotify's dh-virtualenv, or even grocker, which shifts Python deployments into container space) - (plug: I did a presentation at PyCon Balkan '18 on Packaging Python applications).
That said, one very hacky way, I could think of would be:
Find a way to compile your Node apps into a single binary. There is pkg (a blogpost about it), which
[...] enables you to package your Node.js project into an executable that can be run even on devices without Node.js installed.
This way the Node tools would be take care of.
Next, take these binary blobs and add them (somehow) as scripts to your python package, so that they get distributed along with your package and find their place, where your actual python package can pick them up and execute them.
Upsides:
User do not need any nodejs on their machine (which is probably expected, when you just want to pip install something).
Your package gets more self-contained by including binaries.
Downsides:
Your python package will include binary, which is less common.
Containing binaries means that you will have to prepare versions for all platforms. Not impossible, but more work.
You will have to expand your package creation pipeline (Makefile, setup.py, or other) a bit to make this simple and repeatable.
Your package gets significantly larger (which is probably the least of the problems today).
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 6 years ago.
Improve this question
I know this might be a silly question to some of you, but I am beginner in React, and I wish to create a really simple application.
I found a sample in which every component is saved in a separate js file, which looks very good for modularity and re-usage.
The only thing I need to take care of now is using export/require. However, I don't need to be dependent on nodejs. I just need a simple html/js application that can run on any cheap web server.
I read somewhere that I can use "Browserify", but after looking at it, it seems like a node library.
Is there any library that I can use from a web page (via cdn for example) that allow me to use require? If not, does that mean I can not separate react components in different files?
However, I don't need to be dependent on nodejs.
Use NodeJS. It is how React applications are designed to be built.
I just need a simple html/js application that can run on any cheap web server
NodeJS is only required at build time. You run it on your development workstation. The output is static files that you can upload to any webserver.
(NB: React applications are often designed to make HTTP requests to get dynamic data. Some tutorials cover using Node to build a server to listen for and make responses to those requests. Make sure you don't conflate the program to transpile the React application to ES5 (which runs at build time) with the program to run a webserver (at runtime) even if both are written using Node).
If you don't want to use Node, you can use webpack: https://webpack.github.io/
you will generate a static/bundle.js . If you want to learn more about it, I sugest http://survivejs.com/
What you need is a build step that packs the separate files into one or more packages that can be loaded in the browser.
Browserify can be used to do that, but WebPack is also popular.
These tools require some configuration, so I think that the best way to start is with a tool like create-react-app which is easy to install and has commands for developing as well as packing your app for deployment.
It uses webpack internally (along with some other tools) but saves you the hassle of configuring it yourself. If at any time you need advanced configurations beyond what create-react-app provides, it has an 'eject' command that exposes the raw configuration files.
Getting started is simple (taken from their readme):
npm install -g create-react-app
create-react-app my-app
cd my-app/
npm start
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 2 years ago.
Improve this question
In the past, I made some websites with notepad for example, so we must create a folder TREE and put into it a .htm file, and some folderS with stuff like Javascript, css ...
Maybe I don't understand what NPM really brings, because It seems to do the same thing but automated it ... is it just that ?
For example, why not just unpack a frameworks (e.g. Bootstrap or Kube) without use of NPM and so have folders ready to use ?
Help me to understand please because I'm near the crazy state with all this stuff ...
npm is a package manager for Node.js with hundreds of thousands of packages. Although it does create some of your directory structure/organization, this is not the main purpose.
The main goal, as you touched upon, is automated dependency and package management. This means that you can specify all of your project's dependencies inside your package.json file, then any time you (or anyone else) needs to get started with your project they can just run npm install and immediately have all of the dependencies installed. On top of this, it is also possible to specify what versions your project depends upon to prevent updates from breaking your project.
It is definitely possible to manually download your libraries, copy them into the correct directories, and use them that way. However, as your project (and list of dependencies) grows, this will quickly become time-consuming and messy. It also makes collaborating and sharing your project that much more difficult.
Hopefully this makes it more clear what the purpose of npm is. As a Javascript developer (both client-side and server-side), npm is an indispensable tool in my workflow.
NPM basically is the package manager for node. It helps with installing various packages and resolving their various dependencies. It greatly helps with your Node development. NPM helps you install the various modules you need for your web development and not just given you a whole bunch of features you might never need.
NPM is a Node Package Manager and it's use for
it is an online repository for the publishing of open-source Node.js
projects.
Command line utility to install Node.js packages, do version
management and dependency management of Node.js packages.
NPM is a node package manager. It is basically used for managing dependencies of various server side dependencies.
We can manages our server side dependencies manually as well but once our project's dependencies grow it becomes difficult to install and manage.
By using NPM it becomes easy, we just need to install NPM once for all dependencies.
npm is Node's package manager. It's a repository of hundreds of thousands of useful pieces of code that you may want to integrate with your Node project.
npm also has a command line tool that lets us easily install, manage and run projects.
Use npm to . . .
Adapt packages of code for your apps, or incorporate packages as they are.
Download standalone tools you can use right away.
Run packages without downloading using npx.
Share code with any npm user, anywhere.
Restrict code to specific developers.
Create Orgs (organizations) to coordinate package maintenance, coding, and developers.
Form virtual teams by using Orgs.
Manage multiple versions of code and code dependencies.
Update applications easily when underlying code is updated.
Discover multiple ways to solve the same puzzle.
Find other developers who are working on similar problems and projects.
READ MORE here
It stands for Node Package Manager
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
I'm building a web application using a MEAN stack: MongoDB, Express, Angular, and Node.js, based on Daftmonk's angular-fullstack Yeoman generator.
Because most of my work is Java, I'm using IntelliJ IDEA however I'd like optimal introspection and workflow for this JavaScript module.
In order to achieve to most possible introspection, and least possible confusion, what plugins and project configuration should I use?
Here's the best I've been able to do so far.
There are some crucial IntelliJ plugins to install:
.gitignore support
AngularJS
Base64 for IDEA and Storm
BashSupport
Bootstrap
CSS Support
Database Support
ddescriber for jasmine
Git Integration
GitHub
HAML
Heroku integration
HTML Tools
Jade
JavaScript Debugger
JavaScript Intention Power Pack
JavaScript Support
JS Toolbox
JUnit
Karma
LESS CSS Compiler
LESS support
Markdown
Mongo Plugin
NodeJS
Require.js plugin
REST Client
Spy-js
SvgViewer 2
Terminal
W3C Validators
YAML
As a peace offering to the mighty IntelliJ, use Java as project SDK:
I prefer to configure four separate modules, to help separate back-end vs. front-end JavaScript dependencies:
Add the bower_components library to the client module, and the node_modules library to the server module:
And be sure to enable JavaScript libraries in the editor.
Per best practices, we do not commit the local IntelliJ IDEA configuration folder (/.idea/) to the repository, instead adding it to the .gitignore file like so:
# IntelliJ IDEA local workspace
.idea
Happy coding!