How to run a node.js application in vert.x? - javascript

I'm totally new to vert.x and I'm trying to see if it's possible to bring up an existing nodejs application in vert.x. Following the instructions at http://vertx.io/blog/vert-x3-says-hello-to-npm-users/, I used npm to install vert.x. I can run a simple hello-world app, but running our existing app is proving to be a little challenging. All the vert.x docs I've found talk about writing new apps, not porting existing code.
Oh, and the same code base needs to continue running on existing nodejs systems.
The trouble that I'm seeing is that vert.x won't load nodejs native modules correctly. For example, Vert.x choked on this require:
var fs = require("fs");
After a little searching I found the vert.x equivalent:
var fs = require("vertx-js/file_system");
Perhaps we could create an shim/abstraction layer to wrap the differences. I did a quick one for the file system API and it seems to load correctly. It does seem like writing an entire abstraction layer will be a fair bit of work. But it seems like it would solve the compatibility issue for APIs used within our source.
The real trouble is how to intercept all the require statements in the node_modules directories. Those modules are also going to be requiring lots of other native APIs like the file-system. This seems like a problem that others may have encountered and solved already. Better not to re-invent the wheel.
I could roll my own solution. I don't really want to sed/replace the node_module source except as a last resort. The only other alternative I have thought of is creating a directory of abstractions an inserting that directory name at the head of the NODE_PATH. This solution seems like it might work, but as I mentioned I'm a vert.x noob so I cannot forsee what kinds of pitfalls lie along that approach.
Does vert.x support a shim layer for running nodejs applications?

Short version TLDR:
You can't!
Long version:
Vert.x is not a Node.JS replacement or runtime. Although there are quite similarities and common design choices such as support for CommonJS modules and support for NPM the native libraries are not present. All I/O operations in Vert.x are done using Vert.x API and they do not always relate to the Node counter parts.
Also you should be aware that the JavaScript language version is not the same either, for example Node relies on V8 which nowadays is quite close to fully support ECMA2015 or ES6 for short, Vert.x as a framework running on the JVM relies on Nashorn (the JavaScript runtime from the JDK itself) which is still on ES5.
The idea of supporting NPM in Vert.x was not to emulate Node but to allow the usage of many of its modules (that do not depend on node native modules). For that reason there is a warning on the documentation. But I guess it is not clear.
There are some ways to get most out of NPM and Vert.x, one option is to go 100% ES6 and use a transpiler such as Babel to transpile back to ES5 which will run fine both with Node and Vert.x (until the moment you use a native module).
If you must to use Node, say that you already have an application built on node and the port is not worth (in terms of resouces/time/etc) then I'd suggest to look into the tcp eventbus bridge. This bridge will allow your existing application to produce and consume messages of an existing cluster of vert.x applications.

Related

What are 'tools for native modules'? Help a newbie decide if they are necessary (NodeJS installation)

I have just started my NodeJS course, the lecture was recorded at the time of NodeJS version 10 (on a mac). I'm on Windows, it is now version 16. The lecture does not contain this page of the installation screen:
Summary: I do not know if I want native modules, or what they are - but I do not want chocolatey.
I have done my research, yet still I cannot find anything to clear up the following question for me anywhere.
1.My question:
How important are these native modules? Do I need them? Or do you recommend them, and why?
2.Chocolatey:
Out of interest, perhaps you could tell me why NodeJS have bundled together native modules and Chocolatey?
I have decided I do not want chocolatey (no problem, if I decide to install the 'tools' then I will go onto GitHub and install them manually, as it says in the screenshot.)
The reason I do not want chocolatey is because: from my research I do not think I need chocolatey and I have seen that uninstalling chocolatey will potentially cause me one or two problems, so I'll avoid it all together - but I thought I'd mention that here on the side, because maybe somebody knows a very valid reason why they are bundled together, and it will change my mind.
A big thank you to the Stack Overflow community.
Native modules need to be compiled, most often (but not exclusively) from C/C++ source, in order to function. Some folks avoid them like cancer, as they need to be compiled on installation which can be a deployment risk. Others (like me) embrace native modules because of the performance benefits they can bring.
Note that this is not a concept unique to Javascript or Node.js. Other languages like Ruby and Python also have "modules" (by other names) that involve compiling native code in order to function as well.
As to why Node.js uses Chocolatey to manage its native toolchain, it's because Chocolatey already has packages available for the tools it needs. It doesn't make sense to maintain separate NPM packages of these tools, and relying on existing packages reduces a lot of overhead in getting a wide berth of tools and utilities installed. In addition, Chocolatey can be installed system wide or for only a specific application's use. I'm not sure which technique Node.js uses but if it's asking, I assume it wants to use a system-wide config.
If you don't want to use Chocolatey, you'll have to manage the native toolchain on your own. If you tell it to use Chocolatey, you can manage the toolchain upgrades with the choco upgrade command.
That said, I would consider exploring Chocolatey if I were you. It makes package management so much easier on Windows. It's about as close to a standard as a 3rd party solution can get, in part because it builds off of nuget, and you can technically manage Chocolatey packages with PowerShell without installing Chocolatey (though I don't recommend this, just use Chocolatey).
Somebody had raised this question to the node.js developers through a github issue - https://github.com/nodejs/node/issues/30242.
tl;dr It is not essential for node.js

Why do people install Node.js on their PC, if its back-end runtime environment

What are those reasons that people install Node.js on PC, and can a Node.js website be developed without installing Node.js on PC, if yes, what are the disadvantages?
Thanks
There can be many reasons, but some common ones:
When developing for any language, you'll need to be able to run and test your code, and running it locally makes it easier to do so.
Additionally, although you can use remote debugging, debugging is faster and easier to set up locally.
Many of the tools used by web developers are also developed in JavaScript and to run those locally, an engine capable of doing so is required. It makes sense that these tools are developed in JavaScript, not just because their primary user group will be able to understand and extend their code, but also because many of these tools need to be able to perform tasks and integrate with components that require something like a JavaScript engine to begin with.
In many situations, running a local environment on your development system will be cheaper and easier to maintain than having a separate test server; and you don't want to run your untested code on a production server.
Not directly related, but npm and Node.js have many uses beyond serving as a back-end to JavaScript-driven websites. Many people that have Node.js installed have nothing to do with web development, but have it for one of the many other reasons.
To answer your 2nd question: "can a Node.js website be developed without installing Node.js on PC?" Yes, but I can see very little reason to want to do so, unless you must. The advantage might be that you can avoid having a complicated piece of software with a large footprint and possibly some security concerns on your development machine. But the disadvantages for the average developer likely far outweigh that - more so if you just sandbox the entire development environment in case security is your main concern.
If you are writing everything from scratch, you don't need a package manager, but there is so much great stuff out there that you can use rather than writing it yourself! If you want to use it, you need a package manager that allows you to download (an optionally specific version of) specific packages, which may use packages themselves, and YOUR source code repository doesn't need to store a copy of every package you use, and every package used by the packages you use, because, as long as your source code specifies which packages it uses in a way your package manager understands, all you need to do is specify a manifest of (an optionally specific version of) specific packages your code uses directly.
"NPM" is one such package manager. "bower" is another but that uses NPM under the hood. Maven is a package manager I've seen in Java projects, and NuGet for MS projects, but for JavaScript projects it's usually NPM. And NPM uses node.

what are the most common use case scenarios for node.js?

It seems like npm is the most common use of node.js. Would you estimate that 80%+ of node.js users only use the npm functionality? What do you think are the most common use case scenarios for node.js outside of npm?
It seems like you might be confused about what node.js and npm actually are, hopefully I can help clear this up.
Node.js is a javascript run-time environment. Originally, javascript was used mainly, is not exclusively, for client side scripting fro webpages. In 2009, Ryan Dahl released node.js, which essentially allowed developers who were very comfortable with Javascript (Standardized ECMAScript) to use javascript in a server environment. Prior to this, a developer would use other technologies (PHP, Java, ect). Node.js allowed the developers to work with javascript both server-side and client-side.
NPM, or Node Package Manager is essentially an "add-on" to Node.js. It is an application that allows users to specify packages, that is, modules that someone wrote and published, to be used in their own projects.
So to try to answer your questions:
Would you estimate that 80%+ of node.js users only use the npm functionality?
No, Using solely the npm functionality would never REALLY accomplish anything, all NPM does is download some files to your computer.
What do you think are the most common use case scenarios for node.js outside of npm?
Node is used for sever-side scripting. Arguably the most popular use is to provide a back-end functionality to web apps.

how to create an environment agnostic javascript library

I'm creating a javascript library, and i want it to be environment agnostic (It will not use DOM, AJAX, or NodeJS api. It will be vanilla javascript). So, it's supposed to run in any javascript environment (browsers, npm, meteor smart packages, V8 C bindings...).
My currently approach is creating git repo with the library, with all the library inside a single global variable, without thinking about patterns like CommonJS or AMD.
Later, i'll create another git repo, using my library as a git submodule, and create what is needed to release it as a npm module. I'm concerned if it's a good approach, i didn't found anyone doing this way.
Pros: code will be vanilla javascript, without awareness of environment patterns. It will not bind itself to CommonJS. It will be repackable (copy and paste or git submodule) to any javascript environment. It will be as small as needed to be sent to browsers.
Cons: I'll have to maintain as many git as environments i want to support. At least a second git repo to deliver on npm.
Taking jQuery as example, it runs in both browser and nodejs, with just one git repo. There is some code to be aware of the "exports" variable to run on nodejs or other CommonJS compatible enviroment.
Pros: Just one git repo to mantain.
Cons: It will be binded to CommonJS pattern (to achieve npm compatibility)
My question is: Am i following a correct (or acceptable) approach? Or should i follow jquery's path, and try to create a single git repo?
Update 1:
Browserify and other require() libraries are not valid answers. My question is not how to use require() on the browser, instead, it's about the architecture pattern to achieve enviroment agnosticism.
Update 2:
Create a browser/nodejs module is not the question, it's known. The question is: can make a real enviroment agnostic library? This example is binded to CommonJS pattern, used in NodeJS.
If you are looking for design recommendation for your future library work then in my opinion you can think-future and just use usual Object Oriented Practices well proven in other languages, systems and libraries.
Mainly concentrate on the UML view of your design.
Forget the "one variable" requirement.
Use features proposed in the planned next version of JavaScript.
http://wiki.ecmascript.org/doku.php?id=strawman:maximally_minimal_classes
http://wiki.ecmascript.org/doku.php?id=harmony:modules_rationale
There is an experimental compiler available that allows you to write ES6-style code even today (see https://www.npmjs.org/package/es6-module-transpiler-rewrite).
Node.js has a --harmony command line switch that allows for the same (see What does `node --harmony` do?)
So in my opinion correct approach is to follow best practices and "think future"
"Use a build tool" is the answer for this question. With a build tool, you can develop with the best code pratices, without accopling your code to some enviroment standard of today (AMD, commonjs...) and still publish your code to these kind of enviroments.
For example, I'm using Grunt.js to run some tasks, like build, lint, test, etc.
It perform tedious operations (minification, compilation...) like Make, Maven, Gulp.js, and various others.
The build task can handle standards (like commonjs) for the compiled code. So, the library can be totally enviroment agnostic, and the build process handle enviroment adaptations.
Note that i'm not talking about compiling to binaries. It's compiling source to another source, like CoffeScript to JavaScript. In my case, it's compilation of JavaScript without enviroment standard to JavaScript with commonjs standard (to run as a Node.js module).
The final result is that i can compile my project to various standards without messing with my code.
Aditionally, with a build phase i can "think-future", like xmojmr answered and use the EcmaScript 6 features on my JavaScript code, using Grunt plugins like grunt-es6-transpiler or grunt-traceur to compile JavaScript code from ES 6 to 5 (so it can run on enviroments of today)
According to modular your library (if you need modules). Read this question Relation between CommonJS, AMD and RequireJS?
Take bootstrap for example, it uses npm to manage project dependencies and use bower to publish as static content for other web apps.
Take a look at browserify as reference, it's a little heavy because it provides the capability to bundle dependent npm modules as resource for browsers.

How can I use npm for front-end dependencies?

I want to ask if it is possible (and generally a good idea) to use npm to handle front-end dependencies (Backbone, jQuery).
I have found that Backbone, jQuery and so on are all available through npm but I would have to set another extraction point (the default is node_modules) or symlink or something else...
Has somebody done this before?
Is it possible?
What do I have to change in package.json?
+1 for using Browserify. We use it here at diy.org and love it. The best introduction and reasoning behind Browserify can be found in the Browserify Handbook. Topics like CommonJS & AMD solutions, build pipelines and testing are covered there.
The main reason Browserify works so well is it transparently works w/ NPM. As long as a module can be required it can be Browserified (though not all modules are made to work in the browser).
Basics:
npm install jquery-browserify
main.js
var $ = require('jquery-browserify');
$("img[attr$='png']").hide();
Then run:
browserify main.js > bundle.js
Then include bundle.js in your HTML doc and the code in main.js will execute.
Short answer: sort of.
It is largely up to the module author to support this, but it isn't common. Socket.io is an example of such a supporting module, as demonstrated on their landing page. There are other solutions however. These are the two I actually know anything about:
http://ender.no.de/ - Ender JS, self-described NPM analogue for client modules. A bit too involved for my tastes.
https://github.com/substack/node-browserify - Browserify, a utility that will walk your dependencies and allow you to output a single script by emulating the node.js module pattern. You can use a jake|cake|rake|make build script to spit out your application.js, and even automate it if you want to get fancy. I used this briefly, but decided it was a bit clunky, and became annoying to debug. Also, not all dual-environment npm modules like to be run through browserify.
Personally, I am currently opting for using RequireJS ( http://requirejs.org/ ) and manually managing my modules, similar to how Mozilla does with their BrowserQuest sample application ( https://github.com/mozilla/BrowserQuest ). Note that this comes with the challenge of having to potentially shim modules like backbone or underscore which removed support for AMD style module loaders. You can find an example of what is involved in shimming here: http://tbranyen.com/post/amdrequirejs-shim-plugin-for-loading-incompatible-javascript
Really it seems like it is going to hurt no matter what, which is why native module support is such a hot topic.
Our team maintains a tool called Lineman for building front-end projects. The tool is node-based, so a project relies on a lot of npm modules that operate server-side to build your assets, but out-of-the-box it expects to find your client-side dependencies in copied and committed to vendor/js.
However, a bunch of folks (myself included) have tried integrating with browserify, and we've run into a lot of complexity and problems, ranging from (a) npm modules being maintained by a third party which are either out of date or add unwanted changes, to (b) actual libraries that start failing when loaded traditionally whenever a top-level function named require is even defined, due to AMD/Require.js baggage.
My short-term recommendation is to hold off and stick with good ol' fashioned script concatenation until the dust settles. Until you have problems big enough or complex enough to warrant it, I suspect you'll spend more time debugging and remediating your build than you otherwise would. And I think most of us agree the best use of your time is focusing on your application code, not its build tools.
You might want to take a look at http://jspm.io/ which is a browser package manager. Has nice ES6 support too.
I personally use webmake for my small projects. It is an alternative to browserify in the way it brings npm dependencies into your browser, and it's apparently lighter.
I didn't have the opportunity to compare in details browserify and webmake, but I noticed webmake doesn't work well with modules internally using global variables such as socket.io (which is full of bloat anyway IMO).
I would be cautious about RequireJS, which has been recommended above. Because it is an AMD loader, your browser will load your JS files asynchronously. It will induces more exchanges between your client and server and may degrade the UX of people browsing from mobile networks / under bad WiFi. Moreover, if you succeed to keep your JS code simple and tiny, asynchronous loading is absolutely not needed !

Categories