I want to publish a module to several component manager systems: npmjs, bower, etc... plus I want to create downloadable builds as well, for example one in AMD style for requirejs, one in commonJS style, one for the global namespace in browser, minified for each of them, etc... These are more than 10 builds.
I currently have a single AMD build, and I wrote unit tests for it using karma, jasmine and requirejs in an amd style. What do you suggest, how to generate the other builds and the tests for them?
I mean I cannot decide what should I have as a base of transformations. There is a common part in every output package, and there is a package dependent part either.
AMD - requirejs (I am not sure about using the config options)
define(["module", "dependency"], function (module, dependency) {
var m = {
config: function (options){
//...
},
//...
//do something with the dependency
};
m.config(module.config()); //load config options set by require.config()
return m;
});
commonJS
var dependency = require("dependency");
module.exports = {
config: function (options){
//...
},
//...
//do something with the dependency
};
global
var m = (function (){
return {
config: function (options){
//...
},
//...
//do something with the dependency
};
})(dependency);
I don't know, should I develop the common code and build before every test, or should I develop one of the packages, test it, and write a transformation from that into the other builds?
I intend to use gulp for creating the builds and call unit tests automatically for each of them before automatically publishing them. Ohh and ofc I need an auto version number change as well. Btw. is it necessary to call unit tests after the building procedure, what do you think? I just want to be sure, that not a buggy code is published...
There are transformation libraries for gulp:
https://github.com/phated/gulp-wrap-amd (commonjs to amd)
https://github.com/gfranko/amdclean (amd to standard js)
https://github.com/phated/gulp-wrap-umd (commonjs to umd I guess)
https://github.com/adamayres/gulp-wrap (not sure of its capabilities yet, maybe universal with custom templates, maybe nothing)
So it is easy to transform one package format to the two other formats. This can be done with the tests and the code as well. The commonjs can be tested with node and jasmine-node, the standard js can be tested with karma and jasmine, the amd can be tested with karma, requirejs and jasmine.
It would be a poor choice to create a common descriptor and convert that before every test. I don't want to create another language, like coffeescript, etc... so conversion between packages is okay.
Calling unit tests before publishing is okay. There are no other common package types, so this 3 packages will be enough for every component manager I will use.
I am unsure about the versioning. It is not hard to set it manually, but maybe a system like travis or maven could help more... I'll figure out and edit this answer.
Related
I have a library of Java code from Android development and I'd like to reuse that in a web-app version of the same. The library code is quite generic as it was always intended to be re-usable and has, in the past, been used to generate Android/Java, App-Engine/Java, iOS/ObjC, and GWT apps.
Looking around, I think the best framework for the web app would be Angular. Rewriting the library code to Kotlin should be a relatively minor task as there are tools to do most of the work. Then it can be compiled for the JVM (for native and backend apps) or JavaScript (for web apps).
While advice for/against this plan is welcome, my actual question is...
How do I set up an IntelliJ project to do this?
I thought the obvious answer would be two modules: one for the lib and one for the app but IntelliJ doesn't allow creating a Kotlin module, only a Kotlin project.
Instead, I made a Kotlin/JS project and used Angular/CLI to create the app module beneath it (with a backend app to sit beside it sometime in the future). The library builds and the sample app runs but I haven't been able to get the latter to include the generated JS (plus .d.ts) code of the former which sits in some deep directory under build/. So maybe I'm going about it all wrong...
Wow, that was rough! After many hours of Google searches and attempts that failed, here is something that works. Perhaps I'll find cleaner methods down the road but this is acceptable for now.
Note: I wasted too much time on gradle plugins for my liking. If they're not first-party, they tend to be unmaintained, poorly documented (largely assuming that the reader already knows what they're doing), and out of date. What follows is done only with first-party support programs.
Create a top-level project with Angular/CLI followed by sub-projects for the app and library:
ng new MyGeneralProject --no-create-application
ng generate library klib
ng generate application MyApp
Now set up Gradle in the top-level directory (or wait and do this via the IDE):
gradle init
gradle wrapper
Open the top-level in IntelliJ. Create/edit the top-level build.gradle.kts file:
buildscript {}
plugins {}
repositories {}
More important is the top-level settings.gradle.kts file:
rootProject.name = "MyProject"
include("projects:klib")
This will now access the projects/klib/gradle.build.kts file:
plugins {
kotlin("js") version "1.8.0"
}
repositories {
mavenCentral()
}
dependencies {
implementation(kotlin("stdlib-js"))
testImplementation(kotlin("test-js"))
}
kotlin {
sourceSets {
all {
// allow #JsExport without opt-in each time
languageSettings.optIn("kotlin.js.ExperimentalJsExport")
}
val main by getting {
kotlin.srcDir("src/main")
}
val test by getting {
kotlin.srcDir("src/test")
dependencies {
implementation(kotlin("test"))
}
}
}
js(IR) {
moduleName = "klib"
browser {
distribution {
directory = File("$projectDir/../../dist/$moduleName")
}
}
binaries.library()
}
}
The above uses the kotlin("js") plugin for Gradle but that's provided by JetBrains as part of IntelliJ who also maintain the Kotlin language libraries. Though the documentation for it also assumes you already know everything about Gradle, it nonetheless works just fine.
The built JS/TS library will be created under the top-level directory as "dist/klib". This naming could use some improvement but it's a reasonable starting point.
In the top-level tsconfig.json file, under "compilerOptions", look for "paths" and update as desired. I went with a leading "#" to indicate a top-level import location:
{
...
"compilerOptions": {
...
"paths": {
"#klib": [
"dist/klib"
]
}
...
},
...
}
In the projects/myapp/src/app/whatever.ts file, access the converted Kotlin library:
import * as klib from '#klib'
Then the entire library is available as klib.blah.blah.blah with IntelliJ providing full completion semantics.
2023-02-07: I've found that while this approach worked okay for creating the desired Angular project, it fell apart when trying to add a jvm-based backend with which it shared code. Now trying a different tactic...
I am just re-looking at Jest as it's been getting a lot of good reports. However struggling to find a good way the access internal functions to test them.
So if I have:
const Add2 = (n)=> n+2;
export default (list)=>{
return list.map(Add2());
}
Then if I was using Jasmine or Mocha I'd use rewire or babel-plugin-rewire to get the internal Add2 function like this:
var rewire = require('rewire');
var Add2 = rewire('./Adder').__get__('Add2');
it('Should add 2 to number', ()=>{
let val = Add2(1);
expect(val).toEqual(3);
});
However neither of them seem to work with jest and while there looks like an excellent mocking syntax I can't see any way to get internal function.
Is there a good way to do this, something I'm missing on the jest api or set up?
You can actually achieve this if you are willing to use babel to transform your files before each test. Here's what you need to do (I'll assume you know how to get babel itself up and running, if not there are multiple tutorials available for that):
First, we need to install the babel-jest plugin for jest, and babel-plugin-rewire for babel:
npm install --save-dev babel-jest babel-plugin-rewire
Then you need to add a .babelrc file to your root directory. It should look something like this:
{
"plugin": ["rewire"]
}
And that should be it (assuming you have babel set up correctly). babel-jest will automatically pick up the .babelrc, so no additional config needed there unless you have other transforms in place already.
Babel will transform all the files before jest runs them, and speedskater's rewire plugin will take care of exposing the internals of your modules via the rewire API.
I was struggling with this problem for some time, and I don't think the problem is specific to Jest.
I know it's not ideal, but in many situations, I actually just decided to export the internal function, just for testing purposes:
export const Add2 = x => x + 2;
Previously, I would hate the idea of changing my code, just to make testing possible/easier. This was until I learned that this an important practice in hardware design; they add certain connection points to their piece of hardware they're designing, just so they can test whether it works properly. They are changing their design to facilitate testing.
Yes, you could totally do this with something like rewire. In my opinion, the additional complexity (and with it, mental overhead) that you introduce with such tools is not worth the payoff of having "more correct" code.
It's a trade-off, I value testing and simplicity, so for me, exporting private functions for testing purposes is fine.
This is not possible with jest. Also you should not test the internals of a module, but only the public API, cause this is what other module consume. They don't care how Add2 is implemented as long as yourModule([1,2,3]) returns [3,4,5].
I have come across a few modules i would like to use in my Angular app but am at a crossroads on how to make work in my angular app as i will need to "require()" in my factory file.
Heres the node module im interested in: https://github.com/TimNZ/node-xero
On this current project i am using Gulp Angular in Yeoman to generate my boilerplate and am having a hard time figuring out how i should make this work if i need to modify any of the gulp scrips.
I was thinking i can just "Browserify" the single file that will use require() but is this the wrong approach? should i just browserify all the project files? is this standard practice?
Any advice is appreciated, currently has me at a stand still.
All the modules i want to use in relation to Xero all seem to be node modules.
The simplest starting point would be to use Browserify to build a standalone bundle that uses a global of your choice.
To do this, you could create a JS file that requires the node module(s) you want to use. You could create a file named bundle-index.js with this content:
exports.xero = require('node-xero');
You could then run this command to build a standalone module:
browserify --standalone the_global bundle-index.js > bundle.js
Where the_global is a name you find appropriate for the global object that will contain the exports. With the bundle.js file included in a script element, you would be able use it like this:
var privateApp = new window.the_global.xero.PrivateApplication({ ... });
Doing things this way would involve the least amount of disruption to your current project. And if you are only needing to use Browserify to require third party libraries that don't change frequently, you could start with a simple manual process for building the standalone bundle.
Note that you can export other required modules by adding additional exports to bundle.js:
exports.xero = require('node-xero');
exports.someOtherModule = require('some-other-module');
(function(){
var xero = require('node-xero');
angular
.module('app')
.factory('myFactory', myFactory);
function myFactory(){
var helper = {
myMethod: myMethod,
};
return helper;
function myMethod(){
xero.doStuff();
}
}
})();
I'm writing a javascript library that contains a core module and several
optional submodules which extend the core module. My target is the browser
environment (using Browserify), where I expect a user of my module will only
want to use some of my optional submodules and not have to download the rest to
the client--much like custom builds work in lodash.
The way I imagine this working:
// Require the core library
var Tasks = require('mymodule');
// We need yaks
require('mymodule/yaks');
// We need razors
require('mymodule/razors');
var tasks = new Tasks(); // Core mymodule functionality
var yak = tasks.find_yak(); // Provided by mymodule/yaks
tasks.shave(yak); // Provided by mymodule/razors
Now, imagine that the mymodule/* namespace has tens of these submodules. The
user of the mymodule library only needs to incur the bandwidth cost of the
submodules that she uses, but there's no need for an offline build process like
lodash uses: a tool like Browserify solves the dependency graph for us and
only includes the required code.
Is it possible to package something this way using Node/npm? Am I delusional?
Update: An answer over here seems to suggest that this is possible, but I can't figure out from the npm documentation how to actually structure the files and package.json.
Say that I have these files:
./lib/mymodule.js
./lib/yaks.js
./lib/razors.js
./lib/sharks.js
./lib/jets.js
In my package.json, I'll have:
"main": "./lib/mymodule.js"
But how will node know about the other files under ./lib/?
It's simpler than it seems -- when you require a package by it's name, it gets the "main" file. So require('mymodule') returns "./lib/mymodule.js" (per your package.json "main" prop). To require optional submodules directly, simply require them via their file path.
So to get the yaks submodule: require('mymodule/lib/yaks'). If you wanted to do require('mymodule/yaks') you would need to either change your file structure to match that (move yaks.js to the root folder) or do something tricky where there's a yaks.js at the root and it just does something like: module.exports = require('./lib/yaks');.
Good luck with this yak lib. Sounds hairy :)
I've been researching CommonJs, AMD, module loading, and related issues for over a week. I feel like nothing out there does what I need. My basic need is to share code seamlessly between frontend and backend. There are various issues around this including module formats for the client side, script loading, and module format conversions/wrapping. The piece I've been struggling with recently is how to use both CommonJS and AMD (or something AMD-like) in node.js.
You can't get away from commonJs in node.js, so my thinking is that if I want to use AMD, it has to work alongside commonJs. What tools, libraries, or techniques can I use to get something AMD-like working?
For example, I would like to be able to write a module like this:
var x = require('x')
modules.exports = function(a, callback) {
if(a) {
require(['y','z'], function(y,z) {
callback(x, y.o + z.k)
}
} else {
callback(x, "ok")
}
}
Ideally:
Both node.js and the amd-like modules will have paths interpreted in the node.js way (paying attention to node_modules unless the module path starts with "/", "./", or "../")
doesn't require source conversion for the server side in a build step (ie modules will run in node.js without each one being programmatically converted)
module or require don't need to be explicitly passed into the amd-like require function
uRequire is the perfect tool for this requirement, it's all about interoperability between the module formats and their incompatibilities.
Essentially uRequire converts or translates modules from nodejs to AMD and vise versa, plus the UMD format that runs on both nodejs and the browser or a combined .is that requires no AMD loader on browser.
It will require a build step though, but that is a minor concern in contrast to the offering.
You could check out, http://dojotoolkit.org/documentation/tutorials/1.9/node/
I've only played with it a little, but has worked with what I've tried. I got it working with node-orm and remember that being a pain to get going, but might of just been me making a mess while playing with it.
Essentially you end up with AMD on the server, like:
require(["dojo/node!orm","other/amd/module"], function(orm){
//use third party commonjs module and your own amd modules here
}
It looks like you've already investigated Requirejs's suggestion to wrap commonjs modules in an AMD require (automatically during build most likely using r.js).