Releasing a Javascript library for client use - javascript

I have a Javascript library that I wrote for a client. This is written in typescript using webpack and "compiles" into javascript. I want to give the client access to the distribution files but not the whole source. Ideally they can install from the command line to make installing updates easy.
The library provides some javascript functions. The client would install the script in one location on their server. They could then include the javascripts in their web surveys as they need it.
+project
+dist
-main.js
-vendor.js
-index.html
-README.md
-LICENSE.md
+src
-index.js
-index.html
...
My initial thoughts are to give them access to a private git repository that contains only the distribution files. So my project would be a git repository, only I would have access to this repo. I would then copy the contents of the dist directory to a release directory. The release directory would be another git repo I could supply to the client.
I'm not sure this is the best approach.
It was suggested that GitHub releases may be an option - but I don't use GitHub, I use GitLab and would like to continue to do so.
npm also doesn't seem like a good choice. It installs files into the node_modules directory and creates a package.json file. That's going to be confusing to my client and isn't "clean".

It sounds like a second git repository as submodule could work for you. On your side it would receive the built files, and on the clients side they could consume them.
I'd suggest making use of tags to indicate significant versions in the submodule
By using a separate repository there is no risk of leaking the original files.
Alternatively you could package the files as a zip, and upload somewhere like S3 as part of your ci process, and write a script to give the client that can automatically download the distribution files - but this seems more complex than just using a package manager like npm.

Related

How to package grpc-web generated code into npm package

I have protofiles defined inside a go module and publish this module so servers and go-clients can reference the generated code. Now I also want to have grpc-web client to communicate with the server. grpc-web generates ts and js files, but inorder to use this in browser I need to package this as npm module or copy the generated code into browser repo manually which I do not want to do. What is the standard practice here?
Is there a good solution to package npm & go module from proto files.
Update: I ended up copying over the generated ts/js files into desired location as part of my standard build. Since I have a mono repo this is sufficient right now. Kind of straight forward and also source controlled. Not sure if there is a better idomatic way.

Why are frontend frameworks in NPM?

When I’m looking at some github projects and tutorials I look at the package.json file and see frontend frameworks listed as the dependencies quite a lot. I don’t get it. I thought Node was backend? My understanding is that to install frontend frameworks you dl them directly from their website or github or use a CDN then link them in your pages - all this has nothing to do with Node.
Even if I did install a framework through Node doesn’t it save it to the node_modules folder? There must be a reason for it as I’ve seen a lot of projects list them in their package.json file. Can anyone explain this to me?
NodeJS is not only a "server" in the sense that it is a programmable webserver, it is a JavaScript runtime. You can use it to serve webpages, but you can also use the NodeJS server as a parser / generator for JavaScript (meaning: reading and writing files on the system). If you use one of the frontend frameworks like react and angular, you install the packages just to get their sourcecode and not to actually run the code on the server. Then you use a bundler like webpack to turn the code you've written and the code from the modules into one (or multiple) large chunks of minified frontend code. You can usually find those generated files inside the /dist or /build folder. Now to get these files to clients, you can use NodeJS as a server too, serving the files to the clients. That way, the packages "installed" on your server end up on your client.

Including JS plugin files directly in Github repository?

I am relatively new to using git and GitHub, and I am working on my personal website. I am using the JS plugin Slick, a responsive carousel feature. I downloaded all of the necessary files and stored them within my local repo. The size and content of the Slick zip folder is much larger than the files for my site at the moment, so when syncing with GitHub this makes my project appear as 75% Javascript, whereas the actual website is not.
Am I doing this correctly, storing the files for my JS plugin directly within my repository folder? Or should I be using some other method to implement Slick on my site? Or is this just something I should not be worried about? Thanks
If you're just using one library, manually storing it in your Git repo is fine. You'd have to manually update the files if a new version is released, but that's not a big deal for one library. (And you might not even care about updates to this library).
However if you're using more than one library, I'd highly recommend using Node Package Manager (NPM) and a build tool like Webpack.
Here's an article that introduces these tools (plus a few others): https://medium.com/front-end-hacking/what-are-npm-yarn-babel-and-webpack-and-how-to-properly-use-them-d835a758f987
For using git, you should store your dependencies in a folder that is in your .gitignore. If you install browserify or another similar tool like webpack, you can use the npm package manager to create a dependency list file with npm init that allows for easy package installation with npm install by anyone. You can install packages slick with npm install --saveslick-carousel and use them with require() in your main js file. Then, take your js file and run browserify jsfile.js -o outputfile.js and it will package your js and your dependencies together to be used by the browser
When uploading to your git repo, add a .gitignore like this one for Node. This prevents your dependencies from being uploaded to the repo and instead when someone wants to run your project, they must run npm install to get all the dependencies.
Browserify gives an output JS file you add to your web server, the name of this file should be put in your .gitignore as well. Your code is stored in the js file you pass to browserify and other people can still access it without the output file, but they need to run the browserify command to package your code.

My build server lives behind a very strict firewall, so can I store my Bower dependencies inside the project?

I work in a high-security corporate environment.
I'm not allowed to access Bower's index or GitHub from any of the build servers. I can access both of these things from my workstation (for now).
Is there a way to manage all of my project's dependencies some other way - e.g. on an internal index or possibly just dumping the project's dependencies into a directory of the project's source code. Clearly none of these things are as nice as just connecting to the Internet - but I need to be able to do a "bower install" without actual access to the Internet at build-time.
Previously we solved access to Python modules PyPi simply by constructing a static web-site whose content was structured like the PyPi "simple" format. I was hoping that we might be able to do something similar for Bower.
There are a few options:
a) set a bower cache folder of your preference and dump all the packages you need to have there (How to change bower's default cache folder?)
b) simply commit the project's bower packages folder into your source control
c) use a local git server to host the packages and install from there
Set up a git repository within your organization, with the bower modules you need, and then do:
bower install '<git-url>#<git-commit-sha>'
Or also any web service will be fine, see http://bower.io/docs/api/#install

How do I obtain the path of a file in a Meteor package?

I know how to get the current directory from a Meteor package, but how do I get the path of a specific file in the project?
node's __dirname and __filename don't work in Meteor.
It is complicated.
meteor run copies your project files to a tree of directories inside <project-dir>/.meteor/local/build, reorganizes them in non-obvious ways (e.g.. the private subdirectory in the original tree becomes the assets subdirectory) and mixes it in with various npm modules to create a bundle that can be executed as a nodejs project. Indeed, to avoid duplications, there is a .gitignore file automatically set up in the .meteor directory that tells git, if you use it for version control, not to copy the .meteor/local directory.
The original project directory gets watched in case you change a file. The change then gets copied into the current project build directory and the project rebuilt.
If you deploy to a remote system, the build gets copied to a server and then run.
process is usually a defined global server-side object, and works according to the node.js API, because the meteor server code is ultimately running in node.js.
So you can run console.log(process.cwd()); in your server-side to obtain the current working directory for the server process, usually something like:
~/<meteor project directory>/.meteor/local/build/programs/server
This suggests that when meteor run is done locally, original project files are in ../../../../../, but don't use that as it may change in the future.
Instead, for the directory containing the original project files, you could use:
baseDir = process.cwd().replace(/\/\.meteor.*$/, '');
This will get the working directory, and truncate everything beginning with /.meteor
This won't work for a server deploy, though, because the original project tree is not needed on the server, only the build. Files that aren't intended to be client or server code could possibly be stuck in the private subdir, which as I mentioned becomes the assets subdir in the build. Ways to currently find files in the build is either manual inspection .meteor/local in a local run, or use of a JS library that calls or imitates gnu find.
Since you mentioned packages, I note that in the build, server-side package code finally ends up in:
~/<project-dir>/.meteor/local/build/programs/server/packages
and client side in:
~/<project-dir>/.meteor/local/build/programs/web.browser/packages

Categories