My app has a client and an API in the same repo. The API side is built with slc build --npm, while the client is built with npm run build as triggered within the slc build command. I am trying to build the deployable .tgz file and exclude the client/ and build/ artifacts (among a bunch of other stuff that I don't necessarily want on my API servers).
Am I missing a configuration setting?
My .npmignore clearly has entries for
client/
devops/
dist/
yet after building if I untar the archive, everything listed in .npmignore is still included. Is this just not supported?
I guess slc build --npm is not running the actual npm pack command internally and just doing a custom thing.
You are correct that it is not running npm pack. It is actually using a module that was explicitly created to ignore the .npmignore file in your module as well as any dependencies.
The reason this was done is because many modules with binary add-ons are configured to not publish their compiled parts, and this makes it impossible to bundle those packages with their binaries pre-compiled so that they can be deployed to an environment that doesn't have a compiler.
It sounds like the behavior could use some refinement, like only ignoring the dependencies ignore files.
Related
I am working on a VueJS web app. One of the modules it uses is a wrapper for a javascript library installed through npm and packaged to integrate into vuejs. It does not quite fit our needs, we need to add some functionality to it, so I want to fork it and edit it.
The repo has two folders: src and dist.
As far as I understand, src is the actual src code while dist is a minified version for distribution. Questions:
If I want to edit it, how do I deal with the contents of /dist ? Do I delete it?
Do components installed through npm use the /src/ version or the /dist/ one?
If I delete /dist, work on the /src code, how do I recreate /dist based on the modified /src files?
Thank you.
Based on your questions, I would suggest you get a bit more familiar into your stack and how to actually build your appication.
Generally speaking the /dist folder contains automatically generated files, which may be uploaded to a webserver of your choice. Since you are using VueJS, you should be able to generate these files by running npm run build.
If I want to edit it, how do I deal with the contents of /dist ? Do I delete it?
As I already mentioned, these files are automatically generated by running npm run build. Therefore everytime you run this command, everything in /dist, will be automatically updated.
Do components installed through npm use the /src/ version or the /dist/ one?
Your working directory is always /src. Dependencies can be used like in any other application (this example uses Axios, which is a http client):
import axios from 'axios';
const httpClient = axios.create({ baseURL: 'https://api.something.com' });
httpClient.get(/* ... */);
If you are a beginner and are not 100% sure about how to use depencencies, I highly encourage you to read this article: An Absolute Beginner's Guide to Using npm
If I delete /dist, work on the /src code, how do I recreate /dist based on the modified /src files?
You do not have to delete anything in /dist. Simply running npm run build automatically will add the latest changes.
Please keep in mind that running npm run build is only relevant for your production environment. For your development environment you always want to use a dev server, which can be started with npm run serve.
I am doing my first angular project and I have to upload it to a Development test server. Problem is that, an Angular project (event in default state) has so many files that it takes a lot of time to upload it.
I investigated and the .gitignore file seems only to be to avoid the commit of the files or folders specified there.
Could you please tell me if there is a way to minimize the number of needed files to upload and install or use them later, in local, in a safe way? Without risks of corrupting the project.
If you are using Angular CLI you can get a production build by doing
npm run -- ng build --prod
in the project directory. This will create a minified, bundled version of your app in the dist folder, ready for upload. You will then need an http-server running to serve these files.
As #yarz-tech suggested, the solution was to remove the node_modules each time I upload it. Then, when someone downloads it and wants to work with it again, must execute npm install. That allows npm to install all the dependencies our project needs, based on what is is specified on the package.json file, which appears not only for Angular projects, but for Node for example.
Thanks to everyone.
To be clear, I am not asking if I need node_modules folder on live host server. That question & answer exists on Stack Overflow already. The consensus answer, in general is YES - I still need the node_modules directory during runtime.
I am also not asking about running npm init or npm install. I understand how that works.
I am specifically asking - do I still need the node_modules directory on the live/host server if I use webpack during my build process? Doesn't webpack bundle all the necessary JS, etc into folder? Can I delete the node_modules folder if use webpack? Or, will I still need that directory during runtime?
This is for a basic front end, client side web application only. This front end calls other API only for backend sevice. This front end web application is being hosted on Windows/IIS.
The site's published code includes static references like this:
<link rel="stylesheet" href="/css/app.css?id=f243e9c6546d420fec1f">
<script src="/js/app.js?id=bf7be8f179cc272c0190"></script>
Ignore the id= part, as I think that's part of the web framework for cache busting.
No, everything is in the bundle after you build. You can take the files defined as output (usually whatever is in the "dist" folder) and stick them on whatever static server you want without the need of the supporting node_modules.
During your web pack build process ,need the node modules folder , because when you import a file from the node_modules , web pack will try to fetch the file from the particular node_module folder recursively.
Once you successfully done with the build you will get a dist package folder with all the bundles for the deployment, it will not contain node_modules folders.
You can test it by using
npm run build
Is there an article that describes how a deployed aurelia web application differs than an application that runs locally on gulp. This must be a general question that applies not only to aurelia. There is a js library that I'm using that hangs the browser. That never happens when I run the app locally which makes me think that there is something really different that the deployed app does not have.
You are correct, this is not so much Aurelia specific, but build-tool specific. When you run your app locally, you will be using npm dependencies that are installed in /node_modules directory and resources from local file system (like CSS, images etc). When you bundle your application for deployment you need to bundle everything needed to run the app (including dependencies and resources).
For every bundler you can configure what to bundle and create different bundles. There are good explanations on how to bundle for both Aurelia CLI project (bundle configuration is in aurelia_project/aurelia.json) and JSPM project (bundle configuration is in bundle.js).
Just make sure that all necessary files and modules are bundled. Often the problem is not with bundling itself, but stuff that cannot be bundled. There are some very stubborn libraries (eg some assets of Bootstrap or some jQuery-based plugins) that will not work when bundled. Then you need to include them in deployment separately. In JSPM configuration it means you will have to export them together with bundles. Export basicly means "select all files that will be used to run the app in production" and those files are going to be copied to /export directory in case of JSPM. In CLI installation you will need to add copyFiles section to aurelia.json to export extra files.
Check this article on how bundling exactly works and this one to understand what is the role of aurelia-bundler in the process (hint: aurelia-bundler is a part of framework that creates ready-to-use Gulp tasks for you).
I'm working on node.js app that is written in Typescript, which means it needs to be compiled to JS before running. As I'm coming from java/jvm background where you ship prebuilt package to server and it gets run there I'm a bit afraid of the way of deployment where you push code to git and it's being built/compiled on server first and then run.
I don't like it for two main reasons:
dev dependencies need to be installed on server
deployment depends on external resources availability (npm etc).
I found NAR https://github.com/h2non/nar which is more or less what I wanted but it has some drawbacks (doesn't work with some deps that have native extensions).
My question is: is there any other "sane" way of doing deployment node.js deployment than this risky combination of npm install and tsc on server? Or should I let that sink in and do it that way?
To be honest I don't believe there are no more sane/reliable options for that.
What you can do (but there are probably other perfectly valid approaches) is building your project locally (or on a CI service), and only deploy this built version when you consider it as valid (tests, etc.).
This way, if something bad happens, like npm that fails, or a compilation error, you don't deploy anything, and you have time to resolve the situation.
For example, I used to have a gulp task (but it can be anything else: Grunt, a simple npm script...) that clone a production repository and build the project into this directory.
That way, I can check that my build is valid. If it is, I make a new commit and push it to the production repo, that is served the way you need (on a Heroku instance for example).
Pros
Clear separation of dev and non-dev dependencies
Deployment only when you know that the build is valid
No built files on source control on the development repository
No "live" dependency on external tasks like npm install or tsc build
Cons
You have two separated git repositories (one with the source code, one with the built version of your project)
Production process is a little bit heavier than simply committing to master
(From comment) Doesn't properly handle the case of npm package that relies on native extensions that have to be (re)built
is there any other "sane" way of doing deployment node.js deployment than this risky combination of npm install and tsc on server
package.json + npm install + tsc is the way to do it. Nothing risky about it.
More
Just use an npm script : https://github.com/TypeStrong/ntypescript#npm-scripts