My folder structure:
package.json
src
├── backend
│ ├── routes
│ │ ├── myRoute.js
├── index.js
├── .babelrc
The command i run from package.json: "build:server": "babel src/backend/ -d dist"
The resulting dist folder structure:
dist
├── src
│ ├── backend
│ ├── index.js
How can i get it to transpile all subfolders within backend as well? I tried babel src/backend/** -d dist and that didn't work either.
You should move your .babelrc file to the root of your directory
Related
I have a typescript monorepo setup that uses npm workspaces and I want to be able to import subpaths down to any individual typescript file of #package/shared in #package/client and #package/server while skipping the src folder. One solution is simply removing the src folder and moving every subfolder up the hierarchy but that feels like a hacky solution. I tried using wildcards in the exports (e.g. "./*": "./src/*", "./": "./src/*.ts") but either they're not working or they only work partially (moduleResolution is set to "NodeNext")
For example, with the following folder structure of the monorepo
.
├── package.json
└── packages/
├── shared/
│ ├── package.json
│ └── src/
│ ├── index.ts
│ ├── types/
│ │ ├── index.ts
│ │ ├── foo-type.ts
│ │ └── bar-type.ts
│ └── constants/
│ ├── index.ts
│ ├── foo-const.ts
│ └── bar-const.ts
├── client/
│ └── ...
└── server/
└── ...
I want to be able to import any typescript file in the shared package while skipping src
import { ... } from "#package/shared"
import { ... } from "#package/shared/types"
import { ... } from "#package/shared/types/foo-type"
import { ... } from "#package/shared/types/bar-type"
import { ... } from "#package/shared"
import { ... } from "#package/shared/constants"
import { ... } from "#package/shared/constants/foo-const"
import { ... } from "#package/shared/constants/bar-const"
// shouldn't work!
import { ... } from "#package/src/shared/..."
I'm using the verion 2.7 of Parcel for bundling my client side javascript. I have a index.ts where I group all my code. In some cases I have to use dynamic import statements:
example:
const { Menu } = await import('./Menu');
The issue that I can't solve: after each update on Menu.ts, Parcel creates a newly hashed Menu.[hash].js file instead of updating it.
npm run watch:js:
"watch:js": "parcel watch --no-hmr ./public/ts/index.ts --dist-dir ./public/js --public-url ./"
public folder structure:
.
└── public/
├── [...]
├── js/
│ ├── index.js
│ ├── index.js.map
│ ├── Menu.[hash-1].ts **! that's an issue !**
│ └── Menu.[hash-2].ts **! that's an issue !**
└── ts/
├── [...]
├── index.ts
└── Menu.ts
I sometimes see that ts file import other modules from post compiled js file.
For example I see like following monorepo project.
Each file in presentation refers to infrastructure/lib or library/lib. That means ts files in presentation refer to post-compiled js file in each project.
On the other hand, the files in presentation refer to ts files in presentation.
.
├── infrastructure
│ ├── README.md
│ ├── lib
│ ├── node_modules
│ ├── package.json
│ ├── src
│ ├── tsconfig.json
│ └── typeorm
├── library
│ ├── lib
│ ├── node_modules
│ ├── package-lock.json
│ ├── package.json
│ ├── src
│ └── tsconfig.json
└── presentation
├── batch
├── rest-api
└── web
infrastructure/tsconfig.json:
{
"extends": "../../tsconfig.json",
"compilerOptions": {
"baseUrl": "./src/",
"outDir": "./lib",
"moduleResolution": "node",
"strictPropertyInitialization": false,
}
}
Are there any laws about that? Is there any merit in referring to post-compiled files?
If one wants to jump start a project in Node.js with express. one would use express-generator. After creating a new project your file tree will look like this
.
├── app.js
├── bin
│ └── www
├── package.json
├── public
│ ├── images
│ ├── javascripts
│ └── stylesheets
│ └── style.css
├── routes
│ ├── index.js
│ └── users.js
└── views
├── error.pug
├── index.pug
└── layout.pug
One thing that stood out for me is that to run the app you need to do node bin/www or a predefined shortcut npm run. My question is why would one use the www the way it is and not add a .js extension and remove #!/usr/bin/env node from the top of the file? Are there any benefits of doing it this way or is it a personal preference?
Let's look at the first line of the bin/www file:
#!/usr/bin/env node
This shebang tells the *nix operating system how to interpret the file if you try to run it as a program.
So this file can be started as a program. And in Linux traditionally executable files do not have an extension.
I used Express generator to create node application. Directory structure looks like the following:
.
├── app.js
├── bin
│ └── www
├── package.json
├── public
│ ├── images
│ ├── javascripts
│ └── stylesheets
│ └── style.css
├── node_modules
│ └── jquery
│ └── dist
| |___jquery.min.js
├── routes
│ ├── index.js
│ └── users.js
└── views
├── error.jade
├── index.jade
└── layout.jade
In my index.jade file i try to reuse jquery.min.js from node_modules, instead use url to web source:
doctype html
html
head
link(rel='stylesheet', href= '/stylesheets/style_monit.css')
body
#container
.row
.col-sm-4(style='background-color:lavender;') .col-sm-4
.col-sm-4(style='background-color:lavenderblush;') .col-sm-4
.col-sm-4(style='background-color:lavender;') .col-sm-4
.col-md-4
textarea#inData.form-control(style='background:#222; color:#00ff00;', rows='8')
script(type='text/javascript' src='../node_modules/jquery/dist/jquery.min.js')
css file loads great, but in Chrome console i have error that
GET http://localhost:3000/node_modules/jquery/dist/jquery.min.js
NOT FOUND
I believe the problem is that the node_modules directory is private and shouldn't be exposed to the client. Only static files in the public directory can be served. See this Stack Overflow answer for more information.