I want to reverse or unbundle a bundle.js.
Currently I am loading the bundle.js in my browser (Chrome)
Chrome detects the sourcemap and shows me a nice structure of the
full application based on the bundle. The application is bundled
using webpack and is a flux react application.
Is there a way to generate all these files out of the bundle so I can
easily browse the bundle based on the application structure?
This is for a reverse engineering project to get the application source from
an existing bundle.
So in chrome I can nicely browse the whole application using devtools sources.
and see all the individual files. But I would like to create that exact
same structure on my local drive.
I was trying a tool like debundle But I cannot find a way to add the sourcemap to this conversion?
So can I easily unbundle an existing bundle.js if:
Sourcemap is available
Chrome can easily show me the structure and individual files
Bundle is not minified or scrambled.
Bundle is created using webpack
I found shuji to be a good option - you just provide the path to the sourcemap to it. It unbundled an example bundle I made with babel and webpack perfectly. It doesn't preserve folder structure though, all of the files just end up in one folder.
Related
I am building a typescript/javascript package that will contain several JSON files. I do not want those JSON files included in the bundle that webpack outputs. I do want those files included in the output folder of the bundled javascript (copied from the node_module directory). This would be similar to including images.
I would like to create directives that explain to webpack what to do vs writing documentation in hopes that somebody reads it and does it correctly.
I know that copy-webpack-plugin will do what I need to do, but not sure how to set up this directive.
Is it even possible?
So
MyPackage has JSON files
Another developer uses my npm-package
Developer uses web pack in their project
Developers webpack bundles the javascript, excludes my JSON files from the bundle, but copies them to the output directory.
Figured it out within the package.json
Create a folder called bin (whatever) on the same level as src.
Copy the contents that need to be included in the package but not compiled or bundled into javascript, such as json files.
Update package.json add the following entry
"files": [
"bin"
],
Now when publishing, the npm package will contain the bin directory. When building within your project using webpack it will recognize this and include those files in the webpack build and deploy as part of the deployment but not in the javascript bundle.
Then from there, your javascript should reference the files similar to reading a file whether it be on the server or client sie.
The standard approach when packaging code for npm is to place transpiled or bundled code in a separate lib, dist or build directory and to set the package.main property (and module/browser) to point to the built files, which makes the existence of the build directory transparent when doing bare imports (without a path). However, if deep imports are needed (importing specific modules not referenced in package.json properties), it means that the build directory needs to be included in the path like so: require('package/lib/some-module'), which looks bad.
Are there any examples of an approach that would at the same time:
Support deep imports without the build directory in the path
Support npm-link and tools like Lerna
Still place the built files in a separate directory for tidiness
Every approach I've tried or seen fails to fit one of the criteria; for example, copying package.json into the build directory (lib/dist) allows publishing or linking the package from the build directory, but tools like Lerna don't support this approach since they expect the package.json to be in the root of the package instead of the build directory, so only points 1 and 3 are met.
Placing sources in the root of the package and the transpiled or built files next to the sources allows deep imports and is also supported by Lerna, but it doesn't meed the 3rd criteria (tidiness).
Having import maps would solve this issue, but I'm not sure if any tool supports import maps yet.
The question is: is there some approach that I've missed that could meet all three listed points?
I have a project that uses source files external to the project. Effectively, there is the actual project source code (an Typescript/Angular 2 application, lets call it the 'core' stuff), and this is a generic web application that is meant to be the base code that consumes these external source files.
The external files include additional stuff-- that could be SCSS files, images, evn additional JS. The way I want this to work is that webpack copies these external files from any source directory (this is critical, it is not part of the core project) to a local .tmp directory. The files in the .tmp directory are worked on along with the core src files to generate the prod output.
I can't figure out how to add these additional external source files to the watch list. Effectively what I'm looking to do is watch that directory and as things change, it re-copies the affected files to the local .tmp directory and triggers a recompile.
Presently I have to restart webpack and have a very very ugly solution using Grunt to watch the additional files. It's nasty but these kinds of workarounds have historically been what I've had to do with webpack.
Does anyone have a better solution? Ideally I'd like to not have to mix grunt with webpack. Webpack should be able to do this, but its hard to know whether there's an existing plugin for this or what the best approach would be.
Also, please spare the "look for it on google" or "read the docs" comments. I've combed through it all, hard, and have not found anything.
Thanks in advance.
As of now Webpack doesn't watch external files out-of-the-box . You need a plugin for that.
Basically idea is to have a file watcher module chokidar / watch , listening to the file change , and when there is a change, restart the webpack compilation phase . Webpack plugins can access the compilation object and we you need to hook it to a compiler phase i.e. 'emit' , 'after-emit' etc.
This Webpack plugin exactly solves your problem - https://www.npmjs.com/package/filewatcher-webpack-plugin .
My application is using JSPM and SystemJS for module loading and is using angular.
My config.js file has angular map like:
"angular": "github:angular/bower-angular#1.5.8"
So when I do import angular from 'angular', I am getting the angular.js file from the Path specified in config.js file. That's good.
Now the requirement is I want to use minified third party javascript files (angular.min.js) in the app. But there is no minified files in jspm registry
So the initial loading time of my application is high because of so many large files e.g. angular.js, browser.js etc. that takes too much time to load.
I know, I can do a jspm bundle to minify all dependency files recursively which includes vendors' files also. But my questions are:
1 - Is it possible to use vendor's minified file (angular.min.js) directly with JSPM? It is good to use vendor's minified file rather than minifying them ourshelves, Isn't it?
2 - If above one is not possible, then how can I bundle only my application specific files and still able to use (bundle separately) minified angular.js file?
What is the recommended approach here?
In your config.js file you can map any file with a name and have it imported by SystemJs in the browser
So you could potentially do something like
"angular": "jspm_packages/...../angular.min" (The path to your file)
However the recommended approach would be to bundle and minify all your vendor files and as you mentioned, bundle your application specific files separately
You can do this with something like Gulp or Grunt to generate the 2 files
There are lots of examples online of how to do this but here is one to get you started
https://blog.dmbcllc.com/using-gulp-to-bundle-minify-and-cache-bust/
Once you have generated both files you add them to your html page via a script tag
<script src="Your File Path"></script>
Option 2 is the preferred approach and I would recommend spending the time to get setup as you only have to do it one time to get your head around it
I'm working on a project which uses js source files from multiple directories and compiles them into a common dist/ directory which is used in production. One way I can test my changes to the js code would be to make the changes into the source code and reinstall the entire project to generate the new dist/ directory. Is there an easier and more practical way to do this?
As I like my production and development environments to be (mostly) equal I use source maps for this issue. This is the way I usually build by js:
JS Hint
Generate Source maps
Concat JS into one file
Uglify
I do this by using gulp and some plugins. Those shouldn't be hard to find.
The benefits of this aproach are:
Serving a small js file
no difference between dev and prod
readable JS source for debugging
no redeploy needed