My web project serves static web pages and scripts. There is no preprocessing done at all. All changes are done in the client.
It has a main page that lists some other pages. When the user clicks a link, jQuery-UI will load the associated HTML page and any linked Javascript/CSS files.
This works great, and gives us flexibility to add/remove new pages with ease. The problem is when we want to debug the loaded JS and the browser appears not to know about it.
Took me a while to find out about Source Maps, and then find out they are all geared towards framework projects like Angular and React.
We don't want that in this project. Just basic HTML & JS that we can plug in and reload. I realize we may need to run an external command to generate the source maps, but it must be a free standing tool - no NPM or frameworks.
It's an internal web project, so security/privacy is not a concern. We want the clients to see the source code if they need to.
I know there are a lot of Questions about JS source maps, but every single one that I've found assumes using some framework tools. We have no framework and do not want one in this project.
Any suggestions on how we can generate the source maps we need, or do you know of any alternative to debug simple JS loaded via jQuery?
First and foremost, you do not need to use Angular/React for sourcemaps to work. These are just a common use case.
Secondly, NPM is exactly what it says it is; a package manager. So you don't need NPM either.
What you need is a build process. You're quite clear that you don't want to minify the js, but you do want sourcemaps. This is a common configuration used to debug js, and is typically accomplished by "building" or "Uglifying" the code with all of the optimizations disabled.
You could likely avoid NPM entirely if you were willing to use the Closure Compiler, but that is a can of worms and I'd suggest you avoid.
Instead I suggest using installing Uglify* globally* (per dev machine) with NPM. This is a "once per machine" step.
npm install uglify-js -g
*: Hopefully this side steps your NPM-less requirement. I did experiment with cloning the Uglify repo directly, but even then you'd need to get it running, and to do that, at a minimum, you'd want to install its dependencies with NPM). I'd love to be proven wrong about this, but I figured it was very unrelated to this post.
And then writing a build script using that. I've attempted to gather the parts for you here:
File: gen-map.sh
#!/usr/bin/env bash
uglifyjs file1.js --config-file gen-map.json \
-o file1.min.js \
--source-map "root='http://foo.com/src',url='file1.min.js.map'"
cat file1.min.js
File: gen-map.json
{
"compress": false,
"output": {
"beautify": true
},
"sourceMap": {
"content": "content from file1.js.map",
"url": "file1.js.map"
}
}
File: file1.js
var b = function() {
console.log('b');
};
function c() {
console.log('c');
};
console.log('a');
b();
c();
(function() {
console.log('d');
})();
File: file1.min.js
var b = function() {
console.log("b");
};
function c() {
console.log("c");
}
console.log("a");
b();
c();
(function() {
console.log("d");
})();
//# sourceMappingURL=file1.min.js.map
File: file1.min.js.map
{"version":3,"sources":["file1.js"],"names":["b","console","log","c"],"mappings":"AAAA,IAAIA,IAAI;IACNC,QAAQC,IAAI;;;AAGd,SAASC;IACPF,QAAQC,IAAI;;;AAGdD,QAAQC,IAAI;;AACZF;;AACAG;;CACA;IACEF,QAAQC,IAAI;EADd","sourceRoot":"http://foo.com/src"}
*: Uglify-es if you're using ES6 features.
After that the only thing left to do would be to update the paths, filenames, and actual script tags. Using this config you must still serve the min.js file, although it seems possible that manually tagging your JS file to point to the map might work...
With this config, you'll need to keep your built files up to date by running:
🐚 ./gen-map.sh
Doing this with npm and gulp would be simpler, but, if you don't mind another package, there are 2 generic "files been changed watchers" that I can suggest;
Nodemon:
🐚 nodemon gen-map.sh
entr
🐚 entr gen-map.sh
Related
I have been trying to use closure compiler to optimize and bundle a project for two weeks now.
The project is originally written in TypeScript. So I wanted to use Tsickle to transpile to JS that would be easily fed to the closure compiler Java app. When I was finally able to do that I stumbled on external Node modules problems. I tried all the solutions I was able to find in Google Groups, SO and in the closure compiler repo. Nothing worked.
Wanting to not let this go I decided to use gulp. This is my gulpfile.js, I tried keeping it as simple as possible.
const closureCompiler = require("google-closure-compiler").gulp();
gulp.task("js-compile", function () {
return gulp
.src("./src/**/*.js", { base: "./" })
.pipe(
closureCompiler(
{
compilation_level: "ADVANCED",
warning_level: "VERBOSE",
jscomp_off: "checkVars",
js_output_file: "output.min.js"
},
{
platform: ["native", "java", "javascript"]
}
)
)
.pipe(gulp.dest("./dist/js"));
});
There are way too many input files for me to put them. This time I used TSC instead of Tsickle to transpile.
The error when running gulp I get is
[JSC_REDECLARED_VARIABLE_ERROR] Illegal redeclared variable: *nameofvariable*
I have this for almost every file in my project, even if the name is not repeated twice in the same file.
You may try tscc. It uses tsickle under the hood and provides some solutions to external node_modules problems.
Try --env CUSTOM or --isolation_mode IIFE. I ran into this when compiling code that conflicted with browser interfaces like Node.
I'm trying to build a new project with ES6 modules without bundling. I still want to use babel-7 to translate TypeScript and JSX to JS. I find it hard to figure out how to set up a development-server for it. I couldn't find any kind of "babel-dev-server" that works similar to webpack-dev-server (hot-module-reloading, browser-sync, file-watcher).
One possibility would be to use browser sync as a static server on e.g. dist and run something like babel src --out-dir dist --watch in parallel. But this excludes hot-reloading and seems a bit clumsy to me. Besides, it would still be useful for build- and dev-steps if you could give the JS-files a hash to control caching better. Or can I configure a build-tool like webpack so that it doesn't perform bundling but still performs some transformations (like putting the hashs in the filenames in imports)?
Prototyping way
A very simple way to do this is to see the server and the transpiling as separate steps
You could use a standalone version of babel as the first script that you load, so you can write jsx inside your html document of javascript files without compiling them.
Simply add on of the cdn links from https://cdnjs.com/libraries/babel-standalone/ as a script like so:
<html>
<head>
<script src="https://cdnjs.cloudflare.com/ajax/libs/babel-standalone/7.0.0-beta.3/babel.min.js"></script>
<script src="/your/jsx/here.js"></script>
<script>
// or here
</script>
</head>
<body>
<div id="application" />
<noscript>This app needs javascript enabled in order to run.</noscript>
</body>
</html>
This would allow you to really quickly prototype things using any webserver that watches files. You can do this using any task runner plugin (i.e. for grunt or gulp) or if you are using visual studio have a look at LiveServer plugin.
When you are moving to production grade you might not want to include the entire babel library. See the other two approaches.
Webpack way
You're asking how to use webpack without bundling. Which can be done using file loader plugin to load every file separately, using a glob pattern. Do make sure whether this is indeed what you need. If all you want is to simply debug your code an relate it back to the original file after compiling, all you need is a standard webpack configuration using bundling and sourcemaps.
Taskrunner way
One way to have even more control over how each file is processed, you can use a taskrunner to do the compile step for you. Below is a simplified example configuration for taskrunner https://gulpjs.com/.
gulpfile.js
const gulp = require('gulp');
const watch = require('gulp-watch');
const webpackStream = require('webpack-stream');
const webpack = require('webpack');
const eslint = require('gulp-eslint');
gulp.task('watch', function() {
return watch('src/**.js', ['compile']);
});
gulp.task('lint', function() {
return gulp.src(['src/*.js', 'src/*/*.js'])
.pipe(eslint({
parser: 'babel-eslint',
parserOptions: {
ecmaFeatures: {
jsx: true
},
sourceType: 'module'
}
}))
.pipe(eslint.format())
.pipe(eslint.failAfterError());
});
gulp.task('compile', ['lint'], function() {
return gulp.src('src/main.js')
.pipe(webpackStream({
output: {
filename: 'main.js',
libraryTarget: 'commonjs2',
sourceMapFilename: 'main.js.map',
},
plugins: [],
module: {
loaders: [
{
test: /\.js$/,
loader: 'babel-loader',
query: {
presets: [
require.resolve('babel-preset-es2015'),
require.resolve('babel-preset-stage-0'),
],
},
},
],
},
}), webpack)
.pipe(gulp.dest('dist/'));
});
This example file can be run, using gulp watch. It'll watch the files for a chance and when it does trigger the other tasks.
I only had an example with webpack, but you can replace it by any other compiler component or even write your own compile step if you want (probably you don't).
This way you have exact control over every step your files go through. Most of which (and more) can also be achieved using the Webpack way. However, it would have the downside of inserting all its boilerplate on top of each processed file, when processing each file as a separate bundle. Ultimately probably something could be done with common chunks plugin.
With the latest release of Snowpack (formerly #pika/web) this should be possible now!
From their website:
TL;DR - With Snowpack you can build modern web apps (using React, Vue, etc.) without a bundler (like Webpack, Parcel, Rollup). No more waiting for your bundler to rebuild your site every time you hit save. Instead, every change is reflected in the browser instantly.
And their "How it Works":
Instead of bundling on every change, just run Snowpack once right after npm install.
Snowpack re-installs your dependencies as single JS files to a new web_modules/ directory. It never touches your source code.
Write code, import those dependencies via an ESM import, and then run it all in the browser.
Skip the bundle step and see your changes reflected in the browser immediately after hitting save.
Keep using your favorite web frameworks and build tools! Babel & TypeScript supported.
check https://www.snowpack.dev/ for more information, they have done a great job with their documentation, it looks really promising!
With webpack and source maps, it shouldn't matter that it changes your code. While this can be a challenge to set up initially, once you get it working you can look at your original source code in the browser debugging tools exactly as they appear to you on disk. The VS Code editor also does a good job of supporting this feature, allowing you to set breakpoints and look at the values of variables directly in your editor without having to use the browser developer tools.
However, if you are still set on trying to get this to work with your original source files then you are right that your ES6 code should just work in most modern browsers
For live reload you could check out the npm livereload package.
Or you could roll your own and figure out how webpack-dev-server does it. They use the chokidar npm package to watch the file system for changes and then they notify the broswer via web sockets. You could probably throw something together that's similar with a little effort.
Here is how webpack-dev-server initiates it:
const watcher = chokidar.watch(watchPath, options);
watcher.on('change', () => {
this.sockWrite(this.sockets, 'content-changed');
});
Obviously there is some JavaScript that runs in the browser waiting on a websocket for that message.
You could use a Webpack plugin like Emit All.
I want to minimize the number of HTTP requests from the client to load scripts in the browser. This is going to be a pretty general question but I still hope I can get some answers because module management in javascript has been a pain so far.
Current situation
Right now, in development, each module is requested individually from the main html template, like this:
<script src="/libraries/jquery.js"></script>
<script src="/controllers/controllername.js"></script>
...
The server runs on Node.js and sends the scripts as they are requested.
Obviously this is the least optimal way of doing so, since all the models, collections, etc. are also separated into their own files which translates into numerous different requests.
As far as research goes
The libraries I have come across (RequireJS using AMD and CommonJS) can request modules from within the main .js file sent to the client, but require a lot of additional work to make each module compliant with each library:
;(function(factory){
if (typeof define === 'function' && define.amd) define([], factory);
else factory();
}(function(){
// Module code
exports = moduleName;
}));
My goal
I'd like to create a single file on the server that 'concatenates' all the modules together. If I can do so without having to add more code to the already existing modules that would be perfect. Then I can simply serve that single file to the client when it is requested.
Is this possible?
Additionally, if I do manage to build a single file, should I include the open source libraries in it (jQuery, Angular.js, etc.) or request them from an external cdn on the client side?
What you are asking to do, from what I can tell, is concat your js files into one file and then in your main.html you would have this
<script src="/pathLocation/allMyJSFiles.js"></script>
If my assumption is correct, then the answer would be to use one of the two following items
GULP link or GRUNT link
I use GULP.
You can either use gulp on a case by case basis, which means calling gulp from the command line to execute gulp code, or use a watch to do it automatically on save.
Besides getting gulp to work and including the gulp files you need to do what you need, I will only provide a little of what I use to get your answer.
In my gulp file I would have something like this
var gulp = require('gulp');
var concat = require('gulp-concat');
...maybe more.
Then I have the file paths I need to be reduced into one file.
var onlyProductionJS = [
'public/application.js',
'public/directives/**/*.js',
'public/controllers/**/*.js',
'public/factories/**/*.js',
'public/filters/**/*.js',
'public/services/**/*.js',
'public/routes.js'
];
and I use this info in a gulp task like the one below
gulp.task('makeOneFileToRuleThemAll', function(){
return gulp.src(onlyProductionJS)
.pipe(concat('weHaveTheRing.js'))
.pipe(gulp.dest('public/'));
});
I then run the task in my command line by calling
gulp makeOneFileToRuleThemAll
This call runs the associated gulp task which uses 'gulp-concat' to get all the files together into one new file called 'weHaveTheRing.js' and creates that file in the destination 'public/'
Then just include that new file into your main.html
<script src="/pathLocation/weHaveTheRing.js"></script>
As for including all your files into one file, including your vendor files, just make sure that your vendor code runs first. It's probably best to keep those separate unless you have a sure fire way of getting your vendor code to load first without any issues.
UPDATE
Here is my gulp watch task.
gulp.task('startTheWatchingEye', function () {
gulp.watch(productionScripts, ['makeOneFileToRuleThemAll']);
});
Then I start up my server like this (yours may differ)
npm start
// in a different terminal window I then type
gulp startTheWatchfuleye
NOTE: you can use ANY movie or show reference you wish! :)
Now just code it up, every time you make a change in the specified files GULP will run your task(s).
If you want to say run Karma for your test runner...
add the following to your gulp file
var karma = require('karma').server;
gulp.task('karma', function(done){
karma.start({
configFile: __dirname + '/karma.conf.js'
}, done);
});
Then add this task karma to your watch I stated above like this...
gulp.task('startTheWatchingEye', function(){
gulp.watch(productionScripts, ['makeOneFileToRuleThemAll', 'karma']);
});
ALSO
Your specific settings may require a few more gulp modules. Usually, you install Gulp globally, as well as each module. Then use them in your various projects. Just make sure that your project's package.json has the gulp modules you need in dev or whatever.
There are different articles on whether to use Gulp or Grunt. Gulp was made after Grunt with a few additions that Grunt was lacking. I don't know if Grunt lacks them anymore. I like Gulp a lot though and find it very useful with a lot of documentation.
Good luck!
I'd like to create a single file on the server that 'concatenates' all the modules together. If I can do so without having to add more code to the already existing modules that would be perfect.
Sure you can. You can use Grunt or Gulp to do that, more specifically grunt-contrib-concat or gulp-concat
Here's an example of a Gruntfile.js configuration to concat every file under a js directory:
grunt.initConfig({
concat: {
dist: {
files: {
'dist/built.js': ['js/**/**.js'],
},
},
},
});
Also, you can minify everything after concatenating, using grunt-contrib-minify.
Both libraries support source maps so, in the case a bug gets to production, you can easily debug.
You can also minify your HTML files using grunt-contrib-htmlmin.
There's also an extremely useful library called grunt-usemin. Usemin let's you use HTML comments to "control" which files get minified (so you don't have to manually add them).
The drawback is that you have to explicitely include them in your HTML via script tags, so no async loading via javascript (with RequireJS for instance).
Additionally, if I do manage to build a single file, should I include the open source libraries in it (jQuery, Angular.js, etc.) or request them from an external cdn on the client side?
That's debatable. Both have pros and cons. Concatenating vendors assures that, if for some reason, the CDN isn't available, your page works as intended. However the file served is bigger so you consume more bandwidth.
In my personal experience, I tend to include vendor libraries that are absolutely essential for the page to run such as AngularJS for instance.
If I understand you correctly, you could use a task runner such as Grunt to concatenate the files for you.
Have a look at the Grunt Concat plugin.
Example configuration from the docs:
// Project configuration.
grunt.initConfig({
concat: {
dist: {
src: ['src/intro.js', 'src/project.js', 'src/outro.js'],
dest: 'dist/built.js',
}
}
});
Otherwise, as you have stated, a 'module loader' system such as Require JS or Browserify may be the way to go.
I've come across an interesting problem in NodeJS that pops up now and then at work, and I haven't been able to figure it out.
I've written a back-end in CoffeeScript, which I then compile with grunt-contrib-coffee into Javascript in a ~/bin directory. I also include a library that I privately host on Bitbucket with the appropriate private keys, and install through npm. This library too is written in coffeescript.
Usually I'm able to include this library in Javascript without any headaches, using a simple require just like I would for any other library. However, occasionally one of the servers that's using the back-end gets updates, and it stops working. When I go check the error, it's always the same - 'require' passes, but instead of loading the actual library in JavaScript, it returns an empty object ({}). Running the code in coffeescript still works, but regardless of what I do - recompile, reinstall all dependencies, remove and clone the repository, the problem persists.
I've run out of ideas of what it might be myself, so I'm hoping that someone here has come across the problem before and might be able to point me in the right direction.
In the library package.json:
{
"name": "graph-engine",
"main": "./lib/graph"
}
In the library's graph.coffee
class Graph
constructor: () ->
# Perform init
module.exports = Graph
Then in the app's package.json:
{
"graph-engine": "git+ssh://git#bitbucket.org:<team>/graph-engine.git"
}
Finally, in the app itself:
GraphEngine = require "graph-engine"
engineInstance = new GraphEngine()
This works fine in coffeescript, but when compiling the app using grunt with the following setup for grunt-contrib-coffee:
coffee:
glob_to_multiple:
expand: true
flatten: false
cwd: 'src'
src: ['**/*.coffee']
dest: 'bin'
ext: '.js'
It fails to load the library correctly when running the compiled application, instead returning an empty object. Again, I'd like to emphasise that this does not always happen, and as such I didn't include any code or json files as I believed that it was unrelated.
Although the exact reason for the randomness of the behaviour eludes me to this day, I have found that it is related to these libraries being written in CoffeeScript.
It seems that depending on the load order, occasionally third party libraries would register coffee script and my own libraries would load correctly, whereas other times these libraries would load after my own libraries had loaded, resulting in an inability to load coffeescript. Therefore, the solution turned out to be fairly straightforward - register coffeescript. This can be done by putting the following at the start of the node.js app:
require('coffee-script/register')
Refer to the documentation for further information.
I'm using MVC5/Durandal and wondering what the recommended approach to bundling/minifying a durandal application would be. Ive seen docs on using Weyland but will be deploying to an Azure Website and don't see how to leverage this in my .net-based deployment process. How can I go about configuring automated bundling/minification of my durandal application when deploying to Azure?
I've spent a bit of time trying to optimize an AngularJS application for one of the biggest banks in Holland. Although it's no Durandal, this might still give you some ideas.
So what did we use for bundling and minification? Out-of-the-box bundling and minifcation from ASP.NET MVC ( which is from the system.web.optimization namespace )
You need to get a couple of things in order to leverage this:
Organize your files
Organize your code files in a way that they can easily be bundled. We had a large tree structure under the /app folder in the web project. Something like:
- App
|- Modules
| |-Common
| | |- Directives
| | |- Templates
| | |- Filters
| --User
| ...
| app.js
So the application skeleton was inside the app.js and all the other JS files were required by the application. Point being: all SPA code is separated from vendor javscript files and the rest of course
Set up the budling inside the bundle configuration
That's a breeze now, just do regular-old-bundling from your Global.asax.cs:
Make sure there's a line in the Application_Start() with:
BundleConfig.RegisterBundles(BundleTable.Bundles);
That calls into your BundleConfig class which only needs 1 bundle to pack up the whole /app folder:
bundles.Add(new ScriptBundle("~/bundles/app")
.Include("~/app/*.js")
.IncludeDirectory("~/app", "*.js", true));
We needed the app.js to load first - therefore we put it explicitly at the top. Don't worry, it will not be requested twice.
For bundling - only the sequence of files can be important. However, through including that file explicitly, we could control that and it worked like a charm.
Minification
Now for minification we had to do some code changes. AngularJS can be used with different types of syntax - some of which can be minified, others give problems.
Example:
angular.module('myapp').controller(function($http,$scope) { ... });
can not be minified. The minifyer will change the name of $http so something shorter, after which the injector cannot do dependency injection anymore, since it only knows stuff called $http and $scope and not the minified variable name.
So for Angular you need to use a different syntax:
angular.module('myapp').controller(['$http', '$scope', function($http,$scope) { ... }]);
With this, the injector will know that the first argument of the function is '$http' because that's the first string variable in the array. OK, but that's Angular and you're looking for Durandal.
I've heard that Durandal uses AMD right? So within a module, minification shouldn't be a problem, because it should be smart enough. However, if you're using external things, you want to make sure everything still works. I've read here that you'll want to use te following syntax for your AMDs:
define("someModule", ["jquery", "ko"], function($,ko) { ... });
And that gave us a reduction of 80% of the requests and around the same number for the Javascript payload.
Added AngularJS bonus
This might not be of interest to you, but maybe for other readers. The reason we didn't get a 99% reduction of requests is because AngularJS uses something called 'directives'. These are like HTML templates. Those HTML templates still needed to be downloaded every time they were used.
They were also included in our /app folder - hence we had to add an IgnoreRoute in the routeconfig:
routes.IgnoreRoute("app/");
I Googled, but couldn't find anything similair for Durandal. So Angular will go and get all of the small HTML files, but will first check its $templatecache. In case the HTML content is not in the cache, it goes out and downloads it and places it in the cache, so it needs to be downloaded only once.
We, well I, wrote a T4 generator that outputs a JS file in which all the HTML files in the /app folder are added to the $templatecache. So the output would look like:
angular.module('myapp').run(function($templateCache) {
/// For all *.html files in the /app folder and its children
$templateCache.put('...filename...', '...content of html file ...');
});
Because this .JS file was inside the /app folder, it would immediately get bundled with the application, no more configuration required. This got our requests down for the whole application to just 1. Since the amount of HTML was quite small, it seemed to be faster to do 1 larger request, then multiple smaller ones.
Point is: if Durandal has something similair and it will look for some templates, find the caching mechanism ( because it will have it ) and try to tap into that.
Controlling bundling and minification
I'll quote this site: http://www.asp.net/mvc/overview/performance/bundling-and-minification
Bundling and minification is enabled or disabled by setting the value
of the debug attribute in the compilation Element in the Web.config
file. In the following XML, debug is set to true so bundling and
minification is disabled.
<system.web>
<compilation debug="true" />
</system.web>
So for your release build - this flag shouldn't be set and thus bundling + minification should happen.
But of course, you will want to test locally - you can either remove this from your web.config or override it with BundleTable.EnableOptimizations = true;
Deployment to Azure
Then you mention deployment to Azure. I don't know how this would be any different from deploying to a local server. We used web-deploy. Bundling and minification doesn't happen build-time, so there are no changes in the build process. Also, the optimization framework is being deployed with the site - so no difficult things for deployment either.
Maybe one thing though: you could consider adding the CDN for the libraries you are using:
bundles.Add(new ScriptBundle("~/bundles/jquery", "http://ajax.aspnetcdn.com/ajax/jQuery/jquery-1.7.1.min.js")
In case the CDN location of jQuery was already cached by the client's browser, you'll save another request.
Measuring the performance was easy: just open up the network tab on Chrome and reload the page ( make sure it's not caching ). Check the total number of requests and the total amount of data downloaded. Like I said: we saw a huge improvement.
Well, hope it helps or points you in a right direction.
The below answers are pretty complicated. I've just gone through this with a simple(r) approach here:
https://lifelibertycode.wordpress.com/2015/04/14/how-to-bundle-up-your-mvc-durandal-app/
The steps below:
Step 1: Install Node
Step 2: Install Gulp
$ npm install --global gulp
$ npm install --save-dev gulp
Step 3: Create your gulpfile.js
This should be at the root of your project, and should initially contain this:
var gulp = require('gulp');
gulp.task('default', function() {
// place code for your default task here
});
Step 4: Install gulp-durandal
npm install gulp-durandal --save-dev
Step 5: Update your gulpfile.js
var durandal = require('gulp-durandal');
gulp.task('durandal', function(){
durandal({
baseDir: 'app', //same as default, so not really required.
main: 'main.js', //same as default, so not really required.
output: 'main.js', //same as default, so not really required.
almond: true,
minify: true
})
.pipe(gulp.dest('dir/to/save/the/output'));
});
Step 6: Add a post-build event to your project
if '$(Configuration)'=='Release' (
cd $(ProjectDir)
gulp durandal
)
Step 7: Add a pre-build event to your project
I needed this because occasionally gulp would hang when generating the new main-built.js on top of an existing version. So I just delete the old version before the build begins:
if '$(Configuration)'=='Release' (
cd $(ProjectDir)/app
del main-built.js
del main-built.js.map
)
Now, when you build your project, you’ll generate a new main-built.js file each time that can be served down to your clients. Sweet.
At this point, you probably have some concerns.
How do I keep my files un-bundled when I’m debugging?
#if (HttpContext.Current.IsDebuggingEnabled) {
<script type="text/javascript" src="~/Scripts/require.js" data-main="/App/main"></script>
} else {
#Scripts.Render("~/Scripts/main-built")
}
Where ‘main-built’ is defined in your BundleConfig:
bundles.Add(new ScriptBundle("~/Scripts/main-built").Include(
"~/app/main-built.js"));
How do I bust cache when I have new stuff to ship?
If you’re using the above approach, bundling will take care of this for you. ASP.NET will detect a change to your main-built.js file and append a unique identifier to your bundles to bust the cache.
What if my client has downloaded my SPA, and then I ship an update. Won’t the (outdated) client-side code stick around until they refresh?
Yup. Unless you leverage build versioning to tell the client when it’s out of date, and then tell the user.
I happen have written a blog post about this:
https://javascriptkicks.com/articles/4230
Hopefully that helps you out