Has anyone here used Mark Story's Asset Compress (https://github.com/markstory/asset_compress/) plugin ?
I've followed the installation instructions to the last bit and have the plugin up and running - but it simply won't generate the combined JS files to the specified cache (cache_js) folder.
I'm using Cake 1.3 and v0.2 of AssetCompress (the latest available download from github).
The plugin has been placed in the app/plugins/asset_compress folder
Cache folders - cache_js and cache_css created in WEBROOT
JsMin and CssMin filters downloaded and added to app/vendors/JsMin and app/vendors/CssMin respectively
Config file setup to point to the cache folders and filters
Routes configured as per requirements
Debug mode set to 1
My config.ini:
[Javascript]
searchPaths[] = WEBROOT/js/
searchPaths[] = WEBROOT/js/jquery/
searchPaths[] = WEBROOT/js/jquery/plugins/
stripComments = true
cacheFilePath = WEBROOT/cache_js/
cacheFiles = false
filters[] = JsMin
[Css]
searchPaths[] = WEBROOT/css/
stripComments = true
cacheFilePath = WEBROOT/cache_css/
cacheFiles = false
filters[] = CssMin
Still no output in the cache folders.
Any ideas why ?
Thanks,
m^e
I did not use this plugin yet
but just few comments.
-sometimes minification leads to the hells (javascript errors)especially if you mimnify an already minified version
-minification makes comments and license agreements disappear which makes things illegal.
I personally, do not prefer to compress assets using plugins.
plugin in cakePHP by definition is a semi application not just a utility class (helper, component, behavior, or any vendor utility class)
currently I am compressing the concatenated javascript files (resp css files) in the AppController by using just one function
Finally got it to work.
Turns out I was messing around with v0.2 which is what you get by default when you hit the DOWNLOAD button at the GitHub repository of Asset Compress.
You've to check out the latest version from GitHub using a git client like msysGit (if you are on Windows) and then be extra careful about where you are placing the asset inclusion commands.
Here are the steps you need to take:
Place the contents of the download in a folder named asset_compress under your app's plugins folder.
Include the plugin as a helper (preferably in your app_controller.php)
public $helpers = array(
'AssetCompress.AssetCompress',
);
In your layout file, place the asset inclusion commands, e.g.
$this->AssetCompress->script( filename );
Just before the point where you place echo $scripts_for_layout in your layout, place the statement,
echo $this->AssetCompress->includeJs();
...and you're good to go.
Cheers,
m^e
Related
I'm using MVC5/Durandal and wondering what the recommended approach to bundling/minifying a durandal application would be. Ive seen docs on using Weyland but will be deploying to an Azure Website and don't see how to leverage this in my .net-based deployment process. How can I go about configuring automated bundling/minification of my durandal application when deploying to Azure?
I've spent a bit of time trying to optimize an AngularJS application for one of the biggest banks in Holland. Although it's no Durandal, this might still give you some ideas.
So what did we use for bundling and minification? Out-of-the-box bundling and minifcation from ASP.NET MVC ( which is from the system.web.optimization namespace )
You need to get a couple of things in order to leverage this:
Organize your files
Organize your code files in a way that they can easily be bundled. We had a large tree structure under the /app folder in the web project. Something like:
- App
|- Modules
| |-Common
| | |- Directives
| | |- Templates
| | |- Filters
| --User
| ...
| app.js
So the application skeleton was inside the app.js and all the other JS files were required by the application. Point being: all SPA code is separated from vendor javscript files and the rest of course
Set up the budling inside the bundle configuration
That's a breeze now, just do regular-old-bundling from your Global.asax.cs:
Make sure there's a line in the Application_Start() with:
BundleConfig.RegisterBundles(BundleTable.Bundles);
That calls into your BundleConfig class which only needs 1 bundle to pack up the whole /app folder:
bundles.Add(new ScriptBundle("~/bundles/app")
.Include("~/app/*.js")
.IncludeDirectory("~/app", "*.js", true));
We needed the app.js to load first - therefore we put it explicitly at the top. Don't worry, it will not be requested twice.
For bundling - only the sequence of files can be important. However, through including that file explicitly, we could control that and it worked like a charm.
Minification
Now for minification we had to do some code changes. AngularJS can be used with different types of syntax - some of which can be minified, others give problems.
Example:
angular.module('myapp').controller(function($http,$scope) { ... });
can not be minified. The minifyer will change the name of $http so something shorter, after which the injector cannot do dependency injection anymore, since it only knows stuff called $http and $scope and not the minified variable name.
So for Angular you need to use a different syntax:
angular.module('myapp').controller(['$http', '$scope', function($http,$scope) { ... }]);
With this, the injector will know that the first argument of the function is '$http' because that's the first string variable in the array. OK, but that's Angular and you're looking for Durandal.
I've heard that Durandal uses AMD right? So within a module, minification shouldn't be a problem, because it should be smart enough. However, if you're using external things, you want to make sure everything still works. I've read here that you'll want to use te following syntax for your AMDs:
define("someModule", ["jquery", "ko"], function($,ko) { ... });
And that gave us a reduction of 80% of the requests and around the same number for the Javascript payload.
Added AngularJS bonus
This might not be of interest to you, but maybe for other readers. The reason we didn't get a 99% reduction of requests is because AngularJS uses something called 'directives'. These are like HTML templates. Those HTML templates still needed to be downloaded every time they were used.
They were also included in our /app folder - hence we had to add an IgnoreRoute in the routeconfig:
routes.IgnoreRoute("app/");
I Googled, but couldn't find anything similair for Durandal. So Angular will go and get all of the small HTML files, but will first check its $templatecache. In case the HTML content is not in the cache, it goes out and downloads it and places it in the cache, so it needs to be downloaded only once.
We, well I, wrote a T4 generator that outputs a JS file in which all the HTML files in the /app folder are added to the $templatecache. So the output would look like:
angular.module('myapp').run(function($templateCache) {
/// For all *.html files in the /app folder and its children
$templateCache.put('...filename...', '...content of html file ...');
});
Because this .JS file was inside the /app folder, it would immediately get bundled with the application, no more configuration required. This got our requests down for the whole application to just 1. Since the amount of HTML was quite small, it seemed to be faster to do 1 larger request, then multiple smaller ones.
Point is: if Durandal has something similair and it will look for some templates, find the caching mechanism ( because it will have it ) and try to tap into that.
Controlling bundling and minification
I'll quote this site: http://www.asp.net/mvc/overview/performance/bundling-and-minification
Bundling and minification is enabled or disabled by setting the value
of the debug attribute in the compilation Element in the Web.config
file. In the following XML, debug is set to true so bundling and
minification is disabled.
<system.web>
<compilation debug="true" />
</system.web>
So for your release build - this flag shouldn't be set and thus bundling + minification should happen.
But of course, you will want to test locally - you can either remove this from your web.config or override it with BundleTable.EnableOptimizations = true;
Deployment to Azure
Then you mention deployment to Azure. I don't know how this would be any different from deploying to a local server. We used web-deploy. Bundling and minification doesn't happen build-time, so there are no changes in the build process. Also, the optimization framework is being deployed with the site - so no difficult things for deployment either.
Maybe one thing though: you could consider adding the CDN for the libraries you are using:
bundles.Add(new ScriptBundle("~/bundles/jquery", "http://ajax.aspnetcdn.com/ajax/jQuery/jquery-1.7.1.min.js")
In case the CDN location of jQuery was already cached by the client's browser, you'll save another request.
Measuring the performance was easy: just open up the network tab on Chrome and reload the page ( make sure it's not caching ). Check the total number of requests and the total amount of data downloaded. Like I said: we saw a huge improvement.
Well, hope it helps or points you in a right direction.
The below answers are pretty complicated. I've just gone through this with a simple(r) approach here:
https://lifelibertycode.wordpress.com/2015/04/14/how-to-bundle-up-your-mvc-durandal-app/
The steps below:
Step 1: Install Node
Step 2: Install Gulp
$ npm install --global gulp
$ npm install --save-dev gulp
Step 3: Create your gulpfile.js
This should be at the root of your project, and should initially contain this:
var gulp = require('gulp');
gulp.task('default', function() {
// place code for your default task here
});
Step 4: Install gulp-durandal
npm install gulp-durandal --save-dev
Step 5: Update your gulpfile.js
var durandal = require('gulp-durandal');
gulp.task('durandal', function(){
durandal({
baseDir: 'app', //same as default, so not really required.
main: 'main.js', //same as default, so not really required.
output: 'main.js', //same as default, so not really required.
almond: true,
minify: true
})
.pipe(gulp.dest('dir/to/save/the/output'));
});
Step 6: Add a post-build event to your project
if '$(Configuration)'=='Release' (
cd $(ProjectDir)
gulp durandal
)
Step 7: Add a pre-build event to your project
I needed this because occasionally gulp would hang when generating the new main-built.js on top of an existing version. So I just delete the old version before the build begins:
if '$(Configuration)'=='Release' (
cd $(ProjectDir)/app
del main-built.js
del main-built.js.map
)
Now, when you build your project, you’ll generate a new main-built.js file each time that can be served down to your clients. Sweet.
At this point, you probably have some concerns.
How do I keep my files un-bundled when I’m debugging?
#if (HttpContext.Current.IsDebuggingEnabled) {
<script type="text/javascript" src="~/Scripts/require.js" data-main="/App/main"></script>
} else {
#Scripts.Render("~/Scripts/main-built")
}
Where ‘main-built’ is defined in your BundleConfig:
bundles.Add(new ScriptBundle("~/Scripts/main-built").Include(
"~/app/main-built.js"));
How do I bust cache when I have new stuff to ship?
If you’re using the above approach, bundling will take care of this for you. ASP.NET will detect a change to your main-built.js file and append a unique identifier to your bundles to bust the cache.
What if my client has downloaded my SPA, and then I ship an update. Won’t the (outdated) client-side code stick around until they refresh?
Yup. Unless you leverage build versioning to tell the client when it’s out of date, and then tell the user.
I happen have written a blog post about this:
https://javascriptkicks.com/articles/4230
Hopefully that helps you out
Hi I wanted to know the advantage of registering Asset Bundle following the process described in the docs like
Process one
in AppAsset.php
public $js = [
'js/myjsfile.js'
];
then in the view file
adding Namespace like
namespace app\assets;
and then adding the use statement like
use app\assets\AppAsset;
AppAsset::register($this);
Instead of doing all this if I use
Process Two
$this->registerJs('js/myjsfile.js', $this::POS_READY);
it works fine.
So why should I use Process One.
Any advantage and reason for this will be greatly appreciated.
If I follow the process one Do I need to add all the js files in
AppAsset.php individually.
Thanks.
Asset Bundles have some advantages over normal registering. Apart from what #deacs said in his/her answer here are others:
Assets Bundles can publish the file to assets if its not in web accessible directory
Assets Bundle can deal with less files (in case of CSS) as well as compressing the assets.
Makes Code Elegant especially in solving dependencies and hence reusability
All the features that makes bundles shine are found in docs
One of the main reasons for using an Asset Bundle is that your assets' paths will always be correct. Consider:
$this->registerJsFile('js/myjsfile.js', ['position'=>$this::POS_READY]);
will generate something like:
<script src="js/myjsfile.js"></script>
Which works great for non urlManager enabled urls, e.g. http://localhost/yiiproject/index.php?r=user/update&id=8 because your browser looks for the js file at: /yiiproject/js/myjsfile.js
But if you enable urlManager, your url will look like http://localhost/yiiproject/user/update/8, which means your browser will look for your js file at: /yiiproject/user/update/8/js/myjsfile.js.
You could overcome this problem by using:
$this->registerJsFile(Yii::$app->request->baseUrl.'/js/myjsfile.js', ['position'=>$this::POS_READY]);
But the Asset Bundle basicly does that for you.
Using Asset Bundles, you can also get the latest version from 'vendor' folder, so if you need to update some lib you don't need to manually do this since composer already do this.
Is there any way to list the files of a directory in a static webpage with the link to view the file?
I would like to upload some PDF files to a certain directory of my static website (it uses HTML and JS), and want to show the list of the files in a page with the link to view the pdf files. That way, if I upload new files, I don't have to modify the HTML page every time. Is there any way to do that?
If you are using Apache as a web-server and have configured mod-autoindex for the directory you upload pdf files then you should probalby see somethink like this when navigation to that url:
This auto-generated page can be easily parsed by js using jquery:
var pdfFilesDirectory = '/uploads/pdf/';
// get auto-generated page
$.ajax({url: pdfFilesDirectory}).then(function(html) {
// create temporary DOM element
var document = $(html);
// find all links ending with .pdf
document.find('a[href$=.pdf]').each(function() {
var pdfName = $(this).text();
var pdfUrl = $(this).attr('href');
// do what you want here
})
});
You need to have a server-side implementation, you could do this by using PHP for example. You cannot do this with JavaScript, because it is run on the client-side, and cannot access the files on the server.
I made a node module to automate the task of getting all files and folders: mddir
Usage
node mddir "../relative/path/"
To install: npm install mddir -g
To generate markdown for current directory: mddir
To generate for any absolute path: mddir /absolute/path
To generate for a relative path: mddir ~/Documents/whatever.
The md file gets generated in your working directory.
Currently ignores node_modules, and .git folders.
Troubleshooting
If you receive the error 'node\r: No such file or directory', the issue is that your operating system uses different line endings and mddir can't parse them without you explicitly setting the line ending style to Unix. This usually affects Windows, but also some versions of Linux. Setting line endings to Unix style has to be performed within the mddir npm global bin folder.
Line endings fix
Get npm bin folder path with:
npm config get prefix
Cd into that folder
brew install dos2unix
dos2unix lib/node_modules/mddir/src/mddir.js
This converts line endings to Unix instead of Dos
Then run as normal with: node mddir "../relative/path/".
I made another node module call agd, to generate a tree view based on the other module: https://github.com/JohnByrneRepo/agd.
Auto generated documentation (Alpha)
Functionality so far:
Generates a tree folder structure in node, that is rendered as a treegrid in the browser. Click on a file (non-root level) to populate main view.
Coming soon:
Generates a documentation guide including function names and parameters, function dependencies, and more. Initially compatible with jQuery and plain JavaScript function namespacing, soon to be compatible with React, Redux, Angular 1, Angular 2 and other frameworks on request.
Usage
node agd relativePath
e.g. node agd '../../'
Generated code.json.
Run 'node http-server' then open the browser to view the file structure rendered in the sidebar. Larger projects can take up to a minute or two to render.
See code.json for example generated data.
To-do: Add code content for top level files. Move tree html generation into node.
Contact html5css3#outlook.com
MIT License
Example generated tree structure
Open a browser
Navigate to the folder you want listed
If you see a list of the files, continue
If you don't see a list of the files, Take a look at this: Is it possible to get a list of files under a directory of a website? How? to figure out how to do that.
Make an ajax call to that folder (example below)
response will be the html of the listing page
You can parse that out to get the file listing
Example:
// This is in angular, but you can use whatever
$http.get('folder_to_list/').success(function(response) {
console.log(response);
});
Instead of JavaScript, which runs only on the client side, you should consider using PHP or other server language, to crawl your directory of files and list them inside an HTML file/template. PHP for example has scandir function, which can list files in a dicrectory
You need a combination of javascript and PHP. You could call the PHP file through Javascript, by using an AJAX call.
try this PHP file which should return an Json object:
$directory = "/directory/path";
$pdffiles = glob($directory . "*.pdf");
$files = array();
foreach($pdffiles as $pdffile)
{
$files[] = "<a href=$pdffile>".basename($pdffile)."</a>";
}
echo json_encode($files);
Now you just need to loop through the Json object to list the Url's.
Something like:
$.getJSON( "file.php", function( data ) {
var items = [];
$.each( data, function(val ) {
items.push(val);
});
$( "body" ).append( items );
});
Did not test it, but something like this should work.
Be simple. Put all your files in a directory and don't make a homepage of that directory. Then, in the page you want, add an Iframe that shows that directory. Then you will see the list of files you uploaded, all in hyperlinks. When you click on the links, the Iframe will show the PDF files.
I have the same problem.
I used to use an apache webserver with its 'fancy directory listings' and had everything setup that way, complete with headers, footers, color schemes, etc.
Then I migrated to gitlab webservers which is pure static pages only. NO Directory listings. Arrggghh...
My solution...
I continue to have the pages served in a local apache server (not world accessable), then I download the "index.html" file it generates, before uploading the index page to gitlab.
Example page generated from apache fancy directory listing...
https://antofthy.gitlab.io/info/www/
I do the same for a set of pages that used Server-Side Includes (shtml), having apache expand the page to static HTML.
Example apache SSI generated page...
https://antofthy.gitlab.io/graphics/polyhedra/
Of course this does not work with pages that rely on executable output, or CGI scripts, but for directory listings it is just fine.
Of course I would prefer to find a SSG that knows apache fancy directory listing, SSI, or even basic directory listings, that isn't over kill. BUt that is an on going search.
I'm using Play framework 2.3.6 and Webjars for web lib dependencies.
That is, my build.sbt contains something like "org.webjars" % "angularjs" % "1.2.26".
To uglify my Javascript sources I added pipelineStages := Seq(rjs, uglify, digest, gzip) to my build.sbt.
Now, when running Play's 'stage' or 'dist' tasks it looks like all Javascript files getting uglified, that is, also files from the Webjar libraries.
[info] Uglify2 file: /target/web/rjs/build/lib/angularjs/angular.js
I would have expected that sources from the external Webjar libraries are left untouched as there already is a minified version.
One problem with that is that the uglify process takes way too long.
How can I speed up the uglification process?
There are two ways to speed up the Javascript building steps:
Install node.js and set export SBT_OPTS="$SBT_OPTS -Dsbt.jse.engineType=Node" before running the activator. Using node.js instead of the default Javascript engine gives a very significant speedup. More details can be found here: Migration to play 2.3, see section about sbt-web
Customize the build steps, e.g.
disable minification by adding to build.sbt: RjsKey.optimize := "none"
limit uglification by adding to build.sbt: e.g. includeFilter in uglify := GlobFilter("myjs/*.js"),
You can find more details about the options on the github site of these plugins:
sbt-uglify
sbt-rjs
Even though that the sbt-uglify documentation says that excludeFilter should exclude webjars and public folder, it doesn't.
Follow Martin's reponse customization part, except that he maid y typo, add an S to RjsKeys:
RjsKeys.optimize := "none"
I'm currently working on a big JavaScript project for which we want to define our own API. I'm using RequireJS as my dependency loader and it suits me just fine, allowing me to define modules in their respective file. I do not make use of my own namespace, a module returns an instance, which can be used in other modules, i.e.:
define(
['imported_module'],
function(module){
module.doSomething();
}
)
However as the number of files grows, I'd like to decide how to structure these files in folders. Currently I use the following scheme to name my files:
[projectname].[packagename].[ModuleName]
An example could be stackoverflow.util.HashMap.js. I would like to introduce a project folder, a folder per package and rename the files to the module name, like:
stackoverflow/util/HashMap.js
This structures my code quite neatly into folders, however the filename reflects only the module now. I'd like to define some kind of routing to be able to define how RequireJS should look for files. Example:
The file
stackoverflow/util/stackoverflow.util.HashMap.js
Should be importable by the statement
define(['stackoverflow.util.HashMap'],function(HashMap){});
Has anyone experience with structuring large JavaScript projects and if so, could you share your approach?
You shouldn't specify the routing info on your js file names, those are the namespace and folder paths' jobs. So stackoverflow/util/HashMap.js is just fine. And you can use define("stackoverflow/util/HashMap", ....) to tell the dependency.
If you need to put your modules in a different folders, you can config paths for your loader, see this manual from RequireJS API.
There's no best way for structure your js files. But put the root namespace in a src folder is always a good practice. You can see the dojo source code and YUI source code and use similar ways for your project. They both are large scale Javascript projects.
actually it's better to get js lib routing to load all js using standard interface: "js.yoursite.com/lib-0.2.js" there should be a router (php or other, and able to cache queries). So there you could determine and control whole pathes that you use. Because common jquery plugin should stay at one dir, with jquery, and your own custom plugins not.
And there you control each project by it's own rules:
jquery/
plugins/
jquery.prettyPhoto.js
jquery.min.js
mySuperJS/
stable.0/ -- there your production version for 1.0 branch
module.js
0.1/
module.js
0.2/
module.js
0.3/
module.js
myOtherlib/
stable.0/ -- production version for all 0.* versions
stable.1/ -- production version for all 1.0 versions
0.1/
0.2/
0.3/
0.4/
0.4.1/
0.4.1.18/
We're using such structure around a year and it's the best for us. But sometimes we use more complex solution and separate all modules for libs, plugins, tools, components and apps.