Here's a typical workflow:
Edit JS file
Save file, watchify automatically starts rebuilding it for me
alt-tab to browser
ctrl+R to reload page
That's great, except if watchify takes longer than steps 3 and 4, it sucks because you either get the stale code or an error.
Is there an easy way to guarantee that doesn't ever happen? Like a way for watchify to signal to my server that it should wait another split second before trying to load the requested page? If such a thing doesn't exist, how do people deal with this problem in practice?
I must suck at Googling because I can't even find people talking about this problem except this which says "Add a simple (Node-based) server that will block on requests until the watch is done running: this would avoid the always-frustrating phenomenom of reloading the page only to find the watch hasn't quite run yet." -- but unfortunately that's an entry in a todo list, not something that exists in that repo.
If you are using Grunt or Gulp you can use the live reload plugin.
Or you can play a beep when the task is complete, so that you know when to reload the page.
Also it may be worth looking at livereloadify.
Related
I recently configured Webpack (version 4) to split the bundle in three chunks (bundle, runtime and vendor) and also to append a hash to the beginning of each of their filenames so that the browser can sense the changes in them. I'm also updating the HTML references with the HtmlWebpackPlugin.
This seems to be working, but not fully, let me explain. Before I did this, a hard reload was needed after each deployment in order to see the new changes, this is no longer needed.
Now the problem is, when you load the web app after a deployment for the first time, it still shows you the old version, it's only when you soft reload the webpage that it shows the new changes.
What I am wondering is, is there any way that I can get rid of this behavior so that whenever a deployment is done and you load the web app the changes instantly show up without the need of refreshing once?
Thanks in advance!
You could append a random querystring to the import of your bundles, like src="bundle.js?nocache=12345" that's generated everytime.
This prevents the browser from caching your code, and doesn't require you to change the bundle names (which is convenient).
If you don't want it to be loaded fresh each time even when you're not releasing anything, you should keep something like a version number somewhere and append that in a querystring to the bundle import instead.
Though, your current implementation basically does the same. It should be a fresh non-cached bundle when you open the web app.. I don't think there is a way to make the browser detect changes in javascript and refresh the imported scripts, unless you are working with something like webpack-dev-server (hot reload) that does that for you, but that's development convenience only.
We have a .net web forms application, running on iis on our own server that has show some strange behaviour in the last 24 hours.
Rollbar notified me of multiple errors all saying certain js functions/variables can't be found from a host of users - essentially breaking the app. I've come into work today, loaded the site up in chrome dev tools only to find it did not have any source file shown, and therefore no js files to load/step through (css & image files are there though).
To make matters more confusing, after refreshing the page everything is there as it should be?!
An updated build of the app was released yesterday, so I'm guessing that has something to do with it?
Honestly any speculative pointers on things we can look into to prevent it happening again would be appreciated.
If you have multiple javascript files and your code is running before its dependencies are loaded, then you get 'undefined' errors.
After page refresh the dependency files are already cached, so they load immediately, almost synchronously. That's why you dont get errors next times.
Try to disable cache in devTools and reload it a few times checking if next attempts are still working.
If that is the problem, you might consider modularizing your JavaScript code and loading it as asynchronous dependencies, for example by using browserify, webpack or even require.js. Anyways, you can find more in the subject looking for "javascript load order".
I have an angularjs SPA that is running into problems in IE9. The app works great in Chrome and Firefox. There is quite a bit data to download on the initial load and many dependencies managed through requirejs.
When I view the page in a fresh IE session I get a part of the webpage i.e. some parts just don't display on the screen. I check the network requests and they are short some items. If I reload at this point, the page will load fine because it already cached some of the files so it doesn't need to download them. However if I clear cache and retry the same thing will happen.
Missing files are not consistent, meaning that it's not always the same files.
Additionally, in console I can see the following error:
$digest already in progress
This app is an open source project that I am intimately familiar with, however I am not sure even where to start troubleshooting this.
Any ideas?
Thanks in advance
You've got something trying to start up a digest cycle when one is already running. You should check directives and any other code you have which calls either $apply() or $digest() directly.
I would try commenting out all of those $apply() and $digest() calls and see what happens. My guess is that the page will load successfully but you might not get everything on it updated. So then you can open another question and we can see what can be done to get rid of the call which is trying to start another digest cycle.
I have one website made with Wordpress. I installed the plugin "cache quick" in order to optimize it.
Now it loads faster.
But I have the little issue, which is:
I make little changes and upload to the production environment.
I clean cache
Now when each page, post, of Wordpress is visited for the first time, it loads very slowly (is the first time)
Then, I try to visit all the links of the website to cache them, and so serve the users quickly, with latest changes.
I have thought to make a script to do this for me :-)
Anyone can help me, please?
One of the methods to rebuild a cache is to write a crawler commandline script. It will read all URL's from your database and then uses curl to hit them. You can have this script have intervals between hits to save server capacity as well as have it cronned as to run every hour or so.
If you prefer doing it manually you can create a plugin that reads all url's and hits them each after each other. Functionality in fact is the same.
(I would write this in a comment, unfortunately I don't have enough reputation.)
If I understand correctly you want to visit your main page and all pages 'below' in the hierarchy.
Then assuming you have some unixoid system available I would suggest you use something like this on the command line:
wget -R -np http://www.yoursite.com
Read the man pages of wget to see what those flags do and which flags you actually need. You can also follow links in your domain and stuff like that.
If you want to do this on a regular timebasis you can use cron. And maybe after downloading everything you should rm all the stuff you downloaded.
You may try some grabber and after that take a look that your cache is build or not.
you may use http://www.httrack.com/ this will request all links on your website.
I hope this helps.
I have a recurring issue that occurs when I make changes to my angularjs app. The problem is that I have to refresh the page if I want to seed the changes that I have made. This is a problem, because I don't want my users to have to refresh the page (they may not know to do this). They may think the site is broken. Is there some kind of angular directive or process that I can follow that will allow my UI changes to be reflected on my production server without my users having to refresh the page?
There are a couple of solutions you could look at,
There is a similar answer to your question here:
AngularJS disable partial caching on dev machine
To quote the accepted answer on that post:
"here's one way to always automatically clear the cache whenever the
ng-view content changes"
myApp.run(function($rootScope, $templateCache) {
$rootScope.$on('$viewContentLoaded', function() {
$templateCache.removeAll();
});
});
A non angular solution may also be found here:
Force browser to clear cache
To quote the best answer on that post (Fermin's answer):
If it's to view css or js changes one way is to append _versionNo to
the css/js file for each release. E.g.
script_1.0.css script_1.1.css script_1.2.css etc.
You can check out this link to see how it could work.
I hope these help.
EDIT.
To respond to your comment, I'm sorry the above hasn't worked.
Perhaps you could try implementing a websockets based approach, similar to Trello, whereby after a certain amount of time you ask the user to update their page to receive new updates to the system or to refresh if their socket connection has timed out? Depending on your setup you may find this tutorial useful: http://www.html5rocks.com/en/tutorials/frameworks/angular-websockets/
Using html2js you can more easily manage versioning of templates. It is a grunt plugin that will load all of your templates into a js file. Downside is all your templates get loaded regardless of need, upside is you have all your templates loaded ready to use on one http call and one point of caching for them. This way you can set up to rename your js file on new releases and bust the cache for templates on production. As for dev, the other answer here is sufficient.