How to speed up Drupal in China - javascript

My target is to use Drupal in China (or any other country that throttles internet access).
In particular Drupal is not China friendly because of its attempt to load some resources from servers which are not allowed, or run extremely slow because of their filter policy.
In most common Drupal cases, googleapis.com and bootstrapcdn.com.
As a consequence sites are really slow: the main content is loaded quickly enough, but then (depending on the browser configuration) the screen remains blank until the requests timeout. Which usually is tents of seconds.
I made some research on the topic, but solution are often localized on the specific module or theme. I would like to find a general solution to apply without the need to patch jQuery or bootstrap or whatever every time.
To find a general solution is not easy because many calls are made runtime from Javascript or CSS imports, so, solving the problem from Javascript doesn't seem to be a solution (or not?).
The best solution I thought until now is to edit the hosts file on the server, redirecting to localhost some of the calls, e.g.,
127.0.0.1 fonts.googleapis.com
::1 fonts.googleapis.com
127.0.0.1 maxcdn.bootstrapcdn.com
::1 maxcdn.bootstrapcdn.com
[...]
But doesn't seem to work, I'm still seeing them among the downloaded resources.
Somebody managed to solve this problem for the Drupal CMS?

jquery update has means to serve a local jquery version.
for the font library it depends how the font is added. either by the theme (then you will have to modify your theme's code or hook_library_alter the library - which probably is quiet hard to maintain) or if it comes by another module that usually comes with an option to serve local version. (like fontawesome).
After all, are you trying to change the server side output or have an local browser extension modify it?

Related

referencing or hosting javascript on HTTP/2

For a modern website, what is the best way to load javascript libraries (in my case jQuery and D3)?
Let assume:
everyone accesses the site using HTTP/2
self hosting means hosting on GitHub (i.e. for bl.ocks)
referencing could mean:
Google for jQuery and cdnjs or D3.org for D3
cdnjs for jQuery and D3
cdnjs for jQuery and D3.org for D3
Since everyone is using HTTP/2, the parallelism argument no long applies (right?).
In order to maximize the chance of a cache hit, I would assume Google is the best bet for jQuery, but they do not provide D3, so I would have to use cdnjs or D3.org for that. Is there an advantage to using cdnjs for both?
EDIT: Let me say about the audience that it is global, so ideally a solution would work well from e.g. Africa and China. The later is important here, because it blocks access to Google servers, meaning a local fallback would be needed.
The audience is also not limited to D3 designers / bl.ocks users (in case that would be relevant to the cache hit chances).
Using a CDN version might mean it's cached, so you save a tiny about of downloading another copy of it. However if it's not, then it can actually slow down your site because you need to make a connection to the CDN site including a DNS lookup, a TCP 3 way handshake, deal with TCP Slow Start meaning the connection is initially slow, do a TLS set up (assuming it's over HTTPS), and finally requesting the resource. After which you never use that CDN for anything else so all that setup cost is wasted.
This cost is doubled up for two different CDNs.
Personally, if it's one or two libraries than I just self host for this reason. Even for HTTP/1.1.
If you really want benefits of a CDN then consider putting the whole site behind a CDN like Cloudfare and not just loading one or two libraries from a CDN. And this might not be a bad idea for a global service like you say this is.

Load libraries, scripts, etc for the website from external or internal sources?

I just started my adventure with frontend, most likely with web design. I've been struggling to answer one technical question and I couldn't find yet a reasonable answer.
There's so many libraries you can load, download to make your web developing faster. Therefore there is my question.
Is it better to download these libraries (e.g. Boostrap, jQuery, Angular, fonts from Google and so) and link to them (externally) from the official source or download it, upload to your server and then link to the location file (internal source) on your server?
My imagination tells me that if I would download them and upload em on my server, then link to it would make the whole website load quicker. Is that a good thinking?
Pro hosting and linking to external resources (may it be JS libraries, images or whatever):
Spreading the load: your server doesn't have to serve all content, it can concentrate on its main functionality and serve the application itself.
Spread the HTTP connections: due to more and more asynchronously working applications it is a good thing to use the maximum parallel HTTP connections per site/subdomain to deliver application data and load all necessary additional resources from other servers.
as Rafael mentioned above, CDNs scale very good and seldom go offline.
Cons
Even with fast internet connections there is a high chance that resources will be served faster when they are located on the same Intranet. That's why some companies have their own "Micro-CDNs" inside their local networks to combine the advantages of multiple servers and local availability.
External dependancy: as soon as an Internet connection becomes unavailable or a Proxy server goes down, all external resources become unavailable leaving the application in a broken state.
Sometimes it may be actually faster if you link from an external source. That's because the browser stores recent data it accesses, and many sites use Bootstrap, jQuery and the such. It might not happen frequently with less popular libraries.
Keep in mind, though, since you're downloading from external sources, you're at the mercy of their servers. If for some reason or another it gets offline, your page won't work correctly. CDNs are not supposed to go offline for that very reason, but it's good to be aware of that. Also, when/if you're offline and working on your page, you won't be able to connect during development.
It is always better to download these files locally if you are developing some application for more security so that you do not really have to depend on any third party server which hosts the CDN.
Talking about performance using CDN might be beneficial because the libraries that you require might be cached in your browser so the time to fetch the file is saved. But if the file is available locally loading these files will definately take time and space.
https://halfelf.org/2015/cdn-vs-local/
https://www.sitepoint.com/7-reasons-not-to-use-a-cdn/
I agree with Rafael's answer above, but wanted to note a few benefits of serving up these libraries locally that he or she omitted.
It is still considered best practice (until HTTP2 becomes widespread) to try to minimize the amount of downloads being made by your site by concatenating many files into a single file. SO-- if you are using three Javascript libraries/frameworks (for instance, Angular, jQuery and Moment.js), if you are using a CDN that is three separate script elements pulling down three separate .js files. However, if you host them locally, you can have your build process include a step where it bundles the three libraries together into a single file called "vendor.js" or something along those lines. This has the added bonus of simplifying dependency loading to some degree, as you can concatenate them in a certain order should the need be.
Finally, although it is a little advanced if you are just getting started, if you are considering hosting your library files with your project it would definitely be worth looking into bower (https://bower.io/docs/api/) -- it is a node build tool that allows you to define what packages need to be included in your project and then install them with a single command-- particularly useful for keeping unnecessary library files out of your version control. Good luck!

How to execute external JS file blocked by users' adblockers

We use an external service (Monetate) to serve JS to our site such that we can perform adhoc presentation-layer site updates without going through the process of a site re-deploy - which in our case is a time-consuming, monolithic process which we can only afford to do about once per month.
However, users who use adblockers in the browser do not see some of these presentation-layer updates. This can negatively affect their experience of the site as we sometimes include time-sensitive promotions that those users may not be aware of.
To work around this, I was thinking to duplicate the JavaScript file that Monetate is serving and host it on a separate infrastructure from the site. That way, it we needed to make updates to it, we could do so as needed without doing a full site re-deploy.
However, I'm wondering if there is some way to work around the blocking of the Monetate JS file and somehow execute the remote Monetate JS file from our own JS code in such a way that adblockers would not be able to block it? This avoid the need to duplicate the file.
If that file is blocked by adblockers, chances are that it is used to serve ads. In fact, your description of time-sensitive promotions sounds an awful lot like ads, just not for an external provider, but for your own site.
Since adblockers usually match the URL, the easiest solution would indeed be to rehost this file, if possible under a different name. Instead of hosting a static copy, you can also implement a simple proxy with the equivalent of <?php readfile('http://monetdate.com/file.js'); or apache's mod_rewrite. While this will increase load times and can fail if the remote host goes down, it means the client will always get the newest version of the file.
Apart from using a different URL, there is no client-side solution - adblockers are included in the browser (or an extension thereof), and you cannot modify that code for good reasons.
Beware that adblockers may decide to block your URL too, if the script is indeed used to serve ads.
Monetate if probably blacklisted in Adblock, so you can't do nothing about.
I think that self-hosting Monetate script would require to keep it updated by checking for new versions from time to time (maintaining it could become a pain in the ass).
A good solution in my opinion is to inform your users about that limitation with a clear message.
Or, you can get in touch with Monetate and ask for a solution.

stop caching views in Angular.js

for some reason all of my html seems to be 100% cached in chrome.
I am using angular 1.2 with .net web api 2 project, and my content is served in index.html.
I did not make any cache policy changes yet, but it seems to be caching everything very heavily.
none of the changes i make (to the views) are reflected until i clear browser cache.
I don't see new changes after pressing f5 or after publishing my site to the server and doing f5 on that. I have to either explicitly clear browser cache, or keep console open with "no caching while dev tools are open" setting on.
I want to prevent asking users to clear their browser cache when new versions are deployed.
Since noone is chiming in, i'll post the solution I implemented, which works, but could be better
IIS:
http features > http response headers > set custom headers > Expire Web Content: Immediately
(if you don't see http features, you'll have to add the feature to your iis server)
index.html:
<meta http-equiv="CACHE-CONTROL" content="NO-CACHE">
<meta http-equiv="CACHE-CONTROL" content="NO-STORE">
(these may cause issues in older versions of IE)
this works, but I'd prefer to strike a better balance of caching between releases.
Alternative approach would be to use a build tool such as Grunt, and generate unique filenames for your scripts in production and update links in index.html. This would be the best approach I believe, because full caching would be enabled, and browser would always make the request for new version of files since the names would be unique. (I've also seen people append ?v=666 to files, but from what i've read this is not a reliable approach)
Also if you're serving content from .Net pages (instead of basic .html) you have an option of using Bundler which does manage tags between releases.
bundles.Add(new ScriptBundle("~/bundles/angular").Include("~/Scripts/angular/angular.js")
.Include("~/Scripts/angular/angular-route.js"));
this will result in
<script src="/bundles/angular?v=20LMOimhsS0zmHdsbxeBlJ05XNyVXwPaS4nt3KuzemE1"></script>
update
additionally I've also been adding a version parameter in template includes. This prevents caching, but it may prevent caching completely so do some testing if you go that route
<div data-ng-include=" 'app/import/import-summary.html?v=' + appVersion "></div>
appVersion can be a global variable set in app.js
app.run(['$route', '$rootScope', function ($route, $rootScope) {
$rootScope.appVersion = getBuildVersion(); // this can be read from assembly file version or any other place where versions are generated by your build server
}]);
I have found I have the same issue intermittently with my production server as well.
In most instances, 80% of the code that I push to the production server has no problems. However, a new module reference, a new Angular binding, or even the smallest template edit can take several refreshes to actually clear the cache and show the latest updates.
I actually wrote a 7 paragraph question slash statement to air my grievances and hoping someone had the same issue as myself. I ended up deleting the question, as I feared it was just my experience.
I have found a weird workaround, where if I host the site from the terminal by running either 'npm start' or 'http-server' modules on my machine, the constant update between both my development and production server will force the browser to acknowledge both environments and cause the change to either appear in one environment or both.
In my opinion, there are two types of developers: modular and line-by-line asynchronous.
Modular programmers code in large chunks and then upload the entire code effort which minimizes heavy caching.
Asynchronous programmers like to view every single change they make by coding a couple of lines, uploading the small changes and then refreshing the browser to see their latest nuance of progression.
I am definitely an asynchronous programmer. I find it allows me to catch smaller mistakes as well as understand the inner workings of whichever language I am using.
Is it safe to say that asynchronous programming leads to heavy browser caching, which is leading to the problem we are all having on this particular question?
Glad to share in your blind frustration, as this has been a huge problem of mine, ever since I've been dabbling in Angular.
It may at first glance appear to be a small issue, however as you are learning Angular, you may spend hours or even a couple of days recoding the same 30 lines because there was no change in your browser. This is a very dangerous, invisible issue which can lead to large amounts of time being chipped away for development time.

Link to external source or store locally

When using third-party libraries such as jquery, yui-reset, swfobject and so on, do you link to the hosted versions, or download and host your own versions?
Pros and cons either way?
Hosted versions are apparently the way to go. For three main reasons (edit: I've added a fourth reason, but it's sort of a moot point):
Google/jQuery/etc servers are likely to be faster than your own
A lot of these servers use content distribution so it will be served from a server geographically close to the requester
If every site used the hosted versions, users are more likely to have the files cached in their browsers, so a trip to the server might not even be necessary
They are possibly more reliable than your own server (however if your own server goes down, this point is moot, as you probably won't be able to serve the main page, so there won't be a request to the js file anyway)
Cons would be
You have no control over the uptime/reliability of the servers (though they are more likely more reliable than your own)
Cannot make any custom mods/patches to these files (though most good frameworks allow you to extend them without needing to modify the original code)
If the hosted file does not allow you to specify the version as part of the filename (eg "jquery-1.3.2.js" rather than "jquery.js"), you probably don't want to use the hosted version, as any updates could possibly break your code
I would say the pros generally outweigh the cons.
These are all javascript libraries - You want to put a copy of it on your own server.
If you somehow use a different version then you would not have tested against the newer version and it might break your code.
I always download and host them locally, just because I'm worried about their server downtime, there's no real guarantee that their servers will be up for the rest of time. There is usually a note in the script about who it belongs to anyway.
I guess the only con would be if the person who made the script wouldn't actually want it downloaded.. But I don't see that ever happening.
Plus the requests times are much faster, instead of requesting a script hosted at google, just request it on your own server.
For production use hosted.
For development use local because if you're offline then your dev site is broke.

Categories