I'm building an app in AngularJS, built on Node/ExpressJS. I have a list of images which are hosted externally (and I have no access to them to compress them at the source).
The issue is, that often these images are quite large - ~200kb for a 600x600 image. I don't want to serve such large files to my users, especially those on mobile with data caps and whatnot.
Is there any service (or Node module) which would allow a middleman-style way of compressing the images that AngularJS serves up to the user? Something like Google PageSpeed Service (a surprising number of people haven't heard of this, check it out, it's awesome) would be absolutely perfect, except it doesn't work with AJAX images/AngularJS.
You have services like http://kraken.io/ - It is just a matter of hooking a url pattern to an API call for the optimized image. The problem with such services is that they aren't scalable (at least cheaply), since you are using third party bandwidth and processing power.
I would strongly advice caching the files somehow, on your side of things, though. Or even do it the other way around - hook the optimizing to changes to the image list, and serve the optimized files from your end.
Doing this from angular is doing this from each user's computer: with a limit of 50 files/day lasting (apparently) 1 hour on their server, you'll quickly run out of API calls.
Related
I just started my adventure with frontend, most likely with web design. I've been struggling to answer one technical question and I couldn't find yet a reasonable answer.
There's so many libraries you can load, download to make your web developing faster. Therefore there is my question.
Is it better to download these libraries (e.g. Boostrap, jQuery, Angular, fonts from Google and so) and link to them (externally) from the official source or download it, upload to your server and then link to the location file (internal source) on your server?
My imagination tells me that if I would download them and upload em on my server, then link to it would make the whole website load quicker. Is that a good thinking?
Pro hosting and linking to external resources (may it be JS libraries, images or whatever):
Spreading the load: your server doesn't have to serve all content, it can concentrate on its main functionality and serve the application itself.
Spread the HTTP connections: due to more and more asynchronously working applications it is a good thing to use the maximum parallel HTTP connections per site/subdomain to deliver application data and load all necessary additional resources from other servers.
as Rafael mentioned above, CDNs scale very good and seldom go offline.
Cons
Even with fast internet connections there is a high chance that resources will be served faster when they are located on the same Intranet. That's why some companies have their own "Micro-CDNs" inside their local networks to combine the advantages of multiple servers and local availability.
External dependancy: as soon as an Internet connection becomes unavailable or a Proxy server goes down, all external resources become unavailable leaving the application in a broken state.
Sometimes it may be actually faster if you link from an external source. That's because the browser stores recent data it accesses, and many sites use Bootstrap, jQuery and the such. It might not happen frequently with less popular libraries.
Keep in mind, though, since you're downloading from external sources, you're at the mercy of their servers. If for some reason or another it gets offline, your page won't work correctly. CDNs are not supposed to go offline for that very reason, but it's good to be aware of that. Also, when/if you're offline and working on your page, you won't be able to connect during development.
It is always better to download these files locally if you are developing some application for more security so that you do not really have to depend on any third party server which hosts the CDN.
Talking about performance using CDN might be beneficial because the libraries that you require might be cached in your browser so the time to fetch the file is saved. But if the file is available locally loading these files will definately take time and space.
https://halfelf.org/2015/cdn-vs-local/
https://www.sitepoint.com/7-reasons-not-to-use-a-cdn/
I agree with Rafael's answer above, but wanted to note a few benefits of serving up these libraries locally that he or she omitted.
It is still considered best practice (until HTTP2 becomes widespread) to try to minimize the amount of downloads being made by your site by concatenating many files into a single file. SO-- if you are using three Javascript libraries/frameworks (for instance, Angular, jQuery and Moment.js), if you are using a CDN that is three separate script elements pulling down three separate .js files. However, if you host them locally, you can have your build process include a step where it bundles the three libraries together into a single file called "vendor.js" or something along those lines. This has the added bonus of simplifying dependency loading to some degree, as you can concatenate them in a certain order should the need be.
Finally, although it is a little advanced if you are just getting started, if you are considering hosting your library files with your project it would definitely be worth looking into bower (https://bower.io/docs/api/) -- it is a node build tool that allows you to define what packages need to be included in your project and then install them with a single command-- particularly useful for keeping unnecessary library files out of your version control. Good luck!
I want to create a mobile application for an already existing website. This application will allow users to login into their accounts, upload files, view messages etc. This app will be created using Phonegap and as much js/css/html5 as possible. My question is: Most apps I have seen so far use JSON-Responses (or similar) and work with them right in the app. But what is the disadvantage of already creating the necessary html on the server and simply loading it into a div using jQuery? (is it even possible because of Same Origin Policy?) When you want to change some minor things of the application, it is easier in this way, but there must be a reason why I haven't seen this so far...
The HTML and templates are already cached locally - so by passing JSON data instead of HTML fragments you're passing less data, network traffic is usually a limiting factor in mobile apps.
From a design point of view only asking the server for data (and not presentation logic) also makes for better separation of concerns and architecture very often. It's also easier to cache data than presentation because of user preferences and such.
On the upside - sending data will make presenting it slightly faster since less work has to happen on the client side.
As for the Same Origin Policy, it does not apply for mobile apps so don't worry about it.
See if somebody were to create a large application based on WebGL. Let's say that it's a 3D micro management game which by itself take approximately 700 Megabytes in files to run it.
How would one deal with the loading of the assets. I would have assumed that it would have to be done asynchronously but I am unsure how exactly it would work.
P.S. I am thinking RollerCoaster Tycoon as an example, but really it's about loading large assets from server to browser.
Well first off, you dont want your users to download 700 megabytes of data, at least not at once.
One should try to keep as many resources(geometry, textures) as possible procedural.
All data that needs to be downloaded should be loaded in a progressive/on demand manner using multiple web workers
since one will probably still need to process the data with javascript which can become quite cpu heavy when having many resources.
Packing the data into larger packages may also be advisable to prevent request overhead.
Sure thing one would gzip all resources and try to preload data as soon as the user hits the website. When using image textures and/or text content, embedding it into the html(using <img> and <script> tags) allows to exploit the browser cache to some extend.
Using WebSQL/IndexedDB/LocalStorage can be done but due to the currently very low quotas and flaky/not existing implementation of the quota management api its not a feasable solution right now.
I'm using the excellent requirejs optimizer to compress the code of a web application.
The application uses a lot of third-party libs. I have several options :
Let the user download all the third party libs separately from my server
Let the user download all the third party libs from a CDN, if available
User requirejs to produce a 'compressed' version of all those libs, in a single file
Now, I know that caching and / or a CDN would help with how long it takes to fetch each individual library, however if I have 15 libs, I'm still going to end up with 15 http requests ; which is all the more annoying if the actual code for my application ends up being served in one or two relatively small files.
So what are the pros and cons of each methods ? Also, I suppose I would be actually 'redistributing' (in the sense of common FOOS licenses) the libraries if I were to bundle them inside my app (rather than pointing to a CDN ?)
Any experience / ideas welcome.
Thanks.
You could take a look to Why should I use Google's CDN for jQuery? question, why CDN is better solution.
It increases the parallelism available. (Most browsers will only
download 3 or 4 files at a time from any given site.)
It increases the chance that there will be a cache-hit. (As more sites
follow this practice, more users already have the file ready.)
It ensures that the payload will be as small as possible. (Google can
pre-compress the file in a wide array of formats (like GZIP or
DEFLATE). This makes the time-to-download very small, because it is
super compressed and it isn't compressed on the fly.)
It reduces the amount of bandwidth used by your server. (Google is
basically offering free bandwidth.)
It ensures that the user will get a geographically close response.
(Google has servers all over the world, further decreasing the
latency.)
(Optional) They will automatically keep your scripts up to date. (If
you like to "fly by the seat of your pants," you can always use the
latest version of any script that they offer. These could fix security
holes, but generally just break your stuff.)
Is it really possible to achieve grade "A" in yslow for all things for a dynamic and CMS(PHP/Asp.net) based websites? and using same server.
(source: haacked.com)
http://developer.yahoo.com/yslow/help/index.html#performance_view
Yes, I guess it is possible to achieve this on one server, except of course for the CDN part which relies on an external service. You'll probably need full control over your server to configure things like ETags and such.
I think it's rarely worth the effort to fulfill all this literally down to the last percent except if you're a huge site like Google or Yahoo themselves, where every saved byte can mean tens or hundreds of thousands in savings. Just get a proper grade so things work fast and reliably - much like in school :)
Yes. First of all, try making any and all of your JS external, and load it on demand, only preloading the components that you really need. Then monitor when each javascript file is loaded, in order. Run that through JSBuilder (JavaScript packaging and compressor tool).
Turn on GZIP on your server. Gzip compression was able to reduce my static filesizes (css, js, etc.) by 73.43%.
Cache, cache, cache. Anything that doesn't change between application deployments needs to have a far-future expires header.
If you can afford it, serve your files from a cdn. They are distributed networks that make delivering content easier.
Get rid of your cookies, or combine them by encoding their values in JSON or using a caching service server-side to cache the values, and only store the cache key in the cookie. That way you only have one cookie instead of hundreds.
Put your css at the top, and optimize it by stripping out any unused selectors and properties.
Oh, and consider switching to a thin client...reloading the web page is so 1999. Using a thin client allows you to try different page download optimization techniques, and decouples your view (the web client) from your server and middleware api, allowing you to develop a front end in just about any RIA environment of your choice. You can go extremely lightweight with JQuery, or go with the more robust pre-built UI's of Ext or Dojo.
Reduce the amount of unused HTML. Tables are evil unless absolutey necessary or inserted into the dom after page load.
I'm sure that some of this will require some major re-work, which your application architecture and developer skill set may not be geared for at this time. The good news is that you can improve your user-experience just by cacheing cookies server-side (like I mentioned above), gzipping your static components, and combining and minifying any and all JS, and optimizing your CSS and layout, without getting into re-structuring your web application.
Sure, why not?
Each item follows through to a link with more details on how to acheive a higher grade
Yes, it is possible.
Here is a guided success example of an optimisation of a Typo3 installation. Take a look at this Yahoo! page for optimisation goodies.
Is it worth trying to optimise for bandwidth reduction and server responsiveness? Sure!
A lot of people connect through hand-held devices with expensive mobile phone plans. The school system where I live even have limited and expensive broadband plans. Waiting for a website to load is a waste of time.