We are currently trying to optimize page load times on an memory and cpu constrained windows mobile device (the MC9090, 624mhz cpu, 64meg ram, running WinMo 6 and Opera Mobile 9.5). As one of the potential optimiztions, I wanted to combine external js/css files, but now we are observing that page load times actually INCREASE after concatenation.
Now, that flies in the face of everything I have seen and read about with regards to client side performance. Has anyone ever seen this before? Also, are there different rules for embedded device web optimization then desktop browsers?
Browsers can download across multiple connections (4, sometimes 6 - not sure what this version of Opera allows). By putting everything in one file it forces the browser to download all the javascript in one stream, vs. multiple ones. This could be slowing it down.
Images and other items also compete for these connections. You nelly need a tool similar to Firebug's "net" tab to view the traffic and see how the bandwidth resources are allocated.
Also, see: http://loadimpact.com/pageanalyzer.php
out of date but relevant: http://www.stevesouders.com/blog/2008/03/20/roundup-on-parallel-connections/
Related
Working with angularjs it's very easy to run through LOTS of data on the client.
Is there a rule of thumb as to how much data I want to work with at once? I'm transferring files with a few mb of text data and don't seem to be running into too much trouble. (Barring I'm not displaying all my records at once).
Is there a point where you think you should still be working off the server? How much is too much? Is it browser/os/device dependent?
I think this strongly depends on which browsers you are targeting. If you are targeting IE8 you will have no were near the same performance as if a user was using the latest chrome release.
As a rule of thumb I tend to want the experience (speed, smoothness, user ability etc) when you visit sites I develop to be the same across all browsers and versions. To do this I do most of the heavy lifting on the server and send down mostly pre-packages, pre-parsed data sets for the client to display.
You should be testing your application across the full spectrum of browsers you support and you will find areas each browser excels and is not as good as it's rivals. You can then tweak accordingly.
However, some would argue if you are targeting the latest browsers that to reduce cpu cycles on the server you should be push processing to the client. I would agree here if you can ensure your using are using the latest browsers on fairly decent machines with good internet connections.
I've discovered a pretty cool function on Opera which is fetching data on website, it shows lower resolution images and things like that and page loads fast. This is great for slow connections.
I'm interested in the background of this little function, with my basic knowledge of CSS, HTML and Javascript I don't understand how this can be done. Can anyone explain how does it work?
I mean let's say for images, it needs to download the image first and then convert it to lower resolution one so where do we "win" time here? Image is still being downloaded right?
Sad to say it's non-trivial for you to achieve what you are trying to do. If you take a look at Opera Turbo
How we squeeze out all that speed
When Opera Turbo is enabled, webpages are compressed via Opera's servers so that they use much less data than the originals. This means that there is less to download, so you can see your webpages more quickly.
Enabling Opera Turbo is as simple as clicking the Opera Turbo icon at the bottom-left of the Opera browser window. When you are on a fast connection again and Opera Turbo is not needed, the Opera browser will automatically disable it.
Your best bet is to follow up on How do I check connection type (WiFi/LAN/WWAN) using HTML5/JavaScript?, depending on their connection type, load your images accordingly, but be aware that, a connection type cannot accurately let you determine their network speed. A device can be on 3G or LTE but still getting crappy speeds from their provider.
If you really want to implement this feature and be safe to work across browsers & devices, i can suggest putting lazyload plugins like Unveil that will help with the amount of data on load. OR include a button within your page that allow the user to select a low bandwidth option, something like what gmail does
Turbo mode is really great feature of Opera.
In short, it downloads complete code and images to Opera servers and then send them to user. Turbo mode is big traffic-saver (up to 80%). Opera then decompresses data on-the-fly, while answering user's request. Well, images in Turbo mode are almost useless but still, this mode basically comes handy when you use extremely slow internet-speed.
You can check out official documentation to find more info. Also, check out my old post where I wrote about Turbo mode. There you can find more info and useful links about this topic.
Additionally, look at info on opera-turbo tag here, on StackOverflow.
I have an app in which user uploads images from computer and then draw images on the canvas.
In chrome and firefox i'm using filereader. But if user uploads a very large image, it doesn't load properly or doesn't load and can't be drawn onto the canvas.
I have tried it by accessing the same file direct from computer and it works fine. So is there any way to increase the browser memory so that large images load properly?
Or there is some other problem!!!
In my experience you can reliably bet on 5MB minimum for the platforms you mention above. Keep your data below that level and you should be pretty safe.
Read this article. http://diveintohtml5.ep.io/storage.html it has some nice nuggets of info, but it's not all accurate, especially the part that says you cant up the limit.
I know for a fact that on iPhone once you reach the limit the phone will ask the user if they want to allow more space. (Sort of accurate, but not entirely)
On Android platforms the heap memory limit is set at 12MB. Not sure about the other platforms. Since you are going to be running in some kind of webcontainer (Webkit or other) I wouldn't worry too much about it. The containers themselves are pretty good at managing memory and implementing file caches to minimize their footprint.
I recommend you leave the memory optimizations and such for last. Who knows, you might not even need it. Dont optimize prematurely.
PS:
Look at Phonegap: http://phonegap.com/
As a web developer I also have to take the Android and iOS web browsers into account. The rendering engines of these browsers and the lack of power and memory brings a lot of complications.
So I was wondering, is there a comprehensive guide on performance tuning (HTML/CSS/Javascript) for these browsers?
I haven't found an actual guide focusing on mobile development yet. However, my coding practice is to do everything you'd do for a desktop browser and try to put extra effort in:
maximize use of cache, by using CDN, ETags, proper expiry dates etc.
minimize reflows/repaints, they are CPU intensive
optimize images aggressively to minimize download size.
minimze amount of server round-trips and included JS/CSS, since most mobiles are used on 3G/4G and other wireless networks. They tend to have higher latencies than wired broadband (cable/dsl).
do not use animated gif, 3d transforms, etc.
Here is some reading material: https://developers.google.com/speed/docs/best-practices/rules_intro
I'm currently working on a web app, and have been inspired by a couple different apps out there (mainly Cloud9IDE) in how they hold a large majority of their interface in javascript objects. This makes it incredibly easy to add features in the future, and also allows extensibility options in the future.
The question is, at what point does storing data in memory (via javascript) become rude. I'm building a social network (think like Twitter), and essentially I would be storing an object for every "tweet", as well as some more broad objects for interface items.
Are there hard limits forced by browsers on how much memory I can use? Will my website crash if I go over? Or will the entire browser crash? Will it slow down the user? If so, is there a general rule for how much memory will bother the average user?
Absolutely positively don't use anywhere close to 4 GB of memory. Most people use 32-bit browsers, so the browser couldn't support 4 GB anyway :)
On a more practical note, remember that the more memory you take up, the slower your app will usually run. Today's Intel/AMD (I don't know about ARM) processors access registers about 100 times faster than accessing memory that isn't in cache, so if you use a lot of memory you will cause thrashing, which will slow down your application dramatically.
So, assuming that you want users for your social network, you should try to design your website to work well on as many machines as possible. Millions and millions of people are still using Windows XP machines that are 5+ years old. These machines might have as little as 512 MB of RAM, so if you are using a few hundred megabytes, you can thrash all of memory rather than just processor cache, as the kernel keeps swapping out pages that you want to use. So as a rule of thumb I would recommend staying below 150-200 MB of memory. GMail takes up ~100MB of memory on Chrome for Linux, so I think that keeping up with GMail is a reasonable goal.
Another benefit of keeping memory usage relatively low is that your users can more easily view your site on a smartphone. An iPhone 3GS (there are still a lot of them in use) has only 256 MB of RAM, so staying below 200 MB in your website makes it easier for a smartphone user to load your site without having to kill processes indiscriminately.