I've discovered a pretty cool function on Opera which is fetching data on website, it shows lower resolution images and things like that and page loads fast. This is great for slow connections.
I'm interested in the background of this little function, with my basic knowledge of CSS, HTML and Javascript I don't understand how this can be done. Can anyone explain how does it work?
I mean let's say for images, it needs to download the image first and then convert it to lower resolution one so where do we "win" time here? Image is still being downloaded right?
Sad to say it's non-trivial for you to achieve what you are trying to do. If you take a look at Opera Turbo
How we squeeze out all that speed
When Opera Turbo is enabled, webpages are compressed via Opera's servers so that they use much less data than the originals. This means that there is less to download, so you can see your webpages more quickly.
Enabling Opera Turbo is as simple as clicking the Opera Turbo icon at the bottom-left of the Opera browser window. When you are on a fast connection again and Opera Turbo is not needed, the Opera browser will automatically disable it.
Your best bet is to follow up on How do I check connection type (WiFi/LAN/WWAN) using HTML5/JavaScript?, depending on their connection type, load your images accordingly, but be aware that, a connection type cannot accurately let you determine their network speed. A device can be on 3G or LTE but still getting crappy speeds from their provider.
If you really want to implement this feature and be safe to work across browsers & devices, i can suggest putting lazyload plugins like Unveil that will help with the amount of data on load. OR include a button within your page that allow the user to select a low bandwidth option, something like what gmail does
Turbo mode is really great feature of Opera.
In short, it downloads complete code and images to Opera servers and then send them to user. Turbo mode is big traffic-saver (up to 80%). Opera then decompresses data on-the-fly, while answering user's request. Well, images in Turbo mode are almost useless but still, this mode basically comes handy when you use extremely slow internet-speed.
You can check out official documentation to find more info. Also, check out my old post where I wrote about Turbo mode. There you can find more info and useful links about this topic.
Additionally, look at info on opera-turbo tag here, on StackOverflow.
Related
I tried to take a screenshot of a movie on the Disney+ web app when I realised that the video turns black as soon as I try to take a new screenshot with Snipping Tool. When I tried to do the same thing with OBS and Discord streams, I saw the same effect.
Interestingly, this only works for Chrome on my machine (I also tried Firefox and Edge and they just let me record my screen).
When I saw this, I became really curious on how they achieved this.
Does anyone have any idea how I can recreate this for my own web projects?
I became really curious on how they achieved this.
They use Widevine.
Widevine homepage.
https://ottverse.com/widevine-drm-how-does-it-work/
https://en.wikipedia.org/wiki/Widevine
News reports:
https://www.cordcuttersnews.com/sadly-disney-wont-work-on-chromebooks-linux-some-android-devices-because-of-drm/
https://www.tomsguide.com/news/disney-plus-will-work-on-chromebooks
https://www.androidpolice.com/2019/10/22/disney-will-only-work-on-devices-that-support-the-strictest-widevine-l3-drm/
It's also used by Netflix, Hulu and others.
Widevine is Google's DRM system that's baked-in to Chrome.
All other major browsers have adopted it as well, because no-one will use a browser that can't access Netflix.
Mozilla's and Microsoft's support is less user-hostile and as you noticed.
It's just a standard HTML5 <video> element - when the browser downloads the video stream it will see that it's encrypted with Widevine and that engages the Widevine client-side code which does all the DRM biz.
Though there are HTML and DOM features that facilitate DRM, I'm unsure of the extent that any JavaScript is required to use it - as theoretically everything the browser needs to know to load the DRM system should be embedded in the raw media stream.
On Windows, I understand (though unconfirmed) that Widevine makes use of SetWindowDisplayAffinity to block screenshots.
Nothing stops you from doing this in your own native code (e.g. if you had an Electron fork), but please don't because it's a real dick-move to your users, in addition to not working at all if the user has the DWM disabled (e.g. they're running Windows 7 with Aero disabled).
Has anyone any idea how I can recreate this for my own web projects?
You'll need to license Widevine yourself. This is a complicated process intended only for large media production companies and content rightsholders, not individuals or small businesses.
Anyway, even if you could, please don't. Why would you want to make to harder for users to share and appreciate your media? Just stick it up on YouTube instead.
I have an ASP.NET MVC application that makes pretty heavy use of javascript and JQuery for both administrative functions as well as customer-facing functions. Recently I reorganized the administrative screens to be able to more cleanly fit administrative controls for some new features.
I tested using IE and Chrome and found that there was a slight, but acceptable hang in one of the busier pages. However, the main person who uses the admin pages uses Firefox and kept reporting an unacceptable hang. I finally checked it out and found that what hangs in Chrome and IE for 2-3 seconds hangs in Firefox for 10-12 seconds, which is no good.
Not knowing where to turn, I wound up installing Glimpse and got it configured and running just fine, but I'm still having trouble figuring out how to drill into it to find out what area of the page is causing trouble. All I can tell so far is that it is definitely something with how the client (Firefox) is rendering. To be clear, it happens on all browsers, but for some reason it is way more pronounced in Firefox.
Can someone please give me some pointers on how to get started on diagnosing the issue? I'm not married to the idea of using Glimpse, but it seems like a pretty decent tool from what I can tell.
Thanks for your help.
Based on what you're describing, the problem appears to be client side. With that said, Glimpse may not be as well-suited as using Firefox's own profiler.
SHIFT+F5 will bring up the web developer performance screen. From there, you can begin/end a performance analysis and gain more insight into what may be taking longer than expected.
It may also be worthwhile to look at the network tab and make sure assets are loading in a timely manner.
Keep in mind as well that add-ins could play into the latency. If the end-user has a setup that performs post-page processing (such as Greasemonkey scripts or (recalling an earlier add-in) a Skype plugin that used to transform phone numbers on the page to direct-dial links), that would also play a part in the performance. A good way to rule these out is to hold down SHIFT while starting up Firefox (effectively running it in Safe Mode), which would determine if it's Firefox itself or an add-in that's to blame.
We are currently trying to optimize page load times on an memory and cpu constrained windows mobile device (the MC9090, 624mhz cpu, 64meg ram, running WinMo 6 and Opera Mobile 9.5). As one of the potential optimiztions, I wanted to combine external js/css files, but now we are observing that page load times actually INCREASE after concatenation.
Now, that flies in the face of everything I have seen and read about with regards to client side performance. Has anyone ever seen this before? Also, are there different rules for embedded device web optimization then desktop browsers?
Browsers can download across multiple connections (4, sometimes 6 - not sure what this version of Opera allows). By putting everything in one file it forces the browser to download all the javascript in one stream, vs. multiple ones. This could be slowing it down.
Images and other items also compete for these connections. You nelly need a tool similar to Firebug's "net" tab to view the traffic and see how the bandwidth resources are allocated.
Also, see: http://loadimpact.com/pageanalyzer.php
out of date but relevant: http://www.stevesouders.com/blog/2008/03/20/roundup-on-parallel-connections/
This is an often used HTML piece on websites.
<noscript>
Please enable JavaScript or use a JavaScript capable device to get the maximum benefit of this site.
</noscript>
I want to link it to some directions or similar to enable JavaScript. I don't want to make my own list, as it would require me to update it.
I have found the Google link before, which was pretty good, but I was wondering if there is any de facto link that developers link to give users step by step instructions on how to enable JavaScript.
I realise that most people with it off probably do know how to re-enable it, I just thought for completeness a link couldn't hurt (maybe their more web savvy brother disabled it on a shared computer).
Looked up a bit and found http://www.enable-javascript.com/.
Seems to be a bit more updated i.e includes chrome etc and has screen images as well for those who prefer the visual route.
Also, doesn't seem to have a lot of pesky ads etc. Hope it helps!
Caveat: Must add that I have never used this before and am not sure of how frequently it will be updated but it looks promising!
Is this still a major concern in 2010? In my experience, people who see the <noscript> content have either:
disabled JS themselves, and therefore would know how to enable it (e.g. NoScript users)
or don't have JS capabilities (e.g. text-only or low-end mobile browsers)
Above that, browser landscape is varied enough that it's rather hard to keep up with various browser versions and their JS settings.
I'd say "just display 'this works better with JS' and degrade gracefully".
I have found activatejavascript.org. It is a little outdated, e.g. not providing instructions for Google Chrome.
I am also weary of sites that look like just a quick + dirty platform for advertisements (I got that impression).
There is probably something better.
Why don't you do it yourself?
It only takes a little browser sniffing and browser screenshots.
Recently I have been having issues with Firefox 3 on Ubuntu Hardy Heron.
I will click on a link and it will hang for a while. I don't know if its a bug in Firefox 3 or a page running too much client side JavaScript, but I would like to try and debug it a bit.
So, my question is "is there a way to have some kind of process explorer, or task manager sort of thing for Firefox 3?"
I would like to be able to see what tabs are using what percent of my processor via the JavaScript on that page (or anything in the page that is causing CPU/memory usage).
Does anybody know of a plugin that does this, or something similar? Has anyone else done this kind of inspection another way?
I know about FireBug, but I can't imagine how I would use it to finger which tab is using a lot of resources.
Any suggestions or insights?
It's probably the awesome firefox3 fsync "bug", which is a giant pile of fail.
In summary
Firefox3 saves its bookmarks and history in an SQLite database
Every time you load a page it writes to this database several times
SQLite cares deeply that you don't lose your bookmarks, so each time it writes, instructs the kernel to flush it's database file to disk and ensure that it's fully written
Many variants of linux, when told to flush like that, flush EVERY FILE. This may take up to a minute or more if you have background tasks doing any kind of disk intensive stuff.
The kernel makes firefox wait while this flush happens, which locks up the UI.
So, my question is, is there a way to have some kind of process explorer, or task manager sort of thing for Firefox 3?
Because of the way Firefox is built this is not possible at the moment. But the new Internet Explorer 8 Beta 2 and the just announced Google Chrome browser are heading in that direction, so I suppose Firefox will be heading there too.
Here is a post ( Google Chrome Process Manager ),by John Resig from Mozilla and jQuery fame on the subject.
There's a thorough discussion of this that explains all of the fsync related problems that affected pre-3.0 versions of FF. In general, I have not seen the behaviour since then either, and really it shouldn't be a problem at all if your system isn't also doing IO intensive tasks. Firebug/Venkman make for nice debuggers, but they would be painful for figuring out these kinds of problems for someone else's code, IMO.
I also wish that there was an easy way to look at CPU utilization in Firefox by tab, though, as I often find myself with FF eating 100% CPU, but no clue which part is causing the problem.
XUL Profiler is an awesome extension that can point out extensions and client side JS gone bananas CPU-wise. It does not work on a per-tab basis, but per-script (or so). You can normally relate those .js scripts to your tabs or extensions by hand.
It is also worth mentioning that Google Chrome has built-in a really good task manager that gives memory and CPU usage per tab, extension and plugin.
[XUL Profiler] is a Javascript profiler. It
shows elapsed time in each method as a
graph, as well as browser canvas zones
redraws to help track down consuming
CPU chunks of code.
Traces all JS calls and paint events
in XUL and pages context. Builds an
animation showing dynamically the
canvas zones being redrawn.
As of FF 3.6.10 it is not up to date in that it is not marked as compatible anymore. But it still works and you can override the incompatibility with the equally awesome MR Tech Toolkit extension.
There's no "process explorer" kind of tool for Firefox; but there's https://developer.mozilla.org/en-US/docs/Archive/Mozilla/Venkman with profiling mode, which you could use to see the time spent by chrome (meaning non-content, that is not web-page) scripts.
From what I've read about it, DTrace might also be useful for this sort of thing, but it requires creating a custom build and possibly adding additional probes to the source. I haven't played with it myself yet.