javascript RAM memory usage [duplicate] - javascript

This question already has answers here:
jQuery or javascript to find memory usage of page
(10 answers)
Closed 8 years ago.
I have website which can upload images.
I do CROP from client side before upload then in server side new optimize...
On Mobile devices when no Free RAM fails.
How I can get RAM memory usage from JavaScript to skip CROP if there no memory?
I am looking only JavaScript solution!.
PLEASE I do not have LEAK OF MEMORY!!!
if I open many apps and no much RAM left my strategy not working
I need CODE by JavaScript get the free RAM and if is it bellow some amount I skip the CROP
------------ OK define Fail: --------------
From mobile devices people take photo and upload it...
from JavaScript I perform CROP
1. around 2MB image goes to 300kb
2. I upload only 300kb then from server side 300kb --> 30kb that I save
If there is no RAM this FAILs
I do not want to say "try again"
I would like to Skip the CROP
Thank you very much for comments.
I handle the errors but I would like to avoid client to wait 40-60 sec and then message
If I go with NO CROP IS IT OK but saving near 1.7MB per image bandwidth... GREEDY :-)
window.performance good I will used thanks.
I will do research to have round trip from SERVER SIDE what I can do can I find it for Mobile devices

In Development
Use Chrome's DevTools for pretty comphrensive diagnostics. You can get JavaScript run-time diagnostics, request information, and basically anything you might need.
Client-side Testing
As far as testing how much RAM is available in your code itself, there isn't really a "correct" or "recommended" strategy (source). Obviously the best solution would be to just optimize your code; varying your site's display/interactions based of how many other apps a client is running could be confusing for the user (eg: they expect some tool to be displayed and it never is; they think the page isn't loading properly, so they leave; etc.).
Some highlights from the source above:
"Counting DOM elements or document size might be a good estimation, but it could be quite inaccurate since it wouldn't include event binding, data(), plugins, and other in-memory data structures."
You'll need to monitor both the DOM and the memory you're using for your code for accurate results.
You could try using window.performance (source), but this isn't well-supported across different browsers.
Note: As Danny mentions in a comment below, showing an advisory message when a feature is disabled could clear up some confusion, but then why use the resources on a feature that is so optional that you could just not use it? Just my two cents... :)

Related

Client-side vs. server-side templating (which one?)

I've been reading some very interesting articles about the whole client vs. server rendering lately.
http://www.onebigfluke.com/2015/01/experimentally-verified-why-client-side.html
http://www.quirksmode.org/blog/archives/2015/01/angular_and_tem.html
http://tomdale.net/2015/02/youre-missing-the-point-of-server-side-rendered-javascript-apps/
Now I've been a bit of a fan boy when it comes to client side but after I read these articles some points started to show up in favor of the server side rendering, to my surprise... The main points were:
1) You can upgrade your server, but not your users device - This means, well, yes... you are in control of the server, so if it's under performing you may opt to upgrade/scale. You can't force users to upgrade their devices.
2) First paint vs. last paint - Now on the experimentally verified... link above it shows when the users first see the page (first paint) and when the users may use the page 100% (last paint). Now from what I can think of when the user sees the page, it takes their brain some time to process the signals from the visual cortex to the frontal cortex and then to the premoter cortex where the user actually starts clicking his/her finger, that is of course if the html is rendered first so the brain has something to process while loading is happening in the background (js files, binding etc.).
What really got me was the bit were twitter reported people of having up to 10 seconds of loading time for client side rendering, no one should ever experience that! It's kind of saying, "Well if you don't have a good enough device, sorry, you'll just have to wait.".
I've been thinking, isn't there a good way of using both client-side and server-side templating engines and which both client and server use the same template engine and code. In that case it's only to figure out if it's benefactor to supply the client with the rendered page or let the client render it themselves.
In any case, share your thoughts on my sayings and the articles if you want. I'm all ears!
UPD: do it only if you really need it
(4 years and 2 isomorphic enterprise apps later)
If you're required to do SSR, fine. If you can go with a simple SPA - go with it.
Why so? SPAs are easier to develop, easier to debug and cheaper and easier to run.
The development and debugging complications are evident. What do I mean by "cheaper and easier to run", though? Well, guess what, if 10K users try to open your app at the same time your static HTML website (i.e. a built SPA) you won't even feel it. If you're running an isomorphic webapp though, the TTFB will go up, RAM usage will go up and eventually you'll have to run a cluster of those.
So, unless you are required to show some super-low TTFB times (which will likely come through aggressive caching), don't overcomplicate your life.
Original answer from 2015:
Basically you're looking for an isomorphic web app that shares the same code for frontend and backend.
Isomorphic JavaScript
JavaScript applications which run both client-side and server-side. Isomorphic JavaScript frameworks are the next step in the evolution of JavaScript frameworks. These new libraries and frameworks are solving the problems associated with traditional JavaScript frameworks.
I bet this guy explains that much better that me.
So, when a user comes to the page, the server renders the full page with contents. So it loads faster and requires no extra ajax requests to load data, etc. Then, when a user navigates to another page, the usual techniques for single page applications are used.
So, WHY WOULD I CARE?
Old browsers / Weak devices / Disabled Javascript
SEO
Some page load improvements
Old browsers / Weak devices / Disabled Javascript
For example, IE9 does not support History API. So, for old browsers (and if user disables javascript too), they would just navigate through pages just like they did it it in good old days.
SEO
Google says it supports SPA's but SPA's aren't likely to appear in the top results of google search, are they?
Page speed
As it was stated, the first page loads with one HTTP request, and that's all.
OK, so
There are lots of articles on that:
http://nerds.airbnb.com/isomorphic-javascript-future-web-apps/
http://www.sitepoint.com/isomorphic-javascript-applications/
https://www.lullabot.com/articles/what-is-an-isomorphic-application
But SHOULD I CARE?
It's up to you, of course.
Yeah, that's cool, but it takes much work to rewrite/adapt the existing app. And if your backend is in PHP/Ruby/Python/Java/Whatever, I've got bad news for you (it's not necessarily impossible, but close to that).
It depends on the website, you can try to collect some stats and if the percentage of users with old devices is small, it's not worth the trouble, so why not...
LET THEM SUFFER
If you care only about users with old devices, then c'mon, it 2015, and it's your user's problem if he's using IE8 of browsing websites with a iPod Touch 2. For example, Angular dropped IE8 support in 1.3 approximately a year ago, so why wouldn't you just alert the users that they need to upgrade ;)
Cheers!
All of the conversations on this topic miss one point. Bytes sent to the client. Pages rendered as HTML on the server are a lot smaller. Less bytes transmitted is better for everyone, both server and client. I've seen the bandwidth costs on cloud sites and even a 10% reduction can be a huge saving. Client side JS pages are always fat.

Client-side image resizing. Any known issues?

I would like to start resizing images on the client side to avoid running into any memory issues on my server. I found what looks like a good example at http://www.shift8creative.com/projects/agile-uploader/index.html
Can anyone think of any issues that might occur from doing the resizing on the client side (not including the dangers of uploading files to a server)?
For me it seems like the perfect solution but I thought I would check to see if anyone has any thoughts on the matter first.
Any thoughts are appreciated.
Thanks,
cs1h
There are several downsides to the client side:
You have to support multiple browsers
Therefore you don't have a controlled environment and might not support all of them
The result due to the 2 points above may differ (different scaling algorithms)
Users can disable JavaScript
But the fact that you'll saveloads of bandwidth which is good especially for mobile users, might mitigate the downsides.
No issues that I know of (except for browser support). The server of course doesn't care what data you send to it and where does that data come from. If the client is capable if dealing with binary data, why not?
Of course you still need to leave server-side resizing functionality as a backup solution for clients that can't do it on their own or have JavaScript disabled altogether.
And you still need to perform all the necessary checks (make sure uploaded file is an image and does not exceed file size and/or dimensions limits) on the server, regardless of any client-side logic, that's the golden rule.
Well, if the user had JavaScript disabled, your plan fails. Not to mention the security vulnerability you already mentioned.

Pushing javascript too hard with Script# and javascript?

So I have been playing around with a home project that includes a lot of js. I having been using Script# to write my own library etc. Personally I wouldn't write a lot of js if I didn't have a tool like Script# or GWT to help maintain it.
So far it includes these external libraries:
– ASP.NET AJAX
– ExtJS
– Google Maps
– Google Visulisations
– My own library to wrap the above libraries and add extra functionality...
So that works out to be a heap of js. It runs fine on my pc. I however have little faith in js/browsers and I am concerned that loading too much js will cause the browser to die or perform poorly.
Is this a valid concern?
Does anyone have any experience with loading a lot of js into the browser that has resulted in performance issues? I however know there are a lot of variables here, for example browser type (I assume IE is worse than others) the client PCs RAM etc, but it would be good to get other peoples experiences. I would hate to invest a lot of time into js only to find that I am painting myself into a corner.
The more I use Script# the more client classes I have as I move more processing onto the client. At what point would this start becoming an issue? I'm sure the browser could easily handle 100 MS Ajax classes but at what would be too far for a browser?
NOTE: I am not concerned about the actual js file sizes but more the runtime environment that gets loaded.
There is nothing wrong with having large number of js files or big js files, the project currently am working on got more than 60 core framework libraries and 30 of each module got average of 5 to 6 js files.
So the only concern is how you design your website that make use of the JS best practices & optimization techniques. like
Minimize the JS using YUI or any other compression libraries to address the download size issues.
Enable proper caching in your webserver to reduce the file downloads.
Put your javascript in the bottom of the page, or make it a separate file.
Make your AJAX response cachable.
And finally, design your page that handles the on-demamnd script loading.
- Microsoft DOLOTO is a good example for this one. download it here
And Check out the High Performance Web Sites && latest Even Faster Web Sites by Steve Souders. Its a must read for the web developers. This book addresses all the common problems web developers facing today.
with modern browsers routinely occupying 250 MB of RAM or more, script caching, and optimized javascript engines, keeping the script library resident would probably be negligible added load in most reasonable scenarios.
the biggest bottleneck would probably be intitial load time of the scripts - downloading and parsing them. but once that's done, the scripts are cached and the per-page initialization isn't very noticeable.
I highly doubt a browser would ever crash running your JS scripts, but it will become really slow and may not perform what you want. Most people are more concerned about how fast it runs, not if it will run!
I agree with jspcal, you should be able to load quite a lot of javascript with no problems. The javascript engines in all the modern browsers are a lot faster than they were a few years ago. The initial load will be the biggest issue. If possible I'd suggest lazy loading scripts that aren't needed for the page to render.
Also, Steve Souders has a lot of great material about improving page load times, such as this article, which gives several techniques for loading scripts without blocking.
http://www.stevesouders.com/blog/2009/04/27/loading-scripts-without-blocking/
If you're really concerned about performance then I would take a look at your target audience. If you think you'll have a relatively high number of IE6 users then test it out in IE6-- on an older machine if possible. IE Tester is great for this.

What is a reasonable size of JavaScript?

What is the maximum size of JavaScript that would be reasonable for a web page? I have a JavaScript program with a data segment of size about 130,000 bytes. There is virtually no whitespace, comments, or variables in this file which could be minified. The file looks something like this:
"a":[0],
"b":[0,5],
"c":[3,4,24],
"d":[0,1,3],
going on for several thousand lines.
Google Analytics gives the following info on the connection speed of the current users:
Rank Type Visitors
1. DSL 428
2. Unknown 398
3. Cable 374
4. T1 225
5. Dialup 29
6. ISDN 1
Is the file size too much?
The alternative is using a server-side program with Ajax.
Better the small size better will be the load time. If you are too concerned with the file size then try gzipping it. You can also minify the js file.
Minifying js and css files is one of the performance rules that Yahoo suggests. For more detailed reading check this out.
Best Practices for Speeding Up Your Web Site
Edit
Check this one
How To Optimize Your Site With GZIP Compression
It depends on your users and what sort of connection speeds they have. With a 1 Mb/s connection or faster it probably wouldn't be too noticable, but with an older modem it would be very irritating having to wait 10 seconds or more.
You could try Minify to compress your script: http://code.google.com/p/minify/
You can also load your scripts in the background using AJAX: http://betterexplained.com/articles/speed-up-your-javascript-load-time/
whatever your users will tolerate given their connection speed .. how long can they wait vs the benefit they gain for doing that ..
a download calculator might help ya
130k would take about 25-35 seconds to download on dialup.
As someone who is forced to use dialup two or three times a year, I'll tell you - if you're programming a web application that I wanted to use, I might stick around to use it. If it's just a site that I'm surfing to randomly, I'd be outta there :)
You should definitely look into minimizing the script. Looks like others have found the links before I did, so definitely check them out.
It is very impotent to speed up the web page load time to have small javaScript file There are some points
Use external JavaScript file.
Put all your JavaScript below body end tag.
Try to minimize file size using tools mentioned above.
There are many more tips regarding this here

Is it possible to optimize/shrink images before uploading?

I am working on a web application that will deal with many image uploads. Its quite likely that the users will be in areas with slow internet connections and I'm hoping to save them upload time by compressing the images before uploading.
I have seen that Aurigma Image Uploader achieves this using a java applet or active x but it's expensive and I'd rather something open source or at least a little cheaper. Ideally I'd like to roll my own if its at all possible.
I'm developing on Ruby on Rails if that makes any difference..
Thanks!
Edit just to clarify: I don't mind if the solution uses activeX or an applet (although js is ideal) - I'm just looking for something a little cheaper than Aurigma at this stage of development.
Also, it may not be feasible for users to shrink the image themselves as in many instances they will uploading directly from an internet cafe or other public internet spot.
Generally, it isn't possible to write an image compressor in JavaScript. Sorry.
You'll need to use a plugin of some sort, and as you mention, other sites use Java.
It appears to be possible to write something to encode a JPEG in ActionScript (i.e. Flash), which will reach a much larger audience than the Java plugin you mention. Here's a link to a blog post talking about PNG & JPEG encoders in ActionScript.
Here's another blog post with a demo of an inlined JPEG encoder in ActionScript.
Only if you use Flash or Silverlight (only way to be cross-platform)
http://www.jeff.wilcox.name/2008/07/fjcore-source/ may be worth a read.
Without using applets or activex (only in windows) you can't execute anything on a client pc.
Probably not, but you can always insist that image uploads over x size will not succeed.
Is this an application where you can force them to insert a smaller image. In that case you could grab the size first to verify it fits standards. This is what facebook used to do with profile pictures. If it was too big they wouldn't take it.

Categories