Webpack require really slow, debugging JS profile - javascript

From Chrome devtools Timeline profiling my Angular2 app load time is really huge, for single bundled file it takes up to 2.66 seconds, call stack snapshot look like this (zoomed for nice readability)
Several calls to webpack_require takes about 1.27 seconds, 855.05 milliseconds and so on, which drastically affect app load time. Are there any known webpack problems related to this? I only found https://github.com/webpack/webpack/issues/2219, issue that is still open without general solution.

Related

The Nuxt.js (SPA) application crashes when I keep taking photos in iOS Safari. (Caused by a memory leak?)

I have created an application that uses Nuxt.js (SPA) to take up to 20 images taken with the iPhone and register all 20 at once.
When the images are taken, they are compressed to about 500KB using a library called browser-image-compression and retained.
The compressed images are then displayed in a preview.
The problem is that when registering 20 photos multiple times, the Nuxt.js application suddenly crashes in the middle of the process (when taking a photo, the captured image suddenly starts to disappear after loading, like when reloading).
Thinking it could be a memory leak, I tried measuring it on the Mac Safari timeline, but the memory usage goes up and up with each shot, and does not go down.
I tried to refresh memory with location.reload(true), but the memory usage did not go down.
Please let me know the solution to refresh the memory, or anything that will stop the application from crashing.
Memory Timeline just before the crash
Versions
iPhone 11
iOS 14.6
#nuxt/cli v2.14.12
Vue v2.6.14
browser-image-compression v1.0.14
There is a PR about this which aimed to fixed this exact bug. It maybe didn't fixed it really.
Please try to double-check the actual memory usage in another browser like Chrome or Firefox and see if it does the same. If it didn't, it's maybe time to give some feedback to browser-image-compression or use another package.

Why does testmysite claim that my website is slow?

The website testmysite claims that the website I'm working on is very slow, namely 9.8 seconds on 4g which is ridiculous.
My webapp requests geolocation first, but denies it immediately, so this is not where the slowness comes from.
The site is server-rendered, and sends scripts to the client to optimise.
The bundle analyzer is very dubious.
It claims that my total bundle size would be 750kb, but I strongly doubt that this is the case.
Even if it turns out that the testmysite is not a reliable source, I'd like to know why it says that my website is so slow, and what I can actually do to improve.
My vendor node modules are chunk split, because I want the browser to cache them individually.
My website is deployed here: https://zwoop-website-v001.herokuapp.com
Note that loading may take some time because I use the free service and the server falls asleep often.
Edit: my performance tab shows the following in Chrome:
I have really zero idea why that website says that my site is slow...
But if the measurement is used by search engines, then I do care.
* Answer: *
I just voted to re-open this question as #slebetman has helped me to troubleshoot some issues and I'd just like to formulate an answer.
First thing to note is that the free Heroku server that I run in production is located in Europe (this was an option you could choose), and it is unclear where the server from testmysite is located. #slebetman was located in East-Asia when running his test.
#slebetman mentioned that the network tab indicated for him a very slow load time for loading fonts (woff2), of about 2 seconds. This didn't occur for me, but as it turns out, these are font-awesome icons that are loaded from a CDN.
So while there is the logical thought of looking at performance in terms of script loading, interpretation and rendering, there may additionally be a latency issue related to third-party resources that are downloaded from another server. Even when your website is fast, you don't actually know if you have imported some module that requests additional resources.
Either of these things can be tracked through the performance or network tab of Google Chrome. An additional tip is to mimick a slow network (although I don't think this would actually track all problems that may occur on the network in terms of redirects, DNS resolutions etc as #slebetman mentions could be the case as well).
First thing to note is that the free Heroku server that I run in production is located in Europe (this was an option you could choose), and it is unclear where the server from testmysite is located.
#slebetman was located in East-Asia when running his test.
#slebetman mentioned that the network tab indicated for him a very slow load time for loading fonts (woff2), of about 2 seconds. This didn't occur for me, but as it turns out, these are font-awesome icons that are loaded from a CDN.
So while there is the logical thought of looking at performance in terms of script loading, interpretation and rendering, there may additionally be a latency issue related to third-party resources that are downloaded from another server. Even when your website is fast, you don't actually know if you have imported some module that requests additional resources.
Either of these things can be tracked through the performance or network tab of Google Chrome. An additional tip is to mimick a slow network (although I don't think this would actually track all problems that may occur on the network in terms of redirects, DNS resolutions etc as #slebetman mentions could be the case as well).

Loading issue with IOS 9 with cordova 4.2

I have a problem with cordova 4.2 while using Xcode 7.0.1,when I run my application on IOS9 ,loading JS files taking around 25 seconds which is very bad comparing to android.
I tried to detect the problem by putting flag ,the loading is stuck with getting the handlebars (Since I am using Ember js).
I tried to minify the hbs with no improving.
function onDeviceReady() {
Helpers.getScript('app/app.js');
Helpers.getScript('app/helpers.js');
Helpers.getScript('app/init.js');
Helpers.getScript('app/router.js');
}
Thanks in advance
Okay. If all you are doing is loading files, then there is no need to delay loading those files. The only time you want to delay loading a file *is* if you have some active code in those files.
To be clear, there are some libraries that load and become active as soon as they are loaded. Those libraries would set variables or access the system in some way. These types of libraries need to be loaded AFTER the deviceready event. Otherwise, you are free to load libraries as soon as you can, and even before the deviceready event.
On your App, I cannot say why it is delaying to start. You say 20+ seconds, and I can only think it might be the delayed library loading.
NOTE: not all apps are built the same. This means that on Android, there are libraries that come with the system, and there are libraries that Cordova adds. It might, (might be) that you are loading a larger APP on an iOS system, and that might be the delay.
In any case, let me know what you have tries, and what worked and what did not.

Multiple PhantomJS instances hanging

I am having a problem when running multiple instances of PhantomJS on Ubuntu 14. After a few minutes, the processes become unresponsive.
Brief background:
Using PhantomJS 2.0 to render a web page that ultimately gets saved as a PDF using Wkhtmtopdf.
PhantomJS is only responsible for loading the page, making ajax requests, and waiting for a response after the PDF is saved on the server. It does not generate the PDF itself.
There are hundreds of web pages that need to be generated into PDF, so I want to run as many PhantomJS instances in parallel as the system allows.
Each PhantomJS process is started by a shell script like so:
{path to phantomjs} {path to js file} --data {some argument} >> {path to log file} 2>&1 &
The problem occurs after a couple of minutes where I stop getting any output from the PhantomJS processes, and looking at top I can see they are just laying there not doing anything. The JS script has timers that retry to load a page if it takes longer than a minute, and eventually call phantom.exit() if the page can't load / PDF generation fails. So even if something goes wrong, the process should still exit - but it doesn't.
I tried changing the number of PhantomJS instances running in parallel. Tried 20 -> 10 -> 5 -> 3, but it doesn't seem to matter. I can actually get many more jobs execute successfully when maintaining 20 instances at a time.
When running with --debug=true I can see that at some point it gets stuck at
[DEBUG] WebPage - updateLoadingProgress:
Also going through the output I see several of these warnings:
[WARNING] QIODevice::write: device not open
which makes me believe that is the source of the problem.
I thought there might be some contention for file resources so I tried without redirecting output to a log file, and not using --local-storage-path, but that didn't help.
As a side note, I have been using PhantomJS for several years now doing the same procedure, only sequentially (run a single PhantomJS process at a time). And although there were a few snags to overcome, it worked great.
Any idea what's causing this?
Anyone faced with a similar problem?
Any advice on running multiple PhantomJS instances in parallel?
Thanks!
I faced the exact same issue both locally and on my CI server (which was also Ubuntu). Uninstalling 2.0.0 and upgrading to to 2.1.1 resolved the problem for me.
I was facing the same issue. Use driver.quit() instead of driver.close(). That solved the issue for me.

Large browser parsing time after loading CSS's and JS's

I'm trying to decrease page loading time of a project I'm working on. At first the team I'm working with tried to optimize Apache/Symfony2/Postge but finaly in Networking panel of Chrome's Dev Tools I've found a large "gap" after loading all css and js files. The "gap" disappears when script files are removed. If I leave only jquery.min.js (no other libraries or scripts loaded) page still loads for aprox. 1.5s.
Developer tools doesn't say what happens during this period. I think this might be when browser parses and interprets CSS and JS but I need decreasing of DOMContentLoaded time. Any suggestions?
This is my first post and I need 10 reputation to post images, so I've uploaded screenshot of what I'm talking about here:
http://tinypic.com/m/imqw0n/1
P.S.: Screenshot is taken on "average" PC. The same test on my personal laptop (AsusG750JZ, 8-core CPU, 16Gb RAM, nVIDA 880m) shows very different results - load time is about 1.5s instead of 5.2s. Unfortunately I can't make everyone use the web app on hi-end ;)
P.S.2: Async loading of JS is not an option. I've tried RequireJS but I didn't like the results. It was clumsy because of all scripts dependencies.
In DevTools, go to the 'Timeline' tab. https://developer.chrome.com/devtools/docs/timeline

Categories