This is more of a concept question as I am trying to learn more about long-polling, in Javascript/jQuery specifically. I have a web app where I am long-polling (websockets are not an option right now) to a json file. I have run some tests and after leaving the app open for some time, it starts to slow down and later seems to start getting stuck. Using Chrome, I've checked the developer tools and the memory starts going through the roof, as with the listeners (>5000) around 1 1/2 hours of up time. I've searched and searched but can't find forums that pinpoint this problem and solution. In my code, I am using setInterval every 30 seconds. What do I need to do in order to keep the memory and listeners count low, and make sure the app does not overload and get slow? The function of the app requires it to stay up for long periods of time.
Thank you for your help.
Related
In a server running Nodejs, I am using multiple setInterval functions, each having a relatively very long interval (24 hours, 36 hours, etc.).
setInterval(funcOne, 86400000)
...
setInterval(funcTwo, 172800000)
While they seem to work fine for now, I am a bit concerned about their reliability and performance. The server of coarse is running all the time. I am okay with restarting the functions if the server failed.
I tried to investigate about their interaction with the event loop, but it is still a bit vague for me.
I want to make sure that these multiple functions are not blocking the event loop no matter how many they were.
I am looking for possible failures or crashes due to them.
I am keen to understand any performance overhead they might cause, if any.
PS: I am not willing to migrate those functionalities from the server to a cloud function or a serverless app at the moment.
All,
A company I'm working with outsourced some development to a third-party company. The developers at this third-party company are "using" Ng 7 (using in quotes, because much of the functionality is hand-written instead of using what's out of the box).
The app is VERY sluggish.
Inside the root component, I call a service's init method. The init method creates a start time and then calls and observer on the applicationRef.isStable object. When the application becomes stable, it returns to the init method that then gets the end time and calculates a delta between the end and start times. This delta is then printed to the console.
The difference in time ranges anywhere from 60 second to 200+ seconds.
Obviously, the code has issues as it shouldn't take 3+ minutes for the application to become stable.
Question:
Is there a tool that can help me troubleshoot what's taking so long for the application to become stable?
Thanks.
I would go with chrome's performance:
Check for guidance on google's guide
Checking on the network tab could also help, to figure out if it's XHR requests that are causing the problem
I'm working on a pretty simple web application that displays top users hitting a web page. I've written some code in d3 to convert the data to a bar graph and have written a simple php script to pull the data from a database. I use an ajax request to pull the data once every 5 seconds and update the bar graph. Without fail if I leave the page open in the background it will get to the point of the old
aw snap google chrome has run out of memory
I've gone through a bunch of sites on memory leaks and done what I could to prevent them but there is a decent enough chance I messed something up. The problem is other than that error coming without fail on this page (and I've written plenty of javascript applications where this doesn't happen) there is absolutely no evidence the leak is happening. The data I'm retrieving every 5 seconds is 212kb, and when I do a heap recording heap memory peaks at 25mb and doesn't really increase (it generally bounces between 10 and 25mb), so it seems like that 212kb is being garbage collected and is not accumulating in the heap. Similarly I've looked at the task manager and when I make that the only tab open there seems to be a good amount of fluctuation but definitely not a trend upwards. I've taken heap snapshots and they tend to be in the 10-15mb range and I really don't understand how to read the snapshot, but here's what it looks like after running for 15 minutes or so:
It's just getting extremely frustrating, I've spent 20-30 hours on this but it seems like a case of a watched pot never boils. If I look for evidence of it happening I can never find any, but if I leave it open while I leave my computer for a few hours, without fail the page crashes. It's almost like the garbage collector just waits for me to leave my computer before deciding to just stop running.
Any thoughts on what could be the culprit here or next steps to attempt to identify the culprit? This is my first experience with a memory leak and it's driving me crazy. Thanks.
Testing is really slow going for me in Eclipse. Deploy -> log into my site -> navigate to the appropriate screen, etc. While running locally, I am connecting to an external Dev database which doesn't have much horsepower. Total turn-around can be from 5-15 minutes depending on the day. Find out a quote is in the wrong place, rinse and repeat...
My question is this - Is there a way to replace a page on the fly while running locally? This way I can fix the error, replace the single page, hit refresh in the browser and continue with the updated page? If so, where is the file being stored on the hard drive when Apache is running?
Sorry if this is a silly question, back in the old days I could do this all the time. Just getting frustrated over here because I spend 5% of my time writing code and 95% going back and forth testing it because of the slow turn around.
Hello Stack Overflow community,
I'm a rather novice coder, but I have a project I've been devising that looks more and more complicated every day, and I don't know where to start.
With inspiration taken from Synchtube & Phonoblaster, I'm looking to create something for my website that will allow visitors to watch YouTube videos and playlists that I have curated, together in real-time, in-sync.
Because I want to be able to put this in the context of my own website, I can't use the services listed above that already do this - so I wanted to figure out how to roll my own.
Some things have been written about this topic on Stack Overflow, and other blogs:
HERE
and HERE.
Because I still consider myself a novice programmer, and a lot of the information I've found on Google and Stack tends to be more than 1 or 2 years old, I'm still unsure where to begin or if this information is outdated. Specifically, what languages and tools I should be learning.
From what I've gathered so far, things like Javascript, Node.JS, and the YouTube API would form the crux of it. I've not used any of these before, but would be interested to see whether other experienced coders would have their own suggestions or ideas they could point me towards.
I appreciate you taking time out to read this post!
Hope to hear from some of you soon :)
Many thanks.
It partially sounds like you need a live stream from Youtube. You can find more info here. https://support.google.com/youtube/bin/answer.py?hl=en&answer=2474026
If you can get that going, then syncing play between any number of users is as simple as embedding a regular youtube embed of your stream in a browser.
Looking past that, if you wanted to sync video playback amongst any number of users, the first big problem is learning how to set time on a video. Luckily, that's easy with the hashbang #t=seconds.
Eg: http://www.youtube.com/watch?v=m38RdUGqBPM&feature=g-high-rec#t=619s will start this HuskyStarcraft video at 619 seconds into the video.
The next step is to have some backend server that keeps track of what the current time is. Node.js with Socket.io is incredibly easy to get setup. Socket.io is a wonderful library that gracefully handles concurrency connections from web sockets all through long polling and more and works well even on very old browsers. Note that websockets aren't even required, but will be the most modern and full-proof method for you. Otherwise its hacks and stuff.
One way this could work would be as follows.
User1 visits your site and starts playing the video first. A script on your page sends an XHR request to your server that says, "video started at time X". X then gets stored as the start time.
At this point, you could go 2 routes. You can have a client-side script using the Youtube API to poll the video and get its current status every second. If the status or time changes, send another request back to the server to update the state.
Another simple route would be to have the page load for User2+, then send an XHR request asking for the video play time. The server sends back the difference between the start time from User1, then the client script sets the 't' hashbang on the youtube player for User2+. This lets you sync start times, but if any users pause or rewind the video, those states dont get updated. A subsequent page refresh might do that though.
The entire application complexity depends on exactly what requirements you want to have. If its just synchronized start times, then route #2 should work well enough. Doesn't require sockets and is easy to do with jQuery or just straight javascript.
If you need a really synchronized experience where any user can start/stop/pause/fast forward/rewind the video, then you're looking at either using an established library solution or writing your own.
Sorry this answer is kind of open ended, but so was your question. =)