Website crashes after longer usage - javascript

I am building a website on a free website provider (000webhost) and I am currently working on a chat. I have set an interval every 500ms which reads a file and checks if a new message was posted. If there is a new one, i load all messages from a .txt-file into a element in html.
It is nearly finished but after long chatting or just being on the chat-page (3 minutes or more), my site crashes and I have to wait about an hour till i can access it. I am refreshing the chat using javascript and ajax every half second.
Does anybody know what I could have done wrong?
I already searched google for that issue but couldn't find any help.
Edit:
I changed the interval for refreshing the chat to 2,5 seconds and the website didn't crash. I think that solved the problem..

Sounds like the host is blocking you, maybe due to excessive requests. One request every 500 milliseconds from the same IP can probably be mistaken for a DOS-attack or similar.
There are more performant and suitable ways to build a chat - have a look at web sockets or node.js for instance.
NodeJS Chat
Web Socket chat
Update
As Tom points out in his comment, it might be that a free web host doesn't provide or allow you to setup a Node-server. In that case, I guess you could experiment with an increased request-interval, and see if that helps you, or check with the host if they have such a limit. An increased request interval would probably make the chat feel less responsive, but it is tough to get everything on a free host.

Related

Architecture: Javascript event tracking in browser

I want to make a custom tracking system for web events. I have looked into multiple per-excsiting systems, but I want something terribly simple - yet very accurate.
I want to be able to track the following:
Page view even
Time on that page
or:
Video started playing event
Time of video watched
My first initial thought was to do a simple javascript reporting back to the server, but what happens if the user closes the window? How do I know they stopped viewing? And how can I get accurate measurements down to 1/10th of a second? So I thought of a websocket solution, as it know when a user has discounted. I ended up with Socket.io, but I want to make sure there is no better or smarter way to achieve this?
How would you approach his challenge? What is the smartest way to engineer this?
A Websocket connection which reports back to the server frequently was my first thought as well, but if you send 10 messages every second, even that might be too much for a websocket, especially when connectivity isn't top-notch.
Since the server doesn't require the information absolutely immediately, consider batching requests instead - save/update the information into Local Storage every 0.1 seconds, but don't send it to the server then - instead, every 30 or 60 seconds, or on pageload, take the current data in Local Storage and send it to the server, and clear Local Storage so that the next request a minute from now doesn't send duplicate data.

Can a javascript function take down a VPS?

Can a badly coded loop in javascript on a website consume so many server resources that it takes it down?
We have a VPS on our company i added a small code to scroll to a certain div on the theme and tied it to an click event. However the day i added that, everything worked good untill the night when the server dropped and has been going down since like every day.
The hosting company says it was a space issue, but tracking down the thing i found that code i added that day. Is it possible that a simple scroll to when someone click on it can generate so much pressure on the VPS to take it down?
apparently i wrongly put it inside another code tied to another event.
I really want to know if this is possible.
$("#close-icon").click(function() {
if ($(".cresta-facebook-messenger-container").hasClass("open")) {
$('.cresta-facebook-messenger-box').hide('swift');
$('#com-opt').show('swift');
}else{
//nothing
}
$("#show-mail-form").click(function() {
var scroll = accordion.top - 350 + (element * 90);
jQuery('body,html').animate({ scrollTop: scroll });
}, 310);
});
});
If and only if the JavaScript loop makes a request to the backend for data.
The code sample you've given does not make requests for data to the backend, so, no it will never take down any backend resources, such as a dedicated server or VPS. It will freeze the user's browser, which, at worst, could also freeze the user's local machine.
If the JavaScript does, however, make AJAX or API integration calls to the backend, then yes, certainly, badly written front-end code can overwhelm and take down a server.
Yes. It can even take down a dedicated server, not just a VPS. A few years back, we had an auth-related JS bug on our work's moodle login screen (wasn't my code), and it caused an infinite login loop that led moodle to spin up session after session after session until it exploded. The problem didn't surface on the dev or test servers, just with real users. That machine had to be rebooted, but a VPS could have easily been made practically unavailable even if it didn't go fully down...

PHP, MySql, JavaScript - Pushing data from server to client (Live chat)

I am trying to create a social network with live chat system, so that users can have notification that they have a new message or receive a message after it was sent from another user in real time.
I am new to this, I have made front end (div that will hold messages that are fetched from DB, in form of a paragraph) and DB design, but I am not sure what to use for back end. My best solution so far is to make Ajax call for every user in every few seconds interval, but this looks like inefficient solution for many registered users.
I have searched the web and haven't found any good and up-to-date solutions and I would appreciate if someone could share some experience or point me in the right direction.
Few ways to do it:
websocket (with socketio it's the best)
Server Sent Event Long Pooling Pooling (Ajax)
The best now is websocket. But you can have some problems if your chat needs to work behind some firewall. But the overall perf if you use websocket, you will use something like 80% less resources.

Reload div content for everyone that is online

This site does specifically what i want to do http://en.lichess.org/. Users join the chess game and after some seconds the lobby refreshes and there are new games added. Does anybody know how can i do that ? or at least give me a starting point?
In the traditional method (and for older clients) you would do 'long-polling' which would fire every few seconds asking for updates. The new way is through web sockets. The page you reference uses websockets.
So if you have a modern browser, then you can use Websockets and the server can push data to the browser (same with EventStream). Then that message is read on the browser and the proper view is updated.
There are some frameworks which do this stuff for you, biggest one I can think of is Meteor
See the dev console:

Background job on heroku how does the web know it's finished

So, I'm creating this application that sometime it require pulling the feed and it's always timeout on heroku because of the xml parser takes time. So, I change to be asynchronous load via Ajax every time the page is loaded. I still get H12 error from my Ajax call. Now I'm thinking of using Resque to run the job in background. I can do that no problem but how would I know that the job is finished so I can pull the processed feed on to the html page via AJAX?
Not sure if my question is clear, so how would the web layer knows that the job is done and it should signal e.g (onComplete in javascript) to populate the content on the page?
There are a number of ways to do this
The JavaScript can use AJAX to poll the server asking for the results and the server can respond with 'not yet' or the results. You keep asking until you get the results.
You could take a look at Juggernaut (http://juggernaut.rubyforge.org/) which lets your server push to the client
Web Sockets are the HTML5 way to deal with the problem. There are a few gems around to get you started Best Ruby on Rails WebSocket tool
You have an architecture problem here. The reason for the H12 is so that the user is not sat there for more than 30 seconds.
By moving the long running task into a Resque queue, you are making it disconnected to the front end web process - there is no way that the two can communicate due to process isolation.
Therefore you need to look at what you are doing and how. For instance, if you are pulling a feed, are you able to do this at some point before the user needs to see the output and cache the results in some way - or are you able to take the request for the feed from the user and then email them when you have the data for them to look at etc etc.
The problem you have here is that your users are asking for something which takes longer than a reasonable amount of time to complete, so therefore you need to have a good look at what you are doing and how.

Categories