Can a badly coded loop in javascript on a website consume so many server resources that it takes it down?
We have a VPS on our company i added a small code to scroll to a certain div on the theme and tied it to an click event. However the day i added that, everything worked good untill the night when the server dropped and has been going down since like every day.
The hosting company says it was a space issue, but tracking down the thing i found that code i added that day. Is it possible that a simple scroll to when someone click on it can generate so much pressure on the VPS to take it down?
apparently i wrongly put it inside another code tied to another event.
I really want to know if this is possible.
$("#close-icon").click(function() {
if ($(".cresta-facebook-messenger-container").hasClass("open")) {
$('.cresta-facebook-messenger-box').hide('swift');
$('#com-opt').show('swift');
}else{
//nothing
}
$("#show-mail-form").click(function() {
var scroll = accordion.top - 350 + (element * 90);
jQuery('body,html').animate({ scrollTop: scroll });
}, 310);
});
});
If and only if the JavaScript loop makes a request to the backend for data.
The code sample you've given does not make requests for data to the backend, so, no it will never take down any backend resources, such as a dedicated server or VPS. It will freeze the user's browser, which, at worst, could also freeze the user's local machine.
If the JavaScript does, however, make AJAX or API integration calls to the backend, then yes, certainly, badly written front-end code can overwhelm and take down a server.
Yes. It can even take down a dedicated server, not just a VPS. A few years back, we had an auth-related JS bug on our work's moodle login screen (wasn't my code), and it caused an infinite login loop that led moodle to spin up session after session after session until it exploded. The problem didn't surface on the dev or test servers, just with real users. That machine had to be rebooted, but a VPS could have easily been made practically unavailable even if it didn't go fully down...
Related
I have variable in Javascript which are created by reading a number from HTML, adding a number to it and then returning it to HTML.
I want to make it so that no matter what browser/what user you are, you are seeing the latest version of the variable. Currently, if I refresh the page then the number resets to 0 (the default value). I want it so that if I update the number to 1 when someone else views it from another browser they will also see 1 and not 0.
I've seen that cookies are an option, however I thought cookies were client side only? So that would mean that only I would see the latest version of the variable.
I've seen that sessions are another option, are sessions server side? And would they do the job that I am after?
Is there another way of doing this I haven't considered?
Thanks in advance
I want to make it so that no matter what browser/what user you are, you are seeing the latest version of the variable.
You need to send your updates from the browser to a server, and then have that server relay your updates to all the other clients. There are many choices for how to do this, with various tradeoffs and complexity.
One method is to simply take that number and send it to the server. Then, on next page load, the server injects that new number into the page it outputs (or it serves it up via an API call, over AJAX, via the Fetch API, or server-sent events, WebSocket, etc.). If you do this though, you will need to decide how to handle concurrency. What will happen if two people load the page at the same time?
The general system you're describing is called Operational Transform, and this is a rabbit hole you probably don't want to go down right now. Just understand that there's no magic that synchronizes things across the planet perfectly and at the same time. Your system has to account for inherent delays in some way.
I've seen that cookies are an option, however I thought cookies were client side only?
Yes, cookies are client-side. They're sent to the server with every request, but that's not a useful tool for you, aside from session identification.
I've seen that sessions are another option, are sessions server side?
They can be, but you need to find a way to know what the user is between browsers. Normally, a session ID is stored in cookies.
Web development being completely new to me, this may be easy to find online but I might lack the technical jargon in this area...
I need to display some data on a linux-device that also runs a webserver, so I figured the easiest way would probably be to do this in a browser. The data might change due to (physical) interaction with the device: it has external push-buttons attached. I need the data on the webpage to change instantly when a button is pressed, so that the user sees the values change immediately when he presses a button.
This might be complete and utter nonsense, but is it possible to have the program that watches for button-presses pipe its output somewhere and have a piece of php respond to this?
A sub-optimal solution would be to have a piece of client-side javascript with a timer that periodically "calls" (?) a piece of php. I don't like this solution because you either reload ad nauseam to minimize delays, or you'll notice lag in the response to the button-presses.
You can use socket programming. Usually used in chat servers to send data to client without refreshing.
http://php.net/manual/en/sockets.examples.php
This should help
In this question the p asker tries to do the same u are trying, push data on some external event
Python Socket Programming - need to do something while listening for connections
I have a web server that generates questions for students of a particular subject. The web server needs to keep track of how much time each student has spent on a particular set of questions.
The web pages have a "Finished" button, which, when pressed, causes statistics to be sent to server.
However, I also want the web browser to send statistics if the student navigates away from the page or closes the browser window without pressing "Finished".
For this purpose, I have planned to have "onunload" or "onbeforeunload" send an Ajax request to the server with the relevant information. But apparently different browsers do not fully support these events, and also there are restrictions on what can be done in the event handlers. And, of course, I don't want the browse to freeze if the communication with the server fails.
So, I need some advice on the best way to do this.
If I wanted to be sure to handle all the "special events" I would send tick 'requests' from the webpage to the server. Granularity depends on the tracking requirements, the load, and whether it is an intranet or internet application; can be some seconds or even a minute. So you are tracking the time spent on the page even if the browser/os/network crashes.
The best way to implement is, is to use period updates. This will pretty much guarantee you have some relevant data when the user disconnects in any way.
An implementation is pretty trivial, all tough you might have to refactor some of your logic to send out period updates instead of everything at once.
function sendStatistics()
{
// ajax and what not
}
setInterval(function(){
sendStatistics();
}, 1000);
An other way to make it work is to make your ajax call in beforeunload and make it synchronous. This will freeze the browser for duration of the call, and will also only work when navigating away or closing the browser, i don't recommend this.
I am building a website on a free website provider (000webhost) and I am currently working on a chat. I have set an interval every 500ms which reads a file and checks if a new message was posted. If there is a new one, i load all messages from a .txt-file into a element in html.
It is nearly finished but after long chatting or just being on the chat-page (3 minutes or more), my site crashes and I have to wait about an hour till i can access it. I am refreshing the chat using javascript and ajax every half second.
Does anybody know what I could have done wrong?
I already searched google for that issue but couldn't find any help.
Edit:
I changed the interval for refreshing the chat to 2,5 seconds and the website didn't crash. I think that solved the problem..
Sounds like the host is blocking you, maybe due to excessive requests. One request every 500 milliseconds from the same IP can probably be mistaken for a DOS-attack or similar.
There are more performant and suitable ways to build a chat - have a look at web sockets or node.js for instance.
NodeJS Chat
Web Socket chat
Update
As Tom points out in his comment, it might be that a free web host doesn't provide or allow you to setup a Node-server. In that case, I guess you could experiment with an increased request-interval, and see if that helps you, or check with the host if they have such a limit. An increased request interval would probably make the chat feel less responsive, but it is tough to get everything on a free host.
So, I'm creating this application that sometime it require pulling the feed and it's always timeout on heroku because of the xml parser takes time. So, I change to be asynchronous load via Ajax every time the page is loaded. I still get H12 error from my Ajax call. Now I'm thinking of using Resque to run the job in background. I can do that no problem but how would I know that the job is finished so I can pull the processed feed on to the html page via AJAX?
Not sure if my question is clear, so how would the web layer knows that the job is done and it should signal e.g (onComplete in javascript) to populate the content on the page?
There are a number of ways to do this
The JavaScript can use AJAX to poll the server asking for the results and the server can respond with 'not yet' or the results. You keep asking until you get the results.
You could take a look at Juggernaut (http://juggernaut.rubyforge.org/) which lets your server push to the client
Web Sockets are the HTML5 way to deal with the problem. There are a few gems around to get you started Best Ruby on Rails WebSocket tool
You have an architecture problem here. The reason for the H12 is so that the user is not sat there for more than 30 seconds.
By moving the long running task into a Resque queue, you are making it disconnected to the front end web process - there is no way that the two can communicate due to process isolation.
Therefore you need to look at what you are doing and how. For instance, if you are pulling a feed, are you able to do this at some point before the user needs to see the output and cache the results in some way - or are you able to take the request for the feed from the user and then email them when you have the data for them to look at etc etc.
The problem you have here is that your users are asking for something which takes longer than a reasonable amount of time to complete, so therefore you need to have a good look at what you are doing and how.