My question is obvious. Can spams (I mean blog spams) make use of javascript on my site?
For example "Laravel" needs csrf code on forms. If I pass csrf code as a javascript variable and append the code using jquery append method into the form, will spams be able to submit the form?
Of course, as the source code is on the client side, they can either run it, alter it before running it, or reverse engineer it and populate the form field with the required value directly. That's why any spam protection mechanism always need a server side parte. Otherwise you're just making it harder to break... That said, think if the result is worth the effort to reverse engineer your spam protection.
So, our company has been developing anti-spam for websites and I can say that many spambots(about 30%) are able to use javascript.
Related
I have a form users fill out and JavaScript is used to validate the input (e.g. makes sure the password field isn't left blank). Since JavaScript is client side and not compiled anyone can easily mess around with it. Does this mean it's necessary to validate data from the user again on the server? If yes, is there anyways it can be made more efficient since JavaScript (theoretically) already did it?
Yes, it is necessary to validate data on the server because it can be messed with by end users client-side.
If yes, is there anyways it can be made more efficient since JavaScript (theoretically) already did it?
It is already more efficient than having only server-side validation, because you avoid a lot of round-trips for validation by having client-side validation (you only need to submit the data once, and unless validation was incomplete or disabled, it will go through straightaway). Provides a better user experience, too.
You cannot do away with server-side validation (if you care about the data). If the data only ever goes back to the same user and is not shown or used anywhere else (and has no potential to break anything on your system), you could relax this a little. As as extreme example, Dropbox probably does not care what files you upload, so they don't validate if the HTML you upload contains malicious Javascript.
I can disable any javascript on your page just with a click of the mouse. I can even totally bypass an HTML form and send data directly to your server.
For example, if you retrieve data with $_GET I can bypass your form (and the javascript validation) just by messing with the address bar. Don't think that using $_POST would change this: it just a matter of writing an HTTP request.
So, yes... Never trust user input, even if sanitized with javascript.
As somebody posted above, javascript validation can prevent legitimate user errors (thus save the trip the wrong data would have done to your server and then back to the user) but malicious users will still be able to bypass it VERY easily.
short answer: yes and always!
read about PDO, SQL injection, UUID, tokens, MD5, SHA, Cross-site request forgery...You have a whole new world to discover! :) I mean it in a good way. Learn about this and you'll build more secure websites
You always need to keep this in mind: Never trust user input data. Never. So you have to perform extra validating process in server-side.
Yes absolutely. It is still possible for someone to intercept the form and modify values before re-posting to the server.
The user certainly can disable JavaScript. It is also very easy to mess with it as the source code is right there. The user can also run arbitrary JS, making it even easier to mess with your stuff.
Therefore, you should always do server side validation as well. Client side validation should only be used as convenient information for the user. Never trust it as your only security source.
Yes, you MUST validate both in client side and server side.
You must think in terms of progressive enhancement. Think of Javascript as just a layer for enhancement and not a necessity. Because it's always upon the discretion of the user to disable Javascript in their browser, rendering your Javascript code useless.
A plus in client side validation is you're saving roundtrips to the server validating if the username or password is empty which can easily be done in javascript.
Yes, to be safe you will need to add server side validation.
Nothing that is expected to have been done on the client side is guaranteed so you will need to repeat anything that is important.
Additionally there are things that are likely to be evaluated on the server side but not on the client side. Things like checks for SQL injection fall into this category.
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 9 years ago.
Improve this question
One argument for using both client side validation (JavaScript) and server side validation using a validator is that if the client browser does not support JavaScript or JavaScript has been turned off deliberately, then client side validation is rendered useless.
My question is how good is this argument in practice? In theory it makes sense, but in practice, if JavaScript is disabled in the browser, then most website features will not even work. The user probably cannot even load the page without JavaScript, let alone submit a form.
Client-side validation just avoids the client from going "but I filled this all in and it didn't tell me anything!". It's not actually mandatory, and in reality, client-side validation is a very new thing (read: 5 years old or less). In practice, all it does is prevent your client (with JS enabled) to know whether the form is okay before reloading a page.
If AJAX is in the game, it is different - it allows you to save bandwidth as well as to provide user with feedback before submission.
Finally, if you're building strictly client-side, peer-to-peer exchange apps (think games), you'll want client-side validation to keep the clients from cheating.
Server-side validation is also crucial due to the fact that client-side validation can be completely bypassed by turning off JavaScript. In a way, JS-driven validation is a convenience and an aesthetic/cosmetic improvement and should not be relied upon. Furthermore, it is trivial to edit the source of a page locally in order to disable or bypass even the most complex of JS validation.
What could a user do if you do not server-side validate? Anything, depending on how you use their data. You could be allowing users to drop entire databases (or worse, leak them), modify anything they like (or worse, read anything they like. Directory traversal flaws are extremely common entrance points for naughty people), and elevate their privileges at will. Do you want to run this risk? Not validating user input is like trusting people and not installing locks on your house.
Validation should always be performed server-side - you can never trust client-side validation.
Client-side validation is always in the sense of providing a better User Experience (UX), so the user doesn't have to submit and reload a page simply because a value in a form isn't valid - it makes things more dynamic.
As you don't even need a browser to make requests, independently of your website relying on JS to work properly, you will need server-side validation and sanitize all user input in case you care about not having your databases pwned.
Now it is up to you whether you want to provide an UI with dynamic client-side validation hints or not.
Always protect your inputs on the server. It's not always about users having JavaScript disabled but also that they could break the server.
For example, if a site has a JavaScript max-length check on an <input>, a user can could disable that check, thereby sending more data than your server and/or database is expecting. This could overload the server by a large POST occupying a server thread for a long time, it could expose a weakness in the database for example violating a database constraint potentially exposing details about any persistence information. Worse, if there is no constraint a user might be able to perform injection attacks.
Another example is someone using an external HTTP tool to send requests to your server, completely bypassing any JavaScript. I use the Advanced REST Client for Chrome all the time in development for testing JSON APIs.
Client side validation through JavaScript is just a way of providing a quicker feedback to a person using the site of any information about their interaction with the site. In traditional client-server communication it should not be the only validation for the reasons outlined above.
If a user has disabled javascript is a problem of himself and he decided alone to disable the javascript for a reason... For that, when you making a website you must have always in mind that your web site must be valid for users with and without javascript. The both side validation is needed for a number of reasons, some of them are:
User has disabled javascript
An evil user in purpose has removed the javascript in order to exploit the system
With javascript validation you reducing the data traffic between the website and the client.
And of course with server validation you make sure once for all that the data are correct
It is possible to have a website that is using both javascript and "older" technologies to be valid for every user and every browser.
Client-side validation is a solution for highly interactive forms with on-the-fly field validation, but it will not prevent a ill-intentioned user from injecting and posting invalid formatted data to the server. It is important that your server-side script validates everything the user is doing, otherwise you will expose your site to SQL injection attacks, XSS attacks, users doing stuff they are not supposed to etc.
I use some jquery and JS functions to validate forms and to check for example if that field is empty or password is less than 6 characters, and stuff like that, and I noticed that if someone disabled JS, then these functions would be useless, and no more protection of forms, which can be better made using PHP, so should I avoid them now because this could cause an insult to my website ?
JavaScript is very useful for improving user-interaction, along with reducing the number of requests made on the server; but, as you've pointed out, it can be disabled. To answer your question: No, I wouldn't recommend avoiding the use of JavaScript, but the key is not to rely on it for critical actions, such as validation. You'll want to implement the validation on both the client (JavaScript) and server (PHP) sides. This will ensure that there is no way for the user to completely disable the validation.
In short:
JavaScript = Good
JavaScript Validation = Nice for the user, but not reliable
Server Side Validation = Essential
Side note:
With regards to relying on JavaScript for the overall user interaction, as other answers have suggested, use JavaScript to enhance the overall experience, but don't rely on the user having it turned on. I would recommend a bit of bed time reading of Progressive Enhancement, it's the approach of making something better, with being reliant on it, and can be applied to more than just JavaScript (CSS3 etc).
You should use javascript and jQuery to enhance the functionality of your site. If someone has javascript turned off you should ensure that the site still works, and they can access all areas - even though there may be some things they cannot see, such as transitions and effects.
With regard to validation, you should ALWAYS validate server side. Client side validation is optional, but a very nice feature to improve the user experience for your visitors.
I don't mean this as a complaint, more of an observation.
Web developers often grab Javascript libraries as the latest shiny toys with no thought for the consequences. As a Data Governance person, I'm regularly the bringer of bad news, when we do audits on the data collected, and find users who get past the Javascript validation.
"Go back and start again".
Javascript libraries are great for the presentation of data, but should and must not be relied on for data validation and the integrity of user profiles.
A validation should always happen at
Client Side - using javascript to enhance the user experience.
Server Side - using the preferred server side programming language for security reasons
Service Side - if you are following SOA / Web API as part of defensive programming practice. This can also be at the DB level along with Service layer.
You should not avoid using JS and jQuery in your website, but you should avoid using them for validation purposes or business-logic purposes. These should be done in the back-end of the website, not in the UI level.
You shouldn't really avoid them. What you should do is to implement both client and server validation. Don't rely just on client validation, for the reasons you just mentioned. You should always validate the data when it arrives to server.
Adding client-side validation gives your pages dynamic look and feel. A user does not have to wait for the form to go to server and in case of an error to return back. It is automatically verified without postback.
As I said above, don't rely upon client side validation. Always implement server-side validation also.
Remember my friend, Java script is client side scripting. its purpose to validate form at client side itself so we can avoid overhead...You must use java script at client side. because PHp is server side language. It will take time to reply.
Ha, Server Side validations are equally important. if you want to do that, then you can use server side language. You can check comment below to my post by halfer. Server side validations are important for security and many more purposes.
That is different thing that somebody disabled js. You can check for the same and give proper message to enable.
This question already has answers here:
Closed 12 years ago.
Possible Duplicate:
How to prevent your JavaScript code from being stolen, copied, and viewed ?
I have a user form that has a submit button. Once clicked, a certain JS function is called and the code does something. I want to hide this so that nobody can see it. What is the best way to do this? I am not using any JS libraries, it is just code that I wrote myself.
Thanks!
you can always obfuscate and minify your code so that it's only single letters and such. There is no real way someone can't steal your javascript, but that is the best way you can "hide" it so people can' really read your variable names, etc.
Instead of using JavaScript, you could submit the form to the server, and run the same code using PHP or something. If you're manipulating something on the page based on what happens in the JS function, you could do the following:
When the form is submitted, cancel with JS (you're probably already doing this)
Get all the information from the form, do everything that you're comfortable doing on the client side
Make an AJAX call, sending the necessary data to the server, where your now-confidential function can run
Return whatever information you need to return back to the client and make whatever changes you need to make
You can't. This is anathema to the entirety of the web. For the same reason you can't stop picture theft.
What you could do to help prevent it is not only obfuscation as mentioned before, but also prevent outright downloading, require referrer headers and the like.
In the end tho, if you expose it via HTTP, someone else will be able to steal it. End of story.
You can't entirely.
The best option is to obfuscate it, but then it is still delivered to the client's browser, and they can read it (albeit obfuscated).
There are other ways to obscure it such as serving it via a custom handler that requires some kind of authentication, but ultimately its security through obscurity which is no security at all.
Web-Applications these days make extensive use of Javascript, for example various Google Products like Gmail and Calendar.
I'm struggling to how NOT having duplicated logic server and client side.
When requesting a page or state of the application, i would prefer to send the complete UI, meaning: not just some javascript, which in turn makes a dozen ajax requests and builds the user interface.
But here lies the problem, the logic deciding what to show or not has to be written once in the server-side and once in the client-side language.
Then i was wondering if it was somehow possible to process your javascript logic server-side and send the whole to the client, who in turn can continue using the application with all the advantages of a responsive ui, but without disadvantage of the initial loading/building of the user interface due dependency of background ajax requests.
I hope the explanation of my problem is a bit clear, because i'm not the most fluent English writer. If you understand what i mean and if you can describe the problem a little better, please do... thanks!
So my question is:
Is something like this possible and or realistic?
What is your opinion on how to tackle this problem?
;-)
When we started our web app, we had the same kind of questions.
It may help you to know how we ended:
The backend (business logic, security) is totally separated from the frontend (gui)
frontend and backend communicate through JSON services exclusively
the JSON is rendered client-side with the PURE templating library
and the backend is Erlang (anything streaming JSON would be ok too, but we liked its power)
And for your question, you have to consider the browser as totally unsafe.
All the security logic must come from the backend.
Hiding or showing some parts of the screen client side is ok, but for sure the backend decides which data is sent to the browser.
Seems you describe Jaxer.You can write everything in JS. Also, there is GWT that allows to write whole thing on Java
Then i was wondering if it was somehow
possible to process your javascript
logic server-side and send the whole
to the client, who in turn can
continue using the application with
all the advantages of a responsive ui,
but without disadvantage of the
initial loading/building of the user
interface due dependency of background
ajax requests.
Maybe the apps you're looking at just use Ajax poorly.
The only content you can pre-process on the server is the content you already know the user wants. For example, in an email app, you could send them a complete view of their inbox, pre-processed on the server and fetched with a single request, as soon as they log in. But you might use AJAX to fetch a particular message once they click on it. Sending them all the messages up front would be too slow.
Used correctly, AJAX should make your pages faster, because it can request tiny updates or changes of content without reloading the whole page.
But here lies the problem, the logic
deciding what to show or not has to be
written once in the server-side and
once in the client-side language.
Not necessarily. For example, in PHP, you might write a function like displayWidgetInfo(). You could use that function to send the initial widget information at page load. If the user clicks the widget to change something, send an AJAX request to a PHP script that also uses displayWidgetInfo() to send back new results. Almost all your logic stays in that single function.
Your instincts are correct: it's bad to duplicate code, and it's bad to make too many requests for one page. But I think you can fix those problems with some refactoring.
I understand what you're saying.
But I don't think you should be having much 'logic' about what to build, on the client side. If you did want to go with a model like you're proposing (not my cup of tea, but why not), I don't see why you'd end up with much duplicated.
Where you would normally show a table or div, you would just output JavaScript, that would build the relevant components on the client side.
I would consider it just as another 'View' into your data/business logic model.
Have you go a small example of a problem you're coming up against?
I understand your question in this way:
Suppose we have an html form on web-page. There is a field for name and surname. We have to check it for validity both on client-side (with JS) and Sever-side (on php script while processing form inputs). So here is the duplication - regex check on both sides. So what is the way to prevent it and combing these logics?