Anyone have suggestions for creating an extremely simple form verification field using jquery? I need to block basic form spam. Would love to have some type of 1+1= field that us used to make sure it's a human submitting the form. I don't have the abilty to put .php or .asp on the site so it would need to rely on jquery or some other method.
Any suggestions??
It's highly possible the robot script doesn't even have Javascript therefore it would be useless. I don't think this is feasible without a server-side solution.
You cannot solve this issue on the client side (with Jquery). In a spam-submitting situation, the client cannot be trusted.
To solve this issue you will need to modify things server side.
A couple of suggestions that are probably enough
Send a hash of some session variables unique to this user with the form, and validate on reply. This is to prevent replay attacks by detecting bad hashes.
Add fields that should be left blank, and hide them using css. Spam bots will fill them out. Filter out submissions that contain data in these fields.
I've been searching for something like that also and I think I've found something useful that also looks nice:
http://www.webdesignbeach.com/beachbar/ajax-fancy-captcha-jquery-plugin
I haven't tried this yet though.
Regards
Sigersted
Related
My question is obvious. Can spams (I mean blog spams) make use of javascript on my site?
For example "Laravel" needs csrf code on forms. If I pass csrf code as a javascript variable and append the code using jquery append method into the form, will spams be able to submit the form?
Of course, as the source code is on the client side, they can either run it, alter it before running it, or reverse engineer it and populate the form field with the required value directly. That's why any spam protection mechanism always need a server side parte. Otherwise you're just making it harder to break... That said, think if the result is worth the effort to reverse engineer your spam protection.
So, our company has been developing anti-spam for websites and I can say that many spambots(about 30%) are able to use javascript.
I have a form users fill out and JavaScript is used to validate the input (e.g. makes sure the password field isn't left blank). Since JavaScript is client side and not compiled anyone can easily mess around with it. Does this mean it's necessary to validate data from the user again on the server? If yes, is there anyways it can be made more efficient since JavaScript (theoretically) already did it?
Yes, it is necessary to validate data on the server because it can be messed with by end users client-side.
If yes, is there anyways it can be made more efficient since JavaScript (theoretically) already did it?
It is already more efficient than having only server-side validation, because you avoid a lot of round-trips for validation by having client-side validation (you only need to submit the data once, and unless validation was incomplete or disabled, it will go through straightaway). Provides a better user experience, too.
You cannot do away with server-side validation (if you care about the data). If the data only ever goes back to the same user and is not shown or used anywhere else (and has no potential to break anything on your system), you could relax this a little. As as extreme example, Dropbox probably does not care what files you upload, so they don't validate if the HTML you upload contains malicious Javascript.
I can disable any javascript on your page just with a click of the mouse. I can even totally bypass an HTML form and send data directly to your server.
For example, if you retrieve data with $_GET I can bypass your form (and the javascript validation) just by messing with the address bar. Don't think that using $_POST would change this: it just a matter of writing an HTTP request.
So, yes... Never trust user input, even if sanitized with javascript.
As somebody posted above, javascript validation can prevent legitimate user errors (thus save the trip the wrong data would have done to your server and then back to the user) but malicious users will still be able to bypass it VERY easily.
short answer: yes and always!
read about PDO, SQL injection, UUID, tokens, MD5, SHA, Cross-site request forgery...You have a whole new world to discover! :) I mean it in a good way. Learn about this and you'll build more secure websites
You always need to keep this in mind: Never trust user input data. Never. So you have to perform extra validating process in server-side.
Yes absolutely. It is still possible for someone to intercept the form and modify values before re-posting to the server.
The user certainly can disable JavaScript. It is also very easy to mess with it as the source code is right there. The user can also run arbitrary JS, making it even easier to mess with your stuff.
Therefore, you should always do server side validation as well. Client side validation should only be used as convenient information for the user. Never trust it as your only security source.
Yes, you MUST validate both in client side and server side.
You must think in terms of progressive enhancement. Think of Javascript as just a layer for enhancement and not a necessity. Because it's always upon the discretion of the user to disable Javascript in their browser, rendering your Javascript code useless.
A plus in client side validation is you're saving roundtrips to the server validating if the username or password is empty which can easily be done in javascript.
Yes, to be safe you will need to add server side validation.
Nothing that is expected to have been done on the client side is guaranteed so you will need to repeat anything that is important.
Additionally there are things that are likely to be evaluated on the server side but not on the client side. Things like checks for SQL injection fall into this category.
I have a website with simple form that posts input as tweets to a twitter account.
There is no captcha or any kind of anti-spam system installed, so it's only logical that spam-bots have been posting their bogus content like crazy.
Now, I want to build a validation that does not require any sort of captcha.
This is what I've been thinking and I need your honest opinion.
If robots are posting their stuff without using mouse or keyboard (probably by using some sort of generated script), then that actually means that they cannot give any sort of focus to the form itself.
My idea is to create a simple js condition that the form cannot be submitted without gaining some sort of focus.
What do you think about this?
Any other sort of similar ideas?
thank you!
An idea of tweeting messages that have not been validated sounds pretty unsecured on its own. Still, there are a few ways to minimize the spam.
For instance, you could implement a honeypot variation that adds another field to the submission form, but in a manner that's hidden from a live user. The bot would mistakenly fill it out and you could decline the submission on the server-side - believe it or not, most comment spam comes from crawlers, so even a basic approach like this could drastically decrease the amount of malicious content submitted.
You are correct in your assumption that most robots post "without using mouse or keyboard" - it is much more trivial to do so than to simulate client-side activity. Again, you could use this to your advantage - add a field that would be JS-populated and validate its content on the server-side.
This question already has answers here:
Closed 12 years ago.
Possible Duplicate:
How to prevent your JavaScript code from being stolen, copied, and viewed ?
I have a user form that has a submit button. Once clicked, a certain JS function is called and the code does something. I want to hide this so that nobody can see it. What is the best way to do this? I am not using any JS libraries, it is just code that I wrote myself.
Thanks!
you can always obfuscate and minify your code so that it's only single letters and such. There is no real way someone can't steal your javascript, but that is the best way you can "hide" it so people can' really read your variable names, etc.
Instead of using JavaScript, you could submit the form to the server, and run the same code using PHP or something. If you're manipulating something on the page based on what happens in the JS function, you could do the following:
When the form is submitted, cancel with JS (you're probably already doing this)
Get all the information from the form, do everything that you're comfortable doing on the client side
Make an AJAX call, sending the necessary data to the server, where your now-confidential function can run
Return whatever information you need to return back to the client and make whatever changes you need to make
You can't. This is anathema to the entirety of the web. For the same reason you can't stop picture theft.
What you could do to help prevent it is not only obfuscation as mentioned before, but also prevent outright downloading, require referrer headers and the like.
In the end tho, if you expose it via HTTP, someone else will be able to steal it. End of story.
You can't entirely.
The best option is to obfuscate it, but then it is still delivered to the client's browser, and they can read it (albeit obfuscated).
There are other ways to obscure it such as serving it via a custom handler that requires some kind of authentication, but ultimately its security through obscurity which is no security at all.
I'm working on an ajax application that makes extensive use of jQuery. I'm not worried about whether or not the application degrades gracefully.
So far I have been using Malsup's excellent jQuery form plugin to create forms that submit ajax requests. (For example, to submit updated record information.)
However I am considering dispensing with form tags altogether, and instead manually constructing $.post() statements when needed.
I'm wondering: What are peoples' thoughts on the best way to submit a large amount of information to the server - considering graceful degradation is not a requirement. Are there perils with just using $.post()?
Thanks in advance
Nope, not at all. That's all the plugin is doing anyway, under the hood.
The form tag does at least provide you with a nice structural grouping of your form tags, so that you can query for them more easily, though.
You've said it yourself - the peril is that it won't degrade gracefull!
Have jQuery add an extra field called UsingjQuery, then output your results based on whether this field is set or not.
This way users with javascript turned off (mobile clients, etc) will still be able to submit.
edit: Saw you mentioned 'degrades gracefully' but somehow didn't see it said 'not worried about' first!
Having a form tag does allow one javascript trick that jQuery doesn't support without: $('form').reset() ...
I stopped using FORM tags some time ago, but I also have a set of captured users that I know exactly what platform they are using.
I agree with David Pfeffer, however, I would also make the point that on occasion, form tags can get in your way. I've specifically had problems where I wanted multiple forms inside of a table, but that caused really ugly problems with positioning. So, I wound up dropping just the input elements in, copying them into a form that was elsewhere on the page, then submitting that form. It was a bit of a pain in the butt.
If you can do away with forms, and aren't worried about degradation, then I would highly consider it.