restrict user from tampering with the address url - javascript

i am creating a small web application using asp.net mvc3
http://localhost:1871/ManageMember/Admin/MemberDetail/3
for example i have the above url...as you can see that the parameter 3 is easily visible and can be changed by user and similarly can view the details regarding that particular id parameter
http://localhost:1871/ManageMember/Admin/MemberDetail/5
I want to stop the user to tampering with the url

Impossible, unless you somehow mangle the 3 (i.e. have some mapping on the back end where af320f32la0gw -> 3 and bof320afj2fw -> 5). Then it would at least be very hard to guess the ID.
Other than that you would need some sort of authentication for viewing 3 or 5 initially.
Finally, why do you want to prevent people from doing this? Seems like it could be handy for power users.

IE allows you to lock the Address bar (not sure if other browsers support this).
But regardless of how easy, difficult, resticted or "mingled" the URL is, as E. Pills points out, server-side-enforced authorization would be mandatory (your code and URL structure implies that you already have authentication mechanism in place).
One way you could achieve this is to store server-side session token on initial authentication and verify all incoming request for proper authorization.

if itIS == MVC
use Request.ServerVariables["http_referer"]
else if itIS == webforms
use HttpRequest.UrlReferrer

Related

HTML hidden input shouldn't be editable

I just discovered a bug which I couldn't find any solution of, I would like your advise on that. Issue is there are a few hidden input types, which are there to store ID's of already saved data such as per person id if it is already saved etc. etc.
I just tried and change the value of that hidden variable manually, using google chrome and submit the form and surprisingly i did not get the id that should be there but instead i received the Id that I changed. for instance there was an value of 22 I change it 263 I received 263, whereas I should have be receiving 22. I want that 22 to come not that 263.
Its hard to explain I know but I have tried my level best to convey my issue please help and advise my on that how should I store some hidden value that are un-editable.
Any Idea?
Rule of Web Development #1: Never trust the client
Rule of Web Development #2: Never trust the client
Rule of Web Development #3: You can't make the client trustworthy
If the user shouldn't be able to edit it, never give it to them.
As others have said, there are a few ways to handle the situation. The most common is to use a SESSION variable on the server, available almost everywhere.
Store the "secret" values on the SESSION. They will be available when the user posts back.
You cannot control what data users put in HTTP requests to your server.
Instead, use authentication and authorization, on the server, when the request is received, to make sure that the user is allowed to submit the values they submit.
If you're wanting to keep track of data from one page to another I would use sessions. This is data that is tracked on the server.
//page one.php
$_SESSION['id'] = 22;
//page two.php
echo $_SESSION['id']; //22
This is a basic functionality of how browsers work - essentially someone could POST data pretending to be your form with whatever values they wanted in the fields - or even add extra fields.
If it's a problem consider moving that data from hidden fields to session variables.
If it's important for your hidden fields to be secure, don't contain them on the client-side. Client side variables are pretty easy to modify.
You should probably store them in your session, so they're not outputted to the client. If they're required on the page, use AJAX to grab them instead.
It kinda depends on the domain of your application, if it's in-house software then I wouldn't worry about it particularly.
It does not look like a bug.
What scares you about this? These fields are not going to be accessed and changed by your visitors. If you're afraid someone is going to hack the http request of your visitor and change his order (for example), then https connection should help.

Preventing bot form submission

I'm trying to figure out a good way to prevent bots from submitting my form, while keeping the process simple. I've read several great ideas, but I thought about adding a confirm option when the form is submitted. The user clicks submit and a Javascript confirm prompt pops up which requires user interaction.
Would this prevent bots or could a bot figure this out too easy? Below is the code and JSFIddle to demonstrate my idea:
JSFIDDLE
$('button').click(function () {
if(Confirm()) {
alert('Form submitted');
/* perform a $.post() to php */
}
else {
alert('Form not submitted');
}
});
function Confirm() {
var _question = confirm('Are you sure about this?');
var _response = (_question) ? true : false;
return _response;
}
This is one problem that a lot of people have encountered. As user166390 points out in the comments, the bot can just submit information directly to the server, bypassing the javascript (see simple utilities like cURL and Postman). Many bots are capable of consuming and interacting with the javascript now. Hari krishnan points out the use of captcha, the most prevalent and successful of which (to my knowledge) is reCaptcha. But captchas have their problems and are discouraged by the World-Wide Web compendium, mostly for reasons of ineffectiveness and inaccessibility.
And lest we forget, an attacker can always deploy human intelligence to defeat a captcha. There are stories of attackers paying for people to crack captchas for spamming purposes without the workers realizing they're participating in illegal activities. Amazon offers a service called Mechanical Turk that tackles things like this. Amazon would strenuously object if you were to use their service for malicious purposes, and it has the downside of costing money and creating a paper trail. However, there are more erhm providers out there who would harbor no such objections.
So what can you do?
My favorite mechanism is a hidden checkbox. Make it have a label like 'Do you agree to the terms and conditions of using our services?' perhaps even with a link to some serious looking terms. But you default it to unchecked and hide it through css: position it off page, put it in a container with a zero height or zero width, position a div over top of it with a higher z-index. Roll your own mechanism here and be creative.
The secret is that no human will see the checkbox, but most bots fill forms by inspecting the page and manipulating it directly, not through actual vision. Therefore, any form that comes in with that checkbox value set allows you to know it wasn't filled by a human. This technique is called a bot trap. The rule of thumb for the type of auto-form filling bots is that if a human has to intercede to overcome an individual site, then they've lost all the money (in the form of their time) they would have made by spreading their spam advertisements.
(The previous rule of thumb assumes you're protecting a forum or comment form. If actual money or personal information is on the line, then you need more security than just one heuristic. This is still security through obscurity, it just turns out that obscurity is enough to protect you from casual, scripted attacks. Don't deceive yourself into thinking this secures your website against all attacks.)
The other half of the secret is keeping it. Do not alter the response in any way if the box is checked. Show the same confirmation, thank you, or whatever message or page afterwards. That will prevent the bot from knowing it has been rejected.
I am also a fan of the timing method. You have to implement it entirely on the server side. Track the time the page was served in a persistent way (essentially the session) and compare it against the time the form submission comes in. This prevents forgery or even letting the bot know it's being timed - if you make the served time a part of the form or javascript, then you've let them know you're on to them, inviting a more sophisticated approach.
Again though, just silently discard the request while serving the same thank you page (or introduce a delay in responding to the spam form, if you want to be vindictive - this may not keep them from overwhelming your server and it may even let them overwhelm you faster, by keeping more connections open longer. At that point, you need a hardware solution, a firewall on a load balancer setup).
There are a lot of resources out there about delaying server responses to slow down attackers, frequently in the form of brute-force password attempts. This IT Security question looks like a good starting point.
Update regarding Captcha's
I had been thinking about updating this question for a while regarding the topic of computer vision and form submission. An article surfaced recently that pointed me to this blog post by Steve Hickson, a computer vision enthusiast. Snapchat (apparently some social media platform? I've never used it, feeling older every day...) launched a new captcha-like system where you have to identify pictures (cartoons, really) which contain a ghost. Steve proved that this doesn't verify squat about the submitter, because in typical fashion, computers are better and faster at identifying this simple type of image.
It's not hard to imagine extending a similar approach to other Captcha types. I did a search and found these links interesting as well:
Is reCaptcha broken?
Practical, non-image based Captchas
If we know CAPTCHA can be beat, why are we still using them?
Is there a true alternative to using CAPTCHA images?
How a trio of Hackers brought Google's reCaptcha to its knees - extra interesting because it is about the audio Captchas.
Oh, and we'd hardly be complete without an obligatory XKCD comic.
Today I successfully stopped a continuous spamming of my form. This method might not always work of course, but it was simple and worked well for this particular case.
I did the following:
I set the action property of the form to mustusejavascript.asp which just shows a message that the submission did not work and that the visitor must have javascript enabled.
I set the form's onsubmit property to a javascript function that sets the action property of the form to the real receiving page, like receivemessage.asp
The bot in question apparently does not handle javascript so I no longer see any spam from it. And for a human (who has javascript turned on) it works without any inconvenience or extra interaction at all. If the visitor has javascript turned off, he will get a clear message about that if he makes a submission.
Your code would not prevent bot submission but its not because of how your code is. The typical bot out there will more likely do an external/automated POST request to the URL (action attribute). The typical bots aren't rendering HTML, CSS, or JavaScript. They are reading the HTML and acting upon them, so any client logic will not be executed. For example, CURLing a URL will get the markup without loading or evaluating any JavaScript. One could create a simple script that looks for <form> and then does a CURL POST to that URL with the matching keys.
With that in mind, a server-side solution to prevent bot submission is necessary. Captcha + CSRF should be suffice. (http://en.wikipedia.org/wiki/Cross-site_request_forgery)
No Realy are you still thinking that Captcha or ReCap are Safe ?
Bots nowDays are smart and can easly recognise Letters on images Using OCR Tools (Search for it to understand)
I say the best way to protect your self from auto Form submitting is adding a hidden hash generated (and stored on the Session on your server of the current Client) every time you display the form for submitting !
That's all when the Bot or any Zombie submit the form you check if it the given hash equals the session stored Hash ;)
for more info Read about CSRF !
You could simply add captcha to your form. Since captchas will be different and also in images, bots cannot decode that. This is one of the most widely used security for all wesites...
you can not achieve your goal with javascript. because a client can parse your javascript and bypass your methods. You have to do validation on server side via captchas. the main idea is that you store a secret on the server side and validate the form submitted from the client with the secret on the server side.
You could measure the registration time offered no need to fill eternity to text boxes!
I ran across a form input validation that prevented programmatic input from registering.
My initial tactic was to grab the element and set it to the Option I wanted. I triggered focus on the input fields and simulated clicks to each element to get the drop downs to show up and then set the value firing the events for changing values. but when I tried to click save the inputs where not registered as having changed.
;failed automation attempt because window doesnt register changes.
;$iUse = _IEGetObjById($nIE,"InternalUseOnly_id")
;_IEAction($iUse,"focus")
;_IEAction($iUse,"click")
;_IEFormElementOptionSelect($iUse,1,1,"byIndex")
;$iEdit = _IEGetObjById($nIE,"canEdit_id")
;_IEAction($iEdit,"focus")
;_IEAction($iEdit,"click")
;_IEFormElementOptionSelect($iEdit,1,1,"byIndex")
;$iTalent = _IEGetObjById($nIE,"TalentReleaseFile_id")
;_IEAction($iTalent,"focus")
;_IEAction($iTalent,"click")
;_IEFormElementOptionSelect($iTalent,2,1,"byIndex")
;Sleep(1000)
;_IEAction(_IETagNameGetCollection($nIE,"button",1),"click")
This caused me to to rethink how input could be entered by directly manipulating the mouse's actions to simulate more selection with mouse type behavior. Needless to say I wont have to manualy upload images 1 by 1 to update product images for companies. used windows number before letters to have my script at end of the directory and when the image upload window pops up I have to use active accessibility to get the syslistview from the window and select the 2nd element which is a picture the 1st element is a folder. or the first element in a findfirstfile return only files call. I use the name to search for the item in a database of items and then access those items and update a few attributes after upload of images,then I move the file from that folder to a another folder so it doesn't get processed again and move onto the next first file in the list and loop until script name is found at the end of the update.
Just sharing how a lowly data entry person saves time, and fights all these evil form validation checks.
Regards.
This is a very short version that hasn't failed since it was implemented on my sites 4 years ago with added variances as needed over time. This can be built up with all the variables and if else statements that you require
function spamChk() {
var ent1 = document.MyForm.Email.value
var str1 = ent1.toLowerCase();
if (str1.includes("noreply")) {
document.MyForm.reset();
}
<input type="text" name="Email" oninput="spamChk()">
I had actually come here today to find out how to redirect particular spam bot IP addresses to H E L L .. just for fun
Great ideas.
I removed re-captcha a while back converted my contactform.html to contactform.asp and added this to the top (Obviously with some code in between to full-fill a few functions like sendmail, verify form filled out completely etc.).
<%
if Request.Form("Text") = 8 then
dothis
else
send them to google.com
end if
%>
On the form i stuck a basic text field with the name text so its just looks like anything not specifying what its for at all, I then stuck some text 2 lines above in red that states enter what 2 + 6 = in the box below to submit your request.

Hide the url in a Grails application

Is there a way to hide the url in the address bar with Grails application. Now users of the web application can see and change the request parameter values from the address bar and they see the record id in the show page.
Is there a way in Javascript or Groovy (URL Mapping) or Grails (.gsp) or HTML or Tomcat (server.xml or conf.xml or in web.xml inside application in the webapps)
ex(http://www.example.com/hide/show /) i want to avoid this url and always see (http://www.example.com) or (http://www.example.com/hide/show) without the record id
Is there a way to prevent this?
No, most browsers doesn't let you hide the address field, even if you open a new window using window.open. This is a security feature, so that one site can't easily pretend to be another.
Your application should have security checks so that one user can't access data that only another user should see. Just hiding the URL would not be safe anyway, you can easily get around that using tools built into the browser, or readily available addons.
It's part of the restful URL pattern implemented by grails.
Your best bet to hide the URL would be using an iframe within the page you want the user to see in their address bar.
Not quite sure what you mean, but I would change the default root URL mapping in UrlMappings.groovy so it looks a bit like this:
static mappings = {
"/$controller/$action?/$id?"{
constraints {
// apply constraints here
}
}
//Change it here!!!!
"/"(controller: 'controllerName', action: 'actionName')
Where 'actionName' and 'controllerName' are what you want them to be - 'hide', 'show' in your example?
Than pass all parameters via a post instead of a get, just change the <g:form> method.
You will still obviously need to implement any security checking required in the controller as stated by other posters.
Thanks,
Jim.
You can probably handle this using a variation of Post/Redirect/Get:
http://en.wikipedia.org/wiki/Post/Redirect/Get
At our Grails site we have a lot of search fields. When a user clicked a pagination link all those search fields ended up in the URL which created ugly URL:s with a higher risk that users bookmarked those addresses which could mean future problems.
We solved this by saving not only all POST but also GET with parameters into the session, redirect to GET without parameters and append those again in the controller. This not only creates nice URL:s but also a memory so that if a user goes back to an earlier menu, then selected details within that menu are redisplayed.
For your specific request to hide the id in "show/42" you can probably handle that likewise or possibly configure Grails to use "show?id=42" instead, but we don't have that requirement so I haven't looked further into that issue. Good luck!
Forgot to mention: this won't add much to security since links will still contain ids, it will only clean up the address bar.
Here's some sample code that should work. If show?id=42 is called, it saves id=42 in the session, then redirects to just show and id=42 is added to params before further processing. It does what you want, but as commented it might not always be a wise thing to do.
def show = {
if (request.method == 'GET' && !request.queryString) {
if (session[controllerName]) {
params.putAll(session[controllerName])
// Add the typical code for show here...
}
} else {
session[controllerName] = extractParams(params)
redirect(action: 'show')
return
}

how to cross direct user with google analytics

After a user fills in my "new" user form on "example-one.com", the "create" controller creates the record in the db. Then it does a redirect_to to an external site "payment-checkout.com". I have setup the Google Analytics code on both sites.
Google provides two functions _link and _linkByPost for use to use in any links or forms that go to your external domains. The problem is the user is being redirected by the controller action outside of the view and I cant use those two javascript functions to pass on the relevent G.A. info - what do i do?
Can anyone help?
The way _link works is by passing the Google Analytics cookies from your first domain via a query string to your second domain. The second domain, if configured correctly, will accept those URL parameters and apply them as cookie values for the purposes of tracking.
So, it shouldn't be difficult for you to apply your own version of the _link function.
Specifically, the _link function passes the following cookies:
__utma, __utmb, __utmc, __utmx, __utmz, __utmv and __utmk
Into a query string as such: ?__utma=87278922.614105561.1288923931.1294376393.1298325957.6&__utmb=87278922.1.10.1298325957&__utmc=87278922&__utmx=-&__utmz=87278922.1288923931.1.1.utmcsr=(direct)|utmccn=(direct)|utmcmd=(none)&__utmv=-&__utmk=72493274
So, all you need to do to replicate the _link function is, before you apply the server side redirect, grab the cookie values, and apply them as a query string on the URL you're redirecting to.
Now, that's not the only thing you'll need to do to get this working. The Google Analytics configuration on the payment site will need to be configured with _setAllowLinker set to true, as well as potentially disabling the domain hash and setting a particular domain name for the tracking cookies; it depends on your configuration. You can find out more about that in Google Analytics Cross Domain Tracking Guide.
#yc's approach looks like the best bet but if that doesn't work, I would suggest having your controller redirect the user to a "temp" page on your site itself and show some text like "Checking out....Please wait..." and using Javascript trigger the call to the "_link" function to redirect the user to the "payment-checkout.com" (again using Javascript).
I assume you're also tracking the page the user returns to and want to measure how many users you lose in the process in between?
My knowledge of the Google Analytics API is fairly limited, so maybe there's a better solution, but you could consider rendering a page containing the GA code and triggering the _link() function from there?
It might also be possible to perform an AJAX call on submitting the form (maybe using remote_form_for) and handling the GA redirect in an RJS-response:
page << "_gaq.push(['_link', 'http://example.com/test.html']);"
However, I'm not sure how well that would fit into your application.

Persist javascript variables between GET requests?

ASP .NET is allowed
Storing the values in hidden input fields is allowed
Query String is not allowed
POST request is not allowed
It is possible to store JS variables between GET requests ?
I want to reinitialize them on the client using ClientScript.RegisterStartupScript
Can I use cookies for this ?
Are there other posibilities?
Where cookies are stored when Request is made ?
Can I use cookies for this ?
Yes, see this tutorial on using cookies in Javascript.
Are there other posibilities?
If you are not allowed to append anything the URL of your requests, I can't come up with any.
Where cookies are stored when Request is made ?
In the HTTP request header. The aforementioned tutorial will tell you how to read their values from Javascript. On the server side with ASP.Net, you can read cookie values using Request.Cookie["cookieName"] which returns an instance of HttpCookie.
I wouldn't highly recommend this, but the other option is to alter the window.name property.
You can save some minor bits of data here, then retrieve them on the next page load.
Pros:
Quick-n-dirty, but works
Cons:
Messes up any window references for popups/child iframes
Since its a "hack", browser vendors may break this "feature" in the future
Of course if you can exclude all the old browsers, then use Global/Client Session Storage!
At the moment using cookies is your best bet. You can serialize the JavaScript objects to strings, and unserialize them back into objects later. A good choice format is JSON, since it is a subset of JavaScript.
There is also storing objects in Flash.
Storing in Google Gears.
DomStorage
See this library that has an interface to each:
http://pablotron.org/?cid=1557
If you are in control of all aspects of the page, then you can also wrap the page in a top level frame. Then only refresh the child frame. You can then store content in the parent frame.
You can see this used in sites like GMail, and others where the only thing that changes in the URL is outside the #.
You don't even have to change the URL, that part is just put in for Human Friendly URLs. (So you can actually copy and paste URLs as is).

Categories