I want to disable the URL so that no one can edit the data by changing the URL
I am posting the data in URL through get method in php.
How can I disable the URL?
You can't "disable the URL", you've stumbled upon one of the most fundamental issues of web programming: don't trust the client!
That is why web programmers have to check data to ensure it meets certain criteria before moving on...
<?php
if (!isset($_GET['example']) || ($_GET['example']!='foo' || $_GET['example']!='bar'))
{
//error
}
else
{
//proceed
}
?>
Related
I am looking for a solution to the following:
I have a piece of JS code, that performs a redirection to a URL that is constructed with PHP, and that redirection is only done when the user presses a button on a confirmation dialog.
The code is, as follows:
function one() {
window.location.replace("<?php
if($new_redir == "1") {
echo "$new_second_redirect_URL/?token=$hash";
}
else {
echo "$second_redirect_URL/?token=$hash";
}
?>");
}
It works perfectly fine. What I wanna do is conceal the URL that is displayed in the source code when a user opens the page.
What would be the best way to do that?
You're thinking too much into this to be honest.
If they want to avoid the confirmation screen and get the URL from the source, there's not really much you could do.
The best really is possibly performing an AJAX request on confirmation and getting a CSRF token based URL from the response and using that, but that could end up being overkill as well.
You could also make it into an actual <form></form> form with a few hidden fields (again, such as a CSRF token), and perform the post validation onclick. If it a success - redirect them.
UPDATE:
Use robots.txt to stop bots
Build the QS with JS to stop most bots, something like:
var csrftoken='XJIWHEOU324uipHFOFUHR';
var url="http://url.com/page.php?token=";
url=url+csrftoken;
What you could also do, is something like us actually, although for your use case it could be too much.
Log every single page load into the DB, and check if if they're a first time visitor to the page after confirmation.
AJAX call (jQuery example):
$.post( "url_to_backend_page_to_get_url", {hasSubmittedForm:"true"}, function( data ) {
window.location.href = data;
});
All:
I have an issue with a project I am working on using C# MVC4
In the project, I am accepting a URL and other parameters from a user, then do some processing and send the result of the processing to the URL provided by the user.
The result is being sent using the following code:
var context = HttpContext.Current;
context.Response.Write("<html><head>");
context.Response.Write("</head><body>");
context.Response.Write(string.Format("<form name=\"myform\" method=\"post\" action=\"{0}\" >", postUrl));
context.Response.Write("</form>");
context.Response.Write("<script type=\"text/javascript\">document.myform.submit();</script></body></html>");
context.Response.Write("</body>");
context.Response.Flush();
context.Response.Clear();
context.ApplicationInstance.CompleteRequest();
Whenever a user attempts an XSS like passing a url value of javascript%3aalert('xss')%2f%2f, the JavaScript runs and the pop up shows up.
I've tried Antixss.HtmlEncode() to encode the URL before passing it into string.Format but still doesn't work. I've tried Antixss.UrlEncode() also, but this gives error as the form doesn't submit to the URL.
Please help me out, Is there something I am missing? What else can I do?
Thanks in advance.
You will need a three pronged approach to solve this issue.
Preventing XSS injection:
Note that if a user injected the url value
" /> <script>alert('xss')</script>
this would also leave you vulnerable:
<form name="myform" method="post" action="" /> <script>alert('xss')</script>" >
Therefore you should use the HttpUtility.HtmlAttributeEncode function to solve this one.
However, don't stop there. As noted, you should project against javascript: style URLs. For this I would ensure that the URL begins with http:// or https://. If not, throw a SecurityException which you should be logging and handling server-side, and showing the user a custom error page.
Finally, you want to protect against Open Redirect Vulnerabilities. This is to stop phishing attacks by redirecting users to other domains. Again, use a whitelist approach and ensure that the domain redirected to is one of your own. Be careful on the parsing here, as it is easy to get it wrong - a URL of http://example.org?http://example.com will pass the validation filter for example.com on many badly written validation routines. I recommend using the Uri object in .NET and retrieving the domain through that rather than rolling your own string functions.
You could also check if the URL is a relative URL, and allow it if acceptable. Use something like this function which uses a built in .NET library to ensure that it is relative or not.
Just a thought - try putting this script in rather than just document.myform.submit (and remove the form's action property):
if("{0}".indexOf('http') !== 0) {
//this is some sort of injection, or it's pointing at something on your server. Should cover http and https.
//Specifically, it makes sure that the url starts with 'http' - so a 'javascript' url isn't going to work.
} else {
document.myform.action="{0}"
document.myform.submit();
}
There is more you could do, but this should help.
Since you are adding the postUrl as an attribute "action" of the form tag, you can try using HtmlAttributeEncode method in the HttpUtility
[ValidateInput(false)]
public ActionResult Test(string url)
{
var context = System.Web.HttpContext.Current;
context.Response.Write("<html><head>");
context.Response.Write("</head><body>");
context.Response.Write(string.Format("<form name=\"myform\" method=\"post\" action=\"{0}\" >", HttpUtility.HtmlAttributeEncode(url)));
context.Response.Write("</form>");
context.Response.Write("<script type=\"text/javascript\">document.myform.submit();</script></body></html>");
context.Response.Write("</body>");
context.Response.Flush();
context.Response.Clear();
context.ApplicationInstance.CompleteRequest();
return null;
}
http://localhost:39200/home/test?url=https%3A%2F%2Fwww.google.com - Worked
http://localhost:39200/home/test?url=%3Cscript%3Ealert(%27test%27)%3C%2Fscript%3E - Worked(Did not show alert)
It is always good practice to Validate the user input against a white list of inputs, to prevent XSS exploits.
try using HttpUtility.UrlEncode
something like Response.Write(HttpUtility.UrlEncode(urlString));
see How To: Prevent Cross-Site Scripting in ASP.NET for more steps =)
If the user navigates off the webpage, is it possible to execute a php script?
I know that Javascript can be executed..
$(window).bind('beforeunload', function(){
return 'DataTest';
});
Cookies might work, but I am not sure how a listener could track an expired cookie, and then delete the correct webpage.
A sample file system is like this:
user0814HIFA9032RHBFAP3RU.php
user9IB83BFI19Y298RYBFWOF.php
index.php
listener.py
data.txt
Typically, to create the website, php writes to the data.txt and the Python listener picks up this change, and creates the file (user[numbers]). As you might think, these files stack up overtime and they need to be deleted.
The http protocol is stateless, therefore users simply can not "navigate away".
The browser requires a page, the server returns it, and the communication stops.
The server doesn't have reliable methods to know what the client will do with that page.
Disclaimer: I'm not sure, as Fox pointed out, that this is the right way to go in your case. I actuallly upvoted Fox's answer.
However, if you absolutely need to delete each page right after the user left it, use this:
$(window).bind('beforeunload', function() {
$.ajax('yourscript.php?currentUser=0814HIFA9032RHBFAP3RU');
});
Then in yourscript.php, put something like the following:
<?php
// load your userId (for example, with $_SESSION, but do what you want here)
$actualUser = $_SESSION['userId'];
// checks if the requested id to delete fits your actual current user's id
if (isset($_GET['currentUser'] && $_GET['currentUser'] == $actualUser)
{
$user = $_GET['currentUser'];
$file = 'user'.$user.'.php';
unlink($file);
}
I'm not sure if the way to do this is check Google Analytics cookies or otherwise track where a user came to my site from. Basically I have a form with a hidden field code="XY1" Now I need to be able to insert a different preset code for say people who came from Facebook, so the script would have to check where the visitor came from and then assign a code XF1 to any from FB, and a code XT1 to any from Twitter, etc.
Would something like this PHP work for the capture?:
$referringPage = parse_url( $_SERVER['HTTP_REFERER'] );
if ( stristr( $referringPage['host'], 'facebook.com' ) )
Or this JS
var ref = document.referrer;
if (!ref.indexOf("facebook.com") != -1) {
document.write(...)
}
I'm not sure what is the best way to do it and what kind of methods can reliably check the source of a visitor, so any help would be greatly appreciated.
You can use $_SERVER['HTTP_REFERER'], but it's not guaranteed to be accurate, or even present. Not all browsers will necessarily set it, and some allow you to set it yourself. Google cookies won't contain any site history, and you can't examine the browser history, so there's no guaranteed way to do what you're asking.
You can try this option using jquery $.test() method.
$(function(){
var referer=document.referrer, //option 1
//referer="<?php echo $_SERVER['HTTP_REFERER'];?>",//optional 2
XFB=/facebook.com/g,
XFT=/twitter.com/g,
checkF1=XFB.test(referer),
checkF2=XFT.test(referer);
if(checkF1){
var code= "XF1";
$('#hiddenInput').attr('value','ref: '+referer)
}
else if(checkF2){
var code= "XT1";
$('#hiddenInput').attr('value','ref: '+referer)
}
});
I am wondering how to capture all links on a page using jQuery. The idea being similar to Facebook. In Facebook, if you click on a link it captures the link and loads the same link using ajax. Only when you open a link in new tab etc. will it load the page using regular call.
Any clue on how to achieve such kind of functionality? Am sure capturing links should not be a problem, but what about capture form submissions and then submitting the entire data via ajax and then displaying the results?
Is there any plugin which already exists?
Thank you for your time.
Alec,
You can definitely do this.
I have a form that is handled in just this way. It uses the jquery form plugin kgiannakakis mentioned above. Example javascript below shows how it might work.
$("form").ajaxForm({
beforeSubmit: function(){
//optional: startup a throbber to indicate form is being processed
var _valid = true;
var _msg = '';
//optional: validation code goes here. Example below checks all input
//elements with rel attribute set to required to make sure they are not empty
$(":input [rel='required']").each(function(i){
if (this.value == '') {
_valid = false;
_msg += this.name + " may not be empty.\n";
$(this).addClass("error");
}
});
alert(_msg);
return _valid;
},
success: function(response){
//success here means that the HTTP response code indicated success
//process response: example assumes JSON response
$("body").prepend('<div id="message" class="' + response.status + '"></div>');
$("#message").text(response.message).fadeIn("slow", function(){
$(this).fadeOut("slow").remove();
});
}
});
Form plug-in can transform a regular form to an Ajax one:
$("#myForm").ajaxForm(
{beforeSubmit: validate, success: showResponse} );
It would be difficult to do what you want however for an arbitrary form. What if the form uses validation or is submitted by Ajax to begin with? The same thing applies for links. What if there are some javascript navigations scripts (window.location = Url)? If you don't have full control of the page, it will be difficult to do what you want.
Usually pages like facebook, do each event and each form separately coded, as the server-side files are usually set for each single operation / group of operations. I doubt there will be a clean way to convert a page with just a plug-in. And if it is, I see a lot of overhead.
You can do it by hand, but again that's abuse of Ajax. This isn't flash, and with using ajax for all server communications you run into a lot of problems.
Lack of history tracking.
Watching out for concurrent events and the results of thereof.
Communicating to the user that the page is changing.
Users with javascript turned off.
And much more...