I was reading this article
http://msdn.microsoft.com/en-us/magazine/hh708755.aspx
related to securing Asp.net Application, but one thing i am not able to understand like i am browing url http://www.abc.com/XSS.aspx?test=ok and if i replace it with http://www.abc.com/XSS.aspx?test=alert('hacked')... how the site is not safe or hacked?The point i am trying to make here is that it is not impacting or affecting the site?
The example i have mentioned above, is mentioned at many places whereever it discusses security,but didn't understand
Just imagine this if you are going to output the value of "test"(without escaping it properly for html usage) on your html page then one could possibly inject any javascript on your page !! Some possible exploits could be changing the background to something obscene or even redirecting your page to some scam websites .. in effect making you accessory to fraud of somekind !!
ALWAYS USE PROPER ESCAPING FOR STORING OR USING USER SUBMITTED INFORMATION!!
Edit: The escaping I am talking about will be useful so that people dont inject html or JS in your database. This would eventually lead to every user getting the injected HTML/JS (if the injected variable is same for everyone) on their page .. not just the user who injected it !!
Related
I am having a template structure in which there is a single HTML file inside which related HTML & JS files are loaded (using AJAX).
Section are loaded as per User's activity(Page never reloads which kind of is good for user experience).
i.e.
User clicks a menu say "Profile",which causes:
jQuery.load method is used to load a file "/some/path/profile.html".
jQuery.getScript is used in .load() callback to include js files like "some/path/profile.js",The profile js has event handlers for the profile page along with related business logic.
This happens for each menu item/section of the application like "Profile","Files","Dashboard" etc.
It works fast but I am not sure if this is the optimal way to carry this out.
If a User consequently clicks the "Profile" button twice,would the browser
clear up the earlier loaded resources(profile.html,profile.js) first before
loading it afresh?
When user visit a new section say "Dashboard" after visiting "Profile",would
browser again clear out the resources of Profile before loading for
Dashboard?
If not than could this cause some memory related issues with the browser?I searched about this but did not see any related scenarios.
P.S: In this structure often some HTML part is stored in a JS variable to be used further. I read somewhere in SO that it is a bad practice to do so but I was not able to find details regarding it. I assume it should not be a -ve point if the developer is well versed & storing HTML in a JS variable should not be any problem.
Here's my understanding on this:
You have to make sure that you don't send request if clicking on same button at your end.
(Forgot about we are dealing with scripts/HTMl) No caching in the picture
Clearing out resources?, yes it will be removed from DOM if appened in same section. But i guess it's necessary if same placeholder is used for each section content.
If you know that everytime each section will return same template again, you can create a local cache at client side just like memoization to see if template already exists.
Hope this helps.
I am currently working on a HTML presentation, that works well, but I need the presentation to be followed simultaneously with a NAO robot who reads a special html tag. I somehow need to let him know, which slide I am on, so that he can choose the correct tag.
I use Beautiful Soup for scraping the HTML, but it does so from a file and not from a browser. The problem is, there is javascript running behind, assigning various classes to specific slides, that tell the current state of the presentation. And I need to be able to access those, but in the default state of the presentation they are not present and are added asynchronously throughout the process of the presentation.
Hopefully, my request is clear.
Thank you for your time
http://www.seleniumhq.org/ (probably webdriver) is your friend. Initialize a browser and call browser.html to get the document in the current state.
There's wget on the robot, you could use it... (though I'm not sure I understand where is really the problem...)
Is it possible to use jQuery/Javascript to see if a webpages source code is altered by a visitor and if so redirect them?
And by altered, I mean if they open firebug or something and edit anything on the page once its finished loading?
This seems like a hack to prevent people from messing with your forms.
This is most definitely not the right way to make your site more secure; security must always come from the server-side or, if everything is done via the front-end, in a way that can only hurt the user who is currently signed in.
Even if you did succeed in implementing this using JavaScript, the first thing I would do is disable exactly that :) or just disable JavaScript, use wget, inspect the code first, then write a curl work-around, etc.
Even if there is a way to do that, the visitor can still edit this verification, so this is pointless.
yes. at the loading store the innerhtml of the html element in a string.
then set an interval every second to check if the current html matches the stored var.
I have a website with a lot of language subfolders /de/ /fr/ /es/ etc and I am using a combination of redirected/cached translation and uncached onthefly translation. I need the Cart areas of my site to be cached onthefly rather than redirected which it is at the moment. However, I'd like to be able to make the foreign language experience seamless so the customer doesnt have to click on the flag again once the translation type has changed. They'll be coming into the cart through /de/cart. As an example if I was a German customer I would be happily in the /de/ German subfolder with everything in my language but when I go to the cart, I get knocked back to English and have to click the flag again as the redirect has changed. My problem really is that I don't know how to trigger the translation without having a url to direct it to. I thought something like this might work:
redirect 301 /de/cart http://www.my-site.com/cart?lang=de
But this obviously does nothing because I don't know how to say to Google Translate I want to trigger this language. Thanks for taking the time to read this and any help would be greatly appreciated. It might in fact be impossible to achieve this in the way I am trying so a simple not possible would also help :-)
In the javascript you have to parse window.location to find the lang=de and then take action based on the result.
However, it seems strange to me that you are using google translate on your page -- if you want localized versions write the code to do so -- google translate is an external tool and not a very good one compared to a true translation.
I'll be inserting content from remote sources into a web app. The sources should be limited/trusted, but there are still a couple of problems:
The remote sources could
1) be hacked and inject bad things
2) overwrite objects in my global names
space
3) I might eventually open it up for users to enter their own remote source. (It would be up to the user to not get in trouble, but I could still reduce the risk.)
So I want to neutralize any/all injected content just to be safe.
Here's my plan so far:
1) find and remove all inline event handlers
str.replace(/(<[^>]+\bon\w+\s*=\s*["']?)/gi,"$1return;"); // untested
Ex.
<a onclick="doSomethingBad()" ...
would become
<a onclick="return;doSomethingBad()" ...
2) remove all occurences of these tags:
script, embed, object, form, iframe, or applet
3) find all occurences of the word script within a tag
and replace the word script with html entities for it
str.replace(/(<[>+])(script)/gi,toHTMLEntitiesFunc);
would take care
<a href="javascript: ..."
4) lastly any src or href attribute that doesn't start with http, should have the domain name of the remote source prepended to it
My question: Am I missing anything else? Other things that I should definitely do or not do?
Edit: I have a feeling that responses are going to fall into a couple camps.
1) The "Don't do it!" response
Okay, if someone wants to be 100% safe, they need to disconnect the computer.
It's a balance between usability and safety.
There's nothing to stop a user from just going to a site directly and being exposed. If I open it up, it will be a user entering content at their own risk. They could just as easily enter a given URL into their address bar as in my form. So unless there's a particular risk to my server, I'm okay with those risks.
2) The "I'm aware of common exploits and you need to account for this ..." response ... or You can prevent another kind of attack by doing this ... or What about this attack ...?
I'm looking for the second type unless someone can provide specific reasons why my would be more dangerous than what the user can do on their own.
Instead of sanitizing (black listing). I'd suggest you setup a white list and ONLY allow those very specific things.
The reason for this is you will never, never, never catch all variations of malicious script. There's just too many of them.
don't forget to also include <frame> and <frameset> along with <iframe>
for the sanitization thing , are you looking for this?
if not, perhaps you could learn a few tips from this code snippet.
But, it must go without saying that prevention is better than cure. You had better allow only trusted sources, than allow all and then sanitize.
On a related note, you may want to take a look at this article, and its slashdot discussion.
It sounds like you want to do the following:
Insert snippets of static HTML into your web page
These snippets are requested via AJAX from a remote site.
You want to sanitise the HTML before injecting into the site, as this could lead to security problems like XSS.
If this is the case, then there are no easy ways to strip out 'bad' content in JavaScript. A whitelist solution is the best, but this can get very complex. I would suggest proxying requests for the remote content through your own server and sanitizing the HTML server side. There are various libraries that can do this. I would recommend either AntiSamy or HTMLPurifier.
For a completely browser-based way of doing this, you can use IE8's toStaticHTML method. However no other browser currently implements this.