Conventional HTML lets you submit data via forms to a website and the destination (target) is usually visible in the sourcecode. Now, with JS it's a bit more difficult. You have to go through the JS files and find out what happens and where all the data is being sent to.
Is there an easier way to pin down where my data ends up (i.e. what file/address it is being sent to)??
Something like: I click on the button and it shows me what type of action was used and the destination URL?!
I'm on Ubuntu.
Are you trying to prevent phishing attacks or merely capture redirects/ajax calls on button clicks .. what's the purpose of this experiment?
Because Firebug (a firefox add-on) or Chrome Inspector (native to Chrome, not quite as robust as Firebug) will do the things that you're asking for, and give you lots more information to boot.
Additionally, you might consider setting up a local proxy on your machine and capturing all http traffic. It really depends on what you're trying to accomplish.
$("form").live("submit", function(){
alert("The page will be going to: "+this.action);
});
Of course, you will need to know if the window is being unloaded, because the submit could be handled and stopped by the javascript...
Related
My problem is a bit complex. Hard to explain with words, so I broke it down into steps with pictures at each step.
Select a single date from these boxes. Hit submit
I will land on a page with a table. Copy the <tbody> element from the developer console.
Paste it into a text file. Save the text file with the date that was selected.
Repeat steps 1-3 for as many times as needed, selecting a new date each time (01-15-2018, 01-14-2018, 01-13-2018, and so on...)
Is it even possible to build a bot that does this? If yes, what tools would I use?
I know a fair amount of JavaScript and Python, so I'd prefer to use those 2 if possible.
Would need to know the URL you're looking at/look at the page source. If the date is supplied as any part of a request, and the response contains this data you're looking for, it should be simple to farm and analyze that data from a python script.
Walk through your clicks with the network tab of your browser's developer tools and you should see a request go out when you hit submit. Expedia just uses query parameters, and so the entire URL that you'll need pops up in the URL bar of your browser after hitting submit...
Tools:
If request-based:
Python
Requests module
If something cached/more complicated, there are tools for automating clicks and saving the results...I would guess that this won't be necessary though...
Update:
AJAX calls are HTTP requests and responses, and so you should be able to observe them in your networking tab of our web browser developer tools, and then mimic that request from a script, rather than from your browser.
The readability of the requests/responses and/or any measures the organization has implemented to make any application other than a browser unable to get the same response would be potential impediments, but even those should be imitable. If your browser is making the request, then there is no reason your python script can't make the same.
The method you seem to be interested in, although it sounds more complicated to me, is possible with automation tools like Selenium, as the other poster answered. Best of luck.
It is possible:
Take a look at selenium library (its commonly used for automated testing) for python. It should be able to select single dates, hit the submit button then go through the HTML code and grab data in the tag. After that you can use python by itself to store this data in a text file with a name of your choice in a location of your choice.
Here's the situation: I'm redesigning our company's public facing website using ASP.NET, VB.NET and some javascript/jquery. Some of the features I'm adding require page reloads (which register as popups) and cookies. Works great if everything is enabled. But I've noticed on some browsers (such as Firefox) I still get prompted to ok these actions, retain these cookies, etc. Now I can code some contingencies for simpler pages for users who will not or can not enable these features, but I'd like to find a way to make it as simple as possible to enable the full features. From what I've read, there's no way I can actually force it to happen the way you can force a browsers document mode by settings in the web.config file, but I am hoping there is some way to give them a button to click (or something similar) where it will enable what I need. Is there a way to do this programmatically? what I'm looking for is some code that will make the changes, instead of directing them to go into e.g. Internet Explorer security settings, which most end users find tedious if not incomprehensible.
Advice?
You can avoid using cookies. Use Session or a database backend for things you would normally use cookies for. For popups, use overlaid divs such as Ajax Control Toolkit Modal Popup Extender or jQuery UI Dialog instead of starting a new browser window.
But really, ASP.NET is designed to function with cookies. If you're users aren't using them, tell them they're penalizing themselves.
Certain browser features are ONLY user-configurable for security reasons. You cannot provide a button to change these settings because they would then not user-configurable.
All you can do is warn the user.
JavaScript cannot change a client browser's setting due to security reason. Otherwise, all hell will break loose.
Note: you can if you create an executable program, and a user runs on his/her computer.
However, you should never change a user's browser setting.
Instead you should give the warning and instruction to a user which is a proper way of doing it.
Disable Javascript
Disable Cookie
I'm sorry if this is a newbie question but I don't really know what to search for either. How do you keep content from a previous page when navigating through a web site? For example, the right side Activity/Chat bar on facebook. It doesn't appear to refresh when going to different profiles; it's not an iframe and doesn't appear to be ajax (I could be wrong).
Thanks,
I believe what you're seeing in Facebook is not actual "page loads", but clever use of AJAX or AHAH.
So ... imagine you've got a web page. It contains links. Each of those links has a "hook" -- a chunk of JavaScript that gets executed when the link gets clicked.
If your browser doesn't support JavaScript, the link works as it normally would on an old-fashioned page, and loads another page.
But if JavaScript is turned on, then instead of navigating to an HREF, the code run by the hook causes a request to be placed to a different URL that spits out just the HTML that should be used to replace a DIV that's already showing somewhere on the page.
There's still a real link in the HTML just in case JS doesn't work, so the HTML you're seeing looks as it should. Try disabling JavaScript in your browser and see how Facebook works.
Live updates like this are all over the place in Web 2.0 applications, from Facebook to Google Docs to Workflowy to Basecamp, etc. The "better" tools provide the underlying HTML links where possible so that users without JavaScript can still get full use of the applications. (This is called Progressive Enhancement or Graceful degradation, depending on your perspective.) Of course, nobody would expect Google Docs to work without JavaScript.
In the case of a chat like Facebook, you must save the entire conversation on the server side (for example in a database). Then, when the user changes the page, you can restore the state of the conversation on the server side (with PHP) or by querying your server like you do for the chat (Javascript + AJAX).
This isn't done in Javascript. It needs to be done using your back-end scripting language.
In PHP, for example, you use Sessions. The variables set by server-side scripts can be maintained on the server and tied together (between multiple requests/hits) using a cookie.
One really helpful trick is to run HTTPFox in Firefox so you can actually monitor what's happening as you browse from one page to the next. You can check out the POST/Cookies/Response tabs and watch for which web methods are being called by the AJAX-like behaviors on the page. In doing this you can generally deduce how data is flowing to and from the pages, even though you don't have access to the server side code per se.
As for the answer to your specific question, there are too many approaches to list (cookies, server side persistence such as session or database writes, a simple form POST, VIEWSTATE in .net, etc..)
You can open your last closed web-page by pressing ctrl+shift+T . Now you can save content as you like. Example: if i closed a web-page related by document sharing and now i am on travel web page. Then i press ctrl+shift+T. Now automatic my last web-page will open. This function works on Mozilla, e explorer, opera and more. Hope this answer is helpful to you.
I have in my code a window.prompt, which should stop the flow of the code until user put in some value.
Of course IE7 try to protect me from myself and instead of showing the prompt it shows the security tab (top of the page where it alerts to the users that a script wants to open a window).
What is even worse is that the prompt is ignore and the rest of the flow is being done. Regardless to say this can't work as the script is missing data from the user.
What should I do to avoid that security bar, as this script is part of the page/domain.
window.prompt() is no longer usable on the general web for this very reason, since there's no way to get round the security banner in your script. You'll have to use some other mechanism, like any of the many JavaScript UI components that emulate modal dialogs.
I'm building a webapp that contains an IFrame in design mode so my user's can "tart" their content up and paste in content to be displayed on their page. Like the WYSIWYG editor on most blog engines or forums.
I'm trying to think of all potential security holes I need to plug, one of which is a user pasting in Javascript:
<script type="text/javascript">
// Do some nasty stuff
</script>
Now I know I can strip this out at the server end, before saving it and/or serving it back, but I'm worried about the possibility of someone being able to paste some script in and run it there and then, without even sending it back to the server for processing.
Am I worrying over nothing?
Any advice would be great, couldn't find much searching Google.
Anthony
...I'm worried about the possibility of someone being able to paste some script in and run it there and then, without even sending it back to the server for processing.
Am I worrying over nothing?
Firefox has a plug-in called Greasemonkey that allows users to arbitrarily run JavaScript against any page that loads into their browser, and there is nothing you can do about it. Firebug allows you to modify web pages as well as run arbitrary JavaScript.
AFAIK, you really only need to worry once it gets to your server, and then potentially hits other users.
As Jason said, I would focus more on cleaning the data on the server side. You don't really have any real control on the client side unless you're using Silverlight / Flex and even then you'd need to check the server.
That said, Here are some tips from "A List Apart" you may find helpful regarding server side data cleaning.
http://www.alistapart.com/articles/secureyourcode