Ajax to Filter Results Without Reloading - javascript

I have created a webpage with a details of products (15, 30 or 45 per page).
The user have the possibility to reorder by reviews, by price, etc. and can select the brands, the weight, etc.
I'm using Ajax to send the query to the external page.. This is dangerous?
What would be another way to make sure I do not get attacks?

From your API, make sure you're auditing the request coming from the client. Check for bounds, make sure it's not malformed, make sure it fits what you're expecting. If it's invalid, send back a proper response to the client stating so (403/404/etc)... if it's valid, then send back the filtered results.

You shouldn't send raw SQL from the client. Just send the values of the form fields, and use them in the PHP script on the server to construct the SQL query.
See https://stackoverflow.com/a/28909923/1491895 for a technique to generate the WHERE clause dynamically from form fields. Similar methods can be used for the ORDER BY clause.

Related

How to send data to another page without using cookies or JS in PHP?

So, I have a PHP file that runs an SQL(MySql) query and gets the result. I need that page to send that result to another page automatically(I can use PHP in the receiving page). However, I want to make my website accessible to privacy-sensitive people who disable cookies and javascript on their browsers. I'm not using any PHP frameworks.
An initial page that runs the query only runs in the backend like a controller so it does not have any HTML in it. This means I cannot use a hidden form and make the user submit it with a button. I thought about sessions but they need cookies to work and JSON needs javascript. I thought about sending the data in the URL but the query result is quite big and I was afraid it could exceed some kind of URL length limits(if such thing exists). Is there a way to achieve this reliably?
EDIT: To clarify some things, the data I am sending is a search result so the query changes depending on the users input(which was provided to the page which runs the query with a form).
Store data somewhere and send id or something through query string and access data from database through that id from query string. Sending too much data is not advisable through cookies, session or even query string just pass your unique id and grab data from db.
Happy coding.

How to secure data or row ids that represents something in PHP

In my project a user can make a post (post a photo or some text). Technically I identify each post with a unique id in table posts where I store the user_id (owener of the post) and the id of the post is set to auto increment . Knowing the identification for each post I fetch all rows in the post table and put these post and relevant details( post_id, user_id, data, etc) inside an HTML. There are more things a user can do on that post, like discuss on the post, rate the post, and etc.
These things are done via an ajax post since I store the post_id on the HTML element attribute like ( data-p=52). sometimes I use the php base64_encode function to encrypt the post_id
Most in my application an event is acted on these post_id and user_id that is or are stored in the HTML custom attributes.
Now I am thinking of security issues and my question is : Is there a proper method or way I can hold these info in Javascript or a proper way I can encrypt these information about the post.
It is good you are thinking about the possible security vulnerabilities within your system. However, at the moment, from what I can tell, you are not quite worrying about the right thing. Data, like a user's ID, a post's ID, is not sensitive in itself. If you look at the URL of social networks, etc, it is very likely you will see user ID information, etc. What you need to think about, is how can I make sure that it doesn't matter this data is public? In other words, how can I prevent an attacker from using this data? Because this data on it's own, is just a bunch of numbers. Now, if there is a reason why these IDs are actually sensitive in your system, you should think about a slight structural rearrangement.
Take the common (or less so these days) SQL Injection technique. Where a attacker will input SQL code into a user input, and that input will then be concatenated/substituted right into a SQL query, therefore giving unwanted access to the database (see here http://www.w3schools.com/sql/sql_injection.asp). It does not matter the attacker knows the post ID, meaning oh no! He can delete the post he wants to, instead, it matters that he can delete any post he wants. So the problem is in the SQL vulnerability, not the knowing of the ID. And, to fix the SQL vulnerbabilty, you need to make sure that user input will disallow code-like characters. This is known as sanitization.
Of course I am not saying you shouldn't take care of what data is available to users. But the system itself needs to be robust.
The scenario you're worried about (that an attacker can use these IDs to send requests to your system to manipulate data in a way that you don't want) is independent of whether you expose your IDs to the client or not.
In some fashion, requests from your client will need to be handled by your server. If you have a feature where authors of posts can delete their own posts, hopefully you are validating delete requests that the user initiating the request is actually the owner of that post and not just blindly deleting data whenever someone asks your system to. Even if you introduced a surrogate key so as to prevent the actual primary key from leaking to the public, whatever route endpoint that is used to delete posts will still need to handle that data in a robust fashion that maintains the integrity of your data.
As I stated in a comment, really the only thing you should worry about are people performing maths on your IDs (such as plus and minus 1) in order to see if they can game your system and receive data back that maybe they shouldn't have. But even in that scenario whatever endpoint is responding to the request should validate that request before returning anything back. You can prevent this entirely by a) validating that the user requesting the data actually owns the data and b) by not using auto incremented integers as a primary key and instead relying on UUIDs as your primary keys.

Passing large amounts of data from one page to another without POST?

I'm using a web server framework which works with only GET requests, at the moment I'm trying to pass a large amount of data, that is the text content in a textarea which comes from user input, into another page which echoes the user's input.
I've attempted Querystrings but I end up receiving the error "Requested URL too long".
Any suggestions as to what method I should use?
If you can only send data encoded in GET requests, then you will have to break up the request and send it in multiple parts.
You could either use Ajax or store the entire set of data in localStorage and fetch each chunk in turn as the page reloads.
One approach would be to make a request to an end point that allocates you a unique ID. Then send a series of requests in the form: ?id=XXX&page=1&data=... before closing it with ?id=XXX&total_pages=27 at which point you assemble the different pieces on the server.
This way lies madness. It would be much better to add POST support to your framework.
Try using Javascript Cookies.
you can store the textarea value there and then read it in another page (or wherever you want).
Here's a tutorial
http://www.w3schools.com/js/js_cookies.asp

How to keep the filters from DataTables after a Back/Forward or Refresh

We're using DataTables as our table, and we're having a problem/disagreement with somehow keeping the history of filters that were applied to the table before, so that users can back/forth and refresh through these.
Now, one solution that was proposed was that I keep the filters string in the URL, and pass it around as a GET request, which would work well with back/forth and refresh. But as I have very customized filtering options (nested groups of filters), the filter string gets quite long, actually too long to be able to pass it with the GET request because of the length limit.
So as GET is out of the question, the obvious solution would be a POST request, and this is what we can't agree upon.
First solution is to use the POST request, and get the "annoying" popup every time we try to go back/forth or refresh. We also break the POST/Redirect/GET pattern that we use throughout the site, since there will be no GET.
Pros:
Simple solution
No second requests to the server
No additional database request
No additional database data
Only save the filter to the database when you choose to, so that you can re-apply it whenever you want
Cons:
Breaks the POST/Redirect/GET pattern
Having to push POST data with pushState (history.js)
How to get refresh to work?
Second solution is to use the POST request, server side saves the data in the DB, gets an ID for requesting the saved data, returns it, and the client then does a GET request with this ID, which the server side matches back to the data, returning the right filter, thus retaining the POST/Redirect/GET pattern. This solution makes two requests and saves every filter that users use to the database. Each user would have only a limited number of 'history' filters saved in the database, the older ones getting removed as new ones are applied. Basically the server side would shorten your URL by saving the long data to the database, like an URL shortening site does.
Pros:
Keeps the POST/Redirect/GET pattern
No popup messages when going back/forth and refreshing the page due to the post data being sent again
Cons:
Complicated solution
Additional request to the server
Additional request to the database
A lot of data in the database that will not be used unless the user goes back/forth or refreshes the page
A third solution would be very welcome, or pick one of the above and ideally explain why.
This is a fleeting thought i just had...you can save state of length, filtering, pagination and sorting by using bStateSave http://datatables.net/examples/basic_init/state_save.html
My thought was, theoretically you could save the cookie generated by datatables.js into a database table, like you mention in the second solution, but the request only has to happen each time you want to overwrite the current filter, replacing the current cookie with the previous "history" cookie

Shall I use javascript for page submission?

I am working on a big site, and in the site there is a search module. Searching is done by using a a lot of user submitted values, so in pagination I must pass all these data to the next page, appending the values to url make the url very big.
Sso how can I solve this issue? I am planning to use a javascript based page submission (POST) with all the values in hidden fields to the next page the read all the values from the next page.
Will it cause any problems? Or should I use database to keep the search criterias?
I would create a server side object, possibly with a database backend which is updated by the different pages.
It is at my opinion the most clear and easy solution. Giving parameters from page to page, either by post or javascript or cookie will work too but it's more of a quirk in my experience.
Also if a search query is so complex that it needs multiple pages to create it, it might be helpfull for the user to have all the data stored on the server so he can change it more easily by switching back and forth between the different pages.
I would store all the search criterias in some kind of session-store on the server when the initial search is being triggered.
For pagination I would retrieve the criterias from the session-store and then just show the appropriate results. Also I would append some kind of key to the pagination links (so this would be the only hidden post-field) under which the search criterieas can be found.
Even though the session is per user, you might have several search windows open within the same session, and you don't want to mess them up with the pagination.
In order to make a reliable search with pagination, we need to do a bit more than normal.
We need to handle the following cases.
Once search is done, user may choose to do browser back and forward. Here, if you are doing form submission on every page, it would be an overload. Also, if user presses browser refresh button, it will unnecessarily warn him that data is being submitted.
Searching on a large database with lots of criteria is costly. Hence, optimization is important.
So you should NOT do the following:
Submit data on every page change
Not store data in cookie. (This is not secure and not even reliable.)
For large database with complex query, cache the result in session.
In case, you need very up-to-date and real-time result, ignore point (3) and try doing partial search for every page.
Thus, for your case, you can do the following:
When user searches first time, make the form POST data to a search page.
This search page will store the search query in session and generate a unique id for it.
Now render the result page. The result page will be passed the search id (generated in point 2) and the page number. Example result.aspx?searchId=5372947645&page=2
The result page will puck up the query from session using the searchId and then provide result based on the page number sent.
Using hidden fields and POST method should be fine too unless you are able to get them on the next page right.
To supplement Sarfraz's answer...
It's not necessary to use Javascript to make a POST.
<form action="destination_url" method="POST">
...
</form>

Categories