fswrite with full 777 server protocol. Secure? - javascript

I have a question about the security of a server when the 777 Permissions are open.
This is the code:
<?php
// collect the cookie - save the data
if(!isset($_COOKIE["markertype"])) {
echo "Cookie named markertype is not set!";
} else {
echo "Cookie markertype is set!<br>";
echo "Value is: " . $_COOKIE["markertype"];
$file = 'newfile.txt';
// Open the file to get existing content
$current = file_get_contents($file);
// Append a new person to the file
$current .= $_COOKIE["markertype"];
// Write the contents back to the file
file_put_contents($file, $current);
}
?>
Basically the system allows users to do stuff in javascript, I then set a cookie in JS with some user information that they put in the system. I then send that to PHP via a cookie to store this information on the server. The problem was that the server didn't have the permission to write to a file, so i proceeded to give full 777 access to the entire directory.
I think this opens me up to full XSS attacks and more, what is my alternative or can i secure the server in a different way? the server is NOT mine, therefore I only have certain access.

Yeah 777 does cause some security holes, however it can be used on a different
directory on your server, for instance if you have a public_http folder, then make that 777 permissions then that is a HUGE issue, but instead you could make a non-public accessible folder full 777 and save your data to that.
This still isn't 100% secure because really your data should be off-site or remotely stored in a database, but if it's just for research then that is fine.
hope this helped.
H

Related

How can I test a javascript caching solution?

I'm looking into some of the established methods for forcing reload of cached javascript files. Problem is, I can't get a script to cache locally. I checked the network tab of Chrome for the "Disable cache" option is off. I'm using MVC to add this to a site homepage:
#section scripts
{
#Scripts.Render("~/Scripts/app/test.js")
}
And here is the test.js content:
alert('Welcome to 2');
If I change the alert text and refresh the page, regardless of whether the project is restarted or not, it is always fresh...
It's normally a server configuration issue.
https://medium.com/pixelpoint/best-practices-for-cache-control-settings-for-your-website-ff262b38c5a2
For a client-side dev who doesn't want to mess with server configs, you could also change your test.js file to a test.php file (literally just change the extension), call it the exact same way you're calling it (except, again, changing the extension to PHP) and add something like this to the very top of the PHP file:
<?php
$seconds_to_cache = 3600;
$ts = gmdate("D, d M Y H:i:s", time() + $seconds_to_cache) . " GMT";
header("Expires: $ts");
header("Pragma: cache");
header("Cache-Control: max-age=$seconds_to_cache");
?>
Then upload it to your server. As long as your server handles PHP (it probably does), this code should force the browser to cache the new file, which will function just like any JS file.
(Taken from https://electrictoolbox.com/php-caching-headers/)

Create json data file with extension '.json' [duplicate]

So I want to write a REST API in PHP for JSON for consumption on the iPhone, but also a lot of websites and devices. I have the below code, when accessed via a GET statement, returns a file like:
1mdi2o3.part
How do I return something like: users.json
$db->setQuery( "SELECT * FROM users");
$db->query() or die($queryError);
$numRows = $db->getNumRows();
$row = $db->loadObjectList();
// PRINT JSON FILE
header("Content-type: application/json");
for($i = 0 ; $i < $numRows; $i++){
$json_output[$i] = $row[$i];
}
$MYjson_output = json_encode($json_output);
echo $MYjson_output;
Not entirely such what your goal is but here are 3-4 solutions that might work:
Common Solutions
Rewrite
This is probaly the most conventional way of getting clean URIs for your API. If you want the user part of user.json to be dynamic, you can use mod_rewrite (again assuming Apache) to rewrite your URLs to a php handler script. If you are going the conventional RESTful style URL route, you will probably want to use this anyway to achieve clean / separated URLs.
# For grabbing all users
RewriteEngine on
RewriteRule ^users\.json rest.php [L]
# For grabbing one particular user
RewriteEngine on
RewriteRule ^([a-z0-9]+)\.json rest.php?user=$1 [L]
Where rest.php is your PHP handler script.
URL Rewriting without mod-rewrite
If you don't want to use mod_rewrite, you can also do something like
example.com/rest.php/users.json
example.com/rest.php/user-id.json
or even
example.com/rest.php/user-id
example.com/rest.php/user/user-id
Where rest.php is your PHP handler script. You can grab the user-id from the URL (or URI if we're talking RESTful terms) using the $_SERVER global with $_SERVER['REQUEST_URI'].
Other solutions
Changing download name via Content-Disposition:
I believe you want to add the Content-Disposition header...
<?php
header('Content-Disposition: attachment; filename="users.json"');
?>
This will force the file to be downloaded as user.json. This usually isn't the behavior expected for a REST API, but from the way your question was worded, I figured I'd throw it out there.
AddHandler (or AddType)
Assuming you're running an Apache server, you can also just use AddHandler directive to make .json extension files be treated like they are php.
AddHandler application/x-httpd-php .json
Warning: Other plain .json files will be treated as PHP so you'd probably want to set this in a .htaccess file in the directory with the PHP script. This would work fine for this one URI but not ideal if you were exposing many URIs
Welcome to SO. Have you considered using the Zend framework for this application? I've written about this topic before, and if you do use Zend I could probably be of additional help. It would certainly help get you past the basics.
HTH,
-aj
The problem is this line:
header("Content-type: application/json");
I know this looks like the right way to do (after all, application/json is the official MIME type for JSON content), but it causes most browsers to present you with a file download dialog, when you just want to see the text. You can use the text/plain encoding to avoid this. Note that your AJAX/iPhone/... app probably doesn't care about the content type, so your API will work in both cases.
Also see this blog post which provides some more context.

PHP scripts writes to file on localhost but not on IIS server

I've made a simple website that works fine on localhost. So I've put it on an IIS Windows 2008r2 server and now my PHP scripts don't write to my JSON files anymore. I've checked the server and PHP is installed on it so I don't really know what's wrong or where to look.
I'm still not getting it to work so thought I'd explain the situation in more detail.
So this script works on localhost but not on IIS server.
<?php
$myFile = "../json/countries.json";
$fh = fopen($myFile, 'w') or die("can't open file");
$stringData = json_encode($_POST["data"]);
fwrite($fh, $stringData);
fclose($fh)
?>
I've tried adding:
error_reporting(E_ALL);
ini_set('display_errors', '1');
and
chmod("../json/countries.json", 0644);
to the php but not seeing any different results or any errors.
Here's the javascript function that starts the process, and outputting the object to the console does show the correct data to be saved.
function saveJson(object, file) {
console.log("Saving JSON data: " + JSON.stringify(object));
$.ajax
({
type: "POST",
dataType : 'json',
async: false,
url: file,
data: { data:object },
success: function () {console.log("Thanks!"); },
failure: function() {console.log("Error!");}
});
}
Still the json files are not being changed.
Anyone good with Windows Server and php that might know why?
Thanks
This kind of problem occurs because of three reason only:
Don't have proper Directory owner.Ex:- In Apache default user is www:data.
First check the directory owner.
You don't have sufficient write permission for directory i.e. 755
Check directory permission
Path is incorrect to a directory to write or upload file.
Check the directory path where you are writing the file.
According to php document Installation on Windows systems default user is IUSER
Hence you have to set the directory owner 'IUSR' is part of the IIS_IUSRS group, If its not works then try to set 'IIS AppPool\{YouApplicationPoolName}' from IIS AppPool\DefaultAppPool Ex. e.g. IIS AppPool\yourdomain.com. This setting is required for IIS8.
First try to change the owner to IUSER. It should work according to php document.
Set the write permission for the directory.
How to Change a directory permission & user group on windows system.
Rightclick on directory->security->edit
Please check attached screen shot.
What version of PHP are you using. If it is less than 5.2, the json encode function may not be working. Use phpinfo() to find out.
It may also be that $_POST['data'] does not exist. try putting a array instead to see if thats the problem. Like: array(1,2,3);
Also if no warnings are being shown use: error_reporting(E_ALL); OR error_reporting(-1); at the top of your script. It may be that your host is turning off these errors. this will allow you to see what the problem is.
The host also has the ability to deactivate certain PHP function. Contact your customer service regarding this problem. Although I find it very unlikely that they have deactivated the ones you are using.
Make sure the file exist relative to where the PHP code is being executed. If this is the problem, the die command should run. You can see f the file exists using:
if (file_exists ( $filename ) ) echo "file exists";
else echo "file does not exist";
If none of the above work then I haven't got a clue! Sorry.
Add IIS_IUSRS with access "execute, list, read" to security tab on properties.

PHP Ajax jQuery cross domain file loading

So before you say this can't be done. These are all files that I have one server. I just have some of them listed under different domains.
My PHP script will access all the files but when I try to do an ajax request to try to load the file I will often get an error (because the site i am accessing is secure and the one I am accessing it through isn't).
What I need is a way to have php grab the file. But I need aJax to retrieve the file and render it for me. I am also using ACE editor to edit the file
The bit of code I have here will actually error out as well because it will load and print out the file where $page is defined but won't load where htmlspecialchars is.
<script>
var e = ace.edit("editor");
<?php
$page = readfile($_SERVER['DOCUMENT_ROOT'].$_GET['dir']);
echo 'e.setValue('.htmlspecialchars($page, ENT_QUOTES).');';
?>
</script>
I have an ajax get request working but it doesn't work when I go to a directory with a special htaccess file. Now I can't change the htaccess file (unless there is a way for me to confirm that it is my script running and not someone else.
The question is, how can I access those other files without getting that error? Mind you those files could be extension. It is not limited to just scripts or css, mostly they will be html or php files.
After an hour of searching the deep dark depths of the php.net site I was able to put together a solution that works.
<?php
echo htmlspecialchars(
addslashes(
file_get_contents(
$_SERVER['DOCUMENT_ROOT'].$_GET['d‌​ir']
)
)
); ?>
the addslashes is the extra part that I needed. Then I also had to put it between the div for the editor. I couldn't use the editor.setValue() function.

HTTP File Download: Monitoring Download Progress

I am in a situation, when I have to implement downloading of large files(up to 4GB) from a Web server: Apache 2.4.4 via HTTP protocol. I have tried several approaches, but the best solution looks to be the usage of X-SendFile module.
As I offer progress bar for file uploads, I would need to have the same feature for file downloads. So here are my questions:
Is there any way, including workaround, to achieve file downloads progress monitoring?
Is there any way, including workaround, to calculate file download transfer speed?
Is there better way to provide efficient file downloads from a web server than usage of X-Sendfile module?
Is there better file download option in general, that would allow me to monitor file download progress? It can be a client (JavaScript) or server solution(PHP). Is there any particular web server that allows this?
Currently I use:
Apache 2.4.4
Ubuntu
Many times thanks.
2 ideas (not verified):
First:
Instead of placing regular links to files (that you want to download) on your page place links like .../dowanload.php which may look sth like this:
<?php
// download.php file
session_start(); // if needed
$filename = $_GET['filename']);
header( 'Content-type: text/plain' ); // use any MIME you want here
header( 'Content-Disposition: attachment; filename="' . htmlspecialchars($filename) . '"' );
header( 'Pragma: no-cache' );
// of course add some error handling
$filename = 'c:/php/php.ini';
$handle = fopen($filename, 'rb');
// do not use file_get_contents as you've said files are up to 4GB - so read in chunks
while($chunk = fread($handle, 1000)) // chunk size may depend on your filesize
{
echo $chunk;
flush();
// write progress info to the DB, or session variable in order to update progress bar
}
fclose($handle);
?>
This way you may keep eye on your download process. In the meantime you may write progress info to the DB/session var and update progress bar reading status from DB/session var using AJAX of course polling a script that reads progress info.
That is very simplified but I think it might work as you want.
Second:
Apache 2.4 has Lua language built in:
mod_lua
Creating hooks and scripts with mod_lua
I bet you can try to write LUA Apache handler that will monitor your download - send progress to the DB and update progress bar using PHP/AJAX taking progress info from the DB.
Similarly - there are modules for perl and even python (but not for win)
I see main problem in that:
In a php+apache solution output buffering may be placed in several places:
Browser <= 1 => Apache <= 2 => PHP handler <= 3 => PHP Interpreter
process
You need to control first buffer. But directly from PHP it is impossible.
Possible solutions:
1) You can write own mini daemon which primary function will be only send files and run it on another than 80 port 8880 for example. And process downloading files and monitor output buffer from there.
Your output buffer will be only one and you can control it:
Browser <= 1 => PHP Interpreter process
2) Also you can take mod_lua and control output buffers directly from apache.
3) Also you can take nginx and control nginx output buffers using built-in perl (it is stable)
4) Try to use PHP Built-in web server and control php output buffer directly. I can't say anything about how it is stable, sorry. But you can try. ;)
I think that nginx+php+built-in perl is more stable and powerful solution.
But you can choose and maybe use other solution non in that list. I will follow this topic and waiting your final solution with interest.
Read and write to the database at short intervals is killing performance.
I would suggest to use sessions (incrementing the value of sent data in the loop) with which you can safely off by quite another php file, you can return data as JSON which can be used by the javascript function/plugin.

Categories