PHP scripts writes to file on localhost but not on IIS server - javascript

I've made a simple website that works fine on localhost. So I've put it on an IIS Windows 2008r2 server and now my PHP scripts don't write to my JSON files anymore. I've checked the server and PHP is installed on it so I don't really know what's wrong or where to look.
I'm still not getting it to work so thought I'd explain the situation in more detail.
So this script works on localhost but not on IIS server.
<?php
$myFile = "../json/countries.json";
$fh = fopen($myFile, 'w') or die("can't open file");
$stringData = json_encode($_POST["data"]);
fwrite($fh, $stringData);
fclose($fh)
?>
I've tried adding:
error_reporting(E_ALL);
ini_set('display_errors', '1');
and
chmod("../json/countries.json", 0644);
to the php but not seeing any different results or any errors.
Here's the javascript function that starts the process, and outputting the object to the console does show the correct data to be saved.
function saveJson(object, file) {
console.log("Saving JSON data: " + JSON.stringify(object));
$.ajax
({
type: "POST",
dataType : 'json',
async: false,
url: file,
data: { data:object },
success: function () {console.log("Thanks!"); },
failure: function() {console.log("Error!");}
});
}
Still the json files are not being changed.
Anyone good with Windows Server and php that might know why?
Thanks

This kind of problem occurs because of three reason only:
Don't have proper Directory owner.Ex:- In Apache default user is www:data.
First check the directory owner.
You don't have sufficient write permission for directory i.e. 755
Check directory permission
Path is incorrect to a directory to write or upload file.
Check the directory path where you are writing the file.
According to php document Installation on Windows systems default user is IUSER
Hence you have to set the directory owner 'IUSR' is part of the IIS_IUSRS group, If its not works then try to set 'IIS AppPool\{YouApplicationPoolName}' from IIS AppPool\DefaultAppPool Ex. e.g. IIS AppPool\yourdomain.com. This setting is required for IIS8.
First try to change the owner to IUSER. It should work according to php document.
Set the write permission for the directory.
How to Change a directory permission & user group on windows system.
Rightclick on directory->security->edit
Please check attached screen shot.

What version of PHP are you using. If it is less than 5.2, the json encode function may not be working. Use phpinfo() to find out.
It may also be that $_POST['data'] does not exist. try putting a array instead to see if thats the problem. Like: array(1,2,3);
Also if no warnings are being shown use: error_reporting(E_ALL); OR error_reporting(-1); at the top of your script. It may be that your host is turning off these errors. this will allow you to see what the problem is.
The host also has the ability to deactivate certain PHP function. Contact your customer service regarding this problem. Although I find it very unlikely that they have deactivated the ones you are using.
Make sure the file exist relative to where the PHP code is being executed. If this is the problem, the die command should run. You can see f the file exists using:
if (file_exists ( $filename ) ) echo "file exists";
else echo "file does not exist";
If none of the above work then I haven't got a clue! Sorry.

Add IIS_IUSRS with access "execute, list, read" to security tab on properties.

Related

Retrieve a remote .csv file using Fetch API in JS [duplicate]

Are we allowed to link files directly from Github ?
<link rel="stylesheet" href="https://raw.github.com/username/project/master/style.css"/>
<script src="https://raw.github.com/username/project/master/script.js"></script>
I know this is allowed on Google Code. This way I don't have to worry about updating a local file.
The great service RawGit was already mentioned, but I'll throw another into the ring: GitCDN.link
Benefits:
Lets you link to specific commits, as well as auto-get the latest (aka master)
Incurs no damage from high traffic volumes; RawGit asks that it's dev.rawgit.com links be only used during development, where as GitCDN give you access to the latest version, without the danger of the servers exploding
Give you the option of auto minifying your HTML, CSS and JavaScript, or serving it as written (https://min.gitcdn.link).
Adds compression (GZip)
Adds all the correct headers (Content-Type, cache-control, e-tag, etc)
Full disclosure, I'm a project maintainer at GitCDN.link
You can use external server rawgithub.com. Just remove a dot between words 'raw' and 'github' https://raw.github.com/.. => https://rawgithub.com/ and use it. More info you find in this question.
However, according to the rawgithub website it will be shutting down at the end of October 2019.
You can link directly to raw files, but it's best not to do it since the raw files always get sent with a plain/text header and can cause loading problems.
You need carry out the following steps
Get the raw url of the file from github. Which is something like https://raw.githubusercontent.com/username/folder/example.css
Visit http://rawgit.com/. Paste the git url above in the input box. It will generate two url's, one for development and other for production purpose.
Copy any one of them and you are done.
The file will act as a CDN. You can also use gist urls.
GitHub Pages: https://yourusername.github.io/script.js
GitHub repo raw files: https://github.com/yourusername/yourusername.github.io/blob/master/script.js
Use GitHub Pages, DO NOT use raw files.
Reason:
GitHub Pages are based on CDN, raw files are not. Accessing raw files will directly hit on GitHub servers and increase server load.
Add a branch your project using the name "gh-pages" and then you'll (shortly after branching) be able to use a direct URL such as https://username.github.io/project/master/style.css (using your URL, and assuming "style.css" is a file in the "master" folder in the root of your "project" repository...and that your Github account is "username").
For those who ended up in this post and just want to get the raw link from an image in GitHub:
If it is the case of an image, you can just add '?raw=true' at the end of the link to the file.
E.g.
Original link:
https://github.com/githubusername/repo_name/blob/master/20160309_212617-1.png
Raw link:
https://github.com/githubusername/repo_name/blob/master/20160309_212617-1.png?raw=true
Use jsdelivr.com
Copied directly from https://www.jsdelivr.com/?docs=gh:
load any GitHub release, commit, or branch
note: we recommend using npm for projects that support it
https://cdn.jsdelivr.net/gh/user/repo#version/file
load jQuery v3.2.1
https://cdn.jsdelivr.net/gh/jquery/jquery#3.2.1/dist/jquery.min.js
use a version range instead of a specific version
https://cdn.jsdelivr.net/gh/jquery/jquery#3.2/dist/jquery.min.js
https://cdn.jsdelivr.net/gh/jquery/jquery#3/dist/jquery.min.js
omit the version completely to get the latest one
you should NOT use this in production
https://cdn.jsdelivr.net/gh/jquery/jquery/dist/jquery.min.js
add ".min" to any JS/CSS file to get a minified version
if one doesn't exist, we'll generate it for you
https://cdn.jsdelivr.net/gh/jquery/jquery#3.2.1/src/core.min.js
add / at the end to get a directory listing
https://cdn.jsdelivr.net/gh/jquery/jquery/
After searching for this same functionality, I ended up writing my own PHP script to act as a proxy. The trouble I kept running into is even when you get the RAW version/link from Github and link to it in your own page, the header sent over was 'text/plain' and Chrome was not executing my JavaScript file from Github. I also didn't like the other links posted for using third party services because of the obvious security/tampering issues possible.
So using this script, I can pass over the RAW link from Github, have the script set the correct headers, and then output the file as if it were coming from my own server. This script can also be used with a secure application to pull in non-secure scripts without throwing SSL errors warning of "Non-secure links used".
Linking:
<script src="proxy.php?link=https://raw.githubusercontent.com/UserName/repo/master/my_script.js"></script>
proxy.php
<?php
###################################################################################################################
#
# This script can take two URL variables
#
# "type"
# OPTIONAL
# STRING
# Sets the type of file that is output
#
# "link"
# REQUIRED
# STRING
# The link to grab and output through this proxy script
#
###################################################################################################################
# First we need to set the headers for the output file
# So check to see if the type is specified first and if so, then set according to what is being requested
if(isset($_GET['type']) && $_GET['type'] != ''){
switch($_GET['type']){
case 'css':
header('Content-Type: text/css');
break;
case 'js':
header('Content-Type: text/javascript');
break;
case 'json':
header('Content-Type: application/json');
break;
case 'rss':
header('Content-Type: application/rss+xml; charset=ISO-8859-1');
break;
case 'xml':
header('Content-Type: text/xml');
break;
default:
header('Content-Type: text/plain');
break;
}
# Otherwise, try and determine what file type should be output by the file extension from the link
}else{
# See if we can find a file type in the link specified and set the headers accordingly
# If css file extension is found, then set the headers to css format
if(strstr($_GET['link'], '.css') != FALSE){
header('Content-Type: text/css');
# If javascript file extension is found, then set the headers to javascript format
}elseif(strstr($_GET['link'], '.js') != FALSE){
header('Content-Type: text/javascript');
# If json file extension is found, then set the headers to json format
}elseif(strstr($_GET['link'], '.json') != FALSE){
header('Content-Type: application/json');
# If rss file extension is found, then set the headers to rss format
}elseif(strstr($_GET['link'], '.rss') != FALSE){
header('Content-Type: application/rss+xml; charset=ISO-8859-1');
# If css xml extension is found, then set the headers to xml format
}elseif(strstr($_GET['link'], '.xml') != FALSE){
header('Content-Type: text/xml');
# If we still haven't found a suitable file extension, then just set the headers to plain text format
}else{
header('Content-Type: text/plain');
}
}
# Now get the contents of our page we're wanting
$contents = file_get_contents($_GET['link']);
# And finally, spit everything out
echo $contents;
?>
If your webserver has active allow_url_include, GitHub serving the files as raw plain/text is not a problem since you can include the file first in a PHP script and modify its Headers to the proper MIME type.

How can I test a javascript caching solution?

I'm looking into some of the established methods for forcing reload of cached javascript files. Problem is, I can't get a script to cache locally. I checked the network tab of Chrome for the "Disable cache" option is off. I'm using MVC to add this to a site homepage:
#section scripts
{
#Scripts.Render("~/Scripts/app/test.js")
}
And here is the test.js content:
alert('Welcome to 2');
If I change the alert text and refresh the page, regardless of whether the project is restarted or not, it is always fresh...
It's normally a server configuration issue.
https://medium.com/pixelpoint/best-practices-for-cache-control-settings-for-your-website-ff262b38c5a2
For a client-side dev who doesn't want to mess with server configs, you could also change your test.js file to a test.php file (literally just change the extension), call it the exact same way you're calling it (except, again, changing the extension to PHP) and add something like this to the very top of the PHP file:
<?php
$seconds_to_cache = 3600;
$ts = gmdate("D, d M Y H:i:s", time() + $seconds_to_cache) . " GMT";
header("Expires: $ts");
header("Pragma: cache");
header("Cache-Control: max-age=$seconds_to_cache");
?>
Then upload it to your server. As long as your server handles PHP (it probably does), this code should force the browser to cache the new file, which will function just like any JS file.
(Taken from https://electrictoolbox.com/php-caching-headers/)

Can create a file using PHP, but it won't load on the browser for half an hour

So I'm making a browser based game, and in order to create an account for this game I have the JS file call a PHP file (POST) to write an XML file.
This works, I get the file in cPanel, in the right directory, with the right content. Meaning I can open it, but only in cPanel. When I try to access it via browser I get a 404, but only for about 30 min, then it'll just magically start working.
This same PHP file is called later on in the game to update XML files, and the same thing happens. I can confirm that the PHP works exactly as it should, because I can see that the file/directory is perfect.
Here's the interesting bit, if I create an XML file manually or update it manually, it works instantly. It's only the XMLs created by the PHP file that take forever to load.
It's like the server doesn't realize that there was a change on it, until half an hour after the fact. That is, unless I did it manually.
My PHP:
<?php
$filename=$_POST['fileTo'];
$newfile=fopen($filename,"w")or die('Can not open');
$string=$_POST['stuff'];
fwrite($newfile,$string) or die('Could not write');
fclose($newfile);
?>
My AJAX call:
$.ajax({
type: 'GET',
url: writeDirect,
dataType: 'xml',
success: function(result) {
},
cache:false,
error: function(error) {
$.post('PHP/Accounts/creatAcc.php', { fileTo: userWrite, stuff: writeStuff }, function() {
signIn(userATFS, passCe);
});
}
});
Update:
I've decided to access the games directory directly from the browser. This gets even more interesting.
First thing I did was create a new account called testFile, I get the standard error on the GET because the game can't access the newly created account.
Then I opened the directory in Chrome, this is interesting:
The index clearly shows that testFile.xml exists
Then I try clicking on it, but this is where it breaks
The images 404 despite the file clearly existing
And no, changing the permissions on testFile.xml did not change anything.
I believe I've found the answer. I think it was just that server that was weird like that. I was using x10 basic and decided to switch over to another service and now it works.

fswrite with full 777 server protocol. Secure?

I have a question about the security of a server when the 777 Permissions are open.
This is the code:
<?php
// collect the cookie - save the data
if(!isset($_COOKIE["markertype"])) {
echo "Cookie named markertype is not set!";
} else {
echo "Cookie markertype is set!<br>";
echo "Value is: " . $_COOKIE["markertype"];
$file = 'newfile.txt';
// Open the file to get existing content
$current = file_get_contents($file);
// Append a new person to the file
$current .= $_COOKIE["markertype"];
// Write the contents back to the file
file_put_contents($file, $current);
}
?>
Basically the system allows users to do stuff in javascript, I then set a cookie in JS with some user information that they put in the system. I then send that to PHP via a cookie to store this information on the server. The problem was that the server didn't have the permission to write to a file, so i proceeded to give full 777 access to the entire directory.
I think this opens me up to full XSS attacks and more, what is my alternative or can i secure the server in a different way? the server is NOT mine, therefore I only have certain access.
Yeah 777 does cause some security holes, however it can be used on a different
directory on your server, for instance if you have a public_http folder, then make that 777 permissions then that is a HUGE issue, but instead you could make a non-public accessible folder full 777 and save your data to that.
This still isn't 100% secure because really your data should be off-site or remotely stored in a database, but if it's just for research then that is fine.
hope this helped.
H

PHP Ajax jQuery cross domain file loading

So before you say this can't be done. These are all files that I have one server. I just have some of them listed under different domains.
My PHP script will access all the files but when I try to do an ajax request to try to load the file I will often get an error (because the site i am accessing is secure and the one I am accessing it through isn't).
What I need is a way to have php grab the file. But I need aJax to retrieve the file and render it for me. I am also using ACE editor to edit the file
The bit of code I have here will actually error out as well because it will load and print out the file where $page is defined but won't load where htmlspecialchars is.
<script>
var e = ace.edit("editor");
<?php
$page = readfile($_SERVER['DOCUMENT_ROOT'].$_GET['dir']);
echo 'e.setValue('.htmlspecialchars($page, ENT_QUOTES).');';
?>
</script>
I have an ajax get request working but it doesn't work when I go to a directory with a special htaccess file. Now I can't change the htaccess file (unless there is a way for me to confirm that it is my script running and not someone else.
The question is, how can I access those other files without getting that error? Mind you those files could be extension. It is not limited to just scripts or css, mostly they will be html or php files.
After an hour of searching the deep dark depths of the php.net site I was able to put together a solution that works.
<?php
echo htmlspecialchars(
addslashes(
file_get_contents(
$_SERVER['DOCUMENT_ROOT'].$_GET['d‌​ir']
)
)
); ?>
the addslashes is the extra part that I needed. Then I also had to put it between the div for the editor. I couldn't use the editor.setValue() function.

Categories