I have a site where I want a user to be able to download some data as a text file, without permanently saving the file on my server. The approach I'm trying is to use JavaScript to send a POST, using PHP to generate and save a text file. On success, JavaScript will open that file in a separate window. After a few seconds delay, it will then send a POST with the file to delete.
I have most of it working, but for some reason, when I try to delete the file, I keep getting an error - No such file or directory. I don't know why, especially since using a test file to delete in the same directory works fine. Here's what I'm using on the javascript side:
////CREATE FILE
function exportGroup() {
$.post("../Modules/Export_Mod/export_mod.php",
{ submit:'export',
groupIndex: groupSelect.value,
userRole: 'admin',
serial: <?php echo $serial;?>
},
function(data,status){
//open created file in new window
window.open("../Modules/Export_Mod/"+data);
removeExport(data);
});
};
//////REMOVE FILE
function removeExport(filename) {
///After 1 second, send post to delete file
setTimeout(function() {
$.post("../Modules/Export_Mod/export_mod.php",
{ submit:'removeExport',
file: filename
},
function(data,status){
data;
});
}, 1000);
}
and my PHP:
//I'm creating the file successfully with this
...
$filename = $groupName."_group_export.txt";
$content = $header.$dataStr;
$strlength = strlen($content);
$create = fopen($filename, "w");
$write = fwrite($create, $content, $strlength);
$close = fclose($create);
But when I try to delete a second (or more) later using this:
if (($_POST)&&($_POST['submit']=='removeExport')){
$file = $_POST['file'];
unlink($file); ///works when using an already-existing file in the same directory ... unlink('test.txt');
}
I get the error. The first thing am wondering is if I'm approaching this the right way. If not, is there a better way to do it? And then the second thing I'm wondering is why I'm getting this error and what I need to change to make it work.
I would check the server permissions to see if the php script that is running is able to delete the files that are created. If you are on a unix based script I would run the
ls -l /usr/var/
command on the directory that you are storing the files in, to see what permissions have been assigned to them.
I've done similar sorts of things where a file is created and then deleted some time later, in my case 24hrs. But what I did was to set up a cron job to find the files that are older than a period of time and then delete them. That way I don't have to depend on the browser to post back a delete request.
Another option is to set up a custom session handler that deletes files associated with the session upon close of the session. Though that may leave things lying about if the user doesn't officially log out.
Or you could keep the file data in CLOBs on MySQL and then set up a query that kills them after a period.
Or if you feel like using Cassandra, you can set a TTL on a row and it will magically disappear.
Don't get locked in to operating system files. You control what data is provided by your pages, so if you want to send back data and call it a "file" the user will never know that it is actually a database entry.
Related
So I have a script that organises an un-formatted csv file and presents an output.
One of the pieces of data I receive in this data that we must return is a link to an image stored on Google Drive. The problem with this is Google Drive doesn't like to present you with a direct link to a file.
You can get the ID of a file (e.g. abc123DEFz) and view it online at https://drive.google.com/open?id=abc123DEFz. We need a direct link for another service to be able to process the file, not a redirect or some fancy website.
After poking around I discovered that https://drive.google.com/uc?export=view&id=abc123DEFz would redirect you directly to the file, and was what I somehow had to obtain inside the script.
The url it gave me though didn't really seem to have any relation to the ID and I couldn't just go ahead and swap the ID, for each file I would have to resolve this uc?export link into this link that would send me directly to the file. (Where the redirect sent me: http://doc-0c-2s-docs.googleusercontent.com/docs/securesc/32-char-long-alphanumeric-thing/another-32-char-long-alphanumeric-thing/1234567891234/12345678901234567890/12345678901234567890/abc123DEFz?e=view&authuser=0&nonce=abcdefgh12345&user=12345678901234567890&hash=32-char-long-alphanumeric-hash)
No authentication is required to access the file, it is public.
My script works like this:
const csv = require('csv-parser'),
fs = require('fs'),
request = require('request');
let final = [],
spuSet = [];
fs.createReadStream('data.csv')
.pipe(csv())
.on('data', (row) => {
>> data processing stuff, very boring so you don't care
console.log(`
I'm now going to save this information and tell you about the row I'm processing
so you can see why something went wrong`);
final.push(`[{"yes":"there is something here"},{"anditinvolves":${thatDataIJustGot}]`);
spuSet.push(`[{"morethings":123}]`);
})
.on('end', () => {
console.log('CSV file successfully processed');
console.log(`
COMPLETED! Check the output below and verify:
[${String(final).replace(/\r?\n|\r/g, " ")}]
COMPLETED! Check the output below and verify:
[${String(spuSet).replace(/\r?\n|\r/g, " ")}]`);
>> some more boring stuff where I upload the data somewhere and create a file containing said data
});
I tried using requests but it's a function with a callback so using the data outside of the function would be difficult, and wrapping everything inside the function would remove my ability to push to the array.
The url I get from the redirect would be included in the data I am pushing to the array for me to use later on.
I'm pretty bad at explaining crap, if you have any questions please ask.
Thanks in advance for any help you can give.
Try using the webContentLink parameter of the Get API call:
var webLink = drive.files.get({
fileId: 'fileid',
fields: 'webContentLink'
});
This will return the object:
{
"webContentLink": "https://drive.google.com/a/google.com/uc?id=fileId&export=download"
}
Then you can use split() to remove &export=download from the link, as we don't want to download it.
As fileId, you can get the Ids of your files by using the List API Call, and then you can loop through the list array calling the files.get from the first step.
My apologies if I misunderstood your issue.
In case you need help with the authentication to the Google Services, you can take a look at the Quickstart
I'm building an extension that extracts DOMs from a website(not mine) and automates a button click with filling some inputs.
I made a local database which the extension will be extracting values from to fill the inputs, I could successfuly done that with xmlhttprequest that reads my php file from my content-script js file.
Now I want to send my php file that the button was clicked so it updates the database with new values. I tried $.post() but I can't get it with $_POST[];
content-script.js
setTimeout(
function(){
var finsih_div_found = $(dialog_div_found).find("div").get(12);
var finish_button_found = $(finsih_div_found).find("button").get(2);
finish_button_found.dispatchEvent(new Event('click', {bubbles: true}))
$.post('http://localhost:8012/extension-Oasis/php/getIntervenant.php', {button: 'Clicked'}, function(e){
console.log("posted");
});
},2000);
php file
$status = $_POST["button"]; //Gives an error of 'Undefined index: button'.
Please note that the website is not mine I don't have nor the back end nor the front end nor its API's. I just want to automate a button click that's done regularly.
Have you called your PHP file directly? Then it is a HTTP GET request and therefore $_POST['button'] throws and 'Undefined index: button' error. In other words, you can only access the button value via $_POST[] if you use your jQuery $.post to do so.
$.post('http://localhost:8012/extension-Oasis/php/getIntervenant.php', {button: 'Clicked'}, function(e){
console.log(e); // log everything that is echoed in PHP file to browser's console
});
In your PHP file, add a echo $status; in the last line.
I am working on a PHP based web app (that i didn't build).
I am running this ajax request:
$.ajax({
type: 'POST',
url: "/potato/ajax.php?module=test_module",
dataType: 'json',
async: true,
data: {
start_ts: that.start_date,
stop_ts: that.end_date,
submitted: true
},
beforeSend: function() {
console.log('Start: ' + new Date().toLocaleString());
// Show Chart Loading
that.qwChart.showLoading({
color: '#00b0f0',
// text: that.returnNumWithPrecent(that.progress)
text: that.qwChartProgress
});
// If data div isn't displayed
if (!that.dataDisplayed) {
// Show divs loading
that.showMainDiv();
} else {
that.$qwTbody.slideUp('fast');
that.$qwTbody.html('');
}
},
complete: function(){},
success: function(result){
console.log('End: ' + new Date().toLocaleString());
// Clear timer
clearInterval(timer);
// Set progressbar to 100%
that.setProgressBarTo100();
// Show Download Button
that.downloadBtn.style.display = 'inline-block';
// Insert Chart Data
that.insertChartData(result);
// Insert Table Data
that.insertTableData(result);
}
});
And for some reason it gets my whole web-app stuck until it returns the data. I know that by default ajax requests are set to 'true' but i added it anyway just to make sure it is.
If it is async, it should do the job without getting my web-app stuck, am I right? What can be the problem? Is this a serverside problem? How do I debug this situation?
Edit: By saying "stuck" I mean - when I wait for the response after submitting the ajax call, refreshing the page or opening in parallel other pages (within my web app only) display a white loading screen. Whenever the ajax call returns the data - the white page loads to the requested page.
Data is returned from the PHP file:
<?php
require_once("/www/common/api/db.php");
if (!empty($_POST['submitted'])) {
// error_reporting(-1);
// Users Array:
$users = get_qw_data($start_ts_to_date, $stop_ts_to_date);
// Summary Array:
$summary = get_qw_summary($users);
// QW Score Array:
$qws = get_qw_score($users);
// Generate CSV Report files
/* Remove old:*/
if (!is_file_dir_exist($customer))
create_qw_directory($customer);
/* Report #1: */ users_apps_google_macros_ma($users['users'], $customer);
/* Report #2: */ usage_and_qw_summary($summary, $customer);
/* Report #3: */ qw_score($qws, $customer);
/* Zip Files: */ zip_qw_files($customer);
echo json_encode($qws);
}
PHP sessions are a prime candidate for other requests getting “stuck”, because the session file gets write-locked, so as long as one running script instance has the session open, all others have to wait.
Solution to that is to call session_write_close as soon as possible.
A little extended explanation:
The default storage mechanism for session data is simply the file system. For every active session, PHP simply puts a file into the configured session directory, and writes the contents of $_SESSION to it, so that it can be read back from there on the next request that needs to access it.
Now if several PHP script instances tried to write changed session data to that file “simultaneously”, that would quite obviously have great conflict/error potential.
Therefor PHP puts a write lock on the session file, as soon as one script instance accesses the session - everybody else, other requests (to the same script, or a different one also using the session), will have to wait, until the first script is done with the session, and the write lock gets released again.
Per default, that happens when the script is done running. But if you have longer running scripts, this can easily lead to such “blocking” effects as you are experiencing here. The solution to that is to explicitly tell PHP (via session_write_close), “I’m done with the session here, not gonna write any new/changed data to it from this point on - so feel free to release the lock, so that the next script can start reading the session data.”
The important thing is that you only do this after your script is done manipulating any session data. You can still read from $_SESSION during the rest of the script - but you can not write to it any more. (So anything like $_SESSION['foo'] = 'bar'; would have to fail, after you released the session.)
If the only purpose the session serves at this point (in this specific script) is to check user authentication, then you can close the session directly after that. The rest of the script can then run as long as it wants to, without blocking other scripts from accessing the same session any more.
This isn’t limited to AJAX requests - those are just one of the places where you usually notice stuff like this first, because otherwise you usually don’t have that many requests using the session running in “parallel”. But if you were to f.e. open a long-running script multiple times in several browser tabs, you would notice the same effect there - in the first tab the script will run and do its business, whereas in the following tabs you should notice that those requests are “hanging” as well, as long as the previous script instance holds the write lock on the session.
I am working on a PHP based web app (that i didn't build).
I am running this ajax request:
$.ajax({
type: 'POST',
url: "/potato/ajax.php?module=test_module",
dataType: 'json',
async: true,
data: {
start_ts: that.start_date,
stop_ts: that.end_date,
submitted: true
},
beforeSend: function() {
console.log('Start: ' + new Date().toLocaleString());
// Show Chart Loading
that.qwChart.showLoading({
color: '#00b0f0',
// text: that.returnNumWithPrecent(that.progress)
text: that.qwChartProgress
});
// If data div isn't displayed
if (!that.dataDisplayed) {
// Show divs loading
that.showMainDiv();
} else {
that.$qwTbody.slideUp('fast');
that.$qwTbody.html('');
}
},
complete: function(){},
success: function(result){
console.log('End: ' + new Date().toLocaleString());
// Clear timer
clearInterval(timer);
// Set progressbar to 100%
that.setProgressBarTo100();
// Show Download Button
that.downloadBtn.style.display = 'inline-block';
// Insert Chart Data
that.insertChartData(result);
// Insert Table Data
that.insertTableData(result);
}
});
And for some reason it gets my whole web-app stuck until it returns the data. I know that by default ajax requests are set to 'true' but i added it anyway just to make sure it is.
If it is async, it should do the job without getting my web-app stuck, am I right? What can be the problem? Is this a serverside problem? How do I debug this situation?
Edit: By saying "stuck" I mean - when I wait for the response after submitting the ajax call, refreshing the page or opening in parallel other pages (within my web app only) display a white loading screen. Whenever the ajax call returns the data - the white page loads to the requested page.
Data is returned from the PHP file:
<?php
require_once("/www/common/api/db.php");
if (!empty($_POST['submitted'])) {
// error_reporting(-1);
// Users Array:
$users = get_qw_data($start_ts_to_date, $stop_ts_to_date);
// Summary Array:
$summary = get_qw_summary($users);
// QW Score Array:
$qws = get_qw_score($users);
// Generate CSV Report files
/* Remove old:*/
if (!is_file_dir_exist($customer))
create_qw_directory($customer);
/* Report #1: */ users_apps_google_macros_ma($users['users'], $customer);
/* Report #2: */ usage_and_qw_summary($summary, $customer);
/* Report #3: */ qw_score($qws, $customer);
/* Zip Files: */ zip_qw_files($customer);
echo json_encode($qws);
}
PHP sessions are a prime candidate for other requests getting “stuck”, because the session file gets write-locked, so as long as one running script instance has the session open, all others have to wait.
Solution to that is to call session_write_close as soon as possible.
A little extended explanation:
The default storage mechanism for session data is simply the file system. For every active session, PHP simply puts a file into the configured session directory, and writes the contents of $_SESSION to it, so that it can be read back from there on the next request that needs to access it.
Now if several PHP script instances tried to write changed session data to that file “simultaneously”, that would quite obviously have great conflict/error potential.
Therefor PHP puts a write lock on the session file, as soon as one script instance accesses the session - everybody else, other requests (to the same script, or a different one also using the session), will have to wait, until the first script is done with the session, and the write lock gets released again.
Per default, that happens when the script is done running. But if you have longer running scripts, this can easily lead to such “blocking” effects as you are experiencing here. The solution to that is to explicitly tell PHP (via session_write_close), “I’m done with the session here, not gonna write any new/changed data to it from this point on - so feel free to release the lock, so that the next script can start reading the session data.”
The important thing is that you only do this after your script is done manipulating any session data. You can still read from $_SESSION during the rest of the script - but you can not write to it any more. (So anything like $_SESSION['foo'] = 'bar'; would have to fail, after you released the session.)
If the only purpose the session serves at this point (in this specific script) is to check user authentication, then you can close the session directly after that. The rest of the script can then run as long as it wants to, without blocking other scripts from accessing the same session any more.
This isn’t limited to AJAX requests - those are just one of the places where you usually notice stuff like this first, because otherwise you usually don’t have that many requests using the session running in “parallel”. But if you were to f.e. open a long-running script multiple times in several browser tabs, you would notice the same effect there - in the first tab the script will run and do its business, whereas in the following tabs you should notice that those requests are “hanging” as well, as long as the previous script instance holds the write lock on the session.
If the user navigates off the webpage, is it possible to execute a php script?
I know that Javascript can be executed..
$(window).bind('beforeunload', function(){
return 'DataTest';
});
Cookies might work, but I am not sure how a listener could track an expired cookie, and then delete the correct webpage.
A sample file system is like this:
user0814HIFA9032RHBFAP3RU.php
user9IB83BFI19Y298RYBFWOF.php
index.php
listener.py
data.txt
Typically, to create the website, php writes to the data.txt and the Python listener picks up this change, and creates the file (user[numbers]). As you might think, these files stack up overtime and they need to be deleted.
The http protocol is stateless, therefore users simply can not "navigate away".
The browser requires a page, the server returns it, and the communication stops.
The server doesn't have reliable methods to know what the client will do with that page.
Disclaimer: I'm not sure, as Fox pointed out, that this is the right way to go in your case. I actuallly upvoted Fox's answer.
However, if you absolutely need to delete each page right after the user left it, use this:
$(window).bind('beforeunload', function() {
$.ajax('yourscript.php?currentUser=0814HIFA9032RHBFAP3RU');
});
Then in yourscript.php, put something like the following:
<?php
// load your userId (for example, with $_SESSION, but do what you want here)
$actualUser = $_SESSION['userId'];
// checks if the requested id to delete fits your actual current user's id
if (isset($_GET['currentUser'] && $_GET['currentUser'] == $actualUser)
{
$user = $_GET['currentUser'];
$file = 'user'.$user.'.php';
unlink($file);
}