I am using this great jquery.fileDownload.js plugin to allow users to download self-generated CSV files from my website.
From what I understood, based in the PHP code from this site (suggested by fileDownload creator), I need to set a cookie to identify when the download is sucessfull or not:
try {
$page = file_get_contents($filepath);
echo $page;
header('Set-Cookie: fileDownload=true; path=/');
} catch(Exception $e) {
header('Set-Cookie: fileDownload=false; path=/');
}
It works as expected when fileDownload=true, but fail event is never fired when fileDownload=false.
Here is my javascript code:
$.fileDownload('export.php')
.done(function() {
alert('Done!');
})
.fail(function() {
alert('Fail!');
});
And here is my export.php code:
header('Pragma: no-cache');
header('Cache-Control: no-cache, must-revalidate');
header('Content-type: text/csv; charset=utf-8');
header('Content-Disposition: attachment; filename="file.csv"');
try {
// Code for generating the CSV file
header('Set-Cookie: fileDownload=true; path=/');
} catch(Exception $e) {
header('Set-Cookie: fileDownload=false; path=/');
}
exit;
When file is generated correcly I got the success message, but never get the fail message, even when simulating an issue.
Please, what I am missing? Thank you!
It looks like the cookie is only used to check if a download was completed, but not to assess failure. But it does catch HTTP errors, so the solution is simply sending a 500 from the server:
try {
$page = file_get_contents($filepath);
echo $page;
header('Set-Cookie: fileDownload=true; path=/');
} catch(Exception $e) {
http_response_code(500);
}
Related
Im trying to hide the src url for a pdf file in an iframe / embed. Im not sure how.
I tried with all the previously exiting answers, but none of them are working.
<?php
$url = $_GET['url'];
?>
<embed id="renderedPrint" style="height:calc(100% - 4px);width:calc(100% - 4px);padding:0;margin:0;border:0;"></embed>
<script>
$(document).ready(function() {
var encryptedString = "assets/labels/" + "<?php echo $url; ?>" + ".pdf";
$("#renderedPrint").attr("src", encodeURIComponent(encryptedString));
});
</script>
But no matter which method i use (Obfuscator, php openssl_encrypt/decrypt), the output url is always visible.
I dont want users to find the iframe/embed url. I want to make it difficult to or even hide the url from the front-end.
The purpose is that i dont want users to have direct access to the generated pdf file. They may copy the iframe src url and send it to someone else. We cant stop them from downloading the pdf, but i dont want them to copy the source url from the server.
check this code
you should be add file address to DB
<?php
// get id to search on DB and get detail
$id = $_REQUEST['id'];
try {
$conn = new PDO("pgsql:host=$host;port=5432;dbname=$dbname", $username, $password);
// set the PDO error mode to exception
$conn->setAttribute(PDO::ATTR_ERRMODE, PDO::ERRMODE_EXCEPTION);
//echo "Connected successfully";
} catch(PDOException $e) {
echo "Connection failed: " . $e->getMessage();
}
$stmt = $conn->prepare("SELECT url FROM mytable WHERE id=? LIMIT 1");
$stmt->execute([$id]);
$row = $stmt->fetch();
// the address of file in server
$path = $row['url'];
$filename = basename($path);
if (file_exists($path) && is_readable($path)) {
// get the file size and send the http headers
$size = filesize($path);
header('Content-Type: application/octet-stream');
header('Content-Length: '.$size);
header('Content-Disposition: attachment; filename='.$filename);
header('Content-Transfer-Encoding: binary');
// open the file in binary read-only mode
// display the error messages if the file can´t be opened
$file = # fopen($path, 'rb');
if ($file) {
// stream the file and exit the script when complete
fpassthru($file);
exit;
} else {
echo $err;
}
} else {
echo 'check that file exists and is readable';;
}
?>
I need to export data from a database through a webapp. And when I call the function with ajax post method. The file is created successfully on the server, but after that it is not downloading it, the download function is not called, somehow, or it is not working. I don't know the reason, and I don't get any error messages.
Here is my jQuery function that sets post data for the php:
$(document).ready(function(){
$('#exportCSV').click(function(){
var ajaxurl = 'admin.php?page=site&edit=false',
data = {'export': 'csv'};
$.post(ajaxurl, data, function (response) {
});
});
});
A switch that runs the selected file export function, than tries to download it
switch ($_POST['export']) {
case 'csv':
$query = $_SESSION['query'];
$FName=exportToCSV($conn,$query,$fName);
download($FName);
break;
}
And the download script that is not called, or is not working correctly
function download($file){
header("Cache-Control: public");
header("Content-Description: File Transfer");
header("Content-Disposition: attachment; filename=$file");
header("Content-Length: ".filesize($file));
header("Content-Type: application/csv");
header("Content-Transfer-Encoding: binary");
readfile($file);
}
And this is how this "site" is called.
Everything above is in the "somesite.php" file
if(isset($_GET['page'])){
switch ($_GET['page']) {
case 'site':
include("somesite.php");
break;
}
}
I am writing this code where there is 2 user type : Normal User & Admin User.
Normal User Submit Data To Admin User, Both Admin (More Then 1 Admins In database) & Normal User Should Receive Email Regarding The Submission Of Data.
The submission and retrieving of the data is working fine. But in the Email Part, I reuse the code from my registration part that works for the submission code, Result is, It does not read the $mail.
Both of my registration and submission files are in the same folder. (Path should be working fine).
The logic also seems fine. Maybe i forget or miss something ? Could use a help to check my code.
...//
if ($conn->query($sqlDeleteSelected) === TRUE)
{
require_once "../assets/inc/phpmailer/PHPMailerAutoload.php";
$mail = new PHPMailer(true);
try
{
$sqleMail = "SELECT * FROM users_details WHERE users_Admin_University_Name = '$basket_UniCourse_UniName_1'";
$resultSqleMail = $conn->query($sqleMail);
while($dataResultSqlMail=mysqli_fetch_assoc($resultSqleMail))
{
$mail->AddAddress($dataResultSqlMail['users_Email']);
}
$mail->From = "myemail#gmail.com";
$mail->FromName = "MyName";
$mail->isHTML(true);
$mail->Subject = 'Application Registered';
$mail->Body = 'Congratulations!';
$mail->Send();
if($mail->Send())
{
// echo "Message has been sent successfully";
?>
<script type="text/javascript">
alert("sucesss");
</script>
<?php
}
else
{
?>
<script type="text/javascript">
alert($mail->ErrorInfo);
</script>
<?php
}
}
catch (phpmailerException $e)
{
echo $e->errorMessage();
}
catch (Exception $e)
{
echo $e->getMessage();
}
}
//..
Thank You So Much.
I'm not sure if I understand you 100% correctly but I assume that the java script "success" or the alert is not executed? This is because your if condition is not used properly. Actually you are trying to send the email twice:
$mail->Send();
if($mail->Send())
{...
A better way to see if the email was send successfuly is using a try&catch:
try {
$mail->send();
// print success message
} catch (Exception $e) {
// print error message
}
Edit: I have a solution of sorts, but I do not understand why ajax is not working. Please see the edit below.
I want to save a txt file generated in javascript to the client machine.
I want the user to remain on the same webpage.
I am doing it via ajax with a call to a php page.
I want it to work in all major pc browsers; Chrome, Firefox, Internet Explorer and Edge.
Here is what I have so far:
htmlsave.js
dataARR = {
SaveFile: "This is the content.\nLine2.\nLine3"
};
var dataJSON = JSON.stringify(dataARR);
$.ajax({
type: "POST",
url: "htmlsave.php",
async: true,
cache: false,
crossDomain: false,
data: { myJson: dataJSON },
success: function (data) {
alert("success");
},
error: function (XMLHttpRequest, textStatus, errorThrown) {
alert("failure");
}
});
htmlsave.php
<?php
if(!isset($_POST) || !isset($_POST['myJson'])) {
exit(0);
}
$data = json_decode($_POST['myJson'], true);
if(!array_key_exists('SaveFile', $data) || !isset($data['SaveFile'])){
exit(0);
}
$contents = strval($data['SaveFile']);
header("Pragma: public"); // required
header("Expires: 0");
header("Cache-Control: must-revalidate, post-check=0, pre-check=0");
header("Cache-Control: private", false); // required for certain browsers
header("Content-Type: text/plain");
header("Content-Disposition: attachment; filename=\"mysavefile.txt\";" );
header("Content-Transfer-Encoding: binary");
header("Content-Length: " . strlen($contents));
ob_clean();
flush();
echo $contents;
?>
What comes back (ie the data argument in success) is "This is the content.\nLine2.\nLine3".
There is no prompt to save the file with content "This is the content.\nLine2.\nLine3" as I had hoped.
Edit: A solution (but not a great one)
I have done a bit of tinkering and have something that works, in a fashion. Instead of calling through ajax I write in my javascript
document.location = "htmlsave.php?myJson=Line1Line2";
and in my php I have altered it to a get with
if(!isset($_GET) || !isset($_GET['myJson'])) {
exit(0);
}
$contents = $_GET['myJson'];
This works. This will be ok for small files, but I wanted to do reasonably large text files, and even zipped using lzw encoding. Anyone know why Ajax is not working?
Make sure your htmlsave.php starts with
<?php
If it does, see if your webserver is configured to pass .php files to your php implementation (fpm, mod_php, etc...)
I'm trying to implement a progress bar for importing records into a database.
The import is started using jQuery's $.post(...) sent to a php script on the server.
I tried several approaches:
start the import and write the progress to a SESSION-Var, polling the var with a second call via EventSource
php for import is something like:
foreach($importProduct as $ip){
$_SESSION['importedProducts'] += 1;
// ... do the import-stuff
}
And then fetch the import-progress using EventSource
var jsonStream = new EventSource('eventSource.php');
if(typeof(EventSource) === "undefined"){
alert('browser doesn\'t support EventSource');
}else{
console.log('fetching stream');
jsonStream.onmessage = function (e) {
console.log('stream: ' + e.data);
$('#eventReturn').html(e.data);
//var message = JSON.parse(e.data);
// handle message
};
}
and in the PHP-script
header('Content-Type: text/event-stream');
header('Cache-Control: no-cache');
while(1){
echo 'data: {"imported":"'.$_SESSION['importedProducts'].'","total":"'.$_SESSION['totalProductsToImport'].'"}';
echo "\n\n";
ob_flush();
flush();
sleep(1);
}
Which obviously doesn't work since the SESSION is not updated in between calls.
Writing the progress to a file and then read it out every second seems to be a bit of an overhead...
Another thing I tried is using a js-function that calls itself every second and attempts to get the progress from the same script - but it hangs until the import-script has finished
function uploadProgress(){
// Fetch the latest data
$.get('progress.php', function(data){
console.log(data);
});
setTimeout(uploadProgress, 5000);
}
Any ideas?
Notes: I'm starting the Session at ever call (session_start). I'm aware that the «while(1)» creates an endless loop... :)
I found a solution.
The trick was to open and close the session in between calls.
So in the import-script:
foreach($importProduct as $ip){
session_start();
$_SESSION['importedProducts'] += 1;
session_write_close();
// ... do the import-stuff
}
and in the progress-script:
header('Content-Type: text/event-stream');
header('Cache-Control: no-cache');
while(1){
session_start();
echo 'data: {"imported":"'.$_SESSION['importedProducts'].'","total":"'.$_SESSION['totalProductsToImport'].'"}';
echo "\n\n";
ob_flush();
flush();
session_write_close();
sleep(1);
}
Now the session gets updated continuously and I can use the EventSource-approach