I am in a situation, when I have to implement downloading of large files(up to 4GB) from a Web server: Apache 2.4.4 via HTTP protocol. I have tried several approaches, but the best solution looks to be the usage of X-SendFile module.
As I offer progress bar for file uploads, I would need to have the same feature for file downloads. So here are my questions:
Is there any way, including workaround, to achieve file downloads progress monitoring?
Is there any way, including workaround, to calculate file download transfer speed?
Is there better way to provide efficient file downloads from a web server than usage of X-Sendfile module?
Is there better file download option in general, that would allow me to monitor file download progress? It can be a client (JavaScript) or server solution(PHP). Is there any particular web server that allows this?
Currently I use:
Apache 2.4.4
Ubuntu
Many times thanks.
2 ideas (not verified):
First:
Instead of placing regular links to files (that you want to download) on your page place links like .../dowanload.php which may look sth like this:
<?php
// download.php file
session_start(); // if needed
$filename = $_GET['filename']);
header( 'Content-type: text/plain' ); // use any MIME you want here
header( 'Content-Disposition: attachment; filename="' . htmlspecialchars($filename) . '"' );
header( 'Pragma: no-cache' );
// of course add some error handling
$filename = 'c:/php/php.ini';
$handle = fopen($filename, 'rb');
// do not use file_get_contents as you've said files are up to 4GB - so read in chunks
while($chunk = fread($handle, 1000)) // chunk size may depend on your filesize
{
echo $chunk;
flush();
// write progress info to the DB, or session variable in order to update progress bar
}
fclose($handle);
?>
This way you may keep eye on your download process. In the meantime you may write progress info to the DB/session var and update progress bar reading status from DB/session var using AJAX of course polling a script that reads progress info.
That is very simplified but I think it might work as you want.
Second:
Apache 2.4 has Lua language built in:
mod_lua
Creating hooks and scripts with mod_lua
I bet you can try to write LUA Apache handler that will monitor your download - send progress to the DB and update progress bar using PHP/AJAX taking progress info from the DB.
Similarly - there are modules for perl and even python (but not for win)
I see main problem in that:
In a php+apache solution output buffering may be placed in several places:
Browser <= 1 => Apache <= 2 => PHP handler <= 3 => PHP Interpreter
process
You need to control first buffer. But directly from PHP it is impossible.
Possible solutions:
1) You can write own mini daemon which primary function will be only send files and run it on another than 80 port 8880 for example. And process downloading files and monitor output buffer from there.
Your output buffer will be only one and you can control it:
Browser <= 1 => PHP Interpreter process
2) Also you can take mod_lua and control output buffers directly from apache.
3) Also you can take nginx and control nginx output buffers using built-in perl (it is stable)
4) Try to use PHP Built-in web server and control php output buffer directly. I can't say anything about how it is stable, sorry. But you can try. ;)
I think that nginx+php+built-in perl is more stable and powerful solution.
But you can choose and maybe use other solution non in that list. I will follow this topic and waiting your final solution with interest.
Read and write to the database at short intervals is killing performance.
I would suggest to use sessions (incrementing the value of sent data in the loop) with which you can safely off by quite another php file, you can return data as JSON which can be used by the javascript function/plugin.
Related
lets say i have 1000 link like
https://test123.sharepoint.com/teams/TechnicalRecordsManagement/Workshop/2014/578090.pdf
is it possible to make automatic download without user interaction due to security issues???
some people says browsers no longer lets you download files without user interaction due to security issues.
i tried simple ways from here
Auto download file from a link using javascript
lets say i have 1000 link like ... assuming you want to automate the downloading of such files for your own personal uses, you can do so by file_get_contents() without any reliance on a tag. – GetSet
#GetSet yes i have my own personal link, i have way like using curl php – Tri
#Tri I will post a solution without curl, shortly – GetSet
#Tri. The following grabs the contents of the remote file and saves it to your local drive.
$fileToDownload = "https://dmv.ny.gov/brochure/mv21.pdf";
$contents = file_get_contents($fileToDownload);
$f = fopen("mv21.pdf", 'w');
fwrite($f, $contents);
fclose($f);
Thats the basic concept.
Edited to add context.
So I want to write a REST API in PHP for JSON for consumption on the iPhone, but also a lot of websites and devices. I have the below code, when accessed via a GET statement, returns a file like:
1mdi2o3.part
How do I return something like: users.json
$db->setQuery( "SELECT * FROM users");
$db->query() or die($queryError);
$numRows = $db->getNumRows();
$row = $db->loadObjectList();
// PRINT JSON FILE
header("Content-type: application/json");
for($i = 0 ; $i < $numRows; $i++){
$json_output[$i] = $row[$i];
}
$MYjson_output = json_encode($json_output);
echo $MYjson_output;
Not entirely such what your goal is but here are 3-4 solutions that might work:
Common Solutions
Rewrite
This is probaly the most conventional way of getting clean URIs for your API. If you want the user part of user.json to be dynamic, you can use mod_rewrite (again assuming Apache) to rewrite your URLs to a php handler script. If you are going the conventional RESTful style URL route, you will probably want to use this anyway to achieve clean / separated URLs.
# For grabbing all users
RewriteEngine on
RewriteRule ^users\.json rest.php [L]
# For grabbing one particular user
RewriteEngine on
RewriteRule ^([a-z0-9]+)\.json rest.php?user=$1 [L]
Where rest.php is your PHP handler script.
URL Rewriting without mod-rewrite
If you don't want to use mod_rewrite, you can also do something like
example.com/rest.php/users.json
example.com/rest.php/user-id.json
or even
example.com/rest.php/user-id
example.com/rest.php/user/user-id
Where rest.php is your PHP handler script. You can grab the user-id from the URL (or URI if we're talking RESTful terms) using the $_SERVER global with $_SERVER['REQUEST_URI'].
Other solutions
Changing download name via Content-Disposition:
I believe you want to add the Content-Disposition header...
<?php
header('Content-Disposition: attachment; filename="users.json"');
?>
This will force the file to be downloaded as user.json. This usually isn't the behavior expected for a REST API, but from the way your question was worded, I figured I'd throw it out there.
AddHandler (or AddType)
Assuming you're running an Apache server, you can also just use AddHandler directive to make .json extension files be treated like they are php.
AddHandler application/x-httpd-php .json
Warning: Other plain .json files will be treated as PHP so you'd probably want to set this in a .htaccess file in the directory with the PHP script. This would work fine for this one URI but not ideal if you were exposing many URIs
Welcome to SO. Have you considered using the Zend framework for this application? I've written about this topic before, and if you do use Zend I could probably be of additional help. It would certainly help get you past the basics.
HTH,
-aj
The problem is this line:
header("Content-type: application/json");
I know this looks like the right way to do (after all, application/json is the official MIME type for JSON content), but it causes most browsers to present you with a file download dialog, when you just want to see the text. You can use the text/plain encoding to avoid this. Note that your AJAX/iPhone/... app probably doesn't care about the content type, so your API will work in both cases.
Also see this blog post which provides some more context.
There's a question here (on StackOverflow) that asks about streaming large files to user in chunks. A referred to code, originally here, in an answer to this question tells how. I'm looking for how to just save the file to server.
Notes:
The script here aims to download a file to server by providing URL to download file from (This process is also named remote upload).
My server provider disabled me from editing time limit, so downloads using this script takes time.
I am able to save file contents to server using file_put_contents("MyFile.iso",$buffer,FILE_APPEND), but not the whole file, mostly because the script takes long time running so it times out.
I think a solution may work like so: a JavaScript method requests PHP actions in the background via AJAX multiple times, the first background request tells PHP to download the first 100MB of the file. The second request tells PHP to download the second 100MB of the file, and so on till the PHP tells Javascript that we reached to the end of the file. So instead we downloaded the file in one whole process (long time taking), we downloaded it on multiple processes (small time taking).
A good start: How to partially download a remote file with cURL? (I will find time later soon to develop the whole solution altogether. Any help will be appreciated.
Below is the mentioned code that I need to start with in order to save/remote-upload file to server: (edited: it now saves the file to server, but not the whole file, mostly because the script takes long time running)
<?php
define('CHUNK_SIZE', 1024*1024); // Size (in bytes) of tiles chunk
// Read a file and display its content chunk by chunk
function readfile_chunked($fileurl, $retbytes = TRUE) {
$buffer = '';
$cnt = 0;
$handle = fopen($fileurl, 'rb');
if ($handle === false) {
return false;
}
while (!feof($handle)) {
$buffer = fread($handle, CHUNK_SIZE);
file_put_contents("MyFile.iso",$buffer,FILE_APPEND);
ob_flush();
flush();
if ($retbytes) {
$cnt += strlen($buffer);
}
}
$status = fclose($handle);
if ($retbytes && $status) {
return $cnt; // return num. bytes delivered like readfile() does.
}
return $status;
}
$fileurl = 'http://releases.ubuntu.com/16.04.2/ubuntu-16.04.2-desktop-amd64.iso';
$mimetype = 'mime/type';
header('Content-Type: '.$mimetype );
readfile_chunked($fileurl);
?>
I knew this question is old, but I hope my answer help others.
The question is not actually 'Download Large File To Server', but it should be Upload Large File To Server instead.
To upload a large file to a server, there are many ways. Here's is how I do it using FileReader and XMLHttpRequest.
https://stackoverflow.com/a/49808460/6348813
The idea is, you need to read the file as a Binary (ReadAsBinaryString()) or ArrayBuffer (ReadAsArraBuffer()) then you can stream the file to a server. In PHP, to listen the streamed Binary or ArrayBuffer is simply using php://input as the listen directory.
You need to consider doing this under HTTPS or your connection will be open to attacker.
The other method, you may try upload the file using the slice() method. This method has some good performance as it support the pause, close and resume upload whenever the connection is not stable.
In my experience, the ReadAsArrayBuffer() method seems more faster than ReadAsBinaryString() method and slice() method even the connection is under 100kbps.
All above methods has some common features, it is, 'You don't need to setting your PHP upload limit'.
Note:
DO NOT USE ReadAsBinaryString() METHOD AS IT HAS BEEN DEPRECATED IN MOZILLA AND DO NOT SUPPORT LARGE FILE MORE THAN 300MB.
This feature is non-standard and is not on a standards track. Do not use it on production sites facing the Web: it will not work for every user. There may also be large incompatibilities between implementations and the behavior may change in the future.
Original Article
https://developer.mozilla.org/en-US/docs/Web/API/FileReader
https://developer.mozilla.org/en-US/docs/Web/API/XMLHttpRequest
The organization that I'm working with now uses Munin as monitoring tool. We've written a service that streams realtime data about the service that can be displayed by a Javascript component. Preferably the operations team would like to show these metrics in Munin to avoid having another system for realtime monitoring.
Is it possible and feasible to use Munin for displaying realtime data using Javascript? Preferably I'd like to create this as a plugin but we're fine with modifying some Munin HTML page or similar as well and just add the Javascript component to the page.
Specifying alerts/alamars when certain properties of the streams go above a certain threshold would be nice as well. Given that (1) is feasible then one idea to integrate this would be to write an external app that reads the realtime stream and identifies when an alert should be triggered. When a error is detected the external app could write this to a file on disk. The idea is then to write a Munin plugin that reads from this file and trigger alters/alarms from within Munin if applicable.
Munin "polls" machines for data every five minutes. In order to provice your streaming data points to the central munin server, you need to configure a munin node on the server which streams data, and write a shell script (probably involving curl and awk) to fetch the current data.
Creating a munin plugin on a node is really simple, it's just a shell script which outputs it's data in readable form to standard out.
Setting alarms is easy, for the values you return you need to set warn and critical values in the munin plugin config output. Please keep in mind that these warnings are also on a 5 minute schedule so it's not "immediate".
Read up on how munin works at http://guide.munin-monitoring.org/en/latest/
Example of a simple munin plugin (stripped version of the system load plugin):
#!/bin/sh
. $MUNIN_LIBDIR/plugins/plugin.sh
if [ "$1" = "autoconf" ]; then
echo yes
exit 0
fi
if [ "$1" = "config" ]; then
echo 'graph_title Load average'
echo 'graph_args --base 1000 -l 0'
echo 'graph_vlabel load'
echo 'graph_scale no'
echo 'graph_category system'
echo 'load.label load'
print_warning load
print_critical load
echo 'graph_info The load average of the machine describes how many processes are in the run-queue (scheduled to run "immediately").'
echo 'load.info 5 minute load average'
exit 0
fi
echo -n "load.value "
cut -f2 -d' ' < /proc/loadavg
Save your data which you want to make a chart for in the database. Write another code to make a chart from that and simply update your chart with ajax request.
the php code which make chart use gd library or you can do it by svg xml output which i suggest it more.
And as long as time goes get the result of script by just requesting it in ajax.
Thats i just know
I trying to force to download a remote image (url with https protocol), I failed to attempts do it on client (cannot use HTML5, thanks to IE8), so I'm trying to use server side (php).
The only way how to do it I found thanks following answer, is using curl. Other way like readfile($file_url), always return an empty file. The problem with using curl - downloading starts after image is downloaded to server and it can take some time. Can we start loading directly from the source?
But if somebody know a way how to download an image on client side, that already on the page, it will be great!
You can use fopen('http://server/img.jpg'), and fread():
$handle = fopen("http://www.example.com/image.jpg", "rb");
while (!feof($handle)) {
echo fread($handle, 8192);
}
fclose($handle);
plus headers.