force to download remote image immediately without HTML5 - javascript

I trying to force to download a remote image (url with https protocol), I failed to attempts do it on client (cannot use HTML5, thanks to IE8), so I'm trying to use server side (php).
The only way how to do it I found thanks following answer, is using curl. Other way like readfile($file_url), always return an empty file. The problem with using curl - downloading starts after image is downloaded to server and it can take some time. Can we start loading directly from the source?
But if somebody know a way how to download an image on client side, that already on the page, it will be great!

You can use fopen('http://server/img.jpg'), and fread():
$handle = fopen("http://www.example.com/image.jpg", "rb");
while (!feof($handle)) {
echo fread($handle, 8192);
}
fclose($handle);
plus headers.

Related

Creating a file that returns an image from an external server

As the title says, I want to create a file that returns an image from an external server without downloading / storaging it. I tried something with PHP headers, but it didn't work as intended (I can't find the code right now sorry). For example, if we have an image_displayer.php file.
image_displayer.php?url=https://example.com/images/epic_image.png
This URL should return the "https://example.com/images/epic_image.png" image. This image will be displayed from an HTML file from the same server. I couldn't get it working with PHP headers (it was giving problems in the HTML file)
It doesn't have to be in PHP. It doesn't matter to me.
Thanks in advance.
You could try something like this:
$image_data = file_get_contents(IMAGE_URL);
$image_resource = imagecreatefromstring($image_data);
header('Content-Type: image/png');
imagepng($image_resource);
imagedestroy($image_resource);
If you just want to quickly show the image without handling it, you could simply do a 301 redirect to it.

Websockets file upload is corrupted (or wrongly encoded) - PHP and JS

I´m working on websocket scripts in PHP and JS and have issue with saving a file (img)
Sending from JS:
$('#frmChatFile').on("submit",function(event){
event.preventDefault();
var file = document.querySelector('input[type="file"]').files[0];
websocket.send(file, Blob);
});
Saving in PHP
socket_recv($newSocketArrayResource, $socketData, 61440, 0);
file_put_contents('test.jpg', $socketData);
It saves the file, but it is corrupted, or wrongly encoded...
The uploaded picture is slightly smaller (few bytes) and there is nothing readable in hexeditor (while in original I can read header and so on)
What am I missing? Any flag or something? Thank you very much :)
(fopen (w/wb), fwrite, fclose does exactly the same)
Most likely your data/image, is encoded in a frame as defined by RFC6455, so you are reading that frame in PHP with socket_recv. In fact all data sent from JS via websocket is allways encoded in frames.
You have to decode these frames in order to get your data back.
Have a look at https://github.com/napengam/phpWebSocketServer/blob/master/server/RFC6455.php
There you will find the decode function.
Good luck.

Download large file to server (Remote Upload) in chunk with PHP and AJAX (with time limit forced setting)

There's a question here (on StackOverflow) that asks about streaming large files to user in chunks. A referred to code, originally here, in an answer to this question tells how. I'm looking for how to just save the file to server.
Notes:
The script here aims to download a file to server by providing URL to download file from (This process is also named remote upload).
My server provider disabled me from editing time limit, so downloads using this script takes time.
I am able to save file contents to server using file_put_contents("MyFile.iso",$buffer,FILE_APPEND), but not the whole file, mostly because the script takes long time running so it times out.
I think a solution may work like so: a JavaScript method requests PHP actions in the background via AJAX multiple times, the first background request tells PHP to download the first 100MB of the file. The second request tells PHP to download the second 100MB of the file, and so on till the PHP tells Javascript that we reached to the end of the file. So instead we downloaded the file in one whole process (long time taking), we downloaded it on multiple processes (small time taking).
A good start: How to partially download a remote file with cURL? (I will find time later soon to develop the whole solution altogether. Any help will be appreciated.
Below is the mentioned code that I need to start with in order to save/remote-upload file to server: (edited: it now saves the file to server, but not the whole file, mostly because the script takes long time running)
<?php
define('CHUNK_SIZE', 1024*1024); // Size (in bytes) of tiles chunk
// Read a file and display its content chunk by chunk
function readfile_chunked($fileurl, $retbytes = TRUE) {
$buffer = '';
$cnt = 0;
$handle = fopen($fileurl, 'rb');
if ($handle === false) {
return false;
}
while (!feof($handle)) {
$buffer = fread($handle, CHUNK_SIZE);
file_put_contents("MyFile.iso",$buffer,FILE_APPEND);
ob_flush();
flush();
if ($retbytes) {
$cnt += strlen($buffer);
}
}
$status = fclose($handle);
if ($retbytes && $status) {
return $cnt; // return num. bytes delivered like readfile() does.
}
return $status;
}
$fileurl = 'http://releases.ubuntu.com/16.04.2/ubuntu-16.04.2-desktop-amd64.iso';
$mimetype = 'mime/type';
header('Content-Type: '.$mimetype );
readfile_chunked($fileurl);
?>
I knew this question is old, but I hope my answer help others.
The question is not actually 'Download Large File To Server', but it should be Upload Large File To Server instead.
To upload a large file to a server, there are many ways. Here's is how I do it using FileReader and XMLHttpRequest.
https://stackoverflow.com/a/49808460/6348813
The idea is, you need to read the file as a Binary (ReadAsBinaryString()) or ArrayBuffer (ReadAsArraBuffer()) then you can stream the file to a server. In PHP, to listen the streamed Binary or ArrayBuffer is simply using php://input as the listen directory.
You need to consider doing this under HTTPS or your connection will be open to attacker.
The other method, you may try upload the file using the slice() method. This method has some good performance as it support the pause, close and resume upload whenever the connection is not stable.
In my experience, the ReadAsArrayBuffer() method seems more faster than ReadAsBinaryString() method and slice() method even the connection is under 100kbps.
All above methods has some common features, it is, 'You don't need to setting your PHP upload limit'.
Note:
DO NOT USE ReadAsBinaryString() METHOD AS IT HAS BEEN DEPRECATED IN MOZILLA AND DO NOT SUPPORT LARGE FILE MORE THAN 300MB.
This feature is non-standard and is not on a standards track. Do not use it on production sites facing the Web: it will not work for every user. There may also be large incompatibilities between implementations and the behavior may change in the future.
Original Article
https://developer.mozilla.org/en-US/docs/Web/API/FileReader
https://developer.mozilla.org/en-US/docs/Web/API/XMLHttpRequest

How can I download a MP4 file instead of playing it on browser?

I have a .mp4 file that I need to download to my system when I click on the anchor tag.
HTML:
Download Here
Is there any way to download this instead of opening it in a browser?
I needed this to run on IE as well and the only option I have is through javascript or jQuery. Anything else that is simpler can also be suggested.
I am aware of the HTML5 download attribute, but it doesn't support on IE.
I found this: http://www.webdeveloper.com/forum/showthread.php?244007-RESOLVED-Force-download-of-MP3-file-instead-of-streaming and I do similar things with Excel files.
Directly from there:
use a small PHP file to handle the download, put in the same folder as the .mp3:
<?php
$file = $_GET['file'];
header ("Content-type: octet/stream");
header ("Content-disposition: attachment; filename=".$file.";");
header("Content-Length: ".filesize($file));
readfile($file);
exit;
?>
which can be accessed in any anchor like this:
Download the mp3
I've run into this with video files (I want to offer a download, but the browser always tries to open it).
If you can't use a server-side language like dgig mentions, the only thing I've been able to do is put the file in a ZIP file and let the user download that.
Apple does this with Quicktime samples, and since they have more money and resources to throw at this than I do, I figured that was probably how I'd have to do it.
If you are able to write an .htaccess or edit the Apache config, you can also check this proposed response to force a download.

HTTP File Download: Monitoring Download Progress

I am in a situation, when I have to implement downloading of large files(up to 4GB) from a Web server: Apache 2.4.4 via HTTP protocol. I have tried several approaches, but the best solution looks to be the usage of X-SendFile module.
As I offer progress bar for file uploads, I would need to have the same feature for file downloads. So here are my questions:
Is there any way, including workaround, to achieve file downloads progress monitoring?
Is there any way, including workaround, to calculate file download transfer speed?
Is there better way to provide efficient file downloads from a web server than usage of X-Sendfile module?
Is there better file download option in general, that would allow me to monitor file download progress? It can be a client (JavaScript) or server solution(PHP). Is there any particular web server that allows this?
Currently I use:
Apache 2.4.4
Ubuntu
Many times thanks.
2 ideas (not verified):
First:
Instead of placing regular links to files (that you want to download) on your page place links like .../dowanload.php which may look sth like this:
<?php
// download.php file
session_start(); // if needed
$filename = $_GET['filename']);
header( 'Content-type: text/plain' ); // use any MIME you want here
header( 'Content-Disposition: attachment; filename="' . htmlspecialchars($filename) . '"' );
header( 'Pragma: no-cache' );
// of course add some error handling
$filename = 'c:/php/php.ini';
$handle = fopen($filename, 'rb');
// do not use file_get_contents as you've said files are up to 4GB - so read in chunks
while($chunk = fread($handle, 1000)) // chunk size may depend on your filesize
{
echo $chunk;
flush();
// write progress info to the DB, or session variable in order to update progress bar
}
fclose($handle);
?>
This way you may keep eye on your download process. In the meantime you may write progress info to the DB/session var and update progress bar reading status from DB/session var using AJAX of course polling a script that reads progress info.
That is very simplified but I think it might work as you want.
Second:
Apache 2.4 has Lua language built in:
mod_lua
Creating hooks and scripts with mod_lua
I bet you can try to write LUA Apache handler that will monitor your download - send progress to the DB and update progress bar using PHP/AJAX taking progress info from the DB.
Similarly - there are modules for perl and even python (but not for win)
I see main problem in that:
In a php+apache solution output buffering may be placed in several places:
Browser <= 1 => Apache <= 2 => PHP handler <= 3 => PHP Interpreter
process
You need to control first buffer. But directly from PHP it is impossible.
Possible solutions:
1) You can write own mini daemon which primary function will be only send files and run it on another than 80 port 8880 for example. And process downloading files and monitor output buffer from there.
Your output buffer will be only one and you can control it:
Browser <= 1 => PHP Interpreter process
2) Also you can take mod_lua and control output buffers directly from apache.
3) Also you can take nginx and control nginx output buffers using built-in perl (it is stable)
4) Try to use PHP Built-in web server and control php output buffer directly. I can't say anything about how it is stable, sorry. But you can try. ;)
I think that nginx+php+built-in perl is more stable and powerful solution.
But you can choose and maybe use other solution non in that list. I will follow this topic and waiting your final solution with interest.
Read and write to the database at short intervals is killing performance.
I would suggest to use sessions (incrementing the value of sent data in the loop) with which you can safely off by quite another php file, you can return data as JSON which can be used by the javascript function/plugin.

Categories