Laravel with Resumable.js failed uploading large file and slow server response - javascript

From the start:
I'm using XAMPP as a server stack for testing project which is something like google drive.
What I wanted to achieve is that User can upload any size and type of file.
I'm using for it Resumable.js which divides file into the chunks(1024KB) and sends it onto the server (Apache). I assume this will exclude changes in php.ini file (upload_max_filesize, post_max_size) ?
For testing I've used 9,6GB Linux image. There were two problems:
File was uploading 4,5 hours (TTFB increases every minute from 300ms to 10s close to the end of upload). What fix is possible?
While upload "finishes" it sends Error 500 and it not merge the files together. Is it possible to change something in Apache config? Probably response error?
When I've tested which are less than ~1GB it works. That was stress test. XAMPP is the blank install, nothing changed in any config file.
JS function handling uploading:
let resumable = new Resumable({
target: '{{ route('upload.large') }}',
query:{_token:'{{ csrf_token() }}'} ,// CSRF token
fileType: [],
headers: {
'Accept' : 'application/json'
},
testChunks: false,
throttleProgressCallbacks: 1,
});
Upload function in Controller:
public function upload(Request $request)
{
$receiver = new FileReceiver('file', $request, HandlerFactory::classFromRequest($request));
if (!$receiver->isUploaded()) {
// file not uploaded
}
$fileReceived = $receiver->receive(); // receive file
if ($fileReceived->isFinished()) { // file uploading is complete / all chunks are uploaded
$file = $fileReceived->getFile(); // get file
$extension = $file->getClientOriginalExtension();
$fileName = $file->getClientOriginalName(); //file name without extenstion
$disk = Storage::disk(config('filesystems.default'));
$path = $disk->putFileAs('videos', $file, $fileName);
// delete chunked file
unlink($file->getPathname());
return [
'path' => asset('storage/' . $path),
'filename' => $fileName
];
}
// otherwise return percentage informatoin
$handler = $fileReceived->handler();
return [
'done' => $handler->getPercentageDone(),
'status' => true
];
}

I'm getting close to solving this problem (which I'm having myself). It seems that the failure occurs when the chunks are being merged together to recreate the original file. I'm getting a "Failed to open input stream" once the file size hits 4.4GB. I'm suspecting the file append is limited or the memory isn't getting released. Hopefully this helps narrow it down.

Related

how to upload video in chunks to google drive using vanilla Javascript

I have a google chrome extension that records the current tab and on the stop of the recording, it uploads the video to google drive successfully in a specific folder.
I'm looking for how to upload the video while the recording is still in progress? Meaning the blobs which I am getting from recording gets uploaded to the. google drive.
var superBuffer = new Blob(recordedBlobs, {
type: 'video/mp4',
});
var metadata = {
name: Date.now() + '.mp4',
mimeType: 'video/mp4',
parents: [folderId],
};
var form = new FormData();
form.append('metadata', new Blob([JSON.stringify(metadata)], { type: 'application/json' }));
form.append('file', superBuffer);
var xhrDriveRequest = new XMLHttpRequest();
xhrDriveRequest.open('POST', 'https://www.googleapis.com/upload/drive/v3/files?uploadType=multipart&fields=id');
xhrDriveRequest.setRequestHeader('Authorization', 'Bearer ' + accessToken);
xhrDriveRequest.responseType = 'json';
xhrDriveRequest.send(form);
I implemented something similar for webcam recording using RecordRTC. The library provides a timeSlice integer in milliseconds and a ondataavailable( blob ) callback that will be called after every timeSlice period. You can then simply post the received blob to the server and stitch the blobs back together into a single file.
For example in PHP you would simply append or write the blob-data into file:
$filePath = '/path-to-file'; // path for each recording process created from an id or something unique
if (isset($_FILES["blob"])) {
// If the path already exists we are receiving further blobs => append, else write new file
$fp = fopen($filePath, file_exists($filePath) ? "a" : "w");
fwrite($fp, file_get_contents($_FILES["blob"]["tmp_name"]));
fclose($fp);
}
Google Drive API can perform a resumable upload. It will allow you to keep uploading while your screen is being recorded.
As a summary of how resumable uploads works:
Create a POST request with the uploadType=resumable parameter and get the resumable session URI inside the Location header. Remember that as you don't know the length of the file, the X-Upload-Content-Length should not be setted.
Keep creating PUT requests to that URI with the Content-Length set to the number of bytes in the file.
If something was wrong, resume the upload. Remember that you can set the Content-Range to */* if you don't know the total file size, and the space of the new chunk.

Unable to upload pdf file: Php

The issue I'm facing is, I get the following error while trying to upload some pdfs your upload file is not PDF file. However, this error doesn't show up for all pdfs, it's only for some pdf files I get this error.
<?php
$error = $_FILES['fileToUpload']['error'];
//get upload file type
$type = $_FILES['fileToUpload']['type'];
$action = "upload";
//get file name
$picname = $_FILES['fileToUpload']['name'];
$nameArray = explode(".", $picname);
if {
//check files
//filetoUpload code
}
?>
The issue is that, in the url: '../controller/uploadFile.php' even if the file is PDF, $type = $_FILES['fileToUpload']['type']; will return empty and then it will go into the condition else if($type !="application/pdf" ) and pop up the alert your upload file is not PDF file.. Like I said, this issue is with most of the pdf file. However, some pdf files manage to get uploaded without any issue and if a pdf file gets uploaded, then $type will be application/pdf.
Your input will be highly appriciated.
---UPDATE---
The issue is with $_FILES, it's not fetching the pdf file details for some reason
The issue has been resolved. I checked '$error= $_FILES['fileToUpload']['error']; and the value was returning 1
Value: 1; The uploaded file exceeds the upload_max_filesize directive in php.ini.```
You could better check the extension, this also prevents malicious users to upload exe or zip files when they provide the header Content-Type: application/pdf. Also not all browsers/api libraries specify a Content-Type.
If your filename does not contain a path, check it with a regex so people cannot upload files to directories they shouldn't (ex ../../cache/exe). use for example
preg_match("/^[a-zA-Z0-9_+\\- ]+\\.pdf$/", $filename) to check if it is a pdf.
Do never do unlink('files/' . $filename); when $filename could be anything submitted by the user. Delete ../index.php could destroy your server.

Send filepath to PHP using AJAX and jQuery, and get back file contents

I am trying to:
-Send a filepath to a server with post
-Have PHP on the server receive the post, and get the file contents
-Have the server respond with the file contents
I am aware that this poses some security risks. There will eventually be a system in place to prevent access to certain files depending on who is logged in. Right now im just trying to get the system to work
The JS:
function getFileContentsP(path, contents) {
$.post("/os/php/file_get_contents.php",
{"path": path},
function (data, status, jqXHR) {
contents = data;
},
);
return contents;
}
The PHP:
<?php
if ($_SERVER["REQUEST_METHOD"] == "POST") {
var $filePath = testInput($_POST["path"]);
// returns file contents
echo file_get_contents($filePath);
}
function testInput($input) {
$input = trim($input);
//$input = stripslashes($input); filepaths have slashes
$input = htmlspecialchars($input);
return $input;
}
?>
There is a file in the same directory as the php file.
Using the JS function from the console to get this file returns undefined.
This works so horribly that the file path does not even seem to reach the PHP (Discovered by using a log file that the php could write to).
Am I missing something?

Unable to transfer large size file using request module in Node.js

In a Node.js project I have to transfer file from computer to server. I can send file if file size is small i.e. 2mb but unable to send file if it is more than this size. Here is my code as follows:
var url1 = 'http://beta.xxxxx.com/Xbox/xxxx/index.php/info/xxxxx';
var csvenriched = APPDATApath+'/xxxx/users/'+userId+'/programs/'+programName+'/'+foldername+'/Data_'+tmpstmp+'.csv';
var req = request.post(url1, function (err, resp, body1) {
if (err) {
console.log('REQUEST RESULTS:'+err+resp.statusCode+body1);
res.send(err); return false;
} else {
res.send(body1); return false;
}
});
var form = req.form();
form.append('file', fs.createReadStream(csvenriched));
On the PHP side where I am sending data code is as follows:
public function actionSavetestvideo() {
if (!empty($_FILES)) {
$path = Yii::$app->basePath.'/testfiles/'.$_FILES['file']['name'];
if (move_uploaded_file($_FILES['file']['tmp_name'], $path)) {
return 'uploaded';
} else {
return 'error'.$_FILES["file"]["error"];
}
} else {
return $_FILES;
}
}
I know there are answers on internet in case if I have to upload file on Node.js server But in my case I have to transfer file using request module from Node.js to PHP server.
It is working fine in case if file size is small but not if CSV file size is large.
The one thing which I have noticed that if file size is large then if (!empty($_FILES)){} on php side went failed. So I don't think there is issue on PHP side. Please suggest what should I modify there?
The problem is on the PHP side. The default upload_max_filesize configuration value for PHP is 2MB. You will need to increase that value to accept larger file uploads.
I agreed from mscdex the problem is on PHP side. After increasing upload_max_filesize from 2MB to 50MB it was not working, till then I have restarted the Apache Server again.
On Restart the Apache server, it starts working perfectly.

Uploadify Success but No Files Uploaded?

I am attempting to implement uploadify on a site.
It says the files are uploaded but when I look in the upload folder nothing is there.
I have read other post similar to this without luck.
I did read this answer to another question:
I had similar problems on a Linux machine. It turned out that the PHP configuration on my server was the cuplrit. PHP was running in SAFE MODE. As I had uploaded the Uploadify scripts via FTP, so script files were stored in the file system with my FTP user details. Since PHP's temp folder was owned by the server root, I had a UID mismatch, i.e. the temporary upload file was attributed to root while the upload script that tried to move it was owned by the FTP user. That fragged it.
To resolve this I changed the ownership of the uploadify php script to root and from there on it worked.
I know little about server side coding as I am more a front end person. How do I change permissions? I am using 1&1 Hosting.
Here is a screenshot of the files on the server in FileZilla:
EDIT
I tried to upload a ZIP file and it said the upload was successful but did not upload. However, I wonder if there is an error with my script because I should not have been allowed to upload a ZIP File because of this line in the PHP Script:
// Validate the file type
$fileTypes = array('jpg','jpeg','gif','png'); // File extensions
Shouldn't the script reject the zip file?
Below is my code I am using in case there is an error with the scripts and not my server:
JS
$(function() {
$('#file_upload').uploadify({
'swf' : 'uploadify.swf',
'uploader' : 'uploadify.php',
'onUploadSuccess' : function(file, data, response) {
alert('The file ' + file.name + ' was successfully uploaded with a response of ' + response + ':' + data);
}
});
});
PHP
<?php
$targetFolder = '/uploads/'; // Relative to the root
$verifyToken = md5('unique_salt' . $_POST['timestamp']);
if (!empty($_FILES) && $_POST['token'] == $verifyToken) {
$tempFile = $_FILES['Filedata']['tmp_name'];
$targetPath = $_SERVER['DOCUMENT_ROOT'] . $targetFolder;
$targetFile = rtrim($targetPath,'/') . '/' . $_FILES['Filedata']['name'];
// Validate the file type
$fileTypes = array('jpg','jpeg','gif','png'); // File extensions
$fileParts = pathinfo($_FILES['Filedata']['name']);
if (in_array($fileParts['extension'],$fileTypes)) {
move_uploaded_file($tempFile,$targetFile);
echo '1';
} else {
echo 'Invalid file type.';
}
}
?>
It looks as though the token verification code is the problem. If you remove that functionality, the upload should go through :)
Can you remove that if() comparison by commenting it out?
if (!empty($_FILES) && $_POST['token'] == $verifyToken) { line changes to:
if (!empty($_FILES) /* && $_POST['token'] == $verifyToken */) {
It seems that the $fileTypes is case sensitive on my Linux PHP install.
'image.jpg' uploads but 'image.JPG' does not.
Change
$fileTypes = array('jpg','jpeg','gif','png');
To
$fileTypes = array('jpg','JPG','jpeg','JPEG','gif','GIF','png','PNG');

Categories