I need to send image data (data:image/png;base64) from the clientside using AJAX to my PHP server. My AJAX call looks like this:(form_data contains the image)
$.ajax({
url: global_siteurl+'/save_image',
data: form_data,
dataType: 'json',
type: 'post',
contentType: "application/x-www-form-urlencoded; charset=UTF-8",
success: function (retval) {
process_save_image(retval);
}
});
Then I store the encoded image data as a blob in the database (yes - long story behind that!). When I retrieve the image data it seems to be corrupted and does not display correctly. Almost as if there are line breaks and spaces introduced into the image data. Am I missing any parameters in my ajax call? Any ideas as to what may be going wrong? Is there a size limit for the image data I can send across?
It's been a long 4 days of chasing this one.
Mmiz
The problem turned out to be the same described (and solved) in this posting:
Blob data replace '+' with space
Turns out I needed to make the blob data safe for URLs when I GET/POST it. On the PHP server side I used the function described in the above posting. On the Javascript side, I used the functions from:
http://notepad2.blogspot.com/2012/08/javascript-make-base64-encoded-string.html
Took a lot of staring at the encoded image data to notice that the +/= were replaced.
Try this when showing image.
echo '<img src="data:image/png;base64,' . base64_encode($blob_data) . '"/>
Related
I have written some jQuery + PHP code that takes the HTML from an element on a webpage and saves it on server. Here is the code that I am using:
var page = {
'html': document.querySelector("article").innerHTML,
'url': 'path/current/webpage.php' or (<?php echo "'$current_page'"; ?>)
// Both give same 'url' value. This is not an issue.
};
$.ajax({
url:'https://example.com/update.php',
type:'post',
data: page,
success:function(data){
window.location.reload();
}
});
Here is my code for update.php:
$content = $_REQUEST['html'];
$page = $_REQUEST['url'];
file_put_contents($page, $content, LOCK_EX);
I am not very comfortable with dataType and contentType so I skipped them initially. However the request succeeded sometimes but gave 403() error other times.I did a little research and found that this might be due to lack of dataType and contentType. So, I used the following values:
contentType: 'text/plain; charset=utf-8',
dataType: 'html'
I no longer get any errors but the pages are not actually updating. I also tried setting the values to:
contentType:'application/json',
dataType: 'html'
This time too, I did not get any 403() errors but the page would not actually update.
Does the post data needs to be accessed differently based on the value of contentType like 'application/json' or 'text/plain; charset=utf-8'? Because the updates don't seem to show up on the webpage even with a 200 response code.
Using application/x-www-form-urlencoded; charset=UTF-8 updates some pages but gives 403() error for others.
As Rory said (as did I, in an answer I wrote then deleted when I saw his comment; he was right to comment instead), a 403 response code probably doesn't mean there's a problem with either dataType or contentType. You should look for other reasons the server would refuse to satisfy the request. For instance, as you're posting HTML, perhaps you (or your web host) has some kind of anti-script-injection protection going on. You'll have to track that down, perhaps with your hosting company.
But two things: Some info for completeness, and a potential workaround:
dataType is the type you expect back from the server. contentType is the type of data you're sending to the server.
For the request you're sending, leaving off contentType is correct, because the default jQuery will use is what PHP will expect to see.
You shouldn't have to specify dataType at all; instead, you should ensure the response carries the correct Content-Type header. That means ensuring that your server is configured correctly (for static content) and that your PHP code sets the correct header if necessary via header("Content-Type: data/type-here") The only reason for specifying dataType is if you don't control the server and you know it sends back the wrong type.
If you need to try to work around it, first ask: What if someone sends me malicious HTML directly, not through my web page? The answer is: You need to be careful with what you do with the HTML. For example: If you are going to store this HTML and then display it (as HTML) to a user, that's a Cross-Site Scripting vulnerability and you have to rigorously sanitize that HTML before doing that.
Do not proceed with any workaround until you've answered that question for yourself.
Okay, so in terms of working around it (once you have robust safeguards in place): You might send JSON rather than a standard form, in hopes that whatever is rejecting the forms won't look at it. To do that, you'd change your ajax call:
var page = {
html: document.querySelector("article").innerHTML,
url: <?php echo "'$current_page'"; ?>
};
$.ajax({
url:'https://example.com/update.php',
type:'post',
data: JSON.stringify(page),
contentType: 'application/json; charset=UTF8',
success:function(data){
window.location.reload();
}
});
Then on the PHP side, you'd read that JSON and parse it (reading code below taken from this answer):
$entityBody = json_decode(stream_get_contents(STDIN));
$content = $entityBody['html'];
$page = $entityBody['url'];
file_put_contents($page, $content, LOCK_EX);
Again: Please do not use this unless you have robust anti-XSS safeguards in place. And again, if you do haev robust anti-XSS safeguards in place, you might be able to just use a normal form by changing your server config.
I need to receive big amount of data via ajax response and seems like some kind of data size limit on live server(nginx) is causing trouble. On local test server( wamp ) and live shared hosting apache I have no limits or issues. The data is currently 2.7mb and it could get bigger. Problem is that the script is going to be used by multiple users and I cant force anyone to increase the post_max_size.
The script is fairly simple
$.ajax({
type: "post",
dataType: "json",
url: ajaxurl,
data: {
'action': 'get_data',
},
success: function(response, status, xhr) {
if (response.data.html !== null) {
//process html 2.7MB
}
}
});
the reason why data is so big is json validation and html data must have special chars , so the response is almost 3 times of the size than what the actual normal processed HTML would be. This is just a small example of it
<\/div>\r\n\t\t\t<\/div>\r\n\t\t<\/div>\r\n\t\t
right now I have no idea how to approach this. To brake the response in chunks or ? If you have any suggestion have at it please.
Update , we figured out that is some kind of data size limit but we cant nail it
if we do
wp_send_json_success(array(
'html' => substr($html, 0, 1830000),
));
data is passing trough
but with 1840000
it is failing ,
our test server is nginx and in litespeed config client_max_body_size is set at 500M so we are not sure what this 1.8M limit is.
I am sending an AJAX POST request using jQuery on a chrome extension but the data doesn't arrive as expected, accented characters turn out malformed.
The text "HÄGERSTEN" becomes "HÄGERSTEN".
The text appears fine in the console etc, only via AJAX to this other page it appears as mentioned. My AJAX call is basic, I send a data-object via jQuery $.ajax. I've tried both with and without contentType, UTF-8 and ISO-8859-1. No difference.
This is how I make my AJAX call:
var newValues = {name: 'HÄGERSTEN'}
$.ajax({
url: POST_URL,
type: 'POST',
data: newValues,
success: function() ...
});
The newValues object has more values but I retrieve them from a form. However, I have tried to specify these values manually as newValues['name'] = 'ÄÄÄÄ'; and still would cause the same problem.
The original form element of the page that I am sending the AJAX to contains attribute accept-charset="iso-8859-1". Maybe this matters.
The target website is using Servlet/2.5 JSP/2.1. Just incase it might make a difference.
I assume this is an encoding issue and as I've understood it should be because Chrome extensions require the script files to be UTF-8 encoded which probably conflicts with the website the plugin is running on and the target AJAX page (same website) which is using an ISO-8859-1 encoding, however I have no idea how to deal with it. I have tried several methods of decoding/encoding it to and from UTF-8 to ISO-8859-1 and other tricks with no success.
I have tried using encodeURIComponent on my values which only makes them show that way exactly on the form that displays the values I have sent via POST, as e.g. H%C3%84GERSTEN.
I have no access to the websites server and cannot tell you whether this is a problem from their side, however I would not suppose so.
UPDATE
Now I have understood POST data must be sent as UTF-8! So a conversion is not the issue?
Seems like the data is UTF-8 encoded when it is sent and not properly decoded on the server side. It has to be decoded on the server side. Test it out with the following encode and decode functions:
function encode_utf8(s) {
return unescape(encodeURIComponent(s));
}
function decode_utf8(s) {
return decodeURIComponent(escape(s));
}
var encoded_string = encode_utf8("HÄGERSTEN");
var decoded_string = decode_utf8(encoded_string);
document.getElementById("encoded").innerText = encoded_string;
document.getElementById("decoded").innerText = decoded_string;
<div>
Encoded string: <span id="encoded"></span>
</div>
<div>
Decoded string: <span id="decoded"></span>
</div>
We too faced the same situation but in our case we always sent the parameters using JSON.stringify.
For this you have to make changes, 1) While making call to the page via AJAX you have to add content-type tag defining in which encoding data is sent
$.ajax
({
type: "POST",
url: POST_URL,
dataType: 'json',//In our case the datatype is JSON
contentType: "application/json; charset=utf-8",
data: JSON.stringify(newValues),//I always use parameters to be sent in JSON format
EDIT
After reading your question more clearly I came to know that your server side JSP uses ISO-8859-1 encoding and reading some posts, I came to know that all POST method data will be transmitted using UTF-8 as mentioned.
POST data will always be transmitted to the server using UTF-8 charset, per the W3C XMLHTTPRequest standard
But while reading post jquery-ignores-encoding-iso-8859-1 there was a workaround posted by iappwebdev which might be useful and help you,
$.ajax({
url: '...',
contentType: 'Content-type: text/plain; charset=iso-8859-1',
// This is the imporant part!!!
beforeSend: function(jqXHR) {
jqXHR.overrideMimeType('text/html;charset=iso-8859-1');
}
});
Above code is taken from Code Posted by iappwebdev
I don't know if it could have been solved using POST-data and AJAX. Perhaps if I made a pure JavaScript XHR AJAX call, I might be able to send POST-data encoded the way I like. I have no idea.
However, in my desperation I tried my final option (or what seemed like it); send the request as GET-data. I was lucky and the target page accepted GET-data.
Obviously the problem still persisted as I was sending data the same way, being UTF-8 encoded. So instead of sending the data as an object I parsed the data into a URL friendly string with my own function using escape, making sure they are ISO-8859-1 friendly (as encodeURIComponent encodes the URI as UTF-8 while escape encodes strings making them compatible with ISO-8859-1).
The simple function that cured my headaches:
function URLEncodeISO(values) {
var params = [];
for(var k in values) params[params.length] = escape(k) + '=' + escape(values[k]);
return params.join('&');
}
The client side character coding is not completely up to you (immagine the usage of the page from different users all around the world: chinese, italian...) while on the server side you need to handle the coding for your purposes.
So, the data in the Ajax-POST can continue to be UTF8-encoded, but in your server side you coan to do the following:
PHP:
$name = utf8_decode($_POST['name']);
JSP:
request.setCharacterEncoding("UTF-8");
String name = request.getParameter("name");
When I try to upload a MP4 video with 16.9 MB size, using ajax async post to an PHP file the console triggers an error saying: POST http://website.com/proc_vids.php net::ERR_EMPTY_RESPONSE
I know for a fact that this problem is related with PHP memory_limit because when I set to 200 MB it's all fine but when I change it back to 100 MB this error happens.
I can't even get the POST to an PHP variable because as soon as the ajax post call is made it triggers the error without even doing anything on server side (PHP). Here is the ajax post code:
var proc = 1;
video = document.getElementById('preview_video').src;
$.ajax({
'async': true,
'type': "POST",
'global': false,
'dataType': 'json',
'url': "proc_vids.php",
'data': {proc: proc, video: video}
}).done(function () {
//Do something
});
PHP code:
$proc = $_POST['proc'];
if ($proc == 1){
//$video = $_POST['video'];
}
As you can see I commented the line where I pass the POST to a variable and still triggering the error.
What can I do to the video variable containing the base64 code to not expand consuming such high memory levels?
Is there any alternatives without setting the memory_limit higher?
Problem solved thanks to cmorrissey!
I used the same method as described in this thread: Convert HTML5 Canvas into File to be uploaded?
Sending AJAX POST as a FormData and converting the base64 data to Uint8Array into a blob is the key to not allocate PHP memory when the POST is made. But be careful tho because older browsers may not support blob.
Thank you guys ;)
I am trying to use BusinessObject RESTful API to download a generated (pdf or xls) document.
I am using the following request:
$.ajax({
url: server + "/biprws/raylight/v1/documents/" + documentId,
type: "GET",
contentType: "application/xml",
dataType: "text",
headers: {"X-SAP-LogonToken": token, "Accept": "application/pdf" },
success: function(mypdf) {
// some content to execute
}
});
I receive this data as a response:
%PDF-1.7
%äãÏÒ
5 0 obj
<</Length 6 0 R/Filter/FlateDecode>>
//data
//data
//data
%%EOF
I first assumed that it was a base64 content, so in order to allow the users to download the file, I added these lines in the success function:
var uriContent = "data:application/pdf; base64," + encodeURIComponent(mypdf);
var newWindow=window.open(uriContent, 'generated');
But all I have is an ERR_INVALID_URL, or a failure while opening the generated file when I remove "base64" from the uriContent.
Does anyone have any idea how I could use data response? I went here but it wasn't helful.
Thank you!
. bjorge .
Nothing much can be done from client-side i.e. JavaScript.
The server side coding has to be changed so that a url link is generated (pointing to the pdf file) and sent as part of the response. The user can download the pdf from the url link.
You cannot create file using javascript, JavaScript doesn't have access to writing files as this would be a huge security risk to say the least.
To achieve your functionality, you can implement click event which target to your required file and it will ask about save that file to user.