Google Chrome: XMLHttpRequest.send() not working while doing POST - javascript

I'm working on an application that allows the user to send a file using a form (a POST request), and that executes a series of GET requests while that file is being uploaded to gather information about the state of the upload.
It works fine in IE and Firefox, but not so much in Chrome and Safari.
The problem is that even though send() is called on the XMLHttpRequest object, nothing is being requested as can be seen in Fiddler.
To be more specific, an event handler is placed on the "submit" event of the form, that places a timeout function call on the window:
window.setTimeout(startPolling, 10);
and in this function "startPolling" sequence is started that keeps firing GET requests to receive status updates from a web service that returns text/json that can be used to update the UI.
Is this a limitation (perhaps security-wise?) on WebKit based browsers? Is this a Chrome bug? (I'm seeing the same behaviour in Safari though).

I am having the exact same problem. At the moment i use an iframe, which is targeted in the form. That allows the xhr requests to be executed while posting. While that does work, it doesn't degrade gracefully if someone disables javascript.(I couldn't load the next page outside the iframe without js) So if someone has a nicer solution, i would be grateful to hear it.
Here the jQuery script for reference:
$(function() {
$('form[enctype=multipart/form-data]').submit(function(){
// Prevent multiple submits
if ($.data(this, 'submitted')) return false;
var freq = 500; // freqency of update in ms
var progress_url = '{% url checker_progress %}'; // ajax view serving progress info
$("#progressbar").progressbar({
value: 0
});
$("#progress").overlay({
top: 272,
api: true,
closeOnClick: false,
closeOnEsc: false,
expose: {
color: '#333',
loadSpeed: 1000,
opacity: 0.9
},
}).load();
// Update progress bar
function update_progress_info() {
$.getJSON(progress_url, {}, function(data, status){
if (data) {
var progresse = parseInt(data.progress);
$('#progressbar div').animate({width: progresse + '%' },{ queue:'false', duration:500, easing:"swing" });
}
window.setTimeout(update_progress_info, freq);
});
};
window.setTimeout(update_progress_info, freq);
$.data(this, 'submitted', true); // mark form as submitted.
});
});

Related

'No Internet' detection delay using AJAX

I have a Web App that runs in both Android and iOS by WebView. I understand it is possible to implement network detection within the App itself but I would prefer to do it using jQuery and Ajax call.
The behaviour is slightly strange and maybe not possible to rectify but interesting all the same and I'd like to find out why it's happening.
The Ajax call works and applies an overlay to the screen to block any activity whilst no internet is available. Mainly because any request whilst having no internet takes ages to load when it reconnects. It works absolutely fine when data/wifi is manually turned off but when a mobile phone loses signal there is a definite delay of about 1 min before this is detected, there is no delay when data comes back. Here is the script...
var refreshTime = 1000;
window.setInterval( function() {
$.ajax({
cache: false,
type: "GET",
url: "./offline.php",
success:
function off() {
document.getElementById("overlay").style.display = "none";
},
error: function on() {
document.getElementById("overlay").style.display = "block";
}
});
}, refreshTime );
My question is what is the reason for this delay? Is it something controlled by the mobile OS to keep checking for mobile data or is it something to do with the way the script works? Currently offline.php is a blank file and the Ajax call just checks the file is accessible. Is there something I can add in there to get rid of the delay?

Internet Explorer does not recognise keyup handler

I've been working on a website for a client for a while now. The website uses a instant search search box, where when a user starts typing characters in a text input, search results are fetched using AJAX and immediately displayed in the same webpage. The setup work flawlessly in all major browsers except Internet Explorer (at least 9-11, Edge seems excused).
The core code is as follows:
(function($) {
// Tracking variable for AJAX requests
var request = null;
function searchQuery(query){
console.log("Search query received");
request = $.ajax({
method: "get",
url: icl_home + "wp-json/query/v1/search/" + query,
beforeSend : function(){
if(request != null){
// If a request exists already, cancel it
request.abort();
} else{
// Set loading state to signify loading
// This will output a spinner on the object
$("nav .search").addClass("loading");
}
},
success: function(data){
console.log("Query returned");
// On data get, push it to the right function
formatResults(data);
// Reset loading state
$("nav .search").removeClass("loading");
// Reset request
request = null;
}
});
}
$("input#search").keyup(function(e){
console.log("Trigger keyup");
searchQuery($(this).val());
});
})(jQuery); // Fully reference jQuery after this point.
As you can see, the intent is simple. The problem is that Internet Explorer appears to not recognise our keyup handler, and this is imperative to the functioning of the feature. I've tried replacing the handler with keydown / keypress, I've tried singling out jQuery by registering the event with pure javascript, but nothing has worked yet.
Would you have any suggestions as to where we could look for possible problems?

ajax call to localhost from site loaded over https fails in chrome

-------------------- UPDATE 2 ------------------------
I see now that what I am trying to accomplish is not possible with chrome. But I am still curios, why is the policy set stricter with chrome than for example Firefox? Or is it perhaps that firefox doesn't actually make the call either, but javascript-wise it deems the call failed instead of all together blocked?
---------------- UPDATE 1 ----------------------
The issue indeed seems to be regarding calling http from https-site, this error is produced in the chrome console:
Mixed Content: The page at 'https://login.mysite.com/mp/quickstore1' was loaded over HTTPS, but requested an insecure XMLHttpRequest endpoint 'http://localhost/biztv_local/video/video_check.php?video=253d01cb490c1cbaaa2b7dc031eaa9f5.mov&fullscreen=on'. This request has been blocked; the content must be served over HTTPS.
Then the question is why Firefox allows it, and whether there is a way to make chrome allow it. It has indeed worked fine until just a few months ago.
Original question:
I have some jQuery making an ajax call to http (site making the call is loaded over https).
Moreover, the call from my https site is to a script on the localhost on the clients machine, but the file starts with the
<?php header('Access-Control-Allow-Origin: *'); ?>
So that's fine. Peculiar setup you might say but the client is actually a mediaplayer.
It has always worked fine before, and still works fine in firefox, but since about two months back it isn't working in chrome.
Has there been a revision to policies in chrome regarding this type of call? Or is there an error in my code below that firefox manages to parse but chrome doesn't?
The error only occurs when the file is NOT present on the localhost (ie if a regular web user goes to this site with their own browser, naturally they won't have the file on their localhost, most won't even have a localhost) so one theory might be that since the file isn't there, the Access-Control-Allow-Origin: * is never encountered and therefore the call in its entirety is deemed insecure or not allowed by chrome, therefore it is never completed?
If so, is there an event handler I can attach to my jQuery.ajax method to catch that outcome instead? As of now, complete is never run if the file on localhost isn't there.
before : function( self ) {
var myself = this;
var data = self.slides[self.nextSlide-1].data;
var html = myself.getHtml(data);
$('#module_'+self.moduleId+'-slide_'+self.slideToCreate).html(html);
//This is the fullscreen-always version of the video template
var fullscreen = 'on';
//console.log('runnin beforeSlide method for a video template');
var videoCallStringBase = "http://localhost/biztv_local/video/video_check.php?"; //to call mediaplayers localhost
var videoContent='video='+data['filename_machine']+'&fullscreen='+fullscreen;
var videoCallString = videoCallStringBase + videoContent;
//TODO: works when file video_check.php is found, but if it isn't, it will wait for a video to play. It should skip then as well...
//UPDATE: Isn't this fixed already? Debug once env. is set up
console.log('checking for '+videoCallString);
jQuery.ajax({
url: videoCallString,
success: function(result) {
//...if it isn't, we can't playback the video so skip next slide
if (result != 1) {
console.log('found no video_check on localhost so skip slide '+self.nextSlide);
self.skip();
}
else {
//success, proceed as normal
self.beforeComplete();
}
},
complete: function(xhr, data) {
if (xhr.status != 200) {
//we could not find the check-video file on localhost so skip next slide
console.log('found no video_check on localhost so skip slide '+self.nextSlide);
self.skip();
}
else {
//success, proceed as normal
self.beforeComplete();
}
}, //above would cause a double-slide-skip, I think. Removed for now, that should be trapped by the fail clause anyways.
async: true
});

XMLHttpRequest progress event advances much faster than the actual upload

I'm trying to implement an upload form and return the upload status to return tot he user using xhr. Everything seems to be implemented correctly, however when uploading, the callbacks seem to occur way too quick and return a much higher percentage than has actually occurred.
With files ~<20Mb, I get a callback immediately which shows over 99% while the upload continues to churn away for some time in the background.
See the below screengrab showing the console from a 74Mb file. This was taken a couple of seconds after the upload was initialised and the upload continued for another ~60 seconds (notice just 3 callbacks registering (loaded totalsize) (calculatedpercentage) and the ajax upload continuing with the throbber).
Has anyone experienced this and managed to get an acurate representation of upload status?
(the 'load' event triggers correctly after the upload process)
Here's my code:
$(this).ajaxSubmit({
target: '#output',
beforeSubmit: showRequest,
xhr: function()
{
myXhr = $.ajaxSettings.xhr();
if (myXhr.upload)
{
console.log('have xhr');
myXhr.upload.addEventListener('progress', function(ev){
if (ev.lengthComputable) {
console.log(ev.loaded + " " + ev.total);
console.log((ev.loaded / ev.total) * 100 + "%");
}
}, false);
}
return myXhr;
},
dataType: 'json',
success: afterSuccess
});
There are several reports of the same behavior - incorrect progress report on file upload - caused by antivirus software checking the files to be uploaded. My guess is that some part of antivirus attempts to make up for possible delay (caused by the check) - and fails to do it properly.
I had the same issue recently. I think your ajax call is simply returned before your file uploads. To work around this load back what you uploaded and check for the load event. For example, if you are uploading an image (using jQuery):
var loadCheck = $('<img src="' + uploadedUrl +'">').hide().appendTo('body');
loadCheck.on('load', updateProgressBar());
of Course you can implement it on other type files, and incorporate an $.each iteration.

Check response time of a website with ajax

Hi I want to check response times of website. Here is my code, I got some values but doesn't show reality. What is the problem with these codes. Is it sth related with cache?? Furthermore how to show if page doesn't exit or unavailable.
<script type="text/javascript" src="jquery.min.js"></script> <script type="text/javascript">
var start = new Date();
$.ajax ({
url: 'http://www.example.com',
complete : function()
{
alert(new Date() - start)
},
});
</script>
The code itself is fine assuming that the code is running on the same origin as the one it's checking; you can't use ajax cross-origin unless both ends (client and server) support and are using CORS.
It could be caching, yes, you'd have to refer to the browser tools (any decent browser has a Network tab or similar in its developer tools) to know for sure. You can also disable caching by setting cache: false in the ajax call (see the ajax documentation for details), although that's a somewhat synthetic way to do it. A better way would be to ensure that whatever URL you're using for this timing responds with cache headers telling the browser (and proxies) not to cache it.
You can tell if the page doesn't exist or is "unavailable" (whatever that means) by hooking the error function and looking at the information it gives you:
var start = new Date();
$.ajax ({
url: 'http://www.example.com',
error : function(jqxhr, status, ex) {
// Look at status here
},
complete : function()
{
alert(new Date() - start)
},
});
The arguments given to error are also described in the docs linked above.
You can't do this due to the same origin policy.
One trick would be to create an image and measure the time until the onerror event fires.
var start = new Date();
var img = new Image();
img.onerror = function() {
alert(new Date() - start);
};
img.src = 'http://www.example.com/';
Append a random number to the url to prevent caching.
I'd use browser extensions such as Firebug or Chrome Developer Tools to measure Ajax response times.

Categories