I've been working on a website for a client for a while now. The website uses a instant search search box, where when a user starts typing characters in a text input, search results are fetched using AJAX and immediately displayed in the same webpage. The setup work flawlessly in all major browsers except Internet Explorer (at least 9-11, Edge seems excused).
The core code is as follows:
(function($) {
// Tracking variable for AJAX requests
var request = null;
function searchQuery(query){
console.log("Search query received");
request = $.ajax({
method: "get",
url: icl_home + "wp-json/query/v1/search/" + query,
beforeSend : function(){
if(request != null){
// If a request exists already, cancel it
request.abort();
} else{
// Set loading state to signify loading
// This will output a spinner on the object
$("nav .search").addClass("loading");
}
},
success: function(data){
console.log("Query returned");
// On data get, push it to the right function
formatResults(data);
// Reset loading state
$("nav .search").removeClass("loading");
// Reset request
request = null;
}
});
}
$("input#search").keyup(function(e){
console.log("Trigger keyup");
searchQuery($(this).val());
});
})(jQuery); // Fully reference jQuery after this point.
As you can see, the intent is simple. The problem is that Internet Explorer appears to not recognise our keyup handler, and this is imperative to the functioning of the feature. I've tried replacing the handler with keydown / keypress, I've tried singling out jQuery by registering the event with pure javascript, but nothing has worked yet.
Would you have any suggestions as to where we could look for possible problems?
Related
I am using below code to open a link on button click. The link is pointing to a Controller method responsible for downloading some Excel file.
// Button to download table data
$("#btnDownloadCIRResults").click(function (e) {
var All_Recs = $("#cbShowAllRecords").prop("checked") ? "YES" : "NO";
DisplayStatusMessageWind("Downloading report, please wait...", MessageType.Info, true, false);
// DownloadCIRemediationTable(string AllRecords)
window.location = '/AskNow/DownloadCIRemediationTable?AllRecords=' + All_Recs;
DisplayStatusMessageWind("Report downloaded successfully.", MessageType.Success, false, true, 1000);
e.preventDefault();
});
The Controller method queries a DB table, converts it to an Excel workbook and returns a file as download result. All is working fine and as expected, except, since this is a time consuming process, I just want to improve on user experience and update this code to show some wait message while the file is being downloaded.
The DisplayStatusMessageWind() method shows a wait message. However, it doesn't know or care about the load complete event of the window.location = '/AskNow/DownloadCIRemediationTable?AllRecords=' + All_Recs; code.
How can I make the completion message appear only after the file download is completed:
DisplayStatusMessageWind("Report downloaded successfully.", MessageType.Success, false, true, 1000);
By assigning a new location with window.location = "<NEWURL>"; you're requesting asynchronously to replace the current page. What will happen, is that the next line is immediately executed (DisplayStatusMessage()). When all events are handled, the page will finally be replaced. The new page (URL) will load and you'll have no control whatsoever about how or what will happen next.
What you should do is use window.open("<NEWURL>", '_blank') MOZ and then on the new page send a signal via localStorage, which can be read and written by all pages of the same domain. These are some hints, to write the actual code is your job.
On this page, in on("click") event:
// local scope
var ukey;
// polling function
function waitOtherIsReady()
{
if (localStorage.getItem(ukey) === true)
{
// other page experienced ready event
localStorage.removeItem(ukey); // clean-up
// TODO: do your stuff
} else {
setTimeout(waitOtherIsReady, 500);
}
}
// create unique key and deposit it in localStorage
ukey = "report_" + Math.random().toString(16);
localStorage.setItem(ukey, false);
// pass key to other page
window.open("URL?ukey=" + ukey, "_blank");
// start polling until flag is flipped to true
setTimeout(waitOtherIsReady, 500);
On the other page:
$(() => {
// get ukey from URL
var ukey = new URL(window.location.href).searchParams.get("ukey");
// page is now ready, flip flag to signal ready event
localStorage.setItem(ukey, true);
});
I have a Web App that runs in both Android and iOS by WebView. I understand it is possible to implement network detection within the App itself but I would prefer to do it using jQuery and Ajax call.
The behaviour is slightly strange and maybe not possible to rectify but interesting all the same and I'd like to find out why it's happening.
The Ajax call works and applies an overlay to the screen to block any activity whilst no internet is available. Mainly because any request whilst having no internet takes ages to load when it reconnects. It works absolutely fine when data/wifi is manually turned off but when a mobile phone loses signal there is a definite delay of about 1 min before this is detected, there is no delay when data comes back. Here is the script...
var refreshTime = 1000;
window.setInterval( function() {
$.ajax({
cache: false,
type: "GET",
url: "./offline.php",
success:
function off() {
document.getElementById("overlay").style.display = "none";
},
error: function on() {
document.getElementById("overlay").style.display = "block";
}
});
}, refreshTime );
My question is what is the reason for this delay? Is it something controlled by the mobile OS to keep checking for mobile data or is it something to do with the way the script works? Currently offline.php is a blank file and the Ajax call just checks the file is accessible. Is there something I can add in there to get rid of the delay?
-------------------- UPDATE 2 ------------------------
I see now that what I am trying to accomplish is not possible with chrome. But I am still curios, why is the policy set stricter with chrome than for example Firefox? Or is it perhaps that firefox doesn't actually make the call either, but javascript-wise it deems the call failed instead of all together blocked?
---------------- UPDATE 1 ----------------------
The issue indeed seems to be regarding calling http from https-site, this error is produced in the chrome console:
Mixed Content: The page at 'https://login.mysite.com/mp/quickstore1' was loaded over HTTPS, but requested an insecure XMLHttpRequest endpoint 'http://localhost/biztv_local/video/video_check.php?video=253d01cb490c1cbaaa2b7dc031eaa9f5.mov&fullscreen=on'. This request has been blocked; the content must be served over HTTPS.
Then the question is why Firefox allows it, and whether there is a way to make chrome allow it. It has indeed worked fine until just a few months ago.
Original question:
I have some jQuery making an ajax call to http (site making the call is loaded over https).
Moreover, the call from my https site is to a script on the localhost on the clients machine, but the file starts with the
<?php header('Access-Control-Allow-Origin: *'); ?>
So that's fine. Peculiar setup you might say but the client is actually a mediaplayer.
It has always worked fine before, and still works fine in firefox, but since about two months back it isn't working in chrome.
Has there been a revision to policies in chrome regarding this type of call? Or is there an error in my code below that firefox manages to parse but chrome doesn't?
The error only occurs when the file is NOT present on the localhost (ie if a regular web user goes to this site with their own browser, naturally they won't have the file on their localhost, most won't even have a localhost) so one theory might be that since the file isn't there, the Access-Control-Allow-Origin: * is never encountered and therefore the call in its entirety is deemed insecure or not allowed by chrome, therefore it is never completed?
If so, is there an event handler I can attach to my jQuery.ajax method to catch that outcome instead? As of now, complete is never run if the file on localhost isn't there.
before : function( self ) {
var myself = this;
var data = self.slides[self.nextSlide-1].data;
var html = myself.getHtml(data);
$('#module_'+self.moduleId+'-slide_'+self.slideToCreate).html(html);
//This is the fullscreen-always version of the video template
var fullscreen = 'on';
//console.log('runnin beforeSlide method for a video template');
var videoCallStringBase = "http://localhost/biztv_local/video/video_check.php?"; //to call mediaplayers localhost
var videoContent='video='+data['filename_machine']+'&fullscreen='+fullscreen;
var videoCallString = videoCallStringBase + videoContent;
//TODO: works when file video_check.php is found, but if it isn't, it will wait for a video to play. It should skip then as well...
//UPDATE: Isn't this fixed already? Debug once env. is set up
console.log('checking for '+videoCallString);
jQuery.ajax({
url: videoCallString,
success: function(result) {
//...if it isn't, we can't playback the video so skip next slide
if (result != 1) {
console.log('found no video_check on localhost so skip slide '+self.nextSlide);
self.skip();
}
else {
//success, proceed as normal
self.beforeComplete();
}
},
complete: function(xhr, data) {
if (xhr.status != 200) {
//we could not find the check-video file on localhost so skip next slide
console.log('found no video_check on localhost so skip slide '+self.nextSlide);
self.skip();
}
else {
//success, proceed as normal
self.beforeComplete();
}
}, //above would cause a double-slide-skip, I think. Removed for now, that should be trapped by the fail clause anyways.
async: true
});
I am working on a simple firefox extension that tracks the url requested and call a web service at the background which detects whether the URL is suspicious or not and based on the result returned by the service, extension decides to stop the page load and alert the user about the case of forgery or whatever, and if user still wishes to go to that page he can get redirected to the original page he has requested for
I have added a http-on-modify-request observer
var observerService = Components.classes["#mozilla.org/observer-service;1"].getService(Components.interfaces.nsIObserverService);
observerService.addObserver(requestObserverListener.observe, "http-on-modify-request", false);
and the observer
var requestObserverListener = {observe: function (subject, topic, data) {
//alert("Inside observe");
if (topic == "http-on-modify-request") {
subject.QueryInterface(Components.interfaces.nsIHttpChannel);
var url = subject.URI.spec; //url being requested. you might want this for something else
//alert("inside modify request");
var urlbarvalue = document.getElementById("urlbar").value;
urlbarvalue = processUrl(urlbarvalue, url);
//alert("url bar: "+urlbarvalue);
//alert("url: "+url);
document.getElementById("urlbar").style.backgroundColor = "white";
if(urlbarvalue == url && url != "")
{
var browser = getWindowForRequest(subject);
if (browser != null) {
//alert(""+browser.contentDocument.body.innerHTML);
alert("inside browser: "+url);
getXmlHttpRequest(url);
}
}
}
},
}
so when the URL in the URLbar and the requested url matches REST service will be called through ajax getXmlHttpRequest(url); method
now when i am running this extension call is made to the service but before the service return any response the page gets loaded which is not appropriate because user might enter his credentials in the meanwhile and get compromised
I want to first display user a warning message on the browser tab and if he still wanted to visit to that page he can then be redirected to that page on a link click in warning message window
I haven't tried this code out so I'm not sure that suspend and resume will work well but here's what I would try. You're working with an nsIRequest object as your subject so you can call subject.suspend() on it. From there use callbacks to your XHR call to either cancel() or resume() the nsIRequest.
Here's the relevant (untested) snippet of code. My XHR assumes some kind of promise .the() return but hopefully you understand the intention:
if(urlbarvalue == url && url != "")
{
var browser = getWindowForRequest(subject);
if (browser != null) {
// suspend the pending request
subject.suspend();
getXmlHttpRequest(url).then(
function success() { subject.resume(); },
function failure() { subject.cancel(Components.results.NS_BINDING_ABORTED); });
}
}
Just some fair warning that you actually don't want to implement an add-on in this way.
It's going to be extremely slow to do a remote call for every HTTP request. The safe browsing module does a single call to download a database of sites considered 'unsafe', it then can quickly check the database against the HTTP request page such that it doesn't have to make individual calls every time.
Here's some more info on this kind of intercepting worth reading: https://developer.mozilla.org/en-US/docs/XUL/School_tutorial/Intercepting_Page_Loads#HTTP_Observers
Also I'd worry that your XHR request will actually loop because XHR calls creates an http-on-modify-request event so your code might actually check that your XHR request is valid before being able to check the current URL. You probably want a safety check for your URL checking domain.
And here's another stackoverflow similar question to yours that might be useful: How to block HTTP request on a particular tab?
Good luck!
I'm working on an application that allows the user to send a file using a form (a POST request), and that executes a series of GET requests while that file is being uploaded to gather information about the state of the upload.
It works fine in IE and Firefox, but not so much in Chrome and Safari.
The problem is that even though send() is called on the XMLHttpRequest object, nothing is being requested as can be seen in Fiddler.
To be more specific, an event handler is placed on the "submit" event of the form, that places a timeout function call on the window:
window.setTimeout(startPolling, 10);
and in this function "startPolling" sequence is started that keeps firing GET requests to receive status updates from a web service that returns text/json that can be used to update the UI.
Is this a limitation (perhaps security-wise?) on WebKit based browsers? Is this a Chrome bug? (I'm seeing the same behaviour in Safari though).
I am having the exact same problem. At the moment i use an iframe, which is targeted in the form. That allows the xhr requests to be executed while posting. While that does work, it doesn't degrade gracefully if someone disables javascript.(I couldn't load the next page outside the iframe without js) So if someone has a nicer solution, i would be grateful to hear it.
Here the jQuery script for reference:
$(function() {
$('form[enctype=multipart/form-data]').submit(function(){
// Prevent multiple submits
if ($.data(this, 'submitted')) return false;
var freq = 500; // freqency of update in ms
var progress_url = '{% url checker_progress %}'; // ajax view serving progress info
$("#progressbar").progressbar({
value: 0
});
$("#progress").overlay({
top: 272,
api: true,
closeOnClick: false,
closeOnEsc: false,
expose: {
color: '#333',
loadSpeed: 1000,
opacity: 0.9
},
}).load();
// Update progress bar
function update_progress_info() {
$.getJSON(progress_url, {}, function(data, status){
if (data) {
var progresse = parseInt(data.progress);
$('#progressbar div').animate({width: progresse + '%' },{ queue:'false', duration:500, easing:"swing" });
}
window.setTimeout(update_progress_info, freq);
});
};
window.setTimeout(update_progress_info, freq);
$.data(this, 'submitted', true); // mark form as submitted.
});
});