I'm really curious if someone can better explain the internal workings of excel's addin caching of javascript functions? I'm running a flask app on my own internal website behind a SSL cert. My addin pulls in this functionfile.html, and makes an ajax call back to mysite:
<script>
// After the Office library builds, it will look for Office.initialize
// which must be passed a function. It doesnt have to do anything though.
Office.initialize = function (reason){
$(document).ready(function(){
registerBindings();
});
}
function getData(){
return Excel.run( function (context) {
// request bindings
var state_nbr = context.workbook.bindings.getItem("state_nbr").getRange().load("values");
// and the rest for the ajax call
return context.sync().then( function () {
$.ajax({
type: "GET",
url: "https://example.com/excel/function",
data: {
"state_nbr": state_nbr.values[0][0]
}
}).done( function (data){
context.workbook.bindings.getItem("test").getRange().values = [["done"]];
return context.sync();
}).fail( function (result){
context.workbook.bindings.getItem("test").getRange().values = [["fail"]];
return context.sync();
});
});
});
}
</script>
When I click my button, I can see the request with the right payload going to example.com/excel/function, which is a flask route that pumps out a bunch of CLI junk (hundreds of logging commands).
What gets weird though, is that after that first click every time I click the button I don't get any new ajax requests, I only get a request for the functionfile.html. But SheetA1 still pops up "done".
I thought this was just storing the results in cache, but even with flask running in debug mode, if I change functionfile.html, say [["done"]] to [["finished"]], no new ajax call is detected in my logs. BUT THE SHEET UPDATES?!
Related
As a little side-project I created a Django-based Web Application. So far I have created my webpage with basic Javascript. I can successfully get data from the database and create an AJAX POST-request via Javascript. Everything is working BUT there is something that really bothers me:
Every second POST-request takes a significantly longer time to reach the server. For example: Request1 returns successful after 29ms. Request2 needs (for the exact same task!) over 300ms. This goes on and is 100% replicable. I have no idea what the reason for that issue is. I hope someone can guess what the root for this problem is.
Image of "request-waterfall" from the developer tool.
Used Code:
//THIS IS THE POST REQUEST IN THE JS-SCRIPT (CLIENT-SIDE)
$.ajax({
type: "POST",
url: '/update_database',
data: {
"habit_name": habits[h].habit_name,
"meta_data": meta_data
},
success: function(result) {
update();
console.log('Successful');
}
});
Server-side handling of the POST-request:
def update_database(request):
post_dict = dict(six.iterlists(request.POST))
habit_name = post_dict["habit_name"][0]
meta_data = post_dict["meta_data"][0]
change_meta_data(habit_name, meta_data)
data = {
"Habits": Habit.objects.all().values()
}
return JsonResponse(list(data["Habits"]), safe=False)
Update: The problem only occurs when I launch the server on localhost. If I runserver on local IP-adress, it works fine..
I've built a javascript application which has a graph with some dropdown filters that users are able to change. The dropdowns all have event listeners which submit a server request to get the data (via a jquery ajax call) and then graphs the data. The issue is if the user uses the keyboard to quickly go through many different elements of the dropdown.
The server call takes roughly a second so if they quickly scroll through say 20, this can lead to a buildup. 20 different requests to the server are created, and then there's not even a guarantee that the last piece of code to be executed on server request success will be the most current filter.
So my question is what is the best way when a filter is changed to kill all other asynchronous processes? Here's the relevant code:
$("#filter").change(function(d) {
getData();
} //end of if
});
function getData() {
...
$.ajax({
url: myUrl
type: "GET",
dataType: "json",
success: function(d) {
[do some stuff]
} //end of success function
}); //end of ajax
} //end of getData
Save all the Ajax calls you wish to abort into some variable then you can abort the last call.
This is a very common practice when some ajax call might happen many times, before the ones before it had a chance to finish.
function getData(){
$filter.data('REQ') && $filter.data('REQ').abort(); // abort the last request before creating a new one
return $.ajax({
url:myUrl
type:"GET",
dataType:"json",
success:function(d){
[do some stuff]
}
})
}
var $filter = $("#filter");
$filter.on('change', function(d) {
$filter.data('REQ', getData())
});
Of-course this is a very simplified manner of code and you should re-write this in a more structured way, but it gives you an idea of how to cache the last ajax request and then you have the power to abort it before sending a new call.
By the way, your title has nothing to do with the question. You are asking how to handling a sequence of ajax calls but asking about events. this question has nothing to do with events, so you should change the title to fit the problem.
Update regarding what #t.niese had said in the comments:
Throttling the requests on the client side is also a good play to do, since the server cannot really know if the client has aborted the request or not, and if it's a resource-demanding request, it should be throttled.
BUT, I would suggest throttling the requests on the server-side and not on the client-side, if possible, because client-side throttling could be by-passed and is not 100% reliable, and it "costs" the same amount of time to do that on the server-side.
Can a http server detect that a client has cancelled their request?
I would drop all attempts initiate a new request calling getData while a current request is running, and only send the last getData attempt as soon as the current request is finished. This will ensure that the server load wont become unnecessarily high, because only one request will run.
var currentRequest;
var resubmitRequest;
function getData() {
// only start a new request id no current request is running
if (!currentRequest) {
resubmitRequest = false;
currentRequest = Promise.resolve($.ajax({
url: myUrl
type: "GET",
dataType: "json"
})); //end of ajax
currentRequest
.then((d) => {
[do some stuff]
})
.finally(() => {
// if the current request finished and another attempt to request data
// happened while this request was running then call getData again
if (resubmitRequest) {
getData()
}
})
} else {
// store the information that the data has to be requested
// another time after the currently running request finished
resubmitRequest = true;
}
return currentRequest;
} //end of getData
you can simply pause it for example to 500mil seconds and then run 'last' change and do that ajax call..
$("#filter").change(function(d) {
if( $.active > 0 ) return false;
// do not run new ajax if last one is not finished
clearTimeout( window.timer );
// if new 'change' evt is raised in less then 500 mils
// then clear it and run 'last' one
window.timer = setTimeout( function(){
getData();
}, 500 );
});
in case if user changes it while doing ajax, return false it :)
With little effort you can code your own handler for these cases:
function handler(ms, fn) {
var eventId;
return function () {
// if there is an event programmed, kill it
clearTimeout(eventId);
// and program the new one (replace the event)
eventId = setTimeout(fn);
// if passed the time no event was programmed, setTimeout is going to execute the function
}
}
// execute getData if no event is fired in a 1000ms interval (the user stopped typing)
// getData is the same that you have
var fn = handler(1000, getData);
$("#filter").change(fn);
I saw that's possible to do ajax requests inside a web worker, but I want to do the ajax call via jQuery (outside worker, of course), and after this, pass the result of the callback to the worker.
I made some tests and this works, but I want to know if there's something wrong with it (memory leaks, incompatibility, instability):
$.ajax
({
type: 'GET',
url : 'http://192.168.0.2/json.php',
data: requestDataObj,
dataType: 'json'
}).success(function(jsonResult)
{
var jSonWorker = new Worker('http://localhost/webmail/responsive/js/workers.js');
jSonWorker.postMessage(jsonResult);
jSonWorker.onmessage = function(event)
{
alert(event.data)
}
});
As you can see, I pass the jsonResult to the worker, where I do something with it and post a message back to the main thread. Anything wrong with this?
The only problem I see is that you're assuming any browser has support for window.Worker, which is not the case.
In case blocking the application is feasible — say the computation you're expecting the worker to do is light — you could paste the entire code of the worker inside the ajax callback [1].
Thus
...success(function (res) {
if (window.Worker) {
var worker = new Worker(...);
worker.onmessage = function (e) { /* ... */ };
worker.postMessage(res);
}
else {
// Write here the same code the worker
// is supposed to execute "on" res.
}
});
Course you will lose the performance improvement you gained with two threads.
[1] As done here by #afshinm.
I have a pyramid application that runs perfectly on a local server, but when I move it over to a web server (Dreamhost), I get the following error:
400 Bad Request:
Bad request (GET and HEAD requests may not contain a request body)
The code in question is the following ajax in Javascript:
function summary_ajax(sName){
$.ajax({
type: "POST",
url: "summary",
dataType: "json",
data: {
'ccg_name': sName,
},
async: false,
success: function(data) {
//alert("In ajax success function") <----------- This never executes
lValues = data.lValues;
lLabels = data.lLabels;
},
});
};
return (lValues, lLabels);
And is handled in views.py:
#view_config(route_name="ccg_map_summary_ajax",renderer="json")
def ccg_map_summary_ajax(self):
sCCG = self.request.POST.get('ccg_name')
fData = open('pyramidapp/static/view_specific_js/ajax_summary_data.js')
dData = json.load(fData)
lLabels = dData[sCCG].keys()
lValues = dData[sCCG].values()
return {
'lLabels' : lLabels,
'lValues' : lValues,
}
I did some testing by placing alert() functions (its slow, because the server only reloads the script every so many minutes), and everything executes fine except for alerts in the ajax call. So it seems that either the post fails, or something goes wrong in the view. Any ideas?
So is there something in this code that works in my local server (in Pyramid) but breaks down in the web server (Dreamhost)?
The file structure is the same in the local and web server. I don't see why it shouldn't, but will fData still open the file for reading?
For anyone else out there, I found the problem:
The path I specified above was a relative path that worked on my system but not on the server because the working directories are obviously different. So instead of using a relative path, I just changed the script to have the correct absolute path.
To find the current working directory path, just enter pwd into terminal.
This question might seem a bit odd, the problem arised when the page went through webtests.
The page uses an AJAX call (async set to true) to gather some data. For some reason it won't swap pages before the AJAX call has returned - consider the following code:
console.log("firing ajax call");
$.ajax({
type: "POST",
url: "requestedService",
data: mode : "requestedMethod",
cache: false,
dataType: "json",
success: function() { console.log("ajax response received") },
error: null,
complete: null,
});
console.log("changing window location");
window.location = "http://www.google.com"
The location only changes after AJAX returns the response. I have tested the call, it is in fact asynchronous, the page isn't blocked. It should just load the new page even if the AJAX call hasn't completed, but doesn't. I can see the page is trying to load, but it only happens once I get the response. Any ideas?
The console output is:
firing ajax call
changing window location
ajax response received
This seems to work fine for me. The location is changed before the code in the async handler executes. Maybe you should post some real code and not a simplified version, so that we can help better.
Here is a demonstration that works as you expect: http://jsfiddle.net/BSg9P/
$(document).ready(function() {
var result;
$("#btn").on('click', function(sender, args) {
setInterval(function() {
result = "some result";
console.log("Just returned a result");
}, 5000);
window.location = "http://www.google.com";
});
});
And here is a screenshot of the result: http://screencast.com/t/VbxMCxxyIbB
I have clicked the button 2 times, and you can see in the JS console that the message about the location change is printed before the result each time. (The error is related to CORS, if it was the same domain, it would navigate).
Bit late but maybe someone else will have the same issue.
This answer by #todd-menier might help: https://stackoverflow.com/questions/941889#answer-970843
So the issue might be server-side. For eg, if you're using PHP sessions by default the user's session will be locked while the server is processing the ajax request, so the next request to the new page won't be able to be processed by the server until the ajax has completed and released the lock. You can release the lock early if your ajax processing code doesn't need it so the next page load can happen simultaneously.