Web Workers creation inside Ajax callback - javascript

I saw that's possible to do ajax requests inside a web worker, but I want to do the ajax call via jQuery (outside worker, of course), and after this, pass the result of the callback to the worker.
I made some tests and this works, but I want to know if there's something wrong with it (memory leaks, incompatibility, instability):
$.ajax
({
type: 'GET',
url : 'http://192.168.0.2/json.php',
data: requestDataObj,
dataType: 'json'
}).success(function(jsonResult)
{
var jSonWorker = new Worker('http://localhost/webmail/responsive/js/workers.js');
jSonWorker.postMessage(jsonResult);
jSonWorker.onmessage = function(event)
{
alert(event.data)
}
});
As you can see, I pass the jsonResult to the worker, where I do something with it and post a message back to the main thread. Anything wrong with this?

The only problem I see is that you're assuming any browser has support for window.Worker, which is not the case.
In case blocking the application is feasible — say the computation you're expecting the worker to do is light — you could paste the entire code of the worker inside the ajax callback [1].
Thus
...success(function (res) {
if (window.Worker) {
var worker = new Worker(...);
worker.onmessage = function (e) { /* ... */ };
worker.postMessage(res);
}
else {
// Write here the same code the worker
// is supposed to execute "on" res.
}
});
Course you will lose the performance improvement you gained with two threads.
[1] As done here by #afshinm.

Related

Asp.Net Service method Is Not Executing or Calling Inner Method or Statement on BeforeUnload event

I have beforeunload event in js which will hit the .asmx service method as provided below.
.js event
$(window).on("beforeunload", function () {
var d, str;
str = '{Id:"' + $('#hdnId').val() + '"}';
d = str;
$.ajax({
type: "POST", //GET or POST or PUT or DELETE verb
url: "../POC.asmx/fUpdateTimeInterval",
data: d,
contentType: "application/json; charset=utf-8",
dataType: "json", //Expected data format from server
async: true,
beforeSend: function () {
// BlockUI();
},
success: function (data, Type, xhr) {//On Successfull service call
},
error: function (XMLHttpRequest, textStatus, errorThrown) {
alert(errorThrown);
},
complete: function () {
},
failure: function () {
}
});
});
.asmx (Web Service)
[WebMethod(true)]
public int fUpdateTimeInterval(String Id)
{
return new MY_POC.DLL.Survey().fUpdateTimeInterval(Id);
}
The above service will then call the below mentioned method defined in DLL class file.
public int fUpdateTimeInterval(Int32 Id)
{
List<SqlParameter> objParam = new List<SqlParameter>()
{
new SqlParameter{ParameterName ="#Id",Direction=ParameterDirection.Input,DbType=DbType.Int32,Value= Id},
};
MY_POC.DLL.SqlHelper.ExecuteNonQuery("MY_UpdateTimeInterval", System.Data.CommandType.StoredProcedure, objParam.ToArray());
return 0;
}
Now problem is when the page gets load on browser for first time I am getting current auto ID of the inserted row. If I refreshes the browser, then beforeunload event gets fired and update the row of received ID only. But if I close the tab or browser then the compiler would hit the service method and stops after opening brace, it does not executing further & not even showing any error.
After execution stops, I am getting the following message at Output screen of vs, but not showing any error.
It sounds like execution of the request is being aborted because the browser is closing the connection.
You should consider using the Beacon API. It's well supported by almost all browsers and it's made for this purpose. From Mozilla's documentation:
The main use case for the Beacon API is to send analytics such as
client-side events or session data to the server. Historically,
websites have used XMLHttpRequest for this, but browsers do not
guarantee to send these asynchronous requests in some circumstances
(for example, if the page is about to be unloaded). To combat this,
websites have resorted to various techniques, such as making the
request synchronous, that have a bad effect on responsiveness. Because
beacon requests are both asynchronous and guaranteed to be sent, they
combine good performance characteristics and reliability.
You can also make your Ajax request synchronous to prevent the connection from closing but that will have an impact on your GUI as it will block until the request completes.

OfficeJS Caching

I'm really curious if someone can better explain the internal workings of excel's addin caching of javascript functions? I'm running a flask app on my own internal website behind a SSL cert. My addin pulls in this functionfile.html, and makes an ajax call back to mysite:
<script>
// After the Office library builds, it will look for Office.initialize
// which must be passed a function. It doesnt have to do anything though.
Office.initialize = function (reason){
$(document).ready(function(){
registerBindings();
});
}
function getData(){
return Excel.run( function (context) {
// request bindings
var state_nbr = context.workbook.bindings.getItem("state_nbr").getRange().load("values");
// and the rest for the ajax call
return context.sync().then( function () {
$.ajax({
type: "GET",
url: "https://example.com/excel/function",
data: {
"state_nbr": state_nbr.values[0][0]
}
}).done( function (data){
context.workbook.bindings.getItem("test").getRange().values = [["done"]];
return context.sync();
}).fail( function (result){
context.workbook.bindings.getItem("test").getRange().values = [["fail"]];
return context.sync();
});
});
});
}
</script>
When I click my button, I can see the request with the right payload going to example.com/excel/function, which is a flask route that pumps out a bunch of CLI junk (hundreds of logging commands).
What gets weird though, is that after that first click every time I click the button I don't get any new ajax requests, I only get a request for the functionfile.html. But SheetA1 still pops up "done".
I thought this was just storing the results in cache, but even with flask running in debug mode, if I change functionfile.html, say [["done"]] to [["finished"]], no new ajax call is detected in my logs. BUT THE SHEET UPDATES?!

Kill/terminate ajax request if a new one is fired

I've built a javascript application which has a graph with some dropdown filters that users are able to change. The dropdowns all have event listeners which submit a server request to get the data (via a jquery ajax call) and then graphs the data. The issue is if the user uses the keyboard to quickly go through many different elements of the dropdown.
The server call takes roughly a second so if they quickly scroll through say 20, this can lead to a buildup. 20 different requests to the server are created, and then there's not even a guarantee that the last piece of code to be executed on server request success will be the most current filter.
So my question is what is the best way when a filter is changed to kill all other asynchronous processes? Here's the relevant code:
$("#filter").change(function(d) {
getData();
} //end of if
});
function getData() {
...
$.ajax({
url: myUrl
type: "GET",
dataType: "json",
success: function(d) {
[do some stuff]
} //end of success function
}); //end of ajax
} //end of getData
Save all the Ajax calls you wish to abort into some variable then you can abort the last call.
This is a very common practice when some ajax call might happen many times, before the ones before it had a chance to finish.
function getData(){
$filter.data('REQ') && $filter.data('REQ').abort(); // abort the last request before creating a new one
return $.ajax({
url:myUrl
type:"GET",
dataType:"json",
success:function(d){
[do some stuff]
}
})
}
var $filter = $("#filter");
$filter.on('change', function(d) {
$filter.data('REQ', getData())
});
Of-course this is a very simplified manner of code and you should re-write this in a more structured way, but it gives you an idea of how to cache the last ajax request and then you have the power to abort it before sending a new call.
By the way, your title has nothing to do with the question. You are asking how to handling a sequence of ajax calls but asking about events. this question has nothing to do with events, so you should change the title to fit the problem.
Update regarding what #t.niese had said in the comments:
Throttling the requests on the client side is also a good play to do, since the server cannot really know if the client has aborted the request or not, and if it's a resource-demanding request, it should be throttled.
BUT, I would suggest throttling the requests on the server-side and not on the client-side, if possible, because client-side throttling could be by-passed and is not 100% reliable, and it "costs" the same amount of time to do that on the server-side.
Can a http server detect that a client has cancelled their request?
I would drop all attempts initiate a new request calling getData while a current request is running, and only send the last getData attempt as soon as the current request is finished. This will ensure that the server load wont become unnecessarily high, because only one request will run.
var currentRequest;
var resubmitRequest;
function getData() {
// only start a new request id no current request is running
if (!currentRequest) {
resubmitRequest = false;
currentRequest = Promise.resolve($.ajax({
url: myUrl
type: "GET",
dataType: "json"
})); //end of ajax
currentRequest
.then((d) => {
[do some stuff]
})
.finally(() => {
// if the current request finished and another attempt to request data
// happened while this request was running then call getData again
if (resubmitRequest) {
getData()
}
})
} else {
// store the information that the data has to be requested
// another time after the currently running request finished
resubmitRequest = true;
}
return currentRequest;
} //end of getData
you can simply pause it for example to 500mil seconds and then run 'last' change and do that ajax call..
$("#filter").change(function(d) {
if( $.active > 0 ) return false;
// do not run new ajax if last one is not finished
clearTimeout( window.timer );
// if new 'change' evt is raised in less then 500 mils
// then clear it and run 'last' one
window.timer = setTimeout( function(){
getData();
}, 500 );
});
in case if user changes it while doing ajax, return false it :)
With little effort you can code your own handler for these cases:
function handler(ms, fn) {
var eventId;
return function () {
// if there is an event programmed, kill it
clearTimeout(eventId);
// and program the new one (replace the event)
eventId = setTimeout(fn);
// if passed the time no event was programmed, setTimeout is going to execute the function
}
}
// execute getData if no event is fired in a 1000ms interval (the user stopped typing)
// getData is the same that you have
var fn = handler(1000, getData);
$("#filter").change(fn);

Ajax file upload returns status code 0 ready state 0 (only sometimes)

I have looked at the following thread
jQuery Ajax - Status Code 0?
However I could not find a definitive answer and I am having serious trouble trying to find the source of my issue so I am posting here in the hopes that someone can point me in the right direction.
In my code I am performing an Angular HTTP post which just sends basic JSON data, then within the on success callback I am using AJAX to upload files to the same server. (I know I should not be using jQuery and Angular however I can't change this for the moment)
It looks something like this
var deferred = $q.defer()
// first post
$http.post(url,payload,{params: params, headers: headers)
.then(function(response) {
uploadFiles(response,deferred);
// I am also sending google analytics events here
}, function(error) {
// do error stuff
}
return deferred.promise;
// upload files function
function uploadFiles(response,deferred){
$ajax({
type: 'POST',
processData: false,
contentType: false,
data: data // this new FormData() with files appended to it,
url: 'the-endpoint-for-the-upload',
dataType: 'json',
success: function(data) {
// do success stuff here
deferred.resolve(data);
},
error: function(jqXHR, textStatus, errorThrown) {
var message = {};
if (jqXHR.status === 0) {
message.jqXHRStatusIsZero = "true";
}
if (jqXHR.readyState === 0) {
message.jqXHRReadyStateIsZero = "true";
}
if (jqXHR.status === '') {
message.jqXHRStatusIsEmptyString = "true";
}
if (jqXHR.status) {
message.jqXHRStatus = jqXHR.status;
}
if (jqXHR.readyState) {
message.jqXHRReadyState = jqXHR.readyState;
}
if (jqXHR.responseText) {
message.jqXHR = jqXHR.responseText;
}
if (textStatus) {
message.textStatus = textStatus;
}
if (errorThrown) {
message.errorThrown = errorThrown;
}
message.error = 'HTTP file upload failed';
logError(message);
deferred.resolve(message);
}
}
})
}
Not my exact code but almost the exact same.
The issue is that is works almost all of the time, but maybe three or four in every few hundred will fail. By fail I mean the error handler function is called on the file upload function and the files are not uploaded.
I get jqXHRStatus 0 and jqXHRReadyState 0 when this occurs.
The only way I am able to replicate the issue is by hitting the refresh on the browser when the request is being processed, however users have advised they are not doing this (although have to 100% confirm this)
Is there perhaps a serious flaw in my code which I am not seeing? Maybe passing deferred variable around isn't good practice? Or another way the ajax request is being cancelled that I am not considering? Could sending google analytics events at the same time be interfering?
Any advice would be great and please let me know if you would like more information on the issue.
This means, the request has been canceled.
There could be many different reasons for that, but be aware: this could be also due to a browser bug or issue - so i believe (IMHO) there is no way to prevent this kind of issues.
Think for example, you get a 503 (Service Unavailable) response. What you would do in such a case? This is also a sporadic and not predictable issue. Just live with that, and try to repost your data.
Without reinventing the wheel, I suggest you to implement:
Retrying ajax calls using the deferred api
My guess is that your code is executing before it actually gets back from the call. I.e. the call goes back and nothing was returned and it gives a 0 error. This would make sense as the error is variable. Most of the time it would return fine because the backend executed fast enough but sometimes it wouldn't because it took especially long or something else happened etc. Javascript doesn't ever REALLY stop execution. It says it does but especially passing between angular and jquery with multiple ajax requests it wouldn't be surprising if it was executing the second ajax call before it actually completed your angular post. That's why a refresh would replicate the error because it's would clear your variables.
Some things you can do to test this:
On the backend make a timer that goes for a few seconds before it returns anything. This will probably make your code fail more consistently.
Set breakpoints and see when they are being hit and the values they contain in the javascript.
Good luck!

Busy Dialog not showing during server requests

I'm working on a UI5 application. I want to block UI form users during ajax requests. I added the following code:
var dialog = new sap.m.BusyDialog({
text: this.languageModel.getModelProperty("BUSY_DIALOG_FETCHING_DATA"),
title : this.languageModel.getModelProperty("BUSY_DIALOG_WAIT")
});
var that = this;
jQuery.ajaxSetup({
beforeSend: function() {
dialog = (dialog) ? dialog : new sap.m.BusyDialog({
text : that.languageModel.getModelProperty("BUSY_DIALOG_FETCHING_DATA"),
title : that.languageModel.getModelProperty("BUSY_DIALOG_WAIT")
});
dialog.open();
},
complete: function() {
console.log("close");
dialog.close();
}
});
In console, I get good results, and while debugging I can see dialogs, but in normal mode (without any breakpoints) the busy dialogs don't appear at all. Any clue?
Why not use the busy indicator?
var oController = this;
oController.getView().setBusy(true);
$.ajax({
//your ajax call here,
success: function(data) {
//do whatever needs to be done after success;
oController.getView().setBusy(false);
},
error: function(...) {
oController.getView().setBusy(false);
}
});
You're probably sending some requests synchronously so that the browser wasn't able to update the DOM [1] while trying to open the dialog.
When calling low-level APIs (such as jQuery.ajax()):
NEVER set async: false.
NEVER call jQuery.sap.sjax as it's deprecated due to sending sync XHRs only.
Migrate to asynchronous APIs. Follow the best-practices mentioned below:
Is Your Application Ready for Asynchronous Loading?
Use Asynchronous Loading
In case of OData: DO NOT use the deprecated sap.ui.model.odata.ODataModel! It sends many requests with sync XHRs by default. Consider using v2.ODataModel or v4.ODataModel depending on the OData service.
[1] Sending requests synchronously blocks the main-thread (which is the UI-thread at the same time) of the browser which results in a bad UX. It's also deprecated by the web platform.

Categories