this is a interesting problem.
i am doing an asynchronous ajax put
return $.ajax({
url: url,
type: 'PUT',
contentType: 'application/json; charset=utf-8',
async: true, // default
success: function (result, textStatus, xhr) {...}
this works as expected, unless a user does a put before previous call returns (even though it's async, the call does take .5 second to complete)
if a user presses the button a few times (executing multiple puts) the following happens:
i see only one server call in fiddler
success gets fired for every click
all callbacks get the same new row ID (returned by the server)
this leads me to inevitable conclusion that the first server callback triggers all outstanding callbacks..
i could disable the button until the callback returns, but is it possible to handle multiple outstanding calls? is this a browser limitation? best way to handle this?
UPDATE
as a test i switched to using POST instead of PUT: adjusted type: 'POST' on JS side, and [HttpPost] on web api (server side).
the behavior did not change.
UPDATE
looking at posts like this one.. this really should work. i don't see any specific reason why the rest of concurrent requests are not not making it out to the server.
Shouldn't PUT requests be idempotent? That is, submitting multiple requests should generate the same response? If so, the code may simply be trying to coalesce your identical PUT requests since they should all end up with the same result. If you're incrementing some ID for every post (i.e. changing server state) then you should be using POST instead of PUT.
This may not fix your issue; it's just a thought.
You can't wait for an async callback in javascript. You have to restructure your code to do all future work based on the async response from the actual callback.
If you need to make multiple consecutive ajax calls, then you issue the first one and in the success handler or response handler for the first ajax call, you issue the second ajax call and in the response handler for the second one, you carry out whatever you want to do with the data
Related
I have two script tags on page, each containing a document.ready(), and each of them is making an ajax call to a page method.
First one loads the values into the select list. Second one loads the tree into the DOM.
<script>
$(document).ready(function() {
$.ajax({
url: 'PageMethods.aspx/GetTop50',
async: true,
success: function(data) {
//loads the values to the select list
}
//rest of stuff...
});
})
</script>
<script>
$(document).ready(function() {
$.ajax({
url: 'Default.aspx/GetTree',
async: true,
success: function(data) {
// loads the tree into DOM
}
//rest of stuff...
});
})
</script>
Why does my GetTree page method keep executing only AFTER the success callback of the GetTop50? I set the breakpoint to GetTree method serverside, and it is only hit AFTER the select list is loaded.
The client will start both ajax calls one after the other so that they are both "in-flight" at the same time. It will then be up to the server which one will complete first and depend upon how the server is configured (will it process multiple requests at once) and will depend upon what each request is doing.
If your server only handles one request at a time or if it blocks on some shared resource such as a database, then it will likely return the first request it received before returning the second request result - though that's just a likely option, certainly not a guaranteed option. For example, if the first request pretty much always takes longer to process than the second and they aren't both contending for the same shared resource, then the second request might finish first and return its results first.
It is also possible that the requests will return in a random order such that sometimes one will return first and sometimes the other will return first. As I said earlier, it all depends upon how the server processes each request and which one it finishes first.
Also, keep in mind that client-side JS is single threaded so when you are sitting at a client-side JS breakpoint somewhere, other JS cannot run until the current JS thread of execution has finished. As for the behavior in breakpoints on the server-side, that all depends upon what the server execution environment is and how it works during a breakpoint.
If you want to debug timing-related things, a breakpoint is NOT a reliable way to test things because hitting the breakpoint itself can very easily affect the behavior. Instead, you should use logging with accurate timestamps to study the exact sequence of events.
I want to thank everyone for the input, especially #jfriend00, and post the exact solution to my problem.
So the problem was that Default.aspx accesses Session, and had EnableSessionState="True" page directive.
When this directive is set, requests are sequentialized on the server side.
I solved it by moving my method into another page, that doesnt use the session state.
I'm having an issue with AngularJS 1.2 POST requests using $http service.
On a button click, I trigger a POST request like so:
$http.post('my/url').success(function(responseText) {
// Do something
console.log(responseText);
});
My problem is, if I click twice on the button and the first callback hasn't fired yet, only one HTTP request is issued, but my callback is fired twice (with the same data as a parameter).
If I explicitely add cache: false:
$http.post('my/url', {cache: false}).success(function(responseText) {
// Do something
console.log(responseText);
});
Then two requests are issued as I expected.
It looks like a bug to me. Why would anybody want to cache a POST request? Moreover, the AngularJS documentation specifies that we have to pass cache: true if we want to activate the cache for GET requests. Which sounds like it is inactive by default, and not active at all for POST requests (which would make sense to me).
Is there a way to deactivate the cache once and for all, for every request, on the 1.2 branch? I didn't find any.
EDIT
I misused $http.post. The second parameter is not the options map, but rather the data to send to the server. According to the documentation, this parameter is mandatory. So I can execute this:
$http.post('my/url', {'anything': 'anything'}).success(function(responseText) {
// Do something
console.log(responseText);
});
And it works as expected.
So the real question is: why is the data parameter mandatory for a $http.post call to work properly? I feel like I am missing something about HTTP. I already have everything I need in the URL (something like company/company_id/employee) and I don't need any additional data.
I'm having a really strange issue interacting with a RESTful json interface and can't seem to get anywhere diagnosing it.
Using jQuery, I send a DELETE request to the JSON api and receive a 204 response. Success method is called. Everything is happy.
$.ajax({
type: "DELETE",
url: "/content_blocks/" + $(this).parent().attr('id'),
dataType: "json",
success: deleteItem($(this).parent())
});
From Firebug console:
> DELETE http://localhost:3000/content_blocks/228 204 No Content 92ms
Now I do the exact same thing with another item and get this:
> DELETE http://localhost:3000/content_blocks/231 (spinning circle loading forever)
The server sends the same response, jQuery fires the success callback. But the response is apparently never received.
If I send a third DELETE request, it works normally again, and apparently alternates this way forever.
I have another ajax call that sends a POST request. I do any number of these in a row with no issue. But if I do a successful DELETE first, then a POST, the POST behaves like the second, unsuccessful DELETE, except that the success method doesn't fire. If I do a second POST, that one works, just like the third DELETE.
To me this seems like something about the first DELETE request, even though it's successful, is broken or not getting properly reset. Somehow a second request resets it.
I'm using rails 4.1.0.rc1 and jQuery 1.11.0 if that makes any difference.
Has anyone seen anything like this before?
Could this be a jQuery bug?
Can I make two or multiple Ajax requests in one hit in JavaScript or jQuery?
I mean I know it seems crazy to ask this question, but earlier I have been through an interview and they asked me this question. After the interview I searched a lot on this but found nothing.
Somewhere I just found that you can put another Ajax request as the callback of first one. But this is not the real story at all.
I have a doubt, does sync or async has some role in this?
If somebody has a solution, a POC on jsfiddle or plunkr will be appreciated on the same.
JavaScript experts, please help.
Thanks in Advance!!
If you are using jQuery you can make use of the deferred objects. Basically you can perform multiple ajax requests, and when all are done, one callback is executed.
Have a look at http://api.jquery.com/jquery.when/ for more information. There's also a simple example:
$.when( $.ajax( "/page1.php" ), $.ajax( "/page2.php" ) )
.then( myFunc, myFailure );
In short -- no, you cannot.
You can make multiple callbacks to do them all in order, or you can make all the requests to the multiple points on the server side.
You can always do two ajax requests in a row, but there is no guarantee in what order they will return to their respective callbacks.
Look, if you want to send 2 urls on single xmlHttpRequest, I think, this is not possible.
And suppose, it were possible, how we would be able to find the data send by server as response that which response is for which url request?
I'm not sure what you mean by hit? but yes put plainly.
async - doesn't wait for the response from the ajax request
sync - waits until a response from the ajax
depends on what you want to do.
EDIT: after reading other responses. to clarify, if you want multiple concurrent simultaneous ajax requests you need a concurrent connection for each. that is how it possible.
Call 2 requests on same event, make them async.
$("#btn").click(function(){$.get("foo.php");$.get("bar.php");});
you can do only two ajax hits with when and then, but if you wants more than two ajax hits then I would recommend ajax always.
$.ajax({
url: '/Job/multipleajax/',
contentType: "application/json; charset=utf-8",
dataType: "json",
type: "POST",
data: JSON.stringify(id),
success: function (response) {
alert("first ajax hit");
}
}).always(function () {
// second ajax call here
}).always(function () {
// Third ajax call code here
})
I have a WCF service which takes a long time to process the first time it is called, and then caches those results in HttpRuntime.Cache. In order to initialize this cache I'd like to trigger a fire-and-forget ajax call from javascript.
Right now I have this javascript in the page:
$.ajax({
type: 'GET',
url: getServiceURL() + 'PrimeCacheAjax',
contentType: 'application/json; charset=utf-8'
});
Where the PrimeCacheAjax function just performs a dummy call to populate the cache.
The only issue with this approach is that the page with this ajax call is a type of landing page which executes some javascript, opens another window and closes itself. When the window closes itself before the server responds, I see a cancelled request in fiddler. I am concerned that this may lead to situations where the ajax call may not reach the server, is this possible?
Is there a way to specify (using $.ajax()) that no response will be coming, or does it not really matter?
Only time it will matter is if the request to the server is not complete. You should check to make sure the call is at a readyState value of 2 [aka sent], before exiting.
I would just perform a call with no callback, I don't believe there is a property that allows the F&F method.
for a very short ajax call you could try the following code as an alternative, if you wanted.
$.get('URL');