I have two script tags on page, each containing a document.ready(), and each of them is making an ajax call to a page method.
First one loads the values into the select list. Second one loads the tree into the DOM.
<script>
$(document).ready(function() {
$.ajax({
url: 'PageMethods.aspx/GetTop50',
async: true,
success: function(data) {
//loads the values to the select list
}
//rest of stuff...
});
})
</script>
<script>
$(document).ready(function() {
$.ajax({
url: 'Default.aspx/GetTree',
async: true,
success: function(data) {
// loads the tree into DOM
}
//rest of stuff...
});
})
</script>
Why does my GetTree page method keep executing only AFTER the success callback of the GetTop50? I set the breakpoint to GetTree method serverside, and it is only hit AFTER the select list is loaded.
The client will start both ajax calls one after the other so that they are both "in-flight" at the same time. It will then be up to the server which one will complete first and depend upon how the server is configured (will it process multiple requests at once) and will depend upon what each request is doing.
If your server only handles one request at a time or if it blocks on some shared resource such as a database, then it will likely return the first request it received before returning the second request result - though that's just a likely option, certainly not a guaranteed option. For example, if the first request pretty much always takes longer to process than the second and they aren't both contending for the same shared resource, then the second request might finish first and return its results first.
It is also possible that the requests will return in a random order such that sometimes one will return first and sometimes the other will return first. As I said earlier, it all depends upon how the server processes each request and which one it finishes first.
Also, keep in mind that client-side JS is single threaded so when you are sitting at a client-side JS breakpoint somewhere, other JS cannot run until the current JS thread of execution has finished. As for the behavior in breakpoints on the server-side, that all depends upon what the server execution environment is and how it works during a breakpoint.
If you want to debug timing-related things, a breakpoint is NOT a reliable way to test things because hitting the breakpoint itself can very easily affect the behavior. Instead, you should use logging with accurate timestamps to study the exact sequence of events.
I want to thank everyone for the input, especially #jfriend00, and post the exact solution to my problem.
So the problem was that Default.aspx accesses Session, and had EnableSessionState="True" page directive.
When this directive is set, requests are sequentialized on the server side.
I solved it by moving my method into another page, that doesnt use the session state.
Related
Suppose I am using a free API service with a limit of c calls per m minutes.
I am using a tiny bit of javascript linked by the main html of my very basic site which contains something like the following:
$(function () {
//stuff
function getSomething() {
return $.ajax({
type: 'GET',
url: targetURL,
data: dataObject,
});
}
getSomething().done(function (returnedStuff){
//process returnedStuff
});
//more stuff
});
I have two questions:
Will there be an api call each time the page is reloaded?
If the answer to above is YES, then how does one prevent/limit the user/some other event from overshooting the api limit by their repeated reloading of the page.
Thanks for the help.
tldr: Making some assumptions from your high level example:
Will there be an api call each time the page is reloaded? Yes
If the answer to above is YES, then how does one prevent/limit the
user/some other event from overshooting the api limit by their
repeated reloading of the page. Read below
Further explanation:
Depending on whether your example code hits your own server side code which then makes the API code...or whether you're calling the API directly from the client. If you call the function on reload (document ready or whatever), then it will execute on every reload. Else, obviously only when you call the method (like via button click).
Remember, client side code is visible to the client - thus if that's your architecture, then you're exposing your API to the client. I can then for example write my own javascript to loop and call your API repeatedly...
My assumption is that the data does not need to refresh on every reload. With that in mind, I suggest you do the following:
Suggested way to limit API calls:
Use an ajax call to your own server.
On the server side, persist the data via caching of your choice and build in your own logic to test whether data needs to be refreshed (first call, after timeout, etc).
This way you do not expose the API url and details to the client side, and you have control over the amount of calls made to the API.
For optimization purposes, you can also rather cache data client side...but keep the logic and API call server side.
Hope this helps!
ps. If you need an example, please just provide what platform you're coding in and I'll be more than willing to whip up a quick example for you!
pps. You can simply cache client side and makde the API call from there with some logic built in to test the cache - but obviously anyone can then still call your API.
I have a server function like this
function very_long_task($data) {}
This function is called using $.ajax() function clients-side.
The problem is that when my server-side function very_long_task() is executed the site is locked down. Meaning that if I tried to view another page of the website from a different tab or window, the website will not load until the very_long_task() function has completed.
Is there anyway to get around this either server-side or client-side?
UPDATED: 2015-11-3
The AJAX call is actually called many times because it is looping through all the elements in a list and performing an action on each of them. The very_long_task() function is then being called on each element.
For example, if there were a list of 20 elements then the very_long_task() function would be called 20 times. This does help a little bit in the overall responsiveness on that page but not on other pages.
UPDATED: 2015-11-3
Also this is built with WordPress so I can leverage some of their functions, but I have had no luck with wp_schedule_single_event since I need a return value.
https://codex.wordpress.org/Function_Reference/wp_schedule_single_event
UPDATED: 2015-11-3
Here is an updated view of my
function very_long_task($data) {
session_write_close();
// Very long task...
return $data;
}
You'll want to call session_write_close() as soon as possible.
This is because while one page has called session_start(), the session file will be locked until the page finishes execution, or until the session is closed.
If this is not done, any page calling session_start() will wait for the lock to be lifted.
UPDATE
I think I know what's going on:
your browser limits the number of simultaneous connections to a server, typically somewhere between 2 and 10.
If you're making 20 asynchronous AJAX calls, and you open the Developer Console (F12 / control-shift-I), you'll probably find that not all of them are executing simultaneously. This would certainly leave no room for additional connections.
Note, that the session_write_close() is still necessary, otherwise the ajax calls will execute serially.
SUGGESTION
So, it is best to only make one AJAX call.
If you want parallelism, you can fork child processes server-side.
You probably won't be able to use jQuery for this, because you'll want to send data from the server and flush()-ing it as it becomes available (HTTP streaming).
One solution I used in a WP importer plugin is not to use AJAX at all, but perform the long running operation, pushing out HTML and a <script> tag to update the UI.
I'm not entirely sure what you mean by "locked down" but below are some things to try:
Make sure that your AJAX is asynchronous
$.ajax({
url: '/start_very_long_task.php',
async: true
});
Make sure your PHP accommodates the expected behavior
// start_very_long_task.php
function start_very_long_task()
{
ini_set('ignore_user_abort','on');
ini_set('max_execution_time', 0)
session_write_close();
do_very_long_task();
}
function do_very_long_task()
{
// Very long task stuff
// This can recursively call itself without making
// making multiple calls to session_write_close(), etc...
}
start_very_long_task();
this is a interesting problem.
i am doing an asynchronous ajax put
return $.ajax({
url: url,
type: 'PUT',
contentType: 'application/json; charset=utf-8',
async: true, // default
success: function (result, textStatus, xhr) {...}
this works as expected, unless a user does a put before previous call returns (even though it's async, the call does take .5 second to complete)
if a user presses the button a few times (executing multiple puts) the following happens:
i see only one server call in fiddler
success gets fired for every click
all callbacks get the same new row ID (returned by the server)
this leads me to inevitable conclusion that the first server callback triggers all outstanding callbacks..
i could disable the button until the callback returns, but is it possible to handle multiple outstanding calls? is this a browser limitation? best way to handle this?
UPDATE
as a test i switched to using POST instead of PUT: adjusted type: 'POST' on JS side, and [HttpPost] on web api (server side).
the behavior did not change.
UPDATE
looking at posts like this one.. this really should work. i don't see any specific reason why the rest of concurrent requests are not not making it out to the server.
Shouldn't PUT requests be idempotent? That is, submitting multiple requests should generate the same response? If so, the code may simply be trying to coalesce your identical PUT requests since they should all end up with the same result. If you're incrementing some ID for every post (i.e. changing server state) then you should be using POST instead of PUT.
This may not fix your issue; it's just a thought.
You can't wait for an async callback in javascript. You have to restructure your code to do all future work based on the async response from the actual callback.
If you need to make multiple consecutive ajax calls, then you issue the first one and in the success handler or response handler for the first ajax call, you issue the second ajax call and in the response handler for the second one, you carry out whatever you want to do with the data
I noticed that when a link on the page has JS onclick function and also a MVC action method, the JS function fires before the action.
Wondering is this always the case?
Why/how browsers deside to run JS and the the backend method?
Can I run backend method first, but still want to fire the JS function?
Regards
Wondering is this always the case?
Why/how browsers deside to run JS and the the backend method?
Client-side JavaScript runs on the client, inside a page. Server-side .NET code runs on the server and generates an HTML document (or other resource).
To run server side code, the browser has to make an HTTP request.
The easiest way to make an HTTP request is to leave the current page and load a new one from the server (by following a link or submitting a form).
Since client-side JavaScript runs in a page, it can't run after the browser has left the page it runs in.
Can I run backend method first, but still want to fire the JS function?
You can make an HTTP request from JavaScript (before doing other JS actions) instead of leaving the current page. This is usually done with the XMLHttpRequest object and is known as Ajax.
Why/how browsers deside to run JS and the the backend method?
server side code will need a request to return the response. HTTP works on Request and Response architecture. basically client make requests in order to get the response (i.e results or desired data)
wheenever you do postback or return true to the server, it will execute server side methods.
ClickHereToSee
here return value which is returned by the function, if you return the true value it will go to the server method, if you returned the false it will prevent the default action.
Can I run backend method first, but still want to fire the JS function?
You can. use Ajax. basically Ajax requests are XMLHTTPRequest. which used to update the paritial portions.
Can I run backend method first, but still want to fire the JS function?
First two answers are already well answered. For third one you can try Jquery Ajax-
function SomeFunction(){
$.ajax({
type: 'POST',
url: "#Url.Content("Controller/ActionResult")",
data : {
*yourparameter:value* //data
},
dataType: 'json',
success:function(result)
{
//javascript stuff
}
});
}
ClickHereToSee
I'm not sure if what I'm about to ask is possible or the right way of doing about things, but here goes.
I have a webpage which loads some data from a server using AJAX and displays it visually. The user has the option of using one of two buttons on the page to "scroll" through the data which is filtered by week.
The code for these buttons is something like:
$("#leftButton").click(function () {
clearCurrentlyDisplayedData();
changeFilter(1); //Or -1, or whatever.
loadAndDisplayData();
}
In this (simplified) example, loadAndDisplayData() would use AJAX calls to fetch this data and then display it on completion of the request, like:
$.get(
"web/service/address",
function (data, textStatus, jqXHR) {
//Display the data here
});
However, there is a problem when the user clicks the arrows to scroll through the data too quickly. If the buttons are clicked twice in quick succession, the data for two weeks is displayed, on top of each other.
I don't want to disable the buttons until the data is collected - since the data collection and displaying does take a little bit of time, this would kill the ability of the user to navigate through the site quickly, and would quickly become irritating.
Is it possible to kill any currently executing scripts or AJAX calls (or functions called as a result of these) when the user clicks on one of the buttons in order to prevent the loading of two sets of data? Is there any other way I can go about solving this problem?
The jqXHR object has an abort() method, which you can call to cancel an AJAX request.
However, this requires you to keep a reference to the object returned by $.get().
A prehaps easier approach would be to increment a global counter when making a request, and decrement it when a request completes. In your success handler, only show the results if the counter === 0 (e.g. theres no requests pending).
Is there any other way I can go about solving this problem?
Rather than aborting the requests, it might be better to construct the callback function inside $.get in a way that clears the data and displays the new data as a single operation - i.e., doesn't clear the data until the new data is ready.
Javascript only processes a single event/thread at a time, so each AJAX response will be processed serially as they arrive.