I want to update a shell command output each second using AJAX.
However, Chrome CPU usage is too high and output update seems to be updating so fast ( not one second )
Here is the HTML Document:
<script src='jquery-2.2.4.js'></script>
<script>
setInterval(function() {
$.ajax({
url: "test.php",
success: function(data) {
$("body").html(data);
},
async: true
});
}, 1000);
</script>
</body>
And here is the shell command I'm actually using:
system("dir C:");
It would be better to use a setTimeout which will be called after every successful ajax completion.
You could setup also an error handler in the $.ajax because a network fail might happen and call there again the setTimeout(function(){myajaxfunction();},1000);
var myajaxfunction = function() {
$.ajax({
url: "test.php",
success: function(data) {
$("body").html(data);
setTimeout(function(){myajaxfunction()},1000);
},
async: true
});
};
myajaxfunction();
I found the solution. The AJAX request URL was the same URL I requested from which caused an infinite recursive loop.
So what I did is to request it from another PHP page which contains the data I actually need.
Related
I want to open a window, make an ajax call and close the window.
I want to do it as fast as possible and i was wondering if i have to wait for the ajax response before closing the window?
Currently i'm doing it like this:
$.ajax({
url: requestURL,
type: 'GET',
dataType: "json",
cache: false,
timeout: 30000,
always: function () {
closeWindow();
}
});
However, i was wondering if the ajax will reach the server 100% on all browsers if i will do it like this:
$.ajax({
url: requestURL,
type: 'GET',
dataType: "json",
cache: false,
timeout: 30000,
always: function () {
}
});
closeWindow();
//THIS HAS CONFIRMED TO NOT WORK AND MISS OUT SOME REQUESTS
closeWindow() implementation is irrelevant.
The full usecase is as follows:
I send a user a link on Whatsapp/Telegram/Messenger
User clicks the link
Browser is open -> issue an ajax call -> closing the window.
EDIT
To clarify, i don't care what was the server response for the call. I just want to make sure that the browser issued the HTTP GET to the server and then close the window.
EDIT 2
AJAX is not a must, also, better to use vanilla JS
You can do it like you said, send the ajax request and close the window, since it's async methods, it will make the call, if you wanna be absolutely SURE, write a little PHP script that writes something in a file with a little sleep at the top to check that it made the call, but it will do it.
Edit : or use this method and close the browser in the done function, it will take 1ms or so
following function can close your tab after 1 sec, without waiting for the ajax response
$.ajax({
url: requestURL,
type: 'GET',
dataType: "json",
cache: false,
});
setInterval(function(){
close();
}, 1000);
I believe that if you want to know when your request was launched you could use .ajaxStart as explained here.
$(document).ajaxStart(function() {
console.log('Ajax call started');
closeWindow();
});
Or you could try some raw js like the solution explained here.
I guess the answer was at the back of my head: use an Image object to send the request and close the window right after:
var img = new Image();
img.src = urlWithParamters;
img.onload = function () {
closeWindow();
};
img.onerror = function () {
closeWindow();
};
//safety:
setTimeout(closeWindow,5000);
Since you are using JQuery, you can use the "done" callback option, and set your function inside so the window will close if the request was successful.
$.ajax({
url: requestURL,
type: 'GET',
dataType: "json",
cache: false
}).done(function() {
closeWindow();
});
You can also use the "fail" callback option to manage potential errors.
GOAL: What I'm after is to get data from database and refresh main.php (more evident through draw_polygon) every time something is added in database (after $.ajax to submit_to_db.php).
So basically I have a main.php that will ajax call another php to receive an array that will be saved to database, and a json call another php to return an array will be used by main.php.
$(document).ready(function() {
get_from_db();
$('#button_cancel').click(function(){
$.ajax({
url: 'submit_to_db.php',
type: 'POST',
data: {list_item: selected_from_list},
success: function(result){
...
get_from_db();
}
});
});
function get_from_db(){
$.getJSON('get_from_db.php', function(data) {
...
draw_polygon(data);
});
}
});
In my case, what I did was a get_from_db function call for getJSON to actually get data from database, with the data to be used to draw_polygon. But is that how it should be done? I'm a complete newbie and this is my first time to try getJSON and ajax too to be honest. So my question: How does asynchronous work actually? Is there another workaround for this instead of having to call function get_from_db with getJSON (it isn't synchronous, is it? is that why it doesn't update the page when it isn't within a function?) All the time - like $.ajax with async: false (I couldn't get it to work by the way). My approach is working, but I thought maybe there are other better ways to do it. I'd love to learn how.
To make it more clearer, here's what I want to achieve:
#start of page, get data from database (currently through getJSON)
Paint or draw in canvas using the data
When I click the done button it will update the database
I want to AUTOMATICALLY get the data again to repaint the changes in canvas.
Since $.getJSON() uses ajax configurations, just set the global ajax configs:
// Set the global configs to synchronous
$.ajaxSetup({
async: false
});
// Your $.getJSON() request is now synchronous...
// Set the global configs back to asynchronous
$.ajaxSetup({
async: true
});
Asynchronusly does mean the Request is running in the background, and calls your function back when it got a response. This method is best if you want to have a result but allow to use your app within the request. If you want to have a direct response, take a look at a synchron request. this request will pause script execution until it got a response, and the user can not do anything until the response was recieved. You can toggle it via:
async: false,
So for example:
$.ajax({
url: "myurl",
async: false,
...
})
$.getJSON(), doesn't accept a configuration, as it says in the docs it's a shorthand version of:
$.ajax({
dataType: "json",
url: url,
data: data,
success: success
});
So just rewrite your request in terms of that and async:false will work just as you expect.
$.getJSON() is a shorthand notation for $.ajax() which can be configured to be synchronous (see jQuery.getJSON and JQuery.ajax):
$.ajax({
dataType: "json",
url: url,
data: data,
async: false,
success: function(data) {
...
draw_polygon(data);
}
});
Try to avoid synchronous calls though. Quote from jQuery doc (see async prop):
Cross-domain requests and dataType: "jsonp" requests do not support
synchronous operation. Note that synchronous requests may temporarily
lock the browser, disabling any actions while the request is active.
You might want to try jQuery Deferreds like this:
var jqxhr = $.getJSON(url);
jqxhr.done(function(data) {
...
draw_polygon(data);
});
I use jquery AJAX for web page and use async : false option as following. My client's network is very slow. When I try to load the web page from the server web page is slow and all the controls are freeze. Is that "async:false" matter? here my code
function ajaxRequestWithNoArguments(url) {
return $.ajax({
url: urlForPhp + '/' + url,
data: '',
dataType: 'JSON',
async: false,
method: 'POST'
});
}
When I try to load the web page from the server web page is slow and all the controls are freeze. Is that "async:false" matter?
Yes, this is exactly why you should not use async:false, it's used in very specific cases and sounds like you don't need it. Making the request synchronous means that browser will pause program execution (freeze all UI too) until the request is done, the data is loaded back and processed. You don't wan't it in most cases, that's why you need to use default async: true.
function ajaxRequestWithNoArguments(url) {
return $.ajax({
url: urlForPhp + '/' + url,
data: '',
dataType: 'JSON',
method: 'POST'
});
}
Returning a promise object is convenient way to deal with asynchronous function. In this case you would use ajaxRequestWithNoArguments as follows:
ajaxRequestWithNoArguments('/some/url').then(function(response) {
console.log('Data loaded', response);
});
function OpenAjax(link, form_id)
{
document.getElementById("page-wrapper").innerHTML = "";
$.ajaxSetup({ async: true });
$("#page-wrapper").load(link, function(){
$("#" + form_id).validator();
});
}
This is my code. I had the same issue and setting it to true will fix it. When set it to true another problem may occur. Your javascript code will continue to work and if you have a response text you must tell JQuery to run your code after response, as in my example:
$("#" + form_id).validator();
This code works after response, but if I write my code this way
function OpenAjax(link, form_id)
{
document.getElementById("page-wrapper").innerHTML = "";
$.ajaxSetup({ async: true });
$("#page-wrapper").load(link, function(){
//Code moved from this line
});
//Here
$("#" + form_id).validator();
}
$("#" + form_id).validator(); - code will work before Ajax response
I have a method which checks for notifications and executes a js.erb in return.
The JS which is triggering this method is given below:
setInterval(function () {
$.ajax({
url: "http://localhost:3000/checkNotification",
type: "GET"
});
}, 15000);
This is working great but I am worried about the performance of the site once it will be in production.
Will this cause a performance issue?
If yes, is there another way to solve this problem?
one suggestion I would make is to take advantage of the callback function of ajax so that if a request takes longer than expected it will not fire again until the last one completes
function checkNotifications(){
$.ajax({
url: "http://localhost:3000/checkNotification",
type: "GET",
success: function(){
setTimout(checkNotification, 15000);
}
});
};
also if for some reason the server were to respond with a 500 error, it would prevent the function from continuing
UPDATE Following #Ryan Olds suggestion to include the setTimeout in the callback, I must clarify that in my production code I'm calling multiple urls to get json data from several sites. (Have updated JavaScript code below).
Is it only possible to have multiple timeouts scattered throughout this function?
I have a self-invoking updateFunction as follows:
(function update() {
$.ajax({
type: 'GET',
url: "http://myexample.com/jsondata",
dataType: 'json',
success: function (data) {
// do some callback stuff
},
async: false
});
$.ajax({
type: 'GET',
url: "http://myexample2.com/jsondata2",
dataType: 'json',
success: function (data) {
// do some further callback stuff
},
async: false
});
setTimeout(update, 2000);
})();
What I expected this code to do
I hoped that this function would go off to the target URL and wait for the result, then deal with the success callback. Then (and only then) would it fall through to set a 2 second timeout to call the function again.
What appears to be happening instead
Instead, the GET request codes out, and before the response has been dealt with, the timeout has already been set.
What am I missing? How can I make this entirely synchronous?
If I were you, I'd make use of jQuery's support for deferred action.
(function update() {
$.when($.ajax({
type: 'GET',
url: "http://myexample.com/jsondata",
dataType: 'json',
success: function (data) {
// do some callback stuff
}
}), $.ajax({
type: 'GET',
url: "http://myexample2.com/jsondata2",
dataType: 'json',
success: function (data) {
// do some further callback stuff
}
}), $.ajax({
// more requests as you like
})).then(function() {
// when all the requests are complete
setTimeout(update, 2000);
});
}());
Much nicer, IMHO, than mucking around with synchronous requests. Indeed, if the requests are cross-domain, this is pretty much your only option.
See
$.when
deferred.then
Move the timeout in to the success callback. The request is synchronous, it would appear the the callback is not.
I would modify the setup like so:
function update() {
$.ajax({
type: 'GET',
url: "http://myexample.com/jsondata",
dataType: 'json',
success: function (data) {
// do some callback stuff
},
async: false
});
$.ajax({
type: 'GET',
url: "http://myexample2.com/jsondata2",
dataType: 'json',
success: function (data) {
// do some further callback stuff
},
async: false
});
}
setInterval(update, 2000);
update(); // only necessary if you can't wait 2 seconds before 1st load.
You cannot make it entirely synchronous because you're setting up calls to alternate domains. That's done (internal to jQuery) by creating <script> tags and adding them to the document. Browsers perform those calls asynchronously, and that's that. You can't make ordinary xhr requests to domains different from your own.
I can't imagine why you'd want something like that to be synchronous, especially since you're doing many of these operations.
I don't think async: false works on cross domain requests.
From the docs:
async Boolean
Default: true
By default, all requests are sent asynchronously (i.e. this is set to true by default). If you need synchronous requests, set this option to false. Cross-domain requests and dataType: "jsonp" requests do not support synchronous operation. Note that synchronous requests may temporarily lock the browser, disabling any actions while the request is active.
In any case, maybe you can set some conditionals to fire the requests in the order that you want.