I'm experimenting a bit with AJAX and have successfully deployed a simple AJAX A-synced function, yet when I'm changing it to use callback method - suddenly, it takes ages to load (about 10 - 15 mins...).
Here's the function that executes right away:
function ajaxf() {
var xmlhttp;
xmlhttp=new XMLHttpRequest();
xmlhttp.onreadystatechange=function() {
if (xmlhttp.readyState==4 && xmlhttp.status==200 && document.getElementById("icons")==null)
{
document.getElementById("text-12").innerHTML=xmlhttp.responseText;
}
}
xmlhttp.open("GET","http://some-url/ajax.php",true);
xmlhttp.send();
}
And here's the way slower iteration using a callback function:
function ajaxf(url,cfunc) {
xmlhttp=new XMLHttpRequest();
xmlhttp.onreadystatechange=cfunc;
xmlhttp.open("GET",url,true);
xmlhttp.send();
}
document.body.onscroll = function ajaxb() {
ajaxf("http://some-url/ajax.php",function()
{
if (xmlhttp.readyState==4 && xmlhttp.status==200 && document.getElementById("icons")==null)
{
document.getElementById("text-4").innerHTML=xmlhttp.responseText;
}
});
}
Other (perhaps) relevant details - the ajax.php file weighs merely 532 B, on my local test server both run more or less equally the same, the first function uses onscroll="ajaxf()" inside the body tag...
I was under the impression AJAX would be a little more snappy???
I solved it, thanks to #jasen's tip I've put a console.log() and was able to see the scroll function fiered a gazillion times just like #jfriend00 said.
Initially, I thought that by putting "document.getElementById("icons")==null" as a condition - the function would only fire once but I was wrong of course,
So the solution was / is:
to reset the onscroll action after the first execution by adding "document.body.onscroll = null;" at the end of the function.
Related
i have an async httprequest seen below
function httpGet(URL, type, daily, weekly, monthly)
{
if (window.XMLHttpRequest)
{
xmlhttp=new XMLHttpRequest();
}
else
{
xmlhttp=new ActiveXObject("Microsoft.XMLHTTP");
}
xmlhttp.onreadystatechange=function()
{
if (xmlhttp.readyState==4 && xmlhttp.status==200)
{
if (type == 'table')
{
createTable(xmlhttp.responseText, daily, weekly, monthly);
}
}
}
xmlhttp.open("GET", URL, true);
xmlhttp.send();
}
When i call it twice for example
httpGet('example', 'table', 1, 2, 3);
httpGet('example2', 'table', 4, 5, 6);
They will both return the results from the second URL. Im sure making the function not async will fix this but thats not very user friendly either.
Is there anyway to set in stone the url i want to return as the one passed in the parameter that originally called it instead of last called it
You aren't declaring xmlhttp with const, let, or var - this means that it will be a global variable, so each call to httpGet will result in the reassignment of the global variable, rather than each function having a separate binding for xmlhttp. So, when the last line in there runs:
createTable(xmlhttp.responseText, daily, weekly, monthly);
the xmlhttp there will always be referring to the final xmlhttp.
Change your code to:
function httpGet(URL, type, daily, weekly, monthly) {
const xmlhttp = window.XMLHttpRequest
? new XMLHttpRequest()
: new ActiveXObject("Microsoft.XMLHTTP");
xmlhttp.onreadystatechange = function() {
// ...
to ensure that each call gets a separate binding of xmlhttp.
Also, you might consider using fetch and ditching ActiveXObject - ActiveXObject should only be needed for supporting IE6 and below, which are broken browsers that probably shouldn't be considered at all.
When I try to draw data from a database, my javascript is returning the JSON twice. Why would my JSON return this way?
PHP:
<?php
require ('functions.inc');
dbconn(); //establish my db connection
mysql_selectdb("acts");
$query = mysql_query("SELECT * from actInfo");
while ($row = mysql_fetch_assoc($query)){
$name = $row[ActName];
}
$json=json_encode($name);
echo $json;
?>
Javascript:
function getActNames(){
if (windows.XMLHttpRequest)
{
xmlhttp=new XMLHttpRequest();
}
else
{
xmlhttp=new ActiveXObject("Microsoft.XMLHTTP");
}
xmlhttp.onreadystatechange=function(){
var json = xmlhttp.responseText;
var parseV = JSON.parse(json);
$("#somediv").append(parseV);
}
xmlhttp.open("POST","PHP/actMgmt.php",true);
xmlhttp.send();
}
And I'm calling it in HTML via the following:
<p class = "button" onclick= "getActNames();return false;">Some Button </p>
My JSON call is creating 2x the requested records. SO instead of getting the following:
["act1","act2","act3"]
I am getting:
["act1,"act2","act3"]["act1","act2","act3"]
It seems that every time, its called twice.
ALSO, when I just go to the PHP page, it only returns the following like I expect:
["act1","act2","act3"]
**EDIT
var_dump($name) outputs:
array(6)=>{ [0]=>string(4)"act1" [1]=> string(4)"act2" [2]=> string(4)"act3"}
**EDIT
console.log(xmlhttp.responseText) gives me:
JSON.parse: unexpected end of data
["act1","act2","act3"]
I see it now. I would, in the future, HIGHLY suggest you use a framework like jQuery to avoid this. Writing your own AJAX function tends to lead to problems like this.
Your AJAX call isn't checking for a proper status. It's just looking for any return at all. As such, every time you get a packet, it kicks off your anonymous function.
Change your code to this and you should get only one array
xmlhttp.onreadystatechange=function() {
if (xmlhttp.readyState==4 && xmlhttp.status==200) {
var json = xmlhttp.responseText;
var parseV = JSON.parse(json);
$("#somediv").append(parseV);
}
}
It is due to multiple ajax call on the same page(you can observe that as this may happen on first run), so not a solution but still you can escape from the current scenario, by making seperate php and html(containing ajax) pages,
according to w3schools the onreadystatechange event is triggered every time the readyState changes.
which means in every request cycle it is called more then once with the possible options:
0: request not initialized
1: server connection established
2: request received
3: processing request
4: request finished and response is ready
so what you need to do is:
function getActNames() {
if (windows.XMLHttpRequest) {
xmlhttp=new XMLHttpRequest();
}
else {
xmlhttp=new ActiveXObject("Microsoft.XMLHTTP");
}
xmlhttp.onreadystatechange=function() {
// Called when the request finished successfully and response is ready.
if (xmlhttp.readyState == 4 && xmlhttp.status == 200) {
var json = xmlhttp.responseText;
var parseV = JSON.parse(json);
$("#somediv").append(parseV);
}
}
xmlhttp.open("POST","PHP/actMgmt.php",true);
xmlhttp.send();
}
seeing as you are using jQuery I strongly advise you use the jQuery ajax as shown:
$.post("PHP/actMgmt.php", { pram1: "value1" }, function( data ) {
$("#somediv").append(data);
}, "json");
Hope this helps.
I've searched SO for similar issues (e.x. Chrome does not redraw <div> after it is hidden and Force DOM redraw/refresh on Chrome/Mac ) but none of questions gave me the solution to my problem. I am writing modem configuration panel, the webpage with ,,tabs''. On every tab there are some settings-just like configuration panel of any router.
Saving configuration (done when user clicks on Save button) takes few seconds (my embedded platform is not a speed king), so I decided to put special PLEASE WAIT window (div to be precise) which is usually hidden, but is shown when needed to calm user down :-).
Everything works fine on Firefox: after clicking save, the PLEASE WAIT div shows and then the configuration is saved using POST method. However, on Chrome 26 and Chromium 25 the div does not show until the configuration is saved. As you can see in SaveConfiguration function after executing PHP script that saves configuration the alert is shown-this is where the PLEASE WAIT div shows up on Chrome. It looks like Chrome is not redrawing page but immediately starts launching POST script. Has anyone had similar issues and now how to fix this problem?
Below are fragments of my code, I have only supplied functions that might give a clue what I'm doing. I can post more code if that helps.
function showLoadingScreen(yes)
{
if(yes)
{
document.getElementById("loadingtext").innerHTML="Please wait...";
document.getElementById("loading_overlay").style.display="block";
document.getElementById("loading_window").style.display="block";
}
else
{
document.getElementById("loading_overlay").style.display="none";
document.getElementById("loading_window").style.display="none";
}
}
function postDataSync(url, params)
{
var XMLHttpRequestObject = false;
if (window.XMLHttpRequest)
{
XMLHttpRequestObject = new XMLHttpRequest();
} else
if (window.ActiveXObject)
{
XMLHttpRequestObject = new
ActiveXObject("Microsoft.XMLHttp");
}
if(XMLHttpRequestObject)
{
XMLHttpRequestObject.open("POST", url, false);
XMLHttpRequestObject.setRequestHeader('Content-Type','application/x-www-form-urlencoded');
XMLHttpRequestObject.send(params);
{
if (XMLHttpRequestObject.readyState == 4 &&
XMLHttpRequestObject.status == 200)
{
var result = XMLHttpRequestObject.responseText;
delete XMLHttpRequestObject;
XMLHttpRequestObject = null;
return result;
}
}
}
return '';
}
function SaveConfiguration()
{
var errors=checkForm();
if(errors!="")
{
printError("Can't save configuration because there are errors in current tab:<br><br>"+errors);
return;
}
showLoadingScreen(true);
saveTab();
var retval=postDataSync('actions/saveconf3.php','');
alert("Settings saved. The modem is now being reconfigured.");
document.location = "http://" + retval;
}
You are using ajax synchronously rather than asynchronously meaning javascript execution halts during the request. To fix make the following change:
XMLHttpRequestObject.open("POST", url, true);
You need to use a callback for the behaviour after the request is complete. Something like this:
function postDataSync(url, params, success)
{
var XMLHttpRequestObject = false;
if (window.XMLHttpRequest)
{
XMLHttpRequestObject = new XMLHttpRequest();
} else
if (window.ActiveXObject)
{
XMLHttpRequestObject = new
ActiveXObject("Microsoft.XMLHttp");
}
if(XMLHttpRequestObject)
{
XMLHttpRequestObject.open("POST", url, true);
XMLHttpRequestObject.setRequestHeader('Content-Type','application/x-www-form-urlencoded');
XMLHttpRequestObject.send(params);
XMLHttpRequestObject.onreadystatechange = function() {
if (XMLHttpRequestObject.readyState == 4 &&
XMLHttpRequestObject.status == 200)
{
var result = XMLHttpRequestObject.responseText;
delete XMLHttpRequestObject;
XMLHttpRequestObject = null;
if (typeof success === 'function') success(result);
}
}
}
return '';
}
function SaveConfiguration()
{
var errors=checkForm();
if(errors!="")
{
printError("Can't save configuration because there are errors in current tab:<br><br>"+errors);
return;
}
showLoadingScreen(true);
saveTab();
postDataSync('actions/saveconf3.php','', saveComplete);
}
function saveComplete(result) {
showLoadingScreen(false);
alert("Settings saved. The modem is now being reconfigured.");
document.location = "http://" + result;
}
If you have heavy synchronous code (in practice, operations on hundreds or thousands of objects that are already in memory, or calculating pi to a gazillion digits) you can use setTimeout to give the browser time to catch up with any rendering tasks. You'd either need to call setTimeout for each task, or if you have a long-running task, split it up in batches first. This requires quite a bit of refactoring though, since every task needs to be represented as a function that can be passed to setTimeout.
I wouldn't use XMLHTTPRequest synchronously ever.
If setTimeout(fn, 0) does not trigger the "incremental" rendering, try a higher value, until it works. I think I needed to use a value of 100ms between jobs in some cases, for some browsers (I don't recall which).
You may need to yield to the browser even quicker if you want to achieve 60fps, or 30fps. Then you need to stay under 16ms or 33ms for each task. That gets very tight on slow hardware, such as (older types of) smartphones. Then, instead of setTimeout, you can best use requestAnimationFrame, if available.
For asynchronous quick-checks of an URL, I use AJAX with method=HEAD:
function ajax_quickresponse(url) {
var xmlhttp = null;
if (window.XMLHttpRequest)
xmlhttp = new XMLHttpRequest();
else if (window.ActiveXObject)
xmlhttp = new ActiveXObject("Microsoft.XMLHTTP");
if (xmlhttp) {
xmlhttp.open("HEAD", url, true);
xmlhttp.onreadystatechange = function () {
if (xmlhttp.readyState == 4) {
if ((xmlhttp.status > 199 && xmlhttp.status < 400))
; //OK
else
; //ERR
};
};
xmlhttp.send(null);
};
};
This receives is enough for checking the http status, but seems to abort the script at serverside (url). E.g. I can try with a simple php script (url="http://my-host.er/test/script.php") which does sleep(2); and log a success message afterwards.
With xmlhttp.open("HEAD", url, true);, there is no success entry in the log.
With xmlhttp.open("GET", url, true);, there is a success entry in the log.
However, with GET/POST, the javascript is waiting 2 seconds instead of instantly (with HEAD).
The status is known instantly and the javascript does not need to wait for the final response.
How to take advantage of both methods? First, I need the status instantly, as soon as the header comes in, and after the external url/script returns, then i'd like another listener.
E.g. first alert('http status='+xmlhttp.status); and maybe delayed, depending on the url/script, alert('finally completed');
Do you have a tip how to achieve this with one single call?
Starts with shuffle functions (just shuffles arrays). It works.
Then I define 2 global variables that will determine the random order for images to be displayed on the page.
picOrder will be a simple array from 0 to picCount, with picCount determined by Ajax onload. The picCount is being retrieved, but the the picOrder array is not being set! If I manually run "arrangePics();" in the console it works. It fills the array picOrder and then shuffles it. But it does not work by placing the calls to both functions inside "" or by putting the "doStuff()" function in there.
Array.prototype.shuffle = function() {
var s = [];
while (this.length) s.push(this.splice(Math.random() * this.length, 1)[0]);
while (s.length) this.push(s.pop());
return this;
}
var picOrder = new Array();
var picCount;
function getPicCount() {
// picCount = array(10);
if (window.XMLHttpRequest) {// code for IE7+, Firefox, Chrome, Opera, Safari
xmlhttp=new XMLHttpRequest();
} else {// code for IE6, IE5
xmlhttp=new ActiveXObject("Microsoft.XMLHTTP");
}
xmlhttp.onreadystatechange=function() {
if (xmlhttp.readyState==4 && xmlhttp.status==200) {
picCount = xmlhttp.responseText;
}
}
xmlhttp.open("GET","/example.com/images.php?count=hello",true);
xmlhttp.send();
//picCount.shuffle;
}
function arrangePics() {
for(var i = 0;i<picCount;i++) {
picOrder[i] = i;
}
picOrder.shuffle();
//alert(picOrder);
}
HTML
<body onLoad="getPicCount();arrangePics();">
or
<body onLoad="doStuff();">
You need to arrangePics() after the asynchronous AJAX call has returned, i.e. you can only call it in the if (xmlhttp.readyState==4 && xmlhttp.status==200) {} (callback) block otherwise you cannot be sure that the data has been fully received.
What is currently happening is that JavaScript is calling getPicCount();arrangePics(); - the first method initiates the AJAX call and returns immediately and then the second method will by trying to arrange 0 pics. Executing arrangePics() manually on the console would have introduced enough delay into the system for the AJAX call to complete and the picCount would be set as expected.
So if you change the callback function to:
if (xmlhttp.readyState==4 && xmlhttp.status==200) {
picCount = xmlhttp.responseText;
for(var i = 0;i<picCount;i++) {
picOrder[i] = i;
}
picOrder.shuffle();
}
it should shuffle the pics after the count has been received.