What alternativ is best for interval page reloading? - javascript

I am pretty lazy, I've made a code that is fetching data from the database and then reloading a page on the index every second if something is changing, so it's visible without needing to refresh the page.
What alternative is faster?
function getLowPlayers() {
var httpd;
if (window.XMLHttpRequest) {
httpd = new XMLHttpRequest();
} else if (window.ActiveXObject) {
httpd = new ActiveXObject("Microsoft.XMLHTTP");
}
httpd.onreadystatechange=function()
{
if (httpd.readyState==4 && httpd.status==200)
{
document.getElementById("lowplayers").innerHTML= httpd.responseText;
setTimeout(getLowPlayers, 1000);
}
}
httpd.open("GET", "includes/ajax.php?noCache=" + Math.random(), true);
httpd.send();
}
or this one:
function ajaxCall() {
$.ajax({
url: "ajax.php",
success: (function (result) {
var res = $.parseJSON(result);
var str1 = "<center><img src='https://steamcommunity-a.akamaihd.net/economy/image/";
var str3 = " ' height='60' width='70'/></br>";
var mes = [];
var div = [];
})
})
};
I know that it is a silly solution to do like this, I could setup a socket.io server, but I think it's too much of work.
I understand that with many visitors, the ajax.php file would send way to many queries per second, is that healthy for the database? Or is it fine with todays internet speed and hosting services?
Which one is faster? Or do you guys have any better solution?

The two codes are more or less the same, the first is written in javascript vainlla, and the second is in JQuery and therefore is shorter and simpler.
If I had to decide, I would choose the second code since it is easier to read and maintain.
Regarding the performance between those 2 codes is practically the same, although if there were many connections the option to use ajax would not be very correct since each server request occupies "X" memory that if we are talking about large numbers would mean that the page was will load slowly because of the waiting queues.
So the best idea as you say is to confiure socket.io and try to update the page with calls from the server to customers.

Related

Loading the same file via ajax 12 times at once

I hope this makes sense...
I have a page that loads the same external file 1-12 times depending on the usage. The larger load times takes up to a full minute to load the page so I'm trying to load each file via ajax, but using a loop to load the files completely hangs the server.
Here's the code I'm using so far:
function getSchedule(startday,scheduleID,scheduleView){
$.ajax({
type: 'POST',
url: siteURL+'/includes/schedule-'+scheduleView,
data: {startday:startday,scheduleID:scheduleID},
success: function(data){
$('.scheduleHolder'+scheduleID).html(data).removeClass('loading');
}
});
}
var loadSchedules = [];
var startday = $('#the_start_day').text();
var totalSchedules = $('.scheduleHolder').length;
var i = 0;
$('.scheduleHolder').each(function(){
var currentHolder = $(this);
var scheduleView = currentHolder.attr('rel');
var scheduleID = currentHolder.attr('id');
loadSchedules.push(getSchedule(startday,scheduleID,scheduleView));
if (totalSchedules==i) {
$.when.apply($, loadSchedules);
}
i++;
});
Each file should only take 2-5 seconds to load when it's loading individually, so I was really hoping the total load time could go from 60 seconds to 10 or so.
So, my question is how can I load the same file multiple times and at the same time without killing the server? Does that make sense?
I believe you need to use synchronous requests, this hopefully helps:
jQuery: Performing synchronous AJAX requests
(the wording is a bit misleading, read here: Asynchronous and Synchronous Terms )
but I can't vouch what will happen to your server with the 12 simultaneous requests - if the other end is written well, nothing.

JavaScript: Writing an output file within a limited & secure scenario

I would like to add a function in my javascript to write to a text file in the local directory where the javascript file is located. This means I'm not looking for some insecure way of accessing the user's file system in any way. All I care about is extracting the user's input into an html page that is accessed by my javascript then using that input as data externally. I just need a simple text file. This user input isn't actually text by the way, but rather a bunch of actions using my online game's components that the underlying javascript turns into a text string (so this particular string is what I want to save, not really even anything direct from the user).
I don't want to write to a user's file system, but rather, the file where the javascript (and html) code is located (a folder hosted on a server). Is there any simple way to get some file I/O going?
I know Javascript has a FileReader, is there any way to get it to do this in reverse? Like a FileWriter. GoogleClosure looks like it has a FileWriter, but it doesn't seem to quite work and I can't find any decent examples of how to get it to do this.
If this requires a different language, is there any way I can just get the relevant snippet and insert this into my Javascript file?
(the folder is hosted on a Linux system if that helps)
ADDENDUM: Elias Van Ootegem's solution below is excellent and I would highly recommend looking into it as it's a great example of client-server interaction and getting your system to provide you the data you're looking to extract. Workers are pretty interesting.
But for those of you looking at this post with that similar question that I initially had about JavaScript I/O, I found one other work-a-round depending on your case. My team's project site made use of a database site, MongoDB, that stored some of the user's interaction data if the user had hit a "Save" button. MongoDB, and other online database systems, provide a "dumping" function/script that you can call from your local machine/server and put that data into an output file (I was able to put the JSON data into a text file). From that output, you can write a parser to extract and format the data you desire from that output since databases like MongoDB can be pretty clear as to what format the text will be in (very structured, organized). I wrote a parser in C (with a few libraries I had written to extend the language) to do what I needed, but the idea is pretty generalizable to other programming/scripting languages.
I did look at leaving cookies as option as well, and made use of a test program to try it out (it works too!). However, one tradeoff for leaving cookies on a user's local system is that those cookies generally are meant to hold small amounts of data (usually things like username, date created, & expiration date of the cookie) and are dependent upon the user's local machine. Further, while you can extract the data in those cookies from JavaScript, you are back to the initial problem: the data still exists on the web, not on an output file on your server's file system. In the case you need to extract data and want some guarantee this data will exist on your machine, use Elias Van Ootegem's solution.
JavaScript code that is running client-side cannot access the server's filesystem at the same time, let alone write a file. People often say that, if JS were to have IO capabilities, that would be rather insecure... just imagine how dangerous that would be.
What you could do, is simply build your string, using a Worker that, on closing, returns the full data-string, which is then sent to the server (AJAX call).
The server-side script (Perl, PHP, .NET, Ruby...) can receive this data, parse it and then write the file to disk as you want it to.
All in all, not very hard, but quite an interesting project anyway. Oh, and when using a worker, seeing as it's an online game and everything, perhaps a setInterval to send (a part of) the data every 5000ms might not be a bad idea, either.
As requested - some basic code snippets.
A simple AJAX-setup function:
function getAjax(url,method, callback)
{
var ret;
method = method || 'POST';
url = url || 'default.php';
callback = callback || success;//assuming you have a default function called "success"
try
{
ret = new XMLHttpRequest();
}
catch (error)
{
try
{
ret= new ActiveXObject('Msxml2.XMLHTTP');
}
catch(error)
{
try
{
ret= new ActiveXObject('Microsoft.XMLHTTP');
}
catch(error)
{
throw new Error('no Ajax support?');
}
}
}
ret.open(method, url, true);
ret.setRequestHeader('X-Requested-With', 'XMLHttpRequest');
ret.setRequestHeader('Content-type', 'application/x-www-form-urlencode');
ret.onreadystatechange = callback;
return ret;
}
var getRequest = getAjax('script.php?some=Get&params=inURL', 'GET');
getRequest.send(null);
var postRequest = getAjax('script.php', 'POST', function()
{//passing anonymous function here, but this could just as well have been a named function reference, obviously...
if (this.readyState === 4 && this.status === 200)
{
console.log('Post request complete, answer was: ' + this.response);
}
});
postRequest.send('foo=bar');//set different headers to pos JSON.stringified data
Here's a good place to read up on whatever you don't get from the code above. This is, pretty much a copy-paste bit of code, but if you find yourself wanting to learn just a bit more, Here's a great place to do just that.
WebWorkers
Now these are pretty new, so using them does mean not being able to support older browsers (you could support them by using the event listeners to send each morsel of data to the server, but a worker allows you to bundle, pre-process and structure the data without blocking the "normal" flow of your script. Workers are often presented as a means to sort-of multi-thread JavaScript code. Here's a good intro to them
Basically, you'll need to add something like this to your script:
var worker = new Worker('preprocess.js');//or whatever you've called the worker
worker.addEventListener('message', function(e)
{
var xhr = getAjax('script.php', 'post');//using default callback
xhr.send('data=' + e.data);
//worker.postMessage(null);//clear state
}, false);
Your worker, then, could start off like so:
var time, txt = '';
//entry point:
onmessage = function(e)
{
if (e.data === null)
{
clearInterval(time);
txt = '';
return;
}
if (txt === '' && !time)
{
time = setInterval(function()
{
postMessage(txt);
}, 5000);//set postMessage to be called every 5 seconds
}
txt += e.data;//add new text to current string...
}
Server-side, things couldn't be easier:
if ($_POST && $_POST['data'])
{
$file = $_SESSION['filename'] ? $_SESSION['filename'] : 'File'.session_id();
$fh = fopen($file, 'a+');
fwrite($fh, $_POST['data']);
fclose($fh);
}
echo 'ok';
Now all of this code is a bit crude, and most if it cannot be used in its current form, but it should be enough to get you started. If you don't know what something is, google it.
But do keep in mind that, when it comes to JS, MDN is easily the best reference out there, and as far as PHP goes, their own site (php.net/{functionName}) is pretty ugly, but does contain a lot of info, too...

Memory Leak When Pulling JSON from WEB

I've spent days on this and hit it from every angle I can think of. I'm working on a simple windows 7 gadget. This script will pull JSON data from a remote web server and put it on the page. I'm using jQuery 1.6.2 for the $.getJSON. Script consumes more memory each loop.
var count = 1;
$(document).ready(function () {
updateView();
});
function updateView(){
$("#junk").html(count);
count++;
$.getJSON( URL + "&callback=?", populateView);
setTimeout( updateView, 1000 );
}
function populateView(status) {
$("#debug").html(status.queue.mbleft + " MB Remaining<br>" + status.queue.mb + " MB Total");
}
Any help would be greatly appreciated....Thank you!
EDIT: Add JSON data sample
?({"queue":{"active_lang":"en","paused":true,"session":"39ad74939e89e6408f98998adfbae1e2","restart_req":false,"power_options":true,"slots":[{"status":"Queued","index":0,"eta":"unknown","missing":0,"avg_age":"2d","script":"None","msgid":"","verbosity":"","mb":"8949.88","sizeleft":"976 MB","filename":"TestFile#1","priority":"Normal","cat":"*","mbleft":"975.75","timeleft":"0:00:00","percentage":"89","nzo_id":"-n3c6z","unpackopts":"3","size":"8.7 GB"}],"speed":"0 ","helpuri":"","size":"8.7 GB","uptime":"2d","refresh_rate":"","limit":0,"isverbose":false,"start":0,"version":"0.6.5","new_rel_url":"","diskspacetotal2":"931.51","color_scheme":"gold","diskspacetotal1":"931.51","nt":true,"status":"Paused","last_warning":"","have_warnings":"0","cache_art":"0","sizeleft":"976 MB","finishaction":null,"paused_all":false,"cache_size":"0 B","finish":0,"new_release":"","pause_int":"0","mbleft":"975.75","diskspace1":"668.52","scripts":[],"categories":["*"],"darwin":false,"timeleft":"0:00:00","mb":"8949.88","noofslots":1,"nbDetails":false,"eta":"unknown","quota":"","loadavg":"","cache_max":"0","kbpersec":"0.00","speedlimit":"","webdir":"","queue_details":"0","diskspace2":"668.52"}})
EDIT 2: Stripped code down to this and it still leaks. I think that eliminates traversing the DOM as a contributor.
$(document).ready(function () {
setInterval(updateView, 1000);
});
function updateView(){
$.getJSON( URL + "&callback=?", populateView);
}
function populateView(status) {
}
EDIT 3: It's not jQuery. I removed jQuery and did it with straight js. Still leaks.
function init(){
setInterval(updateView, 1000);
}
function updateView(){
var xhr = new XMLHttpRequest();
xhr.open("GET", URL, false);
xhr.setRequestHeader( "If-Modified-Since", "0");
xhr.send('');
}
So...if it's not jQuery, not just in IE (Chrome too). What the heck?! Ideas?
Thank you!
Edit 2:
If it's actually taskmanager showing the leak here, then I think the next step is to investigate IE, as I believe that IE is then engine used to host Windows Widgets.
If you can recreate your script in a little html file you can run this tool and have a look if it's IE that's doing it:
http://blogs.msdn.com/b/gpde/archive/2009/08/03/javascript-memory-leak-detector-v2.aspx
Also, are you running IE8 or 9 ?
Edit:
Based on the JSON string in the Op; basically the problem is misleading here.
the bit of javascript posted is working perfectly fine.
The Server producing the JSON is the one that's showing a difference in memory usage, I would investigate the website/endpoint that's creating that JSON and seeing what the issue is.
Just had a thought,
$.getJSON is just a shorthand function for jQuery's $.ajax call.
I wonder if it makes a different if you change your code to use $.ajax but specifically add the cache mechanism to it:
$.ajax({
url: URL + "&callback=?",
dataType: 'json',
cache: false,
success: populateView
});
That might stop it trying to store it in memory perhaps, and depending on your browser, it might be showing more memory because you just haven't had your garbage collected, so to speak.
I have the feeling that the setTimeout function within the updateView is causing this behaviour. To test this you can modify your code to:
$(document).ready(function () {
setInterval(updateView, 1000);
});
function updateView(){
$("#junk").html(count);
count++;
$.getJSON( URL + "&callback=?", populateView);
}
function populateView(status) {
$("#debug").html(status.queue.mbleft + " MB Remaining<br>" + status.queue.mb + " MB Total");
}
EDIT: The setInterval function will execute the passed in function over and over every x miliseconds. Here to the docs.
EDIT 2:
Another performance loose (Although it might not be critical to the issue) is that you are traversing the DOM every second to find the $('#debug') element. You could store that and pass it in as:
$(document).ready(function () {
var debug = $('#debug');
var junk = $('#junk') ;
setInterval(function(){updateView(debug, junk)}, 1000);
});
function updateView(debug, junk){
junk.html(count);
count++;
$.getJSON( URL + "&callback=?", function(status){populateView(status,debug)});
}
function populateView(status) {
debug.html(status.queue.mbleft + " MB Remaining<br>" + status.queue.mb + " MB Total");
}
Edit 3: I have changed the code above because I forgot to take in the response from the server. Assuming that queue is a property of the returned JSON then the code should be as above.
Edit 4: This is a very interesting issue. Another approach then. Lets assume then that there is still some client side scripts that are clogging the memory. What could this be? As far as is I understand the only two things left are the setInterval and the $.getJSON function. The $.getJSON function is a simple ajax request wrapper which fires a request and waits for the response from the server. The setInterval function is a bit more peculiar one because it will set up timers, fire functions, etc.
I think if you manage to mimic this on your server or even just refresh this webpage in your browser every second/5 secs you you will be able to see whether it is the client or the server that processes your request.
Found this thread trying to find the underlying reason to this problem because I also had a similar problem recently although my memory would increase about 1Mb per minute... I pretty much isolated it to json parsing. Run the ajax command with type: 'text', and you should see that the memory gets cleaned up.
I found a library json_parse.js which recursively parses the json data using the JS engine (not eval). I manually parse the text data to json in the success callback and this works well.

How to ping IP addresses using JavaScript

I want to run a JavaScript code to ping 4 different IP addresses and then retrieve the packet loss and latency of these ping requests and display them on the page.
How do I do this?
You can't do this from JS. What you could do is this:
client --AJAX-- yourserver --ICMP ping-- targetservers
Make an AJAX request to your server, which will then ping the target servers for you, and return the result in the AJAX result.
Possible caveats:
this tells you whether the target servers are pingable from your server, not from the user's client
so the client won't be able to test hosts its LAN
but you shouldn't let the host check hosts on the server's internal network, if any exist
some hosts may block traffic from certain hosts and not others
you need to limit the ping count per machine:
to avoid the AJAX request from timing out
some site operators can get very upset when you keep pinging their sites all the time
resources
long-running HTTP requests could run into maximum connection limit of your server, check how high it is
many users trying to ping at once might generate suspicious-looking traffic (all ICMP and nothing else)
concurrency - you may wish to pool/cache the up/down status for a few seconds at least, so that multiple clients wishing to ping the same target won't launch a flood of pings
The only method I can think of is loading e.g. an image file from the external server. When that load fails, you "know" the server isn't responding (you actually don't know, because the server could just be blocking you).
Take a look at this example code to see what I mean:
/*note that this is not an ICMP ping - but a simple HTTP request
giving you an idea what you could do . In this simple implementation it has flaws
as Piskvor correctly points out below */
function ping(extServer){
var ImageObject = new Image();
ImageObject.src = "http://"+extServer+"/URL/to-a-known-image.jpg"; //e.g. logo -- mind the caching, maybe use a dynamic querystring
if(ImageObject.height>0){
alert("Ping worked!");
} else {
alert("Ping failed :(");
}
}
I was inspired by the latest comment, so I wrote this quick piece of code.
This is a kind of "HTTP ping" which I think can be quite useful to use along with XMLHttpRequest calls(), for instance to figure out which is the fastest server to use in some case or to collect some rough statistics from the user's internet connexion speed.
This small function is just connecting to an HTTP server on an non-existing URL (that is expected to return a 404), then is measuring the time until the server is answering to the HTTP request, and is doing an average on the cumulated time and the number of iterations.
The requested URL is modified randomely at each call since I've noticed that (probably) some transparent proxies or caching mechanisms where faking results in some cases, giving extra fast answers (faster than ICMP actually which somewhat weird).
Beware to use FQDNs that fit a real HTTP server!
Results will display to a body element with id "result", for instance:
<div id="result"></div>
Function code:
function http_ping(fqdn) {
var NB_ITERATIONS = 4; // number of loop iterations
var MAX_ITERATIONS = 5; // beware: the number of simultaneous XMLHttpRequest is limited by the browser!
var TIME_PERIOD = 1000; // 1000 ms between each ping
var i = 0;
var over_flag = 0;
var time_cumul = 0;
var REQUEST_TIMEOUT = 9000;
var TIMEOUT_ERROR = 0;
document.getElementById('result').innerHTML = "HTTP ping for " + fqdn + "</br>";
var ping_loop = setInterval(function() {
// let's change non-existent URL each time to avoid possible side effect with web proxy-cache software on the line
url = "http://" + fqdn + "/a30Fkezt_77" + Math.random().toString(36).substring(7);
if (i < MAX_ITERATIONS) {
var ping = new XMLHttpRequest();
i++;
ping.seq = i;
over_flag++;
ping.date1 = Date.now();
ping.timeout = REQUEST_TIMEOUT; // it could happen that the request takes a very long time
ping.onreadystatechange = function() { // the request has returned something, let's log it (starting after the first one)
if (ping.readyState == 4 && TIMEOUT_ERROR == 0) {
over_flag--;
if (ping.seq > 1) {
delta_time = Date.now() - ping.date1;
time_cumul += delta_time;
document.getElementById('result').innerHTML += "</br>http_seq=" + (ping.seq-1) + " time=" + delta_time + " ms</br>";
}
}
}
ping.ontimeout = function() {
TIMEOUT_ERROR = 1;
}
ping.open("GET", url, true);
ping.send();
}
if ((i > NB_ITERATIONS) && (over_flag < 1)) { // all requests are passed and have returned
clearInterval(ping_loop);
var avg_time = Math.round(time_cumul / (i - 1));
document.getElementById('result').innerHTML += "</br> Average ping latency on " + (i-1) + " iterations: " + avg_time + "ms </br>";
}
if (TIMEOUT_ERROR == 1) { // timeout: data cannot be accurate
clearInterval(ping_loop);
document.getElementById('result').innerHTML += "<br/> THERE WAS A TIMEOUT ERROR <br/>";
return;
}
}, TIME_PERIOD);
}
For instance, launch with:
fp = new http_ping("www.linux.com.au");
Note that I couldn't find a simple corelation between result figures from this script and the ICMP ping on the corresponding same servers, though HTTP response time seems to be roughly-exponential from ICMP response time. This may be explained by the amount of data that is transfered through the HTTP request which can vary depending on the web server flavour and configuration, obviously the speed of the server itself and probably other reasons.
This is not very good code but I thought it could help and possibly inspire others.
The closest you're going to get to a ping in JS is using AJAX, and retrieving the readystates, status, and headers. Something like this:
url = "<whatever you want to ping>"
ping = new XMLHttpRequest();
ping.onreadystatechange = function(){
document.body.innerHTML += "</br>" + ping.readyState;
if(ping.readyState == 4){
if(ping.status == 200){
result = ping.getAllResponseHeaders();
document.body.innerHTML += "</br>" + result + "</br>";
}
}
}
ping.open("GET", url, true);
ping.send();
Of course you can also put conditions in for different http statuses, and make the output display however you want with descriptions etc, to make it look nicer. More of an http url status checker than a ping, but same idea really. You can always loop it a few times to make it feel more like a ping for you too :)
I've come up with something cause I was bored of searching hours after hours for something that everyone is saying "impossible", only thing I've found was using jQuery.
I've came up with a new simple way using Vanilla JS (nothing else than base JavaScript).
Here's my JSFiddle: https://jsfiddle.net/TheNolle/5qjpmrxg/74/
Basically, I create a variable called "start" which I give the timestamp, then I try to set an invisible image's source to my website (which isn't an image) [can be changed to any website], because it's not an image it creates an error, which I use to execute the second part of the code, at this time i create a new variable called "end" which i give the timestamp from here (which is different from "start"). Afterward, I simply make a substraction (i substract "start" from "end") which gives me the latency that it took to ping this website.
After you have the choice you can store that in a value, paste it on your webpage, paste it in the console, etc.
let pingSpan = document.getElementById('pingSpan');
// Remove all the way to ...
let run;
function start() {
run = true;
pingTest();
}
function stop() {
run = false;
setTimeout(() => {
pingSpan.innerHTML = "Stopped !";
}, 500);
}
// ... here
function pingTest() {
if (run == true) { //Remove line
let pinger = document.getElementById('pingTester');
let start = new Date().getTime();
pinger.setAttribute('src', 'https://www.google.com/');
pinger.onerror = () => {
let end = new Date().getTime();
// Change to whatever you want it to be, I've made it so it displays on the page directly, do whatever you want but keep the "end - start + 'ms'"
pingSpan.innerHTML = end - start + "ms";
}
setTimeout(() => {
pingTest();
}, 1000);
} // Remove this line too
}
body {
background: #1A1A1A;
color: white
}
img {
display: none
}
Ping:
<el id="pingSpan">Waiting</el>
<img id="pingTester">
<br> <br>
<button onclick="start()">
Start Ping Test
</button>
<button onclick="stop()">
Stop
</button>
function ping(url){
new Image().src=url
}
Above pings the given Url.
Generally used for counters / analytics.
It won't encounter failed responses to client(javascript)
I suggest using "head" to request the header only.
xhr.open('head', 'asstes/imgPlain/pixel.txt' + cacheBuster(), true);
and than ask for readystate 2 - HEADERS_RECEIVED send() has been called, and headers and status are available.
xhr.onreadystatechange = function() {
if (xhr.readyState === 2) { ...
Is it possible to ping a server from Javascript?
Should check out the above solution. Pretty slick.
Not mine, obviously, but wanted to make that clear.
You can't PING with Javascript. I created Java servlet that returns a 10x10 pixel green image if alive and a red image if dead. https://github.com/pla1/Misc/blob/master/README.md

Using JavaScript to perform a GET request without AJAX

Out of curiosity, I'm wondering about the best (easiest, fastest, shortest, etc; make your pick) way to perform a GET request in JavaScript without using AJAX or any external libraries.
It must work cross-browser and it's not allowed to distort the hosting web page visually or affect it's functionality in any way.
I don't care about headers in the request, just the url-part. I also don't care about the result of the request. I just want the server to do something as a side effect when it receives this request, so firing it is all that matters. If your solution requires the servers to return something in particular, that's ok as well.
I'll post my own suggestion as a possible answer, but I would love it if someone could find a better way!
Have you tried using an Image object? Something like:
var req = new Image();
req.onload = function() {
// Probably not required if you're only interested in
// making the request and don't need a callback function
}
req.src = 'http://example.com/foo/bar';
function GET(url) {
var head = document.getElementsByTagName('head')[0];
var n = document.createElement('script');
n.src = url;
n.type = 'text/javascript';
n.onload = function() { // this is not really mandatory, but removes the tag when finished.
head.removeChild(n);
};
head.appendChild(n);
}
I would go with Pekka idea and use hidden iframe, the advantage is that no further parsing will be done: for image, the browser will try to parse the result as image, for dynamically creating script tag the browser will try to parse the results as JavaScript code.. iframe is "hit and run", the browser doesn't care what's in there.
Changing your own solution a bit:
function GET(url) {
var oFrame = document.getElementById("MyAjaxFrame");
if (!oFrame) {
oFrame = document.createElement("iframe");
oFrame.style.display = "none";
oFrame.id = "MyAjaxFrame";
document.body.appendChild(oFrame);
}
oFrame.src = url;
}

Categories