Multiple AJAX requests on page load is slow. Bad method? - javascript

I should start by saying that I'm running my site from PHPStorm's inbuilt webserver, but I think the problem is in my method, not the environment.
I'm redesigning my outdated games website. I thought it would be a good idea to have my PHP dependant content load by AJAX instead of includes at the top of a page that can only be run once.
In this example, the page loads very quickly but there are 2 sections that have independent AJAX loaded content that take unusually long to load. The 3 blue lines inside the boxes animate to indiciate it is loading.
The relevant jquery and javascript stuff is in the html body, as the site uses a template (containing the header, footer and basically everything that is require for each page) and simply includes the unique page content in the middle, eg - include 'main.php'; index.php doesn't have a head or body, it just contains js and html.
However, I wait until $(document).ready to run any javascript so I don't think this is important.
The first to load is the news section, because it is called first in the javascript. This simply grabs a single element from a mysql table. This alone takes upto 2 seconds.
Next is the last 3 tweets on my timeline, I run a very efficient, simplified Twitter timeline retrieving PHP script, this also takes around 2 seconds.
Also consider I've yet to implement a getGames.php for retrieving all the necessary game info from the database, the wide thumbnails are currently just placeholders, so in total my site will be taking even longer to load all of its content.
Anyway, I realised that if I just do include 'getTweets.php' and 'getNews.php', the raw content appears to load much much quicker, no ajax necessary. However the echoed data gets printed on the page.
The thing is, I'm likely never to call those AJAX function again on this particular page, so the reason for using AJAX on this page is simply consistency and thinking ahead, but there are other pages (the news section for instance) where I will have to use AJAX to get news articles via getNews.php.
Obivously I don't want to have 2 files with almost the exact same PHP code, but I can't see any elegant solution for using both AJAX and includes.
Perhaps I could have one great big ajax.php which can return all the necessary page data in a single call instead of calling AJAX multiple times with different PHPs?
I don't know...
I'm sure there's nothing wrong with the php code itself (getting tweets or news from the database were coded with efficiency in mind), just my method.
I wasn't going to include any code, I don't think it makes my problem any easier to understand, but anyway:
index.php, javascript section:
function ajax(page, inputData, outputHandler, errorHandler) {
$.ajax({
type: "POST",
url: page,
data: inputData,
success: function(data){
outputHandler(JSON.parse(data));
},
error:errorHandler
});
}
$(function() {
$("#tweet-container").append($.parseHTML(loadingIcon));
$("#news-container").append($.parseHTML(loadingIcon));
ajax("/ajax/getNews.php", { }, function(data) {
$("#news-container").empty();
for (var i = 0; i < data.length; i++) {
var date = data[i].dateadded;
date = $.timeago(date);
var text = data[i].news;
text = text.replace(/<br\s*[\/]?>/gi, "\n");
text = $.trim($(text).text());
//'+ data[i].author+'
text = '<img src="/images/Frosty.png" alt="%s" align="left" style="padding:6px 5px; background:pink; margin-right:5px;"/>' + text;
var title = data[i].title;
var newsData = { header: date, title:title, content:text };
var article = Mustache.render(newsTemplate, newsData);
$("#news-container").append(article);
}
$(".news-content").dotdotdot( { height:200 } );
$($(Mustache.render(footerTemplate, { link:"/news", target:"_self", content:"Read More"} ))).appendTo($("#news-container"));
} );
ajax("/ajax/getTweets.php", { count:3 }, function(data) {
$("#tweet-container").empty();
for (var i = 0; i < data.length; i++) {
var text = processTweetURLs(data[i].text);
var date = $.timeago(data[i].date);
var tweetData = { header:date, content:text };
var tweet = Mustache.render(tweetTemplate, tweetData);
$(tweet).appendTo($("#tweet-container"));
}
$(".twitter-content").dotdotdot();
$($(Mustache.render(footerTemplate, { link:"https://twitter.com/gp_studios", target:"_blank", content:"More Tweets"} ))).appendTo($("#tweet-container"));
} );
createGameBoxGrid(GAME_BOX_WIDE, ".featured-games-list", 3, [{}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}]);
$(".main-left").height( $(".main-center").height() );
createGameBoxGrid(GAME_BOX_SMALL, ".main-category-games-list", 9, [{}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}]);
} );
getNews.php
<?php
include($_SERVER["DOCUMENT_ROOT"]."/class/Connection.php");
$mysql = new Connection();
$articles = [];
//SUBSTRING(news,1,1000)
if ($result = $mysql->query("SELECT id, title, news, author, dateadded FROM news ORDER BY dateadded DESC LIMIT 1")) {
while ($row = $result->fetch_assoc()) {
array_push($articles, [ "id"=>$row["id"], "title"=>$row["title"], "news"=>$row["news"], "author"=>$row["author"], "dateadded"=>$row["dateadded"] ] );
}
}
echo json_encode($articles);
?>
What I've written above is a bit waffley but was necessary to get across all the relevant info. So in simpler terms and phrased more like questions:
why is AJAX so slow compared to just running it before the page has fully loaded?
my AJAX PHPs echo a long and often complex JSON, how can I use the same script for AJAX and includes? I've tried surrounding the include in div tags (a method I don't like at all) but if the json contains html it get's interpreted as such and get's somewhat mangled.
Any general hints for making this work a lot better?

The problem you are probably running into, is the fact that browsers limit the number of connections to the same server. See also this question on SO.
So if you put your code in a $(document).ready() function, what actually happens, is that the DOM is ready, so all images start loading and your ajax call will not go through until there is a connection available to the same server.
So your ajax request can be fast, but it starts late, giving the impression of a long-during call.
You can solve this for example by putting all static assets like images on a different sub-domain.
However, you can also use the same script you use for your ajax call as an include so that you don't have to duplicate code. Then you can include it or call it using ajax and get the same results.
You could use something like:
if (!isset($some_enviroment_variable))
{
// ajax call
// set up environment
}
// do your php controller stuff
// output the results

Related

Don't take the GET parameters from the URL

Recently I made a small refactor of my application and the paginated tables have stopped working and I think it is because it is not taking the current page that is passed by GET parameters in the URL.
Before the refactor I had a class index.php that included all code HTML along with PHP code, but to optimize and not duplicate code, I started to separate what would be, the navigation menu, the top navbar, and the content in different files .php, so that when I enter a website using this code I load the entire page:
$(function () {
$.get("../navbar.php", function (data) {
$("#navbar-menu").append(data);
});
$.get("../menu.php", function (data) {
$("#sidebar-wrapper").append(data);
});
$.get("./content.php", function (data) {
$("#divPersonal").append(data);
});
});
When I enter for example one of these pages where I have a table with pagination and type links:
http://localhost/app/modules/users/index.php?page=2
When I reload the index.php and it loads with javascript the "content.php" where I have the PHP call of "getUsers()", it should take the URL and the "page" parameter but it is not doing it, and it seems that it is due to the way my index.php is being mounted. I can not find an optimal solution to solve this problem. I take the parameters directly when I call the function with an if:
if (empty($_GET['page'])) {
$firstPage = 0;
$actualPage = 1;
} else {
$actualPage = $_GET["page"];
$firstPage = ($actualPage - 1) * $SIZE;
}
If anyone can help me, thank you.
Name the pages with them titles and make like this to switch between pages:
if(isset($_POST["page"]))
{
switch ($_POST["page"]) {
case "login":
include '/folder/page.php';
}
}

How do I parse data from a json api that has multiple pages

Objective: Parse JSON from an API where results are listed across multiple pages.
I am new to JSON and to working with data in general. I want to know how to write a function that will update the url, outputting the results for each page, and stopping when it reaches one that is empty.
This problem here is from a Shopify url displaying JSON data used for trivial purposes and not part of a real application.
https://shopicruit.myshopify.com/admin/orders.json?page=1&access_token=c32313df0d0ef512ca64d5b336a0d7c6
Each page had 5O objects. I'm making an $.ajax request to the url but the url has page=1 as a query,
$.ajax({
url:"https://shopicruit.myshopify.com/admin/orders.json?page=1&access_token=c32313df0d0ef512ca64d5b336a0d7c6",
method:'get',
dataType:'JSON'
}).done(function(response){
so the response I am only getting back is only for The results of page one (obviously). I know there are more pages b/c if I manually put a 2 in place of the 1 I can see different data. This goes on for multiple pages. I have tried removing the page option, setting it to all and any and these just display page 1.I thought maybe leaving the page option out would cure the problem but it does not.
How do I get all the pages of data with an ajax call?
Is it a function that takes the $.ajaxcall inside of it, that adds page++ and makes a new call for each page? I still don't know how to write that sadly.
The shopify API docs do give some examples on how to display "all data" but I tried to use what they suggested and it did not work so I'm not sure that it's applicable to the problem, but just in case it is–
https://help.shopify.com/api/reference/order
Here is a simplistic answer - this will get pages until there's clearly no more data - i.e. once a page returns less than limit orders
function getAllData(page) {
return $.ajax({
url:"https://shopicruit.myshopify.com/admin/orders.json?page=" + (page || 1) + "&limit=250&access_token=c32313df0d0ef512ca64d5b336a0d7c6",
method:'get',
dataType:'JSON'
}).then(function(response){
if (page && response.orders.length == 250) {
return getAllData(page + 1)
.then(function (more) {
return response.orders.concat(more)
})
}
return response.orders;
});
}
getAllData(1).then(function(orders) {
// orders is an array of orders
});
Note I've used 250 for limit to get 250 at a time
I say this is simplistic because, it does get all the data, however, you need to wait until all the data is retrieved before you can use it - this may take too long for your "user experience" - but this should get you to a place you can start
There's logic in the code such that if page is 0, only the first page will be retrieved regardless of how many items are in it - so you could do something like
getAllData().then(function(page1data) {
// do something with page 1
}).then(function() {
return getAllData(2); // start at page 2
}).then(function(restOfData) {
// do something with restOfData, pages 2+
});
One thing I'm not sure of is
.then(function(response){
you may need to change this to
.then(function(r){
var response = r[0];
I'm not 100% certain of jQuery's .then callback arguments

Instagram grabbing photos based on hashtag

setInterval(function () {
$("ul").empty();
var tag = $("#add").data('tag'), maxid = $("#add").data('maxid');
$.ajax({
type: 'GET',
url: 'ajax.php',
data: {
tag: tag,
max_id: maxid
},
dataType: 'json',
cache: false,
success: function(data) {
// Output data
$.each(data.images, function(i, src) {
$('ul#photos').append('<li><img src="' + src + '"></li>');
});
// Store new maxid
$('#add').data('maxid', data.next_id);
}
});
}, 20000);
I'm trying to load 1 picture at a time in an interval of 20s. However for a certain hashtag with only 27 photos. It loads well until the last 20, which loads all at one even though I'm limiting it to just one. It's always the last 20.
How do I load it 1 at a time for the last 20?
It's difficult to say exactly without looking at your PHP script, but what I can say is that you are iterating over an array of returned photos (using $.each) and appending each photo from the array of returned photos to your DOM.
So one thing would be, don't iterate over the array of photos and just access the first index of the array of photos (data.images[0]). If you can't figure out why your server side code is returning more photos than you want (which you should investigate), just grab all the photos and set a timeout that adds one of the returned photos every 20s after you've made the network request for all of them. This would mean less network requests as well, so maybe it would be a good solution for you.
If you want to make up to 20 ajax requests (maybe not an optimal solution), and you are getting more images back than you want, then your PHP needs to be investigated and right now you're only showing us the client side code.

Remote json and Javascript

I'm currently developing a web application for the school where I work. We have a program called FROG on one of our dedicated servers. Unfortunately this is very locked down, and you create websites using a a gui. The most coding you can do on it is HTML and javascript.
I want to be able to retrieve information from a remote server which we also own. I cant use ajax due to the cross domain restrictions. However I have come up with a work around.
I have this function call on my remote server within a file called xrequest.js:
loadNotices({[{title: 'this is a test'},{title: 'this is another test'}]});
This is simply a function call with a json object passed as an argument (The argument will ultimately be generated from data retrieved from a database).
On my other restricted server, I have this javacript:
<script type="text/javascript">
function loadNotices(data)
{
alert(data);
}
var url = "http://somedomain.com/tests/xrequest.js";
var script = document.createElement('script');
script.setAttribute('src', url);
document.getElementsByTagName('head')[0].appendChild(script);
</script>
<div id="notices"></div>
What I want to do is loop through each of the titles in the xrequest.js file, and display them as a list.
Im unsure how to loop through the titles.
If you need any more information, please leave a comment. Any help is appriciated.
Many thanks
Phil
To loop over the titles, you first need to remove the curly braces around your array. After, loop through the titles like below:
function loadNotices(arr) {
var title, i = 0;
for (; i < arr.length; i++) {
title = arr[i].title;
}
}​
Also, look into changing:
document.getElementsByTagName('head')[0].appendChild(script);
to
document.head.appendChild(script);
Your implementation look like JSONP call. With Jquery you can make it easy
$.get('url?callback', {<data>}, function(data){
});
with ?callback at the end of url, jquery auto create a random callback function. At your server instead of return normal JSON, you can add wraper callback function around it.
example with php:
$callback = $_GET['callback'];
echo $callback.'('.json_encode(obj).');';
which will become
callback({your return data>});
and your script will receive that.
loadNotices({[{title: 'this is a test'},{title: 'this is another test'}]});
this function call is not correct, do this instead:
loadNotices([{title: 'this is a test'},{title: 'this is another test'}]);
then you can loop through your title like this
for (i = 0; i < titles.length; i++){
alert(titles[i].title);
}

Caching AJAX query results with prototype

I'm looking at putting together a good way of caching results of AJAX queries so that the same user doesn't have to repeat the same query on the same page twice. I've put something together using a Hash which works fine but i'm not sure if there's a better method i could be using. This is a rough snippet of what i've come up with which should give you a general idea:
var ajaxresults;
document.observe("dom:loaded", function() {
ajaxresults = new Hash();
doAjaxQuery();
});
function doAjaxQuery(){
var qs = '?mode=getSomething&id='+$('something').value;
if(ajaxresults.get(qs)){
var vals = (ajaxresults.get(qs)).evalJSON();
doSomething(vals);
}else{
new Ajax.Request('/ajaxfile.php'+qs,{
evalJSON: true,
onSuccess: function(transport){
var vals = transport.responseText.evalJSON();
ajaxresults.set(qs,transport.responseText);
},
onComplete: function(){
doSomething(vals);
}
});
}
}
Did you try caching the AJAX requests by defining the cache content headers. Its another way and your browser will take care of caching. You dont have to create any hash inside your libraray to maintaing cache data.
High performance websites discussed lot about this. I dont know much about the PHP, but there is a way in .Net world to setting cache headers before writing the response to stream. I am sure there should be a similar way in PHP too.
If you start building a results tree with JSON you can check of a particular branch (or entry) exists in the tree. If it doesn't you can go fetch it from the server.
You can then serialize your JSON data and store it in window.name. Then you can persist the data from page to page as well.
Edit:
Here's a simple way to use JSON for this type of task:
var clientData = {}
clientData.dataset1 = [
{name:'Dave', age:'41', userid:2345},
{name:'Vera', age:'32', userid:9856}
]
if(clientData.dataset2) {
alert("dataset 2 loaded")
}
else {
alert("dataset 2 must be loaded from server")
}
if(clientData.dataset1) {
alert(clientData.dataset1[0].name)
}
else {
alert("dataset 1 must be loaded from server")
}
Well, I guess you could abstract it some more (e.g. extend Ajax by a cachedRequest() method that hashes a combination of all parameters to make it universally usable in any Ajax request) but the general approach looks fine to me, and I can't think of a better/faster solution.

Categories