I'm making a that should get images from database with ajax and then display it with some carousel plugin. Here is how it should work :
Image url is saved to database from admin
frontend script makes a ajax call to a php file and returns all url's from db.
Now when I have all images, the carousel plugin can be applied.
Images are now being displayed one by one.
The trick is that i want to make another ajax call when last image is displayed and repopulate container div with fresh images from database. Here is my code
function get_images(){
var data = {
"function_name" : "get_images",
"args" : ""
};
$.ajax({
type : "POST",
url : "http://localhost/.../functions.php",
data : data,
dataType : "JSON",
success : function(ret){
$(".container").html(ret);
$(".container").carousel(//When last call - function again);
}
});
}
And here is the problem. On ajax success carousel plugin is starting to rotate imgs, and when its finished, the get_images function should be call again. But this function is already in the another function and every time when it makes a call it will be one level deeper. Is there any way of doing this?
The best would be for the carousel to fire an event that it needs more images and listen to that event. You can then retrieve more images and return the new list to the carousel. There are carousel plugins that have all this build in. I often use this one: http://caroufredsel.dev7studios.com/
Maybe in your PHP script:
<?php
if ( isset($_POST['function_name']) ) {
if ( $_POST['function_name'] == 'get_images' ) {
// Get some URLs into an array...
// For the sake of the example we'll manually fill the array
$urls = ['img/image1.png', 'img/image2.png'];
echo json_encode($urls);
}
}
And in your JS:
function Carousel() {
this.getMoreImages();
}
Carousel.prototype.nextImage = function () {
if (this.i < this.images.length) {
this.showImage(); // Implement this.
this.i += 1;
this.spin():
}
else {
this.getMoreImages();
}
}
Carousel.prototype.spin = function () {
var self = this;
setTimeout(function () {
self.nextImage();
}, 5000);
}
Carousel.prototype.getMoreImages = function () {
var self = this;
$.ajax({
type : 'POST',
url : 'http://localhost/.../functions.php',
data : data,
dataType : 'JSON',
success : function (ret) {
self.images = JSON.parse(ret);
self.i = 0;
self.spin();
}
});
}
var myCarousel = new Carousel();
This Carousel object will request an array of images on instantiation, and show each image on a 5-second interval. When all the images have been exhausted, it will automatically make another AJAX call, retrieving images in the same manner as it did originally, and then continue looping through the new images. It will continue in this cycle forever.
Related
Here's the problem. I'm making a callback to the server that receives an MVC partial page. It's been working great, it calls the success function and all that. However, I'm calling a function after which iterates through specific elements:
$(".tool-fields.in div.collapse, .common-fields div.collapse").each(...)
Inside this, I'm checking for a specific attribute (custom one using data-) which is also working great; however; the iterator never finishes. No error messages are given, the program doesn't hold up. It just quits.
Here's the function with the iterator
function HideShow() {
$(".tool-fields.in div.collapse, .common-fields div.collapse").each(function () {
if (IsDataYesNoHide(this)) {
$(this).collapse("show");
}
else
$(this).collapse("hide");
});
alert("test");
}
Here's the function called in that, "IsDataYesNoHide":
function IsDataYesNoHide(element) {
var $element = $(element);
var datayesnohide = $element.attr("data-yes-no-hide");
if (datayesnohide !== undefined) {
var array = datayesnohide.split(";");
var returnAnswer = true;
for (var i in array) {
var answer = array[i].split("=")[1];
returnAnswer = returnAnswer && (answer.toLowerCase() === "true");
}
return returnAnswer;
}
else {
return false;
}
}
This is the way the attribute appears
data-yes-no-hide="pKanban_Val=true;pTwoBoxSystem_Val=true;"
EDIT: Per request, here is the jquery $.post
$.post(path + conPath + '/GrabDetails', $.param({ data: dataArr }, true), function (data) {
ToggleLoader(false); //Page load finished so the spinner should stop
if (data !== "") { //if we got anything back of if there wasn't a ghost record
$container.find(".container").first().append(data); //add the content
var $changes = $("#Changes"); //grab the changes
var $details = $("#details"); //grab the current
SplitPage($container, $details, $changes); //Just CSS changes
MoveApproveReject($changes); //Moves buttons to the left of the screen
MarkAsDifferent($changes, $details) //Adds the data- attribute and colors differences
}
else {
$(".Details .modal-content").removeClass("extra-wide"); //Normal page
$(".Details input[type=radio]").each(function () {
CheckOptionalFields(this);
});
}
HideShow(); //Hide or show fields by business logic
});
For a while, I thought the jquery collapse was breaking, but putting the simple alert('test') showed me what was happening. It just was never finishing.
Are there specific lengths of time a callback function can be called from a jquery postback? I'm loading everything in modal views which would indicate "oh maybe jquery is included twice", but I've already had that problem for other things and have made sure that it only ever includes once. As in the include is only once in the entire app and the layout is only applied to the main page.
I'm open to any possibilities.
Thanks!
~Brandon
Found the problem. I had a variable that was sometimes being set as undefined cause it to silently crash. I have no idea why there was no error message.
I am having some ordering issues. I have some code that does the following:
on page load, loop through 3 tables and grab content from server and populate table with said content
make the table responsive
I am having issues making this work. I can achieve this fine through inspect element (calling functions) but that's not user friendly. I want to know if there's a way I can choose the ordering on what function is being executed.
What I have so far is:
$(document).ready(function() {
if (dateCounter == null){ //start calendar from today's date
var current = new Date();
dateChange(current, "", 0); //function one to get grab all contents
//make table responsive
var switched = false;
var updateTables = function() {
if (($(window).width() < 992) && !switched ){
console.log("window width < 992px");
switched = true;
$("table.responsive").each(function(i, element) {
console.log("splitting table up");
splitTable($(element));
});
return true;
}
else if (switched && ($(window).width() > 992)) {
switched = false;
$("table.responsive").each(function(i, element) {
unsplitTable($(element));
});
}
};
function splitTable(original){...}
function unsplitTable(original){...}
}
});
In theory, on page load, it should populate the table first, then make the table responsive, but that's not the case. It seems to be rendering everything concurrently and therefore I get lots of missing/hidden content in my table. I don't know if the AJAX call in my dateChange function has anything to do preventing my table from displaying content correctly.
Following is a code snippet of the dateChange function:
function dateChange(dateInput, nGuests, vName){
//format dates
//For each table (3 tables)
$(".title").each(function(index, element) {
//prepare HTML for grabbing content from server
//Grab content from server and pop into table
$.ajax({
url: "/grab_Content.asp?boatType="+boatName+"&date="+dateInput+"&guests="+guests+"&boatName="+vName+"",
dataType:"html",
success: function(data){
table.html(data);
}
});
});
}
Yes, AJAX calls are asynchronous. $.ajax returns a promise that you can use to control sequence. First, return the promise from dateChange:
function dateChange(dateInput, nGuests, vName){
return $.ajax({
//...
});
}
Then when you call it:
dateChange(current, "", 0).then(function() {
//make table responsive
var switched = false;
//....
}
That will make sure the AJAX call completes before you make the table responsive.
If you have multiple AJAX calls, you'll have to store the promises in an array and use $.when:
var promises = [];
$('.whatever').each(function() {
var promise = $.ajax({ /*...*/ });
promises.push(promise);
});
$.when.apply($, promises).done(function() {
console.log('all done');
// do work....
});
Note we have to use Function.prototype.apply because $.when treats an array of promises like a single promise.
This topic is covered in a few other questions, but I had some difficulty applying the suggested approaches into this use case. I have a checkbox list, where a user can select n sub-sites to publish their post to. since this list could grow to be 100+, I need an efficient way to perform an expensive task on each one. It's okay if it takes awhile, as long as Im providing visual feedback, so I planned to apply an "in progress" style to each checkbox item as its working, then move to the next item int he list once it is successfully published. Also note: I'm working in the WordPress wp_ajax_ hook but the PHP side of things is working well, this is focused on the JS solution.
This code is working right now (console.logs left in for debug), but I've seen multiple warnings against using async: true. How can I achieve a waterfall AJAX loop in a more efficient way?
//Starts when user clicks a button
$("a#as_network_syndicate").click( function(e) {
e.preventDefault(); //stop the button from loading the page
//Get the checklist values that are checked (option value = site_id)
$('.as-network-list').first().find('input[type="checkbox"]').each(function(){
if($(this).is(':checked')){
blog_id = $(this).val();
console.log(blog_id+' started');
$(this).parent().addClass('synd-in-progress'); //add visual feedback of 'in-progress'
var process = as_process_syndication_to_blog(blog_id);
console.log('finished'+blog_id);
$(this).parent().removeClass('synd-in-progress');
}
});
});
function as_process_syndication_to_blog(blog_id){
var data = {
"post_id": $('#as-syndicate_data-attr').attr("data-post_id"), //these values are stored in hidden html elements
"nonce": $('#as-syndicate_data-attr').attr("data-nonce"),
"blog_id": blog_id
};
var result = as_syndicate_to_blog(data);
console.log('end 2nd func');
return true;
}
function as_syndicate_to_blog(data){
$.ajax({
type : "post",
dataType : "json",
async: false,
url : ASpub.ajaxurl, //reference localized script to trigger wp_ajax PHP function
data : {action: "as_syndicate_post", post_id : data.post_id, nonce: data.nonce, blog_id: data.blog_id},
success: function(response) {
if(response.type == "success") {
console.log(response);
return response;
} else {
}
},
error: {
}
});
}
Indeed, doing synchronous AJAX request is bad because it will block the browser during the whole AJAX call. This means that the user cannot interact with your page during this time. In your case, if you're doing like 30 AJAX calls which take say 0.5 seconds, the browser will be blocked during 15 whole seconds, that's a lot.
In any case, you could do something following this pattern:
// some huge list
var allOptions = [];
function doIntensiveWork (option, callback) {
// do what ever you want
// then call 'callback' when work is done
callback();
}
function processNextOption () {
if (allOptions.length === 0)
{
// list is empty, so you're done
return;
}
// get the next item
var option = allOptions.shift();
// process this item, and call "processNextOption" when done
doIntensiveWork(option, processNextOption);
// if "doIntensiveWork" is asynchronous (using AJAX for example)
// the code above might be OK.
// but if "doIntensiveWork" is synchronous,
// you should let the browser breath a bit, like this:
doIntensiveWork(option, function () {
setTimeout(processNextOption, 0);
});
}
processNextOption();
Notice: as said by Karl-André Gagnon, you should avoid doing many AJAX requests using this technique. Try combining them if you can, it will be better and faster.
If you can't pass the whole block to the server to be processed in bulk, you could use a jQuery queue. This is using your sample code as a base:
var $container = $('.as-network-list').first();
$container.find('input[type="checkbox"]:checked').each(function(){
var $input = $(this);
$container.queue('publish', function(next) {
var blog_id = $input.val(),
$parent = $input.parent();
console.log(blog_id+' started');
$parent.addClass('synd-in-progress'); //add visual feedback of 'in-progress'
as_process_syndication_to_blog(blog_id).done(function(response) {
console.log(response);
console.log('finished'+blog_id);
$parent.removeClass('synd-in-progress');
next();
});
});
});
$container.dequeue('publish');
function as_process_syndication_to_blog(blog_id){
var data = {
"post_id": $('#as-syndicate_data-attr').attr("data-post_id"), //these values are stored in hidden html elements
"nonce": $('#as-syndicate_data-attr').attr("data-nonce"),
"blog_id": blog_id
};
return as_syndicate_to_blog(data).done(function(){ console.log('end 2nd func'); });
}
function as_syndicate_to_blog(data){
return $.ajax({
type : "post",
dataType : "json",
url : ASpub.ajaxurl, //reference localized script to trigger wp_ajax PHP function
data : {action: "as_syndicate_post", post_id : data.post_id, nonce: data.nonce, blog_id: data.blog_id}
});
}
I don't have a test environment for this so you may need to tweak it for your use case.
I'm auto-refreshing the content on a site using ajax/json. Using jkey, I trigger some posting to the document, and during this action I want to cancel the original setTimeout that's running ("reloading" at 2 mins) - and then trigger the entire function "content" again to start over after 5 secs. However, I can't seem to stop "reloading" properly, neither call "content" after the given seconds. Can anybody spot the error?
<script>
$(function content(){
function reloading(){
$.ajax({
url: 'api.php',
data: "",
dataType: 'json',
success: function(data)
{
var id = data[0];
_id = id;
var vname = data[1];
var message = data[2];
var timestamp = data[3];
var field1 = data[4];
_field1 = field1;
var val2 = parseInt(field1, 10) ;
_val2 = val2;
$('#output').hide().html( message ).fadeIn("slow");
$('#username').hide().html( vname +":" ).fadeIn("slow");
setTimeout(reloading,120000);
}
});
}
reloading();
});
$(document).jkey('a',function() {
$.post("update.php", { "id": _id} )
$('#output').hide().html( "<i>thx" ).fadeIn("slow");
$('#username').fadeOut("fast");
$('#valg1').fadeOut("fast");
$('#valg2').fadeOut("fast");
clearTimeout(reloading);
setTimeout(content,5000);
});
</script>
The clearTimeout should get the unique "key" that is returned by setTimeout. So when setting, assign the return value to global variable:
window["reload_timer"] = setTimeout(reloading,120000);
Then have this:
clearTimeout(window["reload_timer"]);
You must save the setTimeout() id in order to later clear it with clearTimeout(). Like this:
var timeoutID = setTimeout(function(){someFunction()}, 5000);//this will set time out
//do stuff here ...
clearTimeout(timeoutID);//this will clear the timeout - you must pass the timeoutID
Besides saving the timeout id as mentioned in other posts, your function reloading is created inside function content and since you create no closure to it, it's unreachable from the rest of the program.
$(function content(){
function reloading(){
console.log('RELOADING');
}
reloading();
});
// Can't reach `content` or `reloading` from here
You have to do something like this:
var reloading, content, reloadingTimeoutId, contentTimeoutId;
reloading = function () {
console.log('RELOADING');
$.ajax(
// CODE
success : function (data) {
// CODE
reloadingTimeoutId = setTimeout(reloading, 120000);
}
)
};
content = function () {
reloading();
};
$(document).jkey('a',function() {
// CODE
clearTimeout(contentTimeoutId);
contentTimeoutId = setTimeout(content,5000);
});
It's kinda difficult writing this better not knowing the bigger picture. With this, content will be called after 5 seconds and as long as reloading succeeds it will callback itself every 120 seconds. Please observe that reloading is never cleared this way.
On my portfolio website, I am using a jQuery .ajax() call to pull in my portfolio pieces via XML.
My issue is that after a fresh page load, if the "portfolio" link is clicked first, then the portfolio pieces are pulled in normally. If, after a fresh page load, the "portfolio" link is clicked after any of the other links, then the portfolio pieces are pulled in twice.
You can see the issue for yourself on my site: Transhuman Creative
Here is the code that figures out which navigation link is clicked based on its rel attribute:
$("#nav a").click( function () {
if($(this).attr("rel") == "blog") {
return false;
}else{
$("#nav a").removeClass("selected");
$(this).addClass("selected");
setBlock($(this).attr("rel"));
}
});
After a link is clicked, it is processed by theThe setBlock() function, which hides existing content and calls the processBlock() function to load content.
function setBlock(block) {
if(firstNav) {
processBlock(block);
firstNav = false;
}
else
{
if($(".tab").length > 0 && $(".tab").is(":hidden") == false) {
$(".hidable").fadeOut();
$(".tab").fadeOut(function(){
processBlock(block);
});
}
else {
$(".hidable").fadeOut(function (){
processBlock(block);
});
}
}
}
The processBlock() function waits 500ms to let the animation finish, then either shows the block of content or calls the loadItems() function to load the portfolio data.
function processBlock(block) {
var s = setInterval( function () {
if (block == "portfolio") {
loadItems();
}else{
$("." + block).fadeIn();
}
clearInterval(s);
}, 500);
}
And finally, the .ajax() call is in the loadItems() function. After loading the porfolio data from the XML file, it calls the tabFade() function to parse the data and generate the HTML for the portfolio pieces. The variable firstCall is initially set to true, and it is meant to prevent the portfolio data from being reloaded if it's already in memory:
function loadItems() {
if (firstCall) {
$.ajax({
type: "GET",
url: "data/portfolio.xml?ver=1.11",
cache: false,
dataType: "xml",
success: function(xml){
$(xml).find('item').each(function(){
$("#main").append(addItem($(this)));
});
tabFade();
firstCall = false;
}
});
}else{
tabFade();
}
}
Any thoughts on what might be causing the double load issue? Thanks for your help.
I believe it would be better to set the firstCall variable right inside of the if condition. Otherwise it waits 500+ milliseconds before being set and only gets set once the ajax request completes.
function loadItems() {
if (firstCall) {
firstCall = false; // Put the assignment here before waiting.
$.ajax({
type: "GET",
url: "data/portfolio.xml?ver=1.11",
cache: false,
dataType: "xml",
success: function(xml){
$(xml).find('item').each(function(){
$("#main").append(addItem($(this)));
});
tabFade();
//firstCall = false;
}
});
}else{
tabFade();
}
}
Try using setTimeout instead of setInterval. You probably want to use setTimeout anyway as I don't think you want to run the code more than once?
It could be that it's running that code twice and making two ajax calls as it hasn't responded within 500ms.