AJAX page fetch ignores javascript on page - javascript

I am calling an ajax script to fetch a content of a page and then replace the current contents with the fetched one. One problem I am facing is that there is a javascript on the fetched page which doesnt get rendered and is treated as text. How do I make sure that the javascript gets loaded?

write javascript(fetched page) code in main page.
for example:
$.ajax({
url: "your fetched page url",
dataType: "html",
type: 'POST',
data: "your data",
success: function(data){
$("#result").html(data); //call back div
//put your javascript code from fetched page here
}
});

Related

Passing Data from ajax in js file to php file and print php

I am trying to send data from a php page to another php page through JavaScript with ajax and the getting the new page to open with the data that was sent so that the user can print the page.. here is the js snippet processDateP and sourceID2 are entered by user.
$("#PrintShelf").click(function() {
$.ajax({
type: "POST",
url: "print2.php",
dataType: "json",
data: ({processDate: $('#processDateP').val(), sourceID: $('#sourceID2').val()}),
success: function(data){
if('#sourceID2' != null ){
window.open('print2.php');
}
}
});
});
I want this to open the file PHP file print2.php, but it either opens the file with no data if i move window.open outside of success function or it sends the data and the page popluates correctly (can see using firebug) but does not open the window..I have run just the print2.php file alone with dummy values and it works and prints.
Thanks in advance

Nodejs jquery reload div part of page ejs

I am trying to reload only part of a html page using jquery after a ajax call and everything I've tried gets me 404 not found. Does the jquery.load not work with node js and if so what is a workaround to get one div to reload without reloading the whole page?
$.ajax({
url: "http://localhost:3000/high",
type: "GET",
datatype: "jsonp",
success: function(data){
console.log(data);
var highbtn = JSON.parse(data);
console.log(highbtn);
$('#high').load('/Public/userphotoview.ejs' + ' #high');
},
error: function(){
console.log("error fetching data");
}

table positioning with AJAX to PHP post

I have a jQuery script that sends POST data via AJAX to a php file that then creates a table. I have used firebug to check the console and everything gets created properly. On the success: I reload the window and the table isn't displayed.
I know the table html was created properly from the php file because I commented out the reload so I could see exactly what it created) I have lots of things displayed on my page and would like the table in a particular spot. How can I get the table to actually display where I want it to?
<script>
$(document).ready(function () {
$("#stayinfobutton").click(function () {
var id = $('#id').val();
var dataString = {
id: id
};
console.log(dataString);
$.ajax({
type: "POST",
url: "classes/table_auto_guests.php",
data: dataString,
cache: false,
/*success: function(html)
{
window.location.reload(true);
}*/
});
});
});
</script>
The window.location call will reload a new page in the browser window, this will loose the data returned to the ajax call by the server.
Usually the response to Ajax calls is loaded directly into your page, something like:
success: function(html)
{
$('#guestsTable').html(html);
$('userForm').hide();
}
I made up the guestsTable div and userForm names, ;).
The return data may need some coercing to make it into html (I'm assuming your using JQuery), hopefully the php script will set it to html/text in the header, also dataType: html can be passed in the $.ajax({...}) call.
Alternatively, if the classes/table_auto_guests.php returns a full page which you want to load in the browser, Ajax may not be what you are looking for. This post contains code on how to submit the data as a form, and load the result as a new page.

Dynamic webpage retrieves data from database slowly

I'm making a dynamic webpage which retrieves lots of data from a database very frequently, like at least every 3 seconds.
I tested my webpage and database locally by using XAMPP. It works perfectly. However, it turns to be very slow after I upload everything to 000webhost (my free account). My webpage even freezes (I cannot scroll the page, not even doing anything but wait for the data to be transferred.) when retrieving the data.
I used a setTimeout function which called several ajax commands to read data from my database. I have optimised the data capacity already, but the page still freezes. I also tried to disable most of the ajax commands and only left one. When loading, the page freezes just as a blink, but anyhow it still freezes...
Most of my ajax commands are like below which simply retrieves data from my database and updates the related fields on my webpage. Some ajax commands uses $.parseJSON() because I need the whole row from a table.
$.ajax({
type: "GET",
url: "get_balance.php",
data: {wherematch: localStorage.login_user},
dataType: "html", //expect html to be returned
async:false,
success: function(response){
document.getElementById('balance').innerHTML = response;
}
});
Can anyone provide some suggestions how to solve this issue? Should I pay and get a better account?
Thanks.
to have an ajax refreshing every 3 s, your javascript & ajax must be like this:
function get_data(){
$.ajax({
type: "GET",
url: "get_balance.php",
data: {wherematch: localStorage.login_user},
dataType: "html", //expect html to be returned
success: function(response){
document.getElementById('balance').innerHTML = response;
setTimeout(get_data(),3000);
}
});
}
get_data();
Put setTimeout() function inside the ajax. You will not get freeze because we don't set async as false

Redirect in the middle of an ajax request

I have a site where users can publish links. Users fill a form with 2 fields:
Title
URL
When the user clicks "submit" I have a crawler that looks for an image of the link provided and makes a thumbnail.
The problem is that the crawler usually takes about 5-10 seconds to finish loading and cropping the thumb.
I thought I could do an ajax call like this. As you can see, when the user submits a link first we see if its valid (first ajax call) then if succesful we do another ajax call to try to find and save the image of this link.
My idea was to do that while I move the user to the links.php page, however, I find that if I do it like this the AJAX call breaks and the function in save_image.php doesn't run.
What can I do to avoid making my users wait for the save_image.php process? I need this process to run, but I don't need any data returned.
$.ajax({
url: 'publish/submit_link.php',
type: 'POST',
dataType: 'JSON',
data: {
link : link,
title : title,
},
success: function (data) {
if (data)
{
$.ajax({
url: 'publish/save_image.php', type: 'POST',
data: {
id : data.id,
type : data.type,
url : url,
csrf_test_name : csrf
}
});
}
//THIS NEXT LINE BREAKS SECOND AJAX CALL
window.location = 'links.php';
}
});
Thanks in advance!
SUMMING UP: I want the user to submit a link and redirect the user to the links page while the thumbnail for that link is being generated. I don't want to show the thumbnail to the user.
The AJAX request seems to fail, because when you navigate away, the user request is aborted. Because of that, the execution of save_image.php is interrupted.
You can use PHP's ignore_user_abort to force the PHP process to continue in the background. Put it at the top of save_image.php:
<?php
ignore_user_abort(true);
// ... save image, etc.
?>
For this to work, you have to send (and flush) some output to the client:
PHP will not detect that the user has aborted the connection until an attempt is made to send information to the client. Simply using an echo statement does not guarantee that information is sent, see flush().
Any output should work (e.g. "OK"). This might be a bit of a challenge considering you're using a framework, but it shouldn't be impossible. This might work: Flushing with CodeIgniter
You can read more about PHP connection handling here.
force user to fill first the url and then the title, when user go to title field start crawl data, till finish the title and press sumbit you will gain some time and make the proccess apparently faster.
Why use XHR at all if you don't need the data returned? Just let your form submit the link to links.php and let it save the image there!
to understand your problem, we need to understand the working of javascript
your code is as follows
$.ajax({
url: 'publish/submit_link.php',
type: 'POST',
dataType: 'JSON',
data: {
link : link,
title : title,
},
success: function (data) {
if (data)
{
$.ajax({
url: 'publish/save_image.php', type: 'POST',
data: {
id : data.id,
type : data.type,
url : url,
csrf_test_name : csrf
}
});
}
//THIS NEXT LINE BREAKS SECOND AJAX CALL
window.location = 'links.php';
}
});
from the above i can say that as soon as ajax request is made, java script executes the second line regardless of the response.
we can take the following example
$.ajax({
url: 'publish/submit_link.php',
type: 'POST',
dataType: 'JSON',
data: {
link : link,
title : title,
},
success: function (data)
{
console.log(data);
}
});
for(var i = 0; i < 15000000; i++)
{
console.log(i);
}
you may see the output as follows
1
2
3
.
.
.
1000
data//response of ajax
.
.
14999999
so to avoid that you can use either jQuery.when() our ajax success function.
Hopefully this will help you

Categories