Ok, so this is my first post so I'll try not to sound to noobish here.....
I am working on a project on my corporate site and am having issues with some video. I am grabbing some video through an AJAX call and placing it into a YUI panel to create my own video lightbox. Everything is working fine in all browsers except, of course, IE(8 specifically since we just gave up supporting 7). I can get the panel to open and display the flash player but it won't load the .flv or the player controls. Like I said, fine in all other browsers. Here is the main script I am working with:
/**
* Function to lazy load, then show the video panel with the content of the link passed in inside the panel
*/
var showVideoPanel = function(e, linkEl){
Event.preventDefault(e);
if(!YAHOO.env.getVersion("videoPanel")) {
var successHandler = function() {
videoPanel = new COUNTRY.widget.VideoPanel("videoPanel", " ");
showVideoPanel(e, linkEl);
};
//this is not likely to go off (404 is not considered an error)
var failureHandler = function() {
window.location = linkEl.href;
return;
};
COUNTRY.loadComponent("videoPanel", successHandler, failureHandler);
}
else {
COUNTRY.util.Ajax.getRemoteContent('GET', linkEl.href, videoPanel.body, {
success: function(o){
var start, end, el;
el = Dom.get(videoPanel.body);
start = o.responseText.indexOf('<object');
end = o.responseText.indexOf('</object>', start);
el.innerHTML = o.responseText.substring(start, end);
},
failure: function(o){
el = Dom.get(videoPanel.body);
el.innerHTML = "The requested content is currently unavailable. Please try again later.";
}
});
var bod = Dom.get(videoPanel.body);
COUNTRY.util.Flash.flashControl(bod.getElementsByTagName("FORM")[0]);
videoPanel.show(linkEl);
}
};
This part of the code looks like it probably doesn't do what you intended:
start = o.responseText.indexOf('<object');
end = o.responseText.indexOf('</object>', start);
el.innerHTML = o.responseText.substring(start, end);
This code will include the tag, but not include the tag. I suspect you're trying to either get both tags or neither tag.
I solved this originally by dumping the the data from the AJAX call into an object tag, which solved the problem and worked cross browser perfectly.
We have since changed how we serve media, so I ended up using Vimeo to host the videos which was very simple and works wonderfully.
Related
I have encountered a problem with my Java script/jQuery code.
I want to make a piece of code which could fulfill the following requirement:
1.Make the browser save my remote binary file, let's say http://192.168.0.100/system/diagdata
2.Since the preparing the file in the server side with cost some time(usually around 40s), so I need a callback to let me know when the data will be ready to download(the file itself is very small, so let's ignore the actually data transmit duration) so that I could display some kind of loading page to tell the user the downloading procedure is on the way.
At first, I make a piece of code like this without callback:
var elemIF = document.createElement("iframe");
elemIF.src = 'http://192.168.0.100/system/diagdata';
elemIF.style.display = "none";
document.body.appendChild(elemIF);
It works well(but without callback)
Then in order to make callback possible, then I added some code like this:
var deferred = jQuery.Deferred();
var elemIF = document.createElement("iframe");
elemIF.src = 'http://192.168.0.100/system/diagdata';
elemIF.style.display = "none";
document.body.appendChild(elemIF);
elemIF.defer = 'defer';
if (window.ActiveXObject) { // IE
sc.onreadystatechange = function() {
if ((that.readyState == 'loaded'
||that.readyState == 'complete') ) {
}
}
}
else { // Chrome, Safari, Firefox
elemIF.onload = function() {
alert("onload");
};
elemIF.onerror = function(e) {
alert("onerror");
};
}
deferred.promise();
After I run this piece of code, the "onload" has been called, but the browser did not tend to save the file "diagdata" but try to load it and report a parsing error exception.
Did anyone have a substitute solution which could not only make browser save the binary file but also will callback to inform the data ready status?
I've been debugging this for a long time and it has me completely baffled. I need to save ads to my computer for a work project. Here is an example ad that I got from CNN.com:
http://ads.cnn.com/html.ng/site=cnn&cnn_pagetype=main&cnn_position=300x250_rgt&cnn_rollup=homepage&page.allowcompete=no¶ms.styles=fs&Params.User.UserID=5372450203c5be0a3c695e599b05d821&transactionID=13999976982075532128681984&tile=2897967999935&domId=6f4501668a5e9d58&kxid=&kxseg=
When I visit this link in Google Chrome and Firefox, I see an ad (if the link stops working, simply go to CNN.com and grab the iframe URL for one of the ads). I developed a PhantomJS script that will save a screenshot and the HTML of any page. It works on any website, but it doesn't seem to work on these ads. The screenshot is blank and the HTML contains a tracking pixel (a 1x1 transparent gif used to track the ad). I thought that it would give me what I see in my normal browser.
The only thing that I can think of is that the AJAX calls are somehow messing up PhantomJS, so I hard-coded a delay but I got the same results.
Here is the most basic piece of test code that reproduces my problem:
var fs = require('fs');
var page = require('webpage').create();
var url = phantom.args[0];
page.open(url, function (status) {
if (status !== 'success') {
console.log('Unable to load the address!');
phantom.exit();
}
else {
// Output Results Immediately
var html = page.evaluate(function () {
return document.getElementsByTagName('html')[0].innerHTML;
});
fs.write("HtmlBeforeTimeout.htm", html, 'w');
page.render('RenderBeforeTimeout.png');
// Output Results After Delay (for AJAX)
window.setTimeout(function () {
var html = page.evaluate(function () {
return document.getElementsByTagName('html')[0].innerHTML;
});
fs.write("HtmlAfterTimeout.htm", html, 'w');
page.render('RenderAfterTimeout.png');
phantom.exit();
}, 9000); // 9 Second Delay
}
});
You can run this code using this command in your terminal:
phantomjs getHtml.js 'http://www.google.com/'
The above command works well. When you replace the Google URL with an Ad URL (like the one at the top of this post), is gives me the unexpected results that I explained.
Thanks so much for your help! This is my first question that I've ever posted on here, because I can almost always find the answer by searching Stack Overflow. This one, however, has me completely stumped! :)
EDIT: I'm running PhantomJS 1.9.7 on Ubuntu 14.04 (Trusty Tahr)
EDIT: Okay, I've been working on it for a while now and I think it has something to do with cookies. If I clear all of my history and view the link in my browser, it also comes up blank. If I then refresh the page, it displays fine. It also displays fine if I open it in a new tab. The only time it doesn't is when I try to view it directly after clearing my cookies.
EDIT: I've tried loading the link twice in PhantomJS without exiting (manually requesting it twice in my script before calling phantom.exit()). It doesn't work. In the PhantomJS documentation it says that the cookie jar is enabled by default. Any ideas? :)
You should try using the onLoadFinished callback instead of checking for status in page.open. Something like this should work:
var fs = require('fs');
var page = require('webpage').create();
var url = phantom.args[0];
page.open(url);
page.onLoadFinished = function()
{
// Output Results Immediately
var html = page.evaluate(function () {
return document.getElementsByTagName('html')[0].innerHTML;
});
fs.write("HtmlBeforeTimeout.htm", html, 'w');
page.render('RenderBeforeTimeout.png');
// Output Results After Delay (for AJAX)
window.setTimeout(function () {
var html = page.evaluate(function () {
return document.getElementsByTagName('html')[0].innerHTML;
});
fs.write("HtmlAfterTimeout.htm", html, 'w');
page.render('RenderAfterTimeout.png');
phantom.exit();
}, 9000); // 9 Second Delay
};
I have an answer here that loops through all files in a local folder and saves images of the resulting pages: Using Phantom JS to convert all HTML files in a folder to PNG
The same principle applies to remote HTML pages.
Here is what I have from the output:
Before Timeout:
http://i.stack.imgur.com/GmsH9.jpg
After Timeout:
http://i.stack.imgur.com/mo6Ax.jpg
i have a pixel tracking on a webpage, it is working nice for many time, but yesterday i find that is not working well on mobile page m.url.com, instead of www.url.com.
Mobile page only reload the header the first time you load the page, so i change the tracking to be called on a live event.
Is working and it call the js where i calculate the browser, user_agent, etc to pass as query string in the pixel.
This is the function on my Tracking JS class:
/**
* Web beacon with query string
*/
function getImage(query_string) {
var image = new Image(1, 1);
image.onload = function () {
console.log(image);
};
image.error = function () {
console.log('error');
};
image.onabort = function () {
console.log('abort');
};
image.src = 'http://<myurl>/picture.gif?' + query_string;
}
When i call this function on m.url.com (mobile site), on chrome, the first time the page is loaded, the pixel works, after that, when i navigate the pixel only works on chrome, on other browsers like firefox and safari dont, all is working and the src callback the .onload function with the url of the image, no errors, no abort...
Maybe src only works 1 time at same class? Dont know,
Thx for help in advance.
It seems that the image is cached once and not requested anymore. Just add a timestamp to the image url:
image.src = 'http://<myurl>/picture.gif?_=' + (new Date()).getTime() + '&' + query_string;
I need to export data in excel file.I am using JSON post for the same.Its working fine in every browser except IE. Have a look at my javascript code as follow:-
function ExportQueryData() {
var Qry = $("#txtQueryInput").val();
if ($.trim(Qry) == "") {
$("#txtQueryInput").addClass("error");
return false;
}
else
$("#txtQueryInput").removeClass("error");
var url = "/Reports/ExportQueryData";
var frmserialize = $("#frmQuery").serialize();
$.post(url, frmserialize, function(data) {
data = eval("(" + data + ")");
if (data.Success) {
url = "/Reports/Export";
var win = window.open(url, "DownloadWin", "resizable=0,status=0,toolbar=0,width=600px,height=300px");
win.focus();
win.moveTo(100, 100);
return false;
}
else {
Notify("DB Query", data.Message);
}
});
}
As per above code, i am calling /Reports/Export action using window.open. This pop up getting open for every browser except IE. In IE pop up get close simultaneously in 1 or 2 seconds.
I need to use JSON post because it validate input and then returns success status.I can validate my data only at server side.
Let me know if any information is missing here.
Your suggestion will be valuable for me.
Thanks.
I think you've got a pop up blocker situation... Frankly, I don't think opening up stuff in a pop up is a good idea anymore. Browsers seem to hate it.
You could instead show the user a link to the file, asking him to click to download.
Out of curiosity, I'm wondering about the best (easiest, fastest, shortest, etc; make your pick) way to perform a GET request in JavaScript without using AJAX or any external libraries.
It must work cross-browser and it's not allowed to distort the hosting web page visually or affect it's functionality in any way.
I don't care about headers in the request, just the url-part. I also don't care about the result of the request. I just want the server to do something as a side effect when it receives this request, so firing it is all that matters. If your solution requires the servers to return something in particular, that's ok as well.
I'll post my own suggestion as a possible answer, but I would love it if someone could find a better way!
Have you tried using an Image object? Something like:
var req = new Image();
req.onload = function() {
// Probably not required if you're only interested in
// making the request and don't need a callback function
}
req.src = 'http://example.com/foo/bar';
function GET(url) {
var head = document.getElementsByTagName('head')[0];
var n = document.createElement('script');
n.src = url;
n.type = 'text/javascript';
n.onload = function() { // this is not really mandatory, but removes the tag when finished.
head.removeChild(n);
};
head.appendChild(n);
}
I would go with Pekka idea and use hidden iframe, the advantage is that no further parsing will be done: for image, the browser will try to parse the result as image, for dynamically creating script tag the browser will try to parse the results as JavaScript code.. iframe is "hit and run", the browser doesn't care what's in there.
Changing your own solution a bit:
function GET(url) {
var oFrame = document.getElementById("MyAjaxFrame");
if (!oFrame) {
oFrame = document.createElement("iframe");
oFrame.style.display = "none";
oFrame.id = "MyAjaxFrame";
document.body.appendChild(oFrame);
}
oFrame.src = url;
}