Using pushstate to load certain pages on my ajax website - javascript

I am installing a social network sharing plugin to my website and I notice that I can only share one link.... the main index link.. How would I go about using pushstate to change the url in my browser to load each specific page on my website when sharing. Becuase using my ajax code I only have one URL and thats https://trillumonopoly.com. Im not familiar with pushstate but ive read that it is what im looking for.
for example if I want to social share the page for music on my website so when people click the link it will take them to my website index page with that specific page loaded into the div. How will I do it.
heres my jquery / ajax code
$(document).ready(function () {
loadMainContent('main');
$('body').delegate('.navMenu', 'click', function (event) {
event.preventDefault();
loadMainContent($(this).attr('href'));
});
});
function loadMainContent(page) {
$('#main').load('pages/' + page + '.php');
}

If you have a url like yoursite/someroute then you'll get a not found error. So you'll need to tell apache to serve index.php for those routes using rewrite.
In document.ready you need to check for the url (document.location.pathname) and load the content according to what the path is.
The index page should always show a loading spinner that will be replaced with the actual content based on the path name in the url
Here is some simple sample code of how you could do this:
//if path is /contact then return object with the url to php content page
// and title of the page
const pathToPage = path => {
switch (path) {
case "/contact":
return {
url:"/contact.php"
,title:"Contact"
,path:path
};
}
}
//when user navigates back and forward
window.addEventListener(
"popstate"
,event =>
//only set content, do not mess with history
onlyLoadmain(
pathToPage(location.pathname)
)
);
//load main content with or without messing with history
const loadMainBuilder = (push)=>(page)=> {
//if page is undefined then path is not of known content
if(page!==undefined){
if(push){
//set the url
history.pushState(
{},
page.title
,page.path
);
}
document.title = page.title
//#todo: should show loading here
//$('#main').html(loading);
$('#main').load(
'pages/' + page.url + '.php'
);
}
}
const loadMainAndPush = loadMainBuilder(true);
const onlyLoadmain = loadMainBuilder(false);
//... other code
$(document).ready(function () {
//load the content of current page
// when user gets a url in their email like yoursite/contact the
// apache rewrite rule will serve index.php but main content is
// showing loading untill your code actually replaces it with something
onlyLoadmain(pathToPage(location.pathname));
$('body').delegate('.navMenu', 'click', function (event) {
event.preventDefault();
//the element clicked should have href with the path, not the php file path
//so /contacts not /pages/contacts.php
loadMainAndPush(pathToPage($(this).attr('href')));
});
});

Related

Inject javascript through URL

I have an html that opens a webpage such as:
https://mywebsite.com/index.html&audio=disabled
I have a javascript function that triggers a button in the webpage:
document.querySelector('iframe').contentWindow.document.querySelector('.Pan-Button').click();
I want to trigger this via URL. Since I am anyway disabling the audio via the URL, is it possible to trigger the button as well?
Just looking for an alternate way to trigger it without calling javascript function in code.
Adding script to a URL that is executed in the other page is called Cross site scripting (XSS)) and is bad
Instead (I assume from your code you can edit the target page)
const url = new URL(document.location);
const pan = url.searchParams.get("pan");
window.addEventListener("DOMContentLoaded", () => {
if (pan && pan==="yes") {
document.querySelector('iframe').contentWindow.document.querySelector('.Pan-Button').click();
}
});
and use https://mywebsite.com/index.html?audio=disabled&pan=yes

History replaceState problem in AJAXified Wordpress Theme

I made an Ajax enabled Wordpress theme with the main feature that internal links are not reloading the whole page but only the new content. So when URL's hash is changed, the new content is put in the #primary section and the map in the background stays untouched:
var $mainContent = $("#primary")
$(window).bind('hashchange', function(){
url = window.location.hash.substring(1);
url = url + " #content";
$mainContent.animate({opacity: "0.1"}).html('<p>Please wait...</>').load(url, function() {
$mainContent.animate({opacity: "1"});
});
});
Everything works fine, also history back and forward navigation. You can check out the basic funcionality here.
Now I want nice URLs hiding the hashes, so for example a link like http://geraldkogler.com/places/#/places/place/stwst/ gets changed to http://geraldkogler.com/places/place/stwst/. I do this adding this code to line 47 of ajax.js:
var oPageInfo = {
title: "places",
url: window.location.origin+window.location.hash.substring(1)
}
window.history.replaceState(oPageInfo, oPageInfo.title, oPageInfo.url);
Now the URL gets rewritten - but history doesn't work any more.
So I think I should listen to popstate events, I try to do the following and so back works once, but not more:
window.onpopstate = function(event) {
if (event.state) {
var host = "http://"+location.hostname;
location.hash = event.state.url.substring(host.length);
}
};
This (wrong) behaviour with the mentioned code is shown on this page. Any idea what I'm doing wrong in ajax.js?

What is the correct way to load a linked page, execute an event, and then unload it in a Chrome extension?

I have a Chrome extension that allows you to download data from certain sites.
For the past few days, I have attempted to add a feature that will download data from the sites when they are linked from other sites without having to visit the page and click the addon created download button IF the link to the site ends with #ndslink.
I figured out a solution, but it is INCREDIBLY sloppy and I am looking for a better way to implement this.
Here's the behavior:
Site A links Site B with an href ending in #nddownload. The link is clicked.
The extension disables the default action (open a new window), and instead creates an iframe on Site A and loads the linked URL into the frame.
The extension now runs Site B's content script, which specifically looks for #nddownload in the url, and, when found, proceeds to download some data that would normally be downloaded through a manual "Download" button added onto the page via the extension.
Here is my code.
Site B's content script:
var decklist = [];
$('.col-name').each(function(i, el) {
// IRRELEVANT CODE TRIMMMED
}
});
var data = decklist.join("\r\n");
var saveData = (function () {
// IRRELEVANT CODE TRIMMMED
}());
$(document).ready(function(){
var html = $('.t-deck-title').html();
fileName = $('.t-deck-title').text() + '.txt';
//html = html.replace(/hearthstonedeckdl/, '</br><a class="download" href="#download">DOWNLOAD</a>');
$('.t-deck-title').html(html);
if (window.location.href.indexOf("#ndslink") > -1) {
saveData(data, fileName);
}
$(document).on('click', 'a[href="#download"]', function(){
saveData(data, fileName);
});
});
Script loaded on all URLs (incl Site A):
var $jg = jQuery.noConflict();
$jg(document).ready(function(){
$jg('a[href$="#ndslink').click(function(e) {
e.preventDefault();
});
$jg(document).on('click', 'a[href$="#ndslink"]', function(){
//saveData(data, fileName);
var frameurl = $jg(this).text();
$jg('body').prepend('<iframe id="nddownload" />');
$jg("#nddownload").attr("src", frameurl);
//e.preventDefault();
// Send link to background and download.
});
});
I feel as if I am committing a horrible sin by going this route, but being new to Chrome extensions and generally being inexperienced I could not find a proper way to handle this. I've probably also broken my extension 10 different ways while attempting to do this so a real solution would be much appreciated.
I'm also actually unsure how to go about properly destroying the iframe once the saveData function completes.

Embed a webpage and run a javascript file on it

I am basically trying to show end users a demonstration of how their webpage will look after they use my service[This would be a javascript file to run on their webpage.]
I have tried the following:
Used iframe to embed the webpage - The problem here is that i cant access the iframe content and run my js functions on them.
used jquery load(), html embed, and object but the same problem persists. i am unable to run my javascript on the embedded webpage.
Basically what i want to do can be seen here http://www.luminate.com/publisher/ . Just type in a website URL in the Preview Luminate section and see what happens. The page loads in a new tab and they have loaded a javascript on pageload.
Can someone suggest any way to do this or what these guys have done[ http://www.luminate.com/publisher/ ]?
here is the script which they used to call the page
<script type="text/javascript">
function pop_demo(event) {
event.preventDefault();
var url = $("#preview-site-url").val();
if (!url) {
alert("Please enter a URL to preview.");
return false;
}
if (!/^(http|https):\/\//.test(url)) {
url = "http://" + url;
}
url = "/publisher/demo/?url=" + encodeURIComponent(url);
window.open(url, "");
}
Here your calling url : "/publisher/demo/?url
and internally it get load frame and pass the result to next page.

Auto-download behavior and back button issue

Instead of linking directly to files for download on a web site we link to a page that says "thank you for downloading". The page has tracking codes on it so we know how many people have downloaded the file. The page launches the download file using the jQuery code shown which adds a short delay after the page loads before the download begins. The download location has a content disposition header so it always downloads properly in the browser leaving the "thank you for downloading" page visible. This all works well.
The problem comes if the user carries on browsing past this page and then hits back. The download fires again.
Using window.location.replace(href); didn't seem to fix it.
The issue is further complicated by the fact that the CMS delivering the page has set it to expire immediately so it's not being cached.
Suggestions for (i) ways to avoid this problem; (ii) any better ways to handle file download / thank you pages?
jQuery Code
$(document).ready(function () {
$('a.autoDownload').each(function () {
setTimeout('navigateToDownload("' + $(this).attr('href') + '")', 4000);
});
});
function navigateToDownload(href) {
document.location.href = href;
}
One possible approach would be to set a cookie via Javascript when the page first loads. Then, if that page is ever loaded again, you can check for the presence of the cookie, and if present, do not execute the auto download?
Using the Cookie plugin for jQuery as an example:
$(document).ready(function () {
$('a.autoDownload').each(function () {
var hasDownloadedThisLink = $.cookie("site." + $(this).attr('id'));
if (!hasDownloadedThisLink) {
$.cookie("site." + $(this).attr('id'), "true");
setTimeout('navigateToDownload("' + $(this).attr('href') + '")', 4000);
}
});
});
This is just an example. If you went this way, you'd have to consider how many possible download links there might be, as there is a limit on how many cookies you can set. Also notice that I used an id attribute of the links to identify them in the cookie - I figured this would be more suitable that using some form the href attribute. I also prefixed the cookie name with site..
Well there are a couple of solutions to this. Here's an example:
function navigateToDownload(href){
var e = document.createElement("iframe");
e.src=href;
e.style.display='none';
document.body.appendChild(e);
}
Other implemenatations might exist, but I doubt they're less "hacky".

Categories