My objective is simple, I'm kind of surprised that i can't find an answer to this in other posts, i apologize in advance if i missed it....
I have a textarea input. I want to make sure users don't post links to external sites.
I prefer to do this client side and simply prevent submission if external link is present.
I have seen numerous posts on how to find urls and make them into href's. I suppose i can use the same logic to simply remove them, but that will not prevent the users from proceeding, what i want to do is specifically stop them for submitting if external links are present.
I also need to allow for links within the same domain, so the actual url of detected links must be compared as well. I've seen ways to do this once i have the url in a string on its own, but right now, it is somwhere in the middle of a parapraph of text.
<script type="text/javascript">
function validate() {
var noExternalURLs = true;
var currentDomain = window.location.protocol + '//' + window.location.host;
var links = new Array();
var re = new RegExp('^'+currentDomain,"gi");
// Get protocol & domain of URLs in text
links = document.getElementById('mytextbox').value.match(/(https?:\/\/[\w\d.-]+)/gi);
// Loop over links. If protocol & domain in URL doesn't match current protocol & domain then flag
for(var i=0; i<links.length; i++)
{
if( links[i].match(re) == null )
{
noExternalURLs = false;
}
}
return noExternalURLs; // prevent submit if noExternalURLs==false
}
</script>
<form id="myform" action="index.php" onsubmit="return validate();">
<textarea id="mytextbox"></textarea>
<input type="submit" />
</form>
Do not do this client side only. You must check server side too. Client side checks are only for improving the user experience. Any business logic MUST be handled on he server side.
It is trivial to do this as post or get:
http://url.com?textareainput=http://urltoverybadsite
You can make it nicer for your users by doing a quick regex:
<script>
function checkLinks()
{
var links = document.getElementById('textareainput').value.match("/http:\/\//")
if (!links)
{
window.alert("Links not allowed!")
return false;
}
return true;
}
</script>
<form onsubmit="checkLinks()"><textarea id='textareainput'></textarea></form>
Related
hello my question is what is the best approach to Restrict access to some urls of wordpress website to single referrer domain.
as far as I am familar with javascript I found a way for that. but I think javascript code is not good, because the source code of the page does not change.
I wrote this code:
function getCookie(name) {
const value = `; ${document.cookie}`;
const parts = value.split(`; ${name}=`);
if (parts.length === 2) return parts.pop().split(';').shift();
}
document.body.style.display="none";
var url = document.referrer;
var domainname;
var referal_code = getCookie("protect_faq_pages");
console.log(url);
if(url){
var anchor = document.createElement("a");
anchor.href = url;
domainname = anchor.host;
console.log(domainname);
if(domainname == "softwareservicetech.com"){
var cookieString = "protect_faq_pages=cWs#fgf$a1fD#FsC-)";
document.cookie = cookieString;
}
}else if(!(referal_code == "cWs#fgf$a1fD#FsC-)")){
document.getElementById("page").innerHTML="<p>Sorry you do not have permission to view the content</p>"
}
console.log(referal_code);
document.body.style.display="block";
this site can be accessed itself:
https://health-unity.com/
you can find out the page below is restriced on the view :
https://health-unity.com/help-centre/videos/
and also these pages too:
https://health-unity.com/help-centre/videos/video-number-2/
https://health-unity.com/help-centre/videos/video-number-1/
but when click on the link on below site (link to health-unity-videos):
https://softwareservicetech.com/testpage/
the archive page will be accessible after that. user can go to the pages below directly:
https://health-unity.com/help-centre/videos/video-number-2/
https://health-unity.com/help-centre/videos/video-number-1/
these were restricted before and now can be accessed by a cookie that is set.
but the problem is that page source still exist and did not changed by javascript code and user can view the page source. also I want that the cookie value should be hidden. because of these two problem I think javascript is not a good idea.
please share with me if there is way with javascript, php, or editing functions.php or .htaccess file to achieve this.
thank you for your response in advance
You can use $_SERVER['HTTP_REFERER'] in functions.php
For example:
<?php
add_action('init','check_referrer');
function check_referrer(){
if( str_contain($_SERVER['HTTP_REFERER'], 'https://example-domain.com/'){
// do somthing
}else{
// do somthing else
}
}
?>
I am trying to track Facebook ad results using the Facebook Pixel during appropriate events (page views, lead generation, order form view, purchase). I can do all of this for GA using GTM with no problem, but on Facebook I only have partial success.
The main issue is I have a cross domain setup as shown below:
domain1.com/offer - landing page (FB Page View Pixel should fire)
domain1.com/ordergate - request email before showing order form page (FB Page View Pixel should fire)
crm.com/formsubmission - the actual form submits to my crm (FB Lead Pixel should fire)
crm.com/orderform - order form (FB order form view pixel should fire)
domain1.com/thankyou - the thank you page (FB order pixel should fire)
So my trigger on GTM to fire FB pixel was the "referrer" containing "facebook". However, because of the multi-step process, the referrer is lost by the time the order form or sale is completed.
I have since then learned I need to do the following:
User lands from facebook, write cookie with an appropriately short expiration time that stores this information on domaiin1.com.
When the user clicks a link and is redirected to crm.com, check if the user has the cookie, and if they do, add something like ?reffacebook=true to the redirect URL.
On crm.com, if the URL has ?reffacebook=true write the same cookie you wrote on (1) with an equally short expiration time.
UPDATE
So I have figured out step 2 using the following script on page view when the Facebook cookie is set:
function updateLinks(parameter, value)
{
var links = document.getElementsByTagName('a');
var includeDomains = self.location.host;
for (var i=0;i<links.length;i++)
{
if(links[i].href != "#" && links[i].href != "/" && links[i].href != "" && links[i].href != window.location) //Ignore links with empty src attribute, linking to site root, or anchor tags (#)
{
var updateLink = true;
if(links[i].href.toLowerCase().indexOf(includeDomains.toLowerCase()) != -1) //Domain of current link is included i the includeDomains array. Update Required...
{
updateLink = false;
}
if(!updateLink)
{
//Do nothing - link is internal
}
else
{
var queryStringComplete = "";
var paramCount = 0;
var linkParts = links[i].href.split("?");
if(linkParts.length > 1) // Has Query String Params
{
queryStringComplete = "?";
var fullQString = linkParts[1];
var paramArray = fullQString.split("&");
var found = false;
for (j=0;j<paramArray.length;j++)
{
var currentParameter = paramArray[j].split("=");
if(paramCount > 0)
queryStringComplete = queryStringComplete + "&";
if(currentParameter[0] == parameter) //Parameter exists in url, refresh value
{
queryStringComplete = queryStringComplete + parameter + "=" + value;
found = true;
}
else
{
queryStringComplete = queryStringComplete + paramArray[j]; //Not related parameter - re-include in url
}
paramCount++;
}
if(!found) //Add new param to end of query string
queryStringComplete = queryStringComplete + "&" + parameter + "=" + value;
}
else
{
queryStringComplete = "?" + parameter + "=" + value;
}
links[i].href = links[i].href.split("?")[0] + queryStringComplete;
}
}
else
{
//Do nothing
}
}
}
So with this code I can now properly attribute people with the facebook referral across domains...
...but I still have a problem with form submits.
So when the contact gets to step 4, it is a redirect from the form submission. It does not carry any cookie or query string, so neither of the FB pixels (order form view or order) is being fired.
I'm not sure how I would handle this. My first thought is to pass a hidden field into the form submission (say reffacebook=true). Then somehow expose that in the url in a form of a query string so that it can be detected by GTM.
This seems to be somewhat complicated though, as I would have to edit all my forms to have this variable, edit my CRM so it knows to receive it, and then edit the form landing page to expose that variable in the url.
Hey I hope that I understood what is this all about. Here you want to track traffic between cross domains right? I am not into any coding or anything like that to achieve such a tracking. Because I don't know any coding seriously (I apologies my self for not even trying to learn. I realize my self is that knowing Java script have a lot of benefits in advanced marketing). Ok Here is my point. If we want to track traffic between domains and retarget them later, wouldn't it be done by Facebook itself just by using the same pixel in both domains? This is what I used to believe in the case of multiple domains while doing Facebook ads. Here the important Thing is the audience should be the same from domain A to domain B (In your case it looks like yes the audience is same there for there is no issue for doing that I think). But not sure whether Facebook will track the traffic between domains successfully or not just by placing same FB Pixel in both domains.
Thank you.
#SalihKp, I think you have a point however the issue is that i believe facebook does cross domain with third party cookies which are not working optimally now adays
#David Avellan actually since the user returns to the landing domain for the thank you page, then the final conversion should work using 1st party cookies, but what you want in between might be an issue.
i am looking at now a case where they user lands on a.com and convert
so my situation is as follows:
I wrote a submission system in php that writes to a textfile rather than a database, the idea of the system is people submit their url to the textfile and then when that script is called on a page, it redirects to a random address out of the textfile; the problem is, I don't know how to make javascript read from the text file and then pick a line to redirect to.
Actually, just to clarify, I know how to make javascript read from the text file; but I have NO idea how id write a function to pick a url from the file and forward to it.
Seeing as I hit this road block a couple of days ago, the only way I have been handling submissions is checking the text file every 12 hours for new submissions and then manually adding them to this code:
setTimeout(function() {
var howMany = 38;
var page = new Array(howMany+1);
page[0]="http://gproxy.nl/";
page[1]="http://homeproxy.me/";
page[2]="http://proxyturbo.com/";
page[3]="http://www.lblocker.info/";
page[4]="http://goprivate.eu/";
page[5]="http://jsproxy.com/";
page[6]="http://openthis.eu/";
page[7]="http://proxy4home.info/";
page[8]="http://dedicatedipaddress.net/";
page[9]="https://www.4everproxy.com/";
page[10]="http://www.surfsearch.info/";
page[11]="http://www.leaveproxy.com/";
page[12]="http://proxyecole.fr/";
page[13]="http://newipnow.com/";
page[14]="http://www.hiddenmode.info/";
page[15]="https://europrox.org/";
page[16]="https://www.4everproxy.com/";
page[17]="https://goingthere.org/";
page[18]="http://xuxor.com/";
page[19]="http://033b.com/";
page[20]="http://thewebtunnel.com/";
page[21]="http://prox.phanteye.com/";
page[22]="http://www.hiddenall.info/";
page[23]="http://www.5966.info/";
page[24]="http://hideyoself.com/";
page[25]="http://prox.phanteye.com/";
page[26]="http://freevideoproxy.com/";
page[27]="http://thewebtunnel.com/";
page[28]="http://openthis.eu/";
page[29]="https://europrox.org/";
page[30]="http://xuxor.com/";
page[31]="https://incloak.com/";
page[32]="http://www.leaveproxy.com/";
page[33]="http://www.openunblocker.com/";
page[34]="http://post48.com";
page[35]="http://post48.com";
page[36]="http://inteproxy.com";
page[37]="http://208.73.23.59";
page[38]="http://hidemetoday.com/";
function rndnumber(){
var randscript = -1;
while (randscript < 0 || randscript > howMany || isNaN(randscript)){
randscript = parseInt(Math.random()*(howMany+1));
}
return randscript;
}
quo = rndnumber();
quox = page[quo];
window.location=(quox);
}, 1500);
I would be very grateful if someone would help me write the script or tell me what kind of function I should be googling to look up, googling "How to make javascript read from a textfile and redirect" doesn't really turn up much ; (
Many thanks!
If I understand correctly, first, you'll need a regex to find the URLs in the file. I would refer to this SO post for that: regular expression for url
Once you have that, you can go to any URL with window.location.href = 'http://google.com';
So, you'll do something like this...
var urlPattern = /((([A-Za-z]{3,9}:(?:\/\/)?)(?:[-;:&=\+\$,\w]+#)?[A-Za-z0-9.-]+|(?:www.|[-;:&=\+\$,\w]+#)[A-Za-z0-9.-]+)((?:\/[\+~%\/.\w-_]*)?\??(?:[-\+=&;%#.\w_]*)#?(?:[\w]*))?)/g;
var urls = data.match(urlPattern);
if (urls) {
window.location.href = urls[7];
}
Is that what you're looking for?
Or you can use a more simple regex like var urlPat = /https?:\/\/[^'"]+/g
Remember to use the /g flag with your regex to get all occurrences of the urls.
I'm updating a wordpress site for a customer. I've had to turn on server side caching because the site is so slow due to so many plugins. There is nothing I can do about this and can't disable the plugins.
One of the plugins requires the user to enter an email to download a file. If the plugin sees a valid email, it sets a session variable. Then the page reloads. If that session variable is set, the site displays a download button rather than the enter email form field and button.
Since server side caching is enabled, this functionality is lost. So, what I want to do is set a cookie instead, then check that cookie client side, and swap the content of the div ( id="download" ). I don't do much with Javascript and am having trouble getting this working.
I've set the cookie like this in PHP:
setcookie( 'show_download', 1 );
I've set the new content of the div ( dynamically generated ) like this:
setcookie( 'new_content', '<div class="btn-ss-downloads"><a target="_blank" href="/wp-content/plugins/ss-downloads/services/getfile.php?file=4v-zR6fc6f/9vaROBr/dTJd/Tg/D 0-dT.vBx">Download Series 20 Datasheet</a></div>' );
I've got a function to read the cookie in Javascript that I got from another post:
<script type="text/javascript">
function readCookie(name) {
var ca = document.cookie.split(';');
var nameEQ = name + "=";
for(var i=0; i < ca.length; i++) {
var c = ca[i];
while (c.charAt(0)==' ') c = c.substring(1, c.length); //delete spaces
if (c.indexOf(nameEQ) == 0) return c.substring(nameEQ.length, c.length);
}
return "";
}
</script>
The functionality that seems correct is to call readCookie( 'show_download' );. If the value = 1, swap the content of <div id="download"></div> with the content stored in the new_content cookie.
How do I make this work in Javascript given the pieces I have presented here? I'd like to just add a javascript function to my Wordpress header if that will work here. It seems that I could just run this function after the page has rendered / loaded. I'm not terribly familiar with the order of operations with the browser and javascript. Please help.
Is this what you're looking for?
(<head>)
<script type='text/javascript'>
var showDownload = localStorage.getItem("showDownload") //or readCookie()
if (showDownload == 1) {
$(document).ready(function(){
$("div.btn-ss-downloads").html("<a target='_blank'...</a>")
)};
}
</script>
(</head>)
See http://api.jquery.com/html/ and https://developer.mozilla.org/en/DOM/Storage#localStorage
This uses jQuery, which you can include using:
<script type="text/javascript" src="http://code.jquery.com/jquery-latest.pack.js"></script>
(This is one of the few times I'd advocate using jQuery for something so simple, because $(document).ready is so nice!)
Why are you not considering AJAX?
If the requirement is to check validity of an address and then do certain server actions to display a button back, it looks like a good place to use AJAX.
$.post("downloader.php", { eml: mail_id })
.done(function(data){
if(data != "")
$("div.btn-ss-downloads").html('Download');
});
This assumes the downloader.php does the necessary checks and returns the download URL. You can separate this logic (which seems to be part of the main page now) into downloader.php
My goal is to redirect my website to (/2012/index.php)
ONLY IF the user goes to ( http://www.neonblackmag.com )
ELSE IF
the user goes to ( http://neonblackmag.com.s73231.gridserver.com ) they will not be re-directed... ( this way i can still work on my website and view it from this url ( the temp url )
I have tried the following script and variations, i have been unsuccessful in getting this to work thus far....
<script language="javascript">
if (document.URL.match("http://www.neonblackmag.com/")); {
location.replace("http://www.neonblackmag.com/2012"); }
</script>
This should work:
<script type="text/javascript">
if(location.href.match(/www.neonblackmag.com/)){
location.replace("http://www.neonblackmag.com/2012");
}
</script>
You should use regular expression as an argument of match (if you're not using https you can drop match for http://...
In your solution the semicolon after if should be removed - and I think that's it, mine is using location.href instead of document.URL.
You can also match subfolders using location.href.match(/www.neonblackmag.com\/subfolder/) etc
Cheers
G.
document.url doesn't appear to be settable, afaict. You probably want window.location
<script type="text/javascript">
if (window.location.hostname === "www.neonblackmag.com") {
window.location.pathname = '/2012';
}
</script>
(Don't use language="javascript". It's deprecated.)
Anyone at any time can disable JavaScript and continue viewing your site. There are better ways to do this, mostly on the server side.
To directly answer your questions, this code will do what you want. Here's a fiddle for it.
var the_url = window.location.href;
document.write(the_url);
// This is our pretend URL
// Remove this next line in production
var the_url = 'http://www.neonblackmag.com/';
if (the_url.indexOf('http://www.neonblackmag.com/') !== -1)
window.location.href = 'http://www.neonblackmag.com/2012/index.php';
else
alert('Welcome');
As I said, this can be easily bypassed. It'd be enough to stop a person who can check email and do basic Google searches.
On the server side is where you really have power. In your PHP code you can limit requests to only coming from your IP, or only any other variable factor, and no one can get in. If you don't like the request, send them somewhere else instead of giving them the page.
header('Location: /2012/index.php'); // PHP code for a redirect
There are plenty of other ways to do it, but this is one of the simpler. Others include, redirecting the entire domain, or creating a test sub domain and only allow requests to that.