Keep track of everybody who uses my code - javascript

I made a theme for tumblr and I want to release the code to the public. The thing is, want to know how many people are using my theme. Is there any javascript or other solution to this other than google analytic?

Yes, add to your code some javascript that sends a GET request to a php page that logs the request. You need to grab the tumblr document.URL when you do this.
<script type="text/javascript">
xmlHttp.open( "GET", '//mywebsite.com/tumblr_code_check.php?referrer='+document.URL, true );
xmlHttp.send( null );
</script>
PHP:
<?php
$referrer = $_GET['referrer'];
$file = '';
$filename = 'tumblr_refferalls.txt';
$file = file_get_contents ($filename);
file_put_contents ($filename, $referrer."\n");
?>
Then check the file tumblr_refferalls.txt. You can add some code to the php page ignore your own website, to prevent it from filling up.
Please note that anyone can remove the javascript from your code.
You can get more detailed, such as adding a timestamp, filtering it so you only get the page URL once appended to your logfile, have it send you a text message or email when someone new uses your code, etc.

Use a tracking pixel, a 1x1 image and just include it in the markup. Of course any end-user is free to remove it. I urge you to refrain from other methods, as you will not only raise red flags with anybody who looks into your theme, but you may negatively impact their PHP or JS execution should your host ever fail to connect.

Related

File download html with error handling FileNotFound

I want to download a file when I click on a link. The file is served dynamically.
Download!
But I need to be able to display error to the user if /server/myfunc?id=XXX link does not provide a file. I can return a boolean false or any http code through this link if file isn't present.
What should be the most appropriate method for this implementation. Note that I do not wish to alter the current state of page in any manner, just trigger the file download without affecting the page. And display error if file is not to be downloaded.
You can't do that only with html as far as it is a static language. I would suggest you to use a 'if/else' clause written in php or java. Here you have some documentation, and a concrete sample here. Anyway here you are a sample that could fit exactly on your case (taking into account you are inserting the php on the html).
<?php
var = url
if (isset($var) && ($var === true){ ?>
Download!
<?php } else { ?>
<p>Hey, there is no URL!</p>
<?php } ?>
I know I am replying very late, But it maybe useful for someone facing this kind of issue.
When using Synchronous HTTP calls, Its very difficult to show error Message as it would replace whole web page instead which we dont want.
So, Its better to use AJAX
Click here JohnCulviner Blog for more details - how to use AJAX for downloads
Also, Go through following SO link Download a file by query Ajax

Execute JavaScript before redirection without damaging preview generated by facebook URL scraper

Aim : I have a small shortener service. I use links generated using this services as part of the content I post on facebook. Facebook's URL scraper tries to generate a preview by locating the end-point URL and pull relevant information.
This, right now works because I simply redirect in PHP with the way it's mentioned below without allowing me to execute JS before that redirection. I want to find out if there is a way to execute that JS.
What can I do? Well! redirect from JS using window.location like this solution here...
What's the problem? If I do that, Facebook's URL scraper can no longer find the final destination url, resulting in a blank preview of my site(specifically the page where JS executes) instead of the destination site. JS execution is a must so can't omit that.
Question : Is there a way I can execute some piece of javascript before I do a redirection via header location and still be able to get correct previews for URLs?
Reference Code - in Laravel Controller
if($row->save()){
//i was first doing this and doing a redirect from the view.
//return View::make('pages/redirect')->with('originalUrl',$row->url);
//but then i realized it wasn't fetching the previews. So i do this now.
return Redirect::to($row->url);
//assuming it internally uses php's header location. I want to use the JS before this redirect.
}
I have finally got around the problem by following what #Austin had mentioned in comments.
I put the script after header location statement in the view and returned that view from controller. That way facebook's scraper found both my script and the content from destination URL as a merged markup( as seen from the facebook url debugger )
I find it kinda funny and weird but it works! :D
I tried out the idea I mentioned in the comments.
It looks like it works.
<?php
if (strpos($_SERVER["HTTP_USER_AGENT"], "facebookexternalhit") !== false) {
// this is the facebook crawler
header('location: http://stackoverflow.com/');
}
else {
// other clients
echo '
<script>
window.location = "http://stackoverflow.com/"
</script>
Hello World
';
}
?>

PHP Ajax jQuery cross domain file loading

So before you say this can't be done. These are all files that I have one server. I just have some of them listed under different domains.
My PHP script will access all the files but when I try to do an ajax request to try to load the file I will often get an error (because the site i am accessing is secure and the one I am accessing it through isn't).
What I need is a way to have php grab the file. But I need aJax to retrieve the file and render it for me. I am also using ACE editor to edit the file
The bit of code I have here will actually error out as well because it will load and print out the file where $page is defined but won't load where htmlspecialchars is.
<script>
var e = ace.edit("editor");
<?php
$page = readfile($_SERVER['DOCUMENT_ROOT'].$_GET['dir']);
echo 'e.setValue('.htmlspecialchars($page, ENT_QUOTES).');';
?>
</script>
I have an ajax get request working but it doesn't work when I go to a directory with a special htaccess file. Now I can't change the htaccess file (unless there is a way for me to confirm that it is my script running and not someone else.
The question is, how can I access those other files without getting that error? Mind you those files could be extension. It is not limited to just scripts or css, mostly they will be html or php files.
After an hour of searching the deep dark depths of the php.net site I was able to put together a solution that works.
<?php
echo htmlspecialchars(
addslashes(
file_get_contents(
$_SERVER['DOCUMENT_ROOT'].$_GET['d‌​ir']
)
)
); ?>
the addslashes is the extra part that I needed. Then I also had to put it between the div for the editor. I couldn't use the editor.setValue() function.

Inserting Headers into a Data URI?

I'm trying to dynamically create data in JavaScript that downloads with a .csv extension, but am unable to associate a filename with the download. Here's a code snippet:
var data = '1,2,3';
window.location =
"data:application/csv;charset=UTF-8;content-disposition:attachment;filename=export.csv,"
+ encodeURIComponent(data);
This downloads, but the file name is generic, without a .csv extension.
I can do this server-side, with the following PHP code:
<?php
header('Content-type: text/csv');
header('Content-disposition: attachment;filename=export.csv');
echo "1,2,3";
?>
Additionally, if my back is to the wall, I can modify my Javascript to send an Ajax out to the server and get this in return, but I'd prefer a pure Javascript solution. Is there a browser-independent, Flash-free way of doing this?
EDIT
Someone pointed out this question as a possible duplicate. I did goo-diligence and checked previous SO articles. I am still asking this because some time has passed since those questions, so I am hoping:
Browser features may have changed favorably.
Older browsers are less of a constraint, so certain solutions may be more viable now than when the previous questions were asked.
Additionally, if it makes the question more palpatable, I'm specifically asking about inserting a type of content header into a data URI (as per the subject), which is a twist on the other questions.
Thanks in advance.

Get HTML of URL without XMLHttpRequest

thanks for reading. I'm trying to come up with a Javascript function that would convert the HTML source of a page at an external URL into a variable, so that the whole thing would become editable. The complication is, the URL does not end with a "html, htm, aspx" extension, but instead with a string of input form variables (i.e. ?type=AAA&color=BBB...). Hence the XMLHttpRequest method is out of the question.
Is this doable in JS/jQuery at all? I've heard about the same origin policy, but the following tool manages to do just that, although in PHP: http://www.iwebtool.com/code_viewer
Same origin policy does apply in this case, however you can do it with a combination of server side code (PHP) and jQuery. Heres a little example.
PHP
<?php
$url = $_REQUEST['url'];
$curl_handle=curl_init();
curl_setopt($curl_handle,CURLOPT_URL,$url);
curl_setopt($curl_handle,CURLOPT_CONNECTTIMEOUT,2);
curl_setopt($curl_handle,CURLOPT_RETURNTRANSFER,1);
$buffer = curl_exec($curl_handle);
curl_close($curl_handle);
echo($buffer);
?>
jQuery / HTML
<script type="text/javascript" src="https://ajax.googleapis.com/ajax/libs/jquery/1.7.1/jquery.min.js"></script>
<script type="text/javascript">
$.ajax({
type: "POST",
url: "yourPhpScript.php",
data: "url=http://stackoverflow.com"
}).done(function( content ) {
$('#content').html(content);
//content is your variable containing the source
});
</script>
<div id="content"></div>
XMLHttpRequest works with any valid url, just give it the appropriate url and you can get the response as text.
However, there is the restriction of the same-origin policy. There are different workarounds to this for different situations, but if you want to be able to manipulate the text you receive then there is really only one option. Use the same javascript as you currently have, just add this as the first line of getUrl:
url='/path/to/proxy.php?url='+encodeURIComponent(url);
Then, on your server (the same one that's serving the page and its javascript), write proxy.php:
<?php
echo file_get_contents($_GET['url']);
?>
This will make every ajax request you make go to your server, which does not have the restriction of loading from only one domain. The server will load the url you asked for, and reply to you with the response it got from the page it loaded. Note that the above script will only give you the content body (what you see when you view-source) - if you need to access HTTP headers you can relay those too, it will just be more complicated.

Categories