manage ads to multiple sites in javascript - javascript

Is there a way to manage ads in a way that if i choose to show a certain ad, it will show to multiple sites that publish my ads in their sites. I'm looking into adding google analytics to add tracking.
I've looked into iframes but there are security issues.
Is there a way to accomplish this with javascript, just like how google adsense works? I was thinking of having some javascript and giving my publishers remote scripts to access the data.
Or is there a better way?
I'm don't want to use php because it is likely that the publishers are not using php and having a php script that pulls the data means it will not execute for sites that are in html.
SUMMARY:
I'm looking for a solution thats in html and js. Any help is appreciated, and sudo code is always helpful if you can provide some.
Thanks in advance

I worked at a spot that used javascript in order to render ads. This is how I accomplished it.
First, server type. I was running an NGINX web server with a PHP-FPM backend, with Varnish (cache) in front of it. The server was basically serving up static files, so this kept server load tremendously low. This is a decent tutorial for setting that up on CentOS
I used an Amazon Elastic Beanstalk, small instance type, to run this. It never needed to spin up more than one Small instance to serve somewhere in the neighborhood of 2.5MM javascripts per hour. Keep in mind, it's just serving small text snippets.
The tags looked like this:
<script type="text/javascript" src="http:/ads.someserver.com/1234"></script>
Where the [1234] was the tag ID number. Each publisher could have multiple tags, the ID keeps track of what publisher, ad size it is, etc.
Second, javascript. So now, you use an nginx rewrite in order to direct that request to a javascript file, which then in turn loads up the ad. The javascript file has to dynamically (and without libraries of any kind, load time is at a premium) create an HTML element, and then fill it with your ad.
You need to have another service of some kind choose what ad gets shown. This was not my department, but loading something up once you have the ID should not be difficult.
Figuring out where you should serve the assets (the ads) from is a tough call to make. Wherever you serve it from, better be super fast, because ad servers nowadays that you'll be competing against are super fast, and publishers will be very annoyed if your ad delays their website loading.
Good luck - you have a lot of challenges if you want to pull this off, least of which will be paying for servers to accomplish it.

Related

Any way to create an application with the local web page as an interface?

A few days ago I decided to make my own "interface" to make it easier to organize (and work with) some of my personal files. You know when a lot of related data, pictures and links are right in front of you and you can change them in a couple of clicks, this is very convenient.
I started by studying HTML, CSS and JS, because I thought that the changes made to the local page would be saved somewhere on my PC so I can just run Index.html and do whatever I want. But they didn't. Refreshing the page erased all changes.
Using browser localstorage does not suit me, because if I change the browser, the data will be lost. I wanted it to just open with Index.html and work fine even if I change my browser or move the site folder to another computer.
Then I decided to learn more about server-side languages (such as PHP or Node.js) because they are directly related to databases, so I was hoping to save changes through them. But these languages required me to really open the server, with ip and port tracking. And I just wanted to open a local page through one file, without any ports or connections via the console. So this method scared me off quickly.
So is there an easy way to make a local page like this? Maybe I have not studied well one of the above methods and it has this opportunity?
Or the best I can hope for is a simple application that will use that local page as an interface to interact with data? I accidentally heard about this possibility a long time ago. Then I will ask you to give at least a hint as to which language to choose for this.
I don't understand well everything that lies outside of vanilla HTML, CSS and JS, so a complete study of a complex language like Java or Python will be too difficult for me, and the goal is not worth such a lot of effort.
I hope I understand correcly what you are trying to do.
If your goal is to make an application to manage your files, I think the simplest solution will be, as you said, to look into NodeJS and the File system api which will let you interact with your files through javascript code.
Your program will have to be in two part that will have to interact:
the "front" html page
the "back" nodejs script
The downside is that you'll have to go deeper into your study of the language to learn how to create the interactions you want between your html file and your NodeJS application.
However, there is no need to open your server to the web to make it work. The NodeJS application can be set to listen to requests from only the computer that runs it (localhost).
I obviously can't get too much into details without knowing precisely what you want to do but you'll probably have to learn to make a local server with node (search "nodejs http" or "nodejs express"), then make requests to it via the html page's scripts (search "ajax request").
What you need to look into are (web based) content management systems. like strapi or "grand old dame" WordPress.

How to execute external JS file blocked by users' adblockers

We use an external service (Monetate) to serve JS to our site such that we can perform adhoc presentation-layer site updates without going through the process of a site re-deploy - which in our case is a time-consuming, monolithic process which we can only afford to do about once per month.
However, users who use adblockers in the browser do not see some of these presentation-layer updates. This can negatively affect their experience of the site as we sometimes include time-sensitive promotions that those users may not be aware of.
To work around this, I was thinking to duplicate the JavaScript file that Monetate is serving and host it on a separate infrastructure from the site. That way, it we needed to make updates to it, we could do so as needed without doing a full site re-deploy.
However, I'm wondering if there is some way to work around the blocking of the Monetate JS file and somehow execute the remote Monetate JS file from our own JS code in such a way that adblockers would not be able to block it? This avoid the need to duplicate the file.
If that file is blocked by adblockers, chances are that it is used to serve ads. In fact, your description of time-sensitive promotions sounds an awful lot like ads, just not for an external provider, but for your own site.
Since adblockers usually match the URL, the easiest solution would indeed be to rehost this file, if possible under a different name. Instead of hosting a static copy, you can also implement a simple proxy with the equivalent of <?php readfile('http://monetdate.com/file.js'); or apache's mod_rewrite. While this will increase load times and can fail if the remote host goes down, it means the client will always get the newest version of the file.
Apart from using a different URL, there is no client-side solution - adblockers are included in the browser (or an extension thereof), and you cannot modify that code for good reasons.
Beware that adblockers may decide to block your URL too, if the script is indeed used to serve ads.
Monetate if probably blacklisted in Adblock, so you can't do nothing about.
I think that self-hosting Monetate script would require to keep it updated by checking for new versions from time to time (maintaining it could become a pain in the ass).
A good solution in my opinion is to inform your users about that limitation with a clear message.
Or, you can get in touch with Monetate and ask for a solution.

How to minimize traffic and load time for http request for external scripts?

So I was pretty interested in the idea of head.js, however the idea doesn't play nice with the way I have developed my backend.
Basically what I'm trying to do is decide how I should serve my scripts(js and css) giving my services the most performance and the least traffic. I feel like I can replicate a more integrated idea of head.js using more of my backend. I'm using node.js, without any frameworks around it, to serve everything.
MoreOver, I have several javascript and css files. Basically the idea is theres one for the entire site for the header, footer and reused methods and styles. Then there's one for each page.
The original idea was to make the server send a get request to "/js?src=" and just request one file like "index", then index will be all the javascript that is needed to load on that page will be sent as a single response as a big conjoined script. The problem with this is I'm using Closure Compiler and this just seems like I would run into a lot of issues with this except for the CSS.
The second idea is to create a loop in the template to create separate requests for each script. Utilizing more of the idea of head.js, however moving it to the backend.
The third idea, maybe I'm overthinking at this point. Create a Closure Compiled version of the scripts for each page which will include the javascript for not only the page but the header and footer, therefore the scripts never conflict. This creates redundant data in each of the files, and posses the issue of not pipelining my assets.
The basic idea of my service is a website that will serve social media content, images, and music in realtime. So initial page loading isn't extremely important, however, I want the server to be able to handle a large number of requests quickly. So I'm more focused on the big picture of serving many users than I am on the individual users experience. Whats my best approach?

Scripts folder a vulnerability?

In my .NET web applications I usually have a Scripts folder that contains all of my JavaScript files - jQuery mostly these days, with the occasional JavaScript library of some sort or another.
I'm running vulnerability scans against one of my websites through a scanner called Nexpose, and it informed me that the Scripts folder is open to the world - meaning unauthenticated users can download the JavaScript files contained in the folder and that this is a critical vulnerability. According to Nexpose, the Scripts folder should be restricted to only allow authenticated users to access it. Which leads me to my first question.
How do I restrict the Scripts folder to only authenticated users? I tried placing a web.config file into the Scripts folder and denying access to all unauthenticated users that way, but it didn't work. I was able to determine this myself but going to my website's login page, but not logging in, and then typing https://mywebsite/scripts/menubar.js and sure enough it allowed me to download the menubar.js file.
Second question - Why is this considered a vulnerability? I've tried to reason my way through the possibilities here, but I've failed to come up with much at all. Is it a vulnerability simply because Joe the l33t h4x0r could figure out the various libraries that I'm using and then maybe use known exploits against them?
Update
Overwhelmingly the answer seems to be that in no way should a vulnerability exist just because a .js file can be opened and read on the client's browser. The only vulnerability that might exist would be if the developer were using the .js file in an insecure fashion of some sort (which I'm not).
Logically, you wouldn't want to actually disallow access to the actual files because then you couldn't use them in your webpage. The webserver makes no distinction between a browser requesting a file as part of the process of rendering a webpage versus someone just manually downloading the file.
As a result, the answer to your first question is: you can't and wouldn't want to. If you don't want users to access take it out of the web folder. If it's required to render your site, then you want anyone to have access to it so your site can render properly.
As to why it's considered a vulnerabiliy, who's saying it is? I can go pull any JavaScript Facebook uses right now. Or, more to the point, I could go to Bank of America or Chase's website and start looking through their JavaScript. If I had an account, I could even take a look at the JavaScript used once the user is logged in.
The only thing that you might need to worry about is the same thing you always need to worry about: exposing details that shouldn't be exposed. I'm not sure why you would, but it obviously wouldn't be a good idea to put your database password in a JavaScript file, for example. Other than things like that, there's nothing to worry about.
In most cases it's not a vulnerability. Consider all of the massive public sites with anonymous traffic and/or where it's very easy to become an authenticated user (Google, eBay, Amazon, etc.) These sites also have some of the most elaborate scripts.
The more subtle thing to watch out for are other files which you DO want protected. For example, if users must login to your site and purchase a document, video, image, etc. before viewing it, it certainly should not be in a publicly accessible folder.
Yes. You should keep most of your processing to be done server-side, as most (if not all) client-side scripts can be edited, and such. Most sites use Javascript, so simply using it isn't dangerous, you just have to be careful about what you do with it.
Also, to answer your first question, don't protect them if unauthenticated users need them too.
Sounds like some security suite has an itchy trigger finger. The only two problems I could see is that you could end up loaning your server out as a CDN if someone chooses to point to your jQuery or your -insert library name here- OR (now this is a real security risk) if you are also serving any kind of dynamic .js files out of there that could pose a potential threat. The only other thing I can think of is if you have your "custom" app js in the mix with all the libraries someone could potentially discover your endpoints (web services and such) and try and see if they're secure... but that's it! nothing more... (unless you did something really dumb like hard code a password or something in there... lol)
So the attack isn't that people can edit the script the attack is getting the web server to arbitrarily write to a directory. What you need to do is make sure they are read-only files. chmod 400 or windows read. In terms of Defense in Depth (DiD) you want to make sure the web server is a non-privileged user that cannot log into the system. Further what needs to occur is that you are doing all data sanitization on the server, irrespective of what you are doing on client side, because you do not control the client side. Typically this involves making sure that you cleanse all data coming from the web as well as the database before it gets served. One of my favorite things to do is insert arbitrary javascript into the database and watch it do things in the UI because the development team assumed everything would be fine since they already cleaned it once.
I can provide more details around securing the system if it is warranted.

Dynamic Loading: Pass back url to script or pass back script itself?

I've been wondering if there is a right way to do this: When I'm dynamically loading a script using AJAX, I have the option of passing back a url to the script on the server and then running a: <script src = response.url ></script> or just passing back the script itself.
I went with the approach of passing back the contents script itself because I figured I only make one roundtrip instead of two. Also because I want to pass back some css or other assets as well. I noticed however that facebook passes back urls to CDN resources, so I'm wondering if my approach has some consequences that I'm overlooking.
Thanks!
Matt
When you're facebook, you're looking at some rather unique traffic patterns. Sending back 20KB of script vs sending 30 characters from dynamic servers can translate into a lot more load on those servers. Additionally, they might not be able to serve large-ish content all that fast.
In contrast, the CDN servers are glorified static proxies, designed for speed and for scale. So from facebook's point of view, the additional round-trip makes sense, as it can still improve the overall page speed, and it certainly improves their server traffic patterns.
Now back to you. This approach won't make sense if you're going to load the script from the same servers as the rest of your site. If you do have access to a CDN as well, then you have to do the math using various assumptions about your users (latency, location), facts about your site (size of scripts, timing of script loads), and compare the effect of having your main servers serve those scripts, versus the extra round-trip and your CDN servers handing out those scripts.
One additional thought about roundtrips: If I was facebook, I'd probably be handing out those CDN URLs early on, before the page actually needs to load the scripts. Ideally, I'd piggyback on another request to sneak that little bit of extra data in. That'd make the extra round-trip issue mostly moot.
Hm, well I'm fairly sure there are some cross-domain security issues with AJAX, meaning if you were trying to dynamically load a script's content from an external CDN, you'd need to work around such an issue..

Categories