Android - document.write() blocks Google Ads - Doubleclick - javascript

I've found numerous questions about document.write() but none that responds to my specific problem.
Short explanation
The request to receive an ad from DoubleClick finishes successfully but the ad doesn't load (blank space instead of seeing an ad), these are the warnings provided by logcat in Android Studio :
A parser-blocking, cross site (i.e. different eTLD+1) script, SCRIPT_PATH, is invoked via document.write. The network request for this script MAY be blocked by the browser in this or a future page load due to poor network connectivity. If blocked in this page load, it will be confirmed in a subsequent console message. See https://www.chromestatus.com/feature/5718547946799104 for more details.
The SSL certificate used to load resources from WEBSITE_PATH will be distrusted in the future. Once distrusted, users will be prevented from loading these resources. See https://g.co/chrome/symantecpkicerts for more information. SCRIPT_PATH
Longer Version
I have an Android app in which we include Google Ads with the DoubleClick library. Those ads can`t load on some newer devices (ex Android 7.0). I suspect that newer versions of Android have newer versions of Chromium (55+) in which the handling of document.write() has changed.
https://developers.google.com/web/updates/2016/08/removing-document-write
Thus, when loading some ads (in my case all of them) I always receive the warning about document.write() and the ad doesn't load.
I don't have any error from double click (request is successful).
I don't have any message saying that no ads are available to be served(ad is received correctly in the request).
The only problem is the ad doesn't load because of the JavaScript it's trying to execute.
Changing the JavaScript directly doesn't sound like a good idea as it is loading in a WebView provided by an external library which would require some ugly reflection.
Classes used :
com.google.android.gms.ads.doubleclick.PublisherAdView
com.google.android.gms.ads.doubleclick.PublisherAdRequest
com.google.android.gms.ads.doubleclick.PublisherAdRequest.Builder
I repeat that it works with older devices (Android 4.4 I think)

Related

Delphi TWebBrowser JavaScript Errors and Cookies

I have been stuck on this for a couple of weeks now and this is a follow on from SO question Delphi REST Debugger Returns Error 429 Too Many Requests but Browser Returns JSON as Expected
I was wanting to get the content of a url response using the TNetHTTPRequest and TNetHTTPClient components. I was continually getting 429 errors “too many requests”. When using Firefox Inspect Element to look at network and storage, I discovered that I needed to receive cookies and then send those cookies with my request. Unfortunately, one of the cookies essential to the website content seems to be dependent (I think) on the execution of javascript. I went back to first principles and dropped a TWebbrowser on a form (VCL) and sure enough browser shows a javascript error “Expected Identifier”.
When I use the TWebbrowser in FMX it does not throw an error it just does not return the website contents at all and remains blank. I need FMX as I will be in a cross platform mobile environment.
The URL is https://shop.coles.com.au/a/national/home
I use Delphi Community Edition 10.3.3 Rio.
The URL returns perfectly in commercial browsers Firefox, Safari, Chrome and even CEF4Delphi. Unfortunately, I can’t use CEF as I need cross platform.
I would like to know how to get the website content returned to the browser (or even better NetHTTPClient) without script errors and how to access the browsers current cookies.
Any help will be most appreciated.
Thanks,
John.
URL returns perfectly in commercial browsers ... without script errors and how to access the browsers current cookies
If you'd inspect the network traffic (F12 > Network, then requesting your URL) or use uMatrix (to block everything that doesn't belong to the domain by default) you'd see the JS does at least one XHR to amazonaws.com. Your HTTP transfer alone (as done by TNetHTTP*) works fine and you get the same resource that each internet browser gets.
However, you don't operate with what you got (in contrast to the internet browser, which also automatically parses the HTML, sees JS resources, and executes them). TWebbrowser does not what you take for granted most likely due to security settings (try to get an error console in there, preferably F12 again). You need to do the same: parse the HTML resource for JS URIs, requesting those and executing what you get, while still providing the same cookie environment.
For executing JS you could use Chakra or mORMot or BESEN. It's challenging at first, but the more you understand about HTTP (including cookies) and a JS engine, the more you'll see why "things work" in one situation and not in another. There's a reason why an internet browser is a very complex software and not just a downloader.
As per this forcing IE11 Quirks mode might cure your problem already when using TWebBrowser:
TBrowserEmulationAdjuster.SetBrowserEmulationDWORD(TBrowserEmulationAdjuster.IE11_Quirks);

Firefox Enhanced Privacy Protection is blocking Datalayer push to Google Tag Manager

Over the past few weeks I've realised that the conversion tracking in Google Analytics of a website we built and maintain has been off by about 20% - 40% each day.
When testing in any browser but Firefox, everything works fine and you can see conversions pushing into Analytics straight away.
However, in Firefox, when you have Enhanced Privacy Protection turned ON, (it comes switched on as default now) you get the following error:
Cross-Origin Request Blocked: The Same Origin Policy disallows reading the remote resource at https://www.googleadservices.com/pagead/conversion/957837126/wcm?cc=ZZ&dn=01858439338&cl=ITVOCP2S_34Qxt7dyAM&ct_eid=2. (Reason: CORS request did not succeed).
As soon as you switch off Enhanced Privacy Protection it works perfectly.
The code I am using to push to datalayer, if its of any relevance is:
<script type="text/javascript">
document.addEventListener( 'wpcf7mailsent', function( event ) {
window.dataLayer.push({
"event" : "cf7submission",
"eventAction": "FormSubmission",
"eventCategory": "Contact Form Submission",
"eventCallback" : function() {
// Firefox never gets to run this callback to redirect page - which is what triggered further investigation.
window.location.href = "https://www.domain.co.uk/thank-you/";
return false;
},
"eventTimeout" : 2000 // I had to add this in so that it still redirects to thank you when datalayer push fails.
});
}, false );
</script>
The event listener is just to check when the email has been sent by the site, and then the rest is to push into Data Layer for tracking and then redirect to thank you page upon completion.
In my opinion this is definitely not a CORS related error in the sense that the request is coming from our local script with the correct headers. Code works in all other browsers with no issue.
Firefox has this page https://developer.mozilla.org/en-US/docs/Web/HTTP/CORS/Errors/CORSDidNotSucceed to try to explain why we're getting the error:
Reason 1:
Its Certificate error : Its Google, it's not a cert error
Reason 2:
HTTP to HTTPS request : HTTPS on site with Let's Encrypt SSL
Reason 3:
Not permitted to access localhost : This isn't localhost and is live site
Reason 4:
Server didn't respond : Again, it's Google, it responds to everything.
TLDR:
Firefox is blocking datalayer push when Enhanced Privacy is turned on, but should be allowing a standard conversion tracking script to run in line with their own docs. Why is it blocking us and what code do I need to get around it?
UPDATE
I've found this link https://developer.mozilla.org/en-US/docs/Mozilla/Firefox/Privacy/Tracking_Protection which says:
How does Firefox choose what to block?
Content is blocked based on the domain from which it is to be loaded.
Firefox ships with a list of sites which have been identified as
engaging in cross-site tracking of users. When tracking protection is
enabled, Firefox blocks content from sites in the list.
Sites that track users are most commonly third-party advertising and
analytics sites.
Is Firefox seriously blocking Google Analytics on standard conversion tracking now?
It looks like I was correct with my original assumptions, to a certain degree.
Firefox isn't blocking all analytics access by default now, but it is blocking anything ad related that tries to send conversion or tracking code related to ads.
So, if you're trying to fire a goal upon completion of an ad related activity, it's going to get blocked, whilst other tracking related scripts will get run.
Firefox has chosen its own list of what it believes to be third party tracking scripts and by default, its blocking them all now.
Interesting Points
Google obviously relies on this tracking conversion data and as such Chrome is quite far behind in implementing anything to block ad related traffic, its where they make their money so it wouldn't make sense to block themselves. They currently have over 60% market share in regard to usage (https://en.wikipedia.org/wiki/Usage_share_of_web_browsers) so your tracking is going to be ok for now.
However, both Safari and Firefox, neither of which rely upon ad revenue have implemented strict measures for tracking.
Safari & Firefox
Firefox goes all out and blocks tracking scripts related to third party sources. Take note of the 'third party', its when an advertiser is embedding their script on your site.
Safari, on the other hand has gone a step further and will auto delete ALL tracking related cookies after 7 days of not being on the site. This is going to knock your data way off as although it will still show visitors, they'll show as new visitors instead of returning visitors.
Conclusion
Right now, I feel like this is the beginning of the end of traditional conversion and ad tracking for website owners and something is due to change in the near future as these browser changes start to bite.
I don't know of any way to get around this for now. I explored trying to use a proxy to get around the tracking embeds, but without knowing how and what Google tracks on each script call, it was impossible to spoof the submissions to analytics.

IE11 javascript fails unless debugger is launched

I hava situation where some javascript a web page works fine in Safari and Chrome, but fails in IE11. Unfortunately due to issues with confidentiality I cannot put the javascript up here.
In IE11 the web page's java script fails to operate correctly. By that I mean some of the javascript works and some doesn't. With no errors displayed or any other indication of whats wrong.
If I try to debug the page using IE's developers tools, all the javascript works perfectly without any errors or issues.
Searching on the net I found many people with the same IE problem - fails normally, works when debugging. The main issues they talk about is the console.log(...) statement. I checked my javascript and don't have any console.log(...) statements.
I then saw a stackoverflow thread where adding a cache:false to the $.ajax({... calls solved the issue. I added the same flag but the problem still persists.
Are there any other bugs I've not found?
The web page is using jQuery to handle most of it's manipulation of the DOM with a single $.ajax... call and a series of $.get(... calls polling the server.
How to debug your web pages.....IE11 tips.
All modern web browsers suppress scripting error messages and warnings by default. (In the early days web browsers would halt page loading/rendering and display a script error message with an alert statement)... this gives the best user experience who isn't concerned with the internal workings of web site code.
So, scripting errors will only BREAK execution if:
1. The browser debug tool is opened. and
2. The developer tools' Debug tab setting for Break on Exceptions has been turned on.
So to debug your web pages.
1. navigate to about:blank to start a testing cycle.....press f11 to display the dev tool, select "Break on all exceptions" from the dropdown (looks like a stop sign). Pin the dev tool to the bottom of the browser.
2. Return to the browser address bar and navigate to your test site (typed address of paste and go)...
The dev tool will now break on ALL exceptions and you will list them in the console tab.
IE has built-in content blocking and has ActiveX filtering (ad blocking) which can affect outcomes. You need to configure Internet Options so that the IE dev tool console will record any blocked content or security (XSS) errors.
Tools>Internet Options>Advanced tab, check "Always record developer console messages".
Also on the Emulation tab of the IE dev tool you will find the Emulation Mode (aka documentMode) that IE is using, and how it was established eg. x-ua meta, Enterprise site mode list, user Compatibility View list, etc
If you are developing an internal company website, the emulation mode used by IE may be for an earlier version of IE.. (IE8 on XP).. you should include this information with your questions.
You should also include the IE security zone that your site has been mapped to.. File>Properties menu in IE.... eg. Intranet zone as this can have different security and blocked content outcomes.
finally, the first step in troubleshooting web browser issues is to test in noAddons mode (for IE, winkey+r>iexplore.exe -extoff ). IE has built-in form-fillers and popup blockers... third-party addons can affect the outcomes expected.

how i can determine that code or elements are blocked by not-IE browsers?

i have self-hosted ASP.NET web service and small HTML5 application that communicates with that service
when i try open this page in IE, browser show me notification that it has blocked ActiveX components and scripts. allowing browser to execute dynamic components, HTML/JS code working fine - queries reaches web-service, it generate needed data, send data that page needs
but when i open page in another browsers (Chrome, FF, etc), page does not reacts to the any data that server sends as callback. queries reaches web-service but all communication with page and host is stopping
all JS-libraries that i connect to page: 'jquery-2.1.0.min' and my own 'common.js'
so, how i can determine, what element is not working correctly in not-IE?
Here are some things you can try to avoid these kinds of issues:
Use jQuery 1.11.1 instead of jQuery 2.1.0 to improve compatibility between browsers.
In Chrome, right click anywhere on the page and select Inspect element. You should see a debug window open at the bottom half of the page. Navigate to the Console tab in this window to see any JavaScript errors.
In Firefox, install the Firebug add-in and use that to debug.
Make sure that you are not using any ActiveX. That is only compatible with IE.
You can use sites such as browsershots.org to test browser compatibility
Don't use browser / platform specific technologies such as Flash, Silverlight etc.

Chrome blocking iframe requests as cross-origin request even when origins are the same

This one has me stumped.
I have a web app that has a file upload/download area. Files are downloaded in the background via a temporary iFrame element. This is a single-paged AJAX application and the UI is written in Javascript, jQuery and uses the jQuery.FileDownloader.js to manage the iFrame. The application runs over HTTPS and the site and download URL are on the same exact domain. The back-end is a RESTful application. This has worked great for months. Until today.
All of a sudden, when attempting to download a file in Chrome, the browser reports an error of "Blocked a frame with origin https://example.com from accessing a cross-origin frame."
The problem is that the origin of the main site and that of the iframe are the exact same domain. I have ensured that the domains are the same as well as the protocol. Chrome is the only browser that throws up the cross-origin error. IE, Firefox, Opera, Safari... all work as expected. It's only in Chrome and it's only as of today. To make things worse, no updates were made to the browser. It truly is spontaneous. I've also ruled out plugins as the cause by running in Incognito mode, where none are allowed to run by my settings, as well as disabling my anti-virus software. This problem is being exhibited on other computers, in other locations (not on our LAN or subnet), all running Chrome.
And, again, both domains of the parent frame and the embedded iframe are identical. This only happens against the production server which runs over HTTPS. Other non-HTTPS sites (e.g. our dev environment, localhost) don't have the problem. Our SSL is valid. Since this is a single-paged AJAX application, we're trying to avoid popping up another window for the download.
Hopefully, someone can offer some advice. Thanks in advance.
Update: After additional research, I have found the solution to this problem is to enclose the filename in the response header in double-quotes.
I have found the cause of the problem. It turns out that Google Chrome has problems with files that have commas in their filename. When downloading the file via a direct link, Chrome will report that duplicate headers were reported from the server. This has been a long-standing problem with Chrome that has gone un-addressed. Other browsers are not susceptible to this problem. Still, it's a fairly easy problem to troubleshoot and, indeed, when I searched on this error, the first search result had the solution: remove commas from filenames when handling a request from Google Chrome.
However, this wasn't a direct link, it was an AJAX-request, which results in a different exception. In this case, the error provided by Chrome is the cross-origin request exception and this is what made it so difficult to troubleshoot.
So, the tl;dr of it all is to strip out commas in the names of uploaded files.
Another instance I found where this issue occurred is after executing code similar to:
document.domain = '[the exact same domain that the iframe originally had]'
Removing this line of code got rid of this error for me.

Categories