I am making an extension that needs to capture POST data directed to a site and once the site response confirms success, change some local data to reflect it.
The issue is, the POST data is located in requestBody from the onBeforeRequest event, while the success confirmation is in the onCompleted event. I understand that the lifetime of a webRequest should be managed using its unique requestId, but I am using an Event Page and therefore trying to avoid the use of global variables.
eventPage.js:
function continueListening(requestDetails){
function finishListening(completeDetails){
if (completeDetails.requestId === requestDetails.requestId){
doStuff(requestDetails, completeDetails);
chrome.webRequest.onErrorOccurred.removeListener(finishListening);
chrome.webRequest.onCompleted.removeListener(finishListening);
}
}
chrome.webRequest.onErrorOccurred.addListener(finishListening,{urls:["*://site*"]});
chrome.webRequest.onCompleted.addListener(finishListening, {urls:["*://site*"]});
}
chrome.webRequest.onBeforeRequest.addListener(continueListening, {urls:["*://site*"]});
I decided to try nesting listener registrations for the finalized request in order to provide them with the scope to compare requestIds with the initial request containing the form data. This appears to work, but I am concerned about a potential race condition between the resolving of the webRequest and the registration of the nested listener intended to listen for it, leading to any number of useless unremoved listeners.
The other option I see is to store the requestDetails in chrome.storage.local and check them against the completeDetails once they arrive. My main hesitation there is that if for whatever reason execution is interrupted the local disk space could be polluted with unresolved requests.
Is there a better way of doing this?
EDIT: Unfortunately although I believed I was making an Event Page, I did not have persistent:false in my manifest. As I learned when I added it, Event Pages do not even support webRequests. The Event Page equivalent, declarativeWebRequest, seems to have died in the beta channel. So making it a Background Page seems to be the necessary solution.
Related
I've set up a fiddle to try to verify if the beforeunload event is triggered when the page is used in an iframe.
Since the fiddle show its result in an iframe, I figured it could be easy to verify by just closing the page. I've set up a request bin at pipedream just to see if any requests gets sent, but it doesn't seem to trigger in Chrome.
window.onbeforeunload = function() {
fetch('https://eoaczcjrpegb7wv.m.pipedream.net')
}
Is is possible to use this event from an iframe or do I need to look in to a different approach?
After a closer look it seem to capture some of the requests. Is this prone to race conditions? If so, are there any more robust alternatives?
Yes, there is a race condition. Since your event handler does nothing to stop the page unload in any way (it does not even trigger a confirmation prompt to delay it), immediately after your event handler is processed, the browser will proceed to unload the page. This aborts most pending requests; if a request did not manage to be submitted at that point, it will not be sent to the server at all.
Sending a request from a beforeunload event handler is a poor idea anyway. For starters, you are not even guaranteed that the event will fire at all; the browser may be unable or unwilling to trigger the event. MDN warns that it will only fire after the user had interacted with the page and it may fail when the session is terminated out of browser’s control. The only legitimate purpose of beforeunload is to check whether the page contains any unsaved state that the user may lose, and to trigger a confirmation prompt; even that should be understood to work on a best-effort basis. Anything else is prone to abuse and suspect; I would not be surprised if a future browser plug-in or vendor ‘intervention’ were to block all web requests when a page is about to unload.
However, if you insist, there are ways to make the request survive unload. You can use navigator.sendBeacon:
window.onbeforeunload = function () {
navigator.sendBeacon('https://example.net', '');
};
or the keepalive fetching option:
window.onbeforeunload = function () {
fetch('https://example.net', { keepalive: true });
};
Chrome provides both, but Firefox, as of version 106, only implements the former. Using those APIs comes with some restrictions: at any given moment, the total amount of data sent by active keep-alive requests must fit within 64 KiB, as per the Fetch specification.
You may notice using those APIs in a beforeunload handler is still not recommended usage, as it worsens performance of navigating back to the page with the back button. MDN suggests listening for the visibilitychange event, but that is of course not the same thing.
Last but not least, nothing stops the user from having the browser lie to you that the request has been sent, like with this uBlock filter:
##+js(no-fetch-if, keepalive:true)
##+js(set, navigator.sendBeacon, trueFunc)
So try not to be too obnoxious with your spyware ‘analytics’.
I am getting an NS Binding aborted call while passing data to Sitecatalyst upon a button click. An scAdd event is getting called but after that the tag is not able to access all the data. Some data is being sent while some is not. I tried to add a timeout after firing the event but it did not work. I've seen the s.tl() being suggested as the cause of this error but what is the change that needs to be made in that function to avoid this error? Thanks!
In short, it's nothing to worry about. Adobe's servers respond with a 1x1 transparent pixel, so the user's browser does not need to wait for that to render.
You can refer to this KB article for more details on NS_BINDING_ABORTED and how it relates to Adobe Analytics:
https://marketing.adobe.com/resources/help/en_US/sc/implement/debugger_ns_binding.html
You can also check the Analytics Community for any follow up questions.
I am attempting to track events when links are clicked on my site in a method similar to the following.
Example
<script type="text/javascript">
jQuery(function($) {
// track clicks on all anchor tags that require it
$('a.track').live('click', function(e) {
// send an AJAX request to our event tracking URL for the server to track it
$.get('/events/track', {
url: $(this).attr('href'),
text: $(this).text()
});
});
});
</script>
The problem that I'm having is that a new page load interrupts the AJAX request, and so sometimes these events aren't being tracked. I know Google Analytics has a _trackPageview function that can be attached to onclick events though, and this doesn't seem to be an issue for that. I'm wondering what's different about their call vs. mine that I'm seeing this race condition, and GA isn't. e.g.:
Example
Note that I'm not worried about the result of the AJAX request...I simply want it to ping the server with the fact that an event happened.
(Also, I expect I'll get at least one answer that says to simply track the new page load from the server side, not the client side. This is not an acceptable answer for this question. I'm looking for something like how Google Analytics' trackPageview function works on the click event of anchor tags regardless of a new page being loaded.)
Running Google's trackPageview method through a proxy like Charles shows that calls to trackPageview( ) request a pixel from Google's servers with various parameters set, which is how most analytics packages wind up implementing such pieces of functionality (Omniture does the same).
Basically, to get around ansynchronous requests not completing, they have the client request an image and crunch the parameters passed in those requests on the server side.
For your end, you'd need to implement the same thing: write a utility method that requests an image from your server, passing along the information you're interested in via URL parameters (something like /track.gif?page=foo.html&link=Click%20Me&bar=baz); the server would then log those parameters in the database and send back the gif.
After that, it's merely slicing and dicing the data you've collected to generate reports.
Matt,
If you just want to make sure that the tracking pixel request is made and you don't depend upon response then just doing document.write for the tracking pixel image will do the work.
And you can do the document.write in your onclick handler.
AFA race condition between href and onclick handler of anchor element is concerned the order is well defined.
the event handler script is executed first
the default action takes place afterwards (in this case the default handler is href)
(Source : Href and onclick issue in anchor link)
But yes, if you depend upon the response of the tracking request to the server then you will have to make it synchronous.
Suggested option would be to call some javascript function to wrap the already defined onclick handlers and then in the order make the calls. Make sure that your tracking request is not asynchronous.
Though it is suggested that you should not be dependent upon the response of the tracking pixel request.
I'm implementing click tracking from various pages in our corporate intranet in order to add some sorely needed crowd-sourced popular link features ("most popular links in your department in the last 24 hours", etc.)
I'm using jQuery's .live() to bind to the mousedown event for all link elements on the page, filter the event, and then fire off a pseudo-ajax request with various data to a back-end server before returning true so that the link action fires:
$("#contentarea a").live("mousedown", function(ev) {
//
// detect event, find closest link, process it here
//
$.ajax({
url: 'my-url',
cache: false,
dataType: 'jsonp',
jsonp: 'cb',
data: myDataString,
success: function() {
// silence is golden -- server does send success JSONP but
// regardless of success or failure, we allow the user to continue
}
});
return true; // allow event to continue, user leaves the page.
}
As you can probably guess from the above, I have several constraints:
The back-end tracking server is on a different sub-domain from the calling page. I can't get round this. That's why I am using JSONP (and GET) as opposed to proper AJAX with POST. I can't implement an AJAX proxy as the web servers do not have outbound network access for scripts.
This is probably not relevant, but in the interest of full disclosure, the content and script is inside a "main content" iframe (and this is not going to change. I will likely eventually move the event listener to the parent frame to monitor it's links and all child content, but step 1 is getting it to work properly in the simplified case of "1 child window"). Parent and child are same domain.
The back-end is IIS/ASP (again, a constraint -- don't ask!), so I can't immediately fork the back-end process or otherwise terminate the response but keep processing like I could on a better platform
Despite all this, for the most part, the system works -- I click links on the page, and they appear in the database pretty seamlessly.
However it isn't reliable -- for a large number of links, particularly off-site links that have their target set to "_top", they don't appear. If the link is opened in a new tab or window, it registers OK.
I have ruled out script errors -- it seems that either:
(a) the request is never making it to the back-end in time; or
(b) the request is making it, but ASP is detecting that the client is disconnecting shortly afterwards, and as it is a GET request, is not processing it.
I suspect (b), since latency to the server is very fast and many links register OK. If I put in an alert pop-up after the event fires, or set the return value to false, the click is registered OK.
Any advice on how I can solve this (in the context that I cannot change my constraints)? I can't make the GET request synchronous as it is not true AJAX.
Q: Would it work better if I was making a POST request to ASP? If (b) is the culprit would it behave differently for POST vs GET? If so, I could use a hidden iframe/form to POST the data. however, I suspect this would be slower and more clunky, and might still not make it in time. I wouldn't be able to listen to see if the request completes because it is cross-domain.
Q: Can I just add a delay to the script after the GET request is fired off? How do I do this in a single-threaded way? I need to return true from my function, to ensure the default event eventually fires, so I can't use setTimeout(). Would a tight loop waiting for 'success' to fire and set some variable work? I'm worried that this would freeze up things too much and the response would be slowed down. I assume the jQuery delay() plugin is just a loop too?
Or is something else I haven't thought of likely to be the culprit?
I don't need bullet-proof reliability. If all links are equally catchable 95% of the time it is fine. However right now, some links are catchable 100% of the time, while others are uncatchable -- which isn't going to cut it for what I want to achieve.
Thanks in advance.
I would try a different approach. You can bind to a different event like:
$(window).unload(function(event) {
// tracking code here
});
I would try to return false from the link event handler, remember the URL and navigate away only when JSONP request succeeds. Hopefully it shouldn't add too much latency. Considering you are on the inranet, it might be OK.
Solved!
The short answer is: there is no reliable way to do this cross-domain with a GET request. I tried all sorts, including storing the event and trying to replay the event later, and all manner of hacks to try to get that to work.
I then tried tight loops, and they weren't reliable either.
Finally, I just gave in and used a dynamically created form that POSTed the results, with the target set to a hidden iFrame.
That works reliably -- it seems the browser pauses to finish its POST request before moving on, and ASP honours the POST. Turns out it's not 'clunky' at all. Sure, due to the browser security model I can't see the result... but it doesn't matter in this case.
I am now kicking myself that I didn't try that option first.
I want an event handler that fires when the user hits reload. Is onrefresh or onreload the correct handler to add to ? Also, will this even fire before or after onunload? Are there an browser inconsistencies? Thanks.
I don't think there are events called onrefresh or onreload. You can know when the page is unloading, but knowing why (i.e. where the user is going next) is outside JavaScript's security sandbox. The only way to know whether the page has been reloaded is to know where the user was on the last page request, which is also outside the scope of JavaScript. You can sometimes get that via document.referrer, but it relies on the browser's security settings to permit access to that information.
The WindowEventHandlers.onbeforeunload event handler property contains the code executed when the beforeunload is sent. This event fires when a window is about to unload its resources.
window.onbeforeunload = function () {
return 'Are you sure you want to leave?';
}
This will show a confirm dialog to the user with the message you returned in your function. It will give the user a leave this page or cancel option.
There is no way around the confirm as it could be used for malicious reasons.
https://developer.mozilla.org/en-US/docs/Web/API/WindowEventHandlers/onbeforeunload
If you combine setting a cookie with a for the specific page, with a check for the onload event, you can simulate the nonexistent event you seek. You might adjust the cookie expiration so that a reload is counted only if the initial onload was a certain time interval ago.
There are no onreload or onrefresh events that I'm aware of. Certainly from javascript running in a browser this make little sense. The existing window and all its contents are effectively discarded. Hence you either need to use onunload of the existing context or the load event of the new context that is created as result of reload.
I do believe artlung may have indeed found a way, actually... his version, however, relies on cookies, and those can be cut off from use in numerous ways; the solution, then, is to use a server-side language of your choice to save the timestamp of when the page is unloaded via JavaScript (still a vulnerability, yes, but why not throw another idea out there, huh?) and then testing it again upon every page load; if you detect a difference of less than a few seconds, the user probably just reloaded your page. : )
could use a session. easier than a cookie and don't have to worry about expiration or database. That would cover you for any page except the first one. I don't think the session superglobal is available til the second page. If that's a problem, you could start a session and reload the page immediately if there is no active session.
its onunload
because when you hit refresh the browser "unloads" then loads again