I'm trying to build a website to track my run distance, (have this working) while also tracking the path I take (like google maps when getting directions).
I'm not worried about storing it in a database yet, but if this is required to get the movement to be tracked on a map then I will.
I've looked at Google and see they have asset tracking, also Runtastic (but this doesn't appear to have an API). Also checked pubnub (but it doesn't seem to map the track)
I'm wanting to do it with HTML5 so that it runs in a browser. Has anyone managed to get this working and could share a guide on how to do it? (spent hours looking)
OwnTracks might do the job as backend for storing and managing your recorded tracks.
Related
I've been wresteling with this a few days and can't really find any good information on it. I am a novice with respect to Google Analytics and Google Tag Manager.
I have a client website which is a Single Page Application using Marionette. The client would like timings on how long it takes to load different pages in the application, specifically they want to know what pages to focus on for optimizations. They also want more analysis on user interaction in GA.
I've installed the google tag manager and setup the page view tag to show the individual page paths via the History trigger properly as described here:
https://www.pmg.com/blog/tracking-single-page-web-apps-google-tag-manager-analytics/
That is working fine, I can see the history fragments/page paths in GA.
The problem is when I go to Page Timings in the Behavior reports in GA all the timings are 0. Avg page load, server load, etc.
I've tried installing a page timing recipe from Luna Metrics:
https://www.lunametrics.com/labs/recipes/page-load-timing/
This did not work as the custom javascript in it depended on window.performance which after some research does not appear to work well for SPAs.
function() {
var timing = window.performance.timing;
var ms = timing.loadEventStart - timing.navigationStart;
return Math.round(ms / 100) / 10;
}
I've also tried setting the siteSpeedSampleRate to 100 in the PageView tag for google analytics. This appears to have had no affect either.
i've also been messing with the tags and using the GTM preview and GA Debug and I can see the custom timings being set but with the same value on every subsequent link click (Its using the same page load value on the initial page load I think) which is why I don't think window.performance is an option here.
I've not been able to find any definitive way to get google analytics to track page timings for an SPA and would love any suggestions!
I guess I just needed to be a little patient after the siteSpeedSampleRate change. That seems to have resolved the timing issues for me.
So basically, we are having issues with a website (https://thesoundshop.com) in which the PPC traffic source (it only seems to be PPC traffic that is affected) is being changed by the pageview that we are pushing through to analtics.
The website runs on Ajax so we have to use Javascript to emulate pageviews through to Analytics whenever a link is clicked on the website. We are not using Google Tag Manager to implement the analytics or javascript because of this so we are using the gtag method exactly as Google's documentation recommends:
gtag('config', 'GA_TRACKING_ID');
I have tested this by visiting the website through a PPC ad and watching the real-time reports in Analytics. The first page load attributes the correct source to the traffic (cpc) as expected but then when I click on a link to go to a different page, the traffic source changes to Google Organic search. I then complete an action that I know will trigger an event or a goal and then when this appears in the Goal Reports, it attributes the goal as organic, too.
We know that this has to be down to the Javascript pageview that is being pushed to analytics to simulate that a new page has loaded but we can't work out why it would be changing the traffic source. The gtags are implemented exactly as Google recommends; just wondering if anyone else has had this problem and if so, how did they go about fixing it?
We had the same issue and after some more research found the problem along with the solution, thanks to Simo.
Basically, you need to manually set the document.referrer of the first session to make sure it persists and does not get over-written half way through the session. It's a known issue with GTM and single page applications.
I want to develop a mobile app with Google map javascript API, and it needs to support offline mode, so users can view map without internet connected.
So is it possible to cache google map javascript API(.js file) and map data?
If so, would you please give me some advice ?
Thank you in advance.
No caching or storage. You will not pre-fetch, cache, index, or store any Content to be used outside the Service, except that you may store limited amounts of Content solely for the purpose of improving the performance of your Maps API Implementation due to network latency (and not for the purpose of preventing Google from accurately tracking usage), and only if such storage:
Source: https://developers.google.com/maps/terms#section_10_5
As part of a weekend project, I'm making a little website that draws on a (google) map based on user running tracks. I would like users to be able to upload a snapshot of the map using a facebook share button. The catch is, I would like to avoid hosting the images myself, to reduce bandwidth usage.
I can use html2canvas to turn a map into a canvas, and that into a .png using toDataURL(). The png would then be contained in a javascript variable in the user's browser, and not stored (or hosted) anywhere. So, with that in mind:
Can anyone think of a way to make facebook scrap that image for the entry in the user's time line?
Would facebook store the image permanently, or would it try to refresh it periodically (and fail)?
I understand that following the link in the post would also not go to the image (which doesn't exist), but less assume that's not an issue for now.
Any ideas or alternatives would be very welcome! Thanks!
According to How long is Facebook caching the sharing thumbnails?, facebook is caching share images for 3-5 years, so if you can get it in there..
perhaps you DO save the image and then delete it with a cron task that runs every minute?
* * * * * /home/me/scripts/deleteAllMyShareThumbs.sh
I have a small and low key website what features Google Maps using the v3 Javascript API.
Not that I am expecting to get over 25,000 loads per day, but how does Google detect people loading the map on my site? My site uses the the following code http://maps.google.com/maps/api/js?sensor=false without any API key, and as it is rendered in the clients browser, how does Google relate it to my site?
Is the loads worked out through the http headers/referrers or are the loads based on how many times each client/IP loads the map?
In essence, the code for the map is rendered on my clients/users browsers and thus how do Google know how many people are using the map on my site?
Finally; although I have Google web master tools; is it worth creating & using a API key, or will it just make it possible for Google to track how many people are using the map on my site and thus apply the limit of 25,000?
The API sends a request to google which contains the URI of the page that contains the map(when you inspect the network-traffic inside the dev-tools you'll see a request to https://maps.googleapis.com/maps/api/js/QuotaService.RecordEvent ) . This request will be sended when a Maps-instance has been created successfully.
The benefit of using a key: google is able to contact you when there are issues(e.g. may send a notice when you have reached any limits and give you a chance to react/solve the issue before they restrict the API-access for your domain/account).