Given that it takes around 24 hours for Google Analytics data to update, this makes it hard to see how data rolls up and is displayed within Google Analytics.
Is there some sort of instant tester, or fast turn around application that allows me to quickly set up custom dimensions/metric within my app and see how they appear?
Options for real time analytics:
Google Realtime Analytics - Built into Google Analytics. To use Google Analytics Realtime, login to your google analytics account and select your web property. Then, on the left hand side click Real-Time->Overview.
GoAccess - If you have access to your web server log files this will give you a real-time view of traffic and does not require anything more than a console access. The really nice things about GoAccess is that it does not rely on any 3rd part services and has the ability to run in real-time or generate reports. To use GoAccess, first install it on your server using either the package manager or in a local directory using the official Git. Then, if you are running a standard Apache configuration just run the executable with:
# goaccess -f /var/log/apache2/your-website-access.log -a
If you are running a non-standard Apache log configuration (or another web server entirely), then you have to give GoAccess a description of your log file. This can be done in the ~/.goaccessrc file. Refer to the GoAccess documentation for specific descriptors in generating a string that describes your log file lines.
There are also a host of other SAS options like Clicky, GoSquared, or piwik (which is open source).
Related
I have the code for my chrome extension on GitHub, which I want to publish on Chrome Store. Doing it manually once is fine, but I want to make an automated flow, where as soon as any commit comes to a release branch, chrome extension on the chrome store is also updated. Is there any documentation by any developer or Google which explains how to setup this for my chrome extension?
There are a few ways to do this:
by using npm module (article about that) (suits for you. You can setup script by using this module, then make a hook for a Travis CI)
by using Store APi // for additional reading
by using docker // for additional reading
I suggest using GitHub Actions to automate publishing. High-level steps are:
Building and packing your extension into zip file.
Obtaining (action) an access token for Google API (. using clientId, clientSecret, refreshToken (how get them? Docs, Article).
Upload zip as a new version to Web Store using API (action)
Once the uploaded version was reviewed, publish it (action).
However, there are some pitfalls in this process, such as undocumented responses from Google API, the need for repeating the uploading if it happened shortly after the previous one, refresh token expiration. If you want to build a convenient and robust workflow based on GitHub Actions to handle all these cases I can recommend reading this series of articles.
I am trying to make a simple "sync" functionality. Let me describe my problem with a simple example:
Let's say there is a file named lorem.txt on Google Drive (appFolder). What happens if at the same time, two different devices (belonging to the same user with the same Google account) access the file and try to read or change the contents or name of the file? Lets say device A is suppose to make a change in the contents while device B is also trying to do the same at the same time. The file is accessed through it's unique ID.
How to go around this problem? Is there a build in functionality or measures in place to avoid simultaneous write access to a file (like some sort of file lock)?
I am specifically looking for a solution using Google JS SDK or using plain JS (with Authentication tokens ofc). Does the Drive server returns some sort of error status code (like 404) when such a scenario happens?
UPDATE:
Based on these previously asked questions (asked in 2017 and 2012 respectively):
Android Drive Api Lock File while in use
and
Can I set a lock on files with google api?
, I think it is fair to assume that there is no such functionality built-in into Google Drive.
I will leave this question as a reference for others.
I would like to automatize web tasks which normally should be achieved by a web browser, for example video upload to a certain video sharing website. I would like to make a pure command line program, which apply simple cURL command calls, because those calls are accessible for example in FireBug and Chrome / Chromium dev console / Network pane. So I woudn't like to use libcurl, or similar libraries. I would prefer programming in Ruby.
Task is straightforward: I upload a video while watching the dev tool network pane, and tracking the communication between the browser and the server. Copying POST and GET requests by "copy as cURL" menu. Applying some modification on the copied cURL command, eg. removing some header lines, which sends cookies, and substitute them with the cookies in a "cookie jar" text file (-c option in cURL). And later sending the needed cookies by applying that text file again (-b option in cURL). In the past I managed to make such Ruby scripts and they are simply working I can use those websites services by pure command line, so I can upload files from my VPS which is very fast unlike uploading from home machine.
Unfortunately the website I want to automatize apply a lot of redirections even at the login stage (for example 4 consecutive redirects), which aren't tracked by Chrome dev tool, so I can't see what really happen and when the needed cookies are stored, and which request is responsible for getting those cookies. Sometimes tricky javascript calls are applied by the website to store a cookie which is needed for the video upload and even exporting the video.
So my question is that besides Chrome dev tools and FireBug is there any automated and handy tool which can help achiving similar tasks?
Maybe BrowserAutomationStudio will help:
https://bablosoft.com/shop/BrowserAutomationStudio
This program can record your browser actions and replay them as a standalone bot.
I have an issue.
I use google analytics in a small project, and I develop on localhost. Of course I call my site again and again on my local machine, but this pullutes my analytics stats.
What can I do?
Thats quit simple.
you have the possibility to filter it in analytics, but this option
jungle in analytics is a bit confusing in my opinion.
you can just do it in your javascript:
if (window.location.hostname != 'localhost' && window.location.hostname != '127.0.0.1') {
//put your analytics here
}
Instead of hard-coding your google analytics key inside the rendered HTML, I suggest to put it in a configuration file.
It's typical to have several config files - at least one for the production app and one for a developer setup. Then in the developer settings you can disable GA altogether or just put a different key (for a testing site, so that you can e.g. check if your event tracking works properly).
Another option is to use the Google Analytics debug version and disabling the network part, so that you can preview the hits in console without actually registering them:
ga('set', 'sendHitTask', null);
I'd rather avoid checking for localhost host in Javascript. What if another developer uses localhost:8000? What if you deploy to a staging server? Config files are more reliable and flexible.
I have a chrome packaged app which saves files with the chrome.syncFileSystem api.
Now I want to develop a web app and fetch these files from the web. Is there a way to do this.
Thanks.
You can use the Google Drive API to access your Google drive, and you will find the files there. You can discover the location empirically, but, as it is not documented, it could change at any time. Google and others (e.g., Apple) have a habit of moving formerly-easily-visible files into obfuscated places.
To get you started, simply access your Google drive with their UI (web page or Chrome app), and you'll find the synced files. You may have to hunt around a bit with the links on the left of the Google drive web page.