As much as we would like to have precise data on how users interact with devices, we are not there yet. This is not an in-depth study of analytics methodology but a statement about how I see web analytics used and some of the apples and oranges comparisons I have seen.
To those that practice measuring user behavior online this may seem obvious. To those that rely on reports about user behavior online this may at first be an unsettling opening statement. I have recently come across a few instances where the definitions and nomenclature used by technical and marketing people vary in definition. This has led to some confusion as to what we are measuring, why we are measuring and how we measure.
The Business of Data Collection has Changed
Online behavior analytics has matured greatly over the past ten years. Architectures involving web services have also evolved from mostly 3-tiered models to n-tiered Serviced Based Architecture (SOA) models. In addition to the shift from dedicated servers to more distributed cloud models we also have rapidly evolving user interfaces (UI) that incorporate more complex code. These pages are not typical a then b then c pages but rather a single page with JavaScript (or whatever your code preference is) containers that load the data on the page making it very hard to track what the end user is doing.
Fuzzy Numbers: Uncertainly and the Observer Effect
Measuring user behavior online is not as easy as pasting a tag into a webpage. Google Analytics and WebTrends are two of the tools I use on a regular basis. Frequently I see both tags on the same page. This creates two problems seen mostly in quantum mechanics (QM): The uncertainty principle; The observer effect.
The uncertainty principle in QM really applied to particle physics and is also known as the Heisenberg Principle. Basically it states the more precisely we measure a particle, the more fuzzy measuring the momentum is and vice-versa. Where I see an analog in measuring online user behavior is, based on the technique used, the more we measure one aspect, the less we focus on other aspects. Analysis using a click-tracking service will certainly capture granular data about users that physical logs might miss but we will also be sampling users rather than capturing 100%. The reasons for this might seem obvious if we observe our own experience. I often use "incognito" mode in Chrome to avoid being bombarded by ads related to my browsing activity. I am fairly certain this works as I only get ads in content when I don't use incognito mode.
The observer effect seems much easier to explain. In using the same click-tracking methodology mentioned above we can make some assumptions: Adding code to a page increases latency; Adding links to external code for tracking or font APIs increases latency.
These two effects can be measured by simple sampling of pages that have tracking code and those that don't. One would hope that the latency effect would not be noticeable to the end user. Of course we need to use click-tracking as part of our tool kit when analyzing online user behavior but it should be seen as a tool for a larger purpose.
Why are we Measuring Things?
This is one of the questions I am finding myself asking quite a bit. One of my favorite books in anything to do with the web is Jesse James Garrett's The Elements of User Experience. The illustration below is from one of the early chapters in the book and states pretty clearly that the first question in any web-based project should be "user needs and objectives". Is this something that is consistently a part of every engagement between web analysts and their business customers? In my fifteen years of experience the answer is "no".
Most of the time I deliver a report on the analytics of a page or pages
and there are no question asking me why. I often offer an opinion backed
up by using multiple sources including script based, alert based and log
based. I am hoping to blog more specifically on my new adventures in
measuring online user behavior. The market is mature and the tools are
light years from where we started.
The fuzzy numbers we see are the result of the accuracy of our tools and
the skills of the analyst to build a defensible methodology. All of this
needs to begin with the question "why are we measuring this and what
outcome do you, the stakeholder, want from the end user?"
More later...
Original article: https://www.linkedin.com/pulse/fuzzy-numbers-observations-measuring-user-behavior-hare-gaiq-apm-?trk=mp-reader-card
Comments
Post a Comment