jim.shamlin.com

4: Avoiding Stumbling Points

A significant amount of investment has been done on web analytics tools in recent years, but it's noted that not all tools are good tools, and there's little value in performing analysis if you're measuring the wrong things, in the wrong way, and taking the wrong action (or none at all). As such, even firms that have extensive analytical capabilities in place should periodically pause to assess whether they are creating value for the firm - and, if the answer is "no," to address the problem rather than abandoning analytics.

Analytics Intervention

The author borrows the psychological term "intervention," which involves identifying and changing patterns of behavior that are destructive (even non-productive is destructive in the business world, as it consumes resources), largely by identifying the behaviors and getting the subject to realize a need to change.

The necessary first step is to identify the problem and get a company to acknowledge it. This is generally obvious when a company seems indifferent to analytics and dismissive of it: they indicate that the tools don't deliver the data needed, or the technology they are using makes tracking impossible to do properly. The author goes into quite some detail about both of these:

A lack of faith in the tool is often the fault of vendors, each of whom insists that what you are using is wrong, what they are selling is better, and the last version isn't good enough. A very high number of firms (67%) have switched tools in the past five years - 48% have used three or more, and 11% have even used more than five. Ironically, most of the top-tier tools offer the same basic data analysis, with different reporting capabilities - the functional differences are negligible.

(EN: The root problem here is not the tool, but the way that the data is used. Unless someone within the organization understands the output and how to apply it to the business, the tool used is pointless. In less flattering terms, someone who can't play a piano won't become a virtuoso if you buy them a different piano.)

The notion of the "untrackable site" is also a myth, but one that is widely accepted. There are specific technologies that stymie traditional analytical tools (a Flash-based application is one "file" and the user's interactions are not noted in the server logs), but this does not make it impossible to track (the Flash application can be built to create a log that enables you to analyze user data). This is often met with the most common of IT excuses - "we can't do that" (security, performance, or other "issues" that prevent them from doing what is needed) - but those, too, are excuses. The ability to analyze user-data should be a core business requirement, and a solution that cannot yield meaningful data should be rejected or replaced as unsuitable for business use.

The root cause of analytics "problems" is usually not related to technology, but to organizational culture. The organization is not clear on what activities produce value, whether through lack of objective analysis of operations (they have the wrong idea about what is important) or clinging to an inaccurate or outdated mode of doing business (they know what used to work, but it doesn't work anymore). Instead of admitting this, decision-makers will blame the tools, the customers, the employees, or anyone but themselves for their failure to produce meaningful results.

The author returns to the question-mode: Do you have success metrics? Are you confident in the accuracy of data? Does your analysis identify specific and actionable problems? Do you develop models to test new ideas? Are these metrics used to drive employee behavior? (EN: an few he left out: Do your metrics have a causal connection to profitability? Can you demonstrate a correlation of performance "by the numbers" and success at your company's goals?)

Overcoming Issues

The author provides a list of twelve common issues and suggests the way to address them.

Issue #1: Lack of Established Processes and Methodology

In many companies, data is collected and compiled into a report. The report is distributed, and it might be read. But that's all: the data doesn't indicate anything about the business, nor is it used to drive any decisions about the business. It's just trivia, and it doesn't really matter.

(EN: My sense is that many products who buy off-the-rack solutions fall into this category: they sense they should be measuring something, but don't know what, and rely on the software to give them an answer. As a result, they get a rich report full of granular details that have no impact.)

If this is the case, the problem is cultural: the business does not operate "by the numbers" and considers metrics to be a largely meaningless waste of time. In this instance, the analyst must sell the notion of metrics by demonstrating a connection to something the company values (increased revenue, decreased expense) so that decision-makers can make the connection.

Issue #2: Failure to Establish Proper KPIs and Metrics

Another common issue is that metrics are defined, but they are not "proper" - specifically, the company is measuring the wrong things and driving the wrong behaviors. For example, a firm might seek to assess the number of visitors to their site and conduct an advertising campaign to improve that number, but doing so leads to no increase in revenue.

This is a bit different than the proceeding issue, in that the company does pay attention and take action, but they pay attention to the wrong things and take the wrong actions as a result. The solution is somewhat similar: the analyst must demonstrate that the current metrics are not suitable, suggest different methods, and prove their accuracy.

Issue #3: Inaccuracy

Another problem is that the reports produced are inaccurate, either as a result of garbage data or bad analytical methods. The author likens this to trying to sail with a bad GPS device - and you'd be right to ignore it and instead navigate by the stars. (EN: an interesting point, but people have developed an almost religious reverence for technology and tend to cling to it, insisting technology si right in spite of the evidence before their very eyes, and the results can be damaging.)

The problem is more widespread than might be expected: the author gives examples of respected companies whose metrics were ludicrous: overestimating site visitors by 400% (assuming anyone who loads a file has visited the site), counting only 30% of the actual visitors (by excluding those that don't wait for pages to completely load), and grossly underestimating engagement (going with an average number of pages viewed, including the large number of users who bounce out immediately). And in the end, bad analytics lead to bad reporting, lead to bad decisions based on false information.

The issues aren't failures of technology, but failures of analyst to develop an accurate method of measurement in spite of technology. There are a number of issues that complicate data collection - privacy software that delete cookies, internal visitors to a company site, caching servers and dynamic proxies, bad HTML/JavaScript code, etc. - that an analyst cannot eliminate, but should at least take into consideration when determining how to analyze data.

The solution is largely within the analyst's demesne: he must understand how the figures in the reports are calculated, identify factors that could be corrupting or skewing the output, and take measures to improve the data-collection and data-analysis tools. (EN: And never put complete trust in an off-the-rack solution to do this right - but use the same process as you would for any home-grown solution: how is it calculated and what might skew the output. If you get good answers, it's a good tool. If you don't know, then you're taking quite a risk.)

Issue #4: Data Overload

The author refers to David Schenk's concept of "data smog" - a situation in which so much data is available that it's difficult to tell what, among a great volume of information, is actually important to give attention. It's a serious problem in "by the numbers" organization, in that they seek to grab all the information they can for fear of missing something important, which ironically has the opposite effect (it's not missed, just obscured).

This problem is only going to get worse, as more data becomes available, and the marketing emphasis in analytical tools is on how much data is produced: if a solution spits out more numbers than the competition, or more than its previous version, then the salesmen will try to convince buyers that it is "better" and worth buying - even though the additional features contribute no real value.

Data smog can be avoided, or significantly decreased, by focusing on real-world phenomena to identify the data that is actually relevant. If you want to get more users through the check-out flow, chances are the amount of traffic to the executive biographies is of little importance.

Second, focus on actionable data. There's a lot of trivia in numbers and things that are "interesting" to observe - but it's not useful unless some action can (and likely will) be taken in response of a metric or in order to improve it.

Issue #5: Inability to Monetize the Impact of Changes

One of the chief problems of analysis is that it is backward-facing. It analyzes historical data (even if "history" is ten seconds ago) which, by its nature, is more useful in identifying problems with existing operations rather than opportunities to innovate. And as a result ,the reliance on simple analysis leads attention and budget to cost reduction rather than service improvement.

To mangers who think in terms of money, it's helpful for analysts to assign a dollar-value to interactions and explain the effect of changes on revenue and expenses to those who can't make the connection for themselves. This is to be considered in greater detail later (chapter six).

Issue #6: Inability to Prioritize Opportunities

Difficulty in setting priorities among multiple opportunities may precipitate from the difficulty in translating online behavior to monetary results. As such, all opportunities and problems are regarded as being equal and decisions on what to do are based on random criteria (first-in first-out, whatever costs least to do, or whatever tickles the decision-maker's fancy).

If time and money were no object and everything could be done, there would be no issue. But the reality of constraints mean than some things can be pursued and others can't - and the decision is critical: devoting resources to things that matter less, or denying them to those that matter more, is a serious concern.

Good analysis can also suggest the magnitude of problems. Especially if problems can be monetized, the dollar-impact can identify what is most valuable, and help to draw the line where the benefit of solving a problem is less than the cost of the solution.

(EN: This is sensible, but returns to the former problem: resources are devoted to things that are easy to measure, rather than to things that matter more but do not yield to analysis - hence efficiency smothers innovation because it yields good data and is easy to analyze.)

Issue #7: Limited Access to Data

Many companies still hope to control access to information as a method of protecting "secrets," regardless of whether the information is actually sensitive. The need-to-know approach keeps information out of the hands of employees who need it and often grants it to those who don't.

The example given is of an anonymous company who restricted information by means of requiring a user to have an "account" to see Web site analytics. 85% of the users who were given an account hadn't bothered to log in for the previous six months. (EN: There is no indication of the number of employees who could have benefitted from the information but were denied access, as that does not yield to statistical analysis.)

Certain positions within the organization have a need for information related to Web site usage (product mangers, designers and copywriters, executives, operations and marketing mangers, etc.) and it is worthwhile to not only give them access, but also training to extract useful information.

But beyond that, the information should be open to anyone. Rather than having a need-to-know policy, have an open-door policy, with the burden of proof being required to restrict, rather than permit, access to any information.

(EN: no indication on how to address the problem, and having worked at a few organizations where information was kept under lock and key to anyone who couldn't prove their need for it - or even many who could but were not in political favor with the gatekeepers - I will attest that changing the culture of deprivation is extremely difficult.)

Issue #8: Inadequate Data Integration

The notion of systems integration has been around for a while, and the notion of data warehousing is a component of the trend - but it has made very slow progress. As a result, much data is still frozen in silos. As a result, it's often impossible to get a complete portrait of customer experience (not just cross-channel, but even the systems that existing within a given channel - such as knowing how many customers who ordered products that were out of stock called back a month later to try to order them again).

To the author's thinking, all data pertaining to any individual who interacts with the company in any way should be consolidated and available to cross reference. Without this capability, analysis is limited to granular activities (the behavior on the Web site) that cannot be coordinated to any other activity (whether the customer received a brochure, returned a product, responded favorably to a survey, etc.)

The author suggests that companies should undertake the effort of identifying all data that can be gathered, and moving it to a central location where it can be analyzed He concedes that it's easier said than done. (EN: I recall reading a different book about data mining - and it is indeed a massive concept and companies have a very long way to go to get their act together.)

Issue #9: Starting Too Big

It's not uncommon for executives to hear of a trend (data analysis, ERP SCM, etc.), then go into a panic and attempt to buy a ready-made solution that will instantly catch them up with where their competitors allegedly already are. It's media sensational, marketing scare-tactics, and it remains very effective. The problem, of course, is it's a massive expense and a mad panic, the combination of which generally result in huge disasters (that need to be blamed on someone else).

Web analytics is subject to bouts of the same mania, and it's especially difficult for companies that are seeking to "jump into the pool" all at once. There may be no defense against executive whim, but the analyst can resist the need to measure everything, all at once, but taking a more incremental approach to defining meaningful metrics and taking a more moderate and sane approach.

Issue #10: Failure to Tie Goals to KPIs

The key performance indicators may be well-defined, and the solution may be developed to measure phenomena that are closely related - but unless the information is used in the context of strategy (to drive or record progress against a goal), it's merely a report card. In effect, the company knows what is important, but does not pursue it.

To sell the value of analysis, the analyst should seek to help decision-makers define metrics that can be associated to their plans, with reporting that will assist them in providing proof-of-progress (or possibly feedback on action, even if it doesn't constitute progress) so that reports become a tool used to measure progress and (in time) to set goals.

Issue #11: No Plan for Acting on Insight

A similar problem is failure to act on discoveries, because there was no plan to do so. The author cites a research survey from 2005 that indicated 53% of companies report that "acting on the findings" of analysis was a problem. In such companies, plans are in place and budgets allocated, and if something "comes up" from Web analytics, there are no resources to take action until the next budgeting cycle, by which time it may be too late.

The first step toward being able to be more responsive is to obtain for the Web department an ad-hoc budget that is earmarked for "optimization" based on findings. It can be a relatively small amount at first, and as you are able to demonstrate your ability to make productive use of it (costs saved or revenue earned as a result to being able to act in real time), it will be easier to "sell" the notion of getting more budget and, eventually, to tap into the discretionary sources of funds reserved for "contingencies" or "emergencies" that never seem to actually occur.

Issue #12: Lack of Sustained Support

Especially in companies that adopted analytics because it seemed trendy at the time, there is the tendency for support to decrease when the novelty wears off, and those who are fascinated with trends have moved on to other things.

If possible, this can be addressed in the project that establishes or makes significant enhancements to web analytics: insist on at least one full-time employee devoted to the task, who can remain interested in keeping it alive and growing its use. (EN: This can be a tough sale when the salesman is selling his product as self-sustaining and needing no staff resources.) Not only does analytics need staff to perform the tasks, but it also needs ongoing executive support to champion as needed.

You can't force people to pay attention to analytics, but if you can show its value and relevance to business operations, it then becomes a resource that people become accustomed to having at their disposal. They may not fight to get it, but once it's in place and proves its mettle, some at least will fight against the prospect of having it taken away.