3: What "Culture of Analysis" Means
Analytics are of limited value to a company that puts little credence in data. While contemporary schools of management tends to focus on managing "by the numbers," there are still individuals and organizations who run things by gut feel and top-down decisions based on the intuition of experienced people. This is a cultural problem, and difficult to overcome.
Based on examination of a number of businesses that have adopted performance marketing, and comparing these qualities to companies that have been resistant to the idea, the author has identified a handful of factors. Companies that adopt a culture of analysis tend to use more specific goals to drive decision-making, allocate budget based on the projected return of projects, use data-driven standards to measure success and monitor performance, and are more likely to segment their markets.
The author goes into greater detail on some of these qualities:
Data-driven decision-making stands in stark contrast to more authoritarian decision making, where management rules by whim and decisions and seniority, rather than the benefit of work, determines priority. This is a difficult cycle to break, because it depends on those who currently hold power (typically, high-level employees who have invested considerable effort into rising to a position of power) to be willing to abdicate their considerable influence and allow smaller fish to hold sway if they have the numbers to back their plans.
But there are less subtle methods of running an organization by arbitrary means. For example, a unit that handles projects in the order received (FIFO) may be overlooking that the financial impact of a "later" project may be greater than an earlier one, hence will have greater ROI if worked sooner. (The principle of "dynamic prioritization" involves assessing the value of projects as they come in and changing accordingly - so that those with the best potential aren't made to wait in line.)
Likewise, resource planning tends to be flexible: budget, personnel, and capital resources are not tied to a specific department but are able to be made available where they are needed, when they are needed.
This also requires the business cycle to be short. If budgets are set once a year, they can only be changed once a year; if a project is funded for five years, it is unlikely to be cancelled or change course. In order to be reactive, companies must be nimble and decisions must be made as-needed.
Finally, there is the notion of financial accountability. If a project is to be accepted, funded, and prioritized based on its potential return, the organization must consider the performance of the project after its completion to assess whether the estimation was accurate, as a means to making more accurate predictions in future, as well as to assess the need for further work to address variations.
(EN: This can also go very wrong for an organization that is too rigid in its demand for accuracy, and misguided in the actions it undertakes to "improve" accuracy. The net result is an organizational focus on short-term projects with measurable outcomes rather than long-term projects whose outcomes are more strategically beneficial, but less subject to quantitative analysis, and more difficult to predict with accuracy.)
Perking up Interest in Web Analytics
The author provides a number of suggestions for convincing an organization of the value of analytics and encouraging their use throughout the enterprise.
Establishing a "steering committee" for analytics is helpful in getting input and participation from various departments within a company. This should involve, at the very least, members of marketing, IT, and e-business. (EN: This is from a marketing perspective - if other areas of the business can benefit from analytics, loop them in. My sense is that the more impact the digital channel has on an organization, the more areas should participate.)
As with any committee, the web analytics steering committee can languish without a meaningful mission. The committee should be tasked with determining where analytics will be helpful, overseeing operations that generate analytics, and monitoring existing analyses to ensure it remains accurate and relevant.
If analytics is a new notion, begin by seeking out modest victories: where analytics can identify a problem, it is recognized; but where analytics can identify a solution, it is valued, and the organization considers what other problems it might help to solve, and key people take interest.
Analytics also need to trickle down: if it's discussed only by executives behind closed doors, it is dismissed by the rank-and-file. Analytics must relate to their daily work in order to matter: they must be able to see how it reflects, and is affected by, real performance. Ideally, rewards and incentives can be tied to actions that lead to improved performance, measured in terms of these same metrics.
In instances where analytics are being introduced at a lower level, there will be the need for the analytics department to "sell" the notion by less formal means. The author's advice for doing so is fairly standard for selling any technical solution: consider the perspective of the listener. If you speak in jargon, focus on the details of analysis, etc., you will get nowhere. If you can frame solutions in terms of business value, speak in plain English, and communicate the benefits to the organization (leaving out the nuts and bolts details), you will have better success in getting people to pay attention and grant their support.
Key Roles beyond the Analytics Team
The author considers some of the personnel outside the analytics team who can have a role or an interest in the operation. This will differ among enterprises, but the following are typical players:
- CMO - The marketing silverback will seek analytics as a method to plan marketing efforts and measure their success.
- Web Producer - The manager of the Web site will seek analytics to plan design initiatives and measure the performance of the site
- Web Designer - Web site designers will use analytics to support design and usability decisions pertaining to more granular design decisions
- Information Architect - Seeks analytics to determine the user's click-stream through the site to remove bottlenecks and improve attention given to high-priority dlows
- Usability Expert - Analytics will provide field evidence of usability principles and identify areas of the site where usability can engage to improve performance
- Copywriter - Analytics will identify areas where the text content of a site can be improved to drive user behavior
- Developers - Analytics identify areas where performance may be an issue
- Channel manager - Seeks analytics to measure the performance of the online channel and gage the impact of initiatives
(EN: The list goes on. In some instances, it's so clear that it probably doesn't need explaining; in others, it seems a bit of a stretch. Btu it's largely a brainstorming exercise.)
Cross-Channel Implications
The application of analytics to the Web often creates myopia, to the exclusion of other channels. The Web does not operate in a vacuum, and the online experience can have a significant impact on call centers and physical locations. It is therefore important to extend metrics beyond the Web.
The example is given of a company that launched a Web site and got few online orders, but orders to their call center increased - they had a vague sense one drive the other, but could not derive an accurate estimate. In this example, the solution was to provide a different toll-free number for the site (or even different numbers for different sections of the site) to trade the origin of a call - which borrows a technique of direct marketing.
Comparisons among channels can yield additional information. In the example above, the company was interested in changing customer behavior to prefer the Web site to save cost - but determined that customers who closed on the phones bought more products than those who closed on the Web - hence the decision to "cut costs" would come at the loss of revenue from cross-selling and up-selling. But neither was the Web without value: customers who called a number from the site closed more often and placed higher orders. So closing or dissuading customers from using either of the two channels would be counterproductive.
The lesson in the example is that focusing on one channel, and not considering the confluence of multiple channels, can lead companies to make poor decisions.