Friday, May 21, 2010

Data Quality for Social CRM

We just read a recent blog post from Harish Kotadia on Social CRM. He emphasizes the need for data quality. Upstream data quality for social CRM is probably critical as the volume of data can be huge as Harish notes.

Thursday, May 20, 2010

Tactical Data Quality

Today we listened to a podcast of an interview with Rob Karel of Forrester Research on "Tactical Data Quality Projects". Rob's comments were congruent with our previous blog "Data Quality on a Budget". Rob suggested quick and easy to accomplish projects that would provide quick ROI. He also recommended targeting projects that business units cite as chronic problems that can have a quick payback such as getting junk entries out of marketing data bases.

Friday, May 14, 2010

Benefits of Upstream Data Quality

We liked Liliendahl's recent blog post on upstream data quality. We agree with Lilendahl that data quality is best dealt with at the point of entry before the data is posted to a data base. Upstream data quality solutions can be easier to use and are usually far less costly than downstream solutions. The cost of upstream solutions can be so inexpensive that they can bypass the usual budgeting process. That can be important to getting data quality initiatives launched as we discussed in our blog earlier today.

Data Quality on a Budget

We read an interesting blog by Graham Rhind yesterday regarding the frustration with organizations not appreciating the value of data quality and its importance. He makes some excellent points about how important data quality is to all organizations.

Unfortunately, not many organizations currently share Graham's sense of urgency regarding data quality and it costs them dearly. Thus, data quality will continue to need evangelists such as Graham.

In addition, many organizations expect demonstrable ROI for investments including data quality. (See our blogs on ROI for data quality here and here.) While Graham finds the need to demonstrate ROI frustrating as he believes that data quality will always prove beneficial, it is unlikely that there will be a move away from this sort of capital budgeting exercise any time soon.

However, there may be some ways to move an organization into data quality without having to get bogged down in a time consuming exercise to demonstrate ROI. Not long ago, we listened in on a webinar with Steve Sarsfield discussing "Affordable Data Governance". Steve advocated a strategy he called "land and expand". The gist of this strategy is to pick low-cost data quality solutions that can still have a measurable improvement on data quality. While low-cost solutions may not be the comprehensive solutions that some organizations need, they can often be implemented without having to engage in a lengthy capital budgeting exercise since many organizations have floors for approval of expenditures below which individuals or departments are free to make their own decisions. Moreover, if the solution can demonstrate the value of data quality, it will enhance the case for more comprehensive solutions and help to drive adoption of data quality within an organization.

An example of a low-cost solution is cloud computing solutions for data quality. Some solutions can cost as little as a few hundred dollars per month and, yet, provide very high returns. This expenditure is often well within the budget of a department and will not require the need to gain approval for the expenditure.

Wednesday, May 5, 2010

Estimating Benefits of Data Quality Initiatives

We liked this piece by David Loshin on estimating the cost of junk data in a company's data base. This is the kind of analysis that will lead to a compelling business case based on ROI for a data quality initiative. See our earlier blog for a frame of reference and some other examples.

Tuesday, April 27, 2010

Interesting New Gartner Survey

We read about a new Gartner survey in CIO that suggests that data quality problems are growing rapidly. The problem is becoming particularly acute since 85% of the businesses surveyed claimed that data was a strategic asset of the company.

Easy to use upstream data quality solutions, such as Ikhana's EDM Web Services, can greatly reduce junk data at a very low cost and can be installed in as little as one hour.

Wednesday, April 21, 2010

Cloud Computing for Data Quality

We liked this blog from Liliendahl: http://liliendahl.wordpress.com/2010/04/19/data-quality-from-the-cloud/

We think that using web-based, upstream data quality tools can provide users with a lower cost data quality solution that is easier to install. Those attributes should produce superior ROI and make it easier to get management approval and acceptance from IT.

Monday, April 19, 2010

ROI for Data Quality Initiatives, Part 2


In our last blog we discussed the financial analysis tools for building a business case to gain management approval for data quality projects. Employing the various financial analysis tools to support an investment in data quality is perhaps the easier part of the exercise. The common financial analysis tools are generally well understood and accepted by corporate managers. The harder part is marshalling the data that will need to be presented and analyzed. Nonetheless, there are methods for estimating the costs and benefits of data quality projects that will make a more compelling business case for a data quality project.

The costs of a data quality initiative present an easier estimation challenge. Quotes for the software and hardware that would be necessary for a data quality project are available from vendors. The organization cost (i.e., employee and management time) to implement the data quality initiative can be harder to estimate. However, there have been many data quality projects implemented and, therefore, there is data available or references that can be researched to make a reasonable estimate of this cost.

Estimating the dollar value of the benefits of data quality improvements will be the most difficult task. What is the dollar value of better quality data? The important thing to recognize is that the value need not be exact, but it does need to have a high level of certainty. The way to provide that certainty is to consider how bad quality data causes a business to operate at a suboptimal level and then use some analytical tools, such as sampling for example, to build a case for your estimate of the value of better data quality.

The following is a hypothetical example of how data quality benefits might be estimated for a data base being used for sales leads. Clearly, there is a cost to the company of giving its salesmen poor sales leads. If the sales staff is spending time on leads that will not result in a sale that is an opportunity cost of not making a sales call that could have resulted in a sale. To try to estimate this opportunity cost, a survey could be conducted on a sample of the sales staff that is large enough to be statistically significant. The objective of interviewing the salesmen will be to get an estimate of the amount of time they spend on bad sales leads. With the data from this survey an estimate can be made of how much total time the sales staff is spending on bad sales leads. The next step would be to determine how much time a sales person spends on a sales call and the success rate of a sales call. With this information an estimate of the lost revenue due to bad data quality could be made. This would be the benefit of a data quality initiative. While this analysis does not result in an exact estimate that has 100% certainty, it does create a reasonably compelling case that there will be a benefit to the company from the better data quality and management can have a reasonable level of confidence in the project assuming that the projected returns from the benefits sufficiently outweigh the costs.

Our blog does not allow for an in depth discussion of this important issue. If you have any questions, or would like to discuss this topic with us, please fill out our contact form and we will get back to you.

Subscribe to our blog here.

Wednesday, March 24, 2010

ROI for Data Quality Initiatives

Gaining the support of senior management for spending on data quality is an essential but sometimes difficult task. Almost all well run companies use some form of capital budgeting for their investment decisions. Thus, a sound financial justification for data quality projects is the key to building a compelling case for spending on data quality. However, the financial benefits of data quality improvements can be difficult to measure. Nonetheless, if your proposal is factual and backed up by hard data, it stands a far better chance of getting approved. A proposal based on facts will be perceived as less risky and its projected returns will be considered to have a higher probability. The greater the projected financial return of a data quality project is; the greater is the likelihood that it will be approved.

If you are considering spending on an upstream data quality tool and are currently spending on downstream data quality solutions, your task can be considerably easier than for a de novo project. If you already have a downstream data quality system, you likely have data on how much employee time is required to cleanse your data bases or if you are using an outside service you have received invoices that document the cost of that data quality effort. Upstream data quality tools will eliminate much of the downstream costs. Thus, making an estimate of the amount of benefit provided by upstream data quality tools can be relatively easy and will be based on hard facts not supposition. You can use Ikhana’s ROI calculator to create a quick return on investment (ROI) for your data quality needs.

Return on investment is a quick, but somewhat crude, measurement of the financial worthiness of a project. It is simply the benefits gained from an expenditure minus the expenditure with that difference divided by the expenditure. In the above case, the ROI is the cost savings from an upstream data quality tool less the cost of the tool and that difference divided by the cost of the tool.

Another simple financial analysis that can be used is the payback period . A payback period is the amount of time it takes for an investment to pay for itself. If an investment has a payback period of as little as a few months, it will be very attractive to company management for a couple of reasons. One reason is that the internal rate of return (IRR) (more about this later) will probably be very high because the investment is recouped so quickly. Another reason is that, because the investment is recouped quickly, the risk is less. The more time that passes after an investment is made before the returns pay for it, the more time there is for something to go wrong. If an investment pays for itself immediately, it is a “no brainer” to make the investment. The IRR of the investment is almost infinite. There are some who argue that the payback period is an essential metric for evaluating IT projects.

The IRR and net present value (NPV) are perhaps the most common and most trusted capital budgeting tools. For projects that require multiple periods to implement or multiple periods to recoup the initial investment, the IRR and NPV are the most appropriate analytical tools. The reason for this is the concept of the time value of money. Basically, the notion of the time value of money is that a dollar today is worth more than a dollar a year (or any period of time) from now. The rate at which the value declines over time is known as the discount rate. For a company the appropriate discount rate is the company’s weighted average cost of capital (WACC). Many companies add a risk premium to the WACC to arrive at their discount rate.

To calculate the NPV of an investment proposal the future benefits from the investment are discounted back to today using the company’s discount rate (WACC). Then, when the investment costs are subtracted from the discounted benefits the result is the NPV. If the NPV is positive, it indicates that the project could be a good investment and should result in an increase in the value of the enterprise because the future benefits exceed the costs of the project even when discounted at the company’s cost of capital.

The IRR gives financial analysts another metric to judge the merits of an investment. The IRR uses the same concepts as NPV. However, the IRR is itself a discount rate. It is the rate that when used to discount the future cash flows of an investment will result in a discounted value equal to the cost of the investment. It is the return on the investment over time. (No wonder they call it the internal rate of return.) If the IRR of an investment is greater than a company’s WACC, the investment will have a positive NPV. The greater the spread of the IRR over the WACC, the more attractive the potential investment will be.

This is a very cursory overview of financial analysis for capital budgeting decisions. Please refer to the hyperlinks for more detailed explanations.

In a future blog we will discuss how you can estimate the benefits of a data quality initiative.