Skip to main content

Assessing the impact of poor data on retail: The 1-10-100 rule

As we approach the holidays, and particularly the new year, our thoughts turn to how we can make improvements to our personal life — for example, clean living with better food and more exercise. But how many of us are planning to make improvements to our business? Maybe we should be thinking about our New Year’s work resolutions as well as our personal ones.

For example, what vital improvements can you make to your business to ensure that your deliveries and communications make it to the right customer at the right time? Regardless of the time of year, data is one of the most valuable assets a company owns. This is particularly important as globalization becomes more prevalent and more and more companies have international data in their systems.

So, why do so many businesses neglect to focus on keeping their data clean? When you really think about it, how much does inaccurate data cost your business? And is this just a financial issue, or does it have other consequences?

Too many companies overestimate the quality of their data, believing it to be just fine, when in reality it’s costing them more than they imagine. And by the same token, many companies underestimate the power of validating data at the point of entry, worrying about cost — when in fact, in the long run, this initial cost is minor in comparison to the alternative.

Much like a resolution to lead a cleaner lifestyle personally — businesses need to follow that same mantra — clean data in, clean data out.

Make clean data your New Year’s resolution

To help you put this “clean data” resolution into practice, a useful rule to follow when assessing the impact of your address data quality is the 1-10-100 rule, which was first proposed by George Labovitz and Yu Sang Chang in the early ’90s. Here, I’ll explain exactly what that means for businesses.

In the 1-10-100 rule, there are three phases, each of which explains the cost of maintaining data quality. In the first phase, the “prevention” stage, $1 represents the amount it costs to verify accurate data at the point of capture. This is the simplest and least expensive way of gathering data and ensuring its accuracy and validity.

This is the best possible solution for businesses storing data, as it means communication and deliveries are received the first time, and better business decisions can be made, leading to increased efficiency and growth. Tools like address or email verification ensure you don’t have to worry about dirty data in the future.

However, as we move into the second phase, the “correction” phase, we see that the initial $1 rises exponentially to $10. This $10 represents the increased cost that an incorrect address will have on your business down the line. Without an effective prevention method, dirty data can start to have a serious impact on business efficiency, and companies wanting to resolve the issue will find themselves paying much more than if they had validated their data in the beginning.

And as we head into phase 3, the “failure” phase, we see that the amount increases tenfold to $100. This $100 represents the amount companies will pay for doing nothing about their poor data.

Unfortunately, it isn’t just a simple financial issue, either. Bad data too often leads to more bad data, which eventually leads to bad business decisions and poor efficiency. This can also cause failed deliveries, missed marketing opportunities, a poor level of customer service and lack of compliance.

We all know the golden rule of customer service: “A customer may forget what you said, but they’ll never forget how you made them feel.” This goes for bad experiences too — fail to deliver that all-important Christmas gift and the reputational damage is magnified. On average, consumers tell approximately eight people about their good experiences and more than twice as many people about their bad experiences, according to the American Express Global Customer Service Barometer.

According to our recent research, a massive two-thirds (66 percent) of retailers say the accuracy of address details is critical to their business, but 80 percent of retailers say that customers often don’t realize that failed deliveries are due to the fact that they mistyped their address. If customers are entering incorrect address data into your forms, and you aren’t aware that these are incorrect, this can only lead to issues.

This emphasizes the importance of validating your customer data from the very beginning, rather than leaving it to chance or having to clean it at a later date. Though the costs mentioned in the 1-10-100 rule are not exact, they are a great indication of how much costs increase the longer you leave dirty data in your database.

So rather than resorting to an expensive and time-consuming process of trying to correct poor data in your system, it really does pay to validate at the point of capture. Not only will this ensure that you won’t be spending more than necessary on cleaning data, it also helps you ensure that relationships with customers and brand reputation remain strong. Remember, quality data is for life, not just for the holidays.

via Marketing Land


Popular posts from this blog

6 types of negative SEO to watch out for

The threat of negative SEO is remote but daunting. How easy is it to for a competitor to ruin your rankings, and how do you protect your site? But before we start, let’s make sure we’re clear on what negative SEO is, and what it definitely isn’t.Negative SEO is a set of activities aimed at lowering a competitor’s rankings in search results. These activities are more often off-page (e.g., building unnatural links to the site or scraping and reposting its content); but in some cases, they may also involve hacking the site and modifying its content.Negative SEO isn’t the most likely explanation for a sudden ranking drop. Before you decide someone may be deliberately hurting your rankings, factor out the more common reasons for ranking drops. You’ll find a comprehensive list here.Negative off-page SEOThis kind of negative SEO targets the site without internally interfering with it. Here are the most common shapes negative off-page SEO can take.Link farmsOne or two spammy links likely won’…

Another SEO tool drops the word “SEO”

This guest post is by Majestic’s Marketing Director, Dixon Jones, who explains the reasons for their recent name change.
Majestic, the link intelligence database that many SEOs have come to use on a daily basis, has dropped the “SEO” from it’s brand and from its domain name, to become Since most people won’t have used Google’s site migration tool before, here’s what it looks like once you press the “go” button:

In actual fact – there’s a minor bug in the tool. The address change is to the https version of (which GWT makes us register as a separate site) but that message incorrectly omits that. Fortunately, elsewhere in GWT its clear the omission is on Google’s side, not a typo from the SEO. It is most likely that the migration tool was developed before the need for Google to have separate verification codes for http and https versions of the site.
The hidden costs of a name change
There were a few “nay sayers” on Twitter upset that Majestic might be deserting it…

Software Review Site TrustRadius Has A New Way to Treat Reviews Obtained Through Vendors

Online user reviews are the most powerful marketing technique for influencing purchase decisions. But do they accurately represent the views of most users?Today, business software review platform TrustRadius is announcing a new way — called trScore — to handle the bias introduced in reviews by users obtained through the vendor of the reviewed software product. The site says more than two million software buyers visit each year to check out its product reviews.To understand trScore, let’s first look at TrustRadius’ approach.The site says it authenticates all users through their LinkedIn profiles. It also requires users to answer eight to ten questions about the product, in order to weed out users having no familiarity. Additionally, a staff person reads every review before it is posted, and the site says about three percent of reviews are rejected for not meeting guidelines.As for the reviews themselves, TrustRadius puts them into two main buckets: independently-sourced reviews and ven…