5 Reasons Why You Should Never Use “Black Box” Lead Scoring

By on June 30

We see it constantly–analytics vendors who claim to offer lead scoring services that will increase your conversion rates enormously and leave you begging for more. These companies typically promise a quick and easy lead scoring deployment that will yield a +20% increase in conversion rates. How the vendors plan to achieve said results, however, isn’t part of their pitch, and the assurance they provide you with is enough that you may just hire them. But–when it’s time to measure the effectiveness of their work, you are disappointed by lackluster results. And, rightfully so.

The term “black box” refers having no visibility into how an analytics vendor derives the lead scores they put in place for you. What variables are factored into a lead’s score and how are variables are weighted are complete unknowns. But, what isn’t a mystery is that the nuances of your business were not considered when the lead scoring model was built.

Without a crucial understanding of how your business works, the model fails to predict which leads are worth your time. As a result, you’re left looking just about as bad as your lead scoring outcome. By understanding what makes “black box” lead scoring under-perform expectations, you will be able to avoid wasting your budget and your sales reps’ time on dead-end leads.

Five reasons now to use “black box” lead scoring vendors:

1. Everything (including the kitchen sink) ends up in your scoring model

Many lead scoring vendors are confident they can churn out a quick lead scoring model without being accountable for their models’ performance. They plot the easiest path to the end of the engagement. “Why not throw all of their data into a model and see what sticks?” the vendors ask themselves. After all, that is the quickest way to deliver scores (any old scores) to the client and be done with the engagement. Minimal work for them, minimal results for you.

The obvious problem to this approach is that the scores are not, in all likelihood, accurate because the business rationale for inputs into the model haven’t been given a second thought. Anybody can slap together a regression analysis and hope for the best. But, true lead scoring success comes from careful consideration of model inputs.

2. Derived variables don’t make the cut

Just as “black box” vendors throw every variable into the model to see what sticks, they also skip a vital modeling step: deriving new and meaningful variables that have significant predictive power. As a result, they miss the hidden relationships among variables, which leads the scoring model astray.

Take, for example, a hypothetical software company that offers its customers an advanced version of its famous software. The company is desperate for incremental revenue and wants a lead scoring model to determine which leads are most likely to upgrade to the advanced product. A “black box” vendor may design a lead scoring model that uses all available fields in the software company’s database; but they would have missed a crucial relationship by not deriving a new variable: The amount of time existing customers have used the original product. Those customers with longer “product tenure” may be more likely to upgrade to the advanced product. Without this new predictive variable, the lead scoring model will most likely miss those customers and may be less powerful and less accurate.

3. Industry-specific third-party data is overlooked

In addition to company engagement data, which is incredibly important to lead scoring, 3rd party data such as company “firmographics” (company size, number of employees, etc.) and intent data help to create much more robust and accurate lead scoring models. Firmographics are often great indicators for customer preferences and likelihood to purchase. Likewise, intent data allows companies to zero in on those customers that are genuinely interested in purchasing soon.

Unfortunately, the typical lead scoring vendor may not add industry-specific third-party data that is often invaluable. By skipping out on adding this incredibly valuable data, companies will experience sub-par lead scoring performance, which will, over time, lead to large amounts of incremental revenue lost.

4. Reliance on simple business rules produces unreliable scores

Sometimes, vendors may not consider leveraging the power of predictive analytics; they will simply rely on quick data analyses to inform a few business rules in their effort to build you a “best practices” lead scoring model. Although many business-rules based lead scoring models lead to higher conversion rates and subsequent incremental revenue, it’s vital to use them in tandem with predictive models to ensure a high degree of accuracy.

With a predictive model designed for lead scoring, your results and scores will typically be much more accurate at predicting which customers are most likely to purchase. A popular analogy that compares the two methods goes something like this: Predictive models are to a scalpel, as business rules are to a chainsaw. You can get a more accurate and precise picture of your customers’ likelihood to purchase with predictive analytics.

5. Your three-year-old could have designed a better quality model

Model design is one of the important, if not most important, elements of a successful lead scoring model. One of the main problems we see is a lack of sensitivity to factors such as seasonality bias and survivorship bias. Without understanding the biases in the data and a careful consideration of how to deal with these biases, a “black box” lead scoring vendor will quickly create a model that will put you on the path toward mediocrity.

It is, of course, impossible to know that a vendor will be sensitive to proper model design that will avoid these bias issues; however, a step you can take to better gauge if this is the case is to ask. Determine if the vendor adequately describes their methodology for dealing with bias. If it sounds reasonable and carefully considered, you may have found yourself a good match for a lead scoring vendor.

Have you had any “black box” experiences like this? Tell us about them below!

Learn more about successful lead scoring here >

 

Comments

  • Budd

    July 17, 2016

    Thanks for cointrbuting. It's helped me understand the issues.

Subscribe to the MarketBridge Blog