Dial M for Measurement
“Is realtime customer feedback really that new? Adam Dorrell puts it into perspective and gives us a short history of customer experience measurement from film focus groups to online customer feedback”
Dial M for Measurement
How gauging customer experience has roots in movie history
In the darkness at the back of a Century City movie theater, Alfred E. Green turns away from the screen to look intently at the red line from a pen recording machine as it edges towards the right hand edge of the paper roll. On the screen, Larry Parks mimes to “Mammy” at the climax of a rough cut of the “Jolson story”.
And in the laps of the hand-picked preview audience is an electrical gadget about the size of a flashlight with a dial that can be easily turned by the fingers. A turn to the right means “Like”. Far right “Like Very Much”. A twist to the left registers as “Dull”, or further, “Very Dull”. The emotional reactions from the audience flow into the central machine which combines them in a single wavy line. At this stage in the movie, it seems that every audience member has excitedly dialed the indicators all the way to the end stops. The pen lurches to the right as the orchestra builds to a crescendo, and Green permits himself to smile, now certain that his film will be a hit.
It is 1946, and The Jolson Story goes on to be a critical and commercial success. It is also another hit for Audience Research Inc, one of the companies started by George Gallup, pioneer of measuring public opinion.
That was sixty years ago, and even then Hollywood was not leaving anything to chance. Audience Research Inc (ARI) was testing hundreds of movies a year with sophisticated techniques. Often the results were distilled into a few simple words for the movie moguls. “It’s a flop – Slash the ad budget” or “We’ve got a blockbuster – Rush out more prints!”
Gallup was a pioneer of polling, practically inventing the quota system. In 1936, he made a name for himself by predicting the result of that year’s presidential election from the replies of only 5,000 respondents. Gallup’s prediction contradicted the respected Literary Digest magazine who had sampled over two million returned questionnaires – but got the eventual result wrong.
The art of prediction was rapidly elevated into a science. This type of quota survey was ideal for new products, or one off events. A small focus group could be exposed to a new product (or concept), provide excellent feedback – and be representative of an entire market or nation.
The Usual Suspects
Gallup’s legacy paved the way for long, complicated marketing surveys that could only be conducted once a year (because of expense), had many questions (because if you didn’t ask a question now you had to wait a year) and then had to be interpreted by high priests of mathematics (to see if X correlated to Y). Each survey company jealously guarded its own techniques, and for business people it was impossible to easily compare results across industries – or even internally.
But unlike holding a presidential election or releasing a blockbuster movie, few businesses host one-off events – instead conducting many similar transactions every day, with individual customers.
Over the last twenty years, almost all major companies have embarked on a yearly customer satisfaction survey to see how well the company performed. And it has become ritualized with a tradition that goes as follows:
- Reminder from CEO: “Customer Satisfaction is very important to us. All senior managers are bonused on this metric. So everyone, please focus on it”.
- Marketing department to sales: “Make sure records are updated of your customers so we can survey them”
- Sales: “Let’s find our tamest customers”
- In depth interviews with a very few (happy) customers
- “Gaming” of the results – coaching some customers, dropping poor responses
- Survey result interpretation: Lots of questions + answers. No one really knows what it all means: An average of 0.71 – is that good? Or bad?
- Customer comments or suggestions: One month after the survey the questions get passed around – quantity to high to react, and anyway, they are out of date – best ignored.
- Survey goes on shelf, gathers dust for a year.
At the end, these surveys are rarely used to change anything. Part of the reason is that employees don’t feel connected with the survey. After all, the surveys are only for a small percentage of customers, and they take place a long time after purchase.
One solution would be to survey a customer after EVERY transaction. Although previously impossible to do due to technical difficulties and cost, it is now becoming possible to do so. It is even the norm in the e-commerce business.
One of the best examples is eBay. Each transaction ends with an email for both buyer and seller: “Please rate the seller” and “please rate the buyer”. The eBay community relies on the outcome as an indication of trust. It works so well because of three basic concepts:
- Simplicity: choose a Positive/Negative rating and add a short comment.
- Participation: There is a high proportion of comments (above 90%) – probably due to the simplicity and ease of use
- Transparency: Comments are on show to all – and are believable.
What counts is the number of positives: “98% positive!” shout the powersellers. And with good reason to – it’s a system that can’t easily be fixed, and so if someone has a high rating, it’s a good benchmark of trust and customer service.
Other vendors also email, or even send SMS to ask each customer to rate the experience. Technically this is not difficult, but what matters is asking the right questions and then how the information is interpreted.
Luckily there is an excellent template for this, which uses a single question.
Back to the Future
The roots of an idea to simplify and standardize customer surveys came from Fred Reichheld of Bain Consulting. In his seminal paper “The One Number You Need to Grow”, published in December 2003, he suggested that you could measure Customer Loyalty by simply asking “Would you recommend company X to a friend?”. By counting the number of customers who would recommend, Reichheld found an indicator to future growth in profits. He called it the “Net Promoter Score®”.
This simple idea – using one question, and a single number – is changing the customer survey industry. Companies are rallying around the concept of Net Promoter, finding it easy to explain to staff, and easy to implement. Net Promoter even has possibilities of becoming an industry benchmark, given that the same question and methodology can be used.
Using electronic communications (email and increasingly SMS) organisations can now get almost real-time feedback on customer experience using dashboards, and look at which segments are performing better than others.
The Sixth Sense
Asking customers to rate companies and producing a numerical score is a good start on improvement. It allows companies to track and see how they are improving over time, and to compare to other organizations.
In addition, most companies know much of their weak spots, and can do work to change. But true change rarely comes unless the input of customers is taken into account.
Customers provide input every day: on the phone to customer support, complaining about poor service, making suggestions to staff, even writing letters to the CEO with praise. Rarely however are these brought together in one place to make sense of the “Voice of the Customer”.
Some companies are confused about talking with customers. When I asked the Sales Director of a major company who his customers where he listed the major retailers who were buying more than 80% of his product offering: Wrong answer! He had named the Channel Partners who have the selling relationship with the real customers. In fact, many companies are not organized to have a dialogue with consumers, preferring to leave it in the hands of a variable quality channel. Result: frustrated consumers who are familiar with email, web and call-centres but can’t reach a real person in their brand of choice.
Some successful organizations are now actively soliciting the “Voice of the Customer” - organizing the various touch-points into an integrated set of messages. Urgent issues are handled in a timely way by support staff. Praise is routed to those who deserve it who it.
By categorizing comments, and organizing by number and value a good picture of customer sentiment can be produced. In companies that are truly customer focused, customer feedback is reviewed by senior managers monthly and forms a constantly updated list of priorities that can change the business.
Some Like it Hot
Becoming a customer focused organization is difficult for many companies, and requires a great deal of culture change. But there are some simple tips to help the process for companies of all sizes.
- Survey all transactions if possible. Keep it simple for you and the customers.
- Get a numerical rating, based on Net Promoter Score. Publish it internally. Make sure everyone knows about this.
- Ask customers for comments. Read every one. Classify and sort them. Prioritise what has to be changed.
- Publish the good comments on your website as Testimonials.
- Try to personally answer any critical comments.
Adam Dorrell is the founder of Directness, which produces the software CustomerGauge to help organisations measure customer loyalty, understand customer sentiment, and respond to customer comments. Compatible with the Net Promoter® Score*, CustomerGauge allows organisations improve customer relationships and encourage loyalty, most immediately in e-commerce functions.
His blog is on www.engaugement.net
Sources: Time Magazine. Cinema: A. P. & Want-to-See, July 1946
This comment article appeared online at Internet Retailing. March 2008, text below.
The story should have the following graphic with it – if you want to find out more, read the article! (update 6 May 2010, article now offline, copy reproduced above)