Paste your Google Webmaster Tools verification code here

Why Monitor A Problem If You Don’t Fix It?

 

Every time I see a LifeLock commercial, I think about our mystery shopping and customer feedback programs. There is one line that stands out:

“I’m not a security guard. I’m a security monitor. I only notify people if there’s a robbery…”

And their tag line: Why monitor a problem if you’re not going to fix it?

Sometimes companies will start a mystery shopping or customer feedback program with the best of intentions – they are excited about designing a program that will monitor and measure the customer experience, either from an operational or subjective perspective.

And then the first results come in, and key staff read every word, share with their employees, and wait expectantly for the next one. Then the program runs for a while, and…

Now what?

There have been times when a client’s program seems to take a turn – all of a sudden the overall performance scores are lower, or a particular location seems to have consistent complaints of slow service, incorrect orders, or some other issue. If it’s not improving, it’s time to figure out why.

There have been times where clients will admit they realize there’s an issue but haven’t directly addressed it for a variety of reasons. Perhaps they’re understaffed, or not sure how to handle the issue, or, they’re not really doing much with the data they’re getting. In that case, they are a lot like the security monitor noted above – they are alerted to an issue, but are not actively doing anything about it.

Below are some examples of ways customer experience data may not be used effectively, and some ways to overcome these challenges.

 

“Oh, we use the data. Every time a low score comes in the staff get in big trouble.” When I hear this, I want to cry a little. This is the absolute WORST way to deal with lower than anticipated performance on a mystery shop, especially if you are only focusing on the weaker performing evaluations. A consistent pattern of low performance signals something to dive deeper into for sure. By consistently using analytical reporting portals, you will be able to identify these areas for improvement and action. By only singling out the poor scores, you are setting staff up for failure. You are also setting the tone that any customer experience measurement is “the enemy” and this will leave a staff that has no interest in hearing the feedback or wanting to improve.

Instead, take a different approach: instead of calling out the staff for a poor score, celebrate the good ones. Call out the staff for the best shops or surveys in month’s period. For the weaker evaluations, compile enough data to pinpoint the issue(s) and create an action plan to make it better.

 

“We are supposed to have meetings on a monthly basis to discuss the data, but business has gotten really busy lately, so…” Sometimes it takes a village, but often there could be one point person who is solely responsible for aggregating the data from all customer measurement programs and provide regular reporting to key staff.

It is important to have regular meetings to discuss company wide issues as time allows, but that doesn’t mean nothing should be done in the meantime. Assign a point person who is responsible for distributing individual evaluations or feedback surveys, but also for looking at the back end analytics and providing key metric reports so that managers have a place to work from to make improvements.

 

“I know District Manager A is on top of the program for his/her stores. I asked District Manager B about the program, and he/she said they may have seen some shops come in but hasn’t really looked closely.” When staff are not on board with a program, they may tend to not take it as seriously. The fact is, whether you like it or not, the program will go on, so you may as well make use of it. If you are in a position to oversee District Managers, for example, talk with them on a regular basis and give them some guidance on how to best use the data. Remind key staff that it’s less about the individual results and more about the aggregated data across all programs. Show them how they can make improvements in customer experience that will directly affect customer experience, increased sales, and better overall performance for their stores.

 

“I saw the surveys coming in last night and noticed that several customers were requesting contact. Sounds like it was a bad night.” Yikes. Thanks to technology, managers can be alerted to issues in almost real time, and sometimes taking quick action can alleviate an issue from snowballing into something bigger. I recall a customer feedback program in which text alerts would be sent if a customer requested contact from a manager. One evening, the alerts were coming in fairly quickly in quick succession. On closer inspection, the majority were for one location, and, in reading through the surveys, it appeared that the restaurant’s drive thru was experiencing a wait long enough to cause customers to leave mid-line and in the restaurant, the dining area was not maintained and significantly slow service was being reported.

In this instance, in a perfect world, a manager could do a quick check in with the store as the feedback is coming in to see what quick fixes can be put into place. Then, as soon as possible following the shift, talk with the store manager in future detail to learn more about the issue and create an action plan to ensure it doesn’t happen again or, if it does happen again, what to do to resolve it as quickly as possible.

 

Data is valuable, and not using it can be detrimental. Hindsight is 20/20; don’t be the one to look back and think, “If only we had paid attention to the data coming in….” Take advantage of your monitoring programs and act when needed – your customers will thank you.

 

 

Share