As you build or refine your organization’s reporting capabilities around Key Performance Indicators (KPIs), one tried and true report format should be a part of your arsenal: the ranking report.
What are the hallmarks of a ranking report? Well, to me the idea of the ranking is more than just understanding the spectrum of performance results. A real ranking report is about acknowledging the best in class at the top of the list, calling out those on the bottom of the list as needing improvement, and placing everyone’s performance in line of best to worst, top to the bottom of the list.
Want to make a ranking report even more effective? Try these format upgrades:
- Don’t just list the store, list the store manager’s name
- Call out the top 10, highlight them in green and celebrate their success
- Call out the bottom 10, highlight them in red and call them Opportunity Stores
- List the district manager’s (DM) name alongside the store manager’s name
- Rank the districts in their own list after the stores
Simple stuff. But it’s a natural human reaction to want to look good; or at least to avoid looking bad. I will admit that not everyone craves to be at the top of the list; in fact a fair number prefer the anonymity of the middle of the list. But no one, no one, wants to be found in the bottom 10 or on the very bottom. Not if the report means anything, not if it gets published to their peers, not if their performance also shines as a reflection of their DM too.
So the magic of a ranking report is the dynamic of motivating change from the bottom of the list. Whether that takes help, encouragement, extra resources, DM coaching or like-store learning, the ranking report can provide the fuel for change management for those on the bottom of the list. And if the bottom of the list keeps moving up, so does the collective average. And thus, the organization improves.
Of course, not everything is suitable for performance ranking, top to bottom, best in class at the top and worst in class at the bottom. Some KPIs are entirely inappropriate for this, in fact. The litmus test is whether or not the metric you are sharing across a range of stores is a value that puts every store on a level playing field with reasonably equal potential to deliver results any place on the spectrum of performance.
- Sales per Hour across a chain of varying volumes, mixes and demographics? No way.
- Sale per Transaction across a chain of diverse locations and product mix? No way.
- Overtime hours across a landscape of varying store volumes and situations? Not really.
√ Earned Hours Operating Ratio, where every store’s actual hours are compared to that store’s own earned hours? Now that makes sense.
√ Percent of Overtime Hours used? Much better than just overtime hours.
You get the point. If you want to compare similar metric values using a ranking report, be sure you are casting light on a metric that matters and that all stores can, through application of your organizations tools, best practices, and hard work reach any point on that list with your support.
Your support is critical. Why? Because a good labor manager has to believe that their store managers want to do a good job. Some folks have better training and experience than others. Some need more motivation. Others may not understand the priority or know how to get there. Many need your help and guidance.
Well, if it’s worth ranking, then it’s worth helping with. If your goal is to light a fire under the feet of those managing key performance metrics, then be prepared to help them when they get serious about the fire you set.