The Yellow Brick Road of Leading/Lagging Service Manager Reports, Oh My!

Key article takeaways:

  • Quick series recap

  • The #1 weapon against MSP chaos

  • Leading indicators of tech performance

  • The lagging indicator is

  • Looking ahead along the Yellow Brick Road

 

Welcome back, fellow Autotask Adventurers! As you know, we are in a series going through the 3 Service Manager responsibilities. We are on the downside of responsibility #2 – Leading and Lagging Tech Performance Indicators – or for those who prefer a WBS structure, 2.3.

 

Now, I know the last few weeks may have seemed very much like we were trying to tame a jungle full of animals and bad guys with a hat and a whip but bear with me. By this week, your knowledge and comfort with the responsibilities of our hero (the Service Managers) and our guide (Advanced LiveReports) will make this week seem more like a jaunt down the Yellow Brick Road.

 

Speaking of brick roads…reading the title above, you may have started thinking about a little girl named Dorothy and her adventure with some friends. If so, good – you already know that it takes heart, courage, brains, AND trusted friends to manage a great MSP. If not, you might want to do some Googling about this 1939 classic movie and plan a popcorn night – you’ll thank me for it.

 

I know you’re anxious to start our adventure, but I owe it to those just joining us to provide a quick recap of the purpose of this series:

 

Your Service Manager should be taking ownership of these three things. If they are not, you need to fire them (or at least get them some guidance on their Role and Responsibilities).

 

A Service Manager is responsible for:

1) Holding the Service Coordinator accountable for doing their job as expected

2) Holding the Techs accountable for doing their job as expected

3) Holding the Team accountable to meet profitability expectations

a. Resource Utilization above 80%

b. SLA Performance above

 i. Triage – 97%

ii. Tech Engagement – 95%

iii. Completion – 90%

c. Reactive Hours per Endpoint per Month to less than 1 hour of work for every 4 Endpoints

 

“WOW,” I imagine you are saying, “That’s a tall order.” 

 

Yes, it is. But – if a Service Manager did step up and take responsibility for these three things, my goodness, the Service Delivery would:

1) Meet expectations

2) Maximize profitability

3) Provide a Zen work-environment

 

“Okay, great,” I hear you saying. “I want my Service Manager to rock it out, but I’m still trying to figure out what a story about a girl and her dog has to do with Autotask?”

 

Well, Dorothy couldn’t defeat the wicked witch (Yes, I know…the musical “Wicked” tells a slightly different story) or outsmart the wizard without help from her friends.

 

Likewise, most people can’t properly set up the extremely complex Autotask software, let alone weather the tornadoes and pointy-haired adversaries in their lives without some friends to share in the adventure. That’s what my Team at Advanced Global does: we drop houses on MSP Chaos, and we have heart, brains, and no lack of courage.  

 

Our #1 weapon against MSP Chaos is the Advanced Autotask Live Report. We use these to find the issues in an MSP and get the numbers where they should be. After all, if you can’t measure it, you can’t figure out what’s wrong, and you can’t fix it. 

 

To that end, I want to do another deep dive into two Advanced Live Reports that go to the core of a Tech performance, and six that are supporting either leading or lagging profitability KPIs.

 

The two core Tech Performance reports already covered in the series are:

1) Advanced Resource Utilization  Manage Your Techs Right to Improve Service Delivery (agmspcoaching.com)

2) Advanced Real-Time Time Entry  Why Autotask Reports Are a Service Manager’s Best Friend (agmspcoaching.com)

 

Today we take on understanding the Leading and Lagging Tech Performance indicators:

 

The leading indicators of Tech performance are:

1)     Advanced SLA Performance

2)     Advanced Estimated vs Actual Time per Tech

3)     Escalation Rate

4)     Reopen Rate

5)     First Contact Resolution (notice we did not say First Call Resolution – which is a huge difference)

 

The lagging indicator of Tech Performance is:

1)     Client Survey Score per Tech

 

The leading indicators of Tech performance are:

Advanced SLA Performance:

As you may know from the Metric, Metric, who has the Metric article, one report can be analyzed in multiple ways depending on your role.  Like the Resource Utilization report used in the past article, the SLA Performance report is also one. 

 

We mention the SLA Performance report in discussing Service Coordinator performance as they are totally responsible for all aspects of SLA performance at the Team or Company level.  Here we are using the SLA Performance by Tech to grade the performance of the Techs individually.

 

By comparing the individual performance against the Company, a Service Manager can see who is dogging the show. 

 

From here, the conversations are easy:

  • 1on1 coaching/mentoring conversations will lead to uncovering why one Tech is lagging behind.

  • Coaching them to get with the program, the overall Team or Company score will improve.

 

Advanced Estimated vs Actual Time per Tech

Like the SLA Performance by Tech report, reviewing the Advanced Estimated vs Actual Time per Tech report will show which Techs are taking longer than others to complete engagements.  The reasons for the longer engagement may be:

1)     One unusual Ticket that just was the bad luck of the draw and took a while – stuff happens

2)     The Tech does not have the right skillset and should not have been assigned the Ticket in the first place, and once they saw what was required, humbled themselves and set it to “Return to Triage”

3)     They need additional or a refresher training to keep up the industry changes and advances

 

Escalation Rate:

A reasonable Escalation Rate based on the 187 MSPs from around the world that we have provided a FREE No-Obligation PSA Configuration Evaluation for, is less than 5%.  When more than 5% of a Techs assignments are getting escalated (different than expedited), then either the process is wrong, or the Tech assignment is wrong.

 

We see high escalation rates when the help desk staffed with Level 1 Techs are also the intake team.  Any request requiring more than End User skills needs to be escalated.  This delays the engagement, delivers a poor Client experience, and is a waste of the Level 1 Techs billable hours. Techs should not be at the front end of the in-take process.  There is too much non-billable work within the in-take process, and it reduces their real value and contributions to the Team by saddling them with non-billable work.

 

Besides, it provides a better Client experience if the path to a Tech is streamlined.  By that, I mean Managed Services Client’s End User requests should go straight to a Level 1 help desk person.  They are better equipped to handle End User requests than the more Sr. Technicians.

 

For Managed Service Liaisons, they should go through a Service Coordinator as their types of requests require a more Sr. Tech and one that you do not want hanging around waiting for something to do. 

 

Non-Managed Service Clients should always go through a Service Coordinator as they are not paying for expedited service.

 

**Note: Should a Service Coordinator be a Billable or Non-Billable person? That depends on their career goal. 

 

Non-Billable people will learn the jargon, how to do the job, and be great at Client Relationships, empowering them to shepherd all Client requests from “New” to “Complete.”  But they will not know as much as a Tech would. 

 

However, a Tech that is focused on being on the front lines or a Project Engineer someday will always be focused on engaging rather than shepherding. If the Tech is late in their career and looking to slow down or at least get out of carrying the pager, they make the best Service Coordinators. 

 

Reopen Rate:

A high reopen rate, above 5%, is a sign that the Tech is either in over their head or cutting corners.  The heart of Techs is to do a good job:

1)     Meet Clients Expectations

2)     Fix it right the first time and provide high quality solutions

3)     Not be bothered with rework or reopens

 

If a Tech’s performance is above 5%, this is a great opportunity for a positive collaborative coaching/mentoring conversation.

 

First Contact Resolution:

First Contact Resolution is very different than First Call Resolution.  First Call Resolution is a KPI that comes from Enterprise IT where Agents sit around all day answering the phone in a call center and reading scripted solutions.  If they do not have a script that resolves the person’s problem in One Call, they then need to escalate it to someone who thinks.  Now tell me, does that sound like anyone who works for an MSP? I think not (which is why I have never managed a Call Center).

 

First Contact Resolution allows for the Tech to Think.  And by that, I mean not declaring victory without checking back to validate the solution or scheduling the reboot/update for after hours work.

 

So, First Contact Resolution allows for two Time Entries by the same Tech to be considered a good job.  More than two Time Entries or escalating to a more Sr. Technician means the engagement was not resolved in the First Contact.  I know you are dying to ask - the industry average is around 85%, compared to almost 99% for First Call Resolution in an Enterprise IT Call Center.  My goodness are these two different worlds!

 

That wraps up the 5 leading indicators.  They are called leading indicators because if you get out in front of Tech performance by finding, benchmarking, tracking and leveraging these KPIs, the other performance indicators, such as Resource Utilization and Real-Time Time Entry will improve once the Culture embraces these Tech KPIs.

 

The Lagging Indicator:

The lagging indicator of Tech Performance is Client Survey Score per Tech. We all know the only vote that counts is the Client’s. Having them weigh in on Tech performance is a logical step.  Keep in mind that the Client Surveys are their response to a single engagement.

 

A good example of this comes from an MSP owner that pulled me aside one time and told me he had a Net Promoter Score interview with a Client that gave them a Raving Review and referred the MSP to several prospects.  During the interview, the MSP owner asked the Client why, if they were raving fans, were they so disappointed in the Survey Score.  The Client went on to say that particular request the MSP pooched the engagement, but 99% of the time, the MSP knocked it out of the park.

 

There you go, Survey Scores are more about the Tech’s performance and the Net Promoter Score is about the MSP Delivering Superior Service to the Client.

 

Next week, we will wrap up Phase II of this series by discussing how to roll all of these KPIs into one Tech Performance Score using (what else?) the Tech Balanced Scorecard report.

 

I hope our little trip down that yellow brick road has been helpful. Every day the life of an MSP brings challenges and opportunities (and lions, tigers, and bears), and we’re here to help you through it all. If you have any questions, concerns, or buckets of water, feel free to email us at info@AGMSPCoaching.com.

 

The elephant in the room:  

Who is Advanced Global, and why should we listen to them?

1.     Advanced Global MSP Coaching is the Autotask Service Delivery Authority.

2.     We Guide MSPs to use more of the Autotask Software to drive Operational Improvements. 

3.     The first step is a FREE AUTOTASK PSA CONFIGURATION EVALUATION

 

Get started today.  Waiting is costing you $600 per Tech per Week, and we can help you recoup that loss, Guaranteed or your Money Back.  For more information, email us at info@AGMSPCoaching.com

 

Steve & Co 

Stephen Buyze

President of Advanced Global MSP Coaching

Previous
Previous

Tech Balanced Scorecard: The Service Managers Bazooka in the Fight to Bring Order to Chaos

Next
Next

Why Autotask Reports Are a Service Manager’s #1 Comrade