Showing posts with label goals. Show all posts
Showing posts with label goals. Show all posts

Sunday, August 7, 2016

It's Time to Solve the “The Last Mile Problem” in L&D


The telecommunications industry has had to confront a challenge referred to as “The Last Mile Problem.” This phrase describes the disproportional effort and cost to connect the broader infrastructure and communications network to the customer who resides in the ‘last mile.”  


Learning Measurement also has a “Last Mile” problem. We have a wealth of data, automated reporting and data analysts who attempt to make sense of the data. Yet, as I observe and work with L&D organizations, I continually witness “Last Mile Problems” where the data doesn’t create new insights, enable wise decisions or drive meaningful action. If we want our investments in measurement to pay off, we need first to recognize that we have a problem and then second, to fix it. 


Do you have a “Last Mile" problem?  Here are six indicators:
1.    One-size-fits-all reporting
2.    Mismatch between reporting and decision-making cadence
3.    Lack of context for assessing performance
4.    Poor data visualizations
5.    Little attention to the variability of the data
6.    Insufficient insight that answers the “so what?” question

Let’s explore each indicator with a brief discussion of the problem and the consequences for learning measurement

1.    One size fits all reporting

While L&D organizations generate a plethora of reports, not enough of them tailor their reports to the needs of their various audiences. Each audience, from senior leaders to program managers to instructors has a unique perspective and requires different data to inform decisions. 

In an age of personally targeted ads on Facebook (creepy as they may be), we each want a customized, “made for me” experience. A single report that attempts to address the needs of all users will prove frustrating to everyone and meet the needs of no one. Ultimately, the one-size-fits-all approach will lead users to request ad hoc reports, create their own shadow process or simply resort to gut feel since the data provided wasn’t very useful.  

2.    Mismatch between Reporting and Decision Making Cadence

Every organization has a cadence for making decisions and the cadence will vary based on the stakeholder and the decisions he/she will make. Instructors and course owners need data as soon as the course has completed. Course owners will also need monthly data to compare performance across the courses in their portfolio. The CLO will need data monthly but may also want a report when a particular measure is above or below a specific threshold.

In a world of high velocity decision making, decision makers need the right information at the right time for them. When the reporting cycle is mismatched with the timing of decision-making, the reports become ineffective as decision-making tools. The result? See #1: shadow processes or ad hoc reporting to address decision needs. 

3.    Lack of context to assess performance

Many L&D reports present data without any context for what ‘good’ looks like.  The reports display the data, perhaps comparing across business units or showing historical performance. However, too many reports do not include a goal, a performance threshold or a benchmark.

Leaders don’t manage to trends nor do they manage to comparative data. Without specific performance goals or thresholds, L&D leaders lose the ability to motivate performance on the front end and have no basis for corrective action on the back end. These reports may be interesting, but ultimately produce little action.

4.    Poor data visualization

Data visualization is a hot topic and there is no shortage of books, blogs and training courses available. Despite the abundance of information, L&D’s charts and graphs appear not to have received the memo on the importance of following data display best practices.

Look at the reports you receive (or create). Do your visuals include unnecessary flourishes (also known as chart junk)? Do they contain 3-D charts? Do they present pie charts with more than five slices? Do your annotations simply describe the data on the chart or graph? If you answered "yes" to any of these questions, then you are making the last mile rockier and more challenging for your clients than it needs to be.

No one wants to work hard to figure out what your chart means. They don’t want to struggle to discern if that data point on the 3D chart is hovering near 3.2 (which is how it looks) or if the value is really 3.0.  They won’t pull out a magnifying glass to see which slice of the pie represents their Business Unit.

Poor visualizations fail to illuminate the story.  You have made it more difficult for your users to gain new insights or make decisions.  Over time, your readers shut down and opt to get their information elsewhere.

5.    Little attention to the variability of the data

As the L&D community increasingly embraces measurement and evaluation, it has also ratcheted up its reporting. Most L&D organizations publish reports with activity data, effectiveness scores, test results, cost data and often business results.

"The Signal and the Noise"

What we rarely consider, however, is the underlying variability of the data.  Without quantifying the variance, we may over-react to changes that are noise or under-react to changes that look like noise but in fact are signals.  We are doing our users a disservice by not revealing the normal variability of the data that helps to guide their decisions.  

6.    No answers to the “So-what” question

This last mile problem is perhaps the most frustrating. Assume you have addressed the other five issues. You have role-relevant reports, reports provided based on decision cycles, goals and exception thresholds based on data variability and great visualizations. 
Unfortunately, we all too often present data “as is” without any insights, context, or meaningful recommendations. In an attempt to add something that looks intelligent, we occasionally add text that describes the graph or chart (e.g. “This chart shows that our learning performance declined in Q2.”). Perhaps we think our audience knows more than we do about what is driving the result. Often, we are rushed to publish the report and don’t have time to investigate underlying causes. We rationalize that it’s better to get the data out there as is than publish it late or not at all.  

Amanda Cox, the New York Times Graphics Editor once said: “Nothing really important is headlined, “Here is some data. Hope you find something interesting.” 

If we are going to shorten the last mile, then we have an obligation to highlight the insights for our users. Otherwise, we have left them standing on a lonely road hoping to hitch a ride home.

Recommendations to solve the "Last Mile" problems

There is a lot you can do to address the “Last Mile Problem" in L&D. Here are six suggestions that can help:
  • Create a reporting strategy. Consider your audiences, the decisions they need to make and how to present the data to speed time to insight. Include in your reporting strategy the frequency of reports and data to match the decision-making cadence of each role.
  • Identify a performance threshold for every measure on your report or dashboard that you intend to actively manage.  In the beginning, use historical data to set a goal or use benchmarks if they are available. Over time, set stretch goals to incent your team to improve its performance.
  • Learn about data visualization best practices and apply them.  Start with Stephen Few’s books. They are fun to read and even if you only pick up a few tips, you will see substantive improvements in your graphs and charts.
  • Periodically review the variability of your data to see if its behavior has changed. Use control charts (they are easy to create in Excel) for highly variable data to discern when your data has above or below your control limits.
  • Add meaningful annotations to your graphs and charts. Do your homework before you present your data. If a result looks odd, follow up and try to uncover the cause of an anomaly.
  • For senior leaders, in particular, discuss the data in a meeting (virtual or face-to-face). Make the meeting about exploring and understanding the findings and agreeing on actions and accountability for follow up. Use these meetings to create a deeper understanding of how you can continually improve the quality of your reporting.

If you start taking these recommendations to heart, your last mile could very well shrink to a very small and manageable distance.

Have you experienced a "Last Mile" problem? I'd love to hear from you. Follow me on Twitter @peggyparskey or connect with me on LinkedIn at www.linkedin.com/in/peggy-parskey