Using Data for Better Learning Programs

As learning and development professionals, we look for confirmation that our learning programs make a positive difference, but we’re also answerable to our business to show that L&D resources are a good investment. And so, we turn to data to demonstrate the impact of what we’ve built and discover ways to improve our learning programs.

Sometimes, when trying to use data to improve learning programs, we may think that we don’t have the right data. Not everyone has highly sophisticated systems for metric tracking and analysis. How can those of us with simple systems use our data to glean insights? When using data to improve learning programs, we sometimes react to one piece of compelling data before confirming that it tells the whole story. We need to ensure that we are getting a complete picture before we use data to strategize improvements.

Here are three suggestions for L&D to use data for strategic improvements for their learning programs:

Know what you want to achieve

You must know what problem you are trying to solve by using data for identified improvements. Without defining the scope of your work upfront, sorting through raw data soon becomes unwieldy and counterproductive. When deciding what you want to accomplish, be very clear. Beyond understanding the issue we you need to interpret the data to answer a question. Defining the right questions is sometimes tough and iterative process. Look into problems from multiple angles and come up with a collection of questions to bring to the data.

Sometimes organizations face frustratingly low enrolment. Trying to understand the cause, we may pour over LMS engagement and completion data but this may not yield explanation. We may be we need to raise a basic question “Well, how are people who enrol finding us?” Often it takes precise questions to get precise answers out of raw data to drive a critical point. Sometimes new learners enrol in waves periodically as result of all-hands meetings where leaders share specific upskilling importance. This may translate directly and immediately into program enrolment. Hence, it is better to coordinate with leaders and structure the programs with the priorities set at such meetings. With this insight, it we can evolve from cancelling sessions because of low enrolment to adding more sessions to accommodate long waiting lists, making the program more relevant to the business.

Get creative with your data sources

Once you are clear what you want to accomplish, the following step is to inventory your data sources. What quantitative (LMS data, training hours, assessment scores, and so forth) and qualitative (surveys, interviews, focus group notes, observations, and the like) data are at your disposal? Get innovative. See beyond traditional learning metrics and how you are currently picking up the data. There may be opportunities to create fresh data sources or talk to other departments that learners connect with to collect data about how learners are applying learning in the organization.

Think of social media in a creative way. The polling feature of social media site can be used to create knowledge assessments and reaction surveys. Plus, users’ social comments provide a wealth of qualitative data about their learning experience. Using this data, we can discern our learners’ learning gaps and confidence gaps and address them with program improvements. Certainly, social media isn’t suitable for every learning program. But, don’t be afraid to use new resources to capture data, even if it’s a new way of doing things.

Dig until your data is complete

It’s easy to react to limited data before we get the whole story. When your data suggests something, dig deeper until you fully understand what’s happening before jumping to solutions without fully understanding the context. Make meaningful improvements and not just ineffective changes.

Sometimes learner surveys indicate that more videos would improve the program curriculum. We may jump to speculate that learners thought there was too much text and needed videos to stay engaged. But the reality could be different. It may turn out that they simply wanted more examples of how to apply what they learned on the job and had assumed videos would be the most effective way to do so. Once we know this, we can quickly incorporate the examples learners needed in low-tech ways. By digging until we understood the right problem, we avoid the costly mistakes. And, more importantly, we can use what we learn to improve the program in a meaningful way that supported learners in applying their learning on the job.

Conclusion

Access to different types of data (LMS engagement and completion data, social media engagement data, and learner survey data) can present you with many solutions. Data comes in abundance of useful forms, and we never need to feel our data is inadequate to provide insight. However, if you aren’t getting the information you need, take the opportunity to get creative.

Do you use data to improve your learning programs? What types of data do you find most useful? What unconventional learning metrics do you use?

Leave a Reply

Your email address will not be published. Required fields are marked *