Digital surveys. Online polls. Course data and evaluation. They all utilize numbers to create the most accurate picture of course efficiency, right? But while course evaluation can be a bit of a numbers game, using only digital sources for rating might mean missing out on some of the most valuable information possible: The stuff that comes straight from your audience’s mouths.

Consider the people when you’re building and designing your next custom eLearning program and you might find that spending some one-on-one time with learners could give you priceless data that you’ll never get from numbers.

Crowdsourcing Data Mining

The idea of going straight to the source – the actual learners – for evaluation data can seem over-simple. After all, you probably have spreadsheets that detail big data from when users logged on to completion rates and even pass-through rates. But while that numerical data can be extremely helpful in structuring future courses, only the learners themselves can offer a subjective opinion, giving you information on how your e-Learning methods fared, what worked, what didn’t work and how to improve in the future.

When you talk to the people actually experiencing your e-Learning initiatives, you can find out valuable information that is practically impossible to extract from digital sources, like:

  • Preferred learning style of each learner
  • Existing knowledge base
  • Desired delivery methods
  • Rate of engagement
  • Opinions and ideas for the next course

Look outside the hard exterior of numbers and you’ll probably find that learners can be the best resource for information on how you’re doing – and how to improve.

Gathering Info

If you’re going to talk to actual human beings about your eLearning initiatives, where can you start? Chances are, you won’t need to go very far to find people willing to talk to you about L&D.

Soft launches or pilot groups can be effective before you’re ready to push a new program out to the masses. By assembling a group of users that reflect the type of user your program targets, your pilot group can help weed out errors, identify potential problems and submit suggestions to build a better module.

Another great source of information? Department supervisors. A supervisor has her finger on the pulse of her department, so she might get inside information that learners are less inclined to share with upper management. Ask department supervisors what they’re hearing and you can adjust accordingly.

Finally, don’t discount the age-old technique of actually talking to people. Most are willing to share opinions and ideas; you just have to ask. By putting the question to some of your core user group, you’ll mine information that numbers simply cannot offer.

Sure, number-based data and evaluation can give you an impartial view of who’s completing courses and how they’re doing. But data along can’t provide the big picture information you’ll need to build the best training possible. Giving users the forum to offer feedback – and then taking that feedback for eLearning into consideration – could be the difference between lifeless efforts and powerful training.