Data Makes Continuing Education Relevant and Valuable


For many continuing education (CE) providers, offering CE to learners isn’t just a member benefit or an ancillary service – it’s a major investment. If you’re in that camp, enjoying positive ROE (return on education) from CE depends heavily on your learners’ experience. In particular, it hinges on whether they find your educational content to be relevant and valuable.

How do you know you’re meeting those benchmarks?

To get a sense of the learner’s perception of your CE activities, you need to collect data related to their participation and performance. You also need to gauge their subjective assessment of every activity, using their responses to inform your efforts going forward.

The result? Happier, more engaged learners who recognize you as a go-to source for continuing education.

Ask your learners what they need

You might already ask learners to evaluate your CE activities, but how do you go about providing, collecting, and collating those evaluations? Do you:

The bottom line is that you have to ask learners what was valuable (or not valuable) about their experience. You also need to ask them what they need to get out of CE activities like the one they just completed. After that, you need an efficient way to collect every response from every learner and use them to plan future activities and modify existing ones.

Manual processes can turn the seemingly simple task of obtaining and using learner evaluation responses into an administrative hassle. Even if you manage to collect evaluations from every learner – and you might not get everyone to fill them out – you’ve still got to organize the responses and analyze them. It takes time.

A better approach is to automate the evaluation process within your learning management system (LMS). The software should prompt a learner to complete an evaluation prior to requesting credit. That way, you don’t have to send them an evaluation at a later date.

By automating evaluations, you gain a better overall picture of what all your learners need. You can also access their responses right away, eliminating lag time between requesting evaluations and generating usable response data.

Identify areas where learners excel

Which CE activities have the highest test scores? Which ones have the lowest scores – and what do the numbers mean?

High test scores and a high participation rate might mean that an activity’s content is engaging and valuable. Learners excel because the material is interesting and challenging. On the other hand, scores might be uncharacteristically high because a test isn’t challenging enough. Viewing participation data and analyzing it alongside learners’ evaluation responses can help you know for sure.

The best way to collect and analyze participation data is through reporting tools tied into your learner’s CE experience. You can view granular data at the learner or activity level and uncover insights about the relevance and value of a given CE activity.

Needless to say, this approach works great for online CE activities like journal articles and enduring materials. But it works for live events, too. Draw on data from integrated eCommerce tools to see which types of learners register for specific types of events and how often they’ve attended similar events hosted by your organization.

The more tightly integrated your reporting tools are with your learner experience, the easier it is to identify strengths and weaknesses in your CE activities.

Automate your outcomes assessments

If you’re a CME provider, outcomes assessments aren’t just important for verifying whether you’ve reached Levels 6 and 7 of Moore’s Outcomes Taxonomy; they’re also valuable for assessing the relevance and usefulness of an activity for an individual learner:

But how do you assess outcomes long after a CE activity is complete? If you handle things the way the Boston University School of Medicine does, you’ll ask learners to “commit to change” following a CE activity. Then you send them an outcomes survey weeks or months after the activity is complete.

You can automate all of this, of course. Encourage learners to commit to change when they complete their evaluations – just prior to the request for credit process. Inside your LMS, you can create a post-activity outcomes survey for learners to receive via automated email at a specified time following their participation in the activity.

Did they follow through on their commitment to change? The survey will tell you.

Providing valuable CE activities is good business

It can be tough to be sure your CE activities offer the relevance and value your learners demand. By making a commitment to data collection and analysis (and investing in technologies that make it possible), you can start basing decisions related to CE content on real participation data and subjective learner evaluations.

It’s just good business. You need learners to find value in your educational content so they continue participating. The more you know about the learner experience, the better you can make it.