The Rievent Blog

Exploring the latest developments in continuing professional
education and LMS technologies.

CME Outcomes Assessment is Easier Than You Think

by The Rievent Team

Outcomes Reporting

Outcomes are a perennial topic of discussion among CME providers. As the ACCME continues to emphasize outcomes reporting, you may feel increasing pressure to measure outcomes consistently and accurately. That probably sounds challenging, and it can be.

But when you’ve got the right CME technologies in place, it’s actually pretty simple.

How to automate CME outcomes reporting

You can’t shadow your learners to see if they’re applying the knowledge they’ve acquired through CME, but you can follow up with them to see how things are going. By adding an automated post-activity outcomes survey to CME activities at the outset, you can gather data on:

  • How learners are applying the insights they gained through CME
  • To what extent the information in a given activity has benefited patients
  • Whether learners have changed their practice in an impactful way
  • Patient health improvements resulting from the learner’s CME participation

How easy will it be to add these questions in survey form? It all depends on the capabilities of your CME learning management system (LMS). Ideally, you’ll be working with a system that supports surveys and/or assessments both before and after a learner’s participation in the activity.

These surveys shouldn’t be confused with activity post-tests, which learners complete immediately after finishing an activity. Instead, they should be given to learners as a pre-participation diagnostic and a post-participation outcomes assessment. The process will break down as such:

1. Create the activity in your LMS

Add your content, create the post-test, and assign credit types and amounts to the activity. Outcomes surveys are part of the activity building process, so you’ll want to prepare the outcomes questions at the same time.

2. Add a pre-test or pre survey to the activity

This is the diagnostic tool you’ll give learners before they begin an activity. Make sure it covers all of the major takeaways from the activity and gauges learners’ current understanding of the subject matter.

Ultimately, you’ll be able to compare these responses to the ones learners provide during an outcomes assessment. While you can also compare them to an activity’s post-test results, our main concern here is with the value of a diagnostic survey after learners have long since completed an activity. We want to see whether they’re compelled to apply their new-found knowledge in a clinical setting – and whether doing so improves patient health.

3. Add an outcomes survey to the activity

This survey can be completely different from your pre-test or survey, somewhat different, or identical. As long as it helps you gauge whether patient health has improved as a result of learners’ CME participation, it will serve its purpose.

4. Automate distribution of the outcomes survey

Here’s where having a comprehensive LMS really counts. You shouldn’t need to manually distribute your outcomes survey. You should be able to set the outcomes survey for automatic distribution on a specified number of days following each learner’s completion of the activity.

In other words, outcomes surveys should be automated. You should be able to create them and specify distribution variables at the same time you create the activity.

Hurdles to clear

When it comes to gauging outcomes, you’re almost always in a challenging position. You’ve got to measure clinician performance that:

  • Occurs after learners have completed the CME activity – sometimes a long time afterward
  • Relates to the content of a particular CME activity
  • Is subject to the hazards of self-reporting, including learners’ subjectivity and (in the absence of concrete health data) individual perception
  • Can, in theory, be influenced by factors other than your CME activity; these factors are almost invariably outside your control

What’s more, you’ve got to start with CME technology that allows for outcomes reporting in the first place. And let’s not forget about response rates. You’ll need several learners to actually respond to the outcomes survey! Otherwise, it will be tough to build a data set that informs your efforts to develop effective CME activities.

But when the dust settles and the responses start coming in…

The benefits of outcomes surveys speak for themselves.

It’s true. By creating quality, comprehensive outcomes surveys with automatic distribution following activity completion…

  • You’ll know a whole lot. Without outcomes data, it can be impossible to know whether a CME activity is having a real-world impact. Outcomes surveys fill that knowledge gap.
  • You’ll get new ideas. When the content of an activity really clicks – when an overwhelming majority of learners are drawing from your CME to improve patient health on a significant scale – you’ll know about it. And you can use the insights from that activity to develop new activities that achieve similar results.
  • You might qualify for accreditation with commendation. Criteria 36-38 of the ACCME’s “Getting Started With Commendation” literature specify that a CME provider must “achieve” outcomes to a certain extent to be eligible for this level of certification. With the right data in hand, you may have what you need to qualify.

Of course, the most important benefit of outcomes surveys doesn’t go to you. It doesn’t go to your learners either. It goes to the patients whose lives are improved – or perhaps even saved! – by the education you’re providing to clinician learners.

All it takes is a simple revamp of your activity design procedures, one or two additional surveys, and a commitment to CME technologies that serve your learners.