Question-level data: Are you leveraging your assessment insights effectively?
For your learners, eLearning assessments conclude the moment they triumphantly enter, select, or drag and drop their final answer. For your learning designers, on the other hand, the end of an assessment can result in a collection of score-related data that has the power to take your learning design to the next level.
In an immediate and practical sense, this kind of data can provide a handy insight into the technical health of your assessments themselves. If everyone’s pass rate looks eerily similar, for example, down to the exact percentage of correct answers, you might need to ensure your assessment (and the question banks that power it) is behaving as it should.
That doesn’t mean score and assessment data are only useful from a diagnostic point of view—far from it! Once you’ve established that your eLearning authoring tool is firing on all cylinders, you can turn to the question of whether your learners benefit from your quizzes and the eLearning courses that underpin them. Read on for a whistle-stop tour of assessment data and reporting.
Why pay attention to assessment data?
When it comes to creating your assessments, good infrastructure and a robust set of authoring options are only part of the story.
After all, if you’ve gone to the trouble of providing your learners with a healthy mix of randomized questions, engaging question types, and insightful inline feedback, you’ll want to make sure your assessments are actually helping users to learn! That’s where scoring data comes in.
An authoring tool like Gomo will provide you with a huge amount of assessment-related data by reporting back to your LMS. The level of detail you receive, however, depends on the standard your LMS supports.
Assessments involve a lot of moving parts. Why not take a look at our breakdown of the most important elements?
13 things you should know about eLearning assessments and quizzesHigh standards: What kind of data can your LMS receive?
The ever-popular SCORM 1.2 will allow a course to report a simple (but still highly useful) set of data to the effect of: “this person did this course and got this score”. While this isn’t the most detailed information in the world, it’s still a great insight into the broad strokes of your learners’ progress.
More advanced standards like SCORM 2004 and xAPI, meanwhile, can take you much deeper.
If your LMS supports these standards, it’ll receive in-depth, question-level data. That means every learner’s answer to every question (and every occasion in which a score is triggered) gets recorded and neatly packaged for your inspection.
How to interpret question-level data
It’s one thing to receive assessment data reports, but what kind of insights can they reveal?
Let’s suppose that a given course (or, if your LMS supports an xAPI statement, a given question) frequently returns wrong answers. There are a few different possibilities here:
- The question (or assessment) is too difficult
- The course materials that precede your assessment don’t provide enough memorable information
- Your learners are just guessing!
Identifying the root cause of these issues will inform the requirements for future course revisions, leading—if you play your cards right—to improved learner outcomes. Whatever the cause may be, it all begins with the assessment data that kickstarts your investigation.
Deciphering your analytics might require a closer look. Check out our advice for getting the most out of your data:
6 practical tips for applying assessment and quiz data to your eLearning contentCreating your own insights
Of course, if an assessment doesn’t seem to be hitting the spot with your learners, the root of the problem might be easy to track down—you don’t necessarily need to get your magnifying glass out whenever your people struggle with a given quiz!
Instead, you can always ask your learners for their feedback as part of your assessments, perhaps through a Likert-style rating system or an open input option.
That way, your reports will be enhanced by quick and easy access to the feelings of your learners, and your improvements will be informed by tips and insights taken straight from the horse’s mouth.
If you want to know more about assessments, you’ve come to the right place
This blog post is just a taste of our ebook, ‘7 assessment best practices: the dos and don’ts of creating eLearning quizzes’. The full ebook covers a wide array of assessment-related topics, from the many advantages of question banks and randomization to the benefits of providing clear and detailed inline feedback. If you want to know your Likert scales from your dropdown lists or discover the connection between question randomization and regulatory compliance, download our free ebook today.