Venturing into Analytics
Traditional LMS metrics are essentially variations of the “have/have not” report: what learners completed which courses in what time frame. Most LMS systems are implemented to address compliance requirements for tracking, and thus, making sure the right boxes are checked seem to be the primary driver for the report information. On occasion, you see stakeholders take a slight interest in data points such as scores or time-on-task, but generally, most LMS reporting metrics are of the “burgers served” variety. This is a shame.
Some LMS can collect and report data for learning analytics - insights into what is occurring in the learning process and how it contributes to success. These LMS can provide stakeholders data about your employees’ ability to execute company strategy. As part of the DotNetNuke ecosystem, the Accord LMS provides much better integrated analytics than other LMS platforms.
Looking Beyond the Scores
The first entry point into deeper learning analytics is the item analysis. Item analysis is a breakdown of the performance of each specific question on a test. And it can provide a significant amount of insight and value for stakeholders.
Item analysis will only truly provide insight in regard to business performance capability if the questions are developed to test skills applied in business, rather than fact recall. The insights provided on item analysis for a performance-based assessment question are extremely valuable.
Test item analysis doesn’t just provide data that employees scored 83% on average, it can illustrate that the most missed questions were questions 3, 12, and 16, with a percent breakdown on each. Further exploring, you can see on question 3, that only 58% of the employees answered correctly, and the most common incorrect answer was selected 36% of the time. This type of insight can identify key performance area gaps to help zero in on specific challenges in the organization to close critical skills gaps. The Accord LMS provides very detailed reports for Item Analysis.
Report Detail:
Item analysis is extremely valuable in a multi-step simulation. I recently deployed simulation training in Adobe Captivate that included many steps to perform tasks in a new software system. By performing item analysis, we could exactly pinpoint which steps were the failure points for employees attempting to perform the task with the new software. By analyzing this information, we were able to share this with the software development team and make improvements to support and training to improve performance in these areas.
To produce an item analysis, you just need to check how your authoring tool enables this data to be shared. In general I have found it requires output to SCORM 2004.
Other Data Improvements
A deeper understanding of employee capability through performing item analysis can drive improvements in other metrics. By understanding key failure points, you can make improvements to the content or other support materials to improve performance- not just on the test- but in workplace application.
Additionally, efficiency metrics such as the time-on-task for completion (or time-to-competence) can improve by understanding and addressing the biggest challenges employees have with a learning asset and addressing it competently. For training departments that track efficiency metrics such as time for developing or maintaining content, or quality measures, the measures improve sharply by using an intelligent, data-driven improvement cycle fueled by item analysis.
Item analysis is a great entry point for learning and development professionals to venture into learning analytics. It is a tool directly under your control, and proper analysis can provide benefits that ripple to other measures across the organization.