A fundamental instructional design process model followed in L&D is Analysis, Design, Development, Implementation, and Evaluation (ADDIE). The “E” stands for formative and summative Evaluation.
In this stage we are supposed to be able to ask a few questions, such as: “What are the results of the training?” , “Based on what we now know from evaluation data, how can we better improve or sustain the training program?”, “Can we prove that specific Instructionally Designed material affected the behavior or competency level as it was intended?”
This could easily be an article challenging whether or not the “E” in ADDIE has ever really been practiced industry-wide. Rather than dwell in the past, let’s focus on how we can make it to the future… because after all, the future is now. For many readers, comprehensive evaluation data is just not imaginable with their current situation. The purpose of this article is to say comprehensive evaluation data is available in your current situation. If the blocker to comprehensive training data were to be removed, most would want to start with the basics, and the basics of learning analytics would probably fit into the engagement category. Engagement metrics would be things like: the average time to complete a course or activity, average time spent per page, the amount of sessions it takes to complete a course or module, most common time of day, average length of learning session, total time spent, etc. Of course the scores and answers on assessments, quizzes and attestations should also be included in the basics. Nobody wants to be locked into a vendor proprietary solution either. So often today I speak to learning professionals who are using a modern learning content delivery approach but do not have insights into whether or not it is even being used, ie. the basics.
Anybody who follows Riptide in our communications knows that we are continuing to make the case for Experience API (xAPI) and the Learning Record Store (LRS). This is critical to understand. Any event that happens digitally can become part of an xAPI statement. In the case of corporate training today we are mostly talking about what is happening in a web view, in the web browser (DOM). This includes mobile applications. But it is also true of video games, VR, AR, or any digital event. In corporate training we have been transitioning all of the training out of the Flash player and into HTML5. What this means is all of the interactions are now available to report someplace. It really doesn’t matter which LMS you are using, any activity that happens in your training can be reported with xAPI.
We are starting to compile and consider some of the common use cases today that really magnify the issues and give reason for the specific features of our learning technology solutions. The obstacles that emerge in these common cases can be easily addressed with xAPI/LRS technology that has been available since 2015 and is certainly vetted and ready to use in 2018. Each of the below brief use case descriptions represent real words from multiple conversations we have had at conferences and demos. Do any of these resonate with you? Each could get into the details to form a solid use case (business case) and become a future article unto itself.
Considering External Content, Proprietary Services, Authoring Tools, and Your LMS
Use Case Description: “In order to get metrics and analytics out of Assessments we have to use a 3rd party Assessment provider with a proprietary reporting capability. This is delivered as a final link within Authoring tool published content. So, the training and the Assessment is being delivered by two different paid systems because we cannot get any meaningful reporting from the Assessments generated in the Authoring Tool and/or the LMS.” – ALSO- “We use Storyline (Captivate or other) to wrap our activities/simulations so we can host them on the LMS and we get zero reporting out of the embedded activities/simulations.”
The above use case is a bit more complex than described, has turned into customer work, and is well on the way to becoming a powerful xAPI case study. We are working with this customer to greatly simplify their training program and capture learning data they didn’t even know was available.
Use Case Description: “Our LMS is being used to deliver a link to external training content. The learner clicks on the link to get at the content and we cannot verify that the person even completed or made it through that course, module, or piece of learning content.”
We could call this use case description black hole link. This use case is often delivered with one of various workarounds to verify the learner experienced the content, a self certification or an assessment, an attestation that the learner completes apart from this black hole link, or some other means that does not provide evaluation data. Very often the results of the verification methods are not captured but are implied by the LMS. For instance, the learner only gained access to the assessment via a link on the last page of the training content, a completion statement is sent to LMS with the last question/page of the assessment. It is the single completion statement sent after technology re-directions that implies all of the content was experienced and not data from the technology sources themselves.
Use Case Description: “Our training programs use primarily video based instruction and the LMS reporting cannot even tell me if the users are watching the videos.”
We could call this use case description, a black hole video. The image below shows the type of data easily available from video such as play, pause, seek, and time signature of the content.
Activity stream data, easily captured from video
Considering SCORM and Your LMS
Use Case Description: “All of our interactive multimedia instruction reporting is in SCORM and SCORM does not provide what we need. Basically, all we can get are completions”
Recently, while speaking at the xAPI Camp for DevLearn 2017 I asked “how many people here use SCORM for reporting?” every single person in the room raised their hand. Many believe that they are going to need to re-create all the training to get into learning analytics and this is not the case. You can get more data and you don’t have to re-publish all the content.
Use Case Description: “We design our content with SCORM reporting and our LMS captures the data but does not support SCORM reporting SO we have to contact the LMS regularly to get a spreadsheet of SCORM interaction data.”
The 2 SCORM use case descriptions directly above are very interesting. Most of the LMS’s on the market support SCORM content PLAYING on their LMS and some even collect the data but most do not support complete reporting of the SCORM data. You are reading this article in 2017-18, SCORM 1.2 (the most widely adopted version of SCORM) is from 2001, when do you think the LMS’s are going to get around to supporting all the SCORM data?
An Open Source Tool to Help…
Incidentally, we recently released an open source tool called The SCORM Interceptor that can be used by adding it to the first page in your SCORM packages and it will intercept every SCORM call and send it to an LRS as xAPI. We will likely be writing more about this tool but it seems very appropriate here to offer a practical, easy-to-use tool that can get you started in 2018. Your SCORM packages are probably reporting more than what you are able to get from your LMS. The engagement metrics (the basics to start in analytics) and assessment data may be there, you just can’t see them due to LMS reporting limitations. All you need is an LRS endpoint to use the SCORM Interceptor.
More and more companies struggling with above use cases are realizing there is a better way. We are proving that the days of clever workarounds to verify completions are over. You can literally “turn the lights on” in all of your current training content very rapidly, easily, and relatively inexpensively. And it does not matter which LMS you are using. Experience is a key word here because everyone can agree that the best evaluation data we could get is learning experience data. Even if you have the most common piece of data reported today, “the completion”, you still do not have much in the way of direct insights into what the learner experienced. Contextual learning experience data is the first step of evaluating, adapting, and mapping training to performance. The big takeaway here in this article is that with xAPI, engagement data is the low hanging fruit. Using xAPI to get the basic stuff you can’t get from SCORM is a super easy way to start with xAPI and it will get the attention of your leadership. What if you could start with all of your legacy SCORM content and get into learning analytics in 2018?
The above use cases represent the needs that our products help to fulfill. We would like to write more about this and would love it if you would share with us your learning analytics pain points. What is it that is blocking you from gaining even the basic engagement metrics in a way that is comprehensive to all of your training programs?