Education, Education, Education. With the proliferation of e-learning startups, fuelled by a trend towards on-demand remote learning, we get increasing demand for analytical approaches that quantify the efficacy of teaching methods and materials.
One such case was our client Viva Tuition, who have built a disruptive learning product for the the CIMA management accounting qualification. They have developed a huge library of expert-taught materials, as well as practice tests and mock exams, and they wanted to understand better the behaviour of their users and how it affected their learning outcomes.
We defined and implemented product analytics using Amplitude, creating in-depth analysis of learning interactions, offering a level of insight deeper than their existing pageview level tracking in Google Analytics.
In order to measure learning outcomes, we added rich event tracking to their objective test mock exams, giving us a measure of student knowledge and how it developed over time.
The next question was whether there were clear predictors of successful learning in the activity data, which we could validate with the test scores. The most basic idea would be to compare general site activity (e.g. active days or time on site) to learning outcomes, but this would miss the ability to attribute learning to individual learning materials. For this, we needed to tackle the challenge of adding tracking to the video lectures.
Tracking learning via videos is non-trivial: even if a user loads a page with a video, do they really consume the content? Fortunately Vimeo, via their developer tools, enables tracking of lower level details. We settled on tracking video plays and video progress at 25%, 50%, and 100%. With that, we could make some pretty fascinating funnels in Amplitude, like the one below for progression of users through each video.
From here we could compare the video engagement data of different users or courses with the results seen in the objective test.
With these insights in hand, the Viva team were able to arm their content producers with a wealth of information about their learning materials, and will be able to evaluate improvement as they iterate.
These same set of simple tracking events can enable a plethora of further analyses: how often are certain lessons repeated? Are longer or shorter lessons more effective? Are there particular learning sequences or habits that users fall into?