Last Friday, I attended NERLA, the first north east regional learning analytics symposium. It was organized as a NERCOMP SIG and held in Southbridge, MA. One of the greatest benefits of SIGS is the face-to-face interaction and networking that occurs. The overarching goal of the meeting was to: a) build focus and collaboration around all campus stakeholders; b) Get a “jump start” regionally on this emerging topic. I’ve been hearing more and more lately about learning analytics on Twitter and elsewhere, especially in regards to blogs and other social media.
While my first exposure to analytics at work is mostly from Blackboard’s Performance Dashboard and also Safe Assign, I ascertained early on in an event held a week prior, ELI’s Learning Analytics webinar, that a vast emerging discourse of analytics is expanding beyond the confines of the LMS walled-garden. I will clean up my notes from this webinar and post a blog post in the next day or two. Important to bear in mind that learning analytics is *not* new actually. We’ve been engaged in some parts of it for a long time. But concerns about privacy need to be addressed by community in order for this work to advance.
One one the biggest takeaways is the notion of early intervention. Learning analytics help us make real-time adjustments in the classroom to enable appropriate interventions. It helps to inform where students are in their mastery of course content. Described as a “peek” under the hood, it is especially useful when students are headed down the wrong path and could greatly benefit from intervening guidance from the instructor. It has the potential to radically address key developmental moments before it’s too late. This to me has great potential value as an innovative tool in blended and online courses.
There are different approaches. At the core, they aim to build a predictive model of successful learning. One approach comes from academic analytics. It takes a longitudinal/ historic data analysis approach and incorporates large volumes of existing data from libraries and academic assessment units. Often a labor intensive venture, the intent is to build code toward data that allows for mining and extraction of potentially useful trends about how learning has occurred/or has not in the curriculum; and what can be done to make it better in the future. This approach can be lengthy and costly. Also, there are challenges of relying on student self-reporting of what they think they did vs. what they *actually* did. A graduate student speaker from Brandeis describes it as: “Don’t focus on the gathering data in the present, but use what you already have to plan for tomorrow,”
More recent approaches consider learning analytics to be a short-term and real time moment in a class with a predictive glimpse at what’s going on, and what lies ahead in an individual’s learning. What do you want the LMS to do for you as instructor? What can it do for students to make experience better? Ultimately, learning analytics provide numbers that reassure us and can help us make more informed decisions. A key challenge is to graphically represent what something is (or is not). One example from Australia shows just what a graph could look like. Another example involves using EnquiryBlogger, a WordPress plugin, where students tag their posts on the basis of originality and other criteria. Understanding data and why and how it is collected has become a life skill. Perhaps even a literacy. What are rubrics we can apply to data to inform our analysis is a central question.
The SIG leadership group also planned a very clever activity and game.