Learning Analytics is a hot topic in educational technology, with principals, vice chancellors and government education departments taking an increasing interest in the insight that can be gained from student data. But what exactly is Learning Analytics?
In recent years researchers and educators have begun to explore how we can use new sources of data on students and their learning, together with predictive analytics techniques, to improve many aspects of the educational experience.
I’ve been immersed in this area for the past few years. In an attempt to understand the many aspects and potential benefits of this fascinating and rapidly evolving field, and to explain them to others, I’ve spent much of the last year writing a book about it.
Learning Analytics uses data about students and their learning to help understand and improve educational processes, and – crucially – to assist the learners themselves. Students are increasingly carrying out their learning online, using devices such as laptops, tablets and smartphones. This leaves a “digital footprint”, which can be automatically analysed, and combined with data about their backgrounds, past academic performance or their career aspirations. Educational content, activities or processes can then be adapted to provide an enhanced or more personalised experience for the learner. This promises to be better for students than the “one size fits all” approach to education, which has been deployed traditionally across much of higher and further education.
I’ve found four main uses of learning analytics in my discussions with experts in the field, and in the more than 600 articles and academic papers which I reviewed for my book:
All universities, colleges and schools have an interest in ensuring that their learners are learning effectively. Many institutions have problems with high rates of student attrition, and this has been responsible for much of the interest in learning analytics, particularly in the United States.
Early alert systems enable the automated identification of learners at risk of failure or withdrawal, sometimes very early on in their studies. Dashboards and automated alerts sent to teachers and personal tutors allow them to view the students who could most benefit from their input. Interventions at an early stage can result in students being retained who might otherwise have dropped out. Better retention has financial and reputational benefits for institutions. There can also, of course, be a huge impact on the self-esteem and career prospects of those individuals who ultimately succeed in their studies.
Other institutions have much less of a problem with students withdrawing. However, they’re aware that many of their students could perform better: with a clearer idea of how they’re learning, good students could become excellent. Are you on target for the degree classification you want to achieve? Models can be produced from historic data about previous students, and mapped onto your own data to show you what aspects of your engagement you need to improve on.
Giving learners data about their own learning through dashboards and apps, is an increasingly promising area for learning analytics. Building on the popularity of fitness apps, the software can help to motivate students, help them feel less isolated, and make them aware of patterns of activity in their learning which could be enhanced.
A second use for learning analytics is recommending to students what courses or modules they should study next. We’ve become used to recommender systems when we buy items online: products are suggested to us by vendors such as Amazon, based on our previous purchases. Companies are increasingly using data about other customers “like you”, as well, to recommend items you might want to buy.
Similar underlying technologies are increasingly being deployed in many institutions, particularly universities in the United States, where students can be faced with a huge choice of courses. Recommender systems predict the ones where they’re most likely to succeed or which may help meet their career aspirations.
Also increasingly set to have a big impact across education are adaptive learning systems, which tailor the material presented to students, based on how they interact with it. Thus a student who is struggling with a particular topic can be directed to additional materials automatically, before moving onto the next topic.
Some of the experts I’ve interviewed see the future as being a much more personalised approach to education, where data on learners and their activity is constantly used to update what is presented to and expected of them. This doesn’t need to be a solitary learning experience, where the student simply interacts with an intelligent system. The algorithms can also help them to connect with others who are encountering similar issues, or who are prepared to offer their expertise.
There is a final significant area in which data on learners’ activities is increasingly being used. This is in the design and enhancement of the learning and assessment content and activities that are provided to students. The effectiveness of different aspects of the curriculum can be analysed using data to enable “on the fly” enhancements or more significant alterations which can benefit future cohorts.
For instance, the data might demonstrate that a key piece of learning content is not being accessed by your students. You might then try to discover why not. Was it because the content was too difficult, not easy to find, sequenced at the wrong time, or you didn’t effectively communicate to the students the importance of accessing it?
There are also examples of institutions discovering from the analytics that a particular minority group is underperforming in an aspect of the curriculum. This can then lead the institution to attempt to identify whether this is due to linguistic or cultural issues, or perhaps a lack of prerequisite knowledge. Additional support can then be targeted at the group to bring their performance up to the standards of the rest of the cohort.
Anyone thinking about deploying a learning analytics system at their institution is likely to encounter ethical objections from some of their colleagues, and perhaps some of their students too. There is no doubt that there are many possibilities for the misuse of student data: I’ve identified 86 separate ethical, legal and logistical issues which occur in the growing number of articles and research papers written about learning analytics.
Predictions can of course be wrong, and students are complex individuals, not simply labels such as “at risk” or “not at risk”. Analytics can’t tell us whether they haven’t turned up to lectures because they’re struggling academically, because they find our lectures boring, or because they’re having to look after a sick relative.
I worked with the UK higher and further education communities and the National Union of Students to capture such issues in Jisc’s Code of Practice for Learning Analytics, which can help institutions to develop their analytics capabilities legally and ethically.
One of the key ethical issues that crops up is the “obligation of knowing”. As institutions assemble ever greater quantities of data, is there not a moral requirement on us to use the insight that can be provided from it to help our students? If the analytics suggest there is a strong likelihood that a student is at risk of dropping out, for example, shouldn’t we be trying to find out if we can help that student? Is it justified, as students incur increasing amounts of debt to fund their studies, for us to continue to provide learning content and activities without doing everything we can to assess whether they’re proving effective?
As industry and government increasingly use “big data” to analyse their customers’ and citizens’ behaviours to enhance the effectiveness of their operations, education is in danger of being left behind. Learning Analytics, carried out strategically across an institution, with attention to the ethical issues and the needs of users, promises to enable a much better understanding of a wide range of educational processes. This should lead to decision making based on evidence rather than intuition, with multiple benefits for institutions, and – most importantly – for our students.
Niall Sclater is consultant and director of Sclater Digital, an educational technology consultancy. As Director of Learning and Teaching at the Open University, he led its institutional learning analytics project. More recently, he has recently written a book “Learning Analytics Explained”.
In recent years researchers and educators have begun to explore how we can use new sources of data on students and their learning, together with predictive analytics techniques, to improve many aspects of the educational experience. Learning Analytics is a hot topic in educational technology, with principals, vice chancellors and government education departments taking an increasing […]
Fill in your details in the form to the right to access the full article.