Case Study from
our Community

Latest Case Studies

proactive student support
  • University of the West of England, Bristol
  • Nottingham Trent University

Book a Demo

Ready to unlock the power of your student data?

Aston University finds early measures of engagement predictive of future outcomes

A study by Aston University that looks at student engagement and the impact on attainment using ‘real-time’ learning analytics data with StREAM has been published.

We asked research contributors Professor Helen Higson, Provost and Deputy Vice-Chancellor, Professor Liz Moores, Deputy Dean of Aston’s College of Health and Life Sciences and Rob Summers, postdoctoral research fellow to share more on the study, Solutionpath and the impact on student support in this case study interview..

Aston University and Solutionpath

How has StREAM been implemented at the University to date?

We first introduced StREAM’s student engagement platform for Foundation and First-Year Undergraduates in Sept 2018; we have now rolled it out to all on-campus taught programmes (UG and PGT) in Sept 2020. Students are the principal users. We believe this is important because it is only them who can change their engagement behaviour. It has also been used by Personal Tutors, Programme Directors, Student Services including the Wellbeing and Enabling Team. Overall, the implementation of StREAM has been a significant driver for change in personal tutoring.


We had heard about it being used to good effect at other Higher Education Institutions. Also, one of the positive things about StREAM is that it doesn’t attempt to predict outcomes based on student demographics, rather it uses each individual student’s behaviour to give an engagement rating. This ethos aligns well with Aston’s values. We did not want to presume that students from particular backgrounds were more likely to behave in a particular way and need particular interventions, rather we wanted to provide our interventions based on individual need and individual behaviour.

The Study

How did the research come about?

There were two factors which led to this research. Firstly, this work was supported by the Centre for Innovation in Learning and Education (CILE), a Catalyst OfS funded project. The joint Aston/Cranfield virtual Centre aims to develop new knowledge in innovative education, business-engaged educational design and innovative delivery modes in undergraduate provision within UK higher education. As part of this project, we were interested to learn more about the effectiveness of learning analytics generally.

Secondly, Aston’s own learning analytics system is relatively new and we wanted to understand more about the potential impact that it could have on retention and attainment of our students and to ensure that the feeds we had chosen for the system were appropriate and optimal. This piece of research is about attainment. We hope to investigate more about retention in the future.

What were your aims for the research?

Our overall aim was to try to produce some useful information for the HE Sector and for Aston University on the potential utility of the Learning Analytics system. Student retention and success is central to our mission at Aston University and we wanted to understand what predicted different outcomes so that the University had accurate knowledge and understanding of student behaviour to further support their success. We try to make all our interventions and initiatives research-informed so that they have the best possible outcomes.

Can you take us through the process you went through to conduct the study?

At first, the intention was to provide a literature review of existing research, but then we thought it might be interesting and helpful to look at our own initial data from the system. Solutionpath were very helpful with this. Lots of previous research has looked at data at the end of an academic year, but few researchers have looked at data on a ‘live’ week-by-week basis. The system was not initially set up for our specific purpose, but Solutionpath helped us to output the data in a format where we were able to do this. This was important to us because predicting students’ outcomes after it was already too late to change them was not of interest.


The study reports an analysis of week by week data which monitored 1,602 first-year undergraduates. Results showed that students who obtained the highest end-of-year marks were more likely to be in a higher engagement quintile as early as the first 3–4 weeks. It also highlighted that students who started in a higher engagement quintile, but their engagement decreased, were more likely to have higher marks than those that started in a lower quintile and then increased their engagement. Concluding that early measures of engagement are predictive of future behaviour and of future outcomes.


How will this research help to support the strategic aims of the University and the sector?

This understanding affords us various opportunities to reduce attainment gaps and provide a more inclusive educational experience. Whilst we accept that some students may be in a better position than others to engage with their studies early and often, we hypothesise that at least part of the difference may result from a lack of understanding of the process of studying and nature of higher education. In other words, students who do not know what it is they are supposed to be doing, for example, students with little family experience of higher education.

It seems reasonable to speculate that the risks of these students falling through the net in this way are even greater if a large proportion of students’ studies is being provided online, rather than face-to-face. Even when online learning opportunities are designed well, there are fewer physical opportunities to make connections, clarify understanding and address personal challenges. This is where the analytics system can come into its own, by encouraging students to track their own behaviour against the behaviour of others in their cohort and to discuss their progress and activities at regular intervals with their personal tutors. We are anticipating that these processes should help to reduce the differences between those students with and without the ‘social capital’ and understanding about requirements in higher education.

What’s next for the use of learning analytics?

The use case at Aston University is continuously evolving, one example is the recent introduction of StREAM’s alerting capability to help identify when student engagement falls below defined thresholds. We intend to measure the impact of this on retention and attainment after the end of the year (although this year is likely to be a bit different in a number of ways). And, we would also be interested in further exploring the interplay between the data and student demographics in the future.


Solutionpath looks forward to continuing to follow the great work by the team at Aston and offering ongoing support in their learning analytics journey.

The full report can be found here.

You can also listen to our interview with Professor Helen Higson, Provost and Deputy Vice-Chancellor, Aston University on our podcast – In Conversation with.