As the noise around Learning Analytics begins to increase, more universities are coming under increasing external and internal pressure to evaluate their options to engage with Learning Analytics solutions.

Where to start?

With multiple methodologies and an ever-increasing number of suppliers entering into the market, how do Universities cut through the complexity to ensure they can gain real value from their Learning Analytics investments?

We want to share with you the must-haves our partner customers have told us they require from a Learning Analytics vendor.

Checklist: Learning Analytics Must-Haves

Identifying Risk – Does this system contain a proven predictive component capable of allowing the university to automate the process of risk stratification and to organise the students into cohorts that represent their likelihood of withdrawal/success?
Daily insights – Does this system include an algorithm that is accumulative, real, or near real-time and that will provide daily feedback to staff and students?
Alert Triggering – Explain the process by which business logic can be created that will trigger an alert for a student presenting as an outlier and requiring support?
Impact and Intervention Management – Highlight your systems case management capabilities, workflow and the ability for multiple staff roles to undertake impact analysis over time?
Utilising historical data for future insights– Does the system allow for longitude data analysis for multiple years?
Data transparency supports adoption – The data should be transparent to both our staff and students and the systems should not contain any information that would be opaque to the student?
Clean Data – Is your system capable of identifying erroneous data before it enters the application?

See why StREAM is ahead of the curve in the Learning Analytics sector, read case studies from our community to see how StREAM has been implemented in other universities.


Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.

Want to Know More ?

Fill in your details and we’ll be in touch.
  • To learn more about our Privacy Policy click here.



Implementing Learning Analytics can appear a daunting task for some institutions, we have created a six part guide that covers some of the key areas to consider when thinking about implementing technology to support Learning Analytics in your institution.

Part 1 – What data gives results

It’s important that any data collected and processed for developing Learning Analytics (LA) is used to drive action. Data must reflect the students’ current learning path and be actionable. Aggregating data and applying analysis retrospectively will not enable pre-emptive interventions. It’s not just about which data you use, but how you use it.

Using live behavioural data from a range of the  institution’s operational systems is key to deriving insight into how a student is engaging with their studies. In order to support students when necessary it is key to understand when their behaviour changes at a point in time when something can be done.

Solutionpath’s StREAM algorithm aggregates institutional data from a range of sources; provides a daily engagement score and alerts to behavioural changes. This enables university staff to pre-empt any issues affecting student welfare or learning journey and act accordingly.

We are now working with 15 Higher Education Institutions (and counting) and for each implementation we created ‘out of the box’ proven connectivity for most systems. Each institution we work with has variations in the data sources that are used to drive insight and action. There are many sources that can be fed into an algorithm to support LA activity, its not a one size fits all approach. If you’d like to know more about the different data sources and inputs we work with, contact us and we’ll be happy to discuss.

Part 2 – What functionality is key?

To deliver the objective of supporting the student in their attainment objectives, its key that any Learning Analytics technology supports the Learning Journey of the student and motivates them to progress.

Learning Analytics Student Learning Journey

Student Learning Journey

Below is a quick summary of the functionality that would be considered key to any successful Learning Analytics technology.

  • Empowers the student – allow them access to their own data through their mobile device
  • Enables fast action – Use alerts, notifications and messaging to highlight changes in engagement behaviour
  • Provides a streamlined student experience – ensure interoperability with other core systems for effective case management
  • Maintains consistent dialogue – Notes, reminders and student insight functionality should enable coherent and consistent interactions


Part 3 – How predictive is Data in identifying students at risk?

When data is transformed in the right way, Learning Analytics is an invaluable tool for identifying students who are at risk of leaving their course. The use of near real time behavioural data ultimately means that flags can be raised at the very point a student’s behaviour changes. This enables university staff to act immediately based on data, rather than traditional methods of observable behaviour. It also enables universities to focus effort where and when necessary; generating efficiencies in pastoral activities.

Solutionpath StREAM technology has a predictive component – it is unique in being able to categorise students by their propensity for persistence based on their engagement with university .

Our experience highlights that a strong LA solution can identify students at risk 8 weeks earlier, on average, than traditional methods, and supported by a strong support framework to manage the interventions, this can lead to decreases in student attrition.

Part 4 – How easy is it to implement Learning Analytics technology?

Many institutions who embark on the Learning Analytics journey encounter obstacles with implementing the technology due to problems associated with existing legacy systems and their ability to extract and transform the data in order to make it useful.

No University is codified for LA, these systems reside downstream from a range of business systems and processes that are not maintained for the purposes of supporting an LA solution, this aspect requires you to work with a partner with extensive experience of delivering against a commercial level Service Level Agreement (SLA).

Implementing Learning Analytics

Implementing Learning Analytics

Ultimately to gain the most actionable insight for LA implementation, there is a requirement to aggregate data across a number of organisational systems, this can take time and resource. Following this, data needs to be analysed and algorithms generated to enable action from the data. The time involved to achieve this, with no prior experience could be extensive.

At Solutionpath we have now implemented this process with a large number of institutions and developed solutions with wide range of pre-integrated out of the box connectors enabling us to easily digest data and transform it. This reduces the technical burden on IT departments removing any requirement for internal development resource to be deployed to any LA project involving Solutionpath StREAM technology.

Part 5 – What if the input data is no good?

Many institutions are held up in their implementation of Learning Analytics due to concerns over data quality and security. Often projects encounter stalls in the process from concerns that data quality will impede the results and a view is taken that value won’t be driven from existing data assets.

Due to the unique way in which Solutionpath’s StREAM technology works, issues in data quality are overcome through data source calibration with an algorithm that adapts to the data input quality to provide actionable insight.

Learning Analytics Tool

StREAM Time to Value

Part 6 – How quickly should value be delivered from Learning Analytics?

If the process of implementing Learning Analytics is approached in the right way, the data and technology can be used immediately to support institutions in their pastoral support processes.

The StREAM algorithm is designed to aggregate data and give an immediate reading of a student’s engagement. This is then calibrated against their cohort to determine how well they are engaging against their peer group. The technology then uses the data to trigger alerts where students are disengaging. Ultimately this means that immediate interventions can be actioned to engage with student’s and understand their current situation.

To learn more about how to implement Learning Analytics, we’ll be happy to share our experience with you. Please get in touch to speak to us.


Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.

Want to Know More ?

Fill in your details and we’ll be in touch.
  • To learn more about our Privacy Policy click here.



How does the system work?

StREAM creates a model for the student’s engagement by counting their use of the various proxies and resources that represent participation in their course, and then assigning an engagement score to each student that can be used at an institutional level to identify risk and mount new lines of enquiry, and also by the individual student for self-reflection and calibration purposes, much like a fitness app for their education.

Why is engagement so important?

Empirical evidence at our customers shows that, as might be expected, student engagement is the strongest predictor of progression and attainment, studies have shown that whilst factors such as demographics and entry tariffs are useful, by far the greatest predictor of success is in what the student does, not who they are, and so we measure the “do not the who”.

Do you include demographics factors including ethnicity & socio economic background in the analytical model?

NO, whilst we recognise that demographics in Higher Education are important. They can be, and are used in a variety of ways by universities from a planning and reporting perspective. When taken in the context of learning analytics however they can become a toxic component that creates bias in application, and risks overlooking students in crisis who may well not fit the profile. One of the major factors behind the design decision however was to ensure that universities could share the data with the primary protagonist – the student themselves – so we wanted to let them measure themselves against a metric they could influence, rather than using factors that they cannot change such their demographic markers.

Why do you share the data with the students?

We democratise the data because we believe the data belongs to the student, by excluding toxic factors such as demographic profiles we enable the institution to empower the student themselves, providing a tool for self-reflection/calibration and significantly increasing the responder community in an environment where, university resource is often stretched.

Do you encounter issues with privacy and monitoring, is this system considered to be a little Orwellian?

StREAM does not create any new data, the system creates the model for engagement passively (which ensures accuracy, consistency and helps to drive ubiquity) using the high frequency digital interactions that the student makes on a regular basis. By democratising the data to the students, including them in the steering group and ensuring that the information is never used punitively, you can ensure that these very legitimate concerns can be addressed.

Can the information provided be a demotivating factor for students that have managed to get behind their peers?

This again is an important factor to consider when providing the students with their engagement data, and much thought and discussion (with the NTU student’s union and ethics committee) went into the design principles of the system for this reason. As well as the accumulative view we also include a daily score, this then allows the student (in much the same way as a fitness app works) to witness the value of the various actions against their daily score, thus encouraging them to do more of the right things that can impact their outcome. In a recent survey the NTU asked their student’s the question, “if we knew you were likely to drop out of university would you want us to tell you” to which 94% of the students replied that they did and so we believe that used appropriately and delivered in a way that is easily consumable will make StREAM an important support tool for every student.

What systems do you typically connect to?

Having now delivered multiple times into the UK HE sector we are able to connect to a wide range of systems out of the box, these include but are not limited to, Tribal SITS, Elucian’s Banner, Unit4, Blackboard, Canvas, D2L, Moodle, Turnitin, Talis, Panopto, Ex Libris/Campus M and many more. The StREAM application also exposes an API that customers can build against, allowing for a rich seem of data that can be either extracted into other campus systems such as case management and CRMS systems, or for other 3rd parties to create integrations for new systems. We are certified with the IMS and are also working towards the Caliper standard.

How many data points would you require as a minimum to create a valuable asset for the university? Are there any must have’s?

We don’t have a ‘minimum’ requirement, however, diversity in data is a key requirement. ‘most’ Universities have similar systems, such as online learning environments (VLE/LMS) or ePortfolio products, these offer rich data points. Also, library systems that may offer book loan history, reading lists or access to online journals. The student information system is a prerequisite as this provides the key data that all data is built from.

What are the training overheads, our tutors are already very busy to what extent does this increase their workload?

The application design has been considered to accommodate a range of digital literacy capability, simple, intuitive by design. The premise of the solution was to collate information to allow staff to have a richer view of a student’s learning journey – this therefore is to SAVE time, not add to it.

What is the single biggest problem related to Learner Analytics for universities to overcome?

LA is not a technology project, it’s a business transformation project. This understanding all the upstream business processes and the impacts that they have when materialised into application like StREAM requires a whole University approach.

You seem to reference NTU so much, what about your other clients, why are NTU so heavily referenced?

Nottingham Trent University were our first customer and have supported the ongoing development through what has become and iterative process. NTU’s maturity in their use of learner analytics and the extent to which they have been able to embed the tools into standard business practice, coupled with the research they have been involved in through projects such as the ABLE ( provide a rich seam of insight for anyone wanting to explore the use of Learner Analytics technologies in their institution.

How long does it typically take from project initiation to completion to go into production with a system?

We have a proven programmatic delivery process gated to support governance steps and controls which accommodates risk management steps. If the University can provide the appropriate resources (whether technical or business) at the appropriate times we expect to deliver in less than 60 days, however, we would recommend an appropriate level of flexibility and any deployment should be followed by thorough user acceptance steps, warranty and some sort of pilot before full University wide production roll out.

Is GDPR a factor in using the software?

Of course, whenever you use personally identifiable data you should consider GDPR. All of the data that is used in StREAM would be data that is already available to the university and used as part of your student educational contract. Many universities have used the ‘legitimate interests’ clause in the GDPR to avoid the need to seek consent. However, we would urge Universities to seek out their own guidance, especially if using other tools that use protected data fields, especially in the process of automated decision.

If you use any personal data in any automated decision making you must declare this when making an intervention


Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.

Want to Know More ?

Fill in your details and we’ll be in touch.
  • To learn more about our Privacy Policy click here.