ARTICLE TITLE LOREM IPSUM DOLOR SIT AMET, ADIPISCING ELIT SED DO EIUSMOD TEMPOR.

Many institutions who have not dipped their toe into Learning Analytics have fears around its use. This installment aims to clear up some commonly held misconceptions around the use of data in this field.


  1. “Collecting data about our students for Learning Analytics feels a bit ‘Big Brother”

The data used by StREAM already exists within the university – StREAM collates student activity data from historical and current digital interactions from centralised data points such as campus swipes, tutorial attendance and e-learning portals, and uses it to support staff and students in the student’s learning journey. A key part of the StREAM design is the transparency of the data model (we only measure engagement – we don’t include more contentious areas such as demographics or socio-economic background), which allows us to share the data with both staff and students with no ‘black box’ approach to calculating the engagement scores.

  1. “Our student services resources are already overwhelmed – surely this will increase the demand on them?”

Sharing the data with the student has resulted in them taking control of their own learning journey and the likes of Nottingham Trent University have seen students adjust their behaviour – in the 2017 Student Transition Survey, 74% of students who had logged on to the Dashboard reported having changed their behaviour to raise or maintain their engagement score. Empowering the students enables them to change their levels of engagement without having to rely on university resources to make an intervention. Other universities are using the dashboard and its alerts framework to identify those students that university support services aren’t aware of and get upstream to offer support before they reach the crisis point. We have also seen other customers identify seasonal changes in resource requirements which helps with planning.

“Two thirds of students who take their own life are unknown to the university support services.”

  1. “We don’t have enough resource to implement Learning Analytics”

Implementing Learning Analytics (LA) can appear a daunting task for some institutions. At Solutionpath, we have developed solutions with a wide range of pre-integrated, out of the box connectors, enabling us to easily digest data and transform it. This reduces the technical burden on IT departments and removes any requirement for internal development resource to be deployed to any LA project involving Solutionpath StREAM technology.

Take a look at our York St John University case study to see how easy it was for them.

  1. “Won’t this make students more anxious?”

The system doesn’t predict grades – it is designed to promote positive behaviours to help achieve positive outcomes. As the data is taken on a daily basis, a student can make adjustments to their daily engagement without the need to wait for the next assessment milestone. Nottingham Trent University research has shown that students wanted to be told that they were at risk of dropping out (94%) or know if the university could improve their chances of progression (97%).

  1. “I’m not sure the academics will like this”

Typically, academics have limited time to spend with students (3-5 times in the first year). The system helps to prioritise their time through identifying those students most at risk, whilst also enabling them to maximise the limited time they do have with students, as the conversations are more focused around their engagement. In the Nottingham Trent University pilot survey, 80% of tutors felt that the data provided by the StREAM Dashboard changed how they worked with students.

 

WANT TO CONTINUE READING?

Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.

Want to Know More ?

Fill in your details and we’ll be in touch.
  • To learn more about our Privacy Policy click here.

Close

ARTICLE TITLE LOREM IPSUM DOLOR SIT AMET, ADIPISCING ELIT SED DO EIUSMOD TEMPOR.

Richard Gascoigne, CEO of Solutionpath, discusses how efficient use of data can help universities support struggling students


Article first published by Education Technology, 28/10/2019, written and contributed by Richard Gascoigne, CEO of Solutionpath

Mental health

With World Mental Health Day having recently taken place, over the last few weeks people across the globe have been reflecting on an important societal issue and steps that can be taken to support the most vulnerable in our society. At the same time, university students have been settling into their courses, and learning to adapt to new, unfamiliar environments. This is a good time to reflect on the wellbeing of these students, especially as stories surrounding mental health and student suicide have unfortunately become commonplace in the news. Last year, figures from the Office of National Statistics (ONS) recorded 95 student suicides over the course of a year which demonstrates that this is still very much an issue that needs attention.

This is a sentiment shared by many, and steps are being taken by public bodies, universities and industry with an aim to reduce if not eradicate student suicide. The Office for Students (OfS) announced a programme earlier this year with an aim to help reduce student suicides, and some universities have aimed to explore early indicator systems that identify those in need of support and direct them to appropriate services.

Reliable information

However, with the output of these efforts being the safety of our students, we need to ensure they’re as immediate and considered as humanly possible. While many approaches are well-intentioned, the ways in which they are being implemented are fundamentally flawed. The most common mistake is using unreliable information to identify students. For example, some identification systems or initiatives are relying on demographic profiles, personal attributes or self-declared data to identify at-risk individuals, even data from student social media profiles. These ‘data types’ may correlate with a risk, but often don’t support when to intervene and risk stigmatising or stereotyping students.

“While many approaches are well-intentioned, the ways in which they are being implemented are fundamentally flawed.” 

Furthermore, these are unproven approaches that could result in action being taken where help may not be needed creating more burden on stretched resources – or worse, a student ‘in need’ isn’t identified as mental health can, and does, affect anyone. Rather, insights and actions need to be based on real-time engagement data that is specific to each person, not a ‘type’ of person or how they present themselves online. Data about a student’s real-life engagement with their university atmosphere – data about whether they’re still participating like going to the library, or if they’ve stopped attending lectures – provides real objective insight, while also respecting a reasonable degree of privacy.

Nottingham Trent University

There are many universities who have already taken this approach and seen measurable impact and not waited for experimental outputs that may or may not work. For example, Nottingham Trent University have produced a ‘whole university’ dashboard to generate and display real-time engagement data to both students and staff. If a student registered a lack of engagement activity for a number of days, an email would be sent to a relevant staff member asking that they make contact with that individual. This approach means that any intervention is based on facts and can be made quickly. This approach has proven to be more accurate and timelier than using the existing data points identified above. To learn more about how Nottingham Trent University use StREAM, click here.

While universities shouldn’t be relied upon to solve the mental health crisis, they have a significant role to play in being a route to seeking help at arguably one of the most vulnerable times in a young person’s life. Having an approach that can pinpoint at-risk students accurately and quickly using engagement data, can help universities to support students both academically and personally, allowing universities to ensure that they fulfil one of their core duties; to fully nurture young learners on their educational journeys without profiling or stigmatising them.

WANT TO CONTINUE READING?

Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.

Want to Know More ?

Fill in your details and we’ll be in touch.
  • To learn more about our Privacy Policy click here.

Close

ARTICLE TITLE LOREM IPSUM DOLOR SIT AMET, ADIPISCING ELIT SED DO EIUSMOD TEMPOR.

Learning and sharing are an integral part of Solutionpath as we work to improve the learning journey and Higher Education experience for students. We host, attend, and present at events and conferences throughout the year to better collaborate with the Higher Education community. Take a look at the upcoming events you can find us at.


Wednesday 16th October –  Higher Education Conference, QEII, London

We’ll be at the Higher Education Conference with our partner Campus M talking about how our joint offering supports the Higher Education market agenda.

https://heconference.co.uk/

 

Wednesday 13th – Thursday 14th November  – PCV Network, Hallam Conference Centre, London

We are proud to sponsor the PVC Network event hosted by Advance HE. We hope you see you there.

https://www.advance-he.ac.uk/programmes-events/events/pvc-network

 

Wednesday 20th – Friday 22nd November –  UCISA CISG, Radisson Blu Edwardian Heathrow

We will be joining our in-market partner Campus M at UCISA CISG to present our joint offering for Higher Education.

https://www.ucisa.ac.uk/groups/cisg/Events/2019/cisg19

 

Monday 25th November  – EdTech for Good Conference, Coventry University London

Join our ‘in-market’ partners at the EdTech for Good conference to hear about how Learning Analytics and Engagement is delivering outcomes and transformation in Higher Education. With speakers from Coventry University, York St John’s University and Nottingham Trent University.

https://www.edtechforgood.co.uk/

 

Wednesday 27th November  – Higher Education Data Conference, Congress Centre, London

Hear Ed Foster from Nottingham Trent University, one of the leaders in Student Engagement and part of the StREAM community, talk about how data is used to improve Student Experience.

https://highereducationdata.co.uk/

 

Visit us on the road where our team can answer some of the questions you might have around StREAM and Learning Analytics and how our solution works or get in touch to book a demo.

WANT TO CONTINUE READING?

Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.

Want to Know More ?

Fill in your details and we’ll be in touch.
  • To learn more about our Privacy Policy click here.

Close

ARTICLE TITLE LOREM IPSUM DOLOR SIT AMET, ADIPISCING ELIT SED DO EIUSMOD TEMPOR.

As the noise around Learning Analytics begins to increase, more universities are coming under increasing external and internal pressure to evaluate their options to engage with Learning Analytics solutions.


Where to start?

With multiple methodologies and an ever-increasing number of suppliers entering into the market, how do Universities cut through the complexity to ensure they can gain real value from their Learning Analytics investments?

We want to share with you the must-haves our partner customers have told us they require from a Learning Analytics vendor.

Checklist: Learning Analytics Must-Haves

Identifying Risk – Does this system contain a proven predictive component capable of allowing the university to automate the process of risk stratification and to organise the students into cohorts that represent their likelihood of withdrawal/success?
Daily insights – Does this system include an algorithm that is accumulative, real, or near real-time and that will provide daily feedback to staff and students?
Alert Triggering – Explain the process by which business logic can be created that will trigger an alert for a student presenting as an outlier and requiring support?
Impact and Intervention Management – Highlight your systems case management capabilities, workflow and the ability for multiple staff roles to undertake impact analysis over time?
Utilising historical data for future insights– Does the system allow for longitude data analysis for multiple years?
Data transparency supports adoption – The data should be transparent to both our staff and students and the systems should not contain any information that would be opaque to the student?
Clean Data – Is your system capable of identifying erroneous data before it enters the application?

See why StREAM is ahead of the curve in the Learning Analytics sector, read case studies from our community to see how StREAM has been implemented in other universities.

WANT TO CONTINUE READING?

Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.

Want to Know More ?

Fill in your details and we’ll be in touch.
  • To learn more about our Privacy Policy click here.

Close

ARTICLE TITLE LOREM IPSUM DOLOR SIT AMET, ADIPISCING ELIT SED DO EIUSMOD TEMPOR.

Solutionpath’s customer Nottingham Trent University (NTU), awarded The Guardian’s University of the Year, have now won a prestigious Collaborative Award for Teaching Excellence (CATE) 2019 from Advance HE.

NTU’s Student Engagement team have been recognised for their use of technology in helping students succeed in their studies through their adoption of Solutionpath’s StREAM, a Digital Engagement Learner Analytics platform, which powers NTU’s Student Dashboard.

The Student Dashboard has help student success at NTU through its use of various indicators, such as the use of library facilities, to create a digital footprint showing how students are engaging with their studies. This enables constructive conversations between students and staff.

The NTU Student Engagement team, led by Ed Foster, have developed the Student Dashboard with technology provider, Solutionpath, embracing the views of staff and students to create a platform that can help every student realise their full potential and inform NTU in how to deliver great individualised learning.

NTU’s expertise in the growing field of Learning Analytics is now widely recognised across the International Higher Education sector. Their use of StREAM was recently highlighted as a best practice case study in the Department of Education’s executive paper “Realising the potential of technology in education: A strategy for education providers and the technology industry”.

“There is no doubt that NTU’s award-winning Student Dashboard is changing the learning and teaching landscape. Our focus on engagement, not demographics or risk of failure, and the sharing of data between students and staff has created a positive, nuanced application of technology to support learning.”

Professor Eunice Simmons, Deputy Vice-Chancellor Academic and Student Affairs

The Learning Analytics data continues to show that there is a strong correlation between engagement measured in the dashboard and student success, with those students who use the Dashboard more frequently are more likely to be successful in their studies.

Furthermore, alerts raised by the system are very strong indicators that a student may be considering leaving their course. Since the launch in 2014, NTU’s cohort usage has been fantastic with student log-ins exceeding 1.7 million.

Want to know more? Read our case studies from our community

WANT TO CONTINUE READING?

Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.

Want to Know More ?

Fill in your details and we’ll be in touch.
  • To learn more about our Privacy Policy click here.

Close

ARTICLE TITLE LOREM IPSUM DOLOR SIT AMET, ADIPISCING ELIT SED DO EIUSMOD TEMPOR.

One of the most-pressing challenges facing the UK higher education sector is how it responds to the increasing prevalence of mental health conditions amongst students.


Article first published by WonkHE, 17/07/2019, written and contributed by Ed Foster, Nottingham Trent Univerity, Student Engagement Manager.

Both official statistics and student surveys show often dramatic increases in incidences of mental health conditions. Anxieties have been further raised due to incidences of student suicide. Whilst these remain low and relatively stable compared to previous decades (Office for National Statistics, 2018), each incidence is one too many.

A range of organizations have conducted research and produced practical guidance (Student MindsUUKInstitute for Public Policy Research). However, significant challenges remain, particularly around resourcing student support at the start of a crisis, or at other times where it can be most effective. Finding the balance between supporting and ‘infantilizing’ students is also a real consideration.

Setting aside arguments around the appropriateness of proactively seeking to support students, this piece explores the question of whether or not existing learning analytics resources (perhaps with some modification) could be used to identify students most in need of mental health support at a time that is most likely to lead to successful outcomes.

Learning Analytics

Nottingham Trent University has embedded learning analytics into institutional practice to help students manage their own success, to help staff support them, to improve student/staff working relationships and to improve institutional insights into the student experience.

The Dashboard generates daily engagement ratings based only on a student’s learning interactions with university resources, such as the library or online tools, avoiding more contentious areas such as socio-economic background. In 2016/17 (the year used for this analysis) the engagement ratings were ‘Low’, ‘Partial’, ‘Good’, and ‘High’.

Engagement data is generated and displayed to both students and relevant university staff, alongside other contextual information, using Solutionpath’s StREAM tool. If a student has no engagement for 14 days the Dashboard sends tutors an email asking them to make contact with the student. As might be expected, there is a strong correlation between overall patterns of engagement and student success. Students with high average engagement are far more likely to progress and achieve higher grades than more lowly engaged peers, and the ‘no engagement’ alerts are a strong early indicator that a student is at risk of non-favourable outcomes.

Daily data can act as an early warning based on either low engagement or unexpected changes in engagement behaviour. Further to supporting the needs of the whole student population, we posit there is particular potential to provide support to students with mental health conditions at the point when a mental health incident may be starting.

Can ‘engagement’ act as a proxy?

At NTU, the majority of students with mental health conditions progress from the first to second year. For example, in 2016/17, 82% of NTU first years with mental health conditions progressed compared with 84% of first year students with no reported disabilities. However, there are differences in engagement patterns between the two groups from the very start of the year. For example,

  • In the first term, a lower proportion of students with reported mental health conditions had ‘Good’ or ‘High’ engagement than their peers with no reported disability (63% and 74% of students respectively)
  • A higher proportion of first year students with reported mental health conditions generated 14-day no-engagement alerts than their peers with their peers with no reported disability (7% and 4% of students respectively)

The data demonstrated that students with mental health conditions are less likely to be highly engaged with their studies. This is important because engagement is such a strong predictor of the likelihood of success:

  • First years with mental health conditions and ‘Good’/’High’ Average engagement = 91% progression
  • First years with mental health conditions and ‘Low/’Partial’ Average engagement = 72% progression
Undergraduates with Mental Health Conditions

Undergraduates with Mental Health Conditions

This 19% difference dwarfs the 2% progression gap between students with mental health conditions and their peers with no disabilities. Importantly, engagement data has the potential to target support to students who are most in need during a particular period of time, and to do some more effectively than using solely group characteristics.

Undergraduates with Mental Health Conditions

Undergraduates with Mental Health Conditions

Insights to act on

Institutions need to develop strategies for supporting students with mental health at the point where they need it most. A range of tools already exist based on real world interaction, including tutor relationships with students and campaigns encouraging students to look out for their friends. Nonetheless, learning analytics gives a different set of insights to be used alongside existing options to spot students with mental health conditions at a point where they most need help.

Average engagement is perhaps most useful to provide contextual information to support tutors’ observations or improve the quality of a tutorial conversation. The ‘no engagement’ alerts appear to work as an effective early warning system at the point when a student with mental health conditions may be in most need of support. Clearly, challenges remain about the most effective way of supporting those students, but we feel there is potential to be explored.

To learn more about how Learning Analytics can be used, please get in touch to speak to us.

WANT TO CONTINUE READING?

Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.

Want to Know More ?

Fill in your details and we’ll be in touch.
  • To learn more about our Privacy Policy click here.

Close

ARTICLE TITLE LOREM IPSUM DOLOR SIT AMET, ADIPISCING ELIT SED DO EIUSMOD TEMPOR.

Implementing Learning Analytics can appear a daunting task for some institutions, we have created a six part guide that covers some of the key areas to consider when thinking about implementing technology to support Learning Analytics in your institution.


Part 1 – What data gives results

It’s important that any data collected and processed for developing Learning Analytics (LA) is used to drive action. Data must reflect the students’ current learning path and be actionable. Aggregating data and applying analysis retrospectively will not enable pre-emptive interventions. It’s not just about which data you use, but how you use it.

Using live behavioural data from a range of the  institution’s operational systems is key to deriving insight into how a student is engaging with their studies. In order to support students when necessary it is key to understand when their behaviour changes at a point in time when something can be done.

Solutionpath’s StREAM algorithm aggregates institutional data from a range of sources; provides a daily engagement score and alerts to behavioural changes. This enables university staff to pre-empt any issues affecting student welfare or learning journey and act accordingly.

We are now working with 15 Higher Education Institutions (and counting) and for each implementation we created ‘out of the box’ proven connectivity for most systems. Each institution we work with has variations in the data sources that are used to drive insight and action. There are many sources that can be fed into an algorithm to support LA activity, its not a one size fits all approach. If you’d like to know more about the different data sources and inputs we work with, contact us and we’ll be happy to discuss.


Part 2 – What functionality is key?

To deliver the objective of supporting the student in their attainment objectives, its key that any Learning Analytics technology supports the Learning Journey of the student and motivates them to progress.

Learning Analytics Student Learning Journey

Student Learning Journey

Below is a quick summary of the functionality that would be considered key to any successful Learning Analytics technology.

  • Empowers the student – allow them access to their own data through their mobile device
  • Enables fast action – Use alerts, notifications and messaging to highlight changes in engagement behaviour
  • Provides a streamlined student experience – ensure interoperability with other core systems for effective case management
  • Maintains consistent dialogue – Notes, reminders and student insight functionality should enable coherent and consistent interactions

 


Part 3 – How predictive is Data in identifying students at risk?

When data is transformed in the right way, Learning Analytics is an invaluable tool for identifying students who are at risk of leaving their course. The use of near real time behavioural data ultimately means that flags can be raised at the very point a student’s behaviour changes. This enables university staff to act immediately based on data, rather than traditional methods of observable behaviour. It also enables universities to focus effort where and when necessary; generating efficiencies in pastoral activities.

Solutionpath StREAM technology has a predictive component – it is unique in being able to categorise students by their propensity for persistence based on their engagement with university .

Our experience highlights that a strong LA solution can identify students at risk 8 weeks earlier, on average, than traditional methods, and supported by a strong support framework to manage the interventions, this can lead to decreases in student attrition.


Part 4 – How easy is it to implement Learning Analytics technology?

Many institutions who embark on the Learning Analytics journey encounter obstacles with implementing the technology due to problems associated with existing legacy systems and their ability to extract and transform the data in order to make it useful.

No University is codified for LA, these systems reside downstream from a range of business systems and processes that are not maintained for the purposes of supporting an LA solution, this aspect requires you to work with a partner with extensive experience of delivering against a commercial level Service Level Agreement (SLA).

Implementing Learning Analytics

Implementing Learning Analytics

Ultimately to gain the most actionable insight for LA implementation, there is a requirement to aggregate data across a number of organisational systems, this can take time and resource. Following this, data needs to be analysed and algorithms generated to enable action from the data. The time involved to achieve this, with no prior experience could be extensive.

At Solutionpath we have now implemented this process with a large number of institutions and developed solutions with wide range of pre-integrated out of the box connectors enabling us to easily digest data and transform it. This reduces the technical burden on IT departments removing any requirement for internal development resource to be deployed to any LA project involving Solutionpath StREAM technology.


Part 5 – What if the input data is no good?

Many institutions are held up in their implementation of Learning Analytics due to concerns over data quality and security. Often projects encounter stalls in the process from concerns that data quality will impede the results and a view is taken that value won’t be driven from existing data assets.

Due to the unique way in which Solutionpath’s StREAM technology works, issues in data quality are overcome through data source calibration with an algorithm that adapts to the data input quality to provide actionable insight.


Learning Analytics Tool

StREAM Time to Value

Part 6 – How quickly should value be delivered from Learning Analytics?

If the process of implementing Learning Analytics is approached in the right way, the data and technology can be used immediately to support institutions in their pastoral support processes.

The StREAM algorithm is designed to aggregate data and give an immediate reading of a student’s engagement. This is then calibrated against their cohort to determine how well they are engaging against their peer group. The technology then uses the data to trigger alerts where students are disengaging. Ultimately this means that immediate interventions can be actioned to engage with student’s and understand their current situation.

To learn more about how to implement Learning Analytics, we’ll be happy to share our experience with you. Please get in touch to speak to us.

WANT TO CONTINUE READING?

Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.

Want to Know More ?

Fill in your details and we’ll be in touch.
  • To learn more about our Privacy Policy click here.

Close

ARTICLE TITLE LOREM IPSUM DOLOR SIT AMET, ADIPISCING ELIT SED DO EIUSMOD TEMPOR.

Demographics in Higher Education are important. They can be used to identify disparity in academic achievement, support targeted provision for those less privileged and track demographic changes in student populations – aiding institutional planning. But what place do they have in Learning Analytics?

Demographics feed into a number of key strategy issues – like recruitment and resource planning, Higher Education Statistics Agency reporting and access and participation targets. It’s therefore no surprise that in our work supporting sector planning teams, demographic splits are regarded as crucial data points for supporting university decision making.


Learning analytics

With the rise of data-led machine learning techniques, it is now possible to use data to enhance our understanding of learners and their learning environments – resulting in the rapid emergence of ‘Learning Analytics’ within the sector – data based on algorithms that support decision making and offer new areas of insight. These developments mean that we are now able to measure and codify students to support progress, attainment and retention at institutional and individual student levels – often through targeted, real time interactions that are specific to an individual student need.

There are clear attainment differences across many demographic groups, so the starting point for many universities is to consider ‘demographic factors’ as an influence within any analytical model. But what do demographics do in this context? Typically, they reinforce what we already know and they can predict a student ‘outcome’ – students from postcode X might do less well, so a system predicts under performance.

If nothing changes along the students’ learning pathway, then the algorithm will be predictive – a factory production line with the same inputs and processes will deliver the same outputs every time. However, to change a student’s trajectory, either the university or the student has to do something different. As demographics do not necessarily change, we should focus on those factors that are within each individual student’s gift to manage.


The dangers in execution

In the work we do with universities, a danger we have identified is that that data analysis algorithms that embed demographic factors hold the potential for bias. Whenever a mathematical model is created, weight is attributed to certain conditions which then indicates an outcome. Gender and racial bias in algorithms are topics of heated debate in the sector and latent bias exists in many Learning Analytics approaches that we have seen.

Algorithms that rank a student’s ‘potential’ because of a demographic attribute which shows that a particular group hasn’t fared as well are biased. What if a BAME student was written off by a tutor because in their view it was a pointless endeavour, and the ‘system’ allows them to confirm that view? What if a white student was having mental health issues and was overlooked just because he/she came from a privileged background?

If the start-point (demographics) is an issue, then the end-point may also need further consideration. Blanket assumptions about what constitutes ‘success’ also influence goals in an algorithm. If we only measure success with a grade outcome, this fails to recognise the full value of higher education by simply providing a classification at the end of the process. Focussing on ‘predicting’ a grade award can create negative perceptions from students; why should a Black and Minority Ethnic student be potentially downgraded in their ‘outcome predictions’ because of their heritage?


Solutions to problems

There are solutions to these problems. By working with KPMG, Solutionpath offers the benefit of experience in managing transformation and student journey projects, helping providers with the challenges associated with driving service adoption. This includes the ethical as well as the operational aspects of data.

In our work with Nottingham Trent University (NTU), the university gives students access to their own data and has done so right from the start. The project team (including students’ union officers and a representative from the equality & diversity unit) was worried about bias in scoring, and resolved that it would be profoundly demotivating to use student demographics in the engagement algorithm. As Ed Foster, NTU’s Student Engagement Manager makes clear,

“Two students engaged in precisely the same way (took out the same textbooks, logged in to the VLE the same amount) where one student was from a disadvantaged background would be at risk of ending up with a lower score – and for them what their students do was more important than who they are”

General Data Protection Regulations (GDPR) obligate institutions to share how automated decisions are derived, and offer students a means to challenge these. This could have a huge consequences – when a tutor is asked for the reason they have asked for an academic review with a student, are they ready to justify the “at risk” flag? And does the institution even understand how the calculation has been reached to be able to explain it?

No university wants bad press, and this simple factor alone could limit Learning Analytics to research and closed-door planning. This would be a shame, because in our experience when done well, Learning Analytics could become a truly democratic tool that offers a means to successful change for students.

Want to know more? Read our blog ‘What affect does demographic data have on science behind the algorithm?’

WANT TO CONTINUE READING?

Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.

Want to Know More ?

Fill in your details and we’ll be in touch.
  • To learn more about our Privacy Policy click here.

Close

ARTICLE TITLE LOREM IPSUM DOLOR SIT AMET, ADIPISCING ELIT SED DO EIUSMOD TEMPOR.

Learning Analytics enable universities to use algorithms to support decision making. But what effect does student demographic data have on the science behind the algorithm?

Demographic data undoubtedly plays a major part in Higher Education. We understand that some student groups do less well than others and strategies to support these groups should be put in place to improve their likelihood of success. However, in adding demographic filters to Learning Analytics projects, institutions run the risk of demotivating their students and missing valuable opportunities to provide assistance to those in need of academic development or wellbeing support.

Demographics in Higher Education are important, using analysis of them can:

  1. Identify disparity in academic achievement
  2. Support targeted provision of inclusive and positive action for those less privileged
  3. Track demographic changes in student populations, aiding institutional planning

Demographics can also feed into a number of key aspects of university life such as recruitment and resource planning, reporting for the Higher Education Statistics Agency (HESA) reporting and meeting the fair-access obligations of the new regulator, the Office for Students. Planning teams are therefore familiar with demographics as important data points for supporting university activities – and this unlikely to change.

But with the rise of data-led, machine-learning techniques, it is now possible to use student data to enhance our understanding of learners and their learning environments. This has resulted in the rapid emergence within the sector of learning analytics which use algorithms to support decision making and offer new areas of insight.

Learning analytics provide the ability to measure and codify students to support progress, attainment and retention at institutional and individual-student levels. They allow targeted, real-time interactions, specific to individual students’ needs automatically.


Demographic factors within the analytics model

Naturally, the starting point for many universities is to consider demographic factors as an influence within any analytical model. For one, there are clear attainment differences between many demographic groups but what do demographics do in this context? Typically, they reinforce what we already know and they can predict a student outcome: students from a certain postcode do less well, therefore a system predicts underperformance for them.

But here is the rub; if nothing changes along the students’ learning pathway, then the algorithm will be predictive. Having a factory production line with the same inputs and processes will deliver the same outputs every time. To change a student’s trajectory, you need to do something different – or they need to be encouraged to do something different.

More importantly, demographics do not necessarily change. No one can change their background, where they come from or their ethnicity. So surely we need to focus on those factors that are within each individual student’s gift to manage?

My issue is that organisations embed demographic factors into data algorithms that hold the potential to operate with bias. We must acknowledge that there is no free pass when it comes to an algorithm. There is inherent bias when creating a mathematical model and there will be a weight attributed to certain conditions which then indicates an outcome. Moreover, no algorithm is universal in nature. Some are good for certain conditions; others are not.

We have seen high-profile cases (including Facebook) where algorithms are influencing social media news feeds by promoting stories that the algorithm suggests is most relevant to the user. This creates individual echo chambers where a user continually hears and sees the same curated content, so their world view becomes a list of what the system sees as their interests. Worryingly, opposing views are extinguished (or curated out).

This is not a new argument; gender and racial bias in algorithms are topics of heated debate but my argument is latent bias exists in many Learning Analytics approaches. If we give a machine biased student data, then the machine will be biased in its outputs.

Consider ranking a student’s potential because of a demographic attribute that historically shows a particular group has not fared as well? Or planning based upon supporting particular students because of inherent bias within the algorithm?

Selecting a Black Asian Minority Ethnic (BAME) student over a white student when deciding whom to provide outreach support to is still bias. Having a system representing students fairly and equally needs to be at the heart of any initiative.

For instance, what if a white student had mental health issues yet they were overlooked just because they came from a privileged background? Judging the assignment of resources on a familial history trait is fundamentally wrong. Similarly, what if a BAME student was written off by a tutor because in their view, offering additional support was a pointless endeavour and the system allows them to confirm that view?

If the start-point (demographics) is an issue, does the end-point also need further consideration? Blanket assumptions about what constitutes success also influence set goals in an algorithm. For example, if we only measure success with a grade outcome, this fails to recognise the full value of higher education by simply providing a classification at the end of the process.

Furthermore, focussing on predicting a grade creates unwanted consequences in negative perceptions of students. Why should a Black and Minority Ethnic student be potentially downgraded in their ‘outcome predictions’ because of their heritage? Yet we assign significance to a single, all-encompassing demographic attribute. Demographics do not represent what we are. We are the minutiae of a complex picture of many influences of society, belief, motivations, culture and so on.

Learning Analytics

Learning Analytics offers Higher Education a means to change; unpacking the learning process in such a way that allows for more rapid and responsive institutions, offering better, more personalised support when it is actually required; and allowing a student to derive the very best value from their fees.

Institutions can gain new and significant insights from Learning Analytics if the start and endpoints are defined with an awareness of what they are there to achieve.

The new General Data Protection Regulation (GDPR) means institutions are obligated to share how automated decisions are derived and offer students a means to challenge these. This has huge consequences when considering an academic tutor discussing the reason they have asked for an academic review with a student because a system has identified them as ‘at risk’.

Could an academic tutor respond? Moreover, does the institution even understand how the calculation has been reached to be able to explain it? This simple factor alone could limit learning analytics to research and closed-door planning rather than becoming a truly democratic tool that offers students a means to be successful and change.

Universities are investing in ways to maximise the return from their resources and are becoming more reliant on tools to support better decision making. But at what cost? Ignoring the fact that algorithms have consequences could cost more than red faces and a legal case.

See how Solutionpath’s StREAM platform works

WANT TO CONTINUE READING?

Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.

Want to Know More ?

Fill in your details and we’ll be in touch.
  • To learn more about our Privacy Policy click here.

Close

ARTICLE TITLE LOREM IPSUM DOLOR SIT AMET, ADIPISCING ELIT SED DO EIUSMOD TEMPOR.

Mental health conditions within the student community are rising, yet just under half of those affected still choose not to disclose it to their university.

In September last year, the Institute for Public Policy Research (IPPR) published their findings from a student wellbeing project in the Improving Student Mental Health in the UK’s Universities report. Almost 50% of students with a mental health condition are choosing not to disclose this information to their University and less than one-third of Higher Education Institutions have designed an explicit Mental health and wellbeing strategy.

Almost 50% of students with a mental health condition are choosing not to disclose this information to their University and less than one-third of Higher Education Institutions have designed an explicit Mental health and wellbeing strategy.

In September last year, the Institute for Public Policy Research (IPPR) published their findings from a student wellbeing project in the Improving Student Mental Health in the UK’s Universities report.

Over the past 10 years, there has been a fivefold increase in the number of students who disclose a mental health condition to their institution (IPPR, 2017).

The research, which was funded by Universities UK and the Mental Health and Wellbeing In Higher Education (MHWBHE) Group, acknowledges that recent increases in mental health and wellbeing levels amongst UK Higher Education students are high in comparison to their wider peer group. The IPPR, a progressive policy think tank, suggests that this due to a combination of academic and financial factors, and social pressures.

Just under half of students who report experiencing a mental health condition still choose not to disclose it to their university.

The report highlighted that in 2015/16, 15,395 first year students in the UK disclosed a mental health condition – a figure five times greater than in 2006/07. It also highlighted that almost 50% of students with a mental health condition are choosing not to disclose this to their university, preventing them from accessing the help they need at a time when they could be at their most vulnerable. Poor mental health and wellbeing can affect students’ academic performance and desire to remain in higher education. In the most severe and tragic circumstances, it can contribute to death by suicide – levels of which have also increased among students in recent years.

In 2015, the number of students who experienced mental health problems and dropped out of university increased by 210 per cent in comparison to data from 2010.

It is generally accepted that poor mental wellbeing can affect academic performance and the numbers of students leaving Higher Education early, and in the most extreme and tragic circumstances, contributes to death by suicide. Findings extracted from the report support this claim – in 2015, the number of students who experienced mental health problems and dropped out of university increased by 210 per cent in comparison to data from 2010. In the same time period, the number of student suicides also increased by 79 per cent.

So, what can universities do to meet the challenge? And what more can be done?

The final chapters of the report set out a number of recommendations for the sector to consider.

The IPPR (2017) found that 71 per cent of UK universities do not have an explicit mental health and wellbeing strategy.

The first of which advises the student mental health and wellbeing issue should become a strategic priority for the sector. Variation exists across the board when it comes to the delivery of a strategic response to wellbeing. No two institutions are alike in their approach to mental health challenges and there is little in terms of guidance around sector best practise on how best to support the student community. The IPPR (2017) found that 71 per cent of UK universities do not have an explicit mental health and wellbeing strategy.

Many of our customers cite organisational culture and processes as the main barriers to a successful engagement project. An approved strategy could seek to address challenges around University-wide buy in for the adoption of a digital software solution.

The second point for consideration is to focus on early intervention, risk management and specialist care referrals through use of a University-wide digital platform. The IPPR claim the use of software to monitor attendance promotes self-determined learners and can be invaluable in the identification of disengaged students – in light of this, the study found that only 29 per cent of UK institutions do not monitor the attendance of all students.

It is widely recognised that institutions embark upon Learning Analytics projects for a variety of reasons; often to enhance the student experience, to support student retention or to improve academic attainment – or sometimes, to achieve a blend of objectives. However, for some, the benefits that engagement analytics can bring to the wellbeing agenda have yet to be considered.

Many of our customers have found that their digital engagement platform is well positioned to spot behavioural changes in individual students. It is this information that enables the university to initiate the pastoral care conversation, and offer advice and guidance to support the student during their time at the institution.

Irrespective of the direction a university chooses to take, mental health and wellbeing is attracting considerable interest within the Higher Education sector. Get in touch to learn more.

WANT TO CONTINUE READING?

Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.

Want to Know More ?

Fill in your details and we’ll be in touch.
  • To learn more about our Privacy Policy click here.

Close