ARTICLE TITLE LOREM IPSUM DOLOR SIT AMET, ADIPISCING ELIT SED DO EIUSMOD TEMPOR.

Implementing Learning Analytics can appear a daunting task for some institutions, we have created a six part guide that covers some of the key areas to consider when thinking about implementing technology to support Learning Analytics in your institution.

Part 1 – What data gives results

It’s important that any data collected and processed for developing Learning Analytics (LA) is used to drive action. Data must reflect the students’ current learning path and be actionable. Aggregating data and applying analysis retrospectively will not enable pre-emptive interventions. It’s not just about which data you use, but how you use it.

Using live behavioural data from a range of the  institution’s operational systems is key to deriving insight into how a student is engaging with their studies. In order to support students when necessary it is key to understand when their behaviour changes at a point in time when something can be done.

Solutionpath’s StREAM algorithm aggregates institutional data from a range of sources; provides a daily engagement score and alerts to behavioural changes. This enables university staff to pre-empt any issues affecting student welfare or learning journey and act accordingly.

We are now working with 15 Higher Education Institutions (and counting) and for each implementation we created ‘out of the box’ proven connectivity for most systems. Each institution we work with has variations in the data sources that are used to drive insight and action. There are many sources that can be fed into an algorithm to support LA activity, its not a one size fits all approach. If you’d like to know more about the different data sources and inputs we work with, contact us and we’ll be happy to discuss.

 

Part 2 – What functionality is key?

To deliver the objective of supporting the student in their attainment objectives, its key that any Learning Analytics technology supports the Learning Journey of the student and motivates them to progress.

Learning Analytics Student Learning Journey

Student Learning Journey

Below is a quick summary of the functionality that would be considered key to any successful Learning Analytics technology.

  • Empowers the student – allow them access to their own data through their mobile device
  • Enables fast action – Use alerts, notifications and messaging to highlight changes in engagement behaviour
  • Provides a streamlined student experience – ensure interoperability with other core systems for effective case management
  • Maintains consistent dialogue – Notes, reminders and student insight functionality should enable coherent and consistent interactions

 

Part 3 – How predictive is Data in identifying students at risk?

When data is transformed in the right way, Learning Analytics is an invaluable tool for identifying students who are at risk of leaving their course. The use of near real time behavioural data ultimately means that flags can be raised at the very point a student’s behaviour changes. This enables university staff to act immediately based on data, rather than traditional methods of observable behaviour. It also enables universities to focus effort where and when necessary; generating efficiencies in pastoral activities.

Solutionpath StREAM technology has a predictive component – it is unique in being able to categorise students by their propensity for persistence based on their engagement with university .

Our experience highlights that a strong LA solution can identify students at risk 8 weeks earlier, on average, than traditional methods, and supported by a strong support framework to manage the interventions, this can lead to decreases in student attrition.

 

Part 4 – How easy is it to implement Learning Analytics technology?

Many institutions who embark on the Learning Analytics journey encounter obstacles with implementing the technology due to problems associated with existing legacy systems and their ability to extract and transform the data in order to make it useful.

No University is codified for LA, these systems reside downstream from a range of business systems and processes that are not maintained for the purposes of supporting an LA solution, this aspect requires you to work with a partner with extensive experience of delivering against a commercial level Service Level Agreement (SLA).

Implementing Learning Analytics

Implementing Learning Analytics

Ultimately to gain the most actionable insight for LA implementation, there is a requirement to aggregate data across a number of organisational systems, this can take time and resource. Following this, data needs to be analysed and algorithms generated to enable action from the data. The time involved to achieve this, with no prior experience could be extensive.

At Solutionpath we have now implemented this process with a large number of institutions and developed solutions with wide range of pre-integrated out of the box connectors enabling us to easily digest data and transform it. This reduces the technical burden on IT departments removing any requirement for internal development resource to be deployed to any LA project involving Solutionpath StREAM technology.

 

Part 5 – What if the input data is no good?

Many institutions are held up in their implementation of Learning Analytics due to concerns over data quality and security. Often projects encounter stalls in the process from concerns that data quality will impede the results and a view is taken that value won’t be driven from existing data assets.

Due to the unique way in which Solutionpath’s StREAM technology works, issues in data quality are overcome through data source calibration with an algorithm that adapts to the data input quality to provide actionable insight.

Learning Analytics Tool

StREAM Time to Value

Part 6 – How quickly should value be delivered from Learning Analytics?

If the process of implementing Learning Analytics is approached in the right way, the data and technology can be used immediately to support institutions in their pastoral support processes.

The StREAM algorithm is designed to aggregate data and give an immediate reading of a student’s engagement. This is then calibrated against their cohort to determine how well they are engaging against their peer group. The technology then uses the data to trigger alerts where students are disengaging. Ultimately this means that immediate interventions can be actioned to engage with student’s and understand their current situation.

To learn more about how to implement Learning Analytics, we’ll be happy to share our experience with you. Please get in touch to speak to us.

WANT TO CONTINUE READING?

Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.

Want to Know More ?

Fill in your details and we’ll be in touch.
  • To learn more about our Privacy Policy click here.

Close

ARTICLE TITLE LOREM IPSUM DOLOR SIT AMET, ADIPISCING ELIT SED DO EIUSMOD TEMPOR.

Demographics in Higher Education are important. They can be used to identify disparity in academic achievement, support targeted provision for those less privileged and track demographic changes in student populations – aiding institutional planning. But what place do they have in Learning Analytics?

Demographics feed into a number of key strategy issues – like recruitment and resource planning, Higher Education Statistics Agency reporting and access and participation targets. It’s therefore no surprise that in our work supporting sector planning teams, demographic splits are regarded as crucial data points for supporting university decision making.

Learning analytics

With the rise of data-led machine learning techniques, it is now possible to use data to enhance our understanding of learners and their learning environments – resulting in the rapid emergence of ‘Learning Analytics’ within the sector – data based on algorithms that support decision making and offer new areas of insight. These developments mean that we are now able to measure and codify students to support progress, attainment and retention at institutional and individual student levels – often through targeted, real time interactions that are specific to an individual student need.

There are clear attainment differences across many demographic groups, so the starting point for many universities is to consider ‘demographic factors’ as an influence within any analytical model. But what do demographics do in this context? Typically, they reinforce what we already know and they can predict a student ‘outcome’ – students from postcode X might do less well, so a system predicts under performance.

If nothing changes along the students’ learning pathway, then the algorithm will be predictive – a factory production line with the same inputs and processes will deliver the same outputs every time. However, to change a student’s trajectory, either the university or the student has to do something different. As demographics do not necessarily change, we should focus on those factors that are within each individual student’s gift to manage.

The dangers in execution

In the work we do with universities, a danger we have identified is that that data analysis algorithms that embed demographic factors hold the potential for bias. Whenever a mathematical model is created, weight is attributed to certain conditions which then indicates an outcome. Gender and racial bias in algorithms are topics of heated debate in the sector and latent bias exists in many Learning Analytics approaches that we have seen.

Algorithms that rank a student’s ‘potential’ because of a demographic attribute which shows that a particular group hasn’t fared as well are biased. What if a BAME student was written off by a tutor because in their view it was a pointless endeavour, and the ‘system’ allows them to confirm that view? What if a white student was having mental health issues and was overlooked just because he/she came from a privileged background?

If the start-point (demographics) is an issue, then the end-point may also need further consideration. Blanket assumptions about what constitutes ‘success’ also influence goals in an algorithm. If we only measure success with a grade outcome, this fails to recognise the full value of higher education by simply providing a classification at the end of the process. Focussing on ‘predicting’ a grade award can create negative perceptions from students; why should a Black and Minority Ethnic student be potentially downgraded in their ‘outcome predictions’ because of their heritage?

Solutions to problems

There are solutions to these problems. By working with KPMG, Solutionpath offers the benefit of experience in managing transformation and student journey projects, helping providers with the challenges associated with driving service adoption. This includes the ethical as well as the operational aspects of data.

In our work with Nottingham Trent University (NTU), the university gives students access to their own data and has done so right from the start. The project team (including students’ union officers and a representative from the equality & diversity unit) was worried about bias in scoring, and resolved that it would be profoundly demotivating to use student demographics in the engagement algorithm. As Ed Foster, NTU’s Student Engagement Manager makes clear,

“Two students engaged in precisely the same way (took out the same textbooks, logged in to the VLE the same amount) where one student was from a disadvantaged background would be at risk of ending up with a lower score – and for them what their students do was more important than who they are”

General Data Protection Regulations (GDPR) obligate institutions to share how automated decisions are derived, and offer students a means to challenge these. This could have a huge consequences – when a tutor is asked for the reason they have asked for an academic review with a student, are they ready to justify the “at risk” flag? And does the institution even understand how the calculation has been reached to be able to explain it?

No university wants bad press, and this simple factor alone could limit Learning Analytics to research and closed-door planning. This would be a shame, because in our experience when done well, Learning Analytics could become a truly democratic tool that offers a means to successful change for students.

Want to know more? Read our blog ‘What affect does demographic data have on science behind the algorithm?’

WANT TO CONTINUE READING?

Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.

Want to Know More ?

Fill in your details and we’ll be in touch.
  • To learn more about our Privacy Policy click here.

Close

ARTICLE TITLE LOREM IPSUM DOLOR SIT AMET, ADIPISCING ELIT SED DO EIUSMOD TEMPOR.

Learning Analytics enable universities to use algorithms to support decision making. But what affect does demographic data have on science behind the algorithm?

Demographic data undoubtedly plays a major part in Higher Education. We understand that some student groups do less well than others and strategies to support these groups should be put in place to improve their likelihood of success. However, in adding demographic filters to Learning Analytics projects, institutions run the risk of demotivating their students and missing valuable opportunities to provide assistance to those in need of academic development or wellbeing support.

Demographics in Higher Education are important as analysis of them can:

  1. Identify disparity in academic achievement
  2. Support targeted provision of inclusive and positive action for those less privileged
  3. Track demographic changes in student populations, aiding institutional planning

Demographics can also feed into a number of key aspects of university life such as recruitment and resource planning, reporting for the Higher Education Statistics Agency (HESA) reporting and meeting the fair-access obligations of the new regulator, the Office for Students. Planning teams are therefore familiar with demographics as important data points for supporting university activities – and this unlikely to change.

But with the rise of data-led, machine-learning techniques, it is now possible to use data to enhance our understanding of learners and their learning environments. This has resulted in the rapid emergence within the sector of learning analytics which use algorithms to support decision making and offer new areas of insight.

Learning analytics provide the ability to measure and codify students to support progress, attainment and retention at institutional and individual-student levels. They allow targeted, real-time interactions specific to individual students’ needs automatically.

Demographic factors within the analytics model

Naturally the starting point for many universities is to consider demographic factors as an influence within any analytical model. For one, there are clear attainment differences between many demographic groups but what do demographics do in this context? Typically, they reinforce what we already know and they can predict a student outcome: students from a certain postcode do less well, therefore a system predicts under performance for them.

But here is the rub; if nothing changes along the students’ learning pathway, then the algorithm will be predictive. Having a factory production line with the same inputs and processes will deliver the same outputs every time. To change a student’s trajectory, you need to do something different – or they need to be encouraged to do something different.

More importantly, demographics do not necessarily change. No one can change their background, where they come from or their ethnicity. So surely we need to focus on those factors that are within each individual student’s gift to manage?

My issue is that organisations embed demographic factors into data algorithms that hold the potential to operate with bias. We must acknowledge that there is no free pass when it comes to an algorithm. There is inherent bias when creating a mathematical model and there will be a weight attributed to certain conditions which then indicates an outcome. Moreover, no algorithm is universal in nature. Some are good for certain conditions; others are not.

We have seen high-profile cases (including Facebook) where algorithms are influencing social media news feeds by promoting stories that the algorithm suggests is most relevant to the user. This creates individual echo chambers where a user continually hears and sees the same curated content, so their world view becomes a list of what the system sees as their interests. Worryingly, opposing views are extinguished (or curated out).

This is not a new argument; gender and racial bias in algorithms are topics of heated debate but my argument is latent bias exists in many Learning Analytics approaches. If we give a machine biased data, then the machine will be biased in its outputs.

Consider ranking a student’s potential because of a demographic attribute that historically shows a particular group has not fared as well? Or planning based upon supporting particular students because of inherent bias within the algorithm?

Selecting a Black Asian Minority Ethnic (BAME) student over a white student when deciding whom to provide outreach support to is still bias. Having a system representing students fairly and equally needs to be at the heart of any initiative.

For instance, what if a white student had mental health issues yet they were overlooked just because they came from a privileged background? Judging the assignment of resources on a familial history trait is fundamentally wrong. Similarly, what if a BAME student was written off by a tutor because in their view, offering additional support was a pointless endeavour and the system allows them to confirm that view?

If the start-point (demographics) is an issue, does the end-point also need further consideration? Blanket assumptions about what constitutes success also influence set goals in an algorithm. For example, if we only measure success with a grade outcome, this fails to recognise the full value of higher education by simply providing a classification at the end of the process.

Furthermore, focussing on predicting a grade creates unwanted consequences in negative perceptions from students. Why should a Black and Minority Ethnic student be potentially downgraded in their ‘outcome predictions’ because of their heritage? Yet we assign significance to a single, all-encompassing demographic attribute. Demographics do not represent what we are. We are the minutiae of a complex picture of many influences of society, belief, motivations, culture and so on.

Learning Analytics offers Higher Education a means to change; unpacking the learning process in such a way that allows for more rapid and responsive institutions, offering better, more personalised support when it is actually required; and allowing a student to derive the very best value from their fees.

Institutions can gain new and significant insights from Learning Analytics if the start and end points are defined with awareness of what they are there to achieve.

The new General Data Protection Regulation (GDPR) means institutions are obligated to share how automated decisions are derived and offer students a means to challenge these. This has huge consequences when considering an academic tutor discussing the reason they have asked for an academic review with a student because a system has identified them as ‘at risk’.

Could an academic tutor respond? Moreover, does the institution even understand how the calculation has been reached to be able to explain it? This simple factor alone could limit learning analytics to research and closed-door planning rather than becoming a truly democratic tool that offers students a means to be successful and change.

Universities are investing in ways to maximise the return from their resources and are becoming more reliant on tools to support better decision making. But at what cost? Ignoring the fact that algorithms have consequences could cost more than red faces and a legal case.

WANT TO CONTINUE READING?

Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.

Want to Know More ?

Fill in your details and we’ll be in touch.
  • To learn more about our Privacy Policy click here.

Close

ARTICLE TITLE LOREM IPSUM DOLOR SIT AMET, ADIPISCING ELIT SED DO EIUSMOD TEMPOR.

Mental health conditions within the student community are rising, yet just under half of those affected still choose not to disclose it to their university.

In September last year, the Institute for Public Policy Research (IPPR) published their findings from a student wellbeing project in the Improving Student Mental Health in the UK’s Universities report. Almost 50% of students with a mental health condition are choosing not to disclose this information to their University and less than one third of Higher Education Institutions have designed an explicit Mental health and wellbeing strategy.

Almost 50% of students with a mental health condition are choosing not to disclose this information to their University and less than one third of Higher Education Institutions have designed an explicit Mental health and wellbeing strategy.

In September last year, the Institute for Public Policy Research (IPPR) published their findings from a student wellbeing project in the Improving Student Mental Health in the UK’s Universities report.

Over the past 10 years, there has been a fivefold increase in the number of students who disclose a mental health condition to their institution (IPPR, 2017).

The research, which was funded by Universities UK and the Mental Health and Wellbeing In Higher Education (MHWBHE) Group, acknowledges that recent increases in mental health and wellbeing levels amongst UK Higher Education students are high in comparison to their wider peer group. The IPPR, a progressive policy think tank, suggests that this due to a combination of academic and financial factors, and social pressures.
Just under half of students who report experiencing a mental health condition still choose not to disclose it to their university.

The report highlighted that in 2015/16, 15,395 first year students in the UK disclosed a mental health condition – a figure five times greater than in 2006/07. It also highlighted that almost 50% of students with a mental health condition are choosing not to disclose this to their university, preventing them from accessing the help they need at a time when they could be at their most vulnerable. Poor mental health and wellbeing can affect students’ academic performance and desire to remain in higher education. In the most severe and tragic circumstances, it can contribute to death by suicide – levels of which have also increased among students in recent years.

In 2015, the number of students who experienced mental health problems and dropped-out of university increased by 210 per cent in comparison to data from 2010.

It is generally accepted that poor mental wellbeing can affect academic performance and the numbers of students leaving Higher Education early, and in the most extreme and tragic circumstances, contributes to death by suicide. Findings extracted from the report support this claim – in 2015, the number of students who experienced mental health problems and dropped-out of university increased by 210 per cent in comparison to data from 2010. In the same time period, the number of student suicides also increased by 79 per cent.
So, what can universities do to meet the challenge? And what more can be done?

The final chapters of the report sets out a number of recommendations for the sector to consider.

The IPPR (2017) found that 71 per cent of UK universities do not have an explicit mental health and wellbeing strategy.

The first of which advises the student mental health and wellbeing issue should become a strategic priority for the sector. Variation exists across the board when it comes to the delivery of a strategic response to wellbeing. No two institutions are alike in their approach to mental health challenges and there is little in terms of guidance around sector best practise on how best to support the student community. The IPPR (2017) found that 71 per cent of UK universities do not have an explicit mental health and wellbeing strategy.

Many of our customers cite organisational culture and processes as the main barriers to a successful engagement project. An approved strategy could seek to address challenges around University-wide buy in for the adoption of a digital software solution.

The second point for consideration is to focus on early intervention, risk management and specialist care referrals through use of a University-wide digital platform. The IPPR claim the use of software to monitor attendance promotes self-determined learners and can be invaluable in the identification of disengaged students – in light of this, the study found that only 29 per cent of UK institutions do not monitor the attendance of all students.

It is widely recognised that institutions embark upon Learning Analytics projects for a variety of reasons; often to enhance the student experience, to support student retention or to improve academic attainment – or sometimes, to achieve a blend of objectives. However, for some, the benefits that engagement analytics can bring to the wellbeing agenda have yet to be considered.

Many of our customers have found that their digital engagement platform is well positioned to spot behavioural changes in individual students. It is this information that enables the university to initiate the pastoral care conversation, and offer advice and guidance to support the student during their time at the institution.

Irrespective of the direction a university chooses to take, mental health and wellbeing is attracting considerable interest within the Higher Education sector. Get in touch to learn more.

WANT TO CONTINUE READING?

Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.

Want to Know More ?

Fill in your details and we’ll be in touch.
  • To learn more about our Privacy Policy click here.

Close

ARTICLE TITLE LOREM IPSUM DOLOR SIT AMET, ADIPISCING ELIT SED DO EIUSMOD TEMPOR.

In recent years researchers and educators have begun to explore how we can use new sources of data on students and their learning, together with predictive analytics techniques, to improve many aspects of the educational experience.

Learning Analytics is a hot topic in educational technology, with principals, vice chancellors and government education departments taking an increasing interest in the insight that can be gained from student data. But what exactly is Learning Analytics?

In recent years researchers and educators have begun to explore how we can use new sources of data on students and their learning, together with predictive analytics techniques, to improve many aspects of the educational experience.

Learning Analytics is a hot topic in educational technology, with principals, vice chancellors and government education departments taking an increasing interest in the insight that can be gained from student data. But what exactly is Learning Analytics?

I’ve been immersed in this area for the past few years. In an attempt to understand the many aspects and potential benefits of this fascinating and rapidly evolving field, and to explain them to others, I’ve spent much of the last year writing a book about it.

Learning Analytics uses data about students and their learning to help understand and improve educational processes, and – crucially – to assist the learners themselves. Students are increasingly carrying out their learning online, using devices such as laptops, tablets and smartphones. This leaves a “digital footprint”, which can be automatically analysed, and combined with data about their backgrounds, past academic performance or their career aspirations. Educational content, activities or processes can then be adapted to provide an enhanced or more personalised experience for the learner. This promises to be better for students than the “one size fits all” approach to education, which has been deployed traditionally across much of higher and further education.

I’ve found four main uses of learning analytics in my discussions with experts in the field, and in the more than 600 articles and academic papers which I reviewed for my book:

 

Early alert and student success

All universities, colleges and schools have an interest in ensuring that their learners are learning effectively. Many institutions have problems with high rates of student attrition, and this has been responsible for much of the interest in learning analytics, particularly in the United States.

Early alert systems enable the automated identification of learners at risk of failure or withdrawal, sometimes very early on in their studies. Dashboards and automated alerts sent to teachers and personal tutors allow them to view the students who could most benefit from their input. Interventions at an early stage can result in students being retained who might otherwise have dropped out. Better retention has financial and reputational benefits for institutions. There can also, of course, be a huge impact on the self-esteem and career prospects of those individuals who ultimately succeed in their studies.

Other institutions have much less of a problem with students withdrawing. However, they’re aware that many of their students could perform better: with a clearer idea of how they’re learning, good students could become excellent. Are you on target for the degree classification you want to achieve? Models can be produced from historic data about previous students, and mapped onto your own data to show you what aspects of your engagement you need to improve on.

Giving learners data about their own learning through dashboards and apps, is an increasingly promising area for learning analytics. Building on the popularity of fitness apps, the software can help to motivate students, help them feel less isolated, and make them aware of patterns of activity in their learning which could be enhanced.

 

Course recommendation

A second use for learning analytics is recommending to students what courses or modules they should study next. We’ve become used to recommender systems when we buy items online: products are suggested to us by vendors such as Amazon, based on our previous purchases. Companies are increasingly using data about other customers “like you”, as well, to recommend items you might want to buy.

Similar underlying technologies are increasingly being deployed in many institutions, particularly universities in the United States, where students can be faced with a huge choice of courses. Recommender systems predict the ones where they’re most likely to succeed or which may help meet their career aspirations.

 

Adaptive learning

Also increasingly set to have a big impact across education are adaptive learning systems, which tailor the material presented to students, based on how they interact with it. Thus a student who is struggling with a particular topic can be directed to additional materials automatically, before moving onto the next topic.

Some of the experts I’ve interviewed see the future as being a much more personalised approach to education, where data on learners and their activity is constantly used to update what is presented to and expected of them. This doesn’t need to be a solitary learning experience, where the student simply interacts with an intelligent system. The algorithms can also help them to connect with others who are encountering similar issues, or who are prepared to offer their expertise.

 

Curriculum design

There is a final significant area in which data on learners’ activities is increasingly being used. This is in the design and enhancement of the learning and assessment content and activities that are provided to students. The effectiveness of different aspects of the curriculum can be analysed using data to enable “on the fly” enhancements or more significant alterations which can benefit future cohorts.

For instance, the data might demonstrate that a key piece of learning content is not being accessed by your students. You might then try to discover why not. Was it because the content was too difficult, not easy to find, sequenced at the wrong time, or you didn’t effectively communicate to the students the importance of accessing it?

There are also examples of institutions discovering from the analytics that a particular minority group is underperforming in an aspect of the curriculum. This can then lead the institution to attempt to identify whether this is due to linguistic or cultural issues, or perhaps a lack of prerequisite knowledge. Additional support can then be targeted at the group to bring their performance up to the standards of the rest of the cohort.

 

Conclusion

Anyone thinking about deploying learning analytics at their institution is likely to encounter ethical objections from some of their colleagues, and perhaps some of their students too. There is no doubt that there are many possibilities for the misuse of student data: I’ve identified 86 separate ethical, legal and logistical issues which occur in the growing number of articles and research papers written about learning analytics.

Predictions can of course be wrong, and students are complex individuals, not simply labels such as “at risk” or “not at risk”. Analytics can’t tell us whether they haven’t turned up to lectures because they’re struggling academically, because they find our lectures boring, or because they’re having to look after a sick relative.

I worked with the UK higher and further education communities and the National Union of Students to capture such issues in Jisc’s Code of Practice for Learning Analytics, which can help institutions to develop their analytics capabilities legally and ethically.

One of the key ethical issues that crops up is the “obligation of knowing”. As institutions assemble ever greater quantities of data, is there not a moral requirement on us to use the insight that can be provided from it to help our students? If the analytics suggest there is a strong likelihood that a student is at risk of dropping out, for example, shouldn’t we be trying to find out if we can help that student? Is it justified, as students incur increasing amounts of debt to fund their studies, for us to continue to provide learning content and activities without doing everything we can to assess whether they’re proving effective?

As industry and government increasingly use “big data” to analyse their customers’ and citizens’ behaviours to enhance the effectiveness of their operations, education is in danger of being left behind. Learning Analytics, carried out strategically across an institution, with attention to the ethical issues and the needs of users, promises to enable a much better understanding of a wide range of educational processes. This should lead to decision making based on evidence rather than intuition, with multiple benefits for institutions, and – most importantly – for our students.

 

About the author

Niall Sclater is consultant and director of Sclater Digital, an educational technology consultancy. As Director of Learning and Teaching at the Open University, he led its institutional learning analytics project. More recently, he has recently written a book “Learning Analytics Explained”.

WANT TO CONTINUE READING?

Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.

Want to Know More ?

Fill in your details and we’ll be in touch.
  • To learn more about our Privacy Policy click here.

Close

ARTICLE TITLE LOREM IPSUM DOLOR SIT AMET, ADIPISCING ELIT SED DO EIUSMOD TEMPOR.

Some people have asked if Learning Analytics breeches the privacy of the student. The simple answer is no, but let me explain why…

The internet can be a terrible echo chamber, especially on perceived contentious points, so I write this with some trepidation. However, I would like to state from the outset that my opinion has been formed by students and users of Learning Analytics solutions and not formed by those that sit on the outside looking in.

Scaremongering of a Big Brother state, poorly conceived initiatives and fool hardy experiments by corporations has made many in society sceptical about the use of data.

Some people have asked if Learning Analytics breeches the privacy of the student. The simple answer is no, but let me explain why…

Our Learning Analytics tool StREAM only uses the information and data already owned by a university. By using existing data, whether historical or as the data as it is created, the software uses algorithms to map a successful student, and a student who may be at risk.

 

Intelligent information

Using centralised data points to collect behavioural information (attendance, system logs, door access records, library use, VLE/LMS data), StREAM analyses the individual’s digital interactions with the University. Similarly, it then performs the same process with a student who has failed or left the university early.

StREAM then builds intelligence around its findings and calculates the likelihood of success and failure, modifying the students risk status with each interaction. By monitoring these patterns of behaviour and making existing data more readily accessible to the organisation, StREAM identifies these student behaviours that may lead an individual to terminate their time at University much earlier than current methods and so giving institutions the time to react and modify student behaviours.

And the information StREAM exposes is purely objective, not subjective, based on a single persons view, based on a few interactions. Inaccurate judgements about an individual or demographical biased information such as gender, ethnicity, family income etc becomes a thing of the past and since rational and forecast are applied, the learning process is enhanced. Essentially, StREAM uses data to make a positive, not negative, changes.

 

Opinions matter

From the perspective of the student, most people in the Internet age have a wildly different view on the value of data and are prepared to share it more easily as they often disclose information for access to services or resources.

A recent survey posed the following question to a group of learners; “If there was a problem, would you want to know?” A staggering 93% said yes they would.

Many students simply don’t know what a good learner looks like and providing them with a tool that allows them to track their own daily progress is invaluable in making them independent self-determined learners. Through access to the StREAM app, students gain an understanding of their development and a demonstration of negative activity is often the motivator needed to trigger a positive change in their behaviour.

 

Enhanced delivery of education

From an internal personnel point of view, in many cases the technology chimes with tutor intuition and presents them with the opportunity to improve learning outcomes and student engagement. If a learner ceases to participate with the University, the tutor is notified and can then begin to foster a positive conversation, offering support, advice and guidance. Moreover, this objective ‘evidence’ ensures that tutors can be very specific with their guidance on how a student could improve making the value of every tutor/student interaction valuable, whether saving the tutor time or by ensuring the time is spent on the conversation and not the research of ‘how are they doing’.

Ultimately, it’s not about trying to find out how long someone has spent in the Student Union bar; it’s a tool to recognise when a student needs help.

 

Support when it’s needed the most

If negative activity is exposed, it’s up to the University to decide if a support intervention needs to be made. And interventions aren’t designed to penalise, they’re designed to offer encouragement and reassurance -making the experience better, not punitive.

The beauty of StREAM is that it doesn’t just focus on the students who stand out at either end of the success spectrum. Instead, it seeks to help all, including those in the middle, who are often overlooked, achieve the best they can. For that reason, if you take into account the positive value StREAM can deliver, the privacy argument almost becomes obsolete.

 

WANT TO CONTINUE READING?

Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.

Want to Know More ?

Fill in your details and we’ll be in touch.
  • To learn more about our Privacy Policy click here.

Close