Grab bag of papers on Student Success Early Warning Alerts

I’m working on a research project with one of our academic Schools. We already have a process where we use data from our learning analytics resource to conduct mid-term reviews. The idea of the review is that we identify students most at risk of dropping out and the School contacts them. We’ve always run this activity as a service, not as a research activity and so we’ve hit a number of problems with it.

  1. We provide the data and share it with the School
    • We provide average engagement and percentage attendance as a quality assurance check
  2. The programme teams conduct a manual check and filter out some students
    • This is a particularly important step – they are concerned about contacting students who are already being supported, may have mental health concerns etc.
  3. The School office contacts the students

And in a busy School, they move on to the next task.

So we’re never in a position to show the impact and really we need to.

Just an image of double yellow lines looking like an equals sign

What are we doing differently this time?

This year we are working more closely with the School as part of our OfLA work. This is relatively unambitious, but we want to get a much clearer picture about the effect of communication as universities must spent millions casting out emails to students they are concerned about each year.

We are seeking to understand what media best prompts students to act. To begin with we are starting with the mid term reviews during the first term and simply testing whether a letter or an email is more likely to prompt students to get in touch with their tutor. Ideally, we’d like to conduct further AB testing to see if we can generate better responses to differently worded communication – do assertively worded communications work better than supportive encouraging ones for example?

So as part of the process, this is just a quick grab bag of research on the efficacy of communicating with students to try and change their behaviour.

Early warnings in dashboards

Arnold & Pistilli (2012) possibly the first paper to show the impact of early warnings in dashboards

Jayaprakash et al (2014) showed that early warnings within a dashboard changed students’ behaviours – module pass rates increased, but this appears partly because ‘weaker’ students selectively dropped out early

Early warning interventions

Nelson, Quinn & Bennett (2012) – this is an example of a student led call centre that contacted students using various data sources – just hidden away in the text is a 10% improvement in retention. My favourite example of “oh this little thing” in a report

Oreopoulos & Petronijevic, 2019 provide a really good starting point on student early warning alerts. They carried out a stack of work, but even where there appears to be an effect, they argue that the impact is very small.

Wong (2017) – has a collection of case studies exploring the impact of learning analytics on interventions (see table 1) 

Sclater & Mullan (2017) cite five case studies into the impact of learning analytics on success

Rienties et al (2016) provide three case studies of how staff use the UK Open University’s learning analytics resource. The OU have been doing this for a while and are always worth listening to.

Bettinger & Baker (2014) – (requires Shibboleth log in) not quite evidence of early warning systems, but a sustained programme of telephone interventions. Interestingly different approach

 

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.