Learning Cycles & Feedback Loops

One of my first jobs after graduating was training students in transferrable skills. I read an enormous amount of training theory, text books on time management, systems-thinking, NLP, public-speaking etc. Certain keystone ideas stuck. One was of course David A Kolb’s (1984) experiential learning cycle. I can remember explaining it by modelling the process of learning how to open one of the doors to the training room. I’m certain that I’ll have used the simple version: act – reflect – make rules – plan rather than the fuller, more-sophisticated version. I’ve used it a lot with students and there’s often an ‘of course!’ revelation.

Screen grab of some of the range of learning analytics cycles and associated theories online

Learning cycles, feedback loops and systems thinking are all ideas that inherently feel right. I think it’s partly because there are some rock solid examples of these systems working (from the very simple e.g. the way a thermostat works, to the very complex, for example the balance of predators/ prey in an ecosystem etc.). There’s something all very logical about them. What learner wouldn’t want to know their goal and to know how far they’ve missed it?

I’ve spent a lot of time thinking about learning cycles for a learning analytics paper we’re working on. I’ve looked at Kolb, Graham Gibbs’s cycle and, the daddy of them all, Argyris and Schon’s single and double loop learning, but I’ve mostly been thinking about Doug Clow’s (2012) learning analytics cycle. Learning analytics and learning cycles are natural bedfellows. Learning analytics over time should provide data that can be measured at different points in time. In theory, we ought to be able to deliver and evaluate interventions using the technology.


Clow’s model feels right. We start with learners, they create data through their interactions with the programme and institution. However, the data only has meaning when we attach metrics to it when we analyse it against a model or through an algorithm. Finally we have the intervention, for example automated feedback, contact from a tutor, learning developers or student support services. In theory, this leads to a change in learner behaviour, new data, new ratings in our analytics system etc.

Job done, we can all go home.


So of course, this is just a framework. All of the stages are complicated. Capturing Data requires cables, servers, computer programmes and programmers (I probably should have put that the other way around, programmers are lovely, hard-working people). Metrics take time and need careful consideration. However, I do think personally Intervention > Learner is the tricky step. So if we assume that the agent of change is the institution, not the student, that step requires at least the following requirements:

  1. Comprehensible early warnings through a dashboard or other early warning system (automated report, emailed alert, etc.)
  2. Agreed policy & sense of ownership about acting
  3. Agreed strategy dependant upon the nature of the early warning (e.g. missed classes probably requires a different approach compared to a failed assessment)
  4. Time and means to communicate to students
  5. Time and means to communicate to the students who didn’t respond to the first communication
  6. Sufficiently skilled and confident staff to engage students in meaningful dialogue, or some form of pro-forma script
  7. Space and time for any interventions (real or virtual) to take place
  8. System for goal-setting or making referrals to learning developers, student support services etc.

I like the idea of learning cycles, I do.

However, I also think that their inherent ‘rightness’ is a problem. How many senior managers have looked at models like this and thought “that’s it, that’s the answer, just get me one of those” (actually probably none – this is the dialogue from a really bad movie, but you get my point).

There’s another issue, learning analytics looks technical. It’s all about computers, data, charts and stuff. It encourages us to think like engineers. That’s all very well, but learners are complicated, contradictory and often confused, we need to think like psychologists.



* To be abslutely fair to Doug, he’s never claiming to have the answers as to how learning analytics changes behaviour, he’s just offering a useful model to conceptualise the process.

2 thoughts on “Learning Cycles & Feedback Loops

  1. Reblogged this on Becoming An Educationalist and commented:
    #Becomingeducational A very very belated Happy New Year!!

    But it’s been worth the wait – for we have brought you this brilliant blogpost by Ed Foster.

    Ed relates the learning cycle – the the points of student disengagement or struggle that should trigger our interventions…


  2. Pingback: A modified learning analytics cycle (I really need to write more exciting titles) – Living Learning Analytics Blog

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.