Tag Archives: Learning Analytics

Learning Analytics meets sports

Starting this season, physical performance data of German Premier League footballers is being collected and published.

Video sensors are tracking players from both teams and allow a detailed analysis of their physical movements. 35 times per second (the video frame-rate) the coordinates of each player are stored. Among the things that can be analysed are spacial and movement profiles, sprints, speed, and heat maps. It seems that coaches and fans no longer only want to trust their own judgment, but prefer to see it in figures and stats. Football already employed quite detailed number crunching of teams such as fouls commited, ball possession, time in opponent’s half, shots on target, etc. This takes analytics to a personal level, where, I guess, it is hoped to help them learn from their behaviour on the pitch.

tracker graph
Criticism has come from some clubs that the tracking company does not restrict this information to club managers and coaches but also sells it to the media and to the general public. It is bemoaned that such analyses may give the wrong impression of being able to single out some players and make them scientifically responsible for a lost game. It is clear, though, that the analysis cannot come to contextual or qualitative conclusions of why a player performed the way he did. The danger that is pointed out is that figures, like milage run, may be taken by the public unreflected. Publishing performance data of players may even – it is feared – impact on the market value of some players or clubs.

What this will do to the sport is yet unknown, but we’ll just hope it still is fun to watch!

Notes to Learning Analytics

The recent two day seminar on Learning Analytics, organised by the Dutch SURF academy, brought some interested parties from different education institutions and vendors together. While stimulating in its presentation, the seminar mainly presented technical showcases. What got somehow left behind were relevant pedagogic showcases and a feeling of how receptive the teaching practitioner community is to this kind of innovation. Are we running again into the old pattern of being technology driven?

Some interesting showpieces included tools to elicit what I would call educational business analytics (as opposed to learning analytics). To some extent these were not really new, as business reporting systems on student grades, drop-out figures, and the likes have existed for many years, albeit that they are mainly available to university registrars. It is not yet clear what these figures do to teaching and learning when presented to teaching staff instead of administrators, but this would be a novel approach.

Here are some notes that came to my mind while listening to the presentations:

  • LA tools are a bit like a cooking thermometer or oven thermostat. It doesn’t give you an indication of what meal a person is preparing or whether it will taste good or not, but it may be a vital (on-demand) instrument to determine the right action at the right time to get it done.

  • How do we avoid teachers being turned into controlers, sitting like Homer Simpson in front of a dashboard and control panel looking at visualisations of their students’ datasets? Does an increase in such activities reduce their contact time with students?
  • One common assumption I noted is the belief that all students are ambitious and only aim for top grades and the best learning experience. Being a father and having seen a few student generations, I contest this assumption. Many, if not most students, just want to pass. Studying isn’t in fact what they perceive as the prime focus of their lives. Tactical calculations that students are used to doing (how often can I still be absent; what’s the minimum mark I need for passing, etc.) maybe ‘prehistoric’ forms of Learning Analytics that have existed for as long as assessments have been around!

Pedagogy and the Learning Analytics model

I received valuable feedback on the proposed design framework for Learning Analytics. A key question people asked was where pedagogy was in the model. Here is how I see it:

LA pedagogy model

Pedagogic strategies and learning activities as such are not part of the analytics process but are implicitly contained in the input datasets that encapsulate the pedagogic behaviour of users. As we know, this behaviour depends a great deal on the platform and the pedagogic vision the developers built in (cf. Dron & Anderson, 2011).  For example, data from a content sharing platform will have a behaviourist/cognitivist pedagogy attached to the learner behaviour, since this is the pedagogic model underlying the technology. In any case, only the pedagogic patterns exhibited in the dataset can be analysed and this will vary.

Additionally, pedagogy can be explicitly addressed in the goals and objectives that the LA designer sets. The LA method will determine the outcome of the analysis and together with the interpretation applied may lead to a large variety of options for consequences and interventions. If such pedagogic interventions are applied they lead to new behaviours which, once again, can be analysed through the available data.

A simple analogy would be boiling water in a pan. At any time (or continuously) you can stick a thermometer in and measure its temperature. The goal would be to determine whether you need to turn up the heat or not. The result of the analysis can then lead to the actions you want to take. The thermometer is only one method for such an analysis. An alternative would be to observe and wait until the water bubbles. Setting a threshold expectation (in the goals design) can inform you when it is time for the teabag to go in.

The model takes note that pedagogic success and performance are not the only thing that Learning Analytics can measure. Learning Analytics are snapshots taken from educational datasets. These snapshots can be used to reflect or predict, in order to make adjustments and interventions (either by a human or by a system). Through connecting the corner stones of the design model in different ways, different use cases can be constructed.

A key element of the Learning Analytics process that is not explicitly present in the model is that the outcome of any analysis needs to lead to a decision process which determines the consequences. Whether these are pedagogic or not, depends very much on the goals specified. Decision making can be stimulated and executed through the method applied and the algorithms chosen, for example in recommender systems. But decisions can also be taken by a human (e.g. teacher or self-directed learner). In any case they would lead to consequences and through a feedback loop the process can be chosen to be iterative.

Decisions based on Learning Analytics are a critical issue, because they determine the usefulness and consequences for the stakeholders. It is here where ethics play an enormously important role. Imagine an educational dataset that determines that children of immigrants are performing worse in reading tasks. Several options present themselves, and by all likelihood will be exploited by political parties or others: (1) more support for immigrant children could be offered; (2) segregation of immigrant vs non-immigrant schools; (3) right wing politicians will not hesitate to point out the deteriorating quality of schools due to immigration. Hence, data analysis could have dramatic (and unwanted) consequences. We need to be aware of this danger!

Learning Analytics framework

Slowly, a common understanding of Learning Analytics is evolving. George Siemens’ definition is the following:

Learning analytics is the use of intelligent data, learner-produced data, and analysis models to discover information and social connections, and to predict and advise on learning

My view is generally in accordance with this. To me, Learning Analytics is a way to take advantage of available educational datasets for the discovery of new insights into educational practice and the development of new educational services that support the goals and objectives of learners, teachers, and institutions. Unlike other people, I feel that Learning Analytics serves reflection as much as prediction.

The careful design of Learning Analytics approaches needs to take a number of different perspectives, so I started to draw up a first framework diagramme that takes these into account. This is only a first draft, but I hope to develop it further as our knowledge and experience increases. Feel free to elaborate further, and I am, of course, happy to have feedback on this.

Click on the image to enlarge

Critical soft issues that I perceive in the exploitation of educational datasets for Learning Analytics, are the competences to interpret, critically evaluate, and derive conclusions (and pedagogic actions) from the analysis. A holistic perspective is essential, because what’s left out of the data coverage is as important as what is in.

Privacy issues aside, which are a legal constraint, I see a number of ethical issues connected with accessing learner data. There’s the issue that some teachers might abuse Learning Analytics as a means for policing and surveyance rather than a support tool. The same could perhaps be said when it comes to institutions gaining insights into teacher performances. The danger being that innovative and creative teaching might be ousted because it does not show up as falling into line with the ‘traditional’ algorithmic performance. As such the data might easily be abused to exercise certain pressures upon the data constitutency.