#H809 Learning Analytics: The Arnold and Pistilli (2012) paper

la photoIn a paper for the Learning Analytics Conference of 2012, Arnold and Pistilli  explore the value of learning analytics in the Course Signals product, a pioneering learning analytics programme at Purdue University.  The researchers used three years of data from a variety of modules.  For some modules learning analytics was used to identify students ‘at risk of failing’, based on a proprietary algorithm that took into account course-related factors such as login data, but also prior study results and demographic factors.  Students  ‘at risk’ were confronted with a yellow or red traffic light on their LMS dashboard.  Based on the information tutors could decide to contact the student by e-mail or phone.  The researchers compared retention rates for cohorts of students who entered university from 2007 until 2009.  They complemented this analysis with feedback from students and instructors.

Modules with use of CS showed increased retention rates – likely due to the use of CS.  These courses also showed lower than average  test results, possible a consequence of the higher retention.  Student feedback indicated that 58% of students wanted to use CS in every course, not a totally convincing number.

The research paper generated following issues/ questions:


  • The correlation doesn’t necessarily point to a causal link (although the relation seems quite intuitive)
  • It’s unclear how courses were selected to be used with CS or not. Possibility of bias?
  • The qualitative side of the research seems neglected.   Interesting information such as the large group of students who are apparently not eager to use CS in every course is not further explored.


  • The underlying algorithm is proprietary and is thus a black box for outsiders, which severely limits its applicability and relevance for others.
  • It’s unclear with exactly what the use of CS is compared.  If  students in non-CS modules get little personal learner support, CS may look like a real improvement.
  • The previous point relates with the need for clear articulation what the objective(s) of CS or learning analytics in general are. Including an analysis of tutor time saved or money saved through retention rates would have given a more honest and complete overview of the benefits that are likely perceived as important, instead of a rather naive focus on retention rates.

Ethical issues

  • It;s unclear if and how informed consent of students is obtained.  Is it part of the ‘small print’ that comes with enrolment?
  • How about false positives and negatives?  Some students may get a continuous red light or face a bombardment of e-mails, if they belong to a demographic or socio-economic group ‘at risk’.  Others may complain when they don’t receive any warnings despite having problems to stay in the course.
  • The authors have been closely involved in the development of the learning analytics programme at Purdue University.  This raises questions about objectivity and underlying motives of the paper.


Arnold, K.E. and Pistilli, M.D. (2012) ‘Course signals at Purdue: using learning analytics to increase student success’, In Proceedings of the 2nd International Conference on Learning Analytics and Knowledge, LAK  ’12, New York, NY, USA, ACM, pp. 267–270, [online] Available from: http://doi.acm.org/10.1145/2330601.2330666.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s