Thursday, July 3, 2014

Learning Analytics Summer Institute (LASI)

This week I attended the Learning Analytics Summer Institute (LASI), which is a hybrid event that involved three half-day streaming sessions from Harvard and two half days of face-to-face sessions in Madison.  There are local events all over the globe that center around the Harvard streams.  If you are completely unfamiliar with Learning Analytics, this information from Educause is a good foundation.  It's been very exciting to learn about Learning Analytics (LA) on the big scale from Harvard and the smaller scale from the UWs - mostly Madison.  Below are a few major general points. I'll also write a post on ethics and one on two specific tools used in the UW system, MAP-Works and the D2L Student Success System (S3).

Definitions 

There are a lot of terms used in this area and I realized that I wasn't 100% sure what they all meant, so here's some info for those of you who may be in the same boat:

Analytics: "the use of data, statistical analysis, and explanatory and predictive models to gain insights and act on complex issues." (Educause, 2012, p. 1).

Learning Analytics (LA): "a genre of analytics that entails the collection and analysis of data about learners" (Educause, 2012, p. 1). LA can consist solely of data generated by learners as they work on a course, or it can be supplemented by information about the learner like demographics, previous course work, high school information, standardized test scores, self-report data, etc.

Predictive Learning Analytics: obviously, LA that helps predict things like student success/risk, course recommendations, paths through courses, etc.

Educational Data Mining (EDM): "EDM develops methods and applies techniques from statistics, machine learning, and data mining to analyze data collected during teaching and learning. EDM tests learning theories and informs educational practice" (US Dept of Ed, 2012, p. 9).

EDM vs LA: "Learning analytics draws on a broader array of academic disciplines than educational data mining, incorporating concepts and techniques from information science and sociology, in addition to computer science, statistics, psychology, and the learning sciences. Unlike educational data mining, learning analytics generally does not emphasize reducing learning into components but instead seeks to understand entire systems and to support human decision making." 

Basically, my interpretation is the EDM is more granular and LA is bigger picture. 

Big Data: Well, obviously big data refers to lots and lots of data - the type of data that Amazon has, for instance. What we are working with in D2L is "little data."

Open Learning Analytics: Similarly to open source software, open LA provides access to source code, algorithms, and whatever other back end info is there (I have no idea).  The opposite would be proprietary or commercial systems, like D2L's or MAP-Works.  The presenters on the D2L tool said that they don't know exactly how the D2L tool comes to the conclusions it does - they set up a model, it's fed data, and it spits out judgments. Not open.

What Learning Analytics is not: One or a few "surfacey" measures, like test statistics or number of logins to the LMS.  Although that information can be useful, it's not big picture enough to really be considered LA because it's not predicting anything in comparison to anything else nor is it combining data to look at a bigger picture.

Learning analytics is a legitimate, emerging field on its own

Maybe that's an odd thing to say, but I didn't realize that LA was so big. I guess I thought it was a part of educational technology, but it's more aligned with computer science and data analytics.  Educational technologists would probably be the intermediaries to get a system set up and support it to get the information to faculty who provide the interventions based on the conclusions of the data.  Either a proprietary/commercial LA system or a LA professional (ideally, both) would be needed in addition to an educational technologist.

LASI was a big conference for the field of LA.  There was a focus from the Harvard sessions on the field in general, other professional development opportunities, and journals/publications.  A big part of it seemed to be building a community of LA professionals. I did not feel like the intended audience for the Harvard sessions (I got kind of an awareness-level of absorption - "oh, that's a thing?") but I learned a lot from the Madison sessions and it was interesting to get a peek at the more hardcore aspects of the field.

Learning Analytics Professionals

There was an entire session on LA professionals, many of whom work in private industry (rather than academia) making excellent salaries.  The owner of Structure (Canvas) described the main skills he looks for in a learning analyst:
  • Project management
  • Agile/rapid prototyping (create something quickly to start playing with it - also a desired skill of an instructional designer)
  • Communication 
  • Education theory (varying degrees of specialization depending on the specific application)
  • Statistics
  • Data retrieval (SQL, CSV, JSON)
  • Rudimentary/functional scripting
  • Visualization ("Make it pretty")
  • Data storage & management
  • Knowledge management ("How did we do that thing we did?") 
There was another Harvard session in which 30 doctoral students in LA introduced themselves and some shared their research, very briefly.  I think it's interesting that there are at least 30 doctoral students doing research in this field.

Emerging Field

So the emerging part is important. There's not a lot of info or structure to data governance, for instance.  There is a LA pilot happening in the UW-System with D2L that has been pretty rocky.  A master's program specifically in LA is in the works (Penn State, I think?).  A journal just started this year.

Kimberly Arnold, UW-System LA guru, shared the Gartner Hype Cycle which is just fascinating. Here's the one from 2013. Click to make it bigger if you can't read this.


Big data is at the peak of inflated expectations, while predictive analytics is predicted to reach it's plateau of productivity in less than two years.  Wikipedia tells me that the plateau of productivity means that "mainstream adoption starts to take off." Starts to take off. I'd agree with that it's starting to take off.

Then she shared the diffusion of innovation figure. You've probably seen this before:


She said that people ''in the know" about LA indicate believe that LA is in the innovators to early-early majority area, which might be a stretch.  It's new - we're just figuring this out.  So the thing to think about here is what kind of school are you in - an innovator, early adopter, early majority, late majority, or laggard?  Is the culture welcoming of bleeding edge technologies like this?  Probably more importantly, are there people who will do something with the data that's collected?  Do you want to spend time figuring things out, or latch on once the bugs have been worked out?  Oh, there are some bugs...next up I'll share what I learned about the D2L Student Success Pilot.  To be continued!

References: 

Educause. (2012). Learning Analytics: A Report on the ELI Focus Session. http://www.educause.edu/library/resources/learning-analytics-report-eli-focus-session

US Department of Education. (2012). Enhancing Teaching and Learning Through Educational Data Mining and Learning Analytics. https://www.ed.gov/edblogs/technology/files/2012/03/edm-la-brief.pdf