Grand Challenges from Learning Analytics and Knowledge ‘16
(c) 2016 Jeremy Roschelle

Grand Challenges from Learning Analytics and Knowledge ‘16

It’s not that often that presidents of 5 different research societies come together to reflect at the close of a conference. But that’s exactly what happened at the close of Learning Analytics and Knowledge ’16 in Edinburgh, Scotland. Leaders of the International Society of the Learning Sciences, AI & Ed, SOLAR, EU-TEL, and Educational Data Mining participated in a panel, each with comments on the importance of learning analytics to the future of their field.

In a previous post, I reflected on keynotes at the Learning @ Scale conference (which occurred just before LAK '16). The three further keynote talks at LAK in Edinburgh also offered rich opportunities to advance the field. Although each talk was richly multi-dimensional, I take this opportunity to focus on just a single aspect: the grand challenge in each of the three keynotes, explored below in three questions:

1) How can we ensure that learning environments are effective, efficient and enjoyable for students? 

Paul Kirschner contrasted utopian and dystopian possibilities of large scale digital learning environments (see Paul's slides). Many presentations at LAK share methodological and analytical techniques. Paul observed that the novel techniques appearing in the papers, posters, demos and conversations at LAK could support both utopian and dystopian futures. Paul also noted that its easier to get only one of the desirable qualities -- effectiveness, efficiency or enjoyableness -- from a learning design than to get all three at once, and thus his challenge sets a high bar.

2) How can measurements of students at the heart of learning analytics be reliable, valid, generalizable, comparable, and fair?

Bob Mislevy shared parallels from the history of psychometrics, which he characterized as a century-old “big data” movement. Today’s digital learning environments collect types of data that go beyond standardized testing data, and can capture orders of magnitude more data. But at their essence, both traditional testing and learning analytics seek to tackle challenges of measurement of students’ behavior, traits, skills and knowledge. As those measurements come to have consequences for the students, they must be defended. Hence, learning analytics cannot narrowly be only an enterprise of seeking useful correlations between system traces and student outcomes, but faces the grand challenge of making valid inferences.

3) How can students access, challenge, and control their own learning data?

Mireille Hildebrandt is a legal scholar and her keynote raised issues of data privacy as well as emerging progress towards legal principles for addressing the issues. At the heart of much of her discussion was the tension between individual and institutional rights with regard to the data generated as learners interact in digital educational environments. The current tendency is for institutions to have most of the rights, but this may result in a balance of power that harms individuals. Although legal issues were not at the heart of the scientific work at the conference, important papers were presented with regard to ethical standards for research — and awareness of these issues is important to avoid some possible dystopian scenarios. For example, see the DELICATE framework proposed by the LACE project.

The sold-out attendance at LAK’16 and standing-room only discussions of the keynote talks indicate broad community interest in these topics. The coming together of leadership from multiple societies show the broad enthusiasm of scientists for the future of learning analytics, but also the desire to work together to take on challenges that are bigger than any individual, project, or society can tackle on their own.

Nice memo. But who wil develop a framework for collecting Education analytics for research data comparision, measuring best digitalisation practices and international collaboration?

Like
Reply

Thanks for sharing these thoughts, Jeremy. It will definitely take close collaboration between scientific, educator and tech communities to deploy solutions that reach all 3 goals of accuracy of measurement, authenticity of learning and cost-effective scalability. It's so encouraging that this portion of the dialog applies to Higher Ed as well as K-12, as it so seldom does well.

Like
Reply

Thank you for sharing this.

Like
Reply

To view or add a comment, sign in

More articles by Jeremy Roschelle

Others also viewed

Explore content categories