Presented by the Assessment Research Centre
and the Melbourne Centre for the Study of Higher Education
Teaching, Assessment and Learning Analytics: Time to Question Assumptions
Simon Buckingham Shum
Professor of Learning Informatics, and Director of the Connected Intelligence Centre (CIC)
University of Technology Sydney
When: 11.30 -12.30 pm, Wed. 13 Sep 2017
Where: Frank Tate Room, Level 9, 100 Leicester St, Carlton
This will be a non-technical talk accessible to a broad range of educational practitioners and researchers, designed to provoke a conversation that provides time to question assumptions. The field of Learning Analytics sits at the convergence of two fields: Learning (including learning technology, educational research and learning/assessment sciences) and Analytics (statistics; visualisation; computer science; data science; AI). Many would add Human-Computer Interaction (e.g. participatory design; user experience; usability evaluation) as a differentiator from related fields such as Educational Data Mining, since the Learning Analytics community attracts many with a concern for the sociotechnical implications of designing and embedding analytics in educational organisations.
Learning Analytics is viewed by many educators with the same suspicion they reserve for AI or “learning management systems”. While in some cases this is justified, I will question other assumptions with some learning analytics examples which can serve as objects for us to think with. I am curious to know what connections/questions arise when these are shared..
Simon Buckingham Shum is Professor of Learning Informatics at the University of Technology Sydney, where he was appointed in August 2014 to direct the new Connected Intelligence Centre. Previously he was Professor of Learning Informatics and an Associate Director at The UK Open University’s Knowledge Media Institute. He is active in the field of Learning Analytics as a co-founder and former Vice President of the Society for Learning Analytics Research, and Program Co-Chair of LAK18, the International Learning Analytics and Knowledge Conference. Previously he co-founded the Compendium Institute and Learning Emergence networks. Simon brings a Human-Centred Informatics (HCI) approach to his work, with a background in Psychology (BSc, York), Ergonomics (MSc, London) and HCI Design Argumentation (PhD, York). He co-edited Visualizing Argumentation (2003) followed by Knowledge Cartography (2008, 2nd Edn. 2014), and with Al Selvin, wrote Constructing Knowledge Art (2015). He was recently appointed as a Fellow of The RSA. http://Simon.BuckinghamShum.net
2. Simon Buckingham Shum
Connected Intelligence Centre • University of Technology Sydney
@sbuckshum • http://utscic.edu.au • http://Simon.BuckinghamShum.net
Teaching, Assessment and Learning Analytics:
Time to Question Assumptions
University of Melbourne • Assessment Research Centre & Centre for the Study of Higher Education, 13th Sept. 2017
9. Learning Analytics: a form of computational social science
9
Computing/
Data Sciences
Education &
Learning
Sciences
Human-Computer
Interaction
10. What do we mean by “Learning Analytics”?
“the measurement, collection,
analysis and reporting of data
about learners and their
contexts, for purposes of
understanding and optimizing
learning and the environments in
which it occurs”
Society for Learning Analytics Research
10
Er, isn’t this what educational
researchers have always done?
11. What do we mean by “Learning Analytics”?
“the process of developing
actionable insights through
problem definition and the
application of statistical models
and analysis against existing
and/or simulated future data”
UK Joint Information Systems Committee
11
For details and more definitions see http://www.laceproject.eu/faqs/learning-analytics
Er, so Learning Analytics =
statistical power tools for
educational researchers/
learning scientists, or institutional
analysts?
12. What do we mean by “Learning Analytics”?
“The potential of learning analytics is arguably far more
significant than as an enabler of data-intensive educational
research, exciting as this is. The new possibility is that
educators and learners — the stakeholders who constitute the
learning system studied for so long by researchers — are for
the first time able to see their own processes and progress
rendered in ways that until now were the preserve of
researchers outside the system.” (p.17)
12
Knight S. & Buckingham Shum, S. (2017). Theory and Learning Analytics. Handbook of Learning Analytics (Chapter 1).
Society for Learning Analytics Research. https://solaresearch.org/hla-17
14. Learning Analytics are here, but can promote different
educational futures: let’s invent good ones
14
epistemology
pedagogyassessment
Knight, S., Buckingham Shum, S. and Littleton, K. (2014). Epistemology, Assessment, Pedagogy: Where Learning Meets Analytics in the Middle Space.
Journal of Learning Analytics, 1, (2), pp.23-47. http://epress.lib.uts.edu.au/journals/index.php/JLA/article/download/3538/4156
learning
analytics
What kinds of learner activity do the
analytics value by tracking?
(so what is not valued?)
Do the analytics value the same things
as the assessment regime? (if not,
why will educators or learners care?)
Do learners see the analytics? What
does this say about the pedagogy? (is it
desirable that they change their
process mid-course in response to
their own and others’ feedback?)
15. Concerns around Learning Analytics
Schools and universities are outsourcing
their core business — e.g. resource
selection, instructional design, feedback
and grading — to black box algorithms
15
16. A student warning system, somewhere near you…
Student: “Being told by the LMS every time I login that I’m at
grave risk of failing is stressful. I already informed Student
Services about my disability and recent bereavement, and I’m
working with my tutor to catch up…”
17. A student warning system, somewhere near you…
University: “Don’t worry it’s nothing
personal. It’s just the algorithm.”
Student: “Being told by the LMS every time I login that I’m at
grave risk of failing is stressful. I already informed Student
Services about my disability and recent bereavement, and I’m
working with my tutor to catch up…”
18. A social network visualisation, somewhere near you…
Student: “Being picked out like this as some
sort of loner makes me feel uncomfortable…
I chat with peers all the time in the cafes.”
19. A social network visualisation, somewhere near you…
Student: “Being picked out like this as some
sort of loner makes me feel uncomfortable…
I chat with peers all the time in the cafes.”
University: “Don’t worry it’s nothing
personal. It’s just the algorithm.”
20. Algorithms are generating huge interest in the
media, policy, social justice, and academia
http://governingalgorithms.org
http://datasociety.net
21. Algorithmic accountability in learning?
21
http://simon.buckinghamshum.net/2016/03/algorithmic-accountability-for-learning-analytics
22. Concerns around Learning Analytics
LMS data often has little to do with learning —
why all the attention on those dashboards when
our most skilled educators are typically using
other platforms creatively and to great effect?
22
23. We can now aggregate activity from diverse platforms (with student consent)
23
?
?
24. Connected Learning Analytics Toolkit
K. Kitto, A. Bakharia, M. Lupton, D. Mallet, J. Banks, P. Bruza, A. Pardo, S. Buckingham Shum, S. Dawson, D. Gašević, G. Siemens, & G. Lynch, (2016). The connected
learning analytics toolkit. Proc. 6th Int. Conf. Learning Analytics & Knowledge (LAK '16). ACM, New York, NY, USA, 548-549. https://doi.org/10.1145/2883851.2883881
Student dashboard Educator dashboard
25. 25
CLA Toolkit: Groupwork dashboard
We could use this to
grade summatively, or
as formative feedback to
provoke reflection
Kitto et al (2017). Designing for student-facing learning analytics. Australasian Journal of Educational Technology, 2017, 33(5)
26. Concerns around Learning Analytics
There’s far more to learning than
having students tethered to screens
26
28. Posture analysis of fieldwork students
28Masaya Okada and Masahiro Tada. 2014. Formative assessment method of real-world learning by integrating heterogeneous elements of behavior, knowledge,
and the environment. Proc. 4th Int. Conf. on Learning Analytics and Knowledge. ACM, New York, NY, USA, 1-10. DOI= http://dx.doi.org/10.1145/2567574.2567579
29. Sensing co-located teamwork
Martinez-Maldonado, R., Power, T., Hayes, C., Abdipranoto, A., Vo, T., Axisa, C., and Buckingham Shum, S. (2017) Analytics Meet Patient Manikins: Challenges in an
Authentic Small-Group Healthcare Simulation Classroom. International Conference on Learning Analytics and Knowledge, LAK 2017, 90-94.
30. Tracking nurses’ movement around the patient
Martinez-Maldonado, R., Pechenizkiy, M., Power, T., Buckingham Shum, S., Hayes, C. and Axisa, C. (2017) Modelling Embodied Mobility Teamwork Strategies in a
Simulation-Based Healthcare Classroom. International Conference on User Modelling, Adaptation and Personalization, UMAP 2017,
31. Movement heatmaps –> sequence analysis
Martinez-Maldonado, R., Pechenizkiy, M., Power, T., Buckingham Shum, S., Hayes, C. and Axisa, C. (2017) Modelling Embodied Mobility Teamwork Strategies in a
Simulation-Based Healthcare Classroom. International Conference on User Modelling, Adaptation and Personalization, UMAP 2017,
32. Concerns around Learning Analytics
The vision of students being continually
prodded or reassured by AI agents is
antithetical to cultivating student qualities like
self-regulation, agency, curiosity, resilience…
32
33. 33
Assessing learning dispositions:
Crick Learning for Resilient Agency survey (CLARA)
Deakin Crick, R., Huang, S., Ahmed Shafi, A. and Goldspink, C. (2015). Developing Resilient Agency in Learning: The Internal
Structure of Learning Power. British Journal of Educational Studies: 62, (2), 121-160. http://dx.doi.org/10.1080/00071005.2015.1006574
https://utscic.edu.au/tools/clara • http://clara.learningemergence.com
34. 34
Structural Equation
Model underpinning
CLARA
Deakin Crick, R., Huang, S., Ahmed Shafi, A. and Goldspink, C. (2015). Developing Resilient Agency in Learning: The Internal
Structure of Learning Power. British Journal of Educational Studies: 62, (2), 121-160. http://dx.doi.org/10.1080/00071005.2015.1006574
35. Immediate visual analytic generated by CLARA
Feedback to Stimulate Self-Directed Change
A framework for reflection and coaching
40 item survey
Deakin Crick, R., Huang, S., Ahmed Shafi, A. and Goldspink, C. (2015). Developing Resilient Agency in Learning: The Internal
Structure of Learning Power. British Journal of Educational Studies: 62, (2), 121-160. http://dx.doi.org/10.1080/00071005.2015.1006574
36. Scaling CLARA in UTS
n=876 n=957
n=548 n=602
§ Approx. 3000 student profiles
§ For the 921 students with both pre-
and post-subject profiles, there
were significant positive changes
on all 8 dimensions.
§ We can also derive through cluster
analysis significantly different
cohort profiles: 4 examples
§ Next step: explore the relationships
of these self-reported profiles to
student outcomes
37. Student Effort (x) vs Grade (y)
from teacher observational assessments
over the semester: basis for a conversation
37
Nagy, R. (2016). Tracking and visualizing student effort: Evolution of a practical analytics tool for staff and
student engagement. Journal of Learning Analytics, 3 (2), 165–193. http://dx.doi.org/10.18608/jla.2016.32.8
UTS:CIC seminar: https://utscic.edu.au/events/niccic-redlands-school-8-june-2016
https://vimeo.com/168306314
38. 38
From clicks to constructs in
MOOCs
Defining a C21 capability of
Crowd-Sourced
Learning
(Part of a
larger map)
Milligan, S. and Griffin, P. (2016).
Understanding learning and learning
design in MOOCs: A measurement-based
interpretation. Journal of Learning
Analytics, 3(2), 88– 115.
http://dx.doi.org/10.18608/jla.2016.32.5
39. 39
From clicks to constructs in
MOOCs
Defining a C21 capability of
Crowd-Sourced
Learning
(Part of a
larger map)
Milligan, S. and Griffin, P. (2016).
Understanding learning and learning
design in MOOCs: A measurement-based
interpretation. Journal of Learning
Analytics, 3(2), 88– 115.
http://dx.doi.org/10.18608/jla.2016.32.5
40. 40
From clicks to constructs in
MOOCs
Defining a C21 capability of
Crowd-Sourced
Learning
(Part of a
larger map)
Milligan, S. and Griffin, P. (2016).
Understanding learning and learning
design in MOOCs: A measurement-based
interpretation. Journal of Learning
Analytics, 3(2), 88– 115.
http://dx.doi.org/10.18608/jla.2016.32.5
41. 41
From clicks to constructs in
MOOCs
Defining a C21 capability of
Crowd-Sourced
Learning
(Part of a
larger map)
Milligan, S. and Griffin, P. (2016).
Understanding learning and learning
design in MOOCs: A measurement-based
interpretation. Journal of Learning
Analytics, 3(2), 88– 115.
http://dx.doi.org/10.18608/jla.2016.32.5
42. Learning Analytics for 21st Century Competencies. (Eds.)
Buckingham Shum S. & Deakin Crick, R. (2016). Journal of
Learning Analytics (Special Section), 3(2), pp. 6-212.
http://dx.doi.org/10.18608/jla.2016.32.2
More examples
https://utscic.edu.au/lasi-asia-keynote2016
43. Concerns around Learning Analytics
Higher education – at its best –
teaches students to craft arguments, and
reflect deeply on how they’re developing as
learners and professionals
(Analytics seem to have nothing to say about such qualities.)
43
45. Academic Writing Analytics: feedback on
analytical/argumentative or reflective writing
Info https://utscic.edu.au/tools/awa
46. 46
Highlighted sentences are colour-
coded according to their broad type
Sentences with Function Keys have
more precise functions (e.g. Novelty)
CIC’s automated feedback tool: analytical writing
Info https://utscic.edu.au/tools/awa
47. AWA: ANALYTICAL ACADEMIC WRITING (UTS CIVIL LAW)
47Knight, S., Buckingham Shum, S., Ryan, P., Sándor, Á., & Wang, X. (Forthcoming). Academic Writing Analytics for Civil Law: Participatory Design
through Academic and Student Engagement. International Journal of Artificial Intelligence in Education
48. AWA: ANALYTICAL ACADEMIC WRITING
48
Roll over sentences with Fkeys for a
popup reminding you of their meaning
Knight, S., Buckingham Shum, S., Ryan, P., Sándor, Á., & Wang, X. (Forthcoming). Academic Writing Analytics for Civil Law: Participatory Design
through Academic and Student Engagement. International Journal of Artificial Intelligence in Education
49. New UI to promote re-drafting in response to feedback
Shibani, A., Knight, S., Buckingham Shum, S., & Ryan, P. (2017, Forthcoming). Design and Implementation of a Pedagogic Intervention Using Writing Analytics. In
W. Chen et al. (Eds.). Proceedings of the 25th International Conference on Computers in Education, New Zealand.