Presentation about Learning Analytics for JISC network event; discussion of research findings and implications for individual and institutions considering a Learning Analytics project. Also discuss implications for my work with Blackboard on "Platform Analytics."
Using Learning Analytics to Assess Innovation & Improve Student Achievement
1. Using Learning Analytics to Assess
Innovation & Improve Student
Achievement
John Whitmer, Ed.D.
john.whitmer@blackboard.com
@johncwhitmer
UK Learning Analytics Network Event (JISC)
March 5, 2015
http://bit.ly/jwhitmer-jisc
2. Quick bio
15 years managing academic technology
at public higher ed institutions
(R1, 4-year, CC’s)
• Always multi-campus projects, innovative uses
of academic technologies
• Most recently: California State University,
Chancellor’s Office, Academic Technology Services
Doctorate in Education from UC Davis (2013)
with Learning Analytics study on Hybrid,
Large Enrollment course
Active academic research practice
(San Diego State Learning Analytics, MOOC
Research Initiative, Udacity SJSU Study…)
3. Meta-questions driving my research
1. How can we provide students with
immediate, real-time feedback? (esp
identify students at-risk of failing a
course)
2. How can we design effective
interventions for these students?
3. How can we assess innovations
(or status quo deployments) of
academic technologies?
4. Do these
findings apply
equally to
students ‘at
promise’ due to
their background
(e.g. race, class,
family education,
geography)
4. Outline
1. Defining & Positioning Learning Analytics
2. A Few Empirical Research Findings
• Understanding Contradictory Outcomes in a Redesigned Hybrid Course
(Chico State)
• Creating Accurate Learning Analytics Triggers & Effective Interventions
(SDSU)
3. How we’re Applying this Research @ Blackboard
4. Discussion
4
5. Economist. (2010, 11/4/2010). Augmented business: Smart systems will disrupt lots of industries, and perhaps the entire economy. The Economist.
5
6. 200MBof data emissions annually
Economist. (2010, 11/4/2010). Augmented business: Smart systems will disrupt lots of industries, and perhaps the entire economy. The Economist.
6
7. Logged into course
within 24 hours
Interacts frequently
in discussion boards
Failed first exam
Hasn’t taken
college-level math
No declared major
7
8. What is learning analytics?
Learning and Knowledge
Analytics Conference, 2011
“ ...measurement, collection,
analysis and reporting of data about
learners and their contexts,
for purposes of understanding
and optimizing learning
and the environments
in which it occurs.”
9. Strong interest by faculty & students
From Eden Dahlstrom, D. Christopher Brooks, and Jacqueline Bichsel. The Current Ecosystem of Learning Management Systems in Higher Education: Student, Faculty,
and IT Perspectives. Research report. Louisville, CO: ECAR, September 2014. Available from http://www.educause.edu/ecar.
13. Course redesigned for hybrid
delivery in year-long program
Enrollment: 373 students
(54% increase largest section)
Highest LMS usage entire
campus Fall 2010 (>250k hits)
Bimodal outcomes:
• 10% increased SLO mastery
• 7% & 11% increase in DWF
Why? Can’t tell with aggregated
reporting data
Study Overview
54 F’s
14. Grades Significantly Related to Access
Course: “Introduction to Religious Studies”
CSU Chico, Fall 2013 (n=373)
Variable % Variance
Total Hits 23%
Assessment activity hits 22%
Content activity hits 17%
Engagement activity hits 16%
Administrative activity hits 12%
Mean value all significant
variables 18%
15. LMS Activity better Predictor than
Demographic/Educational Variables
Variable % Var.
HS GPA 9%
URM and Pell-Eligibility Interaction 7%
Under-Represented Minority 4%
Enrollment Status 3%
URM and Gender Interaction 2%
Pell Eligible 2%
First in Family to Attend College 1%
Mean value all significant variables 4%
Not Statistically Significant
Gender
Major-College
22. 1. Identify courses and recruit instructors
2. Prior to course start, review syllabus, schedule meaningful
“triggers” for each course (e.g. attendance, graded items,
Blackboard use, etc.)
3. Run reports in Blackboard, Online Homework/Quiz software to
identify students with low activity or performance (~ weekly)
4. Send “flagged” student in experimental group a
notification/intervention
5. Aggregate data, add demographic data. Analyze.
Study Protocol
22
23. Key Questions
1. Are triggers accurate predictors of course grade?
2. Do interventions (based on triggers) improve student grades?
3. Do these relationships vary based on student background
characteristics?
27. A Typical Intervention: “Concerned Friend” tone
27
… data that I've gathered over the years via clickers indicates
that students who attend every face-to-face class meeting reduce
their chances of getting a D or an F in the class from
almost 30% down to approximately 8%.
So, please take my friendly advice and attend class and
participate in our classroom activities via your clicker. You'll be
happy you did!
Let me know if you have any questions.
Good luck,
Dr. Laumakis
28. Poll question
A Not significant
<10%, significant .05 level
20%, significant .01 level
30%, significant .001 level
Did triggers predict achievement? What level significance?
How much variation in student grade was explained?
B
C
D
E 50%+, significant .0001 level
28
29. Poll question
A Not significant
<10%, significant .05 level
20%, significant .01 level
30%, significant .001 level
Did triggers predict achievement? What level significance?
How much variation in student grade was explained?
B
C
D
E 50%+, significant .0001 level (Spring 2014, Fall 2014)
29
30. Statistics
Learning analytics triggers vs. final course points
Spring 2014: 4 sections, 2 courses, 882 students
Psychology
p<0.0001; r2=0.4828 p<0.0001; r2=0.6558
31. Fall 2014 results: Almost identical
5 Sections, 3 Courses, N=1,220 students
p<0.00001; r2=0.4836
32. Spring 2015 Results (tentative): lower relationship
8 Sections, 5 Courses, N=1,390 students
p<0.00001; r2=0.28
32
33. Spring 2015 Results (tentative): lower relationship
8 Sections, 5 Courses, N=1,390 students
p<0.00001; r2=0.28
33
34. Explained by differences between courses
(Spring 2015 Results by Course)
R2 (all triggers) R2 (no grades)
Anthro1
(Online)
0.54 0.65
Anthro3
(In Person)
not significant not significant
Comp Eng not significant not significant
Econ4 not significant not significant
Psych1 0.58 0.4
Psych2 0.41 0.23
Stat3 0.2 0.11
Stat4 0.33 0.21
34
R2 (all triggers) R2 (no grades)
Anthro1
(Online)
0.54 0.65
Anthro3 (In
Person)
not significant not significant
Comp Eng not significant not significant
Econ4 not significant not significant
Psych1 0.58 0.4
Psych2 0.41 0.23
Stat3 0.2 0.11
Stat4 0.33 0.21
35. So did the interventions make a
difference in learning outcomes?
37. 77%
91%23%
9%
0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
100%
No Interventions
(n=87, PSY, Pell-Eligible)
Interventions
(n=81, PSY, Pell-eligible)
Experimental Participation vs.
Repeatable Grade (Pell-Eligible)
(n=168, Spring 2014, PSY 101)
Passing
Grade
Repeatible
Grade
24 additional Pell-eligible students would have
passed the class
if the intervention was applied to all
participating students.
37
38. Fall 2014 / Spring 2015 Intervention
Results:
No Significant Difference Between
Experimental/Control Groups.
38
39. One Explanation: Low Reach
39
Fall 2014 (n = 1,220)
Row Labels # Triggers Message Open Rate Clickthrough Rate
Econ1 8 76% 36%
Psych1 6 70% 29%
Psych2 7 69% 35%
Stat3 9 62% 25%
Stat4 8 65% 27%
Grand Total 38 68% 30%
Spring 2015 (n = 1,138)
Row Labels # Triggers Message Open Rate Clickthrough Rate
Anthro-In Person 17 57% 10%
Anthro-Online 7 71% 35%
Comp
Engineering
15 52% 14%
Econ 15 44% 13%
Psych1 17 60% 13%
Psych2 17 63% 13%
Stat3 21 64% 9%
Stat4 20 55% 5%
Grand Total 129 58% 12%
40. Proposed Next Steps
• Add interventions that move “beyond informing”
students to address underlying study skills and behaviors
– Supplemental Instruction <http://www.umkc.edu/asm/si/>
– Adaptive Release within online courses (content, activities)
40
41. 1. Data from academic technology use predicts student
achievement; diverse sources provide better predictions.
2. Tech use > demographic data to predict course success; adding
demographic data provides nuanced understandings and
identifies trends not otherwise visible.
3. Academic technology use is not a “cause” in itself,
but reveals underlying study habits and behaviors
(e.g. effort, time on task, massed vs. distributed activity).
4. Predictions are necessary, but not sufficient, to change academic
outcomes. Research into interventions is promising.
5. We’re at an early stage in Learning Analytics;
expect quantum leaps in the near future.
41
Conclusions and Implications
42. 4. How we’re Applying this
Research @ Blackboard
42
43. Improved instrumentation for learning activity
within applications
Blackboard’s “Platform Analytics” Project
A new effort to
enhance
our analytics
offerings across our
academic technology
applications that
include
Applied findings from analysis
(inc. inferential statistics and data mining)
Integrated analytics into user experiences
(inc. student and faculty)
Aggregated usage data across cloud
applications (anonymized, rolled-up,
privacy-compliant)
48. Factors affecting growth of learning analytics
Enabler
Constraint
WidespreadRare
New education
models
Resources
($$$, talent)
Data governance (privacy,
security, ownership)
Clear goals and
linked actions
Data valued in
academic decisions
Tools/systems for data
co-mingling and analysis
Academic
technology adoption
Low data quality (fidelity
with meaningful learning)
Difficulty of data
preparation
Not invented here
syndrome
49. Call to action [with amendments]
(from a May 2012 Keynote Presentation @ San Diego State U)
You’re not behind the curve, this is a rapidly emerging area
that we can (should) lead... [together with interested partners]
Metrics reporting is the foundation for analytics
[don’t under or over-estimate the importance]
Start with what you have! Don’t wait for student characteristics and
detailed database information; LMS data can provide significant insights
If there’s any ed tech software folks in the audience,
please help us with better reporting!
[we’re working on it and feel your pain!]
2013 SP – Study began with two courses (N=`2,000) in Spring 2014
Weekly reports; triggered students sent email and multimedia “interventions” (low/high intensity)
Our hypothesis was - as you will see as the number of trigger events increase, so would the likelihood of having to repeat the course
We focused on high needs courses.
16-38% “Repeatable grades”
James
Weekly reports; triggered students sent email “interventions” (low intensity). Did people show up i.e., Clicker (participation/attendance points)
Talking points:
Almost ¾ of students got at least one trigger in each course
More PSY students got interventions than Stat students (b/c not completing homework)
The pattern of the # of interventions in both courses is about the same – high up to 2-3, then trails off.
Interesting findings – when consider that the triggers were very different between courses (e.g. PSY only 2 graded items, PSY: Online Homework, Stat: Online Quizzes. Etc).
James: most recent semester: different trajectory by course, although see trail-off pattern over time, most of the triggers in small range (1-3)
But significant differences between courses in number of triggers and number of students “activating” trigger.
Show differences as we have expanded the study over time.
James
For the experimental students, we have expanded the interventions over time to use different modalities: but the focus of all interventions has been in increasing the awareness of students about their at-risk behaviors/status, so that they can do something about it (self regulate).
Message written by instructor, with strong attention to tone: a “concerned friend”, which we thought would be more effective with students.
Also include serious results that might resonate with them and they would take notice about.
John
John
These graphs illustrate that DECREASES in triggers are related to INCREASES in student grade.
(explanation: Each dot is a student; Y axis is the total points (lower to higher), and X axis is the total # of triggers (higher to lower))
Significantly significant results for both courses; possibility due to chance less than 1 in 10,000.
Size of effect different: PSY: triggers explain 48% variation in final grade
STAT: triggers explain 66% of variation in final grade (if remove graded items from Stat, triggers explains 49%)
John
John
John: Million dollar question: did this reduce student repeatable grade rates.
John
James
Definition: Supplemental Instruction (SI) is an academic assistance program that utilizes peer-assisted study sessions. SI sessions are regularly-scheduled, informal review sessions in which students compare notes, discuss readings, develop organizational tools, and predict test items. Students learn how to integrate course content and study skills while working together. The sessions are facilitated by “SI leaders”, students who have previously done well in the course and who attend all class lectures, take notes, and act as model students. Purpose: To increase retention within targeted historically difficult courses
To improve student grades in targeted historically difficult courses
To increase the graduation rates of students
Participants: SI is a “free service” offered to all students in a targeted course. SI is a non remedial approach to learning as the program targets high-risk courses rather than high-risk students. All students are encouraged to attend SI sessions, as it is a voluntary program. Students with varying levels of academic preparedness and diverse ethnicities participate. There is no remedial stigma attached to SI since the program targets high-risk courses rather than high-risk students.
John – first two points (provide examples of academic technology use when explaining first bullet)
James – last two points (because so much of a students participation and performance in a class is logged via academic technology tools it provides us with evidence of student behavior which reveal patterns that can predict struggling students or successful students)
More and better data: More data sources, less structured data, integrated data, potential for expansion of data sources in future. Unlock meaning from data that’s meaningful/available.
Intended for multiple audiences: Blackboard internal development, CIOs/administrative customers interested in metrics, faculty/student customers interested in learning/teaching, researchers interested in underlying relationships.