SlideShare une entreprise Scribd logo
1  sur  50
Using Learning Analytics to Assess
Innovation & Improve Student
Achievement
John Whitmer, Ed.D.
john.whitmer@blackboard.com
@johncwhitmer
UK Learning Analytics Network Event (JISC)
March 5, 2015
http://bit.ly/jwhitmer-jisc
Quick bio
15 years managing academic technology
at public higher ed institutions
(R1, 4-year, CC’s)
• Always multi-campus projects, innovative uses
of academic technologies
• Most recently: California State University,
Chancellor’s Office, Academic Technology Services
Doctorate in Education from UC Davis (2013)
with Learning Analytics study on Hybrid,
Large Enrollment course
Active academic research practice
(San Diego State Learning Analytics, MOOC
Research Initiative, Udacity SJSU Study…)
Meta-questions driving my research
1. How can we provide students with
immediate, real-time feedback? (esp
identify students at-risk of failing a
course)
2. How can we design effective
interventions for these students?
3. How can we assess innovations
(or status quo deployments) of
academic technologies?
4. Do these
findings apply
equally to
students ‘at
promise’ due to
their background
(e.g. race, class,
family education,
geography)
Outline
1. Defining & Positioning Learning Analytics
2. A Few Empirical Research Findings
• Understanding Contradictory Outcomes in a Redesigned Hybrid Course
(Chico State)
• Creating Accurate Learning Analytics Triggers & Effective Interventions
(SDSU)
3. How we’re Applying this Research @ Blackboard
4. Discussion
4
Economist. (2010, 11/4/2010). Augmented business: Smart systems will disrupt lots of industries, and perhaps the entire economy. The Economist.
5
200MBof data emissions annually
Economist. (2010, 11/4/2010). Augmented business: Smart systems will disrupt lots of industries, and perhaps the entire economy. The Economist.
6
Logged into course
within 24 hours
Interacts frequently
in discussion boards
Failed first exam
Hasn’t taken
college-level math
No declared major
7
What is learning analytics?
Learning and Knowledge
Analytics Conference, 2011
“ ...measurement, collection,
analysis and reporting of data about
learners and their contexts,
for purposes of understanding
and optimizing learning
and the environments
in which it occurs.”
Strong interest by faculty & students
From Eden Dahlstrom, D. Christopher Brooks, and Jacqueline Bichsel. The Current Ecosystem of Learning Management Systems in Higher Education: Student, Faculty,
and IT Perspectives. Research report. Louisville, CO: ECAR, September 2014. Available from http://www.educause.edu/ecar.
Source: Educause and AIR, 2012 (2012), http://goo.gl/337mA
2. A Few Empirical
Research Findings
Study 1: Understanding
Contradictory Outcomes in a
Redesigned Hybrid Course
(Chico State)
Course redesigned for hybrid
delivery in year-long program
Enrollment: 373 students
(54% increase largest section)
Highest LMS usage entire
campus Fall 2010 (>250k hits)
Bimodal outcomes:
• 10% increased SLO mastery
• 7% & 11% increase in DWF
Why? Can’t tell with aggregated
reporting data
Study Overview
54 F’s
Grades Significantly Related to Access
Course: “Introduction to Religious Studies”
CSU Chico, Fall 2013 (n=373)
Variable % Variance
Total Hits 23%
Assessment activity hits 22%
Content activity hits 17%
Engagement activity hits 16%
Administrative activity hits 12%
Mean value all significant
variables 18%
LMS Activity better Predictor than
Demographic/Educational Variables
Variable % Var.
HS GPA 9%
URM and Pell-Eligibility Interaction 7%
Under-Represented Minority 4%
Enrollment Status 3%
URM and Gender Interaction 2%
Pell Eligible 2%
First in Family to Attend College 1%
Mean value all significant variables 4%
Not Statistically Significant
Gender
Major-College
At-risk students: “Over-working gap”
Activities by Pell and grade
Grade / Pell-Eligible
A B+ C C-
0K
5K
10K
15K
20K
25K
30K
35K
Measure Names
Admin
Assess
Engage
Content
Not Pell-
Eligible
Pell-
Eligible
Not Pell-
Eligible
Pell-
Eligible
Not Pell-
Eligible
Pell-
Eligible
Not Pell-
Eligible
Pell-
Eligible
Activities by Pell and grade
Grade / Pell-Eligible
A B+ C C-
0K
5K
10K
15K
20K
25K
30K
35K
Measure Names
Admin
Assess
Engage
Content
Not Pell-
Eligible
Pell-
Eligible
Not Pell-
Eligible
Pell-
Eligible
Not Pell-
Eligible
Pell-
Eligible
Not Pell-
Eligible
Pell-
Eligible
Extra effort
In content-related
activities
Study 2: Creating Accurate Learning
Analytics Triggers & Effective
Interventions (SDSU)
Study Overview
• President-level initiative
• Goal: identify effective
interventions driven by Learning
Analytics “triggers”
• Multiple “triggers” (e.g., LMS
access, Grade, Online
Homework/Quiz, Clicker use)
• At scale & over time: conducted
for 3 terms, 5 unique courses,
3,529 students
• “Gold standard” experimental
design (control / treatment)
20
Focus on High Need Courses
21
84%
62%
69%
78.5%
70.8%
16%
38%
31%
21.5%
29.2%
0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
100%
ANTH 101 COMPE 270 ECON 101 PSY 101 STAT 119
Repeatable
Grades
Non-
Repeatable
Grades
1. Identify courses and recruit instructors
2. Prior to course start, review syllabus, schedule meaningful
“triggers” for each course (e.g. attendance, graded items,
Blackboard use, etc.)
3. Run reports in Blackboard, Online Homework/Quiz software to
identify students with low activity or performance (~ weekly)
4. Send “flagged” student in experimental group a
notification/intervention
5. Aggregate data, add demographic data. Analyze.
Study Protocol
22
Key Questions
1. Are triggers accurate predictors of course grade?
2. Do interventions (based on triggers) improve student grades?
3. Do these relationships vary based on student background
characteristics?
Frequency of interventions (Spring 2014)
# Students Receiving >0 Interventions:
PSY: 177 (84%) STAT: 165 (70%)
14%
19%
11%
17%
10%
6%
5%
6%
2% 1%
3% 2%
30%
17%
13%
12%
7%
6% 6%
3%
4%
1%
2%
4%
0%
5%
10%
15%
20%
25%
30%
35%
0 1 2 3 4 5 6 7 8 9 10 >10
Students
Interventions
PSY
STAT
Frequency of interventions (Spring 2015)
0%
5%
10%
15%
20%
25%
30%
35%
40%
45%
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14
Students
Triggers Activated
Triggers Activated per StudentS
(Spring 2015)
Anthro1
Anthro3
Comp Engr
Econ4
Psych1
Psych2
Stats3
Stats4
25
Interventions
Spring 2014
Fall 2014
Fall 2015
26
A Typical Intervention: “Concerned Friend” tone
27
… data that I've gathered over the years via clickers indicates
that students who attend every face-to-face class meeting reduce
their chances of getting a D or an F in the class from
almost 30% down to approximately 8%.
So, please take my friendly advice and attend class and
participate in our classroom activities via your clicker. You'll be
happy you did!
Let me know if you have any questions.
Good luck,
Dr. Laumakis
Poll question
A Not significant
<10%, significant .05 level
20%, significant .01 level
30%, significant .001 level
Did triggers predict achievement? What level significance?
How much variation in student grade was explained?
B
C
D
E 50%+, significant .0001 level
28
Poll question
A Not significant
<10%, significant .05 level
20%, significant .01 level
30%, significant .001 level
Did triggers predict achievement? What level significance?
How much variation in student grade was explained?
B
C
D
E 50%+, significant .0001 level (Spring 2014, Fall 2014)
29
Statistics
Learning analytics triggers vs. final course points
Spring 2014: 4 sections, 2 courses, 882 students
Psychology
p<0.0001; r2=0.4828 p<0.0001; r2=0.6558
Fall 2014 results: Almost identical
5 Sections, 3 Courses, N=1,220 students
p<0.00001; r2=0.4836
Spring 2015 Results (tentative): lower relationship
8 Sections, 5 Courses, N=1,390 students
p<0.00001; r2=0.28
32
Spring 2015 Results (tentative): lower relationship
8 Sections, 5 Courses, N=1,390 students
p<0.00001; r2=0.28
33
Explained by differences between courses
(Spring 2015 Results by Course)
R2 (all triggers) R2 (no grades)
Anthro1
(Online)
0.54 0.65
Anthro3
(In Person)
not significant not significant
Comp Eng not significant not significant
Econ4 not significant not significant
Psych1 0.58 0.4
Psych2 0.41 0.23
Stat3 0.2 0.11
Stat4 0.33 0.21
34
R2 (all triggers) R2 (no grades)
Anthro1
(Online)
0.54 0.65
Anthro3 (In
Person)
not significant not significant
Comp Eng not significant not significant
Econ4 not significant not significant
Psych1 0.58 0.4
Psych2 0.41 0.23
Stat3 0.2 0.11
Stat4 0.33 0.21
So did the interventions make a
difference in learning outcomes?
79% 79%
89%
82%
21% 21% 11% 18%
0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
100%
Control (STAT) Exp. (STAT) Exp. (PSY) Control (PSY)
Experimental Participation vs.
Repeatable Grade (Spring 2014)
Repeatible
Grade
Passing
Grade
36
77%
91%23%
9%
0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
100%
No Interventions
(n=87, PSY, Pell-Eligible)
Interventions
(n=81, PSY, Pell-eligible)
Experimental Participation vs.
Repeatable Grade (Pell-Eligible)
(n=168, Spring 2014, PSY 101)
Passing
Grade
Repeatible
Grade
24 additional Pell-eligible students would have
passed the class
if the intervention was applied to all
participating students.
37
Fall 2014 / Spring 2015 Intervention
Results:
No Significant Difference Between
Experimental/Control Groups.
38
One Explanation: Low Reach
39
Fall 2014 (n = 1,220)
Row Labels # Triggers Message Open Rate Clickthrough Rate
Econ1 8 76% 36%
Psych1 6 70% 29%
Psych2 7 69% 35%
Stat3 9 62% 25%
Stat4 8 65% 27%
Grand Total 38 68% 30%
Spring 2015 (n = 1,138)
Row Labels # Triggers Message Open Rate Clickthrough Rate
Anthro-In Person 17 57% 10%
Anthro-Online 7 71% 35%
Comp
Engineering
15 52% 14%
Econ 15 44% 13%
Psych1 17 60% 13%
Psych2 17 63% 13%
Stat3 21 64% 9%
Stat4 20 55% 5%
Grand Total 129 58% 12%
Proposed Next Steps
• Add interventions that move “beyond informing”
students to address underlying study skills and behaviors
– Supplemental Instruction <http://www.umkc.edu/asm/si/>
– Adaptive Release within online courses (content, activities)
40
1. Data from academic technology use predicts student
achievement; diverse sources provide better predictions.
2. Tech use > demographic data to predict course success; adding
demographic data provides nuanced understandings and
identifies trends not otherwise visible.
3. Academic technology use is not a “cause” in itself,
but reveals underlying study habits and behaviors
(e.g. effort, time on task, massed vs. distributed activity).
4. Predictions are necessary, but not sufficient, to change academic
outcomes. Research into interventions is promising.
5. We’re at an early stage in Learning Analytics;
expect quantum leaps in the near future.
41
Conclusions and Implications
4. How we’re Applying this
Research @ Blackboard
42
Improved instrumentation for learning activity
within applications
Blackboard’s “Platform Analytics” Project
A new effort to
enhance
our analytics
offerings across our
academic technology
applications that
include
Applied findings from analysis
(inc. inferential statistics and data mining)
Integrated analytics into user experiences
(inc. student and faculty)
Aggregated usage data across cloud
applications (anonymized, rolled-up,
privacy-compliant)
Blackboard Analytics for Learn
3. Wrap-Up and Discussion
Factors affecting growth of learning analytics
Enabler
Constraint
WidespreadRare
New education
models
Resources
($$$, talent)
Data governance (privacy,
security, ownership)
Clear goals and
linked actions
Data valued in
academic decisions
Tools/systems for data
co-mingling and analysis
Academic
technology adoption
Low data quality (fidelity
with meaningful learning)
Difficulty of data
preparation
Not invented here
syndrome
Call to action [with amendments]
(from a May 2012 Keynote Presentation @ San Diego State U)
You’re not behind the curve, this is a rapidly emerging area
that we can (should) lead... [together with interested partners]
Metrics reporting is the foundation for analytics
[don’t under or over-estimate the importance]
Start with what you have! Don’t wait for student characteristics and
detailed database information; LMS data can provide significant insights
If there’s any ed tech software folks in the audience,
please help us with better reporting!
[we’re working on it and feel your pain!]
Discussion /
Questions
John Whitmer, Ed.D.
john.whitmer@blackboard.com
@johncwhitmer
http://bit.ly/jwhitmer-jisc

Contenu connexe

Tendances

Students First 2020 - Usage and impact of academic support
Students First 2020 - Usage and impact of academic supportStudents First 2020 - Usage and impact of academic support
Students First 2020 - Usage and impact of academic supportStudiosity.com
 
Affective behaviour cognition learning gains project presentation
Affective behaviour cognition learning gains project presentationAffective behaviour cognition learning gains project presentation
Affective behaviour cognition learning gains project presentationBart Rienties
 
Learning Analytics: What is it? Why do it? And how?
Learning Analytics: What is it? Why do it?  And how?Learning Analytics: What is it? Why do it?  And how?
Learning Analytics: What is it? Why do it? And how?Timothy Harfield
 
Students First 2020 - Embracing and effectively leveraging online student sup...
Students First 2020 - Embracing and effectively leveraging online student sup...Students First 2020 - Embracing and effectively leveraging online student sup...
Students First 2020 - Embracing and effectively leveraging online student sup...Studiosity.com
 
Personalized Online Practice Systems for Learning Programming
Personalized Online Practice Systems for Learning ProgrammingPersonalized Online Practice Systems for Learning Programming
Personalized Online Practice Systems for Learning ProgrammingPeter Brusilovsky
 
Learning design meets learning analytics: Dr Bart Rienties, Open University
Learning design meets learning analytics: Dr Bart Rienties, Open UniversityLearning design meets learning analytics: Dr Bart Rienties, Open University
Learning design meets learning analytics: Dr Bart Rienties, Open UniversityBart Rienties
 
Using learning analytics to improve student transition into and support throu...
Using learning analytics to improve student transition into and support throu...Using learning analytics to improve student transition into and support throu...
Using learning analytics to improve student transition into and support throu...Tinne De Laet
 
ABLE - the NTU Student Dashboard - University of Derby
ABLE - the NTU Student Dashboard - University of DerbyABLE - the NTU Student Dashboard - University of Derby
ABLE - the NTU Student Dashboard - University of DerbyEd Foster
 
Online writing feedback: A national study exploring the service and learning ...
Online writing feedback: A national study exploring the service and learning ...Online writing feedback: A national study exploring the service and learning ...
Online writing feedback: A national study exploring the service and learning ...Studiosity.com
 
Educational Technologies: Learning Analytics and Artificial Intelligence
Educational Technologies: Learning Analytics and Artificial IntelligenceEducational Technologies: Learning Analytics and Artificial Intelligence
Educational Technologies: Learning Analytics and Artificial IntelligenceXavier Ochoa
 
Introduction to Learning Analytics - Framework and Implementation Concerns
Introduction to Learning Analytics - Framework and Implementation ConcernsIntroduction to Learning Analytics - Framework and Implementation Concerns
Introduction to Learning Analytics - Framework and Implementation ConcernsTore Hoel
 
Keynote H818 The Power of (In)formal learning: a learning analytics approach
Keynote H818 The Power of (In)formal learning: a learning analytics approachKeynote H818 The Power of (In)formal learning: a learning analytics approach
Keynote H818 The Power of (In)formal learning: a learning analytics approachBart Rienties
 
An Infrastructure for Sustainable Innovation and Research in Computer Scienc...
An Infrastructure for Sustainable Innovation and Research in Computer Scienc...An Infrastructure for Sustainable Innovation and Research in Computer Scienc...
An Infrastructure for Sustainable Innovation and Research in Computer Scienc...Peter Brusilovsky
 
State and Directions of Learning Analytics Adoption (Second edition)
State and Directions of Learning Analytics Adoption (Second edition)State and Directions of Learning Analytics Adoption (Second edition)
State and Directions of Learning Analytics Adoption (Second edition)Dragan Gasevic
 
Toward an automated student feedback system for text based assignments - Pete...
Toward an automated student feedback system for text based assignments - Pete...Toward an automated student feedback system for text based assignments - Pete...
Toward an automated student feedback system for text based assignments - Pete...Blackboard APAC
 
Course-Adaptive Content Recommender for Course Authoring
Course-Adaptive Content Recommender for Course AuthoringCourse-Adaptive Content Recommender for Course Authoring
Course-Adaptive Content Recommender for Course AuthoringPeter Brusilovsky
 
Conducting Research on Blended and Online Education, Workshop
Conducting Research on Blended and Online Education, WorkshopConducting Research on Blended and Online Education, Workshop
Conducting Research on Blended and Online Education, WorkshopTanya Joosten
 
Land of The Learning Giants: The Rise of MOOCs
Land of The Learning Giants: The Rise of MOOCsLand of The Learning Giants: The Rise of MOOCs
Land of The Learning Giants: The Rise of MOOCsEamon Costello
 
Educational Data Mining in Program Evaluation: Lessons Learned
Educational Data Mining in Program Evaluation: Lessons LearnedEducational Data Mining in Program Evaluation: Lessons Learned
Educational Data Mining in Program Evaluation: Lessons LearnedKerry Rice
 

Tendances (20)

Students First 2020 - Usage and impact of academic support
Students First 2020 - Usage and impact of academic supportStudents First 2020 - Usage and impact of academic support
Students First 2020 - Usage and impact of academic support
 
Affective behaviour cognition learning gains project presentation
Affective behaviour cognition learning gains project presentationAffective behaviour cognition learning gains project presentation
Affective behaviour cognition learning gains project presentation
 
Learning Analytics: What is it? Why do it? And how?
Learning Analytics: What is it? Why do it?  And how?Learning Analytics: What is it? Why do it?  And how?
Learning Analytics: What is it? Why do it? And how?
 
Students First 2020 - Embracing and effectively leveraging online student sup...
Students First 2020 - Embracing and effectively leveraging online student sup...Students First 2020 - Embracing and effectively leveraging online student sup...
Students First 2020 - Embracing and effectively leveraging online student sup...
 
Personalized Online Practice Systems for Learning Programming
Personalized Online Practice Systems for Learning ProgrammingPersonalized Online Practice Systems for Learning Programming
Personalized Online Practice Systems for Learning Programming
 
Learning design meets learning analytics: Dr Bart Rienties, Open University
Learning design meets learning analytics: Dr Bart Rienties, Open UniversityLearning design meets learning analytics: Dr Bart Rienties, Open University
Learning design meets learning analytics: Dr Bart Rienties, Open University
 
Using learning analytics to improve student transition into and support throu...
Using learning analytics to improve student transition into and support throu...Using learning analytics to improve student transition into and support throu...
Using learning analytics to improve student transition into and support throu...
 
ABLE - the NTU Student Dashboard - University of Derby
ABLE - the NTU Student Dashboard - University of DerbyABLE - the NTU Student Dashboard - University of Derby
ABLE - the NTU Student Dashboard - University of Derby
 
Online writing feedback: A national study exploring the service and learning ...
Online writing feedback: A national study exploring the service and learning ...Online writing feedback: A national study exploring the service and learning ...
Online writing feedback: A national study exploring the service and learning ...
 
Educational Technologies: Learning Analytics and Artificial Intelligence
Educational Technologies: Learning Analytics and Artificial IntelligenceEducational Technologies: Learning Analytics and Artificial Intelligence
Educational Technologies: Learning Analytics and Artificial Intelligence
 
Introduction to Learning Analytics - Framework and Implementation Concerns
Introduction to Learning Analytics - Framework and Implementation ConcernsIntroduction to Learning Analytics - Framework and Implementation Concerns
Introduction to Learning Analytics - Framework and Implementation Concerns
 
Robin Smyth
Robin SmythRobin Smyth
Robin Smyth
 
Keynote H818 The Power of (In)formal learning: a learning analytics approach
Keynote H818 The Power of (In)formal learning: a learning analytics approachKeynote H818 The Power of (In)formal learning: a learning analytics approach
Keynote H818 The Power of (In)formal learning: a learning analytics approach
 
An Infrastructure for Sustainable Innovation and Research in Computer Scienc...
An Infrastructure for Sustainable Innovation and Research in Computer Scienc...An Infrastructure for Sustainable Innovation and Research in Computer Scienc...
An Infrastructure for Sustainable Innovation and Research in Computer Scienc...
 
State and Directions of Learning Analytics Adoption (Second edition)
State and Directions of Learning Analytics Adoption (Second edition)State and Directions of Learning Analytics Adoption (Second edition)
State and Directions of Learning Analytics Adoption (Second edition)
 
Toward an automated student feedback system for text based assignments - Pete...
Toward an automated student feedback system for text based assignments - Pete...Toward an automated student feedback system for text based assignments - Pete...
Toward an automated student feedback system for text based assignments - Pete...
 
Course-Adaptive Content Recommender for Course Authoring
Course-Adaptive Content Recommender for Course AuthoringCourse-Adaptive Content Recommender for Course Authoring
Course-Adaptive Content Recommender for Course Authoring
 
Conducting Research on Blended and Online Education, Workshop
Conducting Research on Blended and Online Education, WorkshopConducting Research on Blended and Online Education, Workshop
Conducting Research on Blended and Online Education, Workshop
 
Land of The Learning Giants: The Rise of MOOCs
Land of The Learning Giants: The Rise of MOOCsLand of The Learning Giants: The Rise of MOOCs
Land of The Learning Giants: The Rise of MOOCs
 
Educational Data Mining in Program Evaluation: Lessons Learned
Educational Data Mining in Program Evaluation: Lessons LearnedEducational Data Mining in Program Evaluation: Lessons Learned
Educational Data Mining in Program Evaluation: Lessons Learned
 

Similaire à Using Learning Analytics to Assess Innovation & Improve Student Achievement

Presentations morning session 22 January 2018 HEFCE open event “Using data to...
Presentations morning session 22 January 2018 HEFCE open event “Using data to...Presentations morning session 22 January 2018 HEFCE open event “Using data to...
Presentations morning session 22 January 2018 HEFCE open event “Using data to...Bart Rienties
 
CypherWorx OST Effiacy Study Results 2015
CypherWorx OST Effiacy Study Results 2015CypherWorx OST Effiacy Study Results 2015
CypherWorx OST Effiacy Study Results 2015Steve Stookey
 
MedU Plenary Session on Analytics
MedU Plenary Session on AnalyticsMedU Plenary Session on Analytics
MedU Plenary Session on AnalyticsJanet Corral
 
Conducting Research on Blended and Online Education: A Research Toolkit
Conducting Research on Blended and Online Education: A Research ToolkitConducting Research on Blended and Online Education: A Research Toolkit
Conducting Research on Blended and Online Education: A Research ToolkitTanya Joosten
 
Developmental evaluations for institutional impact
Developmental evaluations for institutional impactDevelopmental evaluations for institutional impact
Developmental evaluations for institutional impactRhona Sharpe
 
The power of learning analytics to measure learning gains: an OU, Surrey and ...
The power of learning analytics to measure learning gains: an OU, Surrey and ...The power of learning analytics to measure learning gains: an OU, Surrey and ...
The power of learning analytics to measure learning gains: an OU, Surrey and ...Bart Rienties
 
Using Learning Analytics to Understand Student Achievement
Using Learning Analytics to Understand Student AchievementUsing Learning Analytics to Understand Student Achievement
Using Learning Analytics to Understand Student AchievementJohn Whitmer, Ed.D.
 
Promoting Effective Teaching and Learning Ecosystems via Research Proven Prac...
Promoting Effective Teaching and Learning Ecosystems via Research Proven Prac...Promoting Effective Teaching and Learning Ecosystems via Research Proven Prac...
Promoting Effective Teaching and Learning Ecosystems via Research Proven Prac...Tanya Joosten
 
Response to Intervention
Response to Intervention Response to Intervention
Response to Intervention Scot Headley
 
At d & data presentation
At d & data presentationAt d & data presentation
At d & data presentationharrindl
 
Dr. Nasrin Nazemzadeh, PhD Dissertation Defense, Dr. William Allan Kritsonis,...
Dr. Nasrin Nazemzadeh, PhD Dissertation Defense, Dr. William Allan Kritsonis,...Dr. Nasrin Nazemzadeh, PhD Dissertation Defense, Dr. William Allan Kritsonis,...
Dr. Nasrin Nazemzadeh, PhD Dissertation Defense, Dr. William Allan Kritsonis,...William Kritsonis
 
Nasrin Nazemzadeh, Dissertation, Dr. William Allan Kritsonis, Dissertation Ch...
Nasrin Nazemzadeh, Dissertation, Dr. William Allan Kritsonis, Dissertation Ch...Nasrin Nazemzadeh, Dissertation, Dr. William Allan Kritsonis, Dissertation Ch...
Nasrin Nazemzadeh, Dissertation, Dr. William Allan Kritsonis, Dissertation Ch...William Kritsonis
 
Society for Research into Higher Education conference presentation
Society for Research into Higher Education conference presentationSociety for Research into Higher Education conference presentation
Society for Research into Higher Education conference presentationChristian Bokhove
 
Lessons learned from 200K students and 2 GB of learning gains data.
Lessons learned from 200K students and 2 GB of learning gains data.Lessons learned from 200K students and 2 GB of learning gains data.
Lessons learned from 200K students and 2 GB of learning gains data.Bart Rienties
 
Talis Insight Asia-Pacific 2017: Simon Bedford, University of Wollongong
Talis Insight Asia-Pacific 2017: Simon Bedford, University of WollongongTalis Insight Asia-Pacific 2017: Simon Bedford, University of Wollongong
Talis Insight Asia-Pacific 2017: Simon Bedford, University of WollongongTalis
 
Learning Analytics and the Scholarship of Teaching and Learning - an obvious ...
Learning Analytics and the Scholarship of Teaching and Learning - an obvious ...Learning Analytics and the Scholarship of Teaching and Learning - an obvious ...
Learning Analytics and the Scholarship of Teaching and Learning - an obvious ...Blackboard APAC
 
2021_06_30 «Small Interventions to Support Student Self-regulation Online »
2021_06_30 «Small Interventions to Support Student Self-regulation Online »2021_06_30 «Small Interventions to Support Student Self-regulation Online »
2021_06_30 «Small Interventions to Support Student Self-regulation Online »eMadrid network
 

Similaire à Using Learning Analytics to Assess Innovation & Improve Student Achievement (20)

Presentations morning session 22 January 2018 HEFCE open event “Using data to...
Presentations morning session 22 January 2018 HEFCE open event “Using data to...Presentations morning session 22 January 2018 HEFCE open event “Using data to...
Presentations morning session 22 January 2018 HEFCE open event “Using data to...
 
CypherWorx OST Effiacy Study Results 2015
CypherWorx OST Effiacy Study Results 2015CypherWorx OST Effiacy Study Results 2015
CypherWorx OST Effiacy Study Results 2015
 
MedU Plenary Session on Analytics
MedU Plenary Session on AnalyticsMedU Plenary Session on Analytics
MedU Plenary Session on Analytics
 
Conducting Research on Blended and Online Education: A Research Toolkit
Conducting Research on Blended and Online Education: A Research ToolkitConducting Research on Blended and Online Education: A Research Toolkit
Conducting Research on Blended and Online Education: A Research Toolkit
 
Developmental evaluations for institutional impact
Developmental evaluations for institutional impactDevelopmental evaluations for institutional impact
Developmental evaluations for institutional impact
 
The power of learning analytics to measure learning gains: an OU, Surrey and ...
The power of learning analytics to measure learning gains: an OU, Surrey and ...The power of learning analytics to measure learning gains: an OU, Surrey and ...
The power of learning analytics to measure learning gains: an OU, Surrey and ...
 
Using Learning Analytics to Understand Student Achievement
Using Learning Analytics to Understand Student AchievementUsing Learning Analytics to Understand Student Achievement
Using Learning Analytics to Understand Student Achievement
 
Promoting Effective Teaching and Learning Ecosystems via Research Proven Prac...
Promoting Effective Teaching and Learning Ecosystems via Research Proven Prac...Promoting Effective Teaching and Learning Ecosystems via Research Proven Prac...
Promoting Effective Teaching and Learning Ecosystems via Research Proven Prac...
 
Response to Intervention
Response to Intervention Response to Intervention
Response to Intervention
 
At d & data presentation
At d & data presentationAt d & data presentation
At d & data presentation
 
Leading with Data: Impacting Change on Scale featuring Belinda Tynan
Leading with Data: Impacting Change on Scale featuring Belinda TynanLeading with Data: Impacting Change on Scale featuring Belinda Tynan
Leading with Data: Impacting Change on Scale featuring Belinda Tynan
 
Dr. Nasrin Nazemzadeh, PhD Dissertation Defense, Dr. William Allan Kritsonis,...
Dr. Nasrin Nazemzadeh, PhD Dissertation Defense, Dr. William Allan Kritsonis,...Dr. Nasrin Nazemzadeh, PhD Dissertation Defense, Dr. William Allan Kritsonis,...
Dr. Nasrin Nazemzadeh, PhD Dissertation Defense, Dr. William Allan Kritsonis,...
 
Nasrin Nazemzadeh, Dissertation, Dr. William Allan Kritsonis, Dissertation Ch...
Nasrin Nazemzadeh, Dissertation, Dr. William Allan Kritsonis, Dissertation Ch...Nasrin Nazemzadeh, Dissertation, Dr. William Allan Kritsonis, Dissertation Ch...
Nasrin Nazemzadeh, Dissertation, Dr. William Allan Kritsonis, Dissertation Ch...
 
Continuous Improvement in Teaching and Learning – The Community College Open ...
Continuous Improvement in Teaching and Learning – The Community College Open ...Continuous Improvement in Teaching and Learning – The Community College Open ...
Continuous Improvement in Teaching and Learning – The Community College Open ...
 
A Quiet Crisis
A Quiet CrisisA Quiet Crisis
A Quiet Crisis
 
Society for Research into Higher Education conference presentation
Society for Research into Higher Education conference presentationSociety for Research into Higher Education conference presentation
Society for Research into Higher Education conference presentation
 
Lessons learned from 200K students and 2 GB of learning gains data.
Lessons learned from 200K students and 2 GB of learning gains data.Lessons learned from 200K students and 2 GB of learning gains data.
Lessons learned from 200K students and 2 GB of learning gains data.
 
Talis Insight Asia-Pacific 2017: Simon Bedford, University of Wollongong
Talis Insight Asia-Pacific 2017: Simon Bedford, University of WollongongTalis Insight Asia-Pacific 2017: Simon Bedford, University of Wollongong
Talis Insight Asia-Pacific 2017: Simon Bedford, University of Wollongong
 
Learning Analytics and the Scholarship of Teaching and Learning - an obvious ...
Learning Analytics and the Scholarship of Teaching and Learning - an obvious ...Learning Analytics and the Scholarship of Teaching and Learning - an obvious ...
Learning Analytics and the Scholarship of Teaching and Learning - an obvious ...
 
2021_06_30 «Small Interventions to Support Student Self-regulation Online »
2021_06_30 «Small Interventions to Support Student Self-regulation Online »2021_06_30 «Small Interventions to Support Student Self-regulation Online »
2021_06_30 «Small Interventions to Support Student Self-regulation Online »
 

Plus de John Whitmer, Ed.D.

Integrating Research & Production Learning Analytics
Integrating Research & Production Learning AnalyticsIntegrating Research & Production Learning Analytics
Integrating Research & Production Learning AnalyticsJohn Whitmer, Ed.D.
 
Collaborative Research: Stealth Assessment of SE Skills w/Learning Analytics
Collaborative Research: Stealth Assessment of SE Skills w/Learning AnalyticsCollaborative Research: Stealth Assessment of SE Skills w/Learning Analytics
Collaborative Research: Stealth Assessment of SE Skills w/Learning AnalyticsJohn Whitmer, Ed.D.
 
Lessons Learned from Moodle VLE/LMS Data in the Field
Lessons Learned from Moodle VLE/LMS Data in the FieldLessons Learned from Moodle VLE/LMS Data in the Field
Lessons Learned from Moodle VLE/LMS Data in the FieldJohn Whitmer, Ed.D.
 
Learner Analytics Presentation to ATSC Committee
Learner Analytics Presentation to ATSC CommitteeLearner Analytics Presentation to ATSC Committee
Learner Analytics Presentation to ATSC CommitteeJohn Whitmer, Ed.D.
 
CSU System-wide Learning Analytics Projects
CSU System-wide Learning Analytics ProjectsCSU System-wide Learning Analytics Projects
CSU System-wide Learning Analytics ProjectsJohn Whitmer, Ed.D.
 
Learner Analytics Panel Session: Deja-Vu all over again?
Learner Analytics Panel Session:  Deja-Vu all over again? Learner Analytics Panel Session:  Deja-Vu all over again?
Learner Analytics Panel Session: Deja-Vu all over again? John Whitmer, Ed.D.
 
Logging on to Improve Achievement
Logging on to Improve AchievementLogging on to Improve Achievement
Logging on to Improve AchievementJohn Whitmer, Ed.D.
 
Many Hands Makes Light Work: Collaborating on Moodle Services and Development
Many Hands Makes Light Work:  Collaborating on Moodle Services and DevelopmentMany Hands Makes Light Work:  Collaborating on Moodle Services and Development
Many Hands Makes Light Work: Collaborating on Moodle Services and DevelopmentJohn Whitmer, Ed.D.
 
Learner Analytics: Hype, Research and Practice in moodle
Learner Analytics:  Hype, Research and Practice in moodleLearner Analytics:  Hype, Research and Practice in moodle
Learner Analytics: Hype, Research and Practice in moodleJohn Whitmer, Ed.D.
 
Learner Analytics and the “Big Data” Promise for Course & Program Assessment
Learner Analytics and the “Big Data” Promise for Course & Program AssessmentLearner Analytics and the “Big Data” Promise for Course & Program Assessment
Learner Analytics and the “Big Data” Promise for Course & Program AssessmentJohn Whitmer, Ed.D.
 
Learning Analytics: Realizing the Big Data Promise in the CSU
Learning Analytics:  Realizing the Big Data Promise in the CSULearning Analytics:  Realizing the Big Data Promise in the CSU
Learning Analytics: Realizing the Big Data Promise in the CSUJohn Whitmer, Ed.D.
 
The State and Future of Learning Management Systems Panel Presentation
The State and Future of Learning Management Systems Panel PresentationThe State and Future of Learning Management Systems Panel Presentation
The State and Future of Learning Management Systems Panel PresentationJohn Whitmer, Ed.D.
 
Learning Analytics: Realizing their Promise in the California State University
Learning Analytics:  Realizing their Promise in the California State UniversityLearning Analytics:  Realizing their Promise in the California State University
Learning Analytics: Realizing their Promise in the California State UniversityJohn Whitmer, Ed.D.
 
Learner Analytics: from Buzz to Strategic Role Academic Technologists
Learner Analytics:  from Buzz to Strategic Role Academic TechnologistsLearner Analytics:  from Buzz to Strategic Role Academic Technologists
Learner Analytics: from Buzz to Strategic Role Academic TechnologistsJohn Whitmer, Ed.D.
 
State of the CSU Learning Management Systems and Services
State of the CSU Learning Management Systems and ServicesState of the CSU Learning Management Systems and Services
State of the CSU Learning Management Systems and ServicesJohn Whitmer, Ed.D.
 
Current CSU LMS Activities:  Campus and Systemwide Strategies
Current CSU LMS Activities:  Campus and Systemwide StrategiesCurrent CSU LMS Activities:  Campus and Systemwide Strategies
Current CSU LMS Activities:  Campus and Systemwide StrategiesJohn Whitmer, Ed.D.
 
CSU Systemwide Learning Management Systems and Services
CSU Systemwide Learning Management Systems and ServicesCSU Systemwide Learning Management Systems and Services
CSU Systemwide Learning Management Systems and ServicesJohn Whitmer, Ed.D.
 
Faculty Development across the California State University System
Faculty Development across the California State University SystemFaculty Development across the California State University System
Faculty Development across the California State University SystemJohn Whitmer, Ed.D.
 
Partnership & Collaboration in Moodle Development: Making it Work
Partnership & Collaboration in Moodle Development: Making it WorkPartnership & Collaboration in Moodle Development: Making it Work
Partnership & Collaboration in Moodle Development: Making it WorkJohn Whitmer, Ed.D.
 
Migrating to Moodle: Lessons Learned from Recent CSU Migrations
Migrating to Moodle: Lessons Learned from Recent CSU MigrationsMigrating to Moodle: Lessons Learned from Recent CSU Migrations
Migrating to Moodle: Lessons Learned from Recent CSU MigrationsJohn Whitmer, Ed.D.
 

Plus de John Whitmer, Ed.D. (20)

Integrating Research & Production Learning Analytics
Integrating Research & Production Learning AnalyticsIntegrating Research & Production Learning Analytics
Integrating Research & Production Learning Analytics
 
Collaborative Research: Stealth Assessment of SE Skills w/Learning Analytics
Collaborative Research: Stealth Assessment of SE Skills w/Learning AnalyticsCollaborative Research: Stealth Assessment of SE Skills w/Learning Analytics
Collaborative Research: Stealth Assessment of SE Skills w/Learning Analytics
 
Lessons Learned from Moodle VLE/LMS Data in the Field
Lessons Learned from Moodle VLE/LMS Data in the FieldLessons Learned from Moodle VLE/LMS Data in the Field
Lessons Learned from Moodle VLE/LMS Data in the Field
 
Learner Analytics Presentation to ATSC Committee
Learner Analytics Presentation to ATSC CommitteeLearner Analytics Presentation to ATSC Committee
Learner Analytics Presentation to ATSC Committee
 
CSU System-wide Learning Analytics Projects
CSU System-wide Learning Analytics ProjectsCSU System-wide Learning Analytics Projects
CSU System-wide Learning Analytics Projects
 
Learner Analytics Panel Session: Deja-Vu all over again?
Learner Analytics Panel Session:  Deja-Vu all over again? Learner Analytics Panel Session:  Deja-Vu all over again?
Learner Analytics Panel Session: Deja-Vu all over again?
 
Logging on to Improve Achievement
Logging on to Improve AchievementLogging on to Improve Achievement
Logging on to Improve Achievement
 
Many Hands Makes Light Work: Collaborating on Moodle Services and Development
Many Hands Makes Light Work:  Collaborating on Moodle Services and DevelopmentMany Hands Makes Light Work:  Collaborating on Moodle Services and Development
Many Hands Makes Light Work: Collaborating on Moodle Services and Development
 
Learner Analytics: Hype, Research and Practice in moodle
Learner Analytics:  Hype, Research and Practice in moodleLearner Analytics:  Hype, Research and Practice in moodle
Learner Analytics: Hype, Research and Practice in moodle
 
Learner Analytics and the “Big Data” Promise for Course & Program Assessment
Learner Analytics and the “Big Data” Promise for Course & Program AssessmentLearner Analytics and the “Big Data” Promise for Course & Program Assessment
Learner Analytics and the “Big Data” Promise for Course & Program Assessment
 
Learning Analytics: Realizing the Big Data Promise in the CSU
Learning Analytics:  Realizing the Big Data Promise in the CSULearning Analytics:  Realizing the Big Data Promise in the CSU
Learning Analytics: Realizing the Big Data Promise in the CSU
 
The State and Future of Learning Management Systems Panel Presentation
The State and Future of Learning Management Systems Panel PresentationThe State and Future of Learning Management Systems Panel Presentation
The State and Future of Learning Management Systems Panel Presentation
 
Learning Analytics: Realizing their Promise in the California State University
Learning Analytics:  Realizing their Promise in the California State UniversityLearning Analytics:  Realizing their Promise in the California State University
Learning Analytics: Realizing their Promise in the California State University
 
Learner Analytics: from Buzz to Strategic Role Academic Technologists
Learner Analytics:  from Buzz to Strategic Role Academic TechnologistsLearner Analytics:  from Buzz to Strategic Role Academic Technologists
Learner Analytics: from Buzz to Strategic Role Academic Technologists
 
State of the CSU Learning Management Systems and Services
State of the CSU Learning Management Systems and ServicesState of the CSU Learning Management Systems and Services
State of the CSU Learning Management Systems and Services
 
Current CSU LMS Activities:  Campus and Systemwide Strategies
Current CSU LMS Activities:  Campus and Systemwide StrategiesCurrent CSU LMS Activities:  Campus and Systemwide Strategies
Current CSU LMS Activities:  Campus and Systemwide Strategies
 
CSU Systemwide Learning Management Systems and Services
CSU Systemwide Learning Management Systems and ServicesCSU Systemwide Learning Management Systems and Services
CSU Systemwide Learning Management Systems and Services
 
Faculty Development across the California State University System
Faculty Development across the California State University SystemFaculty Development across the California State University System
Faculty Development across the California State University System
 
Partnership & Collaboration in Moodle Development: Making it Work
Partnership & Collaboration in Moodle Development: Making it WorkPartnership & Collaboration in Moodle Development: Making it Work
Partnership & Collaboration in Moodle Development: Making it Work
 
Migrating to Moodle: Lessons Learned from Recent CSU Migrations
Migrating to Moodle: Lessons Learned from Recent CSU MigrationsMigrating to Moodle: Lessons Learned from Recent CSU Migrations
Migrating to Moodle: Lessons Learned from Recent CSU Migrations
 

Dernier

4.16.24 21st Century Movements for Black Lives.pptx
4.16.24 21st Century Movements for Black Lives.pptx4.16.24 21st Century Movements for Black Lives.pptx
4.16.24 21st Century Movements for Black Lives.pptxmary850239
 
Scientific Writing :Research Discourse
Scientific  Writing :Research  DiscourseScientific  Writing :Research  Discourse
Scientific Writing :Research DiscourseAnita GoswamiGiri
 
Decoding the Tweet _ Practical Criticism in the Age of Hashtag.pptx
Decoding the Tweet _ Practical Criticism in the Age of Hashtag.pptxDecoding the Tweet _ Practical Criticism in the Age of Hashtag.pptx
Decoding the Tweet _ Practical Criticism in the Age of Hashtag.pptxDhatriParmar
 
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)lakshayb543
 
MS4 level being good citizen -imperative- (1) (1).pdf
MS4 level   being good citizen -imperative- (1) (1).pdfMS4 level   being good citizen -imperative- (1) (1).pdf
MS4 level being good citizen -imperative- (1) (1).pdfMr Bounab Samir
 
Reading and Writing Skills 11 quarter 4 melc 1
Reading and Writing Skills 11 quarter 4 melc 1Reading and Writing Skills 11 quarter 4 melc 1
Reading and Writing Skills 11 quarter 4 melc 1GloryAnnCastre1
 
Mythology Quiz-4th April 2024, Quiz Club NITW
Mythology Quiz-4th April 2024, Quiz Club NITWMythology Quiz-4th April 2024, Quiz Club NITW
Mythology Quiz-4th April 2024, Quiz Club NITWQuiz Club NITW
 
Narcotic and Non Narcotic Analgesic..pdf
Narcotic and Non Narcotic Analgesic..pdfNarcotic and Non Narcotic Analgesic..pdf
Narcotic and Non Narcotic Analgesic..pdfPrerana Jadhav
 
Multi Domain Alias In the Odoo 17 ERP Module
Multi Domain Alias In the Odoo 17 ERP ModuleMulti Domain Alias In the Odoo 17 ERP Module
Multi Domain Alias In the Odoo 17 ERP ModuleCeline George
 
ESP 4-EDITED.pdfmmcncncncmcmmnmnmncnmncmnnjvnnv
ESP 4-EDITED.pdfmmcncncncmcmmnmnmncnmncmnnjvnnvESP 4-EDITED.pdfmmcncncncmcmmnmnmncnmncmnnjvnnv
ESP 4-EDITED.pdfmmcncncncmcmmnmnmncnmncmnnjvnnvRicaMaeCastro1
 
Active Learning Strategies (in short ALS).pdf
Active Learning Strategies (in short ALS).pdfActive Learning Strategies (in short ALS).pdf
Active Learning Strategies (in short ALS).pdfPatidar M
 
How to Make a Duplicate of Your Odoo 17 Database
How to Make a Duplicate of Your Odoo 17 DatabaseHow to Make a Duplicate of Your Odoo 17 Database
How to Make a Duplicate of Your Odoo 17 DatabaseCeline George
 
Concurrency Control in Database Management system
Concurrency Control in Database Management systemConcurrency Control in Database Management system
Concurrency Control in Database Management systemChristalin Nelson
 
Man or Manufactured_ Redefining Humanity Through Biopunk Narratives.pptx
Man or Manufactured_ Redefining Humanity Through Biopunk Narratives.pptxMan or Manufactured_ Redefining Humanity Through Biopunk Narratives.pptx
Man or Manufactured_ Redefining Humanity Through Biopunk Narratives.pptxDhatriParmar
 
Using Grammatical Signals Suitable to Patterns of Idea Development
Using Grammatical Signals Suitable to Patterns of Idea DevelopmentUsing Grammatical Signals Suitable to Patterns of Idea Development
Using Grammatical Signals Suitable to Patterns of Idea Developmentchesterberbo7
 
ICS2208 Lecture6 Notes for SL spaces.pdf
ICS2208 Lecture6 Notes for SL spaces.pdfICS2208 Lecture6 Notes for SL spaces.pdf
ICS2208 Lecture6 Notes for SL spaces.pdfVanessa Camilleri
 
ClimART Action | eTwinning Project
ClimART Action    |    eTwinning ProjectClimART Action    |    eTwinning Project
ClimART Action | eTwinning Projectjordimapav
 

Dernier (20)

4.16.24 21st Century Movements for Black Lives.pptx
4.16.24 21st Century Movements for Black Lives.pptx4.16.24 21st Century Movements for Black Lives.pptx
4.16.24 21st Century Movements for Black Lives.pptx
 
Scientific Writing :Research Discourse
Scientific  Writing :Research  DiscourseScientific  Writing :Research  Discourse
Scientific Writing :Research Discourse
 
Decoding the Tweet _ Practical Criticism in the Age of Hashtag.pptx
Decoding the Tweet _ Practical Criticism in the Age of Hashtag.pptxDecoding the Tweet _ Practical Criticism in the Age of Hashtag.pptx
Decoding the Tweet _ Practical Criticism in the Age of Hashtag.pptx
 
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
 
MS4 level being good citizen -imperative- (1) (1).pdf
MS4 level   being good citizen -imperative- (1) (1).pdfMS4 level   being good citizen -imperative- (1) (1).pdf
MS4 level being good citizen -imperative- (1) (1).pdf
 
INCLUSIVE EDUCATION PRACTICES FOR TEACHERS AND TRAINERS.pptx
INCLUSIVE EDUCATION PRACTICES FOR TEACHERS AND TRAINERS.pptxINCLUSIVE EDUCATION PRACTICES FOR TEACHERS AND TRAINERS.pptx
INCLUSIVE EDUCATION PRACTICES FOR TEACHERS AND TRAINERS.pptx
 
Reading and Writing Skills 11 quarter 4 melc 1
Reading and Writing Skills 11 quarter 4 melc 1Reading and Writing Skills 11 quarter 4 melc 1
Reading and Writing Skills 11 quarter 4 melc 1
 
Mythology Quiz-4th April 2024, Quiz Club NITW
Mythology Quiz-4th April 2024, Quiz Club NITWMythology Quiz-4th April 2024, Quiz Club NITW
Mythology Quiz-4th April 2024, Quiz Club NITW
 
Faculty Profile prashantha K EEE dept Sri Sairam college of Engineering
Faculty Profile prashantha K EEE dept Sri Sairam college of EngineeringFaculty Profile prashantha K EEE dept Sri Sairam college of Engineering
Faculty Profile prashantha K EEE dept Sri Sairam college of Engineering
 
Narcotic and Non Narcotic Analgesic..pdf
Narcotic and Non Narcotic Analgesic..pdfNarcotic and Non Narcotic Analgesic..pdf
Narcotic and Non Narcotic Analgesic..pdf
 
Multi Domain Alias In the Odoo 17 ERP Module
Multi Domain Alias In the Odoo 17 ERP ModuleMulti Domain Alias In the Odoo 17 ERP Module
Multi Domain Alias In the Odoo 17 ERP Module
 
Paradigm shift in nursing research by RS MEHTA
Paradigm shift in nursing research by RS MEHTAParadigm shift in nursing research by RS MEHTA
Paradigm shift in nursing research by RS MEHTA
 
ESP 4-EDITED.pdfmmcncncncmcmmnmnmncnmncmnnjvnnv
ESP 4-EDITED.pdfmmcncncncmcmmnmnmncnmncmnnjvnnvESP 4-EDITED.pdfmmcncncncmcmmnmnmncnmncmnnjvnnv
ESP 4-EDITED.pdfmmcncncncmcmmnmnmncnmncmnnjvnnv
 
Active Learning Strategies (in short ALS).pdf
Active Learning Strategies (in short ALS).pdfActive Learning Strategies (in short ALS).pdf
Active Learning Strategies (in short ALS).pdf
 
How to Make a Duplicate of Your Odoo 17 Database
How to Make a Duplicate of Your Odoo 17 DatabaseHow to Make a Duplicate of Your Odoo 17 Database
How to Make a Duplicate of Your Odoo 17 Database
 
Concurrency Control in Database Management system
Concurrency Control in Database Management systemConcurrency Control in Database Management system
Concurrency Control in Database Management system
 
Man or Manufactured_ Redefining Humanity Through Biopunk Narratives.pptx
Man or Manufactured_ Redefining Humanity Through Biopunk Narratives.pptxMan or Manufactured_ Redefining Humanity Through Biopunk Narratives.pptx
Man or Manufactured_ Redefining Humanity Through Biopunk Narratives.pptx
 
Using Grammatical Signals Suitable to Patterns of Idea Development
Using Grammatical Signals Suitable to Patterns of Idea DevelopmentUsing Grammatical Signals Suitable to Patterns of Idea Development
Using Grammatical Signals Suitable to Patterns of Idea Development
 
ICS2208 Lecture6 Notes for SL spaces.pdf
ICS2208 Lecture6 Notes for SL spaces.pdfICS2208 Lecture6 Notes for SL spaces.pdf
ICS2208 Lecture6 Notes for SL spaces.pdf
 
ClimART Action | eTwinning Project
ClimART Action    |    eTwinning ProjectClimART Action    |    eTwinning Project
ClimART Action | eTwinning Project
 

Using Learning Analytics to Assess Innovation & Improve Student Achievement

  • 1. Using Learning Analytics to Assess Innovation & Improve Student Achievement John Whitmer, Ed.D. john.whitmer@blackboard.com @johncwhitmer UK Learning Analytics Network Event (JISC) March 5, 2015 http://bit.ly/jwhitmer-jisc
  • 2. Quick bio 15 years managing academic technology at public higher ed institutions (R1, 4-year, CC’s) • Always multi-campus projects, innovative uses of academic technologies • Most recently: California State University, Chancellor’s Office, Academic Technology Services Doctorate in Education from UC Davis (2013) with Learning Analytics study on Hybrid, Large Enrollment course Active academic research practice (San Diego State Learning Analytics, MOOC Research Initiative, Udacity SJSU Study…)
  • 3. Meta-questions driving my research 1. How can we provide students with immediate, real-time feedback? (esp identify students at-risk of failing a course) 2. How can we design effective interventions for these students? 3. How can we assess innovations (or status quo deployments) of academic technologies? 4. Do these findings apply equally to students ‘at promise’ due to their background (e.g. race, class, family education, geography)
  • 4. Outline 1. Defining & Positioning Learning Analytics 2. A Few Empirical Research Findings • Understanding Contradictory Outcomes in a Redesigned Hybrid Course (Chico State) • Creating Accurate Learning Analytics Triggers & Effective Interventions (SDSU) 3. How we’re Applying this Research @ Blackboard 4. Discussion 4
  • 5. Economist. (2010, 11/4/2010). Augmented business: Smart systems will disrupt lots of industries, and perhaps the entire economy. The Economist. 5
  • 6. 200MBof data emissions annually Economist. (2010, 11/4/2010). Augmented business: Smart systems will disrupt lots of industries, and perhaps the entire economy. The Economist. 6
  • 7. Logged into course within 24 hours Interacts frequently in discussion boards Failed first exam Hasn’t taken college-level math No declared major 7
  • 8. What is learning analytics? Learning and Knowledge Analytics Conference, 2011 “ ...measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs.”
  • 9. Strong interest by faculty & students From Eden Dahlstrom, D. Christopher Brooks, and Jacqueline Bichsel. The Current Ecosystem of Learning Management Systems in Higher Education: Student, Faculty, and IT Perspectives. Research report. Louisville, CO: ECAR, September 2014. Available from http://www.educause.edu/ecar.
  • 10. Source: Educause and AIR, 2012 (2012), http://goo.gl/337mA
  • 11. 2. A Few Empirical Research Findings
  • 12. Study 1: Understanding Contradictory Outcomes in a Redesigned Hybrid Course (Chico State)
  • 13. Course redesigned for hybrid delivery in year-long program Enrollment: 373 students (54% increase largest section) Highest LMS usage entire campus Fall 2010 (>250k hits) Bimodal outcomes: • 10% increased SLO mastery • 7% & 11% increase in DWF Why? Can’t tell with aggregated reporting data Study Overview 54 F’s
  • 14. Grades Significantly Related to Access Course: “Introduction to Religious Studies” CSU Chico, Fall 2013 (n=373) Variable % Variance Total Hits 23% Assessment activity hits 22% Content activity hits 17% Engagement activity hits 16% Administrative activity hits 12% Mean value all significant variables 18%
  • 15. LMS Activity better Predictor than Demographic/Educational Variables Variable % Var. HS GPA 9% URM and Pell-Eligibility Interaction 7% Under-Represented Minority 4% Enrollment Status 3% URM and Gender Interaction 2% Pell Eligible 2% First in Family to Attend College 1% Mean value all significant variables 4% Not Statistically Significant Gender Major-College
  • 17. Activities by Pell and grade Grade / Pell-Eligible A B+ C C- 0K 5K 10K 15K 20K 25K 30K 35K Measure Names Admin Assess Engage Content Not Pell- Eligible Pell- Eligible Not Pell- Eligible Pell- Eligible Not Pell- Eligible Pell- Eligible Not Pell- Eligible Pell- Eligible
  • 18. Activities by Pell and grade Grade / Pell-Eligible A B+ C C- 0K 5K 10K 15K 20K 25K 30K 35K Measure Names Admin Assess Engage Content Not Pell- Eligible Pell- Eligible Not Pell- Eligible Pell- Eligible Not Pell- Eligible Pell- Eligible Not Pell- Eligible Pell- Eligible Extra effort In content-related activities
  • 19. Study 2: Creating Accurate Learning Analytics Triggers & Effective Interventions (SDSU)
  • 20. Study Overview • President-level initiative • Goal: identify effective interventions driven by Learning Analytics “triggers” • Multiple “triggers” (e.g., LMS access, Grade, Online Homework/Quiz, Clicker use) • At scale & over time: conducted for 3 terms, 5 unique courses, 3,529 students • “Gold standard” experimental design (control / treatment) 20
  • 21. Focus on High Need Courses 21 84% 62% 69% 78.5% 70.8% 16% 38% 31% 21.5% 29.2% 0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% ANTH 101 COMPE 270 ECON 101 PSY 101 STAT 119 Repeatable Grades Non- Repeatable Grades
  • 22. 1. Identify courses and recruit instructors 2. Prior to course start, review syllabus, schedule meaningful “triggers” for each course (e.g. attendance, graded items, Blackboard use, etc.) 3. Run reports in Blackboard, Online Homework/Quiz software to identify students with low activity or performance (~ weekly) 4. Send “flagged” student in experimental group a notification/intervention 5. Aggregate data, add demographic data. Analyze. Study Protocol 22
  • 23. Key Questions 1. Are triggers accurate predictors of course grade? 2. Do interventions (based on triggers) improve student grades? 3. Do these relationships vary based on student background characteristics?
  • 24. Frequency of interventions (Spring 2014) # Students Receiving >0 Interventions: PSY: 177 (84%) STAT: 165 (70%) 14% 19% 11% 17% 10% 6% 5% 6% 2% 1% 3% 2% 30% 17% 13% 12% 7% 6% 6% 3% 4% 1% 2% 4% 0% 5% 10% 15% 20% 25% 30% 35% 0 1 2 3 4 5 6 7 8 9 10 >10 Students Interventions PSY STAT
  • 25. Frequency of interventions (Spring 2015) 0% 5% 10% 15% 20% 25% 30% 35% 40% 45% 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 Students Triggers Activated Triggers Activated per StudentS (Spring 2015) Anthro1 Anthro3 Comp Engr Econ4 Psych1 Psych2 Stats3 Stats4 25
  • 27. A Typical Intervention: “Concerned Friend” tone 27 … data that I've gathered over the years via clickers indicates that students who attend every face-to-face class meeting reduce their chances of getting a D or an F in the class from almost 30% down to approximately 8%. So, please take my friendly advice and attend class and participate in our classroom activities via your clicker. You'll be happy you did! Let me know if you have any questions. Good luck, Dr. Laumakis
  • 28. Poll question A Not significant <10%, significant .05 level 20%, significant .01 level 30%, significant .001 level Did triggers predict achievement? What level significance? How much variation in student grade was explained? B C D E 50%+, significant .0001 level 28
  • 29. Poll question A Not significant <10%, significant .05 level 20%, significant .01 level 30%, significant .001 level Did triggers predict achievement? What level significance? How much variation in student grade was explained? B C D E 50%+, significant .0001 level (Spring 2014, Fall 2014) 29
  • 30. Statistics Learning analytics triggers vs. final course points Spring 2014: 4 sections, 2 courses, 882 students Psychology p<0.0001; r2=0.4828 p<0.0001; r2=0.6558
  • 31. Fall 2014 results: Almost identical 5 Sections, 3 Courses, N=1,220 students p<0.00001; r2=0.4836
  • 32. Spring 2015 Results (tentative): lower relationship 8 Sections, 5 Courses, N=1,390 students p<0.00001; r2=0.28 32
  • 33. Spring 2015 Results (tentative): lower relationship 8 Sections, 5 Courses, N=1,390 students p<0.00001; r2=0.28 33
  • 34. Explained by differences between courses (Spring 2015 Results by Course) R2 (all triggers) R2 (no grades) Anthro1 (Online) 0.54 0.65 Anthro3 (In Person) not significant not significant Comp Eng not significant not significant Econ4 not significant not significant Psych1 0.58 0.4 Psych2 0.41 0.23 Stat3 0.2 0.11 Stat4 0.33 0.21 34 R2 (all triggers) R2 (no grades) Anthro1 (Online) 0.54 0.65 Anthro3 (In Person) not significant not significant Comp Eng not significant not significant Econ4 not significant not significant Psych1 0.58 0.4 Psych2 0.41 0.23 Stat3 0.2 0.11 Stat4 0.33 0.21
  • 35. So did the interventions make a difference in learning outcomes?
  • 36. 79% 79% 89% 82% 21% 21% 11% 18% 0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% Control (STAT) Exp. (STAT) Exp. (PSY) Control (PSY) Experimental Participation vs. Repeatable Grade (Spring 2014) Repeatible Grade Passing Grade 36
  • 37. 77% 91%23% 9% 0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% No Interventions (n=87, PSY, Pell-Eligible) Interventions (n=81, PSY, Pell-eligible) Experimental Participation vs. Repeatable Grade (Pell-Eligible) (n=168, Spring 2014, PSY 101) Passing Grade Repeatible Grade 24 additional Pell-eligible students would have passed the class if the intervention was applied to all participating students. 37
  • 38. Fall 2014 / Spring 2015 Intervention Results: No Significant Difference Between Experimental/Control Groups. 38
  • 39. One Explanation: Low Reach 39 Fall 2014 (n = 1,220) Row Labels # Triggers Message Open Rate Clickthrough Rate Econ1 8 76% 36% Psych1 6 70% 29% Psych2 7 69% 35% Stat3 9 62% 25% Stat4 8 65% 27% Grand Total 38 68% 30% Spring 2015 (n = 1,138) Row Labels # Triggers Message Open Rate Clickthrough Rate Anthro-In Person 17 57% 10% Anthro-Online 7 71% 35% Comp Engineering 15 52% 14% Econ 15 44% 13% Psych1 17 60% 13% Psych2 17 63% 13% Stat3 21 64% 9% Stat4 20 55% 5% Grand Total 129 58% 12%
  • 40. Proposed Next Steps • Add interventions that move “beyond informing” students to address underlying study skills and behaviors – Supplemental Instruction <http://www.umkc.edu/asm/si/> – Adaptive Release within online courses (content, activities) 40
  • 41. 1. Data from academic technology use predicts student achievement; diverse sources provide better predictions. 2. Tech use > demographic data to predict course success; adding demographic data provides nuanced understandings and identifies trends not otherwise visible. 3. Academic technology use is not a “cause” in itself, but reveals underlying study habits and behaviors (e.g. effort, time on task, massed vs. distributed activity). 4. Predictions are necessary, but not sufficient, to change academic outcomes. Research into interventions is promising. 5. We’re at an early stage in Learning Analytics; expect quantum leaps in the near future. 41 Conclusions and Implications
  • 42. 4. How we’re Applying this Research @ Blackboard 42
  • 43. Improved instrumentation for learning activity within applications Blackboard’s “Platform Analytics” Project A new effort to enhance our analytics offerings across our academic technology applications that include Applied findings from analysis (inc. inferential statistics and data mining) Integrated analytics into user experiences (inc. student and faculty) Aggregated usage data across cloud applications (anonymized, rolled-up, privacy-compliant)
  • 44.
  • 45.
  • 47. 3. Wrap-Up and Discussion
  • 48. Factors affecting growth of learning analytics Enabler Constraint WidespreadRare New education models Resources ($$$, talent) Data governance (privacy, security, ownership) Clear goals and linked actions Data valued in academic decisions Tools/systems for data co-mingling and analysis Academic technology adoption Low data quality (fidelity with meaningful learning) Difficulty of data preparation Not invented here syndrome
  • 49. Call to action [with amendments] (from a May 2012 Keynote Presentation @ San Diego State U) You’re not behind the curve, this is a rapidly emerging area that we can (should) lead... [together with interested partners] Metrics reporting is the foundation for analytics [don’t under or over-estimate the importance] Start with what you have! Don’t wait for student characteristics and detailed database information; LMS data can provide significant insights If there’s any ed tech software folks in the audience, please help us with better reporting! [we’re working on it and feel your pain!]
  • 50. Discussion / Questions John Whitmer, Ed.D. john.whitmer@blackboard.com @johncwhitmer http://bit.ly/jwhitmer-jisc

Notes de l'éditeur

  1. John
  2. John
  3. 2013 SP – Study began with two courses (N=`2,000) in Spring 2014 Weekly reports; triggered students sent email and multimedia “interventions” (low/high intensity) Our hypothesis was - as you will see as the number of trigger events increase, so would the likelihood of having to repeat the course
  4. We focused on high needs courses. 16-38% “Repeatable grades”
  5. James Weekly reports; triggered students sent email “interventions” (low intensity). Did people show up i.e., Clicker (participation/attendance points)
  6. Talking points: Almost ¾ of students got at least one trigger in each course More PSY students got interventions than Stat students (b/c not completing homework) The pattern of the # of interventions in both courses is about the same – high up to 2-3, then trails off. Interesting findings – when consider that the triggers were very different between courses (e.g. PSY only 2 graded items, PSY: Online Homework, Stat: Online Quizzes. Etc).
  7. James: most recent semester: different trajectory by course, although see trail-off pattern over time, most of the triggers in small range (1-3) But significant differences between courses in number of triggers and number of students “activating” trigger. Show differences as we have expanded the study over time.
  8. James For the experimental students, we have expanded the interventions over time to use different modalities: but the focus of all interventions has been in increasing the awareness of students about their at-risk behaviors/status, so that they can do something about it (self regulate).
  9. Message written by instructor, with strong attention to tone: a “concerned friend”, which we thought would be more effective with students. Also include serious results that might resonate with them and they would take notice about.
  10. John
  11. John
  12. These graphs illustrate that DECREASES in triggers are related to INCREASES in student grade. (explanation: Each dot is a student; Y axis is the total points (lower to higher), and X axis is the total # of triggers (higher to lower)) Significantly significant results for both courses; possibility due to chance less than 1 in 10,000. Size of effect different: PSY: triggers explain 48% variation in final grade STAT: triggers explain 66% of variation in final grade (if remove graded items from Stat, triggers explains 49%)
  13. John
  14. John
  15. John: Million dollar question: did this reduce student repeatable grade rates.
  16. John
  17. James Definition: Supplemental Instruction (SI) is an academic assistance program that utilizes peer-assisted study sessions. SI sessions are regularly-scheduled, informal review sessions in which students compare notes, discuss readings, develop organizational tools, and predict test items. Students learn how to integrate course content and study skills while working together. The sessions are facilitated by “SI leaders”, students who have previously done well in the course and who attend all class lectures, take notes, and act as model students.     Purpose: To increase retention within targeted historically difficult courses To improve student grades in targeted historically difficult courses To increase the graduation rates of students Participants: SI is a “free service” offered to all students in a targeted course. SI is a non remedial approach to learning as the program targets high-risk courses rather than high-risk students. All students are encouraged to attend SI sessions, as it is a voluntary program. Students with varying levels of academic preparedness and diverse ethnicities participate. There is no remedial stigma attached to SI since the program targets high-risk courses rather than high-risk students.
  18. John – first two points (provide examples of academic technology use when explaining first bullet) James – last two points (because so much of a students participation and performance in a class is logged via academic technology tools it provides us with evidence of student behavior which reveal patterns that can predict struggling students or successful students)
  19. More and better data: More data sources, less structured data, integrated data, potential for expansion of data sources in future. Unlock meaning from data that’s meaningful/available. Intended for multiple audiences: Blackboard internal development, CIOs/administrative customers interested in metrics, faculty/student customers interested in learning/teaching, researchers interested in underlying relationships.