Using Analytics to Examine Course Design and Identify Effective Practices
Karin Readel, Director of Instructional Technology, University of Maryland, Baltimore County
Their public BB reports:
They wanted to know about the impact of their faculty training.
About their training program:
* Alternate Delivery Program (ADP)
* for moving from f2f to online/hybrid
* one time $2500 stipend
* 1 day “hybrid workshop”
* self-reflection on pedagogical problem
* interview with IT & FDC staff
* Create & present 2 online activities
* teach redesigned course
* evaluate adp process
* are there differences in course measures?
* does the training make a difference?
* can we use BA4L data to identify effective practitioners?
* what other non-LMS data can be included in the analysis?
Measures looked at:
* “Average # of interactions” (in the LMS); all faculty & for trained; across f2f, hybrid, online.
* ave amount content, ave course accesses (by students), ave min/access (about the same for all faculty & trained)
* put it all together to see effect – ave content/ave accesses/ave min/access (across f2f all, f2f trained, hybrid all, hybrid trained, online all, online trained)
* % of courses using particular features (content, % accesses, grade center tools,
BB defines an “interaction” as anything a student does in a class (clicks/page view/submission) (could indicate poor design, if folder depth is bad, they don’t use the grade center, etc…)
Cross mapped projector use data with course schedule data – to better allocate the limited number of projector-enabeled classrooms. (some rooms are monitored by GVE – web-based AV management software)
* look at impact of training over time on specific faculty
* look at impact of “redesign” on specific courses
* add clicker & projector use data to BBAnalytics cube so she can compare it all directly
Are All Mobile Devices Equal When It Comes to Teaching and Learning?
Robbie Kendall-Melton, Associate Vice Chancellor for Academic Affairs: eLearning Tennessee Board of Regents