Feed on
Posts
comments

Highlights:

  • Monday: Facilitated a half day iPad session that was a lot of fun
  • Tuesday: Facilitated a “Birds of a Feather” lunch table on Learning Outcomes Assessment. Had a great conversation, learned that our colleague from CT community college was the only school at the table with anything like systematic or comprehensive defined learning outcomes & assessment criteria across the curriculum. Enjoyed a session on a three nation marketing course, session on the future of student computer labs from UNH, and an unconference session.
  • Wednesday: Great faculty story about shifting to an active learning stance from BU, WordPress ePortfolio from Granite State College (poster), What the MOOC?

Effective Practices in Teaching with Technology at Tufts

1. Trunk Forums to Enhance Reflective Learning:
Jonathan Garlick, D.D.S., Ph.D., Professor, Oral Pathology, Director, Division of Tissue Engineering and Cancer Biology, School of Dental Medicine – using Trunk forums to help students go deeper with their learning & fostering reflective learning. Pyramid integrating: foundational science literacy, “real life” implications, broader impacts, reflective interpersonal perspective > Outcomes (Human potential, Human values, Informed citizenship).  How: weekly discussion topic, everyone does the same reflective reading, pre-class conversation, bring it in class.

2. The Development of Video-Based Clinical Ultrasound Teaching Tools for Veterinary Students:
James Sutherland-Smith, BVSc., DACVR, Assistant Professor, Clinical Sciences, Cummings School of Veterinary Medicine – great use of videos & illustration to improve instruction by creating dual-view videos (ultrasound image & external camera, with voiceover).  Lovely use of Prezi for the presentation.  Proving the educational value would be future opportunity, but student feedback was very positive.

3. Implementing Technologies for Blended Learning:
Libby Bradshaw, DD, MS, Academic Director, Master of Science/Certificate Program, Pain Research, Education and Policy Program (PREP), Public Health and Professional Programs, Public Health and Community Medicine, School of Medicine – Program level presentation – how they have moved forward with their blend.  Half of overall course delivered face to face, half online, used Echo Capture, Jabber video, Trunk & TUSK.

 

Feb 25, 2013:

As usual, my “real” life has dashed my high hopes for my own active participation in this MOOC.  HOWEVER, I always get a lot out of just seeing how the course is designed & delivered & in addition, in this case, George generously referred us to an intro Stats course (from a competing MOOC, EdX).  The referral came before I had even realized that I needed the additional background, which I consider teaching genius, and it turns out to have been spot on, so I’ve been devoting my leisure time to that course instead, hoping that the stats background will make my eventual Learning Analytics work both more enjoyable & more productive. Because of this, I’m eliminating the separate page on my blog for this course & putting this into a post along with the rest of my explorations.

I have also noticed that I’m starting to have trouble remembering where each mooc is – EdX, Coursera, Canvass, oh my!  A student portal & portfolio to aggregate & display all mooc accomplishments would be a great tool for someone to develop (or to explain good ways to create with existing tools.)

Feb 11, 2013:

Well, this is a big step…the “Learning Analytics and Knowledge 2013” MOOC looks so promising from today (first day!) that I’ve decided to create a WHOLE PAGE for it!

What I like so far is:

  • Nice video intro from George, which I can listen to in less than 5 minutes while
  • Browsing the course website: https://learn.canvas.net/courses/33/wiki/front-page
  • Course site includes great intro materials, including:
    • an overview of the course that I can quickly understand
    • course outline
    • expectation setting: total time PLUS explanation of centralized & decentralized elements.
    • clear info & links to related elements (twitter hashtag (#LAK13) & Diigo group, a few others), quite a variety, but not too many.

Overall, pretty exciting start – I got my first 10 mins worth & hope I can come back to it soon for more!

Great talk this morning at MIT:

Using Big Data to discover tacit knowledge & improve learning

Ken Koedinger, Prof of Human-computer interaction & psychology, Carnegie Mellon University
CMU Director of LearnLab, Pittsburgh Science of Learning Center

http://www.learnlab.org

Recent Powerpoint (from a similar talk) with good notes:

“Why is it that science & tech have not improved education as they have medicine and transportation? A root cause is that we, the general public, educators, policy makers, do not fully appreciate the complexity of learning and instruction.   Learning is really much more complex than our conscious experience of it would suggest.

Our over-estimated sense that we understand our own learning leads us astray in making educational decisions — it yields a tendency toward the quick fix or one size fits all solution. … The good news is that there is so much S&T can do to better understand learning and to greatly improve instruction.  We first need to accept that we do not know what we know!”

Introduced by Lori Breslow, Director, Teaching and Learning Laboratory at MIT.  Ken’s lecture was recommended to her as best lectures about online learning, offered as part of MIT DUET Seminar Series.

—–

Ken’s talk:

Full notes in PDF format, including some pretty bad photos (but taking notes on the iPad & inserting photos into them is so fun!).

Most of what we know is tacit > learning based on intuition is flawed (great argument overview slide, see if I can get a copy).

Chick sexing – experts can sex chicks, but can’t explain how & it takes awhile to learn (Beckmann & Shiffar, Sexing Day Old Chicks), half page instruction improved learning curve.  Eg we know English but we don’t know what we know.  Experts can describe less than 30% of what they know > major design implications.  Cognitive Task Analysis (lee 2004 meta analysis) improves instruction

Teachers don’t know what they know (eg story problem, word problem, equation) – math teachers & us think story is hardest for students, but equation is (for beginning algebra students). They have trouble with the symbolic language of the equation.  >> Expert Blind Spot – not their fault, but problematic for instructional design.  (Difficulty factors assessment)

From textbook model to inductive support model.  Algebra Cognitive Tutor: interactive support for learning by doing – within activity: authentic problems, feedback within complex solutions, challenging questions, personalized instruction (specific to their need at the moment), between activities: progress & individualized reference to next problem.
Model tracing (like AI plan recrognition); represent all correct paths, everything else is an error.  Knowledge tracing: assess knowledge growth, driving activity selection & placing.  Use the data you are collection to measure effectiveness of the model.  Some verifiable results (some null too), 600k students using it per year, 80 mins per week.  (Using in Algebra, Chemistry, English, Games)
EdTech + wide use = Research in practice
Studies run for a couple weeks, designed to test a specific item/ change.
Interaction data is surprisingly revealing (Worcester Polytechnic)
  • accurate prediction of mcas score
  • detect student work ethic, engagement
  • discover better models of what’s hard to learn

Analysis of Open Learning Initiative data set from stats course.  In algebra, the real challenge is learning to model the problem, not solving equations.

“Sciences of the Artificial”.The task, not the cognitive process drives the learning. Inherent difficulty in the task is not obvious, so instruction is misguided.

Need to design the instruction to get at the underlying tacit knowledge. Learning Factors Analysis

  • Traditional College Course=>100hours, ~3% learning gain
  • Adaptive Data-Driven Course=<50 hours, ~18% learning gain

Experts in the domain/field need to agree in order to make this all possible.

 

ELI day 3

Using Analytics to Examine Course Design and Identify Effective Practices
Karin Readel, Director of Instructional Technology, University of Maryland, Baltimore County
readel@umbc.edu

Their public BB reports:
http://www.umbc.edu/oit/newmedia/blackboard/stats/

They wanted to know about the impact of their faculty training.

About their training program:
* Alternate Delivery Program (ADP)
* for moving from f2f to online/hybrid
* one time $2500 stipend
* 1 day “hybrid workshop”
* self-reflection on pedagogical problem
* interview with IT & FDC staff
* Create & present 2 online activities
* teach redesigned course
* evaluate adp process

Questions:
* are there differences in course measures?
* does the training make a difference?
* can we use BA4L data to identify effective practitioners?
* what other non-LMS data can be included in the analysis?

Measures looked at:
* “Average # of interactions” (in the LMS); all faculty & for trained; across f2f, hybrid, online.
* ave amount content, ave course accesses (by students), ave min/access (about the same for all faculty & trained)
* put it all together to see effect – ave content/ave accesses/ave min/access (across f2f all, f2f trained, hybrid all, hybrid trained, online all, online trained)
* % of courses using particular features (content, % accesses, grade center tools,

BB defines an “interaction” as anything a student does in a class (clicks/page view/submission) (could indicate poor design, if folder depth is bad, they don’t use the grade center, etc…)

Cross mapped projector use data with course schedule data – to better allocate the limited number of projector-enabeled classrooms.  (some rooms are monitored by GVE – web-based AV management software)

Next steps:
* look at impact of training over time on specific faculty
* look at impact of “redesign” on specific courses
* add clicker & projector use data to BBAnalytics cube so she can compare it all directly

Are All Mobile Devices Equal When It Comes to Teaching and Learning?
Robbie Kendall-Melton, Associate Vice Chancellor for Academic Affairs: eLearning    Tennessee Board of Regents

My Script calculator app, Toca Boca

http://emergingtech.tbr.edu/

ELI 2013, day 2

Learning Sciences and Learning Analytics: Time for a Marriage – Sponsored by Starin
Roy Pea: http://www.stanford.edu/~roypea/index.html
David Jacks Professor of Education and the Learning Sciences

Informed by Learning Sciences: Bransford, “The Cambrigde Handbook of Learning Sciences” Learning science in informal environments, Science magazine

Journal of Learning Sciences (+ another journal)
Int’ society of learning sciences: http://www.isls.org

National Tech Plan (Pea, David Rose, others…)
http://www.ed.gov/technology/netp-2010

Calls for connected, personalized learning.

Priorities:

  • Develop interconnected learning maps
  • Design for social learning
  • Educate for broader competencies
  • Use richer pedagogical models
  • Better understand learner goals
  • Forge interdisciplinary teams

urgent priority to build learning maps & competencies to allow personalized learning, like the common core initiative: http://www.corestandards.org/  (Pea showed a great visal map,b ut I can’t seem to find it…)

lots of folks are working on learning maps, but they are unconnected

Shared Learning Collaborative (recently renamed “In Bloom”): https://www.inbloom.org/

most fundamental transformation is that teachers can draw on shared, interoperable platforms.

we need interdisciplinary teams > collaborator for online learning

improvement in post secondary ed will require converting teaching from a “solo sport” to a community based research activity

Enhancing Teaching and Learning through Educational Data Mining and Learning Analytics
http://www.ed.gov/edblogs/technology/files/2012/03/edm-la-brief.pdf

Recent NSF Dear Colleague letter

Int’l Educational Data Mining
Society for Learning Analytics Research

—-

Student Use of Digital Resources: Implications for Learning and Technology Support – nice story about how their research developed.  Good presentation to possibly revisit.
Glenda Morgan, Dir. ATS & eLearning Strategist, University of Illinois at Urbana-Champaign

Annual grant program,
http://oia.arizona.edu/content/online-education-project-rfp
funded out of student tech fee
$10,000 to faculty to develop individual online & hybrid courses
Based on student priorities (bottleneck classes, summer, winter, etc… whatever classes students want to see more of, maybe the most over-subscribed classes)
Can apply – any instructor (including adjunct, grad students)

Project leaders (aka faculty) complete a “readiness to teach” diagnostic (theirs comes from Penn State)
Assigned ID consultant (out of 5 in total, they split the 27 projects)
PL required to attend a min of 3 prof dev activities (fairly flexible)
3-day summer intensive (want opportunities to share with & learn from each other)
Course is assessed with Quality Matters criteria & rubric
Mid-semester evaluations are completed
Final course evaluations (fax & student) are conducted
Final reports are submitted (with budget component)

Leveraging similarities to overcome differences by creating communities.

If faculty are highly rated on these three factors, they always get a high rating over all (zero probability of a low rating)

  • Facilitate learning
  • Ability to communicate
  • Respect & concern for students

10 commandments (but there are only 8 below) – Chuck Dziuban, rite.ucf.edu, www.if.ucf.edu

  1. good, fast, cheap – you can have 2, but not 3
  2. don’t let tech drive the evaluation
  3. stats & measurement are great, but are not everything “if you want people to believe something that is really, really stupid, stick a number on it:
  4. push assess & flexibility out (don’t sit on your data)
  5. a great product makes a huge difference (it’s hard to evaluate junk)
  6. just because you can, doesn’t mean you should (eg studying class size, variables can bite!)
  7. progress best made in simple steps
  8. have to make what you’re talking about relevant to others’ world

Scholarship of Teaching & Learning – faculty in specific disciplines build evaluations for their areas.

« Newer Posts - Older Posts »