Thursday, April 30, 2015

"TBLC Growing Around the World"

TBL is growing worldwide at a rate that is quite remarkable. Educators and administrators are realising the tremendous benefits that TBL offers in terms of student engagement and proven learning outcomes. As a result, since I officially retired last May, I have been travelling very widely delivering TBL workshops around the world. In just the last 6 months I've worked with faculty in the US, Chile, Colombia, Netherlands, Denmark, Norway, Singapore, UK and Australia. During these visits, I have delivered workshops to mixed-discipline faculty groups as well as specific faculties including engineering, pharmacy, medicine and education and even to the national training staff of the Boy Scouts of America, and to primary and secondary teachers at the Singapore Ministry for Education.

One of the memorable experiences from my visits to all these countries was to observe the implementation of TBL in a Singapore normal school seventh grade math class. What struck me in particular was to see a student with ADHD in that TBL class who became engaged with math to the point where he was able to tutor other students. Another example of the power of the TBL process was the observation by Bradford University (UK) that after the school-wide introduction of TBL, student requests for two different kinds of support had been dramatically reduced. One was that requests for support from the disability office for special accommodations (e.g. extra time for exams, a quiet test-taking environment, etc.) had almost completely disappeared; students with disabilities didn't need extra help. The other was that student requests for individual tutoring support in relation to their coursework had almost vanished altogether.

Another very interesting project that I have been working on is the introduction of TBL across the University of South Alabama. This is being done as a very well-planned and well-implemented change process that started in 2011, and is progressing well in meeting the objectives of improving student retention and improved learning. This process and the outcomes are being well documented in a paper that will soon be available. For example, dropout rates in the over 150 classes taught with TBL were less than half of the rates in comparable non-TBL courses and there were 38% fewer D's and F's and higher critical thinking scores. I think that the experience documented in the paper will provide a very good model for other institutions that want to address the very universal challenges in improving student learning and classroom experience.

-Larry Michaelsen, PhD

Wednesday, April 29, 2015

2015 TBLC Research Grant Award Recipients


Dear TBLC Members,
The TBLC supports and encourages research and scholarship in Team-Based Learning. To help its members participate in opportunities that provide educational scholarship, the TBLC will provide funding to initiate new educational research or evaluation proposals in 2015-2016. This year there were several applicants who applied for the research grant, and we would like to recognize the winners for this year’s grant. 

David Rodenbaugh
Oakland University William Beaumont School of Medicine
“A structured faculty review process within Team-Based Learning Peer Feedback/Evaluation: Impact on student team dynamics and quality of narrative feedback”
 

Wei-Hsin Lu
Stony Brook University Medical Center
“A Hybrid Approach: Effectiveness of an Interprofessional Patient Safety/Quality Care Team-Based Learning Simulation Experience on In-Training Healthcare Professionals”
 

Preman Rajalingam
Lee Kong Chian School of Medicine
“Impact of Teamwork Skills Training on Teamwork Quality in Team-Based Learning (TBL) Setting”
 

Each applicant and their research team will receive funding to carry out the project

We congratulate all the teams and look forward to hearing their findings at our TBLC conference in years to come.

Thank you,
Dr Judy Currey
Chair, TBLC Scholarship Committee

Thursday, April 23, 2015

Peer Evaluation: Development of an Efficient Tool for Effective Feedback

It is critical to hold individual students accountable for their learning in collaborative, cooperative or team-based learning. Though all group learning is not equal, peer evaluation is the accountability method used in team-based learning. However, the peer assessment process can be cumbersome for student and faculty alike.

After using paper-based student peer evaluation forms for over fifteen years, the paper process has become increasingly unwieldy as class sizes have increased. Time constraints limit the ability to record and analyze student peer evaluations and communicate results to individuals in an efficient and timely manner. Yet, the need for peer evaluation is critical in maintaining the integrity of the team-based learning process. Consequently, we searched for alternative means to improve process efficiency and hopefully, process effectiveness. Using Google Drive™ programs, a readily available and inexpensive platform, an electronic survey (Google Form™) was developed based on our original paper-based peer evaluation form.

Students access the survey from a secure on-line platform to maintain peer-to-peer anonymity and to meet FERPA guidelines. Students are expected to rank each peer's performance in eight categories using a 1-5 Likert scale. Students provide written comments and an overall performance grade for each peer on a scale of 1-10. Data is downloaded as an Excel file eliminating transcription errors. Data is analyzed and an individual report is returned to each student via Blackboard.

A time and motion study comparing the paper and electronic-based evaluation tools was conducted in AGRI 2317- Principles of Agricultural Economics in the Fall 2013 semester. Time involved in data analysis and responding to 70 students was reduced by 72% using the electronic-based tool. This facilitated more timely feedback to students. Students reported a preference for the electronic format over the paper instrument at a rate of 96%. Adjustments and redeployment of the instrument for the same course in spring 2014 took less than 5% of the time required to develop the original electronic instrument. 

By implementing an electronic-based student peer evaluation instrument, instructors using team-based learning can more efficiently provide quantitative and qualitative feedback to students. More specifically:
Data output from the electronic peer evaluation format provides a much more manageable data set for analysis.

After the initial Google Form™ was developed, subsequent forms are easily developed to accommodate future and/or other classes.

Efficiencies gained through the electronic peer evaluation process allow for a much more effective turnaround time for students to receive feedback.

Student feedback indicates a preference for this type of private assessment of their peers.

We continue to fine-tune the instrument to increase user-friendliness. If you are interested in gaining access to the instrument and general instructions on how to use the electronic-based peer evaluation tool, please contact Kyle Ferrell (kwf@shsu.edu) or Foy Mills (foymills@shsu.edu).

Kyle Ferrell, Foy Mills Jr.
Sam Houston State University

Thursday, April 16, 2015

TBL vs. Lecture: An Exploration of Student Test Data

In Fall 2012 the first author converted a large (100-150 student) general education course from a lecture based class to a TBL formatted course. This shift entailed restructuring how class time was used and content was delivered, but the content remained the same. Students in both the lecture and TBL classes were assigned the same readings. Additionally, material that had been covered during class in the lecture format (lectures and videos) was delivered outside of class in the TBL format using podcasts, lessons in Moodle, and online access to videos. Students in both formats took the same exams with the same objective (multiple choice, true/false) questions. The TBL sections followed best practices: students were placed on permanent teams; the readiness assessment process was completed for each unit; this was followed by application exercises; and peer feedback was given throughout the semester.

A comparison of test question data after 3 TBL semesters revealed that students in lecture sections got 78% of questions correct while students in TBL sections got 81% of questions correct (this difference was significant). The first author was interested in exploring where this improvement came from and wondered what methods of content delivery in lecture based classes were more or less effective than in a TBL class? For example, did students in TBL sections have a better grasp of material that had previously only been assigned outside of class (e.g. a reading not covered in lecture)? Did they fare worse on material that used to be delivered via lecture because they now reviewed it on their own?

Working with two colleagues, Lisa Walker and Angela Ferrara, we explored this issue by comparing the percentage of students who correctly answered objective (multiple choice/TF) exam questions in these two learning environments (lecture vs. TBL). We analyzed student exam data collected over five semesters. Two of these semesters were lecture based (340 students total). Three semesters were taught using TBL (356 students total). One hundred and one test questions remained constant over these 5 semesters and these were included in our analysis. The first author determined the correct answers and developed a key that was used to machine score the questions.

Questions were categorized based on how students in the lecture based course received the information contained in the question. A test question was categorized as:

"In-class content delivery" if students were only exposed to the information during face-to-face class time.

"Self-directed/outside of class" if students were only exposed to the information outside of class.

"Both" if the information in the test question was covered during face-to-face class time and in a resource students were expected to review outside of class.
Note, questions were not classified based on how content was delivered in TBL sections because all content was initially delivered outside of class. Class time was then spent engaging in the Readiness Assessment Process and application exercises.

The analysis revealed that regardless of how content was delivered in the lecture section, TBL students had higher scores. All differences were statistically significant. Students exposed to TBL scored higher (81%) on the in-class content delivery questions than students in lecture (79%) (t = 1.819, p = 0.035). Students exposed to TBL got 78% of the self-directed/outside of class questions correct, while students in the lecture sections got 72% correct (t = 5.749, p<0.001). Finally, TBL students scored higher (86%) than lecture students (83%) on questions categorized as both (t = 2.261, p = 0.012). These results suggest TBL can be as effective, if not more effective, than lecture based pedagogy.

Coral Wayland
Angie Ferrara
Lisa Walker 
University of North Carolina at Charlotte

Wednesday, April 15, 2015

TBL 101 is Being Offered at IAMSE

Dear Colleagues,

The TBL 101 and Creating an Effective TBL Module workshops will be offered at the 19th Annual IAMSE (International Association of Medical Science Educators) Meeting this summer on Saturday, June 13th, 2015 from 8:30 am to 3:15 pm Eastern Time. These workshops will be co-presented by Sandy Cook and Kevin Krane. These workshops can be registered for independently of the normal conference registration. For more information, please visit http://www.iamseconference.org.

Thank you,

TBLC Admin Team

Monday, April 13, 2015

Team-Based Learning and Undergraduate Anatomy: A Good Match

Team-Based Learning has often been used to teach anatomy and physiology courses in medical school and at other graduate schools in the health professions, but these courses are still uncommon in undergraduate programs. I have found TBL to be a perfect fit for this topic and level.

The Anatomy and Physiology courses at UMBC, like at many institutions, are a two-semester, sophomore-level biology courses taken primarily by pre-health students. The 100 or so students that enroll each semester are equally distributed among those planning careers in nursing, physical therapy, pharmacy and medicine. The TBL version of the course is divided into seven units, each of which begins with a Readiness Assessment Test (IRAT and TRAT) and ends with an application exercise. Teams are fixed and students are given required readings before class which are the basis for the RAT.

Application activities, an integral part of each module, are structured using "4S" principles and emphasize the most challenging goals for each unit. Each application activity is projected on slides to be reviewed by the teams. I have learned the following lessons about designing effective application activities which may be helpful to other faculty teaching undergraduate TBL courses:

Use relatively simple clinical applications to provide significant, specific-choice problems. For example: You're in the dentist's chair, awaiting your root canal. The dentist pauses, needle in hand. "Which cranial nerve should I block?" she asks. 

After teams use voting cards to indicate their choices, have them place the cards on stands (such as seating places at a wedding might be marked) so everyone can see and remember each team's choice. 

When facilitating team discussions choose individuals randomly during facilitation to report their team's reasoning. This prevents one or two students from becoming the team's spokesperson or spokespeople, and forces weaker or less focused students to work to understand the team's answer and reasoning.

Application exercises should be kept short. A long, complex problem can be broken up into smaller pieces. This keeps the teams on track and minimizes chatter about their weekends, as students usually have only 2-3 minutes for each problem or piece of the problem.

"Significant" doesn't always mean "Realistic." Some of my best applications-meaning, those that produce the best within-team and between-team discussion-are my "Alien Ray Gun" questions, in which a pictured cartoon alien threatens to vaporize one of various cell types within a tissue, tissues within an organ, or organs within an organ system, but kindly allows the victim to choose which. Choosing between types of T lymphocytes, for example, or whether to vaporize their stomach or colon, teams become highly invested in their answers while at the same time learning the nitty gritty functional relevance of each cell type or organ - which is, of course, the point! 

Direct competition among teams is a useful tool. Often, the times of highest enthusiasm and participation in my classroom are the ten-question "For the Pie" competitions, usually held shortly before individual exams, in which student teams compete on a series of short, tough questions to win homemade pie baked by the instructor. Although the "For The Pie" times are not advertised, students learn to predict them and often seem to study harder for them than for the actual test!

This approach has, over three years, been highly successful. The DFW rate (percent of students receiving Ds or Fs or withdraw from the course) is very low since TBL was implemented, less than 5% per year, versus 15-30% before using TBL. Student satisfaction is high, based on both official student evaluations and mid-course anonymous surveys. When compared to lecture, students give the TBL course significantly higher ratings for both how much they learned as well as the instructor's overall teaching effectiveness.
A pre-posttest is given in each semester for the TBL course. Results reveal that students more than double their knowledge each semester. Additionally, students retain more material from A&P 1 than students who took the course the same year in a lecture format.

The major benefit that TBL provides is the time to explore complex and interesting problems in class with your students. By keeping the organization and flow tight and focused while allowing for misconceptions and messiness to flow freely during facilitation, the course allows students to get confused (a state that is all too lacking in lectures) in an environment that facilitates the transformation of that confusion into new understanding - learning!

Sarah Leupen, PhD
University of Maryland

Thursday, April 2, 2015

TBLC Board of Directors Nominations

Dear TBLC members,

This year the TBLC has two positions open up for the Steering Committee (Board of Directors): the K-12 Member-at-Large and the President Elect. LaTonya Amboree, the current K-12 Member-at-Large, has been nominated for a second term, and Michael Nelson has been nominated for President-Elect. The slate was approved by the nominations committee on March 15th, 2015. According to TBLC bylaws (http://www.teambasedlearning.org/page-1033379), members have until April 25th to petition for additional candidates to be added to the ballot. If additional nominations are received, an electronic ballot will go out to members on May 1st, with voting open until May 31st. If no additional nominations are received, no election will be necessary, as the nominees ran unopposed.

We would also like to thank the current Board of Directors for their service to the TBLC.

President:Wayne McCormack
Past President:Ruth Levine
President Elect:Karla Kubitz
Treasurer:Ed McKee
Health Sciences MAL:Michael Nelson
Higher Education MAL:Peter Balan
K-12 MAL:LaTonya Amboree
Expert Adviser:Larry Michaelsen
Executive Editor of Publications:Michael Sweet
2015 Program Chair:Karla Kubitz
2015 Program Co-Chair:Tatyana Pashnyak
Collaborative Manager:Julie Hewett

Thank you,
TBLC Admin Team