Previous studies have suggested that, although orthodontic residents prefer to be live and interactive in a seminar, they learn almost as much when watching a previously recorded interactive seminar and following up with live discussion. Our objective was to test the effectiveness and acceptability of using previously recorded interactive seminars and different types of live follow-up discussions.
Residents at schools participating from a distance completed preseminar readings and at their convenience watched streaming video of some or all recordings of 4 interactive seminar sequences consisting of 6 seminars each. Afterward, distant residents participated in 1 of 4 types of interaction: local follow-up discussion, videoconference, teleconference, and no discussion. The effectiveness of the seminar sequences was tested by pretest and posttest scores. Acceptability was evaluated from ratings of aspects of the seminar and discussion experience. Open-ended questions allowed residents to express what they liked and to suggest changes in their experiences.
In each seminar sequence, test scores of schools participating through recordings and follow-up discussions improved more than those participating live and interactive. After viewing, residents preferred local follow-up discussion, which was not statistically different from participating live and interactive both locally and from a distance. Videoconference and teleconference discussions were both more acceptable to residents than no follow-up discussion, which was found to be significantly below all methods tested.
When residents are live and interactive in a seminar, there does not appear to be a significant difference between being local vs at a distance. Recorded interactive seminars with follow-up discussions are also an effective and acceptable method of distance learning. Residents preferred local follow-up discussion, but, at a distance, they preferred videoconference to both teleconference and no discussion.
In a series of distant seminar sequences with orthodontic residents, Bednar et al showed that, although they preferred live participation from a distance, later observation of a recorded interactive seminar produced similar improvements in pretest scores compared with posttest scores. The use of recorded seminars was also well accepted by distant residents, especially when participating in short follow-up discussion with faculty after viewing. Positive evaluations from the 3 participating orthodontic programs led to the conclusion that “further development of recorded seminars with live follow-up discussions has the potential to supplement instruction in graduate orthodontic programs.” Recorded interactive seminars can be created by properly equipped schools as they progress through their normal curricula, and offer flexibility to distant residents and faculty while reducing the costs and complexity of distance education.
In this experiment involving second-year residents at the University of North Carolina (UNC), Ohio State University (OSU), Tri-Service Orthodontic Residency (TORP) at Lackland Air Force Base (San Antonio, Tex), West Virginia University, University of Louisville, University of Minnesota, University of Manitoba (Canada), University of Sydney (Australia), and other residents from programs in Australia and New Zealand, we looked to evaluate the effectiveness and acceptability of various forms of postseminar feedback after distant residents viewed recorded interactive seminars.
Material and methods
Between UNC and OSU, recordings of 25 seminars organized into 4 sequences were created. Sequences 1 and 3 originated at UNC and covered topics pertaining to growth and development, and biomechanics, respectively. Sequences 2 and 4 originated at OSU and covered topics pertaining to advanced diagnosis and treatment planning, and treatment sequelae, respectively. The recordings were digitized, edited, and uploaded to a Web site accessed by user name and password. Before live participation or accessing and viewing a recorded seminar, residents were given a seminar outline and preparatory reading list. If not involved in the live seminar, distant residents accessed the seminars, watched them at their convenience, and evaluated 4 methods of postseminar interaction: local follow-up discussion, videoconference, teleconference, and no discussion. Effectiveness was measured by pretest and posttest data. Acceptance was measured by rating statements on a 7-point Likert scale. Open-ended questions allowed participants to share specifics of what they liked or disliked and to suggest improvements.
Although the origin of the seminars rotated between UNC and OSU, all recordings were made and processed at UNC. For final assembly, incoming video and audio signals, obtained from cameras and microphones dispersed throughout the room at UNC or from a Polycom (Pleasanton, Calif) videoconferencing system from OSU, first passed through separate interfaces that digitized the signals and placed them into their respective video-editing and audio-editing software. The digitized video signal was routed and recorded through Wire Cast 3.0, an Apple program that allows recording of multiple feeds, via cameras or data in the form of a Powerpoint file, with the benefit of Live On Tape editing, essentially doing preliminary editing “on the fly.” The digitized audio signal was routed and recorded through Sound Track Pro. Then processed audio and video files were imported into Final Cut Pro, where they were synchronized, edited further, and combined into a single Quicktime file. The Quicktime file was then converted to a Flash file and uploaded to the project’s Web site.
Acceptance of the recorded seminars and the methods of interaction were measured by using evaluation forms containing 12 short statements describing the experience, which residents rated on a 7-point Likert scale on which 7 meant “strongly agree,” and 1 meant “strongly disagree.” A score of 4 meant “neutral” on the subject. The statements were further separated into the following descriptive categories: resident’s opinion, learning experience, and descriptions of the discussion with each method of interaction.
A 1-way analysis of variance (ANOVA) was used to test the null hypothesis that there were no differences in acceptability between types of seminar interactions or follow-up discussions by using the Tukey-Kramer test to adjust for multiple comparisons.
Follow-ups for most distant schools consisted of videoconference discussion, teleconference discussion, and no discussion; the specifics of participation are shown in Table I .
|School||Pretest average (%)||Posttest average (%)||Difference|
|Sequence 1: growth and development||Minnesota||51.82||68.79||+16.97|
|Sequence 2: advanced diagnosis and treatment planning||TORP||52.92||66.67||+13.75|
|Sequence 3: biomechanics||TORP||48.84||50.23||+1.40|
|Sequence 4: treatment sequelae||TORP||56.44||57.33||+0.89|
As Table I shows, posttest scores for effectiveness improved in each seminar sequence. Although the improvement was relatively small, in each seminar sequence, the greatest improvement was seen in a distant school.
The acceptability of the recorded seminars is shown in Table II . Based on the acceptability data pooled among the participating schools, residents agreed (6.1) that the videos helped them to understand the material better than the readings alone. They also agreed that use of the videos improved their educational experience (5.6) and provided a better learning experience than it would have been without them (5.6). They slightly agreed (5.3) that they learned more by watching the instructor interact with other residents as opposed to a video of just a lecture from the instructor, and were close to neutral (4.7) as to whether they would have learned more if present in person.