Introduction
Distance learning studies involving orthodontic residents have shown that, although residents prefer being live and interactive with an instructor, they learn almost as much from watching a recorded interactive seminar followed by a live discussion. Our objective in this study was to test the acceptability and perceived effectiveness of using recorded interactive seminars and video conference follow-up discussions for in-office continuing education.
Methods
Four small groups of practitioners (total, n = 23) were asked to prepare for, view, and then discuss previously recorded interactive seminars on a variety of subjects; a fifth group (5 previous participants) had live discussions of 3 topics without viewing a prerecorded seminar. All discussions were via video conference through typical broadband Internet connections, by using either WebEx (Cisco, Santa Clara, Calif) or Elluminate (Pleasanton, Calif) software. The participants evaluated their experiences by rating presented statements on a 7-point Likert scale and by providing open-ended responses.
Results
Twenty-two of the 23 participants agreed (with varying degrees of enthusiasm) that this was an enjoyable, effective way to learn, and that they would like to participate in this type of learning in the future. Everyone agreed that they would recommend this method of learning to others. The age and experience of the participants had only minor effects on their perceptions of acceptance and acceptability.
Conclusions
The use of recorded seminars followed by live interaction through videoconferencing can be an acceptable and effective method of providing continuing education to the home or office of orthodontists in private practice, potentially saving them both time and travel expenses.
Recent studies have shown that distance learning can be an effective supplement to graduate orthodontic education. Although live interaction in a seminar with a distant leader might be the ideal way to use distance learning, it is expensive, requires extremely high-speed connections (Internet 2) if both data and video transfer are needed, and can be a scheduling problem.
The results of previous studies with orthodontic residents indicate that participation in live interactive seminars from a distance leads to test score improvements similar to typical classroom instruction, and also participants can learn as much by watching a previously recorded interactive seminar. This is especially true when the recorded seminars are supplemented with live follow-up discussions. Between preparatory readings and viewing interactive seminar recordings, students can obtain a great amount of exposure to a topic at any convenient time and at their preferred pace. Faculty, local or distant, can schedule follow-up discussions, which are generally shorter than typical seminars, to address questions and comments. Scheduling flexibility and lower equipment costs also make this effective method of distance learning an inexpensive and efficient supplement to a program’s educational resources.
Current efforts have created an online library of interactive seminar recordings, only used thus far by orthodontic residents. A question remains: can recorded seminars be used to bring continuing education (CE) to practitioners in their offices? Traditional CE courses typically offer a lecture and a limited discussion with the presenter, along with the possibility of engaging in useful discussion outside the lecture room with colleagues who attended. The time and monetary costs associated with attending, however, tend to limit the number of CE courses that people attend. Access to experts and discussion with colleagues are key components of an orthodontic residency; these are 2 possible benefits of recorded interactive seminars and follow-up discussions for CE without having to leave one’s office or home.
The purpose of this study was to evaluate the perceived effectiveness and acceptability to orthodontists of interactive distance learning in their offices, looking at 4 components: preparatory reading, recorded interactive seminars, live follow-up discussions via video conference, and live discussions without previously viewing a recorded seminar on that subject.
Material and methods
Twenty-three orthodontists in private practice who had a typical broadband Internet connection were recruited to participate in projects to evaluate this distance learning model as a means for in-office CE. It is generally acknowledged that, for postprofessional learning, interactive small group seminars are preferred. This leads to better discussions and development of problem-solving skills than traditional lectures. For this reason, the participants were divided into 5 small groups ( Table ).
Participants | Geographic location | Years in practice | Seminars (n) | Surveyed | |
---|---|---|---|---|---|
Group 1 | 5 men | North Carolina, Virginia | 8–38; mean, 21.8 | 6 | After each seminar and after sequence completed |
Group 2 | 5 men, 2 women | California, Washington | 2-20; mean, 13.5 | 6 | After each seminar and after sequence completed |
Group 3 | 5 men, 1 woman | North Carolina, Virginia | 0.5-4; mean, 1.5 | 3 | After sequence completed |
Group 4 | 5 men | North Carolina, Virginia | 1-40; mean, 30.5 | 3 | After sequence completed |
Group 5 | 5 men | North Carolina | 1-40; mean, 15.3 | 3 | After sequence completed |
Group 1 consisted of 5 male orthodontists in North Carolina and Virginia with an average of 21.8 years in practice. Group 2 included 2 female and 5 male orthodontists in California and Washington with an average of 13.5 years in practice. Both groups prepared for, viewed, and then discussed from their own offices (or homes) 6 seminar topics selected from the 25 seminars produced recently for dissemination to university departments.
Group 3 consisted of 6 practitioners (5 male, 1 female) ranging in experience from 6 months to 4 years, and group 4 contained 5 male practitioners with 15 to 40 years of practice experience. They prepared for 3 consecutive seminars in the University of North Carolina Current Topics series being conducted at that time, viewed a recording of the seminar with the residents within a few days after it occurred, and then joined a videoconference discussion. Finally, group 5, with 3 older and 2 younger practitioners who had been members of group 3 or 4, prepared for and then participated in a live video discussion of 3 Current Topics seminars, without previously viewing recordings of those seminars.
For all seminars, preparation required reading 7 to 10 assigned articles in the orthodontic literature. The same readings used by residents to prepare for the recorded seminars were sent to each distant participant in advance, initially via mailed copies of reprints or .pdf files, and later as e-mail attachments that they could print if they wished. The participants were asked to go through the assigned readings before accessing and watching the recorded seminars on a Web site created for this study. Each video showed an approximately 60-minute, small-group, interactive seminar in which faculty members and residents shared ideas, questions, and opinions about the selected readings and their experiences with that topic. The participants could view the recorded seminars at any time.
Each participant was sent a Web camera (Logitech, Fremont, Calif) to connect to his or her computer for follow-up discussions, unless the participant’s computer had a built-in camera. Discussions were scheduled to be as convenient as possible for the participants, taking into account both their time zones and work schedules. Almost all were set to begin at 6:00 pm local time, so that the orthodontists could finish their office day and stay at the office for the discussion.
For groups 1 and 2, the original seminars were led by 6 faculty members from the University of North Carolina or Ohio State University, and all discussions were led by William R. Proffit except for the temporary anchorage device seminar for group 2, which was led by Robert Scholz. Participants in groups 1 through 4 were told before the experiment that the discussions would last approximately 30 minutes, but they frequently ran over the allotted time because the group wanted to continue talking about points of interest. As with the residents, group 5 had a 1-hour live discussion.
WebEx (Cisco, Santa Clara, Calif), the software program that was used for the videoconferencing for groups 1 through 4, allows real-time sharing of desktop information such as Word documents (Microsoft, Redmond, Wash) and slides, and slides were included in the discussions with groups 1 and 2, as they had been in the recorded seminars these groups saw. Slides were not included in Current Topics seminars, however, so they were not viewed in the recorded seminars for groups 3 through 5 or used in those follow-up discussions. To minimize bandwidth issues and to maintain communication with a participant who might have computer trouble, WebEx was used solely as a video feed. For audio, participants used their telephones to join a numerically identified conference call through a standard conference-calling service. For group 5, Elluminate software (Pleasanton, Calif), with both audio and video transmitted over the Internet, was used, eliminating the need for a conference call. Both WebEx and Elluminate place 6 images of participants in windows on the computer screen; if there are more than 6 participants, each person can select whose images are on his or her screen at any moment.
Along with their readings and video cameras, each participant was sent evaluation forms, by mail initially, and via the Survey Monkey program (Palo Alto, Calif) later. For groups 1 and 2, an initial evaluation, to be completed before the installation of the camera and any videoconferencing, was designed to assess each participant’s self-perceived computer abilities. Individual evaluations of each videoconference session focused on the participants’ perceptions of the effectiveness and acceptability of the readings, recorded seminars, and discussions. At the end of each sequence, participants in all groups completed a summary evaluation in which they reflected on their experience as a whole (this was the only evaluation for groups 3-5).
Each evaluation form consisted of statements that participants rated on a Likert scale of 1 through 7 depicting their level of agreement with the statement. A statement rated 1 meant that the participant strongly disagreed with that statement, a rating of 4 meant neutral, and a 7 meant strong agreement with the statement. Because of the small sample size, only descriptive statistics were used. The summary evaluation also contained open-ended questions so that the participants could include their own thoughts about their experience and critiques of what was good or what could be changed.