Purpose. To develop an instrument for measuring medical educators' responses to learners' lapses in professional behavior. Method. In 1999, at the Indiana University School of Medicine, a 22-item checklist of behaviors was developed to describe common responses used by educators responding to learners' lapses in professional behavior. Four medical students were trained to portray lapses in professional behaviors. These students and seven clinical observers trained to categorize behaviors as present or absent. Interrater reliability was assessed during 18 objective structured teaching evaluations (OSTEs). Videotaped OSTEs were coded twice at a one-month interval for test-retest reliability. Items were classified as low, moderate, or high inference behaviors. Script realism and educator effectiveness were assessed. Results. Educators rated OSTE scripts as realistic. Raters observed an average of 6 +/- 2 educator behaviors in reaction to learners' lapses in professional behavior. Educators' responses were rated as moderately effective. More experienced educators attempted more interventions and were more effective. Agreement was high among raters (86% +/- 7%), while intraclass correlation coefficients decreased with increasing inference level. From videotaped OSTEs, raters scored each behavior identically 86% of the time. Conclusions. Accurate feedback on educators' interactions in addressing learners' professionalism is essential for faculty development. Traditionally, educators have felt that faculty's responses to learners' lapses in professional behavior were difficult to observe and categorize. These data suggest that educators' responses to learners' lapses in professional behavior can be defined and reliably coded. This work will help provide objective feedback to faculty when engaging learners about lapses in professional behavior.