Teaching, Learning, and Research Hub

Using Simulations in Teaching

Panelists
Lauren Mann, AuD, PhD, CCC-A, Kansas University Medical Center
Tiffany Johnson, PhD, CCC-A, Kansas University Medical Center
Amanda Stead, PhD, CCC-SLP, CHSE, Pacific University, Oregon


Facilitators
Joan Besing, PhD, CCC-A, Montclair State University
Abdullah Jamos, AuD, PhD, CCC-A, Missouri State University

Description
As part of the year-long teaching symposium, seven synchronous online peer discussions were held. In the sixth session the panelists were asked to address using simulations in teaching and attendees were encouraged to engage with the panelists. Below is summary of the discussion.

QUESTIONS/TOPICS DISCUSSED

  • How did you start using simulations in your programs? How do you use simulations in your
    program?
  • What type of courses or mechanisms have you used to implement simulations in your
    program?
  • How did you go about deciding what type(s) of simulations to use?
  • Can you talk a bit about how you were able to get funding for your simulation programs?
  • What do you consider when you evaluate simulation programs? Please share any
    successes or challenges you have had.
  • Question from the chat: Can the presenters describe the pre-brief and debrief of learners?
  • Can you discuss the use of simulation in interprofessional practice.

How did you start using simulations in your programs? How do you use simulations in your program?

Dr. Stead

  • We were doing experiential and active learning in our classrooms and
    have come to refer to this as “simulation light.” We felt we needed
    to introduce more rigor and fidelity than we were. As our faculty got
    deeper into the literature and thinking about pedagogy, we realized
    that we needed to exert more control around the experiential learning
    strategies.
  • We are at a small liberal arts university, so we don’t have a simulation
    center and didn’t have a big budget. So, we started early with
    standardized patients. After focusing on simulation and getting
    credentialed, our approach was more rigorous—and we were doing
    many things well and building bridges between the classroom and the
    clinic. But we weren’t doing a very good job with assessment.
  • We have a curricular-long program for simulation. Students engage
    with simulations from the first term until the externship term. Some
    examples are (1) a professionalism simulation in which students learn
    to interact with their supervisors, (2) high-fidelity medical hospital
    simulations, (3) a care conference for end-of-life care among others.
  • I think what I’ve realized is that it didn’t take a lot of change for us to be
    doing a better job of simulation. We could add a bit more rigor to our
    experiential learning and have a whole new experience.

Dr. Mann

  • It was very similar for us because we were doing simulations but
    not calling it that. We did not have a systematic method, and the
    simulations were scattered across courses. We are lucky because we
    do have simulations in our department and at the school level.
  • I’d say our entry into simulations was because we were looking for a
    way to standardize clinical experiences for our audiology students so
    they could enter clinical practice with the same foundation.
  • We had been doing this for a while pre-pandemic and now are at [the]
    point where we are looking at improving and evaluating the program.

What type of courses or mechanisms have you used to implement simulations in your program?

Dr. Stead

  • As a faculty, we had a discussion if simulations should be in courses in
    every term or a specific term or if there should be a competency checkoff or a grade associated with clinical registration. Our current practice
    is to have simulations integrated within the academic classes. Students
    earn grades for their knowledge, and we can check off competencies
    within our courses. So, I’d say simulations are across the curriculum.
    Some of the simulation activities are integrated within class time,
    and sometimes they are in a scheduled lab time. Or often they are
    scheduled during finals week—so there is a more robust demonstration
    of their [students’] learning.
  • The educator responsible for the content is responsible for grading
    or evaluating performance using the appropriate rubrics to evaluate
    outcomes. Then clinical hours are submitted into CALIPSO under a
    unique column, and the educator makes sure that the hours end up in
    the correct place.
  • We don’t do simulations to supplement clock hours. Rather, we do it
    because there is wide variety in the clinical placements. We don’t have
    an on-campus clinic, so our faculty really wanted to have their eyes on
    our students, and [by doing that] we can see how a student integrates
    (knowledge and skills) across courses.

Dr. Mann

  • We have an on-campus clinic, and we have our simulations occur
    within coursework. We have simulations at the school level that involve
    multiple disciplines, and we now have a course in which students enroll.
  • Before students enroll in clinic, they are doing simulations every
    week. All our students in a class are in the lab working on the same
    simulation, doing the same pre- reading, and all three clinical faculty
    are participating—and we have a graduate teaching assistant (GTA)
    helping as well.

Dr. Johnson

  • I can speak to the campus-wide activities where students are
    engaged in interprofessional approaches. I can address from the
    chair perspective regarding the money associated with this. We do
    have a dedicated enrollment for this, and it has been nice to have the
    resources so that we can expand the fidelity and level of work so that it
    can be infused in other curricular elements. I’d say many of our courses
    in the first 2 years of the 4-year program take advantage of these
    resources

How did you go about deciding what type(s) of simulations to use?

Dr. Stead

  • We must think about the things you want to teach your students or
    what you want them to demonstrate—and so programs have to make
    hard decisions about what they want.
  • Our approach has been a combination of thinking about what was
    mission critical—like, being sure the students could do a bedside
    swallowing exam and managing in a hospital room to having
    interprofessional professional care conferences and representing
    themselves and their profession. As a faculty, we argued about favorite
    topics and what would be a good use of time.
  • The next thing we did was to look at the students’ clinical hours and
    try to identify where there were gaps (e.g., voice and hearing). We are creating simulations so we can be absolutely sure that every student has the experiences needed.
  • This summer, we will run a simulation that is not part of the big nine but
    is needed. The simulation will focus on working with an interpreter.
  • We are big proponents of developing soft skills because we think it
    makes a big difference in therapy.
  • We have also done a lot on clinical communication and how students
    talk about their role as a speech-language pathologist and how to
    present scientific evidence. We also work on how to build empathy
    for a patient or how to calm a patient down during an emotional
    encounter.
  • We know that a student can follow the steps to complete a procedure,
    but when you ask them to communicate and be nuanced on top of
    completing the procedure, they can’t/don’t do it. Giving feedback that
    supports the development of the communication piece is helpful.
  • I serve as the director of our simulation education program, so it is my
    role to help the educators figure out how to implement their simulation—
    and I handle the logistics.

Dr. Johnson

  • I can address the high-level perspective. Our motivation was to get
    the foundational skills equalized across the students. We were really
    looking for things that would support those foundational diagnostic
    skills and foundational hearing aid amplification skills.
  • We looked to see what was out there, brought people in for demos.
    There is a fair amount of background investigation into what
    opportunities/options are out there.

Dr. Mann

  • We took a shotgun approach and tried them all. The list we generated
    was expensive, and we were very well supported.
  • Technology has evolved since then, and we had students try them, and
    that is not always the best, but we got a lot of student feedback, and
    they told us what was really helpful and what was a waste of time.
  • Because of the lack of options when we started, we tried all of it and
    plugged it all in across the classes. We took student feedback very
    seriously, and over the years, we have been more systematic and have
    evolved how we evaluate students.
  • We have been doing comprehensive exams, practical exams, and
    classes that they are taking in conjunction with their simulations—and
    we have definitely seen changes. But selecting individual simulators is
    difficult. We wanted the simulations to do more than just (teach and
    assess) a clinical skill. We wanted students to be able to do that but
    didn’t want to spend precious clinical time with a real patient figuring
    out how to mask. So, we infused counseling into cases early on, and so
    students were asked to make it more human and record a counseling session
    with that “person” and go back and critique their counseling by themselves
    and do peer feedback.
  • As a preceptor, when I noticed a nervous tic or anxious energy, and
    tried to provide feedback, it is sometimes poorly received even if you
    create a safe environment. By taking these discussions out of the
    clinic—out of the higher-risk space—and moving it into a simulated
    environment, the students felt much safer to talk about what happened
    and [to figure out] what to work on. Having a student identify
    something they did or said was better. We saw a faster implementation
    than when we told the student not to say something to the patient. I
    noticed right away that if students pick up something themselves, they
    don’t do it again.
  • Regardless of what simulator you use or what software you use, I really
    encourage recording a counseling session—and get student feedback
    on that, and there is very little financial involvement

Can you talk a bit about how you were able to get funding for your simulation programs?

Dr. Johnson

  • I’ll talk on a couple of levels, and what I am going to share is probably
    specific to our university, but it might help you think out of the box.
    We imagined how space could be configured to house the simulation
    space.
  • We were in the middle of reevaluating a number of things about our
    curriculum, looking at credits, looking at what was still meeting the
    needs of our students, and what may not be needed anymore. We
    made some judgments about how many credits were needed and
    created an enrollment wherein the way funding flows into the university,
    we were able to support the space and make it cost-neutral to the
    students, and we could capture some of the funds as they came back
    to the department, so the program is sustainable.
  • We also went through a lengthy and involved process to get a fee
    that students pay to support the simulation lab. The fee is attached
    to specific courses, and so only the students using the space were
    charged for the space. It is important to remember that if you go down
    this path, money is needed to purchase simulations and then maintain
    the upkeep on annual subscriptions, repair broken equipment, and
    maintain currency with the computers

What do you consider when you evaluate simulation programs? Please share any successes or challenges you have had.

Dr. Stead

  • One of the things I learned through the literature and the credentialing
    process is that we’re setting way too many objectives for students to
    achieve for any one simulation. We would want to see their soft skills,
    their procedure, their transitions, and their management of material
    when you put them in a clinical environment. It wasn’t very focused,
    and so we have learned to be much more intentional in having three to
    five objectives and [to] be very transparent with the student about what
    we assess.
  • The way we’ve done our assessment is to do pre- and post-test on a
    knowledge or something like self-efficacy or confidence. We’ve done
    a lot of rubrics. The head educator or the standardized patient does a
    rubric on very straightforward skills that address the objective. We just
    had one on clinical communication and used a five-point scale and the
    standardized patient, the head educator, and the student would rate
    themselves when they were finished, and this was carried to the debrief.
  • We’ve also adapted scales and tools that people in the medical
    disciplines have been using for eons. There is a lot of good stuff out
    there.
  • You can email us. We have some great stuff that we have hijacked
    or created on our own. We also do a quality survey at the end of the
    simulation to get feedback on the simulation itself. We get the added
    benefit to the students learning from their perspective.
  • I think you know that you set the assessment up to address the
    outcomes you want from the simulation, and the best advice I can
    give you is to keep the objectives tight. I know this is where we were
    screwing up before. Don’t do more than three to five objectives
    because you can’t assess much more than that, and it keeps the
    students hyper-focused.

Dr. Mann

  • I like what Amanda said about a very focused skill or objective. We
    spent so much money and had all these new gadgets and a new
    dedicated space, so we wanted to do everything with it.
  • If the objective for this session is to take an ear impression, then we
    are going to ask questions, have a reading that goes along with it,
    and then there’s going to be some kind of skill assessment where they
    demonstrate that one narrow skill.
  • It was hard for me when we first set it up because I was so excited
    to get the baby ABR simulator that you want to use the baby ABR
    simulator for everything. Because we manage our NICU screening, and
    because we are on our hospital campus, we realized that maybe this is
    not something we need to spend time simulating. We decided to spend
    time on the things that we do not get organically at our facility.
  • I think when you are starting, I would look for holes in the student’s
    experiences. Then I would think about interprofessional practice, but
    I would lean on other departments and what resources they have.
    Because even before it was systematic, I was sending students down
    with physical therapists to talk about balance rehab, and I don’t have
    any way to get a student that experience. Just kind of pulling other
    colleagues in and they have a plan for their students. You may not
    have to reinvent the wheel. You may be able to tap into something that
    is already going on and just tweak it for your simulation needs.

Audience Question: Can the presenters describe the pre-brief and debrief of learners?

Dr. Stead

  • This is something we were kind of doing but probably not doing it in a
    very rigorous way. Our approach now is that we do a pre-learning—so,
    whatever class the simulation is attached to, there is dedicated class
    time to explain the simulation. The objectives, what the student will
    be doing and give them an opportunity to do reading, or practice
    procedures or model what you’re going to have them do. Then,
    immediately ahead of the simulation, we do a pre-brief that is very
    scripted because we often have multiple educators and want to be
    standardized. We tell them this is what you are here for, these are the
    objectives of the simulation. A way to achieve this is to tell the students,
    “We ask that you enter into the spirit of the simulation and treat this as
    a clinical experience.” We have our students sign a fictional contract
    which asks them to enter into the space in good faith. We remind them
    that it is a confidential learning space. We go through that for 5–10
    minutes ahead of the simulation. We talk about [the fact] that when you
    enter the room, the simulation starts. We go to debrief right after the
    session. To Lauren’s point, you know when we give them feedback, they
    don’t like it. But when you go to debrief and instead of trying to teach
    them and you actually let them explore their experience, they totally
    know what they did wrong. A lot of times they say, like, “Wow, I was
    really awkward,” or “I actually never did that part of the procedure,”
    or “I said this thing that was not factually accurate.” If you serve up the
    opportunity, they definitely get it. Our debriefs are about twice as long as
    the simulation.
  • We try to keep the groups as small as possible because we want every
    student to participate, but it really depends on the pedagogy or the
    setup.
  • We guide them using the PEARLS method (Promoting Excellence and
    Reflective Learning in Simulation*), which is a way to address both
    emotions and content. This is the one we gravitate to, but there’s lots of
    evidence-based debriefing methods you can use.
  • My advice would be to almost script it for the person who is taking the
    lead because the educator is tempted to teach and take up all the
    talking space, and you want the students doing the talking during the
    debrief.

Dr. Mann

  • I come back to Amanda’s point about entering the space and that
    this is a simulation. We’ve got both experiences, so students have
    a dedicated time, and they do Code Blue and interdisciplinary
    simulations where there is a pre-brief and a debrief.
  • We have created a space where the students have the opportunity to
    start the simulation whenever they want. In their dedicated time, they
    meet with a GTA who does the pre-brief and gets everyone oriented.
    They do the simulation, and then the GTA meets with everyone. This
    individual is not on the faculty.
  • The GTA then gives the faculty feedback. We felt that helped create
    safety in their exploration of how they did, and it lowered the risk
    that they are not meeting with me after they try to fit a hearing aid.
    They can talk about what went wrong with an upperclassman, so the
    debriefs are a lot more informal than Amanda described.
  • I also encourage you to think about the dedicated time and place
    and the type of simulation—and then infusing simulations that can
    happen at the student’s pace. Maybe a small portion of the cases are
    what could be used for the simulation. So, we have set up additional
    opportunities for them to come practice on their own so it doesn’t have
    to be formalized either.

Dr. Johnson

  • In the simulations that we have used, they are more campus-wide and
    use spaces that are our own. Lauren mentioned Code Blue in which our
    speech pathology students participate with PT, OT, and athletic training
    in a high-fidelity simulated hospital space and the campus-wide
    simulation space—so that is a pretty high bar to get equipment for that.
    It’s been lovely to have, but we are also running an interprofessional
    simulation in responding to identity-based patient bias. Helping
    students learn how to respond to microaggressions, racist comments
    based on all kinds of identities. That is really low-fidelity. Those of us
    working on it have come up with scripts, and we’re in the phase where
    we’ve got standardized patients who are trained to be the instigator in
    a clinical space. There is no equipment needed for that, and it’s real
    low-tech, and I think it is having a pretty big impact.
  • For the final piece with the standardized patient, we have a brief prelearning, practice, and then debrief at the end. It’s low-fidelity and has
    a pretty heavy impact on the students in their experience—so there is a
    really big continuum on how these could work.
  • Eppich, W., & Cheng, A. (2015, April). Promoting Excellence and Reflective
    Learning in Simulation (PEARLS). Simulation in Healthcare: The Journal of
    the Society for Simulation in Healthcare, 10(2), 106–115. doi.org/10.1097/
    SIH.0000000000000072

Can you discuss the use of simulation in interprofessional practice?

Dr. Stead

  • I think Tiffany can probably speak more to this because she is doing
    more dedicated work. We are working with our colleagues to do
    interprofessional activities. We also like standardized patients.

Dr. Johnson

  • We have mentioned our Code Blue activity and our identity-based
    patient bias training that we are doing with nursing and several School
    of Health Professions disciplines.
  • The other thing we do on our campus—and we’ve been doing this for
    probably between 8 and 10 years—is a campus-wide interprofessional
    activity. This brings together students in medicine, nursing, and health
    professions. These include some simulations—and, again, it is a multistep
    curriculum to learn about working in interprofessional teams.
    The students learn about the language to use, what is your role, how
    to step in and how to stay in your lane but working towards good
    outcomes. We have an individual in the School of Nursing who runs
    this. It is a couple of days and is across three campuses where they
    learn vocabulary, learn about other disciplines, and they have a real
    low-tech sort of simulated patient experience. The “patient” has certain
    characteristics, and students learn their role on the team and learn to
    work with fellow students.
  • As an interprofessional team that again doesn’t require any equipment
    but someone who can manage this. Other folks in nursing are ahead of
    us by a long shot.

Share:

Categories

More Posts

Presented at the 2021 Researcher Academic Town Meeting
Presented at the 2019 Researcher Academic Town Meeting
Presented at the 2017 Researcher Academic Town Meeting