Impact of the Objective Evaluation of Clinical and Surgical Basic Skills (CSBS) On Medicine Students (Spain): An Experimental Design

Impact of the Objective Evaluation of Clinical and Surgical Basic Skills (CSBS) On Medicine Students (Spain): An Experimental Design

Marcelo F. Jimenez, Maria Jose Rodriguez-Conde, Susana Olmos-Miguelañez, Gonzalo Varela, Francisco S. Lozano, Francisco J. Garcia, Fernando Martinez-Abad
Copyright: © 2014 |Pages: 11
DOI: 10.4018/jitr.2014040105
OnDemand:
(Individual Articles)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

This work presents the design, implementation and results of a research project in medical education conducted from 2007 to 2011, on the learning of some basic clinical and surgical skills in Medicine (University of Salamanca, Spain). This project has been conducted in collaboration with the Educational Research Institute of the same university. The hypothesis of the study that teaching methods based on a learning process guided by the direct practice on models, along with the modeling through an online teaching platform, produce a high level of learning, measured by using the direct observation of the behavior. The learning results are positive, although some methodological inconveniences have been found in the evaluation process, in relation with the causes of the variability between skills and evaluators.
Article Preview
Top

Method

Variables and Instruments

The variables and instruments that we raised from the study hypotheses have been the following ones:

  • 1.

    Dependent variables

    • a.

      Skill level demonstrated by the student related to: wound healing, venopunction, airway management, IUCs and thoracic exploration and pucntion. The instruments used in this case are: Direct observation, constructed instruments and check lists / Stimation scales

    • b.

      Student satisfaction level towards the teaching action. The instruments used in this case are: Electronic surveys of satisfaction (posttest)

    • c.

      Teacher satisfaction level towards the procedure developed. The instruments used in this case are: Electronic surveys of satisfaction (posttest)

  • 2.

    Independent variable

    • a.

      Teaching methodology designed for the acquisition of clinical and surgical skills, according with the design of the program of practices.

  • 3.

    Moderating or control variables

    • a.

      Sex, age, Students' prior performance, etc., measured through electronic Surveys applied to the students (prior to the training program)

For the evaluation of the clinical skills, we took into account the type of knowledge acquired and valid instrument for recording evidence in relation to the results of the direct observation through check Lists (Bradley & Humphris, 1999; Alves de Lima, 2005; Martínez Carretero, 2005). Student evaluation through direct observation of the basic clinical and surgical skills on a model is an instrument of an important educational value, following Brailowsky’s model of competence evaluation in the field of Medical Education (2009: 111).

Population and Samples

This study has been prolonged along several academic years (from 2007-08 to 2012-13) in the Faculty of Medicine of the University of Salamanca, in particular with a sample of third year students. Each year the number of students oscillates between 90-115 students. Therefore, the sample of this research has consisted of 450-575 students.

Advances in the Study per Academic Year:

The milestones that the project has had over the years of implementation were as follows:

  • Phase 1: 2007-08

    • 1.

      Previous expectative evaluation and Students and teachers satisfaction level evaluation.

    • 2.

      Teaching material Implementation in Moodle.

    • 3.

      Skills evaluation: reliability (interobserver variability)

  • Phase 2: 2008-09

    • 4.

      Workshop and evaluation guideline to unify evaluation criteria

    • 5.

      Skills Assessment: decrease the average, 4 skills are evaluated and detected high variability between observers.

  • Phase 3: 2009-10

    • 6.

      Educational evaluation seminar (January 2010).

    • 7.

      We analyze and discuss the results (May 2010).

  • Phase 4: 2010-11

    • 8.

      Student-mentors are selected and trained by expert teachers in each skill (October 2010).

    • 9.

      Student-mentors act as a driving force of the practices (October-December 2010)

    • 10.

      We analyze and discuss the results (March 2011).

  • Phase 5: 2011-12

    • 11.

      The assessment systems are implemented and validated

    • 12.

      We analyze and discuss the results (June 2012).

  • Phase 6: 2012-13

    • 13.

      We performed an educational program targeting teachers with the use of material in streaming video format for modeling the process of assessing the skills acquired by students.

    • 14.

      A study of concordance with the data obtained by immediate response systems. (May 2013)

    • 15.

      We analyze and discuss the results (July 2013).

Complete Article List

Search this Journal:
Reset
Volume 16: 1 Issue (2024): Forthcoming, Available for Pre-Order
Volume 15: 6 Issues (2022): 1 Released, 5 Forthcoming
Volume 14: 4 Issues (2021)
Volume 13: 4 Issues (2020)
Volume 12: 4 Issues (2019)
Volume 11: 4 Issues (2018)
Volume 10: 4 Issues (2017)
Volume 9: 4 Issues (2016)
Volume 8: 4 Issues (2015)
Volume 7: 4 Issues (2014)
Volume 6: 4 Issues (2013)
Volume 5: 4 Issues (2012)
Volume 4: 4 Issues (2011)
Volume 3: 4 Issues (2010)
Volume 2: 4 Issues (2009)
Volume 1: 4 Issues (2008)
View Complete Journal Contents Listing