Why Us? >>

  • - Open Access
  • - Peer-reviewed
  • - Rapid publication
  • - Lifetime hosting
  • - Free indexing service
  • - Free promotion service
  • - More citations
  • - Search engine friendly

Free SCIRP Newsletters>>

Add your e-mail address to receive free newsletters from SCIRP.

 

Contact Us >>

WhatsApp  +86 18163351462(WhatsApp)
   
Paper Publishing WeChat
Book Publishing WeChat
(or Email:book@scirp.org)

Article citations

More>>

Livingston, S., & Zieky, M. (1982). Passing scores: A manual for setting standards of performance on educational and occupational tests. Princeton, NJ: Educational Testing Service.

has been cited by the following article:

  • TITLE: OSCE Feedback: A Randomized Trial of Effectiveness,Cost-Effectiveness and Student Satisfaction

    AUTHORS: Celia A. Taylor, Kathryn E. Green

    KEYWORDS: Assessment; Medical Students; Objective Structured Clinical Examination; Feedback; Randomized Trial

    JOURNAL NAME: Creative Education, Vol.4 No.6A, June 13, 2013

    ABSTRACT: Purpose: To develop two new types of clinical feedback for final year medical students using OSCE mark sheets and to evaluate their effectiveness, cost-effectiveness and student satisfaction in a randomized trial. Methods: A randomized trial was conducted with two groups (Cohort A and B) of students (n = 350) at the University of Birmingham (UK) participating in a two stage Objective Structured Clinical Examination (OSCE) (November 2011 and April 2012). Students were randomly assigned to receive one of three feedback interventions (skills-based, station-based, or both) after the November OSCE. Multivariate regression analysis was used to test if feedback intervention was a significant predictor of April OSCE score, while controlling for November OSCE score. Secondary outcomes were cost-effectiveness and student satisfaction. Results: Feedback group was not a significant predictor of April scores for Cohort B. In Cohort A, the station-based group did better than the group who received both types of feedback (2.8%, 95% CI 0.4% to 5.2%, p = 0.022). There was no difference between the skills-based and station-based groups. The cost of providing the station-based feedback was double of that for the skills-based. Questionnaires were received from 245 students (70%). Students who received both types of feedback were the most satisfied, followed by those in the station-based group. Conclusion: There was no consistent difference in effectiveness across the three trial groups. Students tended to prefer station-based feedback over skills-based feedback, but students found elements of the standard feedback more helpful than the feedback evaluated in this trial.