#WCE2014 - Crowd-sourced assessment of technical skills (C-SATS): Validation through the basic laparoscopic urologic surgery (BLUS) curriculum - Interview

TAIPEI, TAIWAN (UroToday.com) - Introduction and Objectives: Crowdsourcing is the practice of obtaining services from a large group of people; typically from an online community such as the Amazon.com Mechanical Turk Project. We hypothesized that the ‘crowd’ could score performances comparably to scores derived from expert surgeons of dry lab laparoscopic skill tasks videotaped during the AUA BLUS curriculum validation project.

wceMethods: 24 candidate videos of laparoscopic skill tasks performed by surgeons of varying levels of laparoscopic case experience - 12 suturing and 12 pegboard transfer performances were evaluated by 5 faculty experts and at least 60 Amazon.com Mechanical Turk crowd-workers. Each rater provided responses to the same multi-domained rating scale from the Global Objective Assessment of Laparoscopic Skills (GOALS) tool. We compared mean global performance scores provided by experts and crowd-workers using Cronbach’s alpha and estimated performance-specific passing probabilities by cut-offs established with receiver operating characteristic (ROC) curves.

Results: Within 48 hours we received 1,840 crowd-worker ratings, of which 1,438(78.2%) passed analysis eligibility criteria based on discrimination questions used to assess the integrity of the scorer’s responses. Faculty experts completed the reviews in 10 days. C-SATS ratings provided excellent discrimination between passing and failing video performances as defined by faculty experts (area under ROC curve = 96.9%; 95% CI: 90.3%–100%).

Conclusions: A properly-sized and qualified crowd can accurately score laparoscopic skill performances on par with faculty experts. Crowd-based ratings may be an efficient method for assessing passing/failing performances, and for measuring change in performance after training.

Source of Funding: None

 
Listen to an interview with Thomas Lendvay, one of the authors of this study.

 

Presented by Thomas Lendvay,1 Bryan Comstock,1 Timothy Averch,2 Geoffrey Box Bodo Knudsen,3 Timothy Brand,4 Michael Fernandino,5 Jihad Kaouk,6 Jaime Landman,7 Benjamin Lee,9 Elspeth McDougall,9 Ashleigh Menhadji,8 Bradley Schwartz,10 Robert Sweet Timothy Kowalewski11 at the 32nd World Congress of Endourology & SWL - September 3 - 7, 2014 - Taipei, Taiwan

1Unviersity of Washington, USA
2University of Pittsburgh, USA
3Ohio State University, USA
4Madigan Army Medical Center, USA
5Duke University, USA
6Cleveland Clinic Foundation, USA
7University of California, Irvine, USA
8Tulane University, USA
9University of British Columbia, Canada
10Southern Illinois University, USA
11University of Minnesota, USA

 

Conference Coverage
 
Recent data from conferences worldwide
  • 2020 Virtual Congress / September 19-21
  • 2020 BCAN Think Tank Virtual / August 7
  • 2020 Virtual Education Program / August 8-10
  • 2020 Annual Meeting / July 17-19 / EAU Virtual Program
  • SUO - AUA 2020 Summer Webcast / July 18
  • 2020 Annual Meeting / June 27-28 / AUA Live Virtual Program
  • 2020 Annual Meeting / May 29-31 / Virtual Scientific Program
  • 2020 Annual Meeting / February 13-15 / San Francisco, CA
  • 2019 Annual Meeting / December 4-6 / Washington, DC
  • 2019 Annual Meeting / October 3 - October 5 / Aarhus, Denmark
  • 2019 Annual Congress / September 27 - October 1 / Barcelona, Spain
  • 2019 Biennial Meeting / August 29-31 / Basel, Switzerland