Video-based assessment of practical operative skills for Undergraduate dental students

Main Article Content

Wälter A
Möltner A
Böckers A
Rüttermann S
Gerhardt Szép S*

Abstract



Introduction: The aim of this study is to evaluate, within the scope of an experimental design, to what extent the assessment of two different settings of prepared cavities, based on video sequences, containing digital analysis tools of the prepCheck software, as well as to what extent they deviate from one another and are reliable.
Materials and Methods: For this prospective, single-centred, experimental study, 60 examination cavities related to a ceramic inlay preparation were assessed by four trainers in two different settings (A: video film versus B: video film plus an analogue model assessment) by using a standard checklist. The examined parameters contained: the 1. preparation / outer edges, 2. surface & smoothness / inner edges, 3. width & depth, 4. slide-in direction, 5. outer contact positioning and 6. overall grade on a Likert scale of 1 = ‘excellent’, 2 = ‘very good’, 3 = ‘good’, 4 = ‘satisfactory’ to 5 = ‘unsatisfactory’. An evaluation questionnaire with 33 items was additionally addressed to the concept of application of a digital-analytic software. The statistical analysis, using SAS 9.2 (SAS Institute Inc., Cary, USA, PROC MIXED) and R (Version 2.15, Package lme4) concerned the reliability, inter-rater correlation and significant factors at a p of 0.05.
Results: The assessment of the individual criteria and overall grade of the control group (A) were, on average, lower (i.e. better) than in the study group (B), yet with the exception of the ‘outer contact positioning’, without conclusive statistical significance. The reliability lay at an average of α=0.83 (A) and α=0.79 (B). The maximum reliability of the criteria ‘preparation edge’, ‘surface’, ‘width & depth’ as well as ‘overall grade’ were reasonable in the assessment mode, with α > 0.7. The inter Video-based Assessment 3 rater correlation was at an average of 0.43 < r < 0.74 higher in assessment mode A than B that comprised 0.35 < r < 0.60.
Conclusion: The current examination shows an average reliability in the assessment mode A that exceeds the requirements for practical examination (α ≥ 0.6) and also fulfils the general requirements for ‘high-stake’ examinations of α ≥ 0.8.



Downloads

Download data is not yet available.

Article Details

A, W., A, M., A, B., S, R., & S, G. S. (2018). Video-based assessment of practical operative skills for Undergraduate dental students. Trends in Computer Science and Information Technology, 3(1), 005–014. https://doi.org/10.17352/tcsit.000007
Research Articles

Copyright (c) 2018 Wälter A, et al.

Creative Commons License

This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.

Licensing and protecting the author rights is the central aim and core of the publishing business. Peertechz dedicates itself in making it easier for people to share and build upon the work of others while maintaining consistency with the rules of copyright. Peertechz licensing terms are formulated to facilitate reuse of the manuscripts published in journals to take maximum advantage of Open Access publication and for the purpose of disseminating knowledge.

We support 'libre' open access, which defines Open Access in true terms as free of charge online access along with usage rights. The usage rights are granted through the use of specific Creative Commons license.

Peertechz accomplice with- [CC BY 4.0]

Explanation

'CC' stands for Creative Commons license. 'BY' symbolizes that users have provided attribution to the creator that the published manuscripts can be used or shared. This license allows for redistribution, commercial and non-commercial, as long as it is passed along unchanged and in whole, with credit to the author.

Please take in notification that Creative Commons user licenses are non-revocable. We recommend authors to check if their funding body requires a specific license.

With this license, the authors are allowed that after publishing with Peertechz, they can share their research by posting a free draft copy of their article to any repository or website.
'CC BY' license observance:

License Name

Permission to read and download

Permission to display in a repository

Permission to translate

Commercial uses of manuscript

CC BY 4.0

Yes

Yes

Yes

Yes

The authors please note that Creative Commons license is focused on making creative works available for discovery and reuse. Creative Commons licenses provide an alternative to standard copyrights, allowing authors to specify ways that their works can be used without having to grant permission for each individual request. Others who want to reserve all of their rights under copyright law should not use CC licenses.

Schmitt L, Möltner A, Rüttermann S, Gerhardt-Szep S (2016) Study on the Interrater Reliability of an OSPE (Objective Structured Practical Examination) – Subject to the Evaluation Mode in the Phantom Course of Operative Dentistry 33: 61. Link: https://tinyurl.com/y7wtonlj

Natkin E, Guild RE (1967) Evaluation of preclinical laboratory performance: a systematic study 31: 152-161.

Jasinevicius TR, Landers M, Nelson S, Urbankova A (2004) An evaluation of two dental simulation systems: virtual reality versus contemporary non-computerassisted 68: 1151-1162. Link: https://tinyurl.com/ybl4p2yr

Kournetas N, Jaeger B, Axmann D, Groten M, Lachmann S, et al. (2004) Assessing the reliability of a digital preparation assistant system used in dental education 68: 12281234. Link: https://tinyurl.com/y9run9ts

Cardoso JA, Barbosa C, Fernandes S, Silva CL, Pinho A (2006) Reducing subjectivity in the evaluation of pre-clinical dental preparations for fixed prosthodontics using the Kavo PrepAssistant 10: 149-156. Link: https://tinyurl.com/y7t2w4xh

Kenneth AE (2004) E-learning—new technologies but slow progress 7: 115-117.

Kikuchi H, Ikeda M, Araki K (2013) Evaluation of a virtual reality simulation system for porcelain fused to metal crown preparation at Tokyo Medical and Dental University. 77: 782-792. Link: https://tinyurl.com/y834vzrx

Taylor CL, Grey NJ, Satterthwaite JD (2013) A comparison of grades awarded by peer assessment, faculty and a digital scanning device in a pre-clinical operative skills course17: 16-21. Link: https://tinyurl.com/y9wwcco2

Turnbull J, Gray J, MacFadyen (1998) Improving in-training evaluation programs. J Gen Intern Med 3: 317-323. Link: https://tinyurl.com/ya3tfd5g

Gerhardt-Szep S, Güntsch A, Pospiech P, Söhnel A, Scheutzel P, et al. (2016) Assessment formats in dental medicine: An overview. GMS J Med Educ 33: Doc65. Link: https://tinyurl.com/yd5lb8v2

Zia A, Sharma Y, Bettadapura V (2016) Automated video-based assessment of surgical skills for training and evaluation in medical schools. Int J CARS 11: 1623-1636. Link: https://tinyurl.com/yddz5rfx

Scaffidi MA, Grover SC, Carnahan H, Yu JJ, Yong E, et all. (2017) A prospective comparison of live and video-based assessments of colonoscopy performance. Link: https://tinyurl.com/yc5gjsoj

Macluskey M, Durham J, Balmer C, Bell A, Cowpe J, et al. (2011) Dental student suturing skills: a multicentre trial of a checklist-based assessment. Eur J Dent Educ 15: 224229. Link: https://tinyurl.com/y8ta8dg7

Chen AC, Lee MS, Chen WJ, Lee ST (2013) Assessment in orthopedic training-an analysis of rating consistency by using an objective structured examination video. J Surg Educ 70: 189-192. Link: https://tinyurl.com/ybykh3rc

Laeeq K, Infusino S, Lin SY, Reh DD, Ishii M, et all. (2010) Video-based assessment of operative competency in endoscopic sinus surgery. Am J Rhinol Allergy. May-Jun; 24(3):234-7. Link: https://tinyurl.com/yauyald6

Podsakoff NP, Podsakoff PM, Mackenzie SB, Klinger RL (2013) Are we really measuring what we say we're measuring? Using video techniques to supplement traditional construct validation procedures. J Appl Psychol. Link: https://tinyurl.com/ybk8xv7e

Sarkiss CA, Philemond S, Lee J, Sobotka S, Holloway TD, et al. (2016) Neurosurgical Skills Assessment: Measuring Technical Proficiency in Neurosurgery Residents Through Intraoperative Video Evaluations. World Neurosurg 89: 1-8. Link: https://tinyurl.com/y8znzezo

Perron NJ, Louis-Simonet M, Cerutti B, Pfarrwaller E, Sommer J, et all. (2016) Feedback in formative OSCEs: comparison between direct observation and videobased formats. Med Educ Online 21: 32160. Link: https://tinyurl.com/yb7d9a45

Takazawa S, Ishimaru T, Harada K (2015) Video-Based Skill Assessment of Endoscopic Suturing in a Pediatric Chest Model and a Box Trainer. J Laparoendosc Adv Surg Tech A 25: 445-453. Link: https://tinyurl.com/y6uwhuy7

Massey D, Byrne J, Higgins N, Weeks B, Shuker MA, et al. (2017) Enhancing OSCE preparedness with video exemplars in undergraduate nursing students. A mixed method study. Nurse Educ Today: 54: 56-61. Link: https://tinyurl.com/y7v9jcgt

Deie K, Ishimaru T, Takazawa S, Harada K, Sugita N, et al. (2017) Preliminary Study of Video-Based Pediatric Endoscopic Surgical Skill Assessment Using a Neonatal Esophageal Atresia/Tracheoesophageal Fistula Model. J Laparoendosc Adv Surg Tech A 27: 76-81. Link: https://tinyurl.com/y8skg2b6

Simpson D, Helm R, Drewniak T, Ziebert MM, Brown D, et al. (2006) Objective Structured Video Examinations (OSVEs) for Geriatrics Education. Gerontol Geriatr Educ 26: 7-24. Link: https://tinyurl.com/ycul69kp

Pérez-Escamirosa F, Chousleb-Kalach A, del Carmen Hernández-Baro M, Sánchez-Margallo JA, Lorias-Espinoza D, et all. (2016) Construct validity of a video-tracking system based on orthogonal cameras approach for objective assessment of laparoscopic skills. Int J CARS 11: 2283–2293. Link: https://tinyurl.com/yb3wz2vq

Weigl P, Felber R, Brandt J, König E, Lauer HC (2015) Fully automated and objective quality inspection of a clinical tooth preparation. Proceedings 40th Annual Meeting of the Association for Dental Education in Europe (ADEE). Eur J Dent Educ 19: e8–e35.

Stumpf A, Weigl P, Gerhardt T, Felber R, Heidemann D, Gerhardt-Szép (2015) Computer-aided 3D analysis of cavities via prepCheck - a pilot study. Proceedings 40th Annual Meeting of the Association for Dental Education in Europe (ADEE). Eur J Dent Educ 19: e8–e35. Link: https://tinyurl.com/yabr9pub

Roopa VA (2017) The Calibration of a Software Programme to Assess Ceramic Crown Preparations in a Pre-clinical Setting.Link: https://tinyurl.com/y9t7n2un

Petkov P, Knuth-Herzig K, Hoefer S, Stehle S, Scherer S, et all. (2017) The reliability and predictive validity of a sixth-semester OSPE in conservative dentistry regarding performance on the state examination. GMS J Med Educ 34. Link: https://tinyurl.com/y9gu4jqj

Kateeb ET, Kamal MS, Kadamani AM, Abu Hantash RO, Arqoub MM (2016) Utilising an innovative digital software to grade pre-clinical crown preparation exercise. Eur J Dent Educ. Link: https://tinyurl.com/ydz8oc3d

Esser C, Kerschbaum T, Winkelmann V, Krage T, Faber FJ (2006) A comparison of the visual and technical assessment of preparations made by dental students. Eur J Dent Educ 10: 157-161. Link: https://tinyurl.com/y77ecmm7

Urbankova A (2010) Impact of computerized dental simulation training on preclinical operative dentistry examination scores. J Dent Educ 74: 402-409. Link: https://tinyurl.com/y7gfqazv

Sampaio-Fernandes MA, Sampaio-Fernandes MM, Fonseca PA, Almeida PR, Reis-Campos JC, et all. (2015) Evaluation of occlusal rest seats with 3D technology in dental education. J Dent Educ 79: 166-176. Link: https://tinyurl.com/ycvykecy