Do OSCEs test little more than acting?
A paper published on BioMed Central has looked at the use of Objective Structured Clinical Examinations (OSCEs) on its level of validity for assessing communicative and interactional aspects of clinical performance. The debate review considers critically the evidence for the simulated consultation as a proxy for the real, as performance, as a context for assessing talk and as potentially disadvantaging candidates trained overseas.
The review suggests that while simulated assessment is ‘realistic’, it shows some crucial differences to the communicative competences found in real-life practice. Candidates who can handle the social and linguistic complexity the standardised situation score highly, yet what is being assessed is not real communication. If communication skills are assessed purely through simulated patients, this may not reflect the real consulting abilities of the candidates.
The paper then introduced three core sociolinguistic concepts relevant to the assessment of communication in medicine:
- The particular variety of talk in simulated consultations separates it out from the talk in real consultations
- The notion of ‘frame’ is used to understand how we relate to and make our talk real to each other and that this reality breaks down in institutionally assessed communication
- That micro-features of talk feed constantly into our evaluation of others and, in high-stakes assessments, can have large consequences on the trajectory of an interaction.
My Health Career asked some medical students for their thoughts on OSCEs. Sun Woo Kim, Publications Officer at the New South Wales Medical Students’ Council responded:
“I believe the concerns raised by this article surrounding the nature of simulation as a form of assessment are both important and relevant for consideration in striving to improve our medical teaching programs. Multiple points raised in this paper mirror discussions and experiences shared between our peers, particularly the influence of time-pressures in altering the structure of consultation and the underlying need to ‘display’ empathy. Many students have echoed the sentiment that approaching simulated cases in an assessment often feels like a performance, with a checklist of formulaic responses and questions needing to be addressed.
As the article states, it is undoubtedly true that simulated patients provide a great platform for standardised assessment of procedural and communication skills, with little to no risk involved. The concern lies within its ability to ascertain the individual’s ability to adapt and respond appropriately to situations which are emotionally charged, diverse culturally or in building rapport and connecting with a patient’s needs. Simulated patients and settings are also further detached from real-life clinical situations because they limit collaboration with hospital staff and the use of other resources normally available in a clinical setting. For these reasons I believe that while simulated patients have their place in assessment of procedural skills, alternatives should be considered in assessing communication with patients and hospital staff within a clinical context.
It is understandably difficult to design a standardised assessment on these skills which are approached differently by all, and inevitably varied in interpretation by individual assessors based on personal experience. Analysis and discussion such as presented in this article may help shed light on key issues to address and improve in structuring our medical training and assessment.”
Click here to read the full paper on Biomed Central.
More articles on My Health Career:
- Appreciating Rheumatology – now that’s a phrase you don’t often hear! By Dr Maxine Szramka
- Things change – by Dr Judith O’Malley-Ford
Image of medical students: Greater Louisville Medical Society – flickr
Image of OSCE logo: Wikimedia Commons