<?xml version="1.0" encoding="UTF-8"?><?xml-stylesheet type="text/css" href="https://imcjms.com/public/assets/rss.css" ?><rss version="2.0">
<channel>
    <title>IMC Journal of Medical Science</title>
    <link>https://imcjms.com/public</link>
    <description>Ibrahim Medical College Journal of Medical Science</description>

                        <item>
                <title><![CDATA[Evaluation of structured oral examination format used in the assessment of undergraduate medical course (MBBS) of the University of Dhaka]]></title>

                                    <author><![CDATA[Md Shah Alam]]></author>
                                    <author><![CDATA[Tahmina Begum]]></author>
                
                <link data-url="https://imcjms.com/public/registration/journal_full_text/71">
    https://imcjms.com/public/registration/journal_full_text/71
</link>
                <pubDate>Tue, 02 Aug 2016 12:10:29 +0000</pubDate>
                <category><![CDATA[Original Article]]></category>
                <comments><![CDATA[Ibrahim Med. Coll. J. 2015; 9(1): 1-10]]></comments>
                <description>Objectives of this cross sectional descriptive
study was to evaluate critically the current status of structured oral
examination (SOE) format as practiced in the professional examination of
undergraduate medical course (MBBS) and views of the faculties regarding the
concept of SOE as an assessment tool.
Analysis of the questions revealed majority
(97%) were of recall type, only 3% were interpretation and problem solving
types. The questions for 119 (97%) examinee did not address 10%-50% content
area. About 38% examiners responded that they had no clear idea regarding
learning objectives and none had idea regarding test blueprint.The examiners
marked the domain of learning measured by SOE in favor of cognitive skill
(61%), communication skill (38.5%), motor skill (11.5%), behavior and attitude
(19%). No examiner prepared model answer of SOE questions by consensus with
other examiner. Though more than 80% examiner agreed with the statement that
pre-selection of accepted model answer is an important element for success of
SOE. But no examiners of any SOE boards practiced it. Similarly, none of the
examiners of SOE board kept records of individual question and the answer of
the examinees. No boards maintained equal time for a candidate during SOE by
using timer or stop watch. Examiners of 8 boards (44%) did not use recommended
rating scale to score individual response of examinee rather scored in
traditional consolidated way at the end of the candidate’s examination.
Majority (94%) boards scored the prompted answer and allowed another questions
when a candidate failed to answer. During SOE conduction, 22% examiner were
absent from the board for a prolonged period and 3% was engaged in marking the
written scripts. About 56% of the examiners arrived late than schedule time.
Behaviors of 14% examiner showed abusing to the candidates.
Introduction
Goal of assessment is to provide direction and
motivation for future learning and protect the public by upholding high
professional standards and screening out trainees and physicians who are
incompetent. Learning abilities must be assessed in multiple modes and
contexts. Educational contents are the stimulus for learning and also provide a
context to demonstrate one’s ability. Attributes of any instruments to assess
different learning outcome should have four factors- validity, reliability,
objectivity and practicality. The preferred learning style may be modified
depending on the student’s perception of task and motivation towards it.2&amp;nbsp;Students’ learning is
influenced greatly by the assessment method used3. 
The cognitive ability is assessed by the
written examinations like essay question, modified essay question, short essay
question (SAQ) and MCQ while skill by practical demonstration (OSPE/OSCE). The
oral examination is still used in all subject centered medical curricula.
Compared to essay questions, it is considered to probe deeply a student’s
ability to think, to express more or less clearly his knowledge of isolated
facts or group of facts that he ought to remember. For the measurement of these
reasoning and deductive process, problem solving skill, capacity to defend
decision, evaluation of competing choice and ability to prioritize, still make
the oral examination a popular tool in summative assessment. Oral examination
has its unique characteristics as face to face interaction, flexibility to
concentrate on one area and exploration of the student view points. 
In view of those shortcomings, the oral
examination should only be used to test the qualities that cannot be assessed
by other method of evaluation. The qualities that are needed as medical
professional include: Alertness, Confidence, Decisiveness and Ability to
discuss logically.
In the unstructured oral examination, the
examinees are liable to be asked whatever the examiners chooses and there is a
risk that the examiner may concentrate on his pet interests.17&amp;nbsp;Assessment of medical
students using the traditional oral system has been marred by being highly
subjective, non-structured and biased; and therefore suggestion was for the
replacement the traditional oral examination by the ViPSCE for testing
knowledge, problem solving and management abilities.18
In the undergraduate medical curriculum of
2002 of Bangladesh, extensive modification of the assessment system was done.
In this new curriculum the written examination format modified to SAQ and MCQ
along with 10% mark added by formative assessment. Traditional practical and
oral assessment were modified to OSPE/OSCE and SOE. The curriculum recommends
that while constructing the questions for SOE, the proportion of recall,
interpretative and problem solving questions should be 50-60%, 20% and 10-20%
respectively. Questions should be constructed by the examiner and typed on a
card for the candidate to pick up the cards randomly from a box. Two boards
consisting four examiners should conduct the examination. Each candidate is
allotted fifteen minutes to answer.
&amp;nbsp;
This Cross sectional type of descriptive study
was carried out over one year period from July 2007 to June 2008. The study was
carried out in nine medical college centers under the University of Dhaka
during the SOE of forensic medicine. The SOE in forensic medicine subject was
chosen for evaluation as a reference and a representative discipline from other
eleven subjects of MBBS course. 
Prior permission to observe was taken from
appropriate authority. The conveners of examinations were informed about the
intension for observing the session. The examiners were ensured that there
would be no interference other than observation, documentation and collection
of question asked to each candidate.
Analysis was done using descriptive statistic.

Results
&amp;nbsp;
&amp;nbsp;
Characteristics of a total of 2544 SOE
questions that were asked to 123 examinees in relation to their learning
hierarchy and content area in the forensic medicine showed majority (97%) of
the questions were of recall type and very negligible numbers were
interpretation (2%) and problem solving (1%) types (Table-2).
Table-2:Distributions of the SOE questions by their
content area in Forensic Medicine (n=2544)
&amp;nbsp;
Highest percentage of questions were from
Forensic Pathology (24%), followed by 22.5% from Forensic Toxicology, from
Forensic Gynecology (12%) and Forensic Thanatology (11%). About 7% of questions
were from Introduction and Legal Procedure and 6% from Medical Ethics. The
detail distribution is given in Table 2.
&amp;nbsp;
&amp;nbsp;
&amp;nbsp;
&amp;nbsp;
Fig-1. Distributions of the candidates by the SOE
questions they were asked from the total content areas of Forensic Medicine
(n=123)
A significant number of examiners (10,38%)
mentioned that they had no idea regarding learning objectives. 
&amp;nbsp;
&amp;nbsp;
None of the examiners of 18 boards practice to
prepare model answer nor even recorded the questions that were asked to the
candidates and the answers of those questions. The examiners of 10 boards (56%)
practiced scoring of every answer using rating scales but 8 (44%) of them
scored in traditional consolidated way at the end when the candidates completed
answering. Majority of examiner of the boards (94%) practiced scoring of
prompted answer and equal number shifted to another question when candidates
failed to answer. Equal time for a candidate was not maintained by stop watch
in any of 18 boards (Table 4).
Table-4:
Distribution of SOE by their procedure of conduction by examiner
&amp;nbsp; 
Other aspects of the behaviours of examiners
during the SOE conduction revealed that 6-28% was involved in other activities
during the examination procedure (Table 5). Only 28% of internals arrived one
hour before schedule time. Both the internal and external talked over cell
phone and eating food during the SOE (Table 5).
Table-5: Distribution
of(atmosphere of SOE) behavior of examiner(n=36)
&amp;nbsp;
&amp;nbsp;
&amp;nbsp;
&amp;nbsp;The opinion of the examiners regarding the
domain of learning outcome measured by the SOE, was a multiple response type
question, hence 11 examiners marked more than one area. The opinions were 61%
in favor of cognitive skill, 38.5% in favor of communication skill, 11.5% in
favor of motor skill and 19% in favor of behavior and attitude. Only 11.5%
teachers had no idea regarding measurement of learning domain (Table 7).
Table 7: Teachers&#039; opinion by the
domain of learning outcome they want to measure by SOE&amp;nbsp;&amp;nbsp; (n-26, multiple
responses)
&amp;nbsp;
About 31% of examiners agreed that test
blueprint provide a ground rule for construction of the question of learning
hierarchy’ while 62% could not decide in selecting any one of the options (Table-8).

Table 8: Distribution
of teachers by their opinion about test blueprint and model answer for SOE (n=26)
&amp;nbsp; 
Highest net priority score (149) was in favor
of the advance construction of SOE questions followed by advance preparation of
model answer (127). The next priority score was for non threatening environment
(106), use of rating scale (95) and equal time for each candidate (93).
Priority score for recording of question and answer was negative (-36) (Table
9).
Table-9: Distribution of elements of SOE by their net priority
&amp;nbsp;
Table-10: Distribution of the respondent by their identified
advantage of SOE
&amp;nbsp;
&amp;nbsp;
&amp;nbsp;
&amp;nbsp;
&amp;nbsp;
&amp;nbsp;
Discussion
The low taxonomic level (recall of factual
knowledge rather than problem solving) of this study indicates, the students
were adopting surface approach in learning. The study of learning style of
medical students had high scores on reproducing orientation were the evidence
of surface approach in learning style.21&amp;nbsp;The preferred learning style may be modified
depending on the students perception of task and motivation towards it. Student
learning is influenced greatly by the assessment method used.3&amp;nbsp;Assessment strategies that
focus predominantly on recall of knowledge will likely promote superficial
learning but assessment strategies that demand critical thinking or creative
problem solving will promote higher level of student performance or
achievement. Higher education institution have been responding to a growing
concern for the adequacy of professional and career preparation by specifying
the outcome or abilities critical for future professional performance. Recent
developments in assessment methodology have focused on performance assessment
and good assessment can help students become more effective self-directed
learners.4&amp;nbsp;
Mainly
internal examiners prepared the question for the current SOE session. This
finding suggested reluctance of shouldering the responsibility by the examiner
of SOE board. Wide variations in content and hierarchy discrimination could be
avoided if the questions were constructed with the aid of prior prepared test
specification (blueprint) and framing all 10 questions in a card distributing
learning hierarchy and core content from all topics. Only 15% examiner
responded that they used test blueprint in construction of question.
Interestingly, even those examiners did not have proper knowledge about test
blue print. All assessments should ensure that they are appropriate for the
learning objective (Knowledge, skills and attitudes) being tested. The
conceptual framework against which to plan assessment is essential and it is
the test blueprint that provides a representative sample of instructionally
relevant tasks. The test blueprint helps to achieve the validity of the
content, response and consequence evidence.
Scoring of students’ response demands marking
of every answer in a rating scale. To score in a traditional way at the end
causes the subjectivity and bias. Examiners of 44% SOE boards did not use
rating scale to score individual response rather they scored in traditional
consolidated way at the end. It is quite impossible for an examiner to remember
all the responses after a prolong time, without subjective bias. Therefore,
scoring of all the answer traditionally at the end definitely invite bias in
scoring.22&amp;nbsp;The
factors that influence rating, are the errors of leniency and central tendency
and hallo effect which should be avoided in rating scale construction and use.23. But in this study no examiners prepared any rating scale and even
significant portion of examiner scored traditionally at the end.
The respondent prioritized the advance
preparation of model answer in consultation with other examiner’s as 2nd&amp;nbsp;highest essential element of
SOE and majority examiner (80%) agreed that the pre-selection of model answer was
important for success of SOE. The net priority score for preparation of
accepted model answer was 127. But none of the examiner prepared any model
answer of the questions. Specified answers and a specific marking scheme in an
SOE for surgical resident in Canada produced an overall reliability of 0.75.25&amp;nbsp;Criteria for answer can
provide clear guidelines on what is and is not an acceptable answer to the
examiner’s question.
The activities
like talking over mobile, eating foods, prolonged outside staying and marking
scripts during conduction of SOE, is unsuitable for establishment of cordial
environment and unbiased scoring. Majority of external examiners were late than
university schedule time. Examiners should arrived one hour before the
university schedule time of starting SOE for selection of question, preparation
of accepted answer on consensus and rating scale. However, it was interesting
to note that all the examiners prioritized the establishment of a
non-threatening environment as an essential element of SOE.
Education is a process the chief goal of which
is to bring change in human behavior. This behavior explicitly defined in the
form of educational objectives, which are the guiding principles to plan
educational activities and assessment. A significant percentage of examiners
(38%) had no clear idea regarding learning objectives. The lack of knowledge
regarding learning objectives indicate basic defect to overcome all the
barriers of effective medical education system. Definition of educational objectives
is an essential step before choosing teaching method and a system of
evaluation. In the present study, about 12% respondent could not decide
learning domain measured by SOE, 12% marked in favor of motor skill, 19%
responded in favor of attitude and 10% for behavioral domain of learning. The
findings were suggesting that significant number of examiners did not perceived
the elements measured by SOE. The findings indicate that there is an urgent
need of training for faculty development.
&amp;nbsp;
The elements of SOE were not properly followed
during assessment of students’ in forensic medicine. Without using test blue
print in construction and framing of questions it is quite impossible to assess
the candidate’s learning hierarchy and coverage of essential content. Advance
preparation of accepted model answer though essential in scoring without bias
was not practiced by the examiners of any board. The medical colleges were
selected purposively and therefore, all medical colleges could not be included.
The examiners of different subject could not be interviewed and all SOE boards
could not be observed However, the reasons for not implementing vis a vis
following the attributes of SOE were not explored. The study was done only in
Forensic medicine but similar situations may exist in other subjects also. The
study revealed that SOE introduced as assessment tool in undergraduate medical
curriculum was not properly implemented and its desired objectives not fully
achieved.
Recommendation
&amp;nbsp;
&amp;nbsp;
1.&amp;nbsp;&amp;nbsp;&amp;nbsp; Joughin G. Dimensions of oral assessment and
student approaches to learning. In: Brown S,&amp;nbsp;
Glasner A eds. Assessment matters. Buckingham: The Society for Research
into Higher Education &amp;amp; Open University Press 1999; 146-156.
3.&amp;nbsp;&amp;nbsp;&amp;nbsp; Wood DF. ABC of learning &amp;amp; teaching in
med problem based learning. British Med J.2003; 326: 328-330
5.&amp;nbsp;&amp;nbsp;&amp;nbsp; Bull GM. Examinations. J Medical
Education. 1959; 34: 1154-1158.
7.&amp;nbsp;&amp;nbsp;&amp;nbsp; Swanson DB. A measurement framework for
performance based test. In: Hart IR, Harden RM eds. Further developments in
assessing clinical competence. Montreal, CanHeal 1987.
9.&amp;nbsp;&amp;nbsp;&amp;nbsp; Colton T &amp;amp; Paterson OL. An assay of
medical student’s ability by oral examination. J of Medical Education
1967; 42: 1005-1014.
11.&amp;nbsp; Newble DI, Hoare J &amp;amp; Efmsli RG. The
validity &amp;amp; reliability of a new exam of the clinical competency of medical
students. Medical Education 1981; 15: 46-52.
13.&amp;nbsp; Ferdousi S, Latif SA, Ahmed MM, Nessa A.
Summative assessment of undergraduate medical students’ performance in
physiology by Structured Oral examination. Mymensingh Med J 2007; 16(1):
64-69.
15.&amp;nbsp; Kearny RA, Puchalski SA, Yang HYH and Skakun
EN. The inter-rater and intra-rater of reliability of a new Canadian oral
examination format in Anaesthesia is fair to good, Canadian. J Anaesthesia
2002; 49: 232-236.
17.&amp;nbsp; Oyebode F,George F, Math V &amp;amp; Haque S.
Inter examiner reliability of the clinical parts of MRCPPsyc Part-II exam. Psychiatry
Bulletin 2007; 31: 342-344.
19.&amp;nbsp; van der Vleuten CPM, Scherpbier AJJA, Dolmans
DHJ, Schuwirth LWT, Verwijnen GM, Wolfhagen HAP. Clerkship assessment assessed.
Medical Teacher 2000; 22(6): 592-600.
21.&amp;nbsp; Newble DI, Gordon MI. The learning style of
medical students. Medical Education1985; 19: 3-8.
23.&amp;nbsp; Guilbert JJ. Comparison of advantage and
disadvantage of different types of test, Educational handbook for health
personnel, Geneva, WHO 1977; 416.
</description>

            </item>
            
    <copyright>2026 Ibrahim Medical College. All rights reserved.</copyright>
</channel>
</rss>
