There was a mix-up in the secondary schools national exam results, with the grades that were awarded to boys labelled as girls’ scores and vice versa, a review of the exam data by Nation Newsplex reveals.
While releasing the 2016 Kenya Certificate of Secondary Education (KCSE) examination results, Education Cabinet Secretary Dr Fred Matiang’i said girls had performed better than boys.
The claim has since been repeated many times by different stakeholders in the education sector, including the Kenya National Union of Teachers (Knut). But the numbers do not add up.
A review of the table on overall performance by grade and gender that was given to the media by the Kenya National Examination Council (Knec) suggests a switch, with the grades awarded to boys labelled as those for girls, and vice versa.
From the table provided to the media, a total of 571,161 candidates were graded, with 299,268 or 52 per cent being female and 271,893 or 48 per cent being male.
These totals differ from the numbers of candidates who sat for the examination. Knec statistics show that 574,125 candidates sat the examination, of which 273,130 (48 per cent) were female and 300,995 (52 per cent) male.
Going by these numbers, between sitting the exam and grading, the number of boys dropped by 29,102 (10 per cent), while the number of girls increased by 26,138 (nine per cent), changes that could not have gone unnoticed if they actually happened.
Furthermore, the girls who were graded were more than those who were sat for the exam, which is practically impossible.
Switching around the columns makes much more sense. With this modification, the number of girls graded is only 1,237 less than those who were registered, while the number of boys graded is only 1,727 less than those who were registered.
The difference between the number of students who were graded and those who sat the examination is 2,964 (less than one per cent) and the Cabinet Secretary’s explanation that a few candidates were not graded on account of not sitting all the minimum seven subjects as required could account for it. KCSE candidates were required to sit for a minimum of seven and a maximum of nine subjects.
Last week, Knut claimed there were glaring shortcomings in the examination marking process. Among the complaints was that the step of normalising grades was missing in this year’s exam, a claim given credence by the fact that the results were released just four days after the marking of the examination papers ended.
According to Dr Matiang’i the marking exercise only ended on December 24. He released the results on December 29.
“Raw marks were graded and the grading system used was not known at all. The same grading system was used for all subjects, Humanities and Sciences. This explains the many A’s in the Humanities and hardly any in English and Science,” a report to the media by Knut states.
“There was no standardisation and moderation done at all. It is also evident that the exams were hurriedly marked and released.”
CLUSTERED AROUND THE MEAN
Dr John Mugo, an education expert at Twaweza East Africa, explains that standardisation and moderation are critical in national examinations like KCSE.
He says moderation, which is done after marking of the examination and is also popularly known as grading, basically looks at the performance of candidates in an examination.
“You come up with the minimum and discuss how to set the pass mark per subject depending on performance,” explains Dr Mugo.
He adds that after moderation, results are standardised, which is a statistical procedure that enables one to tell where most candidates are featuring in grades.
“It is also called normalising and it’s done by a statistical software and it’s clustered around the mean plus or minus the standard deviation,” explains Dr Mugo.
In KCSE, a procedure known as criteria reference is used where candidates are competing at a set standard, which is why moderation is important.
The final event, Dr Mugo said is called the award ceremony, presided over by the Chief Examiner of every paper. They look at the performance in their papers across the country and then propose on the grading system. This is meant to normalise the grades.
Knut appears not to have noticed the anomaly in the data of overall grades, and blamed the reported better scores by girls on the grading system.
20 OUT OF 30 SUBJECTS
More evidence that girls may not have outperformed boys lies in the fact that female candidates performed better than male candidates in only eight out of the 30 subjects examined.
These were Home Science, Christian Religious Education, Art and Design, Electricity, English, Kiswahili, Mathematics Alternative B and Physics.
That would mean male candidates performed better than female candidates in 20 out of the 30 subjects, including Biology and Biology for the blind, Chemistry, Mathematics Alternative A, History, Geography, Business Studies, Computer Studies, Agriculture, Music and languages such as French, German and Arabic.
Others subjects in which boys outshone girls were Aviation Technology, Building Construction, Islamic Religious Education, General Science, Kenya Sign Language, Power Mechanics and Drawing and Design. Female candidates did not register for Wood Work and Metalwork.
How is it possible that boys outsmarted girls across many more subjects but were beaten in the overall grades?
BOYS OUTPERFORMED GIRLS
It is not to say that girls did not make history, if indeed the overall grade distribution by gender was switched.
The real picture is that more girls than boys scored grade A. In the 27-year history of the examination, boys have attained more As than girls. Female candidates made up 59 per cent of those who scored an A or 83 candidates this year compared to a third last year.
Boys constituted 41 per cent or 48 candidates who made the grade. But that is where the good news for girls ends.
More boys than girls scored C+ and above. Boys made up about 57 per cent (50,415 candidates) of the candidates who scored the university cut-off grade, compared to 43 per cent (38,514 candidates) of girls.
This means the share of boys who got C+ and above was five per cent higher than the proportion who sat the examination (52 per cent) while the share of girls who made the grade was five per cent lower than those who sat the exam (48 per cent).
Put another way, boys outperformed girls, as has happened in previous years.
CONVERSION OF MARKS
Newsplex shared its concerns with Knec and asked for clarification from various Education ministry officials for days. While some insisted their data was correct, others promised to get back to us.
Knut says other glaring mistakes included deliberate down marking the students and raw posting of marks without cross checking.
“Apart from the errors in the examination questions, there was no verification, validation and poor conversion of marks,” the Knut report states.
Knut recommended that for fairness and justice to the candidates, KCSE results should be recalled to allow for moderation by the Chief Examiners in order to ensure credible grades of all the 2016 KCSE candidates.
It called for a thorough and comprehensive audit be carried out on the whole handling of 2016 KCPE and KCSE administration, marking and processing of exams. However, Dr Matiang’i insisted the exams would not be remarked.
Speaking to NTV in December after the release of the examination, Knec Chairman Prof George Magoha said the results were released last month in order to beat cartels who wanted to interfere with the results.