DIFFERENTIAL ITEM FUNCTIONING IN NATIONAL EXAMINATIONS COUNCIL TESTS
Keywords:
Differential Item Functioning, NECO, Tests, and Academic AchievementAbstract
Educational tests result serve as important yardsticks through which society
evaluates the outcomes of its educational system. Consequently, it is essential that
educational institutions and examination bodies design assessments (test) that
accurately capture the intended characteristics of their examinees. This study
examined Differential Item Functioning (DIF) in the tests administered by the
National Examinations Council (NECO) and assessed its impact on the tests' reliability. The study was guided by two research questions and tested two hypotheses, employing an ex-post-facto research design. The population included all candidates who took the 2023 NECO Mathematics and Economics examinations in Nigeria. Sampling was conducted in two stages: selecting the subjects and then choosing the participants. Data collection utilized the 2023 NECO past questions for these four subjects, which had already been validated by NECO's Examinations and Standard Unit in Nigeria. Since NECO provides these standardized instruments, the
researcher did not independently determine their reliability. Candidate responses to selected test items were collected directly from NECO for analysis. The research questions were analyzed using an item response theory parameter logistic model with Marginal Maximum Likelihood estimation, along with mean and standard deviation calculations. Hypothesis 1 was tested through Thissen’s Likelihood Ratio Test (LRT) for DIF, and Hypothesis 2 was examined using a test of equality of alpha coefficients at a 0.05 significance level. Findings among others indicated that, on average, the test items for NECO Mathematics and Economics favored male students
more than their female counterparts. The study recommended that NECO should conduct DIF analysis on their assessments to ensure the quality and fairness of test items.