Mogapi, Molefhe (2019) Using Factor Analysis Procedures to Validate Score Reporting Practice of Large Scale Examinations: Establishing the Baseline. Journal of Education, Society and Behavioural Science, 32 (4). pp. 1-10. ISSN 2456-981X
Mogapi3242019JESBS51009.pdf - Published Version
Download (305kB)
Abstract
Performance of candidates in large scale examinations is often reported using a composite score that represents an aggregation of several components of a subject. The components are meant to reflect the fact that subjects are made up of different topics or modalities and each modality is assessed by means of a subset of items. The subsets of items measure a candidates’ knowledge with respect to the specific domain. However, more often than not, the construct validity or psychometric independence of each specific domain has not been empirically defined although the domain has intuitive meaning. Factor analysis can be used to make sure that the score reporting practice as indicated by the number of domains is supported by the underlying factor structure. In this paper, Social Studies and Science final examinations test scores were used as dependent variables to extract underlying dimensions. The co-variance matrix for each of the two subjects was submitted to a principal component analysis with Varimax rotation to produce factor loading. The results indicated a unidimensional factor structure for Social Studies and a three component model for Science. The findings were used to evaluate the adopted score reporting structure for each of the two subjects.
Item Type: | Article |
---|---|
Subjects: | OA Digital Library > Social Sciences and Humanities |
Depositing User: | Unnamed user with email support@oadigitallib.org |
Date Deposited: | 06 Apr 2023 05:38 |
Last Modified: | 05 Sep 2024 10:58 |
URI: | http://library.thepustakas.com/id/eprint/896 |