'pisa mathematics assessment framework' Search Results
Evaluating the Results of PISA Assessment: Are There Gaps Between the Teaching of Mathematical Literacy at Schools and in PISA Assessment?
education gaps mathematical competence mathematical literacy pisa assessment...
The problems in education in the countries of the Organization for Economic Cooperation and Development (OECD) vary from country to country. The differences between "upper class" and "lower class" countries in PISA assessment results have led to a research gap. The purpose of this study was to (a) test students' mathematical literacy skills on the Program for International Student Assessment (PISA) test and compare the results using the sum of means across OECD countries; (b) examine the relationship between students' mathematical competence, precision, and self-perception of mathematical literacy skills in the PISA test; and (c) analyze the gaps that exist between the implementation of mathematics instruction in school and the mathematical literacy as measured on the PISA test. This study was designed as a mixed method with an explanatory sequential design. The data collection methods included test procedures, questionnaires, and interviews. The result of this study showed that the overall mean score obtained was below the OECD average. In general, the respondents achieved only level 2 mathematics proficiency. A significant relationship was found between mathematical competence, precision, and self-perception in mathematical skills. On the other hand, there was a gap, namely the difference at the implementation level, where mathematical literacy measured by PISA differed from the measurement of mathematical learning achievement by teachers in school. The results showed that teaching that emphasizes only problem-solving procedures affects low mathematical competence and is not useful enough for students to deal with the PISA mathematics test.
Writing PISA-Like Mathematics Items: The Case of Tertiary Mathematics Instructors from a State University in the Philippines
context-based math items item-writing mathematical literacy pisa mathematics assessment framework pisa-like mathematics items...
Mathematics test items in International Large-Scale Assessments (ILSAs) such as the Programme of International Student Assessment (PISA) and the Trends in International Mathematics and Science Study (TIMSS) are nested in contexts defined in their assessment framework (e.g., the Personal, Occupational, Societal, and Scientific contexts in PISA). This study followed the item-writing activities of four tertiary mathematics instructors in the Philippines as they constructed context-based mathematics items. They were tasked to write four items each, following a set of specifications for PISA content and context categories. The data consisted of transcripts from the focus-group discussion which was conducted days after the task. The transcripts were then analyzed using thematic analysis. The results of this study showed that the phenomenon of item-writing in the context of writing PISA-like mathematics items had two themes: the phases of item-writing and the dimensions of item-writing. Findings showed that the respondents struggled to find realistic contexts and that they engaged in a problem-solving task likened to solving a puzzle as they attempted to satisfy the content, context, and process categories in the table of specifications (TOS). This study contributes to filling in the research gap on item-writing activities, particularly those of mathematics teachers in the Philippines- a country whose recent mathematical performance in the PISA 2018, TIMSS 2019, and PISA 2022 was nothing short of dismal.