logo logo European Journal of Educational Research

EU-JER is is a, peer reviewed, online academic research journal.

Subscribe to

Receive Email Alerts

for special events, calls for papers, and professional development opportunities.

Subscribe

Publisher (HQ)

Eurasian Society of Educational Research
Eurasian Society of Educational Research
Christiaan Huygensstraat 44, Zipcode:7533XB, Enschede, THE NETHERLANDS
Eurasian Society of Educational Research
Headquarters
Christiaan Huygensstraat 44, Zipcode:7533XB, Enschede, THE NETHERLANDS
Research Article

Implementation of Four-Tier Multiple-Choice Instruments Based on the Partial Credit Model in Evaluating Students’ Learning Progress

Lukman Abdul Rauf Laliyo , Syukrul Hamdi , Masrid Pikoli , Romario Abdullah , Citra Panigoro

One of the issues that hinder the students’ learning progress is the inability to construct an epistemological explanation of a scientific pheno.

O

One of the issues that hinder the students’ learning progress is the inability to construct an epistemological explanation of a scientific phenomenon. Four-tier multiple-choice (hereinafter, 4TMC) instrument and Partial-Credit Model were employed to elaborate on the diagnosis process of the aforementioned problem. This study was to develop and implement the four-tier multiple-choice instrument with Partial-Credit Model to evaluate students’ learning progress in explaining the conceptual change of state of matter. This research applied a development research referring to the test development model by Wilson. The data were obtained through development and validation techniques on 20 4TMC items tested to 427 students. On each item, the study applied diagnostic-summative assessment and certainty response index. The students’ conceptual understanding level was categorized based on the combination of their answer choices; the measurement generated Partial-Credit Model for 1 parameter logistic (IPL) data. Analysis of differences was based on the student level class using Analysis of Variants (One-way ANOVA). This study resulted in 20 valid and reliable 4TMC instruments. The result revealed that the integration of 4TMC test and Partial-Credit Model was effective to be treated as the instrument to measure students’ learning progress. One-way ANOVA test indicated the differences among the students’ competence based on the academic level. On top of that, it was discovered that low-ability students showed slow progress due to the lack of knowledge as well as a misconception in explaining the Concept of Change of State of Matter. All in all, the research regarded that the diagnostic information was necessary for teachers in prospective development of learning strategies and evaluation of science learning.

Keywords: Learning progress, four-tier, change of state of matter, partial-credit model.

cloud_download PDF
Cite
Article Metrics
Views
425
Download
1018
Citations
Crossref
6

Scopus
5

References

Aktan, D. C. (2013). Investigation of students’ intermediate conceptual understanding levels: The case of direct current electricity concepts. European Journal of Physics, 34(1), 33–43. https://doi.org/10.1088/0143-0807/34/1/33

Arslan, H. O., Cigdemoglu, C., & Moseley, C. (2012). A three-tier diagnostic test to assess pre-service teachers’ misconceptions about global warming, greenhouse effect, ozone layer depletion, and acid rain. International Journal of Science Education, 34(11), 1667–1686. https://doi.org/10.1080/09500693.2012.680618

Bond, T. G., & Fox, C. M. (2007). Applying the Rasch Model: Fundamental measurement in the human sciences (2nd ed.). Routledge.

Caleon, I. S., & Subramaniam, R. (2010). Do students know what they know and what they don’t know? Using a four-tier diagnostic test to assess the nature of students’ alternative conceptions. Research in Science Education, 40(3), 313–337. https://doi.org/10.1007/s11165-009-9122-4

Chandrasegaran, A. L., Treagust, D. F., & Mocerino, M. (2007). The development of two tier multiple-choice diagnostic instrument for evaluating secondary school students’ ability to describe and explain chemical reactions using multiple levels of representation. Chemistry Education Research and Practice, 8(3), 293–307

Chi, S., Wang, Z., Luo, M., Yang, Y., & Huang, M. (2018). Student progression on chemical symbol representation abilities at different grade levels (Grades 10–12) across gender. Chemistry Education Research and Practice, 19(4), 1055–1064. https://doi.org/10.1039/c8rp00010g

Claesgens, J., Scalise, K., Wilson, M., & Stacy, A. (2009). Mapping student understanding in chemistry: The perspectives of chemists. Science Education, 93(1), 56–85. https://doi.org/10.1002/sce.20292.

Djiks, M. A., Brummer, L., & Kostons, D. (2018). The anonymous reviewer: the relationship between perceived expertise and the perceptions of peer feedback in higher education. Assessment & Evaluation in Higher Education, 43(8), 1258-1271. https://doi.org/10.1080/02602938.2018.1447645

Duncan, R. G., & Hmelo-Silver, C. E. (2009). Learning progressions: Aligning curriculum, instruction, and assessment. Journal of Research in Science Teaching, 46(6), 606–609. https://doi.org/10.1002/tea.20316

Duschl, R., Maeng, S., & Sezen, A. (2011). Learning progressions and teaching sequences: A review and analysis. Studies in Science Education, 47(2), 123–182. https://doi.org/10.1080/03057267.2011.604476

Emden, M., Weber, K., & Sumfleth, E. (2018). Evaluating a learning progression on “Transformation of Matter” on the lower secondary level. Chemistry Education Research and Practice, 19(4), 1096–1116. https://doi.org/10.1039/c8rp00137e

Goulas, S., & Megalokonomou, R. (2021). Knowing who you actually are: The effect of feedback on short-and longer-term outcomes. Journal of Economic Behavior & Organization183, 589-615. https://doi.org/10.1016/j.jebo.2021.01.013

Guo, M., & Leung, F. K. S. (2021). Achievement goal orientations, learning strategies, and mathematics achievement: A comparison of Chinese Miao and Han students. Psychology in the Schools. 58(1), 107-123. . https://doi.org10.1002/pits.22424

Haarala-Muhonen, A., Ruohoniemi, M., Parpala, A., Komulainen, E., & Lindblom-Ylänne, S. (2016). How do the different study profiles of first-year students predict their study success, study progress and the completion of degrees? Higher Education, 74(6), 949–962. https://doi.org/ 10.1007/s10734-016-0087-8

Habiddin, & Page, E. M. (2019). Development and validation of a four-tier diagnostic instrument for chemical kinetics (FTDICK). Indonesian Journal of Chemistry, 19(3), 720–736. https://doi.org/10.22146/ijc.39218

Hadenfeldt, J. C., Bernholt, S., Liu, X., Neumann, K., & Parchmann, I. (2013). Using ordered multiple-choice items to assess students’ understanding of the structure and composition of matter. Journal of Chemical Education, 90(12), 1602–1608. https://doi.org/10.1021/ed3006192

Hasan, S., Bagayoko, D., & Kelley, E. L. (1999). Misconceptions and the certainty of response index (CRI). Physics Education, 34(5), 294–299. https://doi.org/10.1088/0031-9120/34/5/304

Herrmann-Abell, C. F., & Deboer, G. E. (2016). Using rasch modeling and option probability curves to diagnose students’ misconceptions. American Educational Research Association, 8(12), 1–12

Hoe, K. Y., & Subramaniam, R. (2016). On the prevalence of alternative conceptions on acid-base chemistry among secondary students: Insights from cognitive and confidence measures. Chemistry Education Research and Practice, 17(2), 263–282. https://doi.org/10.1039/c5rp00146c

Jin, H., Mikeska, J. N., Hokayem, H., & Mavronikolas, E. (2019). Toward coherence in curriculum, instruction, and assessment: A review of learning progression literature. Science Education, 103(5), 1206–1234. https://doi.org/10.1002/sce.21525

Karagiannopoulou, E., Milienos, F. S., & Rentzios, C. (2020). Grouping learning approaches and emotional factors to predict students’ academic progress. International Journal of School & Educational Psychology, 9(1), 1–18. https://doi.org.10.1080/2168363.2020.183241

Klassen, S. (2006). Contextual assessment in science education: Background, issues, and policy. Science Education, 90(5), 820–851. https://doi.org/10.1002/sce.20150

Latifi, S., Noroozi, O., & Talaee, E. (2021). Peer feedback or peer feedforward? Enhancing students’ argumentative peer learning processes and outcomes. British Journal of Educational Technology, 52(2), 768-784. https://doi.org/10.1111/bjet.13054

Lee, K., & Keller, J. M. (2021). Use of the ARCS model in education: A literature review. Computers & Education, 122(1), 54-62. https://doi.org/10.1016/j.compedu.2018.03.019

Lin, P. Y., Chai, C. S., Jong, M. S. Y., Dai, Y., Guo, Y., & Qin, J. (2021). Modeling the structural relationship among primary students’ motivation to learn artificial intelligence. Computers and Education: Artificial Intelligence2(1), 1-7

Laliyo, Botutihe, & Panigoro. (2019). The development of two-tier instrument based on distractor to assess conceptual understanding level and student misconceptions in explaining redox reactions. International Journal of Learning, Teaching and Educational Research, 18(9), 216–237. https://doi.org/10.26803/ijlter.18.9.12

Linacre, J. M. (2012). A user’s guide to W I N S T E P S ® M I N I S T E P Rasch-model computer program: Program manual 3.75.0. winsteps.com.

Linacre, J. M. (2020). A User’s Guide to W I N S T E P S ® M I N I S T E P Rasch-Model Computer Programs Program Manual 4.5.1. winsteps.com.

Ling Lee, W., Chinna, K., & Sumintono, B. (2020). Psychometrics assessment of HeartQoL questionnaire: A Rasch analysis. European Journal of Preventive Cardiology. Advance online publication. https://doi.org/10.1177/2047487320902322

Liu, X. (2012). Developing measurement instruments for science education research. In B. Fraser, K. G. Tobin, & C. J. McRobbie (Eds.), Second international handbook of science education (pp. 651–665). Springer Netherlands

Lu, S., & Bi, H. (2016). Development of a measurement instrument to assess students’ electrolyte conceptual understanding. Chemistry Education Research and Practice, 17(4), 1030–1040. https://doi.org/10.1039/c6rp00137h

Morell, L., Collier, T., Black, P., & Wilson, M. (2017). A construct-modeling approach to develop a learning progression of how students understand the structure of matter. Journal of Research in Science Teaching, 54(8), 1024–1048. https://doi.org/10.1002/tea.21397

Neumann, K., Viering, T., Boone, W. J., & Fischer, H. E. (2013). Towards a learning progression of energy. Journal of Research in Science Teaching, 50(2), 162–188. https://doi.org/10.1002/tea.21061

Park, M., Liu, X., & Waight, N. (2017). Development of the connected chemistry as formative assessment pedagogy for high school chemistry teaching. Journal of Chemical Education, 94(3), 273–281. https://doi.org/10.1021/acs.jchemed.6b00299

Peterson, R. F., Treagust, D. F., & Garnett, P. (1989). Development and application of a diagnostic instrument to evaluate grade‐11 and ‐12 students’ concepts of covalent bonding and structure following a course of instruction. Journal of Research in Science Teaching, 26(4), 301–314. https://doi.org/10.1002/tea.3660260404

Rogat, A. (2011). Developing learning progressions in support of the new science standards: A RAPID workshop series. CPRE Research Reports. http://repository.upenn.edu/cpre_researchreports/66

Smith, C. L., Wiser, M., Anderson, C. W., & Krajcik, J. (2006). Implications of research on children’s learning for standards and assessment: A proposed learning progression for matter and the atomic-molecular theory. Measurement: Interdisciplinary Research & Perspective, 4(1–2), 1–98. https://doi.org/10.1080/15366367.2006.9678570

Sumintono, B., & Widhiarso, W. (2014). Aplikasi model Rasch untuk penelitian ilmu-ilmu sosial [Application of Rasch model in social science research]. Trim Komunikata.

Sutiani, A., Situmorang, M., & Silalahi, A. (2021). Implementation of an Inquiry Learning Model with Science Literacy to Improve Student Critical Thinking Skills. International Journal of Instruction, 14(2), 117-138.

Sreenivasulu, B., & Subramaniam, R. (2013). University students’ understanding of chemical thermodynamics. International Journal of Science Education35(4), 601-635.

Testa, I., Capasso, G., Colantonio, A., Galano, S., Marzoli, I., Scotti di Uccio, U., Trani, F., & Zappia, A. (2019). Development and validation of a university students’ progression in learning quantum mechanics through exploratory factor analysis and Rasch analysis. International Journal of Science Education, 41(3), 388–417. https://doi.org/10.1080/09500693.2018.1556414

Treagust, D. F. (1988). Development and use of diagnostic tests to evaluate students’ misconceptions in science. International Journal of Science Education, 10(2), 159–169. https://doi.org/10.1080/0950069880100204

Tyson, L., Treagust, D. F., & Bucat, R. B. (1999). The complexity of teaching and learning chemical equilibrium. Journal of Chemical Education, 76(2–4), 554–558. https://doi.org/10.1021/ed077p1560.1

Wilson, M. (2005). Constructing measures: An item response modeling approach. Lawrence Erlbaum Associates, Inc. https://doi.org/10.4324/9781410611697

Wilson, M. (2008). Cognitive diagnosis using item response models. Journal of Psychology/ Zeitschrift Für Psychologie , 216(2), 74–88. https://doi.org/10.1027/0044-3409.216.2.74

Wilson, M. (2009). Measuring progressions: Assessment structures underlying a learning progression. Journal of Research in Science Teaching, 46(6), 716–730. https://doi.org/10.1002/tea.20318

Wilson, M. (2012). Responding to a challenge that learning progressions pose to measurement practice. In A. C. Alonzo & A. W. Gotwals (Eds.), Learning progression in science (pp. 317–344). Sense Publishers. https://doi.org/10.1007/978-94-6091-824-7

...