logo logo European Journal of Educational Research

EU-JER is is a, peer reviewed, online academic research journal.

Subscribe to

Receive Email Alerts

for special events, calls for papers, and professional development opportunities.

Subscribe

Publisher (HQ)

Eurasian Society of Educational Research
Eurasian Society of Educational Research
7321 Parkway Drive South, Hanover, MD 21076, USA
Eurasian Society of Educational Research
Headquarters
7321 Parkway Drive South, Hanover, MD 21076, USA
chemistry practicum mfrm performance assessment process assessment product assessment

Measurement of Students' Chemistry Practicum Skills Using Many Facets Rash Model

Melly Elvira , Heri Retnawati , Eli Rohaeti , Syamsir Sainuddin

The accuracy of assessing the capabilities of the process and product in chemical practice activities requires appropriate measurement procedures to b.

T

The accuracy of assessing the capabilities of the process and product in chemical practice activities requires appropriate measurement procedures to be followed. It is crucial to identify the components that can introduce bias while measuring student abilities during the measurement process. This study aims to identify the components or criteria used by teachers to assess student performance in practicum activities and analyze the quality of the rubrics developed. The study was conducted with the participation of three raters, 27 high school students, and nine assessment criteria. A quantitative descriptive approach was employed using the many-facet Rasch model (MFRM) analysis for measurement. The results of the MFRM analysis show no significant measurement bias, with data measurement facets fitting the MFRM model. The reliability of all the facets meets the criteria, and the scale predictor functions appropriately. While all students can easily pass four out of nine items, five items can only be partially passed by students. The assessment criteria that require special attention include communication skills, tools and assembly, interpretation, cleanliness, and accuracy when performing practicums. These criteria provide feedback for teachers and students to ensure successful practicum activities. The Discussion section of this study delves into the findings and their implications.

Keywords: Chemistry practicum, MFRM, performance assessment, process assessment, product assessment.

cloud_download PDF
Cite
Article Metrics
Views
296
Download
308
Citations
Crossref
0

Scopus
0

References

Adams, C. J. (2020). A constructively aligned first-year laboratory course. Journal of Chemical Education, 97(7), 1863–1873. https://doi.org/10.1021/acs.jchemed.0c00166

Aiken, L. R. (1985). Three coefficients for analyzing the reliability and validity of ratings. Educational and Psychological Measurement, 45(1), 131–141. https://doi.org/10.1177/0013164485451012

Almarshoud, A. F. (2011). Developing a rubric-based framework for measuring the ABET outcomes achieved by students of electric machinery courses. International Journal of Engineering Education, 27(4), 859–866.

Aryadoust, V. (2016). Gender and academic major bias in peer assessment of oral presentations. Language Assessment Quarterly, 13(1), 1–24. https://doi.org/10.1080/15434303.2015.1133626

Asmorowati, D., Wardani, S., & Mahatmanti, F. (2021). Analysis of student science process skills in the practicum of physical chemistry based on linguistic and interpersonal intelligence. International Journal of Active Learning, 6(1), 34–40. https://www.learntechlib.org/p/218989/

Basturk, R. (2008). Applying the many-facet rasch model to evaluate powerpoint presentation performance in higher education. Assessment and Evaluation in Higher Education, 33(4), 431–444. https://doi.org/10.1080/02602930701562775

Bennett, R. E., Deane, P., & van Rijn, P. W. (2016). From cognitive-domain theory to assessment practice. Educational Psychologist, 51(1), 82–107. https://doi.org/10.1080/00461520.2016.1141683

Beyreli, L., & Ari, G. (2009). The use of analytic rubric in the assessment of writing performance-inter-rater concordance study. Educational Sciences: Theory and Practice, 9(1), 105–125. https://hdl.handle.net/20.500.12451/6891

Bodner, G. M. (2015). Research on problem solving in chemistry. Chemistry Education: Best Practices, Opportunities and Trends, January 2015, 181–202. https://doi.org/10.1002/9783527679300.ch8

Bond, T., Yan, Z., & Heene, M. (2020). Applying the rasch model: Fundamental measurement in the human sciences (4th ed.). Routledge. https://doi.org/10.4324/9780429030499

Brennan, R. L. (2010). Generalizability theory and classical test theory. Applied Measurement in Education, 24(1), 1–21. https://doi.org/10.1080/08957347.2011.532417

Brookhart, S. M. (2013). How to create and use rubrics for formative assessment and grading. Association for Supervision & Curriculum Development. http://bit.ly/3JkVojK

Capozzi, F., Laghi, L., & Belton, P. S. (Eds.). (2015). Magnetic resonance in food science: Defining food by magnetic resonance. The Royal Society of Chemistry. https://doi.org/10.1039/9781782622741

Chairam, S., Klahan, N., & Coll, R. K. (2015). Exploring secondary students’ understanding of chemical kinetics through inquiry-based learning activities. Eurasia Journal of Mathematics, Science and Technology Education, 11(5), 937–956. https://doi.org/10.12973/eurasia.2015.1365a

Chen, H.-J., She, J.-L., Chou, C.-C., Tsai, Y.-M., & Chiu, M.-H. (2013). Development and application of a scoring rubric for evaluating students’ experimental skills in organic chemistry: An instructional guide for teaching assistants. Journal of Chemical Education, 90(10), 1296–1302. https://doi.org/10.1021/ed101111g

Cheung, D. (2008). Facilitating chemistry teachers to implement inquiry-based laboratory work. International Journal of Science and Mathematics Education, 6, 107–130. https://doi.org/10.1007/s10763-007-9102-y

Cheung, D. (2011). Teacher beliefs about implementing guided-inquiry laboratory experiments for secondary school chemistry. Journal of Chemical Education, 88(11), 1462–1468. https://doi.org/10.1021/ed1008409

Chukwuere, J. E. (2021). The comparisons between the use of analytic and holistic rubrics in information systems discipline. Academia Letters, Article 3579. https://doi.org/10.20935/al3579

D’Souza, M. J., Roeske, K. P., & Neff, L. S. (2017). Free inventory platform manages chemical risks, addresses chemical accountability, and measures cost-effectiveness. International Journal of Advances in Science, Engineering and Technology, 5(3), 25–29. https://bit.ly/40v0HUC

Deviana, T., Hayat, B., & Suryadi, B. (2020). Validation of the social provision scale with indonesian student sample: A rasch model approach. Indonesian Journal of Educational Assesment, 3(1), Article 1.

Eckes, T. (2015). Introduction to many-facet rasch measurement: analyzing and evaluating rater-mediated assessments (2nd ed.). Peter Lang Verlag. https://doi.org/10.3726/978-3-653-04844-5

Fisher, W. P., Jr. (2007). Rating scale instrument quality criteria. Rasch Measurement Transaction, 21(1), 1095. https://www.rasch.org/rmt/rmt211m.htm

Galti, A. M., Saidu, S., Yusuf, H., & Goni, A. A. (2018). Rating scale in writing assessment: Holistic vs. Analytical scales: A review. International Journal of English Research, 4(6), 4–6.

Ghaemi, R. V., & Potvin, G. (2021). Hands-on education without the hands-on? An approach to online delivery of a senior lab course in chemical engineering while maintaining key learning outcomes. Proceedings of the Canadian Engineering Education Association (CEEA), 2021, 1-8. https://doi.org/10.24908/pceea.vi0.14834

Giammatteo, L., & Obaya, A. V. (2018). Assessing chemistry laboratory skills through a competency‑based approach in high school chemistry course. Science Education International, 29(2), 103–110. https://doi.org/10.33828/sei.v29.i2.5

Gürses, A., Çetinkaya, S., Doğar, Ç., & Şahin, E. (2015). Determination of levels of use of basic process skills of high school students. Procedia - Social and Behavioral Sciences, 191, 644–650. https://doi.org/10.1016/j.sbspro.2015.04.243

Hager, P., Gonczi, A., & Athanasou, J. (1994). General issues about assessment of competence. Assessment & Evaluation in Higher Education, 19(1), 3–16. https://doi.org/10.1080/0260293940190101

Hall, P. C., & West, J. H. (2011). Potential predictors of student teaching performance: Considering emotional intelligence. Issues in Educational Research, 21(2), 145–161. http://www.iier.org.au/iier21/hall.html

Hanifah, S., Sari, & Irwansyah, F. S. (2021). Making of web-based chemical laboratory equipment and materials inventory application. Seminar Nasional Tadris Kimiya 2020, 2, 97–110. http://bit.ly/3JNsVDB

Harmey, S., D’Agostino, J., & Rodgers, E. (2019). Developing an observational rubric of writing: Preliminary reliability and validity evidence. Journal of Early Childhood Literacy, 19(3), 316–348. https://doi.org/10.1177/1468798417724862

Harsh, J. A. (2016). Designing performance-based measures to assess the scientific thinking skills of chemistry undergraduate researchers. Chemistry Education Research and Practice, 17(4), 808–817. https://doi.org/10.1039/c6rp00057f

Harwood, C. J., Hewett, S., & Towns, M. H. (2020). Rubrics for assessing hands-on laboratory skills. Journal of Chemical Education, 97(7), 2033–2035. https://doi.org/10.1021/acs.jchemed.0c00200

He, T.-H., Gou, W. J., Chien, Y.-C., Chen, I.-S. J., & Chang, S.-M. (2013). Multi-faceted Rasch measurement and bias patterns in EFL writing performance assessment. Psychological Reports, 112(2), 469–485. https://doi.org/10.2466/03.11.PR0.112.2.469-485

Hennah, N., & Seery, M. K. (2017). Using digital badges for developing high school chemistry laboratory skills. Journal of Chemical Education, 94(7), 844–848. https://doi.org/10.1021/acs.jchemed.7b00175

Hensiek, S., DeKorver, B. K., Harwood, C. J., Fish, J., O’Shea, K., & Towns, M. (2016). Improving and assessing student hands-on laboratory skills through digital badging. Journal of Chemical Education, 93(11), 1847–1854. https://doi.org/10.1021/acs.jchemed.6b00234

Hlukhaniuk, V., Solovei, V., Tsvilyk, S., & Shymkova, I. (2020). STEMA education as a benchmark for innovative training of future teachers of labour training and technology. Society. Integration. Education. Proceedings of the International Scientific Conference, 1, 211–221. https://doi.org/10.17770/sie2020vol1.5000

Hunter, C., Mccosh, R., & Wilkins, H. (2003). Integrating learning and assessment in laboratory work. Chemistry Education Research and Practice, 4(1), 67–75. https://doi.org/10.1039/b2rp90038f

Hunter, R. A., & Kovarik, M. L. (2022). Leveraging the analytical chemistry primary literature for authentic, integrated content knowledge and process skill development. Journal of Chemical Education, 99(3), 1238–1245. https://doi.org/10.1021/acs.jchemed.1c00920

Irwanto, Rohaeti, E., & Prodjosantoso, A. K. (2018). The investigation of university students’ science process skills and chemistry attitudes at the laboratory course. Asia-Pacific Forum on Science Learning and Teaching, 19(2), Article 07. http://bit.ly/3FzomeL

Janssen, G., Meier, V., & Trace, J. (2015). Building a better rubric: Mixed methods rubric revision. Assessing Writing, 26, 51–66. https://doi.org/10.1016/j.asw.2015.07.002

Johnson, E. S., Zheng, Y., Crawford, A. R., & Moylan, L. A. (2019). Developing an explicit instruction special education teacher observation rubric. Journal of Special Education, 53(1), 28–40. https://doi.org/10.1177/0022466918796224

Lichti, D., Mosley, P., & Callis-Duehl, K. (2021). Learning from the trees: Using project budburst to enhance data literacy and scientific writing skills in an introductory biology laboratory during remote learning. Citizen Science: Theory and Practice, 6(1), Article 32. https://doi.org/10.5334/CSTP.432

Linacre, J. M. (1994a). FACET: Rasch Model (2nd ed.). Mesa Press.

Linacre, J. M. (1994b). Many-facet: Rasch measurement (2nd ed.). Mesa Press.

Linacre, J. M. (2002). Review of reviews of Bond & Fox (2001). Rasch Measurement Transactions, 16(2), 871–882. https://www.rasch.org/rmt/rmt162.pdf

Linacre, J. M. (2018). A user guide to FACETS Rasch-model computer programs. Winsteps. https://bit.ly/3AEBqfQ

Lunardi, C. N., Gomes, A. J., Rocha, F. S., De Tommaso, J., & Patience, G. S. (2021). Experimental methods in chemical engineering: Zeta potential. Canadian Journal of Chemical Engineering, 99(3), 627–639. https://doi.org/10.1002/cjce.23914

Maknun, D. (2015). Evaluasi keterampilan laboratorium mahasiswa menggunakan asesmen kegiatan laboratorium berbasis kompetensi pada pelaksanaan praktek pengalaman lapangan (PPL) [Evaluation of students' laboratory skills using competency-based laboratory activity assessment during the implementation of field experience practice (FEP)]. Jurnal Tarbiyah, 22(1), 21–47. https://bit.ly/40iDjJT

Mistry, N., & Gorman, S. G. (2020). What laboratory skills do students think they possess at the start of University? Chemistry Education Research and Practice, 21(3), 823–838. https://doi.org/10.1039/c9rp00104b

Mitchell, A. A. (2006). Review of the book introduction to rubrics: An assessment tool to save grading time, convey effective feedback and promote student learning. Journal of College Student Development, 47(3), 352–355. https://doi.org/10.1353/csd.2006.0033

Montgomery, T. D., Buchbinder, J. R., Gawalt, E. S., Iuliucci, R. J., Koch, A. S., Kotsikorou, E., Lackey, P. E., Lim, M. S., Rohde, J. J., Rupprecht, A. J., Srnec, M. N., Vernier, B., & Evanseck, J. D. (2022). The scientific method as a scaffold to enhance communication skills in chemistry. Journal of Chemical Education, 99(6), 2338–2350. https://doi.org/10.1021/acs.jchemed.2c00113

Morgan, G. B., Zhu, M., Johnson, R. L., & Hodge, K. J. (2014). Interrater reliability estimators commonly used in scoring language assessments: A Monte Carlo investigation of estimator accuracy. Language Assessment Quarterly, 11(3), 304–324. https://doi.org/10.1080/15434303.2014.937486

Neamah, W. Q. (2020). Academic laboratory skills for chemistry students at the college of education for pure sciences -Ibn Al Haitham. Journal of Xi’an University of Architecture & Technology, XII(III), 1531–1554.

Ng, S. B. (2019). Exploring STEM competences for the 21st century (C. Gallagher, L. Ji, & T. Kiyomi (Eds.)). UNESCO International Bureau of Education. https://bit.ly/40dMwmE

Orgill, M., York, S., & MacKellar, J. (2019). Introduction to systems thinking for the chemistry education community. Journal of Chemical Education, 96(12), 2720–2729. https://doi.org/10.1021/acs.jchemed.9b00169

Porter, A. L., Barnett, S. G., & Gallimore, C. E. (2017). Development of a holistic assessment plan to evaluate a four-semester laboratory course series. American Journal of Pharmaceutical Education, 81(2), Article 33. https://doi.org/10.5688/ajpe81233

Pusca, D., Bowers, R. J., & Northwood, D. O. (2017). Hands-on experiences in engineering classes: The need, the implementation and the results. World Transactions on Engineering and Technology Education, 15(1), 12–18. https://bit.ly/3JCLPg0

Reigosa, C., & Jiménez‐Aleixandre, M.-P. (2007). Scaffolded problem‐solving in the physics and chemistry laboratory: Difficulties hindering students’ assumption of responsibility. International Journal of Science Education, 29(3), 307–329. https://doi.org/10.1080/09500690600702454

Reynders, G., Suh, E., Cole, R. S., & Sansom, R. L. (2019). Developing student process skills in a general chemistry laboratory. Journal of Chemical Education, 96(10), 2109–2119. https://doi.org/10.1021/acs.jchemed.9b00441

Royal Society of Chemistry. (n.d.). Curriculum support. https://rsc.li/3ng0y9I

Rudd, J. A., Greenbowe, T. J., & Hand, B. M. (2007). Using the science writing heuristic to improve students’ understanding of general equilibrium. Journal of Chemical Education, 84(12), 2007–2011. https://doi.org/10.1021/ed084p2007

Sa’adah, E. N. L., & Sigit, D. (2018). Pengembangan instrumen penilaian sikap dan keterampilan psikomotorik pada materi elektrokimia [Development of attitudes and psychomotor skills assessment instruments in electrochemical materials]. Teori, Penelitian, Dan Pengembangan, 3(8), 1023-1026. https://bit.ly/3oMT0LW

Sa’adah, N., Langitasari, I., & Wijayanti, I. E. (2020). Implementasi pendekatan science writing heuristic pada laporan praktikum berbasis multipel representasi terhadap kemampuan interpretasi [Implementation of the science writing heuristic approach to multiple representation-based practicum reports on interpretation. Jurnal Inovasi Pendidikan IPA, 6(2), 195–208. https://doi.org/10.21831/jipi.v6i2.31078

Sainuddin, S., Subali, B., Jailani, & Elvira, M. (2022). The development and validation prospective mathematics teachers holistic assessment tools. Ingénierie des Systèmes d’Information, 27(2), 171–184. https://doi.org/10.18280/isi.270201

Seery, M. K. (2020). Establishing the laboratory as the place to learn how to do chemistry. Journal of Chemical Education, 97(6), 1511–1514. https://doi.org/10.1021/acs.jchemed.9b00764

Seery, M. K., Agustian, H. Y., Doidge, E. D., Kucharski, M. M., O’Connor, H. M., & Price, A. (2017). Developing laboratory skills by incorporating peer-review and digital badges. Chemistry Education Research and Practice, 18(3), 403–419. https://doi.org/10.1039/c7rp00003k

Skagen, D., McCollum, B., Morsch, L., & Shokoples, B. (2018). Developing communication confidence and professional identity in chemistry through international online collaborative learning. Chemistry Education Research and Practice, 19(2), 567–582. https://doi.org/10.1039/c7rp00220c

Straut, C. M., & Nelson, A. (2020). Improving chemical security with material control and accountability and inventory management. Journal of Chemical Education, 97(7), 1809–1814. https://doi.org/10.1021/acs.jchemed.9b00844

Subali, B., Rusdiana, D., Firman, H., & Kaniawati, I. (2015). Analisis kemampuan interpretasi grafik kinematika pada mahasiswa calon guru fisika [Analysis of kinematics graph interpretation ability in prospective physics teacher students]. Prosiding Simposium Nasional Inovasi Dan Pembelajaran Sains 2015, 3(1), 269–272. https://bit.ly/3JLjNz7

Tahya, D., Dahoklory, F. S., & Dahoklory, S. R. (2022). The development of local wisdom-based chemistry modules to improve students’ science process skills. Jurnal Penelitian Pendidikan IPA, 8(2), 731–739. https://doi.org/10.29303/jppipa.v8i2.1424

Turiman, P., Omar, J., Daud, A. M., & Osman, K. (2012). Fostering the 21st century skills through scientific literacy and science process skills. Procedia - Social and Behavioral Sciences, 59, 110–116. https://doi.org/10.1016/j.sbspro.2012.09.253

Ural, E. (2016). The effect of guided-inquiry laboratory experiments on science education students’ chemistry laboratory attitudes, anxiety and achievement. Journal of Education and Training Studies, 4(4), 217–227. https://doi.org/10.11114/jets.v4i4.1395

Uto, M. (2021). A multidimensional generalized many-facet Rasch model for rubric-based performance assessment. Behaviormetrika, 48, 425–457. https://doi.org/10.1007/s41237-021-00144-w

Weigle, S. C. (1998). Using FACETS to model rater training effects. Language Testing, 15(2), 263–287. https://doi.org/10.1177/026553229801500205

Wesolowski, B. C. (2012). Understanding and developing rubrics for music performance assessment. Music Educators Journal, 98(3), 36–42. https://doi.org/10.1177/0027432111432524

Wesolowski, B. C., Amend, R. M., Barnstead, T. S., Edwards, A. S., Everhart, M., Goins, Q. R., Grogan, R. J., Herceg, A. M., Jenkins, S. I., Johns, P. M., McCarver, C. J., Schaps, R. E., Sorrell, G. W., & Williams, J. D. (2017). The development of a secondary-level solo wind instrument performance rubric using the multifaceted rasch partial credit measurement model. Journal of Research in Music Education, 65(1), 95–119. https://doi.org/10.1177/0022429417694873

Wright, B. D. (1996). Reliability and separation. Rasch Measurement Transactions, 9(4), 472. https://www.rasch.org/rmt/rmt94n.htm

Wright, J. S., Read, D., Hughes, O., & Hyde, J. (2018). Tracking and assessing practical chemistry skills development: Practical skills portfolios. New Directions in the Teaching of Physical Sciences, 13(1), Article 07. https://doi.org/10.29311/ndtps.v0i13.2905

Yamanishi, H., Ono, M., & Hijikata, Y. (2019). Developing a scoring rubric for L2 summary writing: A hybrid approach combining analytic and holistic assessment. Language Testing in Asia, 9, Article 13. https://doi.org/10.1186/s40468-019-0087-6

Yan, X. (2014). An examination of rater performance on a local oral English proficiency test: A mixed-methods approach. Language Testing, 31(4), 501–527. https://doi.org/10.1177/0265532214536171

Zengele, A. G., & Alemayehu, B. (2016). The status of secondary school science laboratory activities for quality education in Case of Wolaita Zone, Southern Ethiopia. Journal of Education and Practice, 7(31), 1–11. https://bit.ly/3JPRDUN

...