Journal directory listing - Volume 63 (2018) - Journal of Research in Education Sciences【63(1)】March
Directory

Developing and Validating a Constructed-Response Assessment of Scientific Abilities: A Case of the Optics Unit
Author: Hsiao-Hui Lin (Graduate Institute of Science Education, National Taiwan Normal University), Sieh-Hwa Lin (Department of Educational Psychology & Counseling, National Taiwan Normal University), Hsin-Kai Wu (Graduate Institute of Science Education, National Taiwan Normal University)

Vol.&No.:Vol. 63, No.1
Date:March 2018
Pages:173-205
DOI:10.6209/JORIES.2018.63(1).06

Abstract:
This study aimed to develop and validate a constructed-response assessment of scientific abilities and an accompanying rubric. The assessment included 32 open-ended test items that were categorized into four subscales—Remembering and understanding scientific knowledge, application and analysis of scientific procedures, argumentation and expression of scientific logic, and evaluation and innovation during problem solving. The analysis revealed the following results: First, the Cronbach’s α values were higher than .90, indicating high intrarater consistency. Second, Kendall’s coefficient of concordance was higher than .90 and its p value was less than .001, denoting a consistent scoring pattern between raters. In addition, many-facet Rasch measurement (MFRM) analysis revealed no significant difference in rater severity, whereas a comparison of the rating scale model (RSM) and partial credit model (PCM) indicated that each rater had a unique rating scale structure. The infit and outfit mean squares of the MFRM were 1 ± 0.5, which suggested that both severe and lenient raters could effectively distinguish high and low-ability students. The deviance values estimated by the RSM and PCM were converted to Bayesian information criterion values, and the RSM was viewed to fit the empirical data appropriately compared with the PCM. Therefore, the severity thresholds of the raters were the same. Third, Cronbach’s α coefficients of the four subassessments and the full assessment were higher than .85, indicating that the constructed-response assessment of scientific abilities (CRASA) provided a high internal-consistency reliability. Finally, confirmatory factor analysis revealed acceptable goodness-of-fit for the CRASA. These results suggested that the CRASA is a useful tool for accurately measuring scientific abilities.

Keywords:confirmatory factor analysis, constructed-response assessment, many-facet Rasch measurement, rater consistency

《Full Text》 檔名

References:
  1. 李茂能(2006)。結構方程模式軟體Amos之簡介及其在測驗編製上之應用:Graphics & Basic。臺北市:心理。【Li, M.-N. (2006). An introduction to Amos and its uses in scale development: Graphics & Basic. Taipei, Taiwan: Psychological.】
  2. 林小慧、曾玉村(2017)。科學多重文本閱讀理解評量及規準之建構與信效度分析—以氣候變遷與三峽大壩之間的關係題本為例。教育心理學報,49(2),215-241。doi:10.6251/BEP.2017-49(2).0003 【Lin, H.-H., & Tzeng, Y.-T. (2017). Developing and validating a scientific multi-text reading comprehension assessment: Evidence from texts describing relationships between climate changes and the Three Gorges Dam. Bulletin of Educational Psychology, 49(2), 215-241. doi:10.6251/BEP.2017-49(2).0003】
  3. 林世華、盧雪梅、陳學志(2004)。國民中小學九年一貫課程學習成就評量指標與方法手冊。臺北市:教育部。【Lin, S.-H., Lu, S.-M., & Chen, H.-C. (2004). The learning achievement assessment indicators and methods manual of grade 1-9 curriculum. Taipei, Taiwan: Ministry of Education.】
  4. 張郁雯、林文瑛、王震武(2013)。科學表現的兩性差異縮小了嗎?-國際科學表現評量資料之探究。教育心理學報,44(s),459-476。doi:10.6251/BEP.20111028 【Chang, Y.-W., Lin, W.-Y., & Wang, J.-W. (2013). Is gender gap in science performance closer? Investigating data from international science study. Bulletin of Educational Psychology, 44(s), 459-476. doi:10.6251/BEP.20111028】
  5. Anderson, L. W. (1999). Rethinking bloom’s taxonomy: Implications for testing and assessment. Retrieved from ERIC database. (ED435630)
» More
APA FormatLin, H. -H., Lin, S. -H., & Wu, H. -K. (2018). Developing and validating a constructed-response assessment of scientific abilities: A case of the optics unit. Journal of Research in Education Sciences, 63(1), 173-205. doi:10.6209/JORIES.2018.63(1).06