What do you want to know about your students' chemical thinking? Categorizing chemistry formative assessments

In order to teach chemistry effectively, we must ascertain what our students are thinking about chemistry and make a decision regarding what to do with what we learn. Formative assessment questions provide a useful lens into students' minds regarding what they are thinking about chemistry. Let us consider then, how categorizing formative assessment questions could help us plan out our classes more deliberately, in order to better design purposeful written formative assessments that align with our curricular goals. 

The members of ACCT presented a ChemEd X Talk on this topic on May 26, 2021. View the recording and access resources: What do we want to know about our students' thinking? Categorizing chemistry formative assessments with ACCT.

Skilled teachers commonly use formative assessment to support their students in learning practices and content in science. When formative assessment is effective, teachers can elicit and support their students' thinking in a manner that helps them make decisions, build scientific explanations, and generate arguments in relevant scientific contexts.1 It is useful for us as teachers to consider the way in which the design of our written formative assessment questions may yield different types of information about the substance of our students' thinking. For example, a formative assessment question that we write may be useful to determine whether students had learned the appropriate science content in their lesson the day before. Alternatively, a formative assessment may help to open up our students' thinking about a subject that they had never considered before to us. This may help us choose new paths in our lesson design and classroom discourse. We the members of ACCT have found that it is useful to consider formative assessments in light of two qualities; their accessibility to students and how revelatory they are of students' chemical thinking. The four examples provided below vary according to each criteria, and highlight the different utility of each category of formative assessment.  

Example 1 - Not accessible, not revealing

As a formative assessment, Example 1 could reveal students' content knowledge regarding mole ratios in a chemical reaction and how to interpret a concentration versus time graph. No further insight regarding what students believe is happening at the molecular scale or why the reactants form products is probed by the question. A student that did not know the relationship between a change in concentration and the mole ratio in the balanced chemical equation would have no access to this question; it would create a barrier that students would not be able to overcome. This type of formative assessment question would be good for reviewing for a standardized test that we would predict to have similar questions. Giving this question to our class would enable us to check on whether students had learned this particular piece of knowledge in a review session.

Example 2: Accessible, not revealing

Example 2 is more accessible to students than example 1. This formative assessment has helpful visuals and the formula to solve for density from mass and volume. Both of these components make the formative assessment question accessible to students trying to figure out what the question is asking and how to answer it. Students are clearly led to divide the mass by the volume in order to solve for the density of the rock using the formula density = mass/volume. Despite the higher degree of accessibility, this question still does not reveal anything about what students think about chemistry. A student that answers this question correctly does not do so by sharing their concept of what it means for the structure of a substance to have a high or low density, or about why the water displacement method for determining the volume of an irregularly shaped object is effective. This question would be a good question to give to our students while reviewing for a quiz or test, assuming that our other quiz questions have a similarly high degree of accessibility for students.

Example 3: Not accessible, revealing

Example 3 requires that students have prior knowledge in order to answer the question.  Students would need to know about weather fronts, the relationship between temperature and snowfall, and that a body of water can influence the amount of precipitation that occurs in a coastal area. This prior knowledge can act as a barrier; making it difficult for a student to communicate their thinking if they are not aware of what influences the weather map. Despite this prior knowledge barrier, example 3 is an open ended question that tends towards the student affect - students are asked to explain everything that THEY believe to be important, in contrast to being asked to solve for a prescribed correct answer. A question like example 3 would be a good question to give to students after they had learned about the aforementioned weather concepts as a way to learn about what they forecast based upon a weather map.

Example 4: Accessible, revealing

Example 4 is a formative assessment question that students should be able to access and answer without lots of  prior knowledge, providing useful student data regarding what students think about the concepts of phase change and dissolution. A student would be able to show their teacher what they were thinking without having formally learned about related science concepts. Students could easily explain their explanations in their own words. 

A useful way to visualize the similarities and differences for these formative assessments collectively is on a grid divided into four quadrants, as shown below. Formative assessments would be put onto the grid according to how they varied with regard to their accessibility to a student on the x-axis, and how revelatory they are of students' thinking on the y-axis. Notice how the example formative assessments map out nicely onto the chart, with example 1 (Chemical reaction graph) mapping out onto quadrant A, example 2 (rock density calculation) fitting into quadrant B, example 3 (weather map) aligning with quadrant C, and example 4 (candy question) best fitting into quadrant D. Although all categories of formative assessment have utility at one time or another in our teaching, we at ACCT propose that teachers strive to revise formative assessments to move them toward quadrant D in order to ensure that formative assessments are accessible, and teachers can gain insight into what the students are thinking.

This is Part 4 of a multipart series from the Assessing for Change in Chemical Thinking (ACCT) project. We, the members of ACCT (Becca, Greg, Hannah, Michael, Rob, Scott), represent a NSF-funded collaboration (NSF awards DRL-1222624 and DRL-1221494) between university researchers, graduate and postdoctoral students, and high school and middle school teachers. ACCT focuses on fostering chemical thinking in middle school, high school and undergraduate classrooms through strategic formative assessment usage. To accomplish this we develop resources, tools, and professional development for teachers of chemistry to foster students’ chemical thinking. We also study how chemistry teachers’ reasoning about formative assessment changes and how chemistry teachers shift to emphasize formative assessment as a lever for change. By working with teachers nationwide, we believe that we can help teachers reimagine the way that they think about chemistry, and develop more purposeful, productive and engaging ways of interacting with their students to help them learn. (If you would like to learn more about us, check out our prior ChemEd X blog postsChemEd X conference pagecollection at ChemEdX, and a JChemEd article about how we collaborate). Our work focuses on Chemical Thinking and Formative Assessment as two major frameworks for professional development and research.

ACCT would love to connect with you!

As a group, we at ACCT are looking to connect with teachers nationwide to build an educator community around formative assessment and chemical thinking.  We seek to share the resources that we have designed and are continuing to develop. You can explore more about the ACCT project at https://www.chemedx.org/ACCT, and we welcome you to reach out to us at ACCTProject@umb.edu. We will soon be offering our professional development resources online through ChemEdX, so that teachers and facilitators can deliver the ACCT professional development program to teachers in their district. Please reach out to ACCTProject@umb.edu with inquiries regarding how to get started and to request access to professional learning session materials.

 

References

1. Coffey, J. E., Hammer, D., Levin, D. M., & Grant, T. (2011). The missing disciplinary substance of formative assessment. Journal of Research in Science Teaching, 48(10), 1109– 1136.

Collection: 

NGSS

Engaging in argument from evidence in 9–12 builds on K–8 experiences and progresses to using appropriate and sufficient evidence and scientific reasoning to defend and critique claims and explanations about natural and designed worlds. Arguments may also come from current scientific or historical episodes in science.

Summary:

Engaging in argument from evidence in 9–12 builds on K–8 experiences and progresses to using appropriate and sufficient evidence and scientific reasoning to defend and critique claims and explanations about natural and designed worlds. Arguments may also come from current scientific or historical episodes in science.
Evaluate the claims, evidence, and reasoning behind currently accepted explanations or solutions to determine the merits of arguments.

Assessment Boundary:
Clarification: