There are a variety of types of assessment, including diagnostic, formative, and summative. The results from assessments have the potential to give teachers and students valuable feedback about progress toward learning objectives. A number of contributors to ChemEd X have written about assessment, including Ben Meacham, Stephanie O'Brien, and Deborah Herrington. One of the most challenging tasks associated with evaluating student progress is the design of the assessment itself. Although teachers frequently use assessments to guide their instruction, many teachers have not received formal training about how to write good questions.
I have taught high school chemistry since 1998. During my first few years of teaching, I spent most of my time focusing on what I wanted to teach. I did not spend much time planning how I would teach the content or how I would know if my students had learned it. After I had been teaching for several years, I was able to find a good balance between curriculum, instruction, and assessment. I remember feeling especially overwhelmed by the challenges associated with teaching AP Chemistry. I used to spend countless hours searching for good AP Chemistry questions. Gradually my confidence increased to the point where I felt comfortable creating my own assessments. Since 2014, I have been fortunate to work as an outside item writer for the AP Chemistry exam. This experience has taught me about the process of item development and helped me to improve my skills in this area. (Note: In the world of assessment, a question is also known as an "item".)
Before a teacher writes a chemistry question, they probably think about what they want their students to know and/or be able to do. This may involve a combination of chemistry content and science practices. In some cases (such as AP or IB Chemistry), the curriculum is well-established. Once teachers have decided on the specific content that they plan to assess, they can start the process of writing items. These could be either multiple-choice (MC) or constructed response (CR). No matter which type of item they plan to write, it is helpful for teachers to consider student misconceptions throughout the writing process. Anyone who has taught chemistry for a few years has seen several examples of student misconceptions. The ability to anticipate the errors that students tend to make should serve as a guiding principle when designing assessment items. In addition, a well-written question can uncover student misconceptions.
For my first example, I will write a MC item that addresses a misconception about intermolecular forces (IMFs). The boiling point (BP) of a substance is often mentioned in questions about IMFs. Students could be asked to predict which one of two substances has the higher BP and to give a justification for their prediction. Alternatively, students could be given BP data for two different substances and be asked to give an explanation for the data. Recent examples of questions from the AP Chemistry exam that involve IMFs are 2019 #2(c), 2018 #4(a), and 2017 #1(d).* A common misconception is that students get confused about the difference between breaking covalent bonds within a molecule and breaking the intermolecular attractive forces between molecules.
When writing an item, it is often desirable (but not necessary) to generate a stimulus. This may be a table, a diagram, or a chemical reaction. The information presented in the stimulus gives students something to focus on, and serves as the entry point for the item. In this example, I have created a stimulus that gives students the Lewis electron-dot diagrams for two different compounds. Compound 1 is propane (C3H8) and Compound 2 is ethanol (CH3CH2OH) (see figure 1). Below the stimulus is the stem, followed by several choices. The stem is the question that students are asked to answer, and the choices include the correct answer along with several incorrect choices, also known as distractors.
Figure 1: Item that addresses misconceptions related to intermolecular forces.
There are some who assume that a MC item can only be used to measure the knowledge of factual information. One strategy that I use to encourage students to apply higher-order cognitive skills when answering a MC item is to include claims, predictions, and justifications in the choices. In this item I will ask students to decide which one of the compounds shown in the stimulus has the higher boiling point. Choices (A) and (B) feature Compound 1 (C3H8), and choices (C) and (D) feature Compound 2 (CH3CH2OH). Students also have to pick the statement that provides the correct justification for their choice.
The distractors that I have written have included some of the potential misconceptions associated with IMFs and covalent bonds. Choice (A) is based on the misconception that covalent bonds are broken during the evaporation of a liquid. Choice (B) is based on the misconception that a molecule that has C–H bonds can form hydrogen bonding attractions. Choice (C) is based on the misconception that the –OH group featured in the structural formula of Compound 2 is a hydroxide ion. Therefore students might see this –OH group and assume that Compound 2 is an ionic substance, instead of a molecular substance. The correct answer to the item in Figure 1 is (D).
For my second example, I will write a MC item that addresses common student errors when performing calculations of the standard enthalpy change (ΔHo) for a chemical reaction. When they learn thermochemistry, students are usually presented with two different methods for calculating the value of ΔHo for a chemical reaction. They often get confused about which method to use in a particular problem.
METHOD #1 uses the data for standard molar enthalpies of formation (ΔHof) and the following equation.
METHOD #2 uses the data for bond enthalpies, and often features the following equation.
ΔHo = (sum of bond enthalpies for bonds that are broken) – (sum of bond enthalpies for bonds that are formed)
In my stimulus, I will give students a balanced chemical equation and a table of bond enthalpy data. In my question stem, I will ask students to use the bond enthalpy data to calculate the value of ΔHo for the reaction. The stimulus and stem are shown in Figure 2 below.
Figure 2: Item that addresses misconceptions related to reaction enthalpy calculations.
Two common student errors include the following. Students could set up their calculation backwards. If they are confused between enthalpy of formation and bond enthalpy, they would set up a "products minus reactants" calculation that would look like the following. This is why I created choice (C) as a distractor.
[(2)(430)] – [(440) + (240)] = 860 – 680 = +180
Students might understand that the value of ΔHo involves the sum of the bonds broken (one H–H bond and one Cl–Cl bond) minus the sum of the bonds formed (two H–Cl bonds). However, if they forget to use the coefficient of "2" with the bond enthalpy of H–Cl, they would get wrong answer, as shown below. This is why I created choice (D) as a distractor.
[(440) + (240)] – (430) = 680 – 430 = +250
If a student happened to make both of the errors that I have described above, their calculation would look like the following. This is why I created choice (A) as a distractor.
(430) – [(440) + (240)] = 430 – 680 = –250
The correct answer involves the following calculation, which explains why choice (B) is correct.
[(440) + (240)] – [(2)(430)] = 680 – 860 = –180
In this brief article, I have shared a few ideas about writing good questions. There is much more about assessment design that I could talk about, but I wanted to keep this article short. I have included a list of references below that are related to assessment and item development. I plan to re-visit this topic in the future and share more of my ideas, suggestions, and examples. I am optimistic that there are teachers who are willing to share their own suggestions and resources related to writing good chemistry questions. I encourage you to add your comments to the conversation below. I would love to hear from you.
References & Resources
Towns, M. H., Guide To Developing High-Quality, Reliable, and Valid Multiple-Choice Assessments. J. Chem. Educ. 2014, 91 (9), 1426–1431. (This is an Editors' Choice article. It is open access to all.) (accessed 1/25/20)
Domyancich, J. M., The Development of Multiple-Choice Items Consistent with the AP Chemistry Curriculum Framework To More Accurately Assess Deeper Understanding. J. Chem. Educ. 2014, 91 (9), 1347–1351. (Available to subscribers of JCE. Members of ACS & AACT can use their complimentary downloads to access.) (accessed 1/25/20)
Underwood, S. M.; Posey, L. A.; Herrington, D. G.; Carmel, J. H.; Cooper, M. M., Adapting Assessment Tasks To Support Three-Dimensional Learning. J. Chem. Educ. 2018, 95 (2), 207–217. (This is an Editors' Choice article. It is open access to all.) (accessed 1/25/20)
Cheung, D., A Test Construction Support System for Chemistry Teachers. J. Chem. Educ. 2006, 83, (9), 1399–1405. (This is an Editors' Choice article. It is open access to all.) (accessed 1/25/20)
Mulford, D. R.; Robinson, W. R., An Inventory for Alternate Conceptions among First-Semester General Chemistry Students. J. Chem. Educ. 2002, 79, (6), 738–9. (This is an Editors' Choice article. It is open access to all.) (accessed 1/25/20) (accessed 1/25/20)
Bretz, S., L., Designing Assessment Tools to Measure Students' Conceptual Knowledge of Chemistry. In D. M. Bunce & R. S. Cole (Eds.), Tools of Chemistry Education Research. Vol. 1166. Publication Date (Web):July 31, 2014. (Available to subscribers of JCE. Members of ACS & AACT can use their complimentary downloads to access.) (accessed 1/25/20)
*For more information about the AP Chemistry Exam, visit the following web site. https://apcentral.collegeboard.org/courses/ap-chemistry/exam (accessed 1/25/20)