It's been a few days since my summer break began. I have had a few days to decompress, relax, and think about my next post. I have been planning to write about concept mapping since the end of our first semester. I first recognized the effects of concept mapping in the classroom when I read Shannon Bowen's blog post last December.
When I read Shannon's post and commented, she mentioned that students' test scores increased as well as their understanding and comprehension of content. I was excited to try this approach to final exam review. So, last semester I conducted an experiment with my 4th hour Honors class as the control (standard packet review) and my 6th hour Honors as the test subject (concept mapping). My 6th hour class was divided into groups and were given 4 days and a 6' x 3' sheet of white paper. In the end, my 6th hour class demonstrated greater understanding and questioning abilities of the content they had covered; they scored 5% higher on the multiple choice portion and 10% higher on the written portion. This was enough to convince me to have all of my freshman chemistry classes concept map their review the next semester but with a slight caveat. In addition to doing a group concept map on paper (over 2 days), students took advantage of my Chromebook cart (courtesy of the blended learning pilot I participated in this past year) and developed individual digital concept maps using either Lucidchart Diagrams or Google Drawing. The goal with this approach was that students developed a pretty significant piece of work collaboratively and then further studied and became familiar with the content as they developed their own work. Below are some examples of analog and digital concept maps.
Upon conclusion of this most recent semester, I found mixed results. One of my honors classes saw a 4% jump in multiple choice and a 9% rise in the written portion (as compared to the fall semester's control group). My other honors class saw a 1% decrease in multiple choice and 1% increase in the written portion (seemingly negligile results). When comparing this semester's regular chemistry classes to last semester, the data shows a concerning decrease in scores.
What can the drop in scores in regular chemistry be attributed to? A number of factors cross my mind but some of my observations include a lack of motivation to actively work in a group or contribute to the review process. Many of the students were not accustomed to thinking and working proactively on projects like this. Some students really struggled with connecting the dots so-to-speak between various concepts. My pre-intern and I did our best to guide students through the process of creating a concept map and questioning students who said they were "done." These student groups quickly realized they were far from being finished and consequently may have struggled with motivation to continue and also find value in doing this type of review. Others were absent during the review period.
Nonetheless I still recognize value in this type of exam review and plan to continue working with it in future semesters. For what it's worth, a biology teacher in my department has begun incorporating more concept mapping into his unit reviews after watching my classes work through it at final exam time.