A Simple Tool to Help Make the Retake Process Less Chaotic

Google Forms Logo

Part of placing value on the process of learning means giving students multiple opportunities to demonstrate understanding. As a result, retakes are an inevitable part of the process. For many teachers, especially those at larger schools, allowing students to retake assessments is not a philosophical problem, but a logistical one. While creating a whole new assessment has its own baggage, the process of re-learning and the scheduling of who will retake what and when can be overwhelming. To help streamline the entire process, I would like to share a simple strategy that anyone can replicate in a short amount of time, which I have found to help bring a bit of sanity to the organization of the retake process.

When I first started to allow retakes, my system had dramatic flaws. In fact, I am hesitant to even give it the courtesy of calling it a system. I simply told students to let me know when, what, and where they wanted to retake. One student might tell me this information in person while another would send me a message using our LMS. Sometimes the information they provided was incomplete and I would have to chase them down to confirm. Regardless, I would typically write this down somewhere or, even worse, leave it to my memory. This all started to quickly spiral out of control as I was bombarded with messages late at night or approached by students during random times throughout the school day. Each retake request required me to divert my attention from whatever I was doing in that moment so I could write myself a little reminder. To make matters worse, sometimes students would completely change the information they had previously given me and I would have to go back and edit whatever I had originally written down. All of this required the additional step of me generating little reminders to myself on top of the list of obligations and tasks that are a natural part of our profession. My system was designed to fail from the start and, eventually, it did. Something had to change.

After reading a bit about how others implemented their retake policies, I eventually came across a strategy that involved using Google Forms to generate a Reassessment Request Form. By using a specific Add-On within the form, all the student’s answers would be automatically emailed to the teacher in a nice and simplified way. This was exactly what I was looking for—basically a personal assistant to handle the grunge work of scheduling and identifying important information. With a little bit of tweaking to make it fit my needs, here is what I eventually started to do.


Creating a Reassessment Request Form from ChemEd Xchange on Vimeo.


Compared to the lack of structure I had in place before, implementing this simple tool has helped me allocate my attention, time, and cognitive load to things higher on my priority list. I spend less time worrying about the logistics of the retake and more time focusing on helping my students better understand chemistry.

If you have your own retake policy, how do you go about actually executing it? Feel free to share any tips or tools that you think would be useful!

Supporting Information: 
Join the conversation.

All comments must abide by the ChemEd X Comment Policy, are subject to review, and may be edited. Please allow one business day for your comment to be posted, if it is accepted.

Comments 13

Chad Husting's picture
Chad Husting | Tue, 05/22/2018 - 12:50

Ben - I totally love this!  I plan on doing a version of this for make up work.  It is quick, fast, easy and cheap.  Just my style.  Thanks again.

Donna Engel | Wed, 05/23/2018 - 07:28


We implemented a protocol to assist with our retakes.  Students must have the protocol sheet signed by parents.  Students must attend two tutoring sessions either during their study hall with a resource teacher or after school with their classroom teacher, and students must complete a review packet prior to taking the reassessment.

Ben Meacham's picture
Ben Meacham | Wed, 05/23/2018 - 14:14

Hey Donna,

Thank you for sharing.  I have a couple questions about the protocol that I'm hoping you can clarify since I'm trying to develop a better protocol of my own.

1) Do you check the review packet for accuracy?  I keep struggling to try and find an efficient way for checking that re-learning has occurred prior to the reassessment.

2) Do you require them to retake the entire assessment and therefore the entire review packet?  

3) Do you have a system in place for helping students identify the area(s) of weakness that they need to focus on while re-learning?


Elizabeth Hamann | Fri, 05/25/2018 - 09:29

Retakes are a hot button topic in my building.  Teachers dont want to feel like they are doing more work than the students.  This year I implemented a "Resource Packet" that must be completed and checked by me if students want to make up a test.  In addition they have to score below a 70% on the original exam.  While every student was given a resource packet for every unit (a large amt of paper) only a handful of students took advantage of the opportunity.  However, I did find that those students who completed the resource packet didn't score below a 70%!  (It might be working!)  I find my more confident students don't need or want to do the extra work, my lower level students are not motivated to do extra work (let alone the required work).  The resource packet seemed to help the mid-level student who needed more face time with me.  This gave them a great reason to see me before or after school!   

Kathe Hetter's picture
Kathe Hetter | Fri, 05/25/2018 - 13:07

I do standards based grading/learning objectives.  Each question/problem on a summative assessment has the learning objective(s) next to it.  Students have a bubble sheet that I fill out.  It is created using Illuminate in the rubric format.  Let's say I decide one problem is worth a total of 4 points.  I may then have four bubbles (A, B, C, D).  If they problem was done correctly, I bubble in "D".  If not completely correct by a minor error then I would bubble in a B or C.  Since each question is put in question groups by learning objective, I can then print a report that gives them their overall percentage grade for the entire assessment.  The best part is that it breaks it down into the scores for each learning objective.  Students have to achieve an 80% on each learning objective.  If they do this for 4 out of the 5 learning objectives, then we both know exactly what they need to work on.  It gives the students very positive feedback.  I can tell them that yes, you may not have mastered that learning objective, but look you did for all the others.  In our grade system, Power School, I then have each learning objective down with its total points instead of one test grade.  This is my second year doing this and I have only had very positie feedback from both students and parents.  I have had students who have had a overall percentage of greater than 90%, but may have not mastered one standard.  They have the option of going back and retesting that standard.  It makes things a lot simplier when you break down your assessments into the learning objectives (outcomes).  It also gives me feedback if there is one learning objective that the majority of students did not master.  Obviously, that lets me know that I need to reflect on my approach to presenting this objective and reteach.

Donna Engel | Tue, 05/29/2018 - 07:57

Hi Ben,

We do check the review packet for accuracy however it is not for a grade.  We also created short reviews based on skills.  Our assessments are written linked to "I can" statements so students can review and determine which areas need to be improved/reassessed with their classroom teacher  One concern when we initially started was that students would purposely fail or not prepare because of the reassessment opportunity.  Since we have a protocol in place which requires more work that has negated that concern.  The only issue we have is students not wanting to be reassessed especially at the ninth grade level.  

Doug Ragan's picture
Doug Ragan | Wed, 05/30/2018 - 06:00

While looking for some good MC question websites for Chemistry for varying assessments I came across http://www.problem-attic.com/. Using the free creator and really impressed so far with its list of questions by topic. Saw an AP section but haven’t checked it out yet. 

Catalina Mejia | Sat, 06/02/2018 - 10:15

Hi. I was wondering how you structure your tests so that you and your students can easily identify the learning targets. Does each target have a separate section of the test? Are your tests online? I'm interested in applying this request form for next school year, it's a wonderful way to organize retakes!! Thank you!

Ben Meacham's picture
Ben Meacham | Mon, 06/04/2018 - 20:02


Yes, each learning target is given a separate section of the test. I have added a sample test in the supporting information. It will show you an example test I give during our reactions unit and how a test can be structured according to learning targets.  Our tests are not online.

Glad to hear you liked the idea!  Hope it helps serve a purpose.

Lauren Stewart's picture
Lauren Stewart | Mon, 06/04/2018 - 10:28

When I switched to SBG, reassessments were a nightmare. I just ended up having piles of post-it notes on my desk! I also switched to a Google Form and add-on but I take it one step further with Autocrat. Once students enter their information into the form, Autocrat then generates a Google Doc with all of their information in it. I just click through the list of Google Docs every morning, type in questions, print them out and my reassessments are good to go for the day. Since I use SBG, I write around 20 reassessments a day but it only takes me 10-15 minutes every morning with this system. I wrote a quick walk-through on my blog a few years ago: https://modelsofar.wordpress.com/2015/07/15/autocrat-for-reassessments/ 

Suzanne Irwin | Wed, 06/27/2018 - 07:26

Hi Ben,

I also used Autocrat and Google Forms, as Lauren mentioned in a post.  In fact, I got the idea from her Blog.  I found it to be very helpful.  Having said that, there a couple of things I love about your system, and I have a question as well.

(1) I love your spot for them to rate understanding, you to rate understanding, and you to give feedback.  My students really liked SBG this year, but one of the end of year comments I got was that they wanted more feedback, instead of just a number that corresponded to "Got It'", "Almost," etc.  Your system makes that explicit! Also, we had a hard time with student reflection this year--student reflections and goal setting seemed to be very superficial. I think your spot for student self rating would help with that. I am definitely going to use this! 

2) I also really like the explicit way you incorporate particulate understanding in your questions on your sampe test.  I had my students draw diagrams, but you assess their particulate understanding in multiple ways.

3) I had a minor issue with students who needed to take retakes not taking advantag of the system, and some students who took advantage of the system not preparing for retakes appropriately (which was difficult for me, because it created more grading).  Because I allowed them to retake at any time and was almost too flexible, this created a lot of retakes at the end of the quarter.  Don't know if you had that problem.  Next year I am struggling with how to address this, and I think I might try using Lauren Stewart's idea of electronic portfolios as a way of showing preparation for the retake. I allow them to retake only the learning targets they need to reasses.

3) Question -- I see you use a 6,5,4,3 scale -- which effectively makes a 3, for no understanding, equal to a 50%.  How does this work for you?  The main motivator for retakes for most of my students remains their final average -- even though on qualitative surveys, they appreciated and understood that this system helped their overall understanding.  I switched from a 0,1,2 scale to a 0,1,2,3,4 scale at the semester break because I needed a way to differentiate "almost there" from "making progress".  My students would like it if I adopted a scale like yours, but I am concerned that they would then be able to have no understanding of half the learning targets, master the other half, and still get a 75% (which would be fine for quite a few of them but I am looking for them to master more targets).  Anyway, curious how your scale works for you?


Thanks for sharing!!

Ben Meacham's picture
Ben Meacham | Thu, 07/05/2018 - 22:54


Thanks for the feedback, much appreciated!  It's definitely been a journey but it's FAR from over.  All we can do is continue to try and make the system more efficient and effective.  I'll try my best to respond to your questions/comments.

1) The intentional space left student self-evaluation and teacher feedback was a nice change this year.  It was really useful for scenarios where there was a clear misconception of how well a student thought he/she understood the material (student thinks he is at a "6" when really I thought he was at a "4".)  The available space for feedback was helpful too since it allowed for nearly all feedback to be in one centralized place for each learning target.  As the year progressed, we actually started to include little "codes" in the feedback section.  For example, one of the codes might be "FR" and in the feedback box it would indicate that "FR" stands for "Faulty Reasoning."  That way, all I really had to do while grading, was write "FR" instead of explicitly describing the faulty reasoning within the question itself; though I did choose to actually provide more detailed feedback if I thought it was necessary.  A more specific code would be something like "SE" which stood for "Stoich Error."  Having specfic codes for each unit a more efficient way of providing feedback.  Not a perfect solution but it was helpful.

2) I'm glad you liked the particulate models!  I pretty much attribute my obsession with particulate drawings to my experience with Modeling Instruction.  I've been happy to see that a particulate understanding is becoming more of a requirement when demonstrating understanding, even for high-stakes tests like the AP exam.

3) While I have had my own fair share of logistical retake chaos, I found that the only real way to decrease the chances for a messy situation are to have boundaries that are clearly communicated.  Here are a few boundaries and requirements that we had in place last year that helped provide a bit of sanity to us:

  • Retakes MUST be done no later than 5 school days from when the test was handed back to them.  For example, if they received their test back on a Wednesday, they would have until the following Wednesday to retake ANY learning target from the specific test they wish.  This prevented the classic bottleneck of retakes from different units throughout the quarter that tends to happen the week before the quarter ends.
  • Throughout a given unit, I provide Practice worksheets.  Each Practice tends to focus on 1 or 2 of the learning targets from that unit.  In order to even qualify for an opportunity to retake, students must have completed, and turned in, every single Practice from that unit by the day of the test.  Though we spent some time discussing this as a department, the eventual consensus that drove this policy revolved around the idea that retakes are a privilege and not a right.  Therefore, the opportunity to retake can be viewed as something that is earned rather than simply given to anyone who spent the entire unit doing nothing to adequately prepare themselves for the assessment.  By doing the Practices ahead of time, students should perform better on the assessment.  Doing better on the assessment results in far fewer retakes.  So it's not like the policy was designed with the intention of denying students the opportunity to retake; it was our way of encouraging students to do the appropriate things throughout the unit that would most likely put them in a position for success.  Once I really stuck to this part of the policy, the number of retakes went down signficantly due to having more students that were better prepared and the fact that, no matter what, there will always be students who don't want to put in the work.  
  • With respect to preparing for the retake, we still haven't fully figured out the best way to do this but it always involved students providing SOME kind of evidence of re-learning.  This could take the form of test corrections, notes that were taken from online videos, student-created tutorials, 1-on-1 convos with me, completing additional problems from an online source or the textbook, etc.   

I REALLY liked Lauren's idea of electronic portfolios too and am thinking about implementing that idea this year.

4) When we swtiched to SBG, we recognized that we were making this change within a computerized grading system that wasn't designed for SBG.  Everything was still on a 100-point scale.  In order to compensate for this, and to make the grading system more mathematically accurate (and fair), we essentially eliminated 0 - 50%.  Consequently, the lowest score one could earn was a 50%.  Unsurprisingly, this decreased the number of F's by quite a bit.  However, it's not like those that would've normally earned an F at a 32% all of a sudden got bumped up to a C or anything.  Instead, some still remained in the F range of 50% - 60% while many of those same students stayed within the 60% - 65% and ended up with a D.  The more I've learned while researching and reading about SBG, I've found myself more in agreement with the idea that a D is essentially the same thing as an F.  However, not only are these kids are no longer being held back, but they actually have some glimmer of hope to bounce back from a poor test that would otherwise put them in hole that is nearly inescapable from based on a traditional grading system (100-point scale). 

Our 3, 4, 5, 6 scale is really no different from a 1, 2, 3, 4 scale.  It's identifying 4 levels of compentency.  Unfortunately, because of the computerized grading systems most of us have to work with, these different levels can far too easily still be seen as "points"--which is not our intent.  There are multiple options you could pursue, like having a more logic-based mastery requirement that defines what is required to earn an A, B, C, etc.  For example, I know multiple teachers that subscribe to a system (on a 0 - 4 scale) that says something along the lines of "to earn an A in this class, you must earn a 4 on at least 80% of the learning targets and having no score lower than a 3 on the learning targets."  You can imagine creating some kind of requirement for all letter grades if you choose to go down that path.

Regarding your concern about students only mastering half the targets and not understanding the other half, resulting in a 75%, my experience has shown that it's very rare for a student to master a number of learning targets and then all of a sudden start demonstrating little to no understanding of multiple targets down the road.  If they consciously choose to just stop demonstrating any understanding after already mastering half the learning targets, then they are accepting the fact that they are OK with a 75%.  If that happened, I would recommend a conversation to be had between myself, the student, and parents to make it clear how silly this approach is from the student given the level of understanding he had already demonstrated he is capable of attaining.  At the end of the day, it all comes down to buy in from the students.  If we can effectively convice them about the merits of our grading system and how it is designed to help them learn and achieve success, the results will be a natural consequence of more students learning and the development of more intrinsic motivation.

I know that was probably more reading than you had hoped for but I love discussing this stuff with other teachers.  I hope I was able to answer your questions!  Let me know if I can provide more clarity.  

Thanks again for the post!  

Chad Husting's picture
Chad Husting | Mon, 09/10/2018 - 10:52

Ben - Fascinating journey and discussion.  I am working with a "school within a school" during a duty bell.  They are working on SBG.  Here is the question and struggle.  How do labs play into this?  Any thoughts or ideas?  Any help would be great.  Here is my email in case you do not want to post.  hustingc@sycamoreschools.org