Teaching With Instant Feedback in Automated Assessments
Assessments are core components of course design, comprising one of the three points of the course design triangle. They measure student knowledge, understanding, and skill, but can also provide insight on the success of various teaching strategies. The efficacy of the assessment, however, is not just measured by its design, but also by the feedback that instructors provide. When instructors offer little more than a score or grade, students may still be bewildered by concepts that haven’t quite clicked, with little guidance on what was incorrect or how to improve.
By offering meaningful feedback for each question in an electronic assessment, you can instantly and effectively seize asynchronous teaching moments to explain rationale, expound on perspectives and techniques, and empower students to take control of their learning. This article offers a three-pronged framework for extending your teaching with prewritten feedback in the automated assessment.
The Possibilities of Feedback
Feedback that is more than a grade is crucial to quality teaching and learning. Espasa and Meneses (2010) showed that students who receive written feedback on assignments earn higher grades in and are more satisfied with their course than students who do not receive such feedback. Students prefer feedback that is helpful to their learning (Debuse, Lawley, & Shibi, 2007), and they want that feedback promptly (English & English, 2015; Bridge, Appleyard, & Wilson, 2007). The instantaneous nature of feedback in an automatically graded assessment is one of the advantages of electronic delivery. For you, offering instant assessment feedback for every question takes less time to construct than feedback for individual answers. Your general feedback can be an enhancement to your lecture presentation and reduce future time spent grading by addressing and resolving student questions in advance of other assessments.
Automated assessments allow you to create a variety of question types (e.g., multiple choice, true or false, matching, short answer) and render feedback tailored to a student’s correct or incorrect submission. Depending on your learning management system, you may have the option to display general question feedback after students submit each individual response or upon submission of the entire assessment. Different from manually writing on student papers in response to their answers, the key to giving feedback on an automated assessment is to write it in advance based on your expert knowledge and experience with where students are most likely to err. For each assessment question, your preemptively written feedback is then delivered at the moment students are primed for and are concentrating on the assessment and becomes just-in-time teaching, guidance, and direction.
For the strategies we’ll discuss below, consider the following multiple-choice question and feedback in an introductory linguistics course exam:
Question: Tommy and Gina are making a scene in the grocery store with a loud argument over their choice of beverages. Tommy insists that he wants to buy “pop”; Gina retorts that she will drink only “soda.” You restore peace by informing them that they are, in fact, talking about the same item with different linguistic labels. You explain:
- They are expressing lexical variants characteristic of a regional dialect.
- Their communicative breakdown is the result of contextual variation.
- Prior to the Southern Shift, discourse rarely reflected semantical differences.
- “Pop” and “soda” are derivational morphemes of the same part of speech.
Answer: A. They are expressing lexical variants characteristic of a regional dialect.
We’ll use the three steps below to create a brief paragraph (approximately three to six sentences) of question feedback that explains, expounds, and empowers student learning.
Step 1: Explain
Explain the answer and walk students through the necessary knowledge and logic (one to two sentences).
Explanatory feedback, in its most basic form, supplies students with the correct answer or possibility. But when students “get their tests back,” they want more than just a correct answer; they want a reason. Reasons identify faulty logic. Reasons motivate and drive deeper learning. When writing your assessment questions with accompanying feedback, think also about other possible answers students might consider correct. Then, write the what and why of the question in a sentence or two. Ask yourself:
- How might I explain why an answer is the answer and other possible answers will not do?
- What important considerations do beginners tend to ignore?
Returning to our example question, explanatory feedback could be:
We can identify Tommy and Gina’s linguistic differences as a matter of dialect because contextual variation is a variety of language within an individual, the Southern Shift refers to pronunciation of vowels, and derivational morphemes make new words from old ones. A dialect is a variety of language differing from standard vocabulary or speech patterns, usually by geographical or social distribution.
Step 2: Expound
Expound on the answer’s implications or potential in alternate contexts (one to two sentences).
Exponential feedback directs a student’s thinking in ways that reinforce learning. Students are already building neural networks around new knowledge and skills, and you can assist that learning by joining the new knowledge to prior knowledge. Make analogies. Suggest applications. This connective kind of feedback is appropriate whether the student answered the question correctly or incorrectly. When constructing feedback, think also about what present and future implications exist for students, and ask yourself:
- How is this knowledge similar to something students may already be familiar with?
- Where might students see or use this information in the future?
Returning to our example question, explanatory, exponential feedback could be:
We can identify Tommy and Gina’s linguistic differences as a matter of dialect because contextual variation is a variety of language within an individual, the Southern Shift refers to pronunciation of vowels, and derivational morphemes make new words from old ones. A dialect is a variety of language differing from standard vocabulary or speech patterns, usually by geographical or social distribution. Recall that in Week 3, we discussed regional dialects of English and discovered that most of y’all are from the Southern United States. Before you travel, it might be helpful to understand the regional dialect of your destination so that you can order pop or soda appropriately.
Step 3: Empower
Empower students to verify their own answers and equip them with resources to take their learning to the next level (one to two sentences).
Feedback can also empower students to solidify their learning by redirecting them to resources already provided. Remind students about discussions or readings that introduced the topic of the assessment question. But more than a look back, feedback can also be a look forward that extends their learning and leads them to resources that continue the conversation. Not all students will be deeply interested in every topic, but when they are, they can rely on you as the expert to direct them to top-notch resources that they may not find on their own. When constructing this type of feedback, ask yourself:
- What course resources can I suggest students revisit?
- Where would I suggest the motivated, ambitious student turn next for deeper learning?
Returning to our example question, explanatory, exponential, empowering feedback could be:
We can identify Tommy and Gina’s linguistic differences as a matter of dialect because contextual variation is a variety of language within an individual, the Southern Shift refers to pronunciation of vowels, and derivational morphemes make new words from old ones. A dialect is a variety of language differing from standard vocabulary or speech patterns, usually by geographical or social distribution. Recall that in Week 3, we discussed regional dialects of English and discovered that most of y’all are from the Southern United States. Before you travel, it might be helpful to understand the regional dialect of your destination so that you can order pop or soda appropriately. See again pages 35–43 of your textbook. To listen to sample English dialects from across the world, visit the International Dialects of English Archive (IDEA) on the Web, a collection founded and directed by dialectician Paul Meier.
Conclusion
Though a grade can be considered one type of feedback, more helpful for students is feedback that alerts them to the logic, connections, and implications of their growing knowledge with an eye toward continued growth. When constructed in advance by an expert who knows how and why an answer is derived and where the beginner can best find additional information, the instant feedback features of automated assessments become an additional strategy for teaching and learning. The “explain, expound, and empower” framework discussed here offers a useful tool for seizing the teaching moments of automated assessments and creating the sort of thoughtfully written communication to the student that only the instructor can provide.
References
Bridge, P., Appleyard, R., & Wilson, R. (2007 May). Automated multiple-choice testing for summative assessment: What do students think? Paper presented at the International Educational Technology (IETC) Conference, Nicosia, Turkish Republic of Northern Cyprus. Retrieved from http://files.eric.ed.gov/fulltext/ED500077.pdf
Debuse, J., Lawley, M., & Shibi, R. (2007). The implementation of an automated assessment feedback and quality assurance system for ICT courses. Journal of Information Systems Education, 18(4), 491–502.
English, J., & English, T. (2015). Experiences of using automated assessment in computer science courses. Journal of Information Technology Education: Innovations in Practice, 14, 237–254. Retrieved from http://www.jite.org/documents/Vol14/JITEv14IIPp237-254English1997.pdf
Espasa, A., & Meneses, J. (2010). Analysing feedback processes in an online teaching and learning environment: An exploratory study. Higher Education, 59(3), 277–292. Retrieved from http://www.jstor.org/stable/25622183