Teachers around the country have a lot of questions this fall. How will the lack of summative assessment data from last spring impact the school year? How quickly can I determine what students may have missed in the chaotic close of the 2019–2020 school year? Are remote assessments accurate? How can I parse the interim and formative assessment data of incoming students and focus on the areas that will provide the greatest return?
The answers will vary from school to school, but across the board, assessment is going to be critical in getting students back on track.
Missing and Remote Assessments: Do We Have the Data We Need?
The majority of schools closed in the spring before they had a chance to perform their standard end-of-year summative assessments. That’s one source of data that teachers didn’t have as they planned for the new academic year.
Compounding this issue, students’ abilities are likely going to be far more varied than they are at the beginning of a typical school year. Again, there are many unanswered questions: What material did students still need to cover when school buildings closed? How much new instruction was provided via distance learning? Did students have internet access? Did students stay engaged or disconnect from school completely? Did students have family members who were able to step in and support their progress, or were they struggling along alone?
Teachers will have to more heavily rely on fall assessments to understand where their students are, and what learning gaps exist within their classroom. Of course, many schools are starting the fall with virtual instruction, raising the question of whether remote testing is as effective or accurate as in-person assessment.
The folks at Imagine Schools, a charter network with 30,000 students at schools in seven states and the District of Columbia, answered this last question by 1) conducting remote assessments in the spring; and 2) commissioning a study of that data. Dr. Bill Younkin of the Biscayne Research Group examined the scores of approximately 5,000 students at 16 of Imagine’s schools and found that remote assessment was as effective as in-person assessment, with a couple of minor exceptions.
“Particularly low scores were a little less common among students being assessed remotely,” noted Younkin, “while exceptionally high scores were slightly more common. These effects were both observed at the lower grades, but virtually disappeared at the higher grades.”
Add your opinion to the discussion.