‘Good’ online assessments could bring much-needed credibility to online learning

online-assessments-schoolsAs K-12 schools across the country begin to implement online learning, issues of cheating and lack of credibility are some of the main reasons why skeptics hesitate in supporting online learning—especially MOOCs.

The answer to decreasing cheating, as well as giving more credibility to many less traditional forms of online learning, is in good assessments, say supporters.

“Assessments are the lynchpins of [online learning],” said David Smetters, CEO of Respondus, a Windows exam creation tool. “If you go to a [class], it’s certainly possible to learn things. But when you actually take the assessments designed for it, you can demonstrate mastery of the content. An instructor then feels comfortable providing a grade…or some type of badge.”

“Assessments are the engine of this credibility cycle,” he continued.

Smetters also argues that if students are effectively assessed during a MOOC, lessening cheating, the credibility of MOOCs goes up.

Of course, now the question becomes, ‘What makes a good online assessment?’

(Next page: The must-haves of an online assessment)

“The top issue relates to integrity of online assessments,” explained Smetters. “Is the right student taking the exam? Is the student searching the internet for answers or accessing other applications during the exam? Is the student using a second-hand computer or accessing other devices or materials during the online exam? Is the student receiving help by another person, or working in a group during the exam?”

These issues are currently being driven by accreditation mandates, said Smetters, but the challenges have been around for a while.

Other issues with online assessments involve FERPA and privacy issues during proctoring of exams.

However, some of the leading providers of online assessments to many institutions consider these features ‘must-haves’ for ‘good’ online assessments:

1. Auto-proctoring: A fancy way of saying ‘no human proctors.’

“To ensure that a learner does not simply Google his way to a certification, a good assessment partner should be able to provide a cheating-proof environment,” said Mettl, an online assessment solution. “Photo ID verification, signatures, typing-styles, screen-sharing, and webcams are known and tested ways to avoid cheating in exams. The use of a webcam and microphone enables learners to take exam at their home, while the camera does its work recording the associated screen activity.”

Auto proctoring also helps to cut down cost of hiring a manual proctor.

“If you need to rely on human beings to grade and proctor the exams, the costs can skyrocket,” said Smetters.

One of Respondus’ solutions, Respondus Monitor, uses a student’s webcam to record the assessment session, allowing for exams to be taken in a non-proctored environment and deters students from accessing other resources during an exam—such as a phone or second computer.

It also ensures the right student is taking the exam, and that the student isn’t getting help from others.

By using this monitor, the company can price the application using FTE or seat pricing, not on a per exam basis.

“There are no restrictions to the number of exams that use Respondus Monitor in a course—the cost is the same,” said Smetters. “In fact, the typical instructor uses Respondus Monitor with over six assessments each term. That would be too cost prohibitive with live proctoring solutions. If an institution is doing live proctoring with over 500 exams a year, Respondus Monitor is almost always a huge cost savings.”

2. Scalability

According to Mettl, scalability of assessment software is critical for smooth functioning. For example, a course on the ancient Greeks—a relatively obscure topic—may have around 2,000 tests per annum, while  a popular course like coding will amount to around 20,000 tests in the same time. The assessment platform should be able to scale seamlessly.

A highly scalable solution shouldn’t require students to schedule their exam sessions days or weeks in advance, said Smetters.

“During busy testing periods, our system can handle more users simultaneously than the live proctoring services can support in an entire day,” he emphasized. “Students like not having to schedule exam sessions in advance; and if students have to pay for the proctoring cost themselves, they like that our solution is $10 for unlimited use during a term, compared to $20 to $35 charged for each exam by live proctoring services.”

(Next page: Features 3-5)

3. Intelligent Q & A options.

Assessment tools must offer good flexibility in terms of the types of questions or files that can be submitted, said Smetters, as “the needs of a computer science course are very different than those of history courses.”

“Some learners have to be assessed on specialized skills such as coding, spoken language skills, and other knowledge,” said Mettl. “A good assessment platform should be able to seal the gap between theory and practice.”

EdX recently introduced an artificial intelligence software to grade essays and short answers, made available free on the web. The same software will also give instant feedback and allow them to re-write the essay.

4. Good service.

“Network up-time is absolutely critical,” said Mettl. “Interruption of tests and loss of data owing to electricity outages, software crash, server trouble et cetera, are absolutely off-limits.”

Since everything is online, a great end-user experience is also one that involves no downloads, no installations and smooth access to learning via any internet-enabled device.

“Assessments should also integrate seamlessly into the online course rather than being conducted offline in separate test centers,” they said.

5. Credible analysis.

Good online assessments should be able to provide a comprehensive assessment with proper analysis and insight of the score reports,” said Mettl. “The insights should be clear actionable results showing micro analysis like average time spent on a question to macro analysis like comparison of pass/fail rate among different batches.”

The only downside to good assessments, however, even with auto proctoring, is in the cost.

“There is a cost to have effective online assessments,” said Smetters. “Grading an assessment can be time intensive if it isn’t automated. Implementing a system that prevents cheating can add to the cost, too.”

However, the cost may be worth it for any school or institution that would like its online learning offerings to be taken seriously.

“The biggest pitfall of disorganized assessment is, of course, its impact on the brand of the [school or institution],” said Mettl. “A certificate from an institute or school is more a testimony to the institute, not the candidate.”

Sign up for our K-12 newsletter

Newsletter: Innovations in K12 Education
By submitting your information, you agree to our Terms & Conditions and Privacy Policy.

Meris Stansbury

Want to share a great resource? Let us know at submissions@eschoolmedia.com.

New AI Resource Center
Get the latest updates and insights on AI in education to keep you and your students current.
Get Free Access Today!

"*" indicates required fields

Hidden
Hidden
Hidden
Hidden
Hidden
Hidden
Hidden
Hidden
Hidden
Hidden
Email Newsletters:

By submitting your information, you agree to our Terms & Conditions and Privacy Policy.

eSchool News uses cookies to improve your experience. Visit our Privacy Policy for more information.