New software can help ‘proofread’ Wikipedia


A new tool might help fight malicious editing that introduces incorrect or misleading information in online sites such as Wikipedia, UPI reports. University of Iowa researchers are developing a software tool that can detect potential vandalism and improve the accuracy of Wikipedia entries, a university release says. The tool is an algorithm that looks at new edits to a page and compares them to existing words in the rest of the entry, then alerts an editor or page manager if it senses a problem. There are existing tools that spot obscenities or vulgarities, or major edits, such as deletions of entire sections, or significant edits throughout a document. But those tools are built manually, with prohibited words and phrases entered by hand, so they’re time-consuming and easy to evade, the UI researchers say. Their automatic statistical language model algorithm works by finding words or vocabulary patterns that it can’t find elsewhere in the entry at any time since it was first written. For instance, when someone wrote “Pete loves PANCAKES” into the Wikipedia entry for Abraham Lincoln, the algorithm recognized the graffiti as potential vandalism after scanning the rest of the entry and not seeing any mention of pancakes…

Click here for the full story

Sign up for our K-12 newsletter

Newsletter: Innovations in K12 Education
By submitting your information, you agree to our Terms & Conditions and Privacy Policy.

Laura Ascione

Want to share a great resource? Let us know at submissions@eschoolmedia.com.

New AI Resource Center
Get the latest updates and insights on AI in education to keep you and your students current.
Get Free Access Today!

"*" indicates required fields

Hidden
Hidden
Hidden
Hidden
Hidden
Hidden
Hidden
Hidden
Hidden
Hidden
Email Newsletters:

By submitting your information, you agree to our Terms & Conditions and Privacy Policy.

eSchool News uses cookies to improve your experience. Visit our Privacy Policy for more information.