LIVE@CoSN2024: Exclusive Coverage

Aiming to learn as we do, a machine teaches itself

Researchers are fine-tuning a computer system that is trying to master semantics by learning more like a human, reports the New York Times. Give a computer a task that can be crisply defined—win at chess, predict the weather—and the machine bests humans nearly every time. Yet when problems are nuanced or ambiguous, or require combining varied sources of information, computers are no match for human intelligence. Few challenges in computing loom larger than unraveling semantics, or understanding the meaning of language. One reason is that the meaning of words and phrases hinges not only on their context, but also on background knowledge that humans learn over years, day after day. Now, a team of researchers at Carnegie Mellon University—supported by grants from the Defense Advanced Research Projects Agency (DARPA) and Google, and tapping into a supercomputing cluster provided by Yahoo—is trying to change that. The researchers’ computer was primed with some basic knowledge in various categories and set loose on the web with a mission to teach itself. The Never-Ending Language Learning system, or NELL, has made an impressive showing so far. NELL scans hundreds of millions of web pages for text patterns that it uses to learn facts—390,000 to date—with an estimated accuracy of 87 percent. These facts are grouped into semantic categories: cities, companies, sports teams, actors, universities, plants, and 274 others. NELL also learns facts that are relations between members of two categories…

Click here for the full story

Sign up for our K-12 newsletter

Newsletter: Innovations in K12 Education
By submitting your information, you agree to our Terms & Conditions and Privacy Policy.

Laura Ascione

Want to share a great resource? Let us know at

eSchool News uses cookies to improve your experience. Visit our Privacy Policy for more information.