Artificial Intelligence students unaware that their teaching assistant is an AI
Ashok Goel, a professor in computing at Georgia Tech, implemented an AI teaching assistant in the online Q&A forums for one of his courses last semester. Young academics busting their backs in hopes of one day snagging a TA job will be alarmed to learn that the A.I, named Jill, performed so well that most of the course’s students couldn’t tell her from the other eight human TAs who were performing the same duties..
Every time Professor Goel teaches Knowledge Based Artificial Intelligence, a course with the goal to “build AI agents capable of human-level intelligence and gain insights into human cognition,” about 300 students, post 10,000 messages in the online forum. This is too much to handle for Goel and his eight teaching assistants (TA). “One of the main reasons many students drop out is because they don’t receive enough teaching support. We created Jill as a way to provide faster answers and feedback,” he said.
The A.I.’s full name is Jill Watson, built from the same IBM Watson platform that beat humans in Jeopardy! over five years ago, so it goes without saying that she is smart. Still, during the first few weeks in January, Jill really struggled and she needed some coaching. To train the system to answer questions correctly, Goel fed it forum posts from the class’s previous semesters. This gave Jill an extensive background in common questions and how they should be answered. Goel tested the system privately for months, having his teaching assistants examine whether Jill’s answers were correct.
“Initially her answers weren’t good enough because she would get stuck on keywords,” Lalith Polepeddi, a graduate students who helped build Jill told the Georgia Tech News Center. “For example, a student asked about organizing a meet-up to go over video lessons with others, and Jill gave an answer referencing a textbook that could supplement the video lessons — same keywords — but different context. So we learned from mistakes like this one, and gradually made Jill smarter.”
Goel tweaked the software, adding more layers of decision-making to it. Eventually Jill reached the point where its answers were good enough. The system is only allowed to answer questions if it calculates that it is 97 percent or more confident in its answer. Goel found that was the threshold at which he could guarantee the system was accurate. There are still many questions Jill can’t handle. Those questions were reserved for human teaching assistants.
On April 26, when the professor informed his class that Jill was a virtual TA and not a real person, the student response was “uniformly positive,” with at least one class-goer’s mind being “blown,” the university news reports. Most students did not doubt about Jill Watson’s personality and believed that she was human similar to other TAs. The ruse, however, wasn’t perfect—a few students suspected early on that Jill might be a computer (although other human TAs also were suspected of the same).
More information: Machines that willl think and feel
Goel plans to use Jill again in a class this fall, but will likely change its name so students have the challenge of guessing which teaching assistant isn’t human. While he doesn’t foresee the chatbot replacing teaching assistants or professors, he expects the chatbot’s question-answering abilities to be an invaluable asset for massive online open courses, where students often drop out and generally don’t receive the chance to engage with a human instructor. With more human-like interaction, Goel expects online learning could become more appealing to students and lead to better educational outcomes.
“To me this is a grand challenge,” Goel said. “Education is such a huge priority for the entire human race.”
- Space and Intelligence