To help with his class this spring, a Georgia Tech professor hired Jill Watson, a teaching assistant unlike any other in the world. Throughout the semester, she answered questions online for students, relieving the professor’s overworked teaching staff.
But, in fact, Jill Watson was an artificial intelligence bot.
Ashok Goel, a computer science professor, did not reveal Watson’s true identity to students until after they’d turned in their final exams.
Students were amazed. “I feel like I am part of history because of Jill and this class!” wrote one in the class’s online forum. “Just when I wanted to nominate Jill Watson as an outstanding TA in the CIOS survey!” said another.
Now Goel is forming a business to bring the chatbot to the wider world of education. While he doesn’t foresee the chatbot replacing teaching assistants or professors, he expects the chatbot’s question-answering abilities to be an invaluable asset for massive online open courses, where students often drop out and generally don’t receive the chance to engage with a human instructor. With more human-like interaction, Goel expects online learning could become more appealing to students and lead to better educational outcomes.
“To me this is a grand challenge,” Goel said. “Education is such a huge priority for the entire human race.”
At the start of this semester Goel provided his students with a list of nine teaching assistants, including Jill, the automated question and answering service Goel developed with the help of some of his students and IBM.
Goel and his teaching assistants receive more than 10,000 questions a semester from students on the course’s online forum. Sometimes the same questions are asked again and again. Last spring he began to wonder if he could automate the burden of answering so many repetitive questions.
As Goel looked for a technology that could help, he settled on IBM Watson, which he had used for several other projects. Watson, an artificial intelligence system, was designed to answer questions, so it seemed like a strong fit.
To train the system to answer questions correctly, Goel fed it forum posts from the class’s previous semesters. This gave Jill an extensive background in common questions and how they should be answered.
Goel tested the system privately for months, having his teaching assistants examine whether Jill’s answers were correct. Initially the system struggled with similar questions such as “Where can I find assignment two?” and “When is assignment two due?” Goel tweaked the software, adding more layers of decision-making to it. Eventually Jill reached the point where its answers were good enough.
“I cannot create chaos in my classroom. Jill had to be almost perfect as a human TA or I am,” Goel said.
The system is only allowed to answer questions if it calculates that it is 97 percent or more confident in its answer. Goel found that was the threshold at which he could guarantee the system was accurate.
There are many questions Jill can’t handle. Those questions were reserved for human teaching assistants.
Goel plans to use Jill again in a class this fall, but will likely change its name so students have the challenge of guessing which teaching assistant isn’t human.
“A really fun thing in this class has been once students knew about Jill they were so motivated, so engaged. I’ve never seen this kind of motivation and engagement,” Goel said. “What a beautiful way of teaching artificial intelligence.”