Jiannan Li1,
MaurĂcio Sousa1,
Karthik Mahadevan1,
Bryan Wang1,
Paula Akemi Aoyagui1
Nicole Yu1,
Angela Yang1,
Ravin Balakrishnan1,
Anthony Tang2,
Tovi Grossman1
1University of Toronto
2Singapore Management University
It autonomously tracks regions of interest
, and changes behaviours according to subtle cues
, including instructors’ body movements, gestures, and speech.
Most of the time, Stargazer follows the instructors' hand actions.
Instructors can use gestures
to change what the robot tracks. For example, a pointing gesture has the robot focus on the object being pointed at.
They can further use speech
to change camera parameters, such as angle and zoom levels.
Our participants were able to create how-to videos on a diverse range of topics using Stargazer. Please see our paper for more details, and the supplemental materials for more video results.