Principal Investigator Patricia Maes
Project Website http://www.media.mit.edu.ezproxy.canberra.edu.au/projects/wordsense-learning-language-in-the-wild/overv…
As more powerful and spatially aware Augmented Reality devices become available, we can leverage the user’s context to embed reality with audio-visual content that enables learning in the wild. Second-language learners can explore their environment to acquire new vocabulary relevant to their current location. Items are identified, "labeled" and spoken out loud, allowing users to make meaningful connections between objects and words. As time goes on, word groups and sentences can be customized to the user's current level of competence. When desired, a remote expert can join in real-time for a more interactive "tag-along" learning experience.