Paepcke, Andreas and Kairam, Sanjay (2012) EchoTree: Engaged Conversation when Capabilities are Limited. Technical Report. Stanford InfoLab.
BibTeX | DublinCore | EndNote | HTML |
| PDF (Enabling collaborative conversations) - Submitted for Publication 662Kb |
Abstract
We describe the collaborative use of word tree visualizations, EchoTrees, to facilitate face-to-face, and remote communication with speech and movement impaired individuals. EchoTree is designed to bridge the inevitable conversational dead space while the impaired person uses assistive technolo- gies to generate written, or artificially spoken sentences. Visualizations that guess multiple conversational directions in which the impaired person might be headed keep conversation partners engaged. Partners may call out possibilities, which the impaired person can confirm or dismiss. Correct guesses accelerate conversational progress. EchoTrees are browser based and interactive. Multiple parties may view, and interact with the same stream of EchoTrees from desktops, tablets, or smartphones. The application may also be used for collaborative, geographically remote story telling games, whose participants may, or may not be physically impaired. We describe our implementation, and provide preliminary measurements of how effectively EchoTrees predict conversation flow for one particular underlying set of text materials.
Item Type: | Techreport (Technical Report) |
---|---|
Uncontrolled Keywords: | disabilities, collaboration, word tree, visualization. |
Projects: | Digital Libraries |
ID Code: | 1054 |
Deposited By: | Andreas Paepcke |
Deposited On: | 24 Sep 2012 15:49 |
Last Modified: | 24 Sep 2012 15:49 |
Download statistics
Repository Staff Only: item control page