The eVokability Interactive Costume Project
The eVokability Project is an interactive performance system based on wearable devices that control projected media with customized sensing hardware and software. eVokability has been developed in collaboration between performers with disabilities and interactive media designers, as a means to extend expressive capabilities in a performance space.
The performer wears a Costume that senses body and voice dynamics and translates those into moving images and sound, to reflect the unique contours of her body’s capabilities, concentrated in a particular expressive action. The Costume allows the performer to use interactive media, both to make oneself more visible to the viewer, and to project one’s own viewpoint. The patterns of movements/voice become the dynamic controllers of a multimedia narrative, a layering of multiple channels of expression: spoken, textual, filmic, sonic, kinesthetic, live and mediated.
The eVokability Project has developed two Costumes thus far, one using the voice to navigate a virtual space, the other based on movements of the hands and arms to animate text and control multimedia, both designed for wheelchair users.
The development of the eVokability Interactive Costume Prototype was made possible by generous support from the Temple University Junior Faculty Research Initiative.