Thrii

Nicole Lehrer, David Tinapple, Tatyana Koziupa, Meng Chen, Assegid Kidane, Stjepan Rajko, Isaac Wallis, Michael Baran, David Lorig, Diana Siwiak, Loren Olson

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Scopus citations

Abstract

Thrii is a multimodal interactive installation that explores levels of movement similarity among its participants. Each of the three participants manipulates a large spherical object whose movement is tracked via an embedded accelerometer. An analysis engine computes the similarity of movement for each possible pair of objects, as well as self-similarity (e.g., repetition of movement over time) for each object. The extent of similarity among the movements of each object is communicated by a visualization projected on a three-sided pyramid, a non-directional audio environment, and lighting produced by the spherical objects. The installation's focus is intended to examine notions of collaboration between participants. We have found that participants engage with Thrii through exploration of collaborative gestures.

Original languageEnglish (US)
Title of host publicationMM'10 - Proceedings of the ACM Multimedia 2010 International Conference
Pages1425-1428
Number of pages4
DOIs
StatePublished - 2010
Event18th ACM International Conference on Multimedia ACM Multimedia 2010, MM'10 - Firenze, Italy
Duration: Oct 25 2010Oct 29 2010

Publication series

NameMM'10 - Proceedings of the ACM Multimedia 2010 International Conference

Conference

Conference18th ACM International Conference on Multimedia ACM Multimedia 2010, MM'10
Country/TerritoryItaly
CityFirenze
Period10/25/1010/29/10

Keywords

  • dynamic time warping
  • generative audio
  • generative video
  • movement similarity
  • tangible objects

ASJC Scopus subject areas

  • Computer Graphics and Computer-Aided Design
  • Computer Vision and Pattern Recognition
  • Human-Computer Interaction
  • Software

Fingerprint

Dive into the research topics of 'Thrii'. Together they form a unique fingerprint.

Cite this