Cross-modal terrains: Navigating sonic space through haptic feedback

Gabriella Isaac, Lauren Hayes, Todd Ingalls

Research output: Contribution to journalConference articlepeer-review

Abstract

This paper explores the idea of using virtual textural terrains as a means of generating haptic profiles for force-feedback controllers. This approach breaks from the paradigm established within audio-haptic research over the last few decades where physical models within virtual environments are designed to transduce gesture into sonic output. We outline a method for generating multimodal terrains using basis functions, which are rendered into monochromatic visual representations for inspection. This visual terrain is traversed using a haptic controller, the NovInt Falcon, which in turn receives force information based on the grayscale value of its location in this virtual space. As the image is traversed by a performer the levels of resistance vary, and the image is realized as a physical terrain. We discuss the potential of this approach to afford engaging musical experiences for both the performer and the audience as iterated through numerous performances.

Original languageEnglish (US)
Pages (from-to)38-41
Number of pages4
JournalProceedings of the International Conference on New Interfaces for Musical Expression
StatePublished - 2017
EventInternational conference on New Interfaces for Musical Expression, NIME 2017 - Copenhagen, Denmark
Duration: May 15 2017May 19 2017

Keywords

  • Cross modal mapping
  • Haptic interfaces
  • Multimodal interaction
  • Performance
  • Terrain

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Signal Processing
  • Instrumentation
  • Music
  • Human-Computer Interaction
  • Hardware and Architecture
  • Computer Science Applications

Fingerprint

Dive into the research topics of 'Cross-modal terrains: Navigating sonic space through haptic feedback'. Together they form a unique fingerprint.

Cite this