The dialogue of this research is between humans, design, and tools. The product focuses on digital times audience who are designers, music and art lovers under high pressure, or daily anxiety, and would appreciate a light generative interface for resting time. This research proposes an XR MIDI tool that can not only be used in large stage life music performance but also for individuals’ daily use. The broader goal to start this research falls into the vision of providing convenient, playful digital products for emotion regulation and anxiety relief. The theoretical approach falls on two parts, the synesthesia simulation, and emotion regulation strategy.
Synesthesia refers to the production of a sense impression relating to one sense or part of the body by stimulation of another sense or part of the body. According to multiple pieces of research, the existence of a lower, unconscious degree of synesthesia in non-synesthetes is found widely for music perception. [1] Vision, hearing and other senses are strongly connected. There are also many musicians claims that they can hear the color of one piece of melody and can describe details as thick, dark, sweet, white...Based on the theory of synesthesia, sound cues could be intimately effective to enhance the act of touching and perception for motion, volume, acceleration.
Games with identical outputs and visual rewards are considered to have significant psychological impacts on the user. During the game experience, the user often enables the various type of cognitive emotion regulation like reappraisal, acceptance. The experience also reveals self-control and self-confidence building and fulfilling. Thus, the highly multi-sense game can be effective for emotion regulation towards specific mental health and well-being seeks. The main purpose of this research is to examine new possibilities for perception enhancement within the design in digital platforms, specifically how can we enhance the experience of performing by transforming tangible notions into the intangible medium. [2]
The initial concept of digitizing sculpting into an audio interface comes from the breakdown of four aspects: cross-culture appreciation, the creation of individual imagination, materiality as medium and methodology, and focus as a mental aspect. The concept engages multi-perception into a closed-loop interface by preserving the user attachment of art practice and transforming the materiality and medium of the creation process. The loop is constantly happening during playing among “listen”, “sculpt”, and “art generation feedback”.
In this game, the performer will sculpt a virtual bubble from inside out according to the sound cues. At the initial stage, the performer will be placed inside a translucent small bubble, which is visible in the app, but the performer can’t see the visual during the performance. When the game starts, the performer can push out the bubble surface while pressing the sculpt button on the app and tilting the phone, the bubble will display manipulated surface with pattern gradience according to life soundtrack playing on the Bose Frame. When the performer is navigating in the space and manipulating the bubble, live music will be generated according to the action.
Action Design:
The surface manipulation modeling, Table3, was studied in rhino with grasshopper and c#. Several forms of visual output were generated using mesh, vertices and move commands. These studies are deployed as simple mouse click interactions. In table 3, each interaction visuals is captured at 20ms/frame.
User Scenarios for app daily use:
• daily lightweight play on a smartphone, between working sessions, during the coffee break
• effective for a positive reappraisal for emotion regulation, reduce anxiety and back to work
• refresh body and brain with multi-sensory activities in the space, also a body stretching time
User Scenarios for performance use:
• indoor life music performance with projection of life visual arts generated by the performer
• cross-location and cross-time viewing of the performance in Virtual Reality
• revisits of recorded performance art piece
Sound Testing:
A group of 5 designers and 3 non-designers were presented with a list of 55 sound cues and three simple surface manipulation visuals. Testers were asked to pick 5 sound cues to represent the surface manipulation visuals.
• Clothes Friction (weakest)
• Balloon Inflate
• Bubble Pop
• Water Drop
• Squeeze (strongest)
Outcome sound effect reveals a linear like match of sound pitch and surface deformation. Sound with stronger pitch and clear uniqueness is matched with stronger deformation.
In layer A, there presents a constantly playing soundtrack. Figure 1. The BOSE Frame is used to orientating the performer with head rotation data. The data comes from gyro sensors. The coordinates of the performer’s head rotation are mapped to sound pitch and volume with c# scripts in unity. [Head Up & Down] is mapped to [Pitch Up & Down], [Head Left & Right] is mapped to [Volume/Stereo Up & Down]. At the horizontal rotation plane, the true north is at the fastest and loudest sound cue.
Initial design decision mapped pitch to horizontal plane but during testing, testers tend to forget to move heads up&down because pitching is more playful to navigate. Thus, the final decision mapped pitch to vertical plane and mapped volume/stereo/beats to horizontal plane.
Frame: Orientation Final Decision
•Head Up & Down: Pitch Up & Down
•Head Left & Right: Volume/Stereo Up & Down
Play testing prototype is generated using BOSE Frame and mouse right or left click on web display using unity with c#. There are three scripts to control the sculpting manipulation on a virtual bubble: mousebeat(c#)[A], mouseinput(c#)[B], and sculpt(c#)[C]. Table 5.
Proposed Sculpt Behavior with smartphone based interface:
•Behavior 1 - slide over (location) - phone vertical
•Behavior 2 - stay pushing (size) - press button
•Behavior 3 - push harder (depth) - phone horizontal
Play testing interface based on Web/VR:
•On Mouse Left Down - /bubble inflation/
•On Mouse Right Down - /rocket beats/
4 participants participated in the play testing based on web interface display on projector and Bluetooth mouse as push input tool. while one participant is sculpting, the other three will examine viewing experience. All 4 participants indicated that a pattern bubble is more matching to the sound. Participants also indicated that they would like to manipulate more bubbles or more complex visual materials so as to be awarded more complex visual outcome at the end.
Camera 1: rotating around outside the bubble
Camera 2: tracking the center of manipulation, displaying from inside, Figure 5
Virtual bubble display can be viewed in VR or recorded together with the soundtrack generated from users, which combines visual sharing, performing, and performer-viewer interaction. Figure 6.
In summary, the engagement of multi-sense performance in close-loop interface with audio augmented reality can successfully enhance people’s perception of spatial volume, sound, motion, and orientation. The experiment also indicates that users are easier to orient themselves on the horizontal plane rather than vertical plane, and easier to associate material deformation with sound pitches. Future play testing may be done for measuring the playful quality and sound-form association with data monitoring. The testing methodology to determine emotion activities might include a before-after bio-metric data measuring using pulse sensors, GSR sensors, or EEG headband.
1. Bragança, Guilherme Francisco F., et al. “Synesthesia and Music Perception.” Dementia & Neuropsychologia , vol. 9, no. 1, 2015, pp. 16–23., doi:10.1590/s1980-57642015dn91000004
2. Jerčić, Petar, and Veronica Sundstedt. “Practicing Emotion-Regulation through Biofeedback on the Decision-Making Performance in the Context of Serious Games: A Systematic Review.” Entertainment Computing, vol. 29, 2019, pp. 75–86., doi: 10.1016/j.entcom.2019.01.001.
3. Stanger, Nicholas, et al. “The Role of Preperformance and In-Game Emotions in Cognitive Interference During Sport Performance: The Moderating Role of Self-Confidence and Reappraisal.” The Sport Psychologist, vol. 32, no. 2, 2018, pp. 114–124., doi:10.1123/tsp.2017-0001.
4. Troy, Allison S., et al. “Cognitive Reappraisal and Acceptance: Effects on Emotion, Physiology, and Perceived Cognitive Costs.” Emotion, vol. 18, no. 1, 2018, pp. 58–74., doi:10.1037/emo0000371.
5. Aldao A, Nolen-Hoeksema S, Schweizer S. 2010. Emotion-regulation strategies across psychopathology: A meta-analytic review. Clin Psychol Rev. Retrieved May 17, 2019 from https://www.ncbi.nlm.nih.gov/pubmed/20015584
6. Ewa Domaradzka* and Małgorzata Fajkowska. 2018. Cognitive Emotion Regulation Strategies in Anxiety and Depression Understood as Types of Personality Front Psychol 9:856