Week 8 – March 14, 2022

Week 8 – March 14, 2022

User Experience

Gillian and Constanza spent this week editing and iterating the script for our meditation. As a team, and with Equa’s consent, we decided to cut the script down to focus on Seeing and Hearing Out (that is, seeing and hearing what is around you) since those are the senses best manipulated by VR. Using this shift in focus to solely “outward” senses, we mapped out a visual scene to take place involving a dragonfly as a dynamic focal point and created a list of sounds the meditator should hear in the environment. We also spent time discussing the sense of “Feel,” which is the third sense addressed by Equa in their meditations. We discussed what it means for the meditator to feel something in VR, and decided that one thing we could do was draw attention to the headset the user is wearing. We’re currently planning on moving forward with this approach to “Feel” in VR, but might end up changing it or removing it based on playtesting responses.

As we continue to adjust our hand gesture system, we’ve found that the Quest 2 often mistakes people’s relaxed hand gestures for the hand gestures we’ve been using to choose responses in our meditation. To solve this problem, Faris has built in floating objects that the meditator can use to select options.

Environment Building

Lots of progress was made in our environment this week as more plants and features were added. The first plant Faris made this week was a giant hogweed. This will be replaced in the future, but is currently acting as a stand in for another flower that grows in bunches. Faris also made some cattails, which sway to a “relaxation level” determined by a meditators breath rate and breath rate variability.

Giant Hogweed

Lauren started designing a dragonfly to act as a dynamic focal point for the meditators.

Dragonfly Iteration 1

Lauren also spent this week iterating on the water in our environment so that it was less transparent and more in line with our style.

New Water

Biometrics and Sensor Integration

This week Zibo’s main goal was connecting the Hexoskin to our data server so that breath biometrics could become biofeedback in our environment. This was a little difficult since this week Equa Health had the Hexoskin for two days, but Zibo was able to create a Bluetooth connection between his phone, the Hexoskin, and our application on Friday. With the Hexoskin now connected to our application, Zibo will be shifting his focus towards stabilizing the connection as Faris begins optimizing the biofeedback in game to feel more natural.

Production

We had several meetings with faculty this week to discuss the state of our project and what we should focus on moving forward. After meeting with our faculty instructors, faculty consultants, and Carl, we’ve hammered out a short list of priority assets and given them hard deadlines for implementation. We’ve also created a playtesting schedule for the rest of the semester, which will be vital for optimizing how guests interact with biofeedback and the environment as a whole.

We also recorded and edited a 30 second draft of our project trailer, as seen below: