Author: Winnie Tsai

  • Week 15 (12/12/25) – Post Mortem

    TACIT Post Mortem

    Introduction and overview of the project

    TACIT started with a dream of having some tactile feedback in Jing’s BVW VR monkey climbing game. As a pitch project, we recruited more members, Jack, Alex, Yufei, (and at that time Michael who did not immediately join), and after discussion and outreach to SMEs and faculty, which yielded the initial assumption that haptics cannot work without context support, decided to do a tech exploration on how haptics along with context (audio and visual) can elicit emotional responses, experimenting from the bottom up with individual haptic moments, and stringing it together to create an emotional journey. 

    Deliverable

    Our deliverable converged to two main things: 

    1. Whatever we discovered in the exploration phase into documentation – our TACIT documentation, and,
    2. The main experience we’ll build later in the semester – In The End, a 10 minute cohesive VR narrative experience. 

    Technology

    Using the bHaptics gloves, which we chose due to its comprehensive developer support, easy to finetune haptic actuators, and cheap price compared to all others, we knew from the start we couldn’t create force feedback and temperature stuff. Still, we were determined to experiment and see if that was actually not possible. 

    Exploration phase

    These informed our first half of the semester exploration. In the experimentation phase, we came up with individual haptic patterns in bHaptics Designer, and tested how the patterns themselves elicit imagination from testing participants. After that, we augmented the patterns with different audio and visual context in VR to test how changing contexts changes an emotional response of a same pattern. The following diagram generally summarizes our conclusion.

    Final experience development phase

    In the development for the final experience phase, we created a narrative experience where players become Death to release people’s souls by crushing their hearts, and get to an emotional experience where they have to take the life of a companion accompanying their journey. We zoomed into the haptic patterns of heartbeat + crushing to discover what contexts (animations, sound, and VFXs) are in need to aid the delivery of this haptic moment. We also tried to convey different final moments of death through distinct heartbeat patterns: weak, arrhythmic, steady, or intensifying, which reveal who is destined to die.

    During testing, we utilized the PANAS scale along with the emotional wheel, to test whether this haptic moment (crushing a heart with different heartbeat patterns) in the context of the entire narrative experience resulted in our intended emotion of remorse and sadness. I have attached the comparison between having only 1 scene vs. the entire experience.

    The experience received highly positive feedback and is being submitted to conferences post-semester. We also created comprehensive documentation on haptic experience design to serve as a reference for future designers.

    Team members

    • Winnie Tsai – Producer, UX Research, Prototyper
    • Michael Wong – Sound Designer, Narrative Design, Rigging Artist
    • Jack Chou – Gameplay Programmer, Game Designer
    • Jing Chung – Gameplay Programmer, System Architecture, Technical Artist
    • Alex Hall – Environment Art, Narrative Design
    • Yufei Chen –  3D Character Art, 2D Art

    What went well

    Comprehensive Bottom-Up Exploration and Foundational Contribution

    Our commitment to the bottom-up approach defined the project’s success. We started with the smallest possible unit (individual haptic parameters) and built upwards (adding context, then narrative structure). This process not only led to our final experience but also yielded significant foundational deliverables:

    • We created a library of haptic patterns and an analysis of their potential.
    • We confirmed that the greatest design insight was that the relationship between haptics and emotions lies in leveraging individual sense memories.
    • We compiled all our prototypes into a Haptics Playground, serving as a valuable resource and reference for future designers and ETC projects working with these gloves.

    Sustained and Iterative Playtesting Schedule

    Our discipline regarding playtesting was essential to achieving a playable, polished game by the end of the semester. We participated in every available ETC playtest opportunity, with only weeks 1, 2, and 10 being exceptions. This rapid, constant iteration allowed us to:

    • Understand player experience for haptics perception regardless of gaming or VR experience.
    • Immediately incorporate user feedback into the game design, programming, and art assets.
    • Rapidly refine the complex interaction moment (the heart crushing) and calibrate the supporting audio/visual contexts.

    Successful Applied Case Study

    While the cognitive and emotional connection to haptics is an already researched area, our project successfully served as a case study in the application of these principles to an emotionally rich VR game context. We defined a pipeline for using haptics as the primary vehicle for accumulating emotional weight, validating the hypothesis that carefully designed haptic moments, repeated in a narrative structure, can deliver a complex and moving emotional climax.

    Achieving Complex Emotional Nuance

    We successfully delivered on the ambitious goal of evoking complex emotions (remorse, sadness, bittersweet acceptance) during the climax with granularly designed heart moments of haptics supported by audio and visual. Furthermore, we demonstrated effective emotional storytelling through art direction, as the choice of a frozen diorama style (initially for technical expediency) inadvertently created a feeling of distance that reinforced the narrative themes of Death.

    What could have been better

    Interdisciplinary Communication and Critical Asset Dependency

    The biggest challenge we faced was the breakdown of interdisciplinary communication and differing expectations between roles, which led to a severe bottleneck specifically around the delivery of 3D models for the dioramic scenes.

    • Early Phase Misalignment: In the exploration phase, we initially struggled with role definition. Programmers expected artists to provide conceptual direction for haptic prototypes, while artists expected programmers to define the technical requirements and constraints first. We eventually solved this by establishing a unified team vision before delegating tasks, an adjustment that cost us early time.
    • Late Phase Drag and Dependency: In the second half of the semester, the production schedule stalled due to asset dependency. While programming (gameplay logic and systems), narrative design, and sound design were largely completed or on schedule, the team faced a substantial drag waiting for many 3D model assets due to individual life events impacting the character/environment art pipeline. This was a huge problem because these models were essential for building the static dioramic scenes that frame the interaction. We failed to be proactive enough in identifying this critical bottleneck and reallocating resources, which resulted in:
      • Slower testing and iteration cycles in the second half.
      • A final experience where the 3D models were not as polished as desired, as the rest of the team had to scramble late in the process to assist, instead of the original artists having the necessary time for refinement.

    Sacrificed Agency for Narrative Flow

    Due to time constraints resulting from the art asset bottleneck and difficulties in naturally conveying the companion’s emotional connection within the limited scene count, we compromised on player agency. We had to rely on direct instructions instead of subtle indirect control for certain player actions. This decision resulted in player feedback regarding a lack of agency and a disruptive narrative flow, preventing the emotional climax from reaching its full potential.

    Conclusion

    This semester’s project reinforced the importance of establishing clear guidelines and research foundations for creating meaningful experiences. Starting from the ground level with haptic exploration, we gained deep domain knowledge and discovered the fundamental truth that haptics require context. This insight prompted us to investigate the minimal viable context needed, working from the most basic, pure starting point.

    By building incrementally upon this core foundation without losing sight of our primary focus, we successfully maintained design coherence throughout iteration, which was a critical factor in completing our final experience. Equally important was effective team communication. When everyone shares a common goal and stays on the same page, each team member can leverage their expertise most efficiently, ultimately elevating the final product.

    Next step plan

    The primary goal of the next steps is to address the critical feedback regarding player agency and the emotional connection with the companion, which were constrained by the semester’s timeline and asset dependency issues. This plan involves expanding the experience and formalizing the project’s documentation and external presentation.

    Experience Expansion and Redesign

    This phase focuses on addressing the narrative and design flaws identified in the final build.

    • Expand Scene Count to Five (Original Plan):
      • Increase the number of scenarios from three to five. This will provide the necessary space to build the player-companion relationship incrementally and allow the climax to feel more earned.
    • Strengthen Companion Relationship:
      • Integrate new narrative moments and subtle dialogue throughout the expanded scenes to develop the companion’s personality, goals, and relationship with the player prior to the emotional climax.
    • Improve Player Agency and Tutorial:
      • Redesign the initial tutorial scene to rely on indirect control and environmental cues rather than direct, instructional dialogue. The player should learn the mechanics (heartbeat perception and crushing) through subtle discovery, thereby increasing their feeling of agency and narrative immersion.
    • Asset Polish and Integration:
      • Dedicate focused time to complete and polish the missing or rushed 3D models that were the primary bottleneck in the initial build. Full polish of these assets is required before the final five-scene experience can be fully integrated and tested.

    Post-Semester Submission and Presentation

    This phase leverages the current state of the project for professional exposure.

    • Conference Submission:
      • Formally submit the current 10-minute VR experience, In The End, along with the comprehensive TACIT Documentation, to relevant conferences (e.g., SONA, SIGGRAPH, Laval Virtual) and academic venues focused on interactive media and haptics.
    • ETC Documentation Finalization:
      • Ensure the Haptics Playground environment and the technical/design documentation are fully compiled and deposited as a resource for future ETC student teams.

    SONA Festival Submission

    Speaking of festivals, in the last week, we submitted our experience, which we now call In the End, to a immersive film festival SONA.

    Synopsis

    In The End is an immersive VR narrative structured around haptic interaction as its primary storytelling language. The participant inhabits the role of the Death, guided through a sequence of still, diorama-like scenes depicting the final moments of someone’s life, culminating in the loss of a trusted companion.

    Using haptic gloves, participants feel distinct heartbeat patterns: weak, arrhythmic, steady, or intensifying, which reveal who is destined to die. Dialogue and audiovisual context frame each scene, while touch carries emotional meaning. The act of releasing a soul is felt as resistance or surrender, followed by lingering residual sensations that communicate consequence.

    The experience invites reflection on presence, responsibility, and what it means to feel the end of a life—one heartbeat at a time.

    Poster

  • Week 14 (12/5/25) – Interactions Come with Consequences

    Week 14 (12/5/25) – Interactions Come with Consequences

    With our feedback from softs and after on our full experience, we decided to add consequences to enhance the feeling of choice and impact. Essentially, player experience was already emotional, but because consequences of actions with just the dialogue is does not feel satisfying enough, we decided to add something using haptics!

    Choice of consequence type

    We came up with two options.

    • Option 1: spirit like people send off in the void. Dialogue teach them after void in hospital scene. Transition to companion void afterwards. The final companion yes/no has the same thing?
    • Option 2: spirit like people appear directly above after crush, and crushed pattern is consequence pattern. Play a dialogue at the same time. Transition to companion void. Final companion has residual.

    We decided that even though Option 1 is interesting and seemed impactful, we did not have enough time to test, as well as needing to teach people to send off souls is a whole new thing to tackle. In the end, Option 2 using just the haptic patterns is enough.

    Last Hunt Playtest

    We tested the consequences using these three types of haptic patterns as consequences:

    • Correct choice: stroking or breathing. Lingering for the last companion.
    • Incorrect choice: chainsaw or tension.

    We assigned these to different character’s heart, and it plays after crushing, replacing the original residual impact.

    Feedback and decision

    We realized that the pattern playing just as a soul flies up is too quick, and people still cannot register the impact. We decided to change it to, still play the residual impact, not just replace it, have the soul appear in front of player, and while it fades away, play the consequence pattern.

    Heart moment

    a photo here

    Festival

    And then… it’s ETC 25th Anniversary Festival!!

    We had 15 players complete the full experience, while over 20 additional audience members explored the haptic effects by wearing the gloves and watching the casting monitor as others played. And we got great reactions from those people.

    (Rework the GIF soon)

    Observation of Player Types

    From all the testing in softs, playtest nights, and now festival, we confirmed our hypothesis of whether our experience was played as intended, where haptics is used as a puzzle solving mechanic, by different player types.

    Player Types

    • VR naivety
      • Naive VR guests have great experience with Meta Quest 3’s hand tracking, so it’s to our advantage that our feedback is on the hand, and that the interactions we designed for are most easily observable.
      • Non-naive guests, especially those with handtracking experience actually have a harder time grabbing and crushing a heart. We eventually decided to “teach” players by showing them how to interact with a heart before trying the experience.
    • Gamer vs. non-gamers
      • The most important factor to whether our experience of haptic perception IS perceived, is actually whether the player consciously view this as a game or an experience.
      • Gamers register that haptics is an information for puzzle solving, while nongamers view haptics as feedback, but still use visuals.

    Next steps

    Next week is finals presentation. I will be back with a postmortem of our final and our next step plans! Stay tuned.

  • Week 13 (11/26/25) – Are We Pushing Narrative?

    Week 13 (11/26/25) – Are We Pushing Narrative?

    This week was Thanksgiving week, and we only have 2 work days, Monday and Tuesday. Time is running fast, but we’re determined to get more iterations on our final experience.

    We planned for a Tuesday playtest, and executed it!

    This time, many new naive guests signed up. In particular, there were more narrative-focused creators from ETC who tried our experience for the first time.

    Feedback given on narrative

    Samantha, a second year student who has been in a screenwriting and film career, gave some feedback on narrative.

    Here’s what she said: “The reason why agency low is because, the three scenes in sequential order is very mechanical. Just like delivering an Uber Eats order: you know nothing of your clients, but you’re doing the requests.”

    She suggested three main refinements to tie everything together more:

    • Think of CROW for the characters and develop them in the story.
    • Use indirect control instead of explicit instructions during tutorial. Using rhetoric questions as “hooks” and making players “guess” will give the illusion of choice.
    • Three potential situations which adds nuance to the end result of “crush”:
      • Want to crush it and can crush
      • Want to crush and cannot crush it.
      • Don’t want to crush it but heart still dies.

    Our decision

    We totally agreed with what Samantha suggested, and similar feedback has been coming up ever since our experience had been made. However, because the scope of our project is more on the technical exploration instead than the experience design, we decided that having suggestions of our experience design means our technical part is good enough.

    Without making the game longer, there isn’t really a way to tie everything together and make the story more nuanced. However, since we only have 1 week left (and one of our programmers is off on a conference), we decided to leave this out for now, and focus simply on making our consequences of interactions more obvious.

  • Week 12 (11/21/25) – Polishing for Haptics to Work Seamlessly with Context

    Week 12 (11/21/25) – Polishing for Haptics to Work Seamlessly with Context

    Week 12 Friday is Soft Opening! But before that, we have Hunt Library Playtest Night on Tuesday. Our team members were individually working on project pitches for next semester as well, which was on Wednesday. We were extremely busy this week, but our goal was to finish strongly with a complete final experience ready for official opening, as well as documentation outline for all that we discovered and learned, which were both of our deliverables for this project we pitched 7 months ago.

    Playtest Night

    First off, Playtest Night, the second to last one this semester were our opportunity to playtest everything we want to show on Softs, or what we like to call, playtest our softs. Working through the weekend, we got the model of the companion’s physical form ready. In addition, having written the dialogue for “failure states” of each choice of heart crushing, we also recorded the additional voice lines on Monday. We also finetuned the haptic patterns so that every one of them are distinct enough within the same scene, so that a decision can be made. Finally, we got a car model that completely fit our requirements from a fellow ETC student Emily Zhang’s BVW world a year ago, and by Tuesday evening, we integrated every new asset with the experience so that it is complete in the actual timeline.

    It went like this:

    • Tutorial scene with 3 hearts to practice heart crushing
    • 1st Hospital scene with 2 characters to feel the heart from (choosing decision should be supplied by both context and haptics, but it can work without haptics, because a person sick on the bed is already obvious enough who might be going)
    • The void: players meet the companion the first time as they monologues about death
    • Car accident scene with 3 characters, with 2 of them having equal opportunities of dying just from the scene.
    • The void: companion responds to the “correct” and “incorrect” observations the players made from the 1st hospital scene and the car accident scene, and states that it is now their turn
    • 2nd Hospital scene where there is only the companion’s physical form, a young girl, lying on the bed peacefully. Far away you can see the 1st hospital scene a curtain away (but unreachable)

    We playtested with the same interview structure where we first determined their understanding of the player goals, asked them why they chose certain people to crush their hearts, and finally surveyed them on their emotions while crushing the heart.

    Successes
    • Huge success: haptic patterns were distinctly different.
    • Most importantly, our tutorial scene was effective. Now people know how to crush after practicing it three times. In addition, they are aware that the “choices” weren’t moral, but puzzle solving by finding the faintest heartbeat.
    • Even with our unposed models, players felt sad for them, and most people were emotionally connected to the context.
    Failure
    • Most gravely, many people were not able to access the driver in the car. Most of them did not see her because the window is blocking the view.
    • Players are not emotionally connected with the companion, and the sadness came from just that she is a little girl, not because you have known her in the journey.

    To address these, our refinements list went down to:

    • Re-pose the characters so they are complete and with more details of their emotions.
    • Make sure the driver is visible.
    • Make environment sufficient and not distracting.

    Emotional Wheel Analysis

    Having playtesters experience our full game compared to previous when there were only 1 tutorial and 1 hospital scene, even though the number of playtesters were drastically different, the playtester’s VR experience were actually similarly distributed normally.

    11/1 Playtest Day. N = 26
    11/11 Hunt Library Playtest. N = 6
    11/18 Hunt Library Playtest. N = 5

    We can see that, even with the changes of the heartbeat separated from crushing in the 11/1 Playtest Day and 11/11 Playtest Night playtest, and that the playtester numbers were very different, the emotional wheel frequency were quite similar, where hesitance and powerful leads the emotional experience. On the other hand, after creating the full experience, even with the huge difference in playtest participant numbers (and as any survey, discounting individual differences that we cannot account for for now), with the narrative added in, it immediately spiked up the broader category of afraid and helplessness, as well as took away the happy category. Of course, this hypothesis of narrative really giving a different emotional experience still has to be reevaluated with more different playtest groups, but at least for now, our team was happy with this difference.

    Playtest Day with only hospital 1 scene and 1 heart tutorial. Heart moment has crush and heartbeat played simultaneously.
    Tutorial scene has 3 hearts to practice crushing + refined heart moment where heartbeat is separated from crush haptics.
    Complete experience with the car accident, final scene, and dialogue with companion model.

    Some discussions for project directions in the week

    One of the faculty at ETC, Mike Christel, gave us valuable suggestions on making progress for more playtesting data that could bloom into interesting research spaces we can use to submit into an HCI paper in the future.

    Specifically, he recommended us to have in-game metrics that we can add to the heart moment, so that we might be able to track if hesitance as an emotion can be tracked with quantitative data.

    Our team was excited but cautious about this idea, as quite some of us are indeed interested in turning our project into a short paper, but were not sure if with Meta SDK constantly confusing player’s interactions if we are able to actually get good data.

    In the end, after consulting with more faculty, after learning that our data would not be useful if we haven’t taken the IRB and have all our playtesters sign a consent form, we decided to put down the thought of putting in this data collection step, and focus on experience design.

    Iterations for Experience Design + Documentation

    Adding poses and environment design stuff.

    Car is now seen. Smoke. Tree. Lighting.

    Soft Opening

    Friday Soft Opening finally happened. Our team prepared an intro to our design insights through our documentation outline, and follow with a try-out of the full experience that we iterated even more after Tuesday.

  • Week 11 (11/14/25) – Assembling the Full Player Experience

    Week 11 (11/14/25) – Assembling the Full Player Experience

    This week, it’s building and building. We recorded the dialogue, added the new models, retexture old models, and have all the pieces fit together so we’ll be able to start polishing it in the next weeks. In addition, we are planning to finish an iteration of this assembled version by next Friday, which is Soft Opening day. Our plan was to make use of this week’s Hunt Library Playtest Night to make sure our refined heartbeat is indeed effective (along with refined hospital 1 scene), and next week’s Hunt Library Playtest Night to playtest the full experience.

    Playtesting at Hunt Library again

    Throughout week 10, we refined the heartbeat moment itself and the first hospital scene. We also wrote the full dialogue. Therefore, we built until the end of the first hospital scene, and added the dialogue.

    Because we are testing what we wanted to during Playtest Day, we decided to generally follow the same playtest procedures, but without the groups of different haptic patterns.

    Again with the help of Claude AI analyzing the recurring words in describing the experience, here’s a word cloud, along with the emotional perception analysis.

    Understanding of the overall experience
    Emotional wheel of the moment of crushing the heart

    Theme 1: Haptics Convey Health, Not Emotion

    • Clear consensus that heartbeats indicate physical state, not feelings:
    • “Haptic gave frequency… only tell you about health”
    • “Context gives the emotion”
    • Interpretations focus on: faster = nervous, weaker = dying, stronger = adrenaline
    • Haptics = medical data, not emotional data

    Theme 2: Visual/Contextual Cues Carry Emotion

    • Players read feelings from visual presentation:
    • “Woman was upset and knew this was happening”
    • “Lady sitting beside, looks sad”
    • “Man looks like sleeping”
    • “Old man feels dead”
    • Body language and staging communicate emotion, not haptics

    It is clear that the most prominent emotions from the heart crushing moment was: Hesitance, Powerful, Distant, Amazement, and Worry.

    • Hesitance: came from the narrative of taking people’s hearts.
    • Powerful: the action and the satisfaction of haptic feedback.
    • Distant: the still scene design and narrative disposition of being a grim reaper.
    • Amazement: from the haptic feedback.
    • Worry: for taking people’s hearts and worried that they might not find the correct heart.

    By this point, we know that with or without haptics will not make a difference, because players will be able to make out who should die simply with visual context of an old man dying on the bed, and that would be our next challenge. By design, players should stop and analyze the heartbeats before making a decision, and they should feel the graveness of their role with the heart haptic feedback.

    The narrative disposition were effective, because players were able to understand their role and their goal in each scene, feeling connected to that given, and thus feeling distant to the scene because of their role and the environment, which was part of our design so the story is not too unbearable emotionally to continue. On the other hand, it is still difficult to take away the “wow”-ness from a novel technology like haptic gloves, which gave rise to the feeling of amazement.

    More design decisions

    Failure states of not detecting heartbeats

    Because for the moment crushing the wrong person’s heart does not have any consequences, we added failure states where the dialogue becomes different.

    If the player successfully crushes the right hearts for the first Hospital scene and the car accident scene, it is full success, and when the companion sees you the final time, she will be ready to go.

    If the player crushes one correct and one incorrect one, it is a partial success, and the companion will be more uncertain about what she learned from you during this journey.

    If the player crushes incorrect hearts for both scenes, the companion will be more pessimistic about death.

    Act 2 + Act 3 assembled

    Simply, with our continued development, we have our Act 2 and Act 3 assembled with placeholder assets, so we can finally test the whole thing next week.

    Changed spawn point and retextured Act 1, and placeholder for Act 3 (since it’s the same hospital)
    Act 2 assembled with placeholder assets
    The companion ghost form is finished, and the physical form is on the way.

    Next steps

    Next week is Soft Opening, so we’re trying our best to get assets in and polish the experience. On the other hand, we have Playtest Night at Hunt Library On Tuesday, and 4 of our team members have Project Pitching on Wednesday. It will be a hectic week!!

  • Week 10 (11/7/25) – Iterating on the moment of heart crushing

    Week 10 (11/7/25) – Iterating on the moment of heart crushing

    First, we reflected on our Playtest Day results, and continued developing the details of our other scenes of the experience.

    Unfortunately, we weren’t able to get to how different patterns changed peoples emotions as not even the difference of patterns were perceived correctly. However, this determined our important design refinements.

    Post-Playtest

    Analysis

    Basic screening data: our VR naivety follows a normal distribution, and most of the people who have used VR have also used hand tracking. There are unexpectedly more people who have used haptic gloves (though, to be frank, they were ETC community people who have playtested our project before)

    The data we got from the technical difficulty portion of the survey gave us the following important suggestions:

    • Hospital needed more items to fill the scene for immersion.
    • The heart pattern needs to be more clear.
    • Crushing needs to be taught better (it was extremely difficult to crush)
    • The cartoonish and not realistic style is good so it doesn’t feel too scary.
    • The void-ish emptiness of the title tutorial scene is good because it lets player focus on the heart.

    Analyzing interview qualitative data

    With the help from Claude AI, here are some summaries.

    The understanding of the overall experience: “It’s a VR experience where you play Death and squeeze hearts with really good haptic feedback—it’s interesting but dark/sad.”

    Pedestal Scene vs. Hospital Scene Player Perceptions

    • Both scenes rely on stillness – “nothing is reacting to your presence”
    • Haptic feedback worked well in both (weight, squeezing felt realistic)
    • Both were “approximately the same” mechanically
    • Both created a “mouthwashing vibe” (unsettling, atmospheric)
    • Emotional Impact
      • Pedestal: Removed, abstract, “mythological,” epic/cinematic (Indiana Jones, Doom 2016)
      • Hospital: Personal, emotional, “way more impactful,” sadder, “evil feeling”

    Player Perception: Owner of the Heart’s Feelings

    • Players are treating the body as an object (like the pedestal) rather than a person with subjective experience.
    • “Couldn’t tell if they were dead or not. Assumed they were dead. So assumed nothing”
    • “Probably in grief, feeling like sorry for the soul”
    • “They look like they were sleeping, no energy, sick”

    How Players Decided Whose Heart to Take

    • Visual/Physical Cues
      • Bed/lying down position as primary indicator
      • One person clearly “more deadish” or lifeless
    • Emotional Weight
      • Task described as “daunting,” “heavy,” “weighty”
      • “Taking a kid’s life is super sad for the mom”
      • Not wanting person to suffer
    • Lack of exploration – Many didn’t look around or try both options

    The Emotions Survey

    After transcribing the physical survey into computer spreadsheet, we got the following.

    While doing the interview, we also asked for the why the playtesters filled in the survey as it was:

    • Hesitance, worry – Don’t want to crush it”
      • “Worried about the consequence
      • “Questioning morals”
    • Powerful because crushing heart is satisfying”
    • “Understood role as ‘grim reaper’, and therefore detached himself from the sadness”
      • “They don’t know what I’m doing, lonely sad about it. I was right there with them but distant”
      • “I didn’t notice any change in the heart beat or the rhythm, so distant”
    • “Saw two living people didn’t want to do it. Sad because knew that had to. Remorseful after I did”
    • Shock and surprise, not something that normal game will do”
      • “Shock was from bursting the heart”

    And then we did a simple data visualization after reordering the survey data according to the emotional wheel, and got this:

    Refinements

    Redesigning the heart moment from feedback

    First of all, the reason why haptic patterns are not clear is because they are currently played the same time as crushing. Our original rationale was that, from the moment of crushing, it symbolizes what the characters feel about death as death is taking their soul. However, playing two haptic patterns on a vibrotactile feedback gloves are not differentiable. Therefore, we decided to separate the two. Our first refinement: heartbeat pattern, then crushing (which includes crushing low intensity to high intensity).

    The second refinement is that, since we are taking apart the two patterns, we should make more haptic patterns. This week, we mapped all the characters out with their heartbeat patterns assigned.

    Contextual content refinements

    Apart from the heart itself, our environment and contextual details that go into users consideration of their perception of the heart has to be refined so it doesn’t break immersion. We already know that, we just have to get to work!

    On the other hand, for better tutorial, we decided to have not 1, but 3 pedestals of different heartbeat pattern hearts, so that players can practices sensing the different haptics, as well as practice crushing, which was the gravest difficulty during Playtest Day.

    Going granular on player experience

    Aside from all the refinements, it’s also time to start making the assets for the entire experience. With all the feedback, we know what to watch out for the next scene, as well as how we talk / interact with the players. The greatest change was that instead of 3-5 scenes, we decided to scope down to just 3.

    Player experience doc

    We created a very detailed player experience doc, where each player interaction and scene change is listed in order, accompanied by the dialogue details. It is presented here. It is a central area where programming, art, and design all look at. It gets constantly updated, and are shared with faculty for feedback regularly.

    Script

    The complete script has also been written.

    For each scenes, our goal is clear.

    • Tutorial: tell players their goal as a grim reaper, and onboard them to feel the heartbeat and crush the heart, all within the narrative characterization of death. We teach them carefully to 1. hold the heart facing themselves (so Meta SDK can detect their hands), and 2. close their fists slowly with all fingers closing in (so crush can be detected correctly.) Also, we don’t want players to feel that this is a moral decision, but understand that death is predetermined. This is extremely important as moral implications would be too emotional, but hard to design for emphasis on haptics (the emotions will not come from haptics)
    • Hospital 1 (with 1 old man lying on bed and a relative by their side): teach the player to look for hearts in the scene and choose just one. This is a scene where death is foreseen.
    • Car accident scene (with 1 driver, 1 victim hit by the car, and 1 bystander): this is a more difficult level so to say. It will not be apparent that who out of the victim and the driver will be dying. The bystander is also there to showcase a different heartbeat, where they are filled with horror of witnessing the scene. The scene conveys the random and suddenness of death.
    • The final scene Hospital 2 (the companion – or the narrator – a spirit who is helping you and teaching you): this is to convey a graver sadness or emptiness from sending away someone that has accompanied you on your journey of grim reaper training. They are peaceful to go, but emptiness should linger longer. In addition, it shares the same hospital as the first scene, which is why narratively they are able to talk with you in the first place.
    Tutorial + Hospital 1
    Road scene
    Last hospital scene

    The road scene

    In order to showcase a scene that is clearly an accident that cannot have been prevented, we have a tree branch that fell on the road, indicating that the driver was avoiding it, but accidentally hit a person.

    Concept art
    Scene layout

    Next steps

    Having a clear direction for iteration, our next step is simply build, build, build!! Please look forward to our assembled final experience.

  • Week 9 (10/31/25) – How to Playtest for Emotions?

    Week 9 (10/31/25) – How to Playtest for Emotions?

    This week, preparing for Playtest Day on Saturday, we reevaluated how our final experience answers the core question of our project goals: “How does our design of haptics and context influence how emotion is perceived in a full narrative arc of an experience?“. We first discussed within the team, then had a talk with Dave Culyba, one of the ETC faculty, and discussed with our project consultant and advisors. Finally, we executed the elaborate testing plan we came up with, for the artifacts we developed up until this week, the Tutorial and Act 1 of our experience.

    Haptics role in an experience

    After guidance from Dave Culyba (a teaching professor who teaches the rapid iteration experience design course Building Virtual Worlds at ETC), we identified the three core guidelines to think about our final experience design related to our goals as a haptics project.

    What is the moment of haptics we are focusing on?

    Knowing that we are focusing on feeling the heart itself to be the core to the emotional climax , it is crucial for us zoom in on that heart moment to the finest details, and refine it.

    There are three main phases of the heart:

    • Feeling the heartbeat
    • Crushing it
    • Consequence of having crushed it

    Each phase is accompanied by different haptic patterns, and designing for it to be differentiable with adequate visual and audio context is important. Our design guideline would be to first design a pattern, add context according to it, and refine the moment whether to amplify the pattern or finetune the context details according to how clearly these can be differentiated from each other.

    How does haptics affect our understanding of an experience

    But the first design guideline simply ensures haptics is “perceivable”. It does not equate to emotional response. We recalled our insights from all the prototypes from before halves, where “haptics understanding is based upon sensed memories, whether it’s from realistic experience, cognitive understanding, or interaction of hands, and that’s what’s changing people’s emotional response”. Therefore, in order to employ a haptics-first design, where haptics informs every other part of design, the design goal for each haptics experience will come from what sensed memories we want to get the players get reminded of. This includes having sufficient (but not too distracting) context, haptics that matches expectation and reinforces the understanding (ideally changes how players interact), and if needed, things that helps with changing people’s interaction intention.

    For each of haptics, audio, and visual, they can be categorized into helpful, neutral, and distracting. For our purposes as a haptics-first approach, ensuring haptics are only helpful or neutral is crucial. Context design should support that and not lead.

    What is our emotional goal for every haptics we put in the experience

    Since we are an exploration project on haptics and emotional design, our final experience should include as many different ways haptics changes our emotional responses. Therefore, we need to ensure our design goal for different haptics is to lead to different emotions. This requires us to design different sets of sensed memories that lead to different emotions. And this is what we will ultimately be playtesting for our complete experience, making sure a haptic experience matches our intended emotions.

    Playtest Planning

    With our design directions and insights gained, preparing for our playtest on Saturday became purposeful: we are playtesting for the effectiveness of the heart moment to the overall experience. We would test different sets of patterns to see how that changes perception of a sensed memory, and therefore the emotion, as well as the effectiveness of haptics itself.

    Since we are focusing on the heart moment, we identified 2 parameters that change the heartbeat pattern: intensity (strong, weak) and heart rate (steady, increasing, decreasing, arithmetic). This leads to 8 different patterns. Treating each of the scenes as a small investigation scene, where the player (death) investigates different hearts to determine who’s closest to dying, we speculate that a weak, arithmetic, and decreasing heart is definitely close to death. Any strong heart would be associated with life, an increasing one leads to nervousness, and steady / fast / slow can change depending on context.

    For this time, our playtest artifact will be the tutorial scene along with the hospital scene where the patient will have a different haptic pattern depending on playtest group. We came up with three groups of playtests:

    • Group A: increasing heartbeat (test perception to emotion)
    • Group B: decreasing heartbeat (test perception to emotion)
    • Group C: no haptics at all (test haptic effectiveness)

    On the other hand, with our dialogue still in development, to still ensure player character embodiment, we as playtest administrators decided to simply “speak out” the narrator’s dialogues to 1. teach the players how to interact with the heart., and 2. teach them what they should look for (a heart pattern closer to death) in the hospital scene.

    Testing methodologies

    Our experience has a strong emotional space, so it’s crucial to navigate the nuances that might arise, which influenced our testing methodologies.

    • content warning about depiction of death, reaching into character bodies, hospital scene, and blood on the heart.
    • Likert scales for VR experience
      • A Likert scale is survey question technique that uses a statement and asks the playtester to choosing from 1-5 or 1-7 on how much they agree on the statement. For our experience, we decided to utilize Likert scales to see how much VR, hand-tracking, and haptics experience they have before to easily screen their VR naivety.
    • interview questions for overall experience
      • Knowing that playtesters might not have time to finish a long survey, we decided to use interviews to help them reflect on their experience, and us developers will record the answers in a survey ourselves for more convenient future data analysis.
      • The interview questions were split into sections:
        • VR naivety (Likert scale)
        • Technical difficulties encountered (asking this here ensures it gets out of the way when asking about the actual experience)
        • Their understanding of player goals and therefore interaction intentions in each scene.
        • What they perceived about the heart moment, and how that influenced their decision in choosing to crush it.
        • Give them the PANAS survey
        • Ask about the extreme values in the survey.
    • PANAS for emotions
      • PANAS is a technique for asking what emotions contribute to an experience. We adapted the emotional wheel, choosing 25 most relevant ones, and created a survey that prompts playtesters to check emotions that they felt at the moment of crushing a heart, and for each emotion they felt, circle an intensity that contributes to that feeling.
    Likert scale for VR familiarity screening
    Asking technical difficulties
    Checking understanding
    Understanding perception of heart moment
    Emotion wheel with the emotions we selected
    Emotional survey

    Playtest Procedure

    Whew, that was a lot. Recap: for each playtest group, we do intro (including content warning); in the experience, we read from the script to teach players, and for different playtesters, we give them different groups of haptic patterns of the patient; finally, we survey them by interviewing.

    1. Introduction to the team and content warning for our currently bloody heart. Onboarded the playtesters to listen carefully to us talking to them. Give them sanitizing gloves for their hand size.
    2. Start the experience. Note which group they are in. A, B, C. Read from a script to guide them.
    3. Interview their experience
      • Screening for VR naivaty
      • Overall understanding of the experience, including technical difficulties
      • How they decided who to take life away
      • A survey for their specific emotion at that moment of crushing the heart

    Next steps

    And that’s all! On playtest day, we executed the plan, and even though there were difficulties where C group would be very unhappy not being able to test haptics, so we decided to only test A and B groups; our script had to dynamically change according to each playtester and iterate on the contents we add; and that our haptic patterns were not distinct enough that identifying the haptic pattern was very difficult, we still got 27 survey results, which helped us understand how our design of haptics, context, and dialogues (onboarding) can improve, and we’ll walk you through it next week!

  • Week 8 (10/24/25) – Gold Spike for Final Experience!

    Week 8 (10/24/25) – Gold Spike for Final Experience!

    Welcome back to school! To prepare for a Playtest Night at Hunt Library the second day we’re back, we immediately got our MVP we discussed before fall break working. In addition, we started setting up the assets template for the experience, and making detailed design of our first scene.

    Integrating heartbeat + crush

    With our previous prototypes, combining the two was simple. There was only 1 catch: Meta SDK handtracking: we realized that, for heartbeat, it is crucial to have players face their palms to the camera, so that it can detect a grab, and therefore play haptics at the correct moments. For crushing, it was more complicated, as we realized that the timing of fingers closing into a fist, aka the moment transitioning from crushing to crushed, needed to be carefully detected, and that preventing players from accidentally crushing prematurely or being unable to crush is exceedingly crucial.

    Combining individual haptic patterns

    For the haptic design, we had the heart play the original heartbeat patterns, which is a lub-dub beat, played along with sound and heart animation, and heartrate interval controlled by an exposed variable that can be controlled by other scripts. We initially thought the moment of touching the heart should have a select haptic pattern, but we eventually abandoned this as heartbeat is strong enough.

    For the crushing haptic pattern, since we noticed that people close their fists naturally as intensity of vibrations go up, and to balance between the granularly design bHaptics event with the purely code-driven PlayMotors() function that the bHaptics SDK provided, we divided crushing into low and high intensity, and a final “crushed” pattern, which used the “residual impact” pattern.

    Adding sufficient context

    What we noticed about the heart crushing moment from heartbeat to crushing is that, purely haptics and no visual / sound feedback, makes heartbeat indistinguishable from crushing, and would cause player confusion. As a project focusing on haptics, our goal is to add only sufficient audio and visual context, just enough to convey our intended meaning, as well as not break immersion. Therefore, we identified a few moments where audio and visuals were crucially needed:

    • Heart beating pattern: beating sound, beating animation
    • Transitioning beating to crushing: sound changing from beating to a squeezing sound. Visual is not as needed to convey the meaning, but very crucial for immersion. Our main programmer Jing transferred into our official technical artist, and created a VFX for blood being spurted out.
    • Crushed moment: we for now did not added anything.

    And we ended up with this very bloody heart that spurts blood when being crushed.

    Minimal scene diorama

    We knew that the structure of our final experience includes a title scene that teaches heart grabbing action, and a hospital scene that teaches the goal of the player, followed by 1-3 stop motion humanoid posing for a scene that represents a person’s last moment before death takes away their life. To test this process in the playtest to see if this minimal transition makes sense to the players, we created two scenes: 1st being a heart on a pedestal that needs to be crushed, and 2nd, a body lying in the ground, with a heart in their body that will be revealed through a shader activated when your hand reaches into the person’s body.

    Playtesting this bloody heart and a scene transition

    Since it was the start of the second mini semester, no one went to the Hunt Library Playtest (neither did pizza arrive), so we started walking around the library to recruit playtesters randomly. We eventually got 3 (+ Anthony, the ETC playtest coordinator)!

    For the heart moment, we were interested in their reactions to this heart crushing moment, and what emotions were elicited. We found that, because of the bloodiness, most people felt “disgusted”. On the other hand, the action of reaching into a person’s body for a heart, not only enhanced the “disgust” feeling, but also the sense of “powerfulness”. In addition, the haptics were clear, was perceived as a very important part of the experience (especially for VR naive guests, which was a surprise since we at the beginning set our target audience as VR non-naive guests). However, whatever sensation given was not explicitly paid close attention to. Whether it is heartbeat or crushing did not matter as much to the experience.

    For the final experience procedure from tutorial to diorama scenes, we were interested in seeing if players intuitively look for the heart in people’s body, if it makes sense, and what players might do that we further want to design to encourage or discourage. We found that, players immediately knew their goal. However, some were reluctant to reach into people’s body. On the other hand, the heart being non-kinematic (aka not affected by gravity) gave a sense of relief, as players felt more confident playing with the heart itself: some started throwing it, which is something we will want to discourage, but was interesting to observe.

    Overall Experience

    Building on the gold spike, we started planning out the scenes in detail. We created a scene document that lists out an overview of the scene, the player’s goal, the haptic patterns used, and assets needed along with reference pictures as a central hub for the team to align on the vision of each scene.

    Narrative goal (+ how haptic enhances that)

    What made our team excited for the players being death and taking people’s life from feeling their heartbeats and crushing it was that this is inherently an emotional scene, giving us a huge emotional space to work with, and by having players needing to figure out whose life to take by feeling the heartbeats, we also had a space to explore designing the haptics where haptics is crucial to progress the experience (no haptics will make the experience unplayable).

    Our player will be death in training, who is guided by a companion, going through a hospital scene at the beginning to understand their goal is to feel people’s hearts and determine who is dying, a few more scenes with more complicated heartbeats (where haptic design comes in play), and in the final scene, they’ll realize their last person in this training is that companion, who is simply a wise soul following around with death before their time ended. The final scene is our emotional climax, and our goal is to have the impact of feeling the heartbeat and crushing the heart as filled with emptiness and loss as possible.

    Hospital scene design

    For our first scene after a title scene of tutorial of grabbing and crushing the heart, we are making the hospital. As our goal is to have players experience the complex emotions facing death, we decided that a hospital scene is a great first scene, as it will be simple to decide who is on their deathbed (usually the person on the bed).

    Our minimal scene design includes: 2 humans, a patient who is dying, and 1 relative who is by their side. A hospital room, bed, and a curtain (p.s. the companion actually came from the same hospital, but on the other side of the curtain – this makes sense story-wise, and we’d be able to re-use assets yay).

    We determined that the humanoids should not be as realistic as the ones we tested at Hunt Library, as that would be too emotional by the scene itself, so that we will go for a stylized cartoonish style.

    Concept art
    Reference photos

    Next steps

    Our goal for next week is to continue making the hospital scene, determining the crucial questions we would be asking during Playtest Day next Saturday, and developing a testing plan and artifacts for that.

  • Week 7 (10/10/25) – Final Experience Design

    Week 7 (10/10/25) – Final Experience Design

    This week we tested with our second batch of context prototypes, Stroke, Crushing, and Breathing (also called energy transfer). On Wednesday, we did the halves presentation. On Thursday and Friday, we finished up more playtests along with coming up with a final experience design from the three candidates we had from last week.

    Second batch of context prototypes + Playtest

    As with our first batch of slap, pinch, and heartbeat, stroke, crushing, and breathing is also tested with different contexts, and understanding playtester’s natural behaviors.

    Stroke

    For stroke, we had a cute creature (from another ETC project Spleunx), along with a cold hard cube. We have playtesters try stroking the two objects, and observe how that changes their perception.

    Results: We found that the cube itself, because of how un-strokable it “looks”, no one thought that they could stroke it. Even when stroking, they found it difficult to find the right way to interact with it is. We therefore concluded that for a haptic pattern with a path on the hand, the object outlines have to match their expectations up until shape and texture.

    Crushing

    For crushing, we believe that this action is already a powerful one, so we wanted to see how to make this interaction most satisfying. We gave the players a planet to crush, and have it play a sound building up to the debris flying moment.

    Results: With this interaction, we did not do any A/B testing, but focused on understanding what how people crushed a planet. When people are able to crush it, they feel “powerful”. However, a lot of people found it difficult to because of Meta handtracking. For this interaction to be effective, we should design for hands to be used as Meta handtracking supports, and that will ensure a feeling of powerful.

    Breathing / Energy Transfer

    For breathing, we created two kinds of water energy that players can absorb and release, one clear and magical, the other disgusting, and turned on and off haptics, to test if the visual and audio context really influenced the haptic feedback impact.

    Results: We found that, in fact, the more disgusting water worked better with haptics, and the clear magical water with haptics is actually distracting. This partly stemmed from our gloves not being able to create temperature that the clear water might suggest, but this finding made us realize that haptics inherently works best with strong visual and audio context, as an outside vibration from a haptic gloves is already strong enough.

    Halves Presentation

    Playtesting focused strategy

    Our halves were quite well-received, as faculty really liked our playtesting and experimentation strategy that we explained in a timeline for our halves. We were encouraged to think more about how emotions can be tested, aside from all the haptic sensation part of the exploration.

    Design insights

    We compiled our design insights from all the testing, as well as the research that aided our insights and prototyping strategy into a documentation section of our current website. While we plan to migrate all things to a easier formattable Google Docs in the future, the documentation currently lives here.

    Final Experience MVP Planning

    With the playtest of our final explorational context prototypes done, it was time to discuss which final experience idea we chose, and what the Minimal Viable Product (MVP) plan looks like, so we can think through it throughout fall break, and start with full energy when we come back to school.

    Eventually, out of the three ideas of main interaction + haptic pattern + emotion (deckhand pulling + tension + powerlessness, deity creating world + blast + powerful, grim reaper reaping hearts + heartbeat with crushing + mournful), interestingly, the first two were either too programming heavy, or too art heavy, that eventually all our votes went to the last idea!

    Heart reaping it is!!

    We then used our Trello board to list out all initial work that needed to be done.

    Next Steps

    With the first half of the semester filled with experimentation of haptics and contexts, it’s now the time for our final experience design and implementation! See you after fall break!

  • Week 6 (10/3/25) – Prototyping Stroke, Crushing, and Breathing!

    Week 6 (10/3/25) – Prototyping Stroke, Crushing, and Breathing!

    This week we’re testing our first batch of “context prototypes”, which includes the selected haptic patterns from our haptic pattern testing that we deemed as high potential for interaction, as well as a “testing environment” we set up that will give us information on what contexts works with the haptic patterns best, and what we should look out for for each haptic pattern.

    This week, we had a Hunt Library Playtest Night on Tuesday, testing with slap, pinch, and heartbeat, while continuing developing the next set of prototypes: stroke, crushing, and breathing.

    Playtest Night

    Our main goal of the context prototypes is to see how much context we change change that has emotional potential. Therefore, fore each pattern, we made different changes, and asked for emotional responses using the emotional wheel.

    Slap

    For slap, the chosen context change was sound, and we had a button on the side that changes the feedback, from one that matches what balloon slapping sounds like, to one that is comical and funny. However, the haptic feedback remains the same.

    Result: It was interesting to find that, many people felt that the comical sound made things unserious, and that they felt that the haptic feedback became lighter. This matches our expectation that context really influences your perception of reality. Players expect a certain feedback from their sensed memories, and that can be made using just a sound change.

    Pinch

    For pinch, we changed the text on the wall, indicating that the character you are pinching up would be: your mom, your ex, a king, a prisoner, etc.

    Result: less about the haptic feedback, the annoying voice made a lot of players want to pick the boy up and throw them to the ground. On the other hand, if that person has context “your mom”, they are more reluctant, a drastic contrast to if the text is “a prisoner”. This showcases how context as simple as a character would change a person’s interaction intention, and that’s something a glove-related interaction project could work with.

    Heartbeat

    For heartbeat, we tested with different intervals of heart rates: slow / steady, fast, increasing, and decreasing.

    Result: very immediately, people associated increasing heartbeat with nervous and excited, while decreasing heartbeat with sleepy and dying. On the other hand, the steady slow and fast ones, did not convey much difference, even though extremely fast made people nervous. It is apparent that heartbeat is a pattern that can clearly convey meaning, and even influence emotions.

    Final Design Brainstorming

    Even though we are still testing with the context prototypes, it is near Halves Presentation, and we wanted to have an outline for our final experience. We decided to discuss what we have learned and what we believe should make out the final experience.

    We simply know that the structure of the final experience would be formulated by many haptic patterns (whether it’s the same or different ones) accumulating to an emotional climax.

    Design Guiding Questions

    To aid out brainstorming, we compiled 3 main questions to ask ourselves when coming up with an idea.

    1. What is the climax
      1. What is the small form that interaction 
      2. What patterns? What actions?
      3. What emotion stems from and must build up to
    2. What’s the theming
      1. Where the interaction and haptics happen
      2. Style
    3. What is the main emotion

    Three Ideas

    … and we came up with three ideas!

    Rope pulling tension as a deckhand, from pulling up treasures to being unable to pull up a fallen ship crew, for an emotion of desperation.

    God simulator, merging and blasting their creation to the world, feeling different textures of the creations, emotion changing between powerful and being careful.

    Grim reaper feeling people’s hearts and crushing it. Climax will come from crushing someone they know, which is a complicated emotion.

    Next Steps

    While preparing for Halves and coming up with design plans, we’re still continuously developing our second batch of context prototypes, which is set the be Playtested at ETC the day before the presentation.