This week, we prepared for the ETC’s Playtest Day, where we demoed our experience to dozens of guests and received their feedback. Moreover, we have some updates on the project’s final design directions after another meeting with Jesse.
Playtest Day
Playtest Day began early Saturday morning, with the first guests trying out our experience at 9:30 a.m. We initially had some technical issues getting guests set up in the Apple Vision Pro, but we quickly smoothed those out and guests could start interacting with Luceal.
The 20 minutes we had with each group of guests was rarely enough, as guests had a lot of fun, and we were having fun helping them and talking to them about the experience. We decided early on to forego the AVP’s calibration, reducing the setup time so they could go straight into our game. Even then, guests spent a lot of time browsing all the options on the customization scene, so we often needed to hasten them so we could switch to another guest in the group. There were a few guests that really wanted to try it out who unfortunately didn’t get the chance before time ran out. We wish we had more time to provide everyone who wanted the opportunity to experience Luceal, but we were fortunate to have the chance to showcase to as many people as we did, and we received a lot of helpful feedback.
(As we edit the footage taken that day, we will upload pictures and videos of playtesters)
Playtest Day Takeaways:
1. Ear sensors are needed: Almost all our playtesters wanted to touch the ears of the plush. Thus, we plan to add sensors to the ears and remove the head sensor, which was easily triggered accidentally. The ear sensors will trigger the head sensor’s animations. Meanwhile, almost no one noticed the absence of hands on the plush, so we will focus on the ears for now and leave hands as a future nice-to-have.
2. Animation and other feedback could be clearer: One current animation (head-shaking with tail-wagging) was misinterpreted as negative, which led users to be cautious after seeing it. We could also differentiate the animations more based on pressure levels. Moreover, we will add sound and VFX to all parts of the experience for clear positive/negative feedback.
3. Hardware needs a case: Some playtesters squeezed harder than we expected, which made us concerned about our embedded systems. We plan to 3D print a protective case for it.
4. A bed and product packaging for our pet: Playtesters liked to hold the pet in lots of different ways. To make the starting position of the pet clear, we’ll make a bed for it to sleep on. Moreover, for finals we would like to have a product packaging box for the pet, as if it were in stores ready to be taken home by customers.
Meeting with Jesse
We met with Jesse again before Playtest Day to have some guidance on what we should focus on testing.
Key questions guided our design process: How do people naturally want to touch the pet? Do they intuitively understand our interaction methods? And what do they expect when their touch aligns with the pet’s responses—creating that magical “peak moment”? Our goal is to design an experience that supports players’ natural touch instincts while gently guiding them toward the interactions we’ve built.
Jesse suggested taking inspiration from toys like Tickle-Me Elmo, where clear cues (like the word “tickle”) shape how users engage. Progression—like Elmo’s laughter escalating to shakes—also enhances connection. Another insight: weight matters. Living things feel substantial because of their water content, so a lightweight pet might feel unnatural. One future idea is to playtest with different weights to find the perfect balance between realism and comfort.
We’ll see you again next week, where we’ll start implementing the changes informed by Playtest Day feedback, getting ready for our Soft Opening in less than 2 weeks.
Comments are closed