We are team PPP (Physical Presence Pet), a group of 6 students at Carnegie Mellon University’s Entertainment Technology Center creating a tactile XR virtual pet.
Read about our development journey!
Why?
Touch is foundational in our relationship with companion animals, while touch with even inanimate objects provides emotional and stress-relieving benefits.
However, fully virtual pet apps offer no touch interaction, while VR pets with gesture interactions have no physical target of touch. At the same time, robotic pets (even those with soft, realistic fur) remain in the uncanny valley, lacking the convincing liveliness of animated virtual pets.
We aim to fill this gap. By combining research in soft technologies with the latest XR headsets, our goal is to make an immersive virtual pet experience with satisfying physical touch interactions.
How?
On the physical side, we are building a soft plush with embedded touch sensors. On the virtual side, we are developing an animated pet experience that translates the touch interactions into lively, fun animations of the plush in the Apple Vision Pro.
We are consulting with experts in Soft Technologies at CMU’s IDeATe, as well as with developers at Apple to guide our project across both sides of the pipeline.
Read more about our design!
Why the Apple Vision Pro?
We chose the AVP as our primary platform because it is designed as an immersive tool with high passthrough resolution. Users don’t put on the AVP just to play a game, but to do tasks in their physical environment.
This is the perfect setting for a more realistic XR pet. The pet can accompany you in your everyday life, still present for you to touch while you’re doing other tasks. It isn’t confined to just one app on your device that you open for a few minutes a day.
As future AVP generations come out and the device becomes more accessible and widespread, a tactile XR virtual pet will provide a holistic digital companion to tech users.