Week 7 (Halves)

Bon Appétit! Now at the halfway point of the semester, the team is gathering together all of our best work and laying it out on an appetizing platter, like a delicious apple toast. 

On the tech side, things aren’t quite smooth sailing just yet. The main issue that the programmers are struggling with is that transforms don’t always behave the same way when testing the experience in the Spectacles as they do in Lens Studio on our PCs. 

One problem that only occurs in the Spectacles is that moving any object in the colocated space causes the object to rapidly stretch and scale. We found out that this is a common occurrence in mixed reality, as trying to scale an object that’s under another parent object can lead to weird relative scaling shenanigans. However, in our case, we can’t simply unparent these objects, since any objects that multiple people should see and interact with all need to be under the ColocatedWorld object in the scene, per the Spectacles Sync Kit. For now, our only workaround is to completely disable scaling in the interactable manipulation components on these objects. 

The other major obstacle that we faced was unpredictable behavior in how the glasses established the colocated world space, and where objects would spawn. If we placed something at position (0, 0, 0), sometimes it would spawn across the room, it might be somewhere behind us, or we might not find it at all. 

All of these challenges made the experience extremely difficult to play through, which was especially concerning this week, as we needed to be able to record and showcase our project at Halves. We decided to quickly build out the simplest possible demo that we could play: spawn a few non-synced food items in front of each player, then have them add the ingredients to the soup, with some text on top of the soup indicating that it knows what was added (with that information reflected accurately on each person’s glasses).