Week 6

Production:

In the beginning of this week, we first came up with 10 pitches for a face detection game that using face as a controller. We come up with interesting idea, and also analyse the possible challenges that it will bring up: either tech challenges or does not align with client’s need. 

Face Detection Game Pitches

In conclusion, summing up from all ideas, we narrow down to 2 approaches that will work for client, and also matching with our need:

1- A rhythm/Dancing game that Player follows a path and make reactions at given points

2- A Racing game in which players are limited to following the one & only right track, which we can iterate from the prototype from last week. Based on time constraints, we are more willing to go this direction to pursue an outcome that meets need.

Game Design:

We have our first prototype ready: in which is a STG game in which players use their face to control a spaceship and travel in space. We implement couple of them: in this prototype, player will be using faces to:

  • Control movement of spaceship, dogging from meteors:
  • Nodding: controlling spaceship back and force
  • Shake Head: controlling spaceship left and right
  • Special Mechanic: Open Mouth (FirePower UFOs)
Playtest Observation:

Based on playtest feedback and client need, we found something work and something not working:

Work: 

Based on the feedback from our client and some playtesters, we found out the control mapping we designed is easy to pick up. Moreover, some playtesters found satisfying in our shooting mechanic, which provides instant and tight feedback between face expressions and game interactions.

Not Working:

1: Too much freedom:Clear goal, No clear PATH

Although we provide players with limited freedom, right now the game is still open-ended, which allows players to solve problems in ambiguous ways.For example, when encountering an enemy, depending on their position, players can choose to evade or shoot the enemy. which makes the outcome of each play different. According to our clients’ needs and their research purposes, this is hard to calibrate compared to a predefined path where performance can be measured against one and only one correct solution.

2- When the player is using their head for directional controls, rotating the head to move left and right makes it difficult for them to look at the screen, while nodding up and down sometimes tampers the detection results and accuracy. 

3 – We found out that controlling a spaceship with only facial movements alone lacks tight connections. Since players will be constantly thinking about facial expressions instead of immersing themselves in the game, this approach makes the gameplay not very engaging compared to controlling something with a more visual and intuitive connection. 

2nd prototype Iteration:

We made two key changes to limit player freedom.
First, We constrained the navigation path in a fixed way, guiding players along a fixed route with a clear direction. And We removed back-and-forth controls, making the control target move automatically, while players adjust the forward direction by tilting their heads left or right.

Here is a demo video, in which this one is when the player makes an early attempt at playing.

demo1.mov

And after a couple of attempts, they go way further and can overcome some sudden turn.

demo2.mp4

Feedback:

First, it better fulfills client needs by shifting from allowing diverse player strategies to focusing on training unfamiliar motor skills. Progression is tied to: getting used to head-based controls and memorizing the level. Second, tilting the head is much more accessible and at the same time still requires progressive learning. But we need more playtesting to fine-tune parameters.Lastly, this version enables scalable level design (Gradually add on reaction points)