Project Overview & Client Description
Team StoryStudio created an interactive storytelling toolkit, including a digital tool prototype, physical interactive prototypes, storytelling guides, demo stories, an audio sound effects library and design & playtesting documentation, to facilitate interactive storytelling at the Children’s Museum of Pittsburgh (CMP). These events would be expected to host between 50-100 people and take place in Assembly Hall in CMP’s MuseumLab building. MuseumLab is a separate space from the main Children’s Museum, but is right next door, and caters to youth, tweens, and teens (in contrast to the main building, which is geared towards ages 3-8). Our team was tasked with investigating what it means to become more immersed in a story and examining how to enhance the experience using mixed-reality interactives.
Our goal was to develop a toolkit and interface that can be used by high-school students and adults to generate meaningfully interactive storytelling events for children. Ultimately, we hope it will support CMP’s initiative to blend digital and analog interfaces in novel and entertaining ways, and specifically spark further interest and support for interactive storytelling events.
Our delivered prototype toolkit included the following:
- A three-part guide on how to tell a good interactive story (Pixar in a Box, Interest Curves, and Interactions
- Three Arduino-based physical prototype modules (arcade button, RFID, and servo)
- Two versions of the StoryStudio Engine, a digital tool prototype akin to PowerPoint
- An audio SFX library
- PowerPoint-based demo stories
- Tutorials on how to prepare image assets from scanned book pages
- Event playtest results, pictures, and videos (kids ages 6-16)
- Digital tool playtest results, pictures, and videos
- Design & development documentation
What went well
Overall, our team is very happy to say that we had a very solid team dynamic. There was constant laughter in our project room, which, as industry professionals will say, is absolutely critical in the entertainment industry. We also took good care of each other – supporting each other through stressful electives, prioritizing each others’ health (especially when one teammate had Covid), striking a generally healthy work-life balance, and being supportive of professional development events (GDC, IAAPA, and the ETC’s East Coast Trip).
We also worked hard to create an environment that was welcoming and fun to work in. Our team spent a meaningful amount of time in the first week of the semester to decorate our project room and organize “play” space. Our emphasis on child-like fun also attracted several visitors to our project room, and we ended up hosting school tours, professional guests, external playtesters, alumni, and prospective students nearly every week. We built and maintained a very healthy relationship with CMU’s K-12 outreach department, which led to our external playtest at Highlands Elementary School and continued involvement with Anthony Palyszeski throughout the semester.
Additionally, we were very strong in admitting and learning from our mistakes. For example, we received more feedback at our 1/2s about how our presentation was confusing than about our actual work, despite having redesigned and reorganized our PowerPoint several times before the actual presentation in response to our instructor feedback. We also were open about our challenges and mistakes made in overscoping our project, and our transparency was received well by both the faculty and our client.
Our team also took great care to heavily document all of our work throughout the semester in written form, pictorial form, and in version control. As is the nature of exploratory projects, our team was constantly iterating over ideas, prototypes, and UI, and we were careful to document those versions as we created them, as well as our meeting agendas and discussions with our clients and with our instructor. This emphasis on documentation also led to useful playtests, as we always went into major playtests with clear playtesting goals and a playtest plan.
Although we took some time to choose our final direction, we are very happy with the quantity of work that we were able to accomplish and deliver at the end of the semester. During our Soft Opening playtests with faculty and alumni, we got very positive feedback on our design and demo. Even though the final digital tool wasn’t quite what we were hoping to deliver, the other elements of our toolkit, as well as our concept for an interactive storytelling event, was very solid and well-received. We feel we did a good job of exploring potential directions over the first half of the semester, and communicated the challenges we foresaw with each direction with our client (who then supported the team in changing the planned space for such an event).
Finally, The Children’s Museum of Pittsburgh was an absolute joy to work with, and were very supportive of our endeavors and very understanding of our mistakes. We would be happy to work with them again.
What could have been better
As alluded to above, our team also made some mistakes this semester, which led to us delivering two versions of our digital tool prototype rather than the single, full version we had intended. In reflecting on those mistakes, we would have benefited from scoping down our tool somewhat and setting a specific internal feature lock deadline and sticking to it. Particularly early in the semester, we were lax about sticking to specific deadlines, which came back to bite us by the end when we didn’t have enough time to fix everything. We also failed to re-adjust our scope late in the semester to combat our team running low on time, an issue which came down to internal team communication.
Although our team was very amicable and team dynamic was super from a relational perspective, we did not adequately hold each other accountable for deadlines, nor did we fully communicate looming issues to everyone on the team. There was a lot of trust among teammates that things were “just going to get done” or that “this is the right thing to do” without actually getting full team input or instructor feedback on those ideas. At times, the team wasn’t fully on the same page and there was a disconnect between what some members thought was going on and what other members were doing. We addressed this late in the semester by holding more frequent semi-formal team meetings, but by then it was too late.
We also realized towards the end of our semester that the team was working with a different definition of “explore”, “prototype”, and “beta-level” than our instructor. What the team felt was an adequate prototype, our instructor saw as B- or C level work, which could have been prevented if we had defined those terms earlier. In a similar vein, we should have determined our end product earlier in the semester – we spent 5 or 6 weeks playing with different potential directions, but then didn’t leave ourselves enough time to actually bring our final design to fruition.
In short, our team ended up over-scope, especially for operating with a single programmer. We would have benefited from fitting our project’s deliverables closer to the skillsets of the team, especially given that this was meant to be a predominately exploratory project anyway. We made the mistake of over-promising and under-delivering as a result of that scope problem, which would have been largely prevented with better internal team communication on the status of different elements, and better communication with our instructor.
Finally, we would have liked to playtest with a full audience of 50-100 people, but we never pushed hard enough for the opportunity. While we did conduct a playtest with a full class of about 30 first graders, we did not actually do a final playtest with our expected audience size.