This week, we finished the user test for the garden sprint. We also did a comparison for the painting sprint and garden sprint. We had a lot of interesting findings which we can bring to sprint 2. Also, after finishing the sprint 1, we made a reflection and started to brainstorm sprint 2.
User Test For Garden Sprint
On Wednesday, we conducted our user test for the garden sprint. We divided our user group into two types. One is the Preferred Audience, they are the ETC Student who has prior knowledge in ML. The other type is General Audience, they are the ETC students who don’t have prior experience in ML. In total, we ran 10 playtests,4 for the general audience and 6 for the preferred audience. For each playtester, they have 10 mins to play the painting prototype and 10 mins to fill out the questionnaire. The questionnaire includes questions about understandability, immersion, and the comparison between garden sprint and painting sprint. Here is what we got.
Insight
- Step by Step Process
- By splitting paragraphs and providing step by step instructions, people understood both data and the algorithm better.
- Understanding Data
- Everyone who played garden prototype has a much higher understanding of the data and ML.
- Animation
- Flower animation helped people to enjoy learning about the process
Feedback
- Literal Representation
- Some people mentioned that even though the animation was engaging, it did not help understand the data better.
- 2D vs 3D
- Feedback that this still feels 2D. How does 3D representation help the understanding of traditional data sets?
- More Freedom
- Less freedom compared to the painting example.
Here is a comparison between two sprints
After the user test, we analyzed the results of two prototypes and made a summary of the factors that affect engagement.
Brainstorm For Sprint 2
Finally, we finished our sprint 1. We really find our way through sprint 1. Therefore, its time to start our sprint 2. According to the result from sprint 1, we found non-traditional data like image, sound really attracts people compared with traditional data. Also, people are willing to interact with ML in VR or AR space. Therefore, for sprint 2, we decided to focus on VR and music. However, the brainstorming is not that smooth. Since we have a lot of freedom to decide what to do, it also increases the difficulties of brainstorming. We still have not decided what to do by this week. But we really got a lot of cool ideas. We will further discuss these ideas next week. Here are our ideas!
- Regenerate a song using real-life sound
- Got inspiration from Google The Infinite Drum Machine Project
- Divide a song into seconds and find the most similar sound in real life.
- Re-generate the song using real-life sound.
- Image filter based on the feature of a song
- We found a dataset provided by spotify. For each song, they analyze the danceability, energy, Loudness, Speechiness and other features.
- For an image, it has the features of Saturation, Highlight, Hue etc.
- We want to create an image filter to map the feature of a song to the feature of an image.
- VR Visualization Of A Song
- Visualize a song in VR space based on the beats, rhythm etc.