With finishing our main MVPs for the project, Week 11 witnesses the team making shifts in priorities in determining what really makes our product shine, and how we want to make the most our of the project and help make whoever ends up using our project really benefit from what we were able to accomplish. We also witness Help-A-Peer’s debut in a classroom setting, which is super exciting!

Feature Development

On top of the MVP features we realized last week we have implement a few more features prior to our playtest on Thursday:

  • When teachers click on the “results” window in quiz sharing, they now get a QuestionResultsWindow that will show them what their students answered to the questions they asked.
QuestionResultWindow, which displays all the quiz answers from students.
  • The ability to mark quiz responses as correct if it isn’t recognized by HAP (Teacher override)
  • Being able to export data such as breakout room formations, quiz results, notes taken for students into a .txt file

We also had an unexpected update from Zoom regarding getting the wrapper to create breakout rooms. Previously we had to resort to use HAP to generate the breakout room grouping and we had to trust students to get into the right rooms. Having the wrapper means possibly that we can carry on with what was previously our Plan A. More updates to come!

Interview w/ CMU Researchers

We were able to connect with Lu Lawrence and Jonathan Sewall, whom we reached out to at the beginning of the semester to discuss the possibility of handing off our project to them for research and discovery. From the interview we learned that any type of data we can get actually might be really interesting for researchers, but we should focus our priority on talking time between the students, to know about their degree of collaboration.

Playtest & Interview w/ Nativity School of Worcester

We were able to conduct a playtest with the Nativity School of Worcester on Thursday after we showcased them our extension last Friday. There are three teachers who were in the class session, one of them was the teacher who lectured the rest of the students while the other two teachers supported. One of the teachers who was supporting us had a Windows 10 computer and therefore was able to run HAP on her computer.

The playtest ran mostly smoothly with a couple of small hiccups. Despite that, she was able to do everything that she wanted to do knowing HAP’s supported features. She experienced question widow dialogue minimizing and disappearing, and some lag on her end, which made us even more aware of teachers’ specs as a real issue that we have to learn to deal with and take into consideration for future development.

We were able to have a 1-1 conversation with Brittany, the co-teacher who ran the class, after the class was concluded. She offered us a lot of valuable thoughts she had on our extension. She liked the way of sharing questions provided by HAP compared to their current usage of Google Slides, and how she was able to create and share questions in breakout rooms when they are with their students. For some tools, they did prefer how they did things in vanilla Zoom, such as creating breakout rooms. She thought taking notes on kids is useful, but harder compared to usual because with HAP, the zoom window where teacher gets to see student face is a bit smaller.

From the playtest we also learned that we should make the process of answering questions as straightforward for the students as possible as many of them had questions about why they would need to add !~ in front of their answers. They also keep forgetting that they would have to direct message the answers to HAP, instead of the whole class, which took a while for them to fully grasp. We also learned a couple of use cases where HAP would crash that we didn’t previously experience, which we will fix in our next iterations.

Last week we aggregated a list of APIs of data that we can gather from Zoom. We got to asked Kate, the main teacher for the math class, as well as Brittany, what type of data they would find useful to have and we got some really interesting insights. For them, knowing who has video on and who doesn’t would be really useful since it is actually mandatory to have videos on, and it could be really hard to keep track of that type of data. They also thought knowing how many times a student raised their hand is helpful to know for what’s called “effort grades” that they give out to students, in combination with academic grades.

Conclusion Thoughts

This week was a fruitful week, especially considering how it is shorter (we had Thursday and Friday off due to Carnival break). We are going to focus on these small but realizable features next week and get the project softs ready!

Categories:

Tags:

Comments are closed