Usability Producer

Over the course of 6 Months, I served as the Usability Producer for 3 teams - Tex-Mechs, Rhome and Goofballs.

As Usability Producer, I helped enable the teams by organizing local and remote playtesting, developing specific testing plans and assisting in market research for each team. I worked in Qualtrics, developing surveys and templates, and gave presentations to the Game Designers and the teams themselves. I also developed a market research projection tool to assist with the financial analysis of our games.


Projection tab for the Financial Projection Model Spreadsheet

Projection tab for the Financial Projection Model Spreadsheet

Overall Responsibilities

  • Created Survey Templates through Qualtrics for quick iteration of surveys for various sprints and needs

  • Organized Local Play-testing, conducting the testing sessions, utilizing think-aloud and other methodologies

  • Coordinated with SD Leads in the creation of a tool to enable remote play-testing with heuristic data being returned to our databases

  • Supported other teams with a Financial Projection Model tool

Skills Used and Developed:

As Usability Producer for three teams, I very quickly developed the skills I needed by doing each task multiple times per sprint. As such, I learned a lot about how to create surveys, how to use Qualtrics and all of its’ capabilities, as well as how to communicate information back to the team.


You can download a sample Qualtrics survey from Rhome’s Alpha Testing

You can download a sample Usability Presentation here, Goofball’s Vertical Slice Playtesting Report:

Survey Creation and Data Presentation

Understanding the needs of the team was a critical component of my role. Through communication with the Game Designer and Producer for each team, as well as my own familiarity with their games, I was able to create relevant questions for our play testing sessions to get critical feedback for the team moving forward.

Once the data was gathered, it was fed into a spreadsheet I built to help interpret and automate the analysis of specific question types - from there, I presented the data to the Game Designer and Producer for the team. In some situations, I would then present the information to the overall team. Depending on the context of the presentation, I would change the language in my slides to ensure that the correct message was being delivered and nothing was being misinterpreted.


Qualtrics Templates

Working with Qualtrics and creating multiple surveys, I quickly realized that I could create templates and eliminate a lot of the repetitive work each time I needed a new survey. To this end, I generated template blocks, with Demographic questions, standard questions shared between all games, and questions that were likely to be asked each testing session.

As well, the templates were designed to be optimized for both mobile and desktop. By identifying the device that the survey was being taken on, I was able to ensure that graphics, scales and even question types were being displayed in the optimal configuration and appearance for each device.

You can see an example of the Qualtrics template here, with Mobile and Desktop versions displayed side by side. It distinguishes between devices for how to display the orientation of question choices, and uses different graphics depending on the screen type.

QualtricsPreviewsWebsite.jpg

FinancialToolWebsite.jpg

Market Research Tool

Towards the end of the projects, the other producers were working to determine if it would make financial sense to sell their games on Steam. To assist with this, I created a model that simulates potential sales conversions based on initial week sales.

Using data gathered from Steamspy, we were able to generate average baseline conversion ratios. We used direct comparison to better inform our data as well. More information on this model can be founds on the productions tools page:

 

Highlighted Work: Automated Reporting Tool

As Usability Producer, the biggest challenge that I faced was getting enough test subjects for each survey. Given that the location we work at is isolated, and we don’t have a budget to entice people to test for us, we need to work smarter instead of trying harder. Since we are only able to devote so much time from the project to travelling to other locations to run testing there, one of the challenges I tried to solve was in creating a tool to allow us to gather information from remote playtesters.

While remote playtesters had been used in the past, the only information that was contributed was survey responses. Without any observational data or gameplay statistics, this data has limited use. Therefore, I identified a potential solution: generating an automated way of returning gameplay data to our server from remote playtesting sessions without having the tester dig and find a file and then email or send it to us.

Coordinating with the Software Design leads for each of the three teams, we worked together to design the initial concept of how it would work. From this point, I took the role of facilitating and enabling their work. While I did not have the answers regarding interacting with our server or using network calls, I was able to ensure that they were able to meet with the knowledgeable people and I could clear other communication or schedule based blockers. In the end, while the tool was functional by the end of the project, it was too late for the statistics tracking to be properly implemented and integrated into the projects.

More details about the tool in particular can be found in the Production Tools page.

Mini-Post Mortem

What Went Well

  • There was no strict deadline for this tool, and therefore we could accommodate important developments on the teams the Software Design Leads were associated with

  • The tool allows for not only automated sending of gameplay data, but allows any team to mark specific flags that should be tracked by the tool, making it project agnostic

  • Having a producer to bring the SD leads together and bringing them out of direct focus from their project was very helpful

What Went Wrong

  • We didn’t start on the tool early enough, and therefore the projects were not built with the integration in mind

  • Since there was no strict deadline, we kept pushing back work on the tool. As a result, the tool was not able to be used for this project.

  • We spent too much time at the start trying to figure out answers before we even knew how we were going to log files. We targeted tier 5, without creating tier 1

What We Learned

  • Building a tool for Usability should start during pre-production, to ensure that your project is structured to accommodate it

  • Don’t blue sky your design for a tool, start by building a tier 1 implementation and work from there

  • Having a producer working directly with a team of SDs is like having a living rubber duck to talk through issues and complications with