Skills: Usability Testing, Qualitative Research, Quantitative Research, Data Analysis, Insight Synthesis, Dovetail, AI-assisted Research, Heuristic Evaluation, Cross-functional Collaboration, Project Management, Clear Reporting & Communication, Workshop/Meeting Facilitation, AI Ethics.
During my time as Lead Associate UX Researcher at User Behavioristics, Inc.one of my main projects was a Comparative AI Study.  We had access to high-quality research data due to running multiple mock studies, and we could use it to compare professional researcher feedback and Maze AI insights. The study's purpose is to examine the strengths and weaknesses of the AI output and determine where it would be most useful for human researchers.  Both the AI and the human team started with the same initial data - 20 playtest recordings and survey results. As a Lead Researcher, I was in charge of the study design and project management, alongside being the main editor and writer. I attended weekly collaborative sessions and assigned tasks based on the experience and availability of the participants.
After the testing phase, we outlined the strengths and weaknesses of the AI agent. The most glaring issue was the software’s inability to consider the footage and detect non-verbal behavior, as it only uses transcripts for its analysis.  The only time the AI did so successfully was when the player's speech matched their nonverbal actions, such as a player saying, "I am about to go through the gate." Another high-priority issue that often arose when using the AI was the lack of focus. As the AI used the entire transcript as input, all player feedback was deemed equally important, which prevented it from focusing on critical issues. 
Still, this software can be used in a way that would be extremely helpful for a UX researcher. Despite not always being 100 percent accurate, transcriptions can provide an easily searchable script of the playtest, allowing the researcher to find the exact player interactions in significantly less time. Lastly, AI is significantly faster than human researchers, taking less than an hour instead of over 1 week of a standard User Behavioristics study.  This could indicate that AI is best used to approach time-consuming, monotonous work, allowing researchers to save valuable time to focus on creative insights that AI is not as good at.
Research Insights
○  The AI struggles to detect observable player behavior.
○  The AI may focus on irrelevant details.
○  The AI is unable to provide industry-level insights and recommendations.
○  The AI's ability to automatically create transcripts provides researchers with an easy way to reference playtest data and user feedback.
○  AI is fast but inaccurate. It is better to delegate monotonous, non-creative work to it rather than relying on its usability analysis and suggestions.​​​​​​​

Sample professional insights provided by User Behavioristics, Inc.

Sample AI insight generated by MazeAI

Back to Top