STUDIO HEURISTICS: REIMAGINING A DIGITAL MAKERSPACE FOR EDUCATORS
As the lead designer for Studio, a digital makerspace with a diverse user base, I recognized that our existing interface posed challenges—particularly for our teacher users. The complexities of serving multiple types of users had led to issues with mental models, creating friction in their experience. To set a clear vision and guide Studio’s evolution, I initiated a comprehensive evaluation of the product.
Instead of a traditional heuristic evaluation, we adopted a "bug bash" approach to align with Studio’s dynamic, non-linear structure. This style allowed us to uncover a broader range of insights without the constraints of a single end-to-end flow. Working alongside a UX Researcher who provided process guidance, I crafted detailed user stories and tasks to capture actionable feedback and ensure a path forward in our product vision.
WHAT IS STUDIO ?
Studio is a digital maker space tool used by DE’s internal teams to create content for teachers and by teachers to modify existing content. Students can also use Studio for assignments making it versatile.
THE PROBLEM
Studio’s original design lacked scalability, making the introduction of new features a consistent challenge.
I identified several fundamental heuristic violations that hindered usability and adaptability, particularly when new updates were needed. Studio’s user interface required a level of ease and intuitive structure that was essential for a digital makerspace—qualities in which competitors excelled. Without these, our users faced a steep learning curve; if teachers couldn’t recognize how to navigate Studio with ease, they might default to more familiar tools like Google Slides
GOALS OF STUDY
I initiated a heuristic evaluation, designed to catalyze a first wave of UX/UI improvements and to identify immediate enhancements
This evaluation served as a critical starting point, providing actionable insights for refining Studio’s interface while fostering discussions with the product team and leadership. My goal was to establish a shared vision for Studio’s evolution—one that prioritized user-centered improvements and reoriented the product’s direction to better meet the needs of its diverse user groups.
MY ROLE
-
Product Designer for Studio Product
-
UX Strategist
-
Prototyping
​
TEAM
-
UX Researcher
-
Product Owner​
-
Studio Lead Engineer
Test Details
-
6 participants - Internal Employees by volunteers
-
Prototype built in Figma
-
Test conducted on UserZoom
STUDIO - TAKE A QUICK TOUR OF CORE FUNCTIONALITY
To provide context for our findings, here’s a snapshot of Studio as it appeared during the study. Each interface element was evaluated with attention to usability, task flow, and clarity.
Note: Call-to-action (CTA) labels within Studio dynamically adjust based on user permissions
HOME PAGE

Studio Home image
ADDING CONTENT TO CANVAS TOOLBAR

Adding content to canvas image
SUPPORTING FEATURES

Naming the creation image

Slide Drawer image
TEST STRUCTURE
This approach provided a comprehensive view of both solo and group interactions within Studio, revealing critical areas for improvement.
To gather valuable feedback on Studio’s usability, we conducted tests with six internal participants who had minimal prior experience with the platform. This group allowed us to capture fresh insights, closely aligned with what new users might experience.
The test was structured into two parts to evaluate both individual and collaborative use cases:
-
Individual Tasks – Participants completed a set of tasks independently to gauge usability and intuitive navigation within the platform.
-
Collaborative Editing – Participants paired up to assess the collaborative editing process, enabling us to understand how Studio supports—or hinders—team-based workflows.
​
​
​
Image of user stories document that will need tasks in User Zoom
FROM USER STORIES TO CREATING TASKS
These tasks were crafted to align closely with our goals for both individual and collaborative evaluations, ensuring that each task provided insights relevant to the usability improvements we sought.
By structuring tasks around these user stories, we could systematically assess how well Studio supported real-world use cases, both for solo activities and collaborative workflows. This structured approach enabled us to identify precise areas for refinement in both phases of testing
Task Documentation
By: Product Designer, Michelle Phanthongphay

Tasks associated with the user stories
TAILORING HEURISTIC EVALUATION TO TASK-SPECIFIC NEEDS
In collaboration with our UX Researcher, we decided to refine the heuristic evaluation for clarity and focus. Rather than presenting the full set of heuristics for each task, we limited the selection to only those most relevant to the specific task being evaluated. This streamlined approach ensured that participants could concentrate on the usability factors directly impacting their experience with each task.
​
For each task, we customized the evaluation rubrics, removing unrelated heuristics from the evaluation table. This task-specific curation allowed us to capture targeted insights, making the feedback more actionable and directly aligned with our UX improvement goals.

Example of the heuristic evaluation we minimized for per each task
COMPLETION AND SYNTHESIS OF FINDINGS
All six participants successfully completed Phase 1, allowing us to transition to the synthesis stage. Together with the UX Researcher, I reviewed and compared our notes, examining feedback to identify patterns and key insights. Below are some of the significant findings from this initial phase of the study.
1. WORKFLOW CHALLENGES AND DEFINING STUDIO'S CORE IDENTITY
We validated the challenges in accommodating multiple users with varying goals and objectives within the same workflow. These insights highlighted the need for a cohesive vision: Should Studio position itself as an “editing and publishing tool” or as a “makerspace tool with robust editing capabilities”?
Establishing this core identity is crucial to aligning the product’s features with user expectations and reducing friction across diverse use cases.


Excerpt from report images
2. ABSENCE OF A TOOLBAR MENTAL MODEL IN STUDIO
Studio lacks a familiar toolbar, making it challenging for users to add items to the canvas. This gap led to confusion and slowed down the workflow, highlighting the need for an intuitive, centralized toolbar to enhance usability.


3. CHALLENGES WITH COLLABORATIVE EDITING
Edits made during a collaborative session aren’t live; the editing user must choose to share changes at the end. This presented ongoing issues, and now, with research in hand, we’re positioned to discuss solutions for real-time collaborative editing.


CONCLUSION
The study wrapped up in June 2021, allowing us to address quick, high-impact fixes while documenting larger improvements for collaborative problem-solving with product leadership (PM, Lead Engineer, and Product Designer). However, as discussions on significant changes began, we were informed that all Studio enhancements are on hold for the foreseeable future.
This work remains paused.