Works

Spatial Computing Experience Design | 2D Experience Design

SPATIAL COMPUTING EXPERIENCE DESIGN

2023 – Present

Driving the vision for spatial input and interaction quality across Meta Quest and Horizon OS platforms. I lead the definition and productization of input systems — including hands, eyes, voice, controllers, stylus, keyboard, and microgestures — and establish design principles that ensure consistent, intuitive, and extensible interaction models across the ecosystem.

Focus Areas
• Spatial input & interactions for Meta Horizon OS / Quest products
• Interaction SDK / Presence Platform
• Peripherals / 3rd party input modalities
• Meta Horizon OS UI Set
• MR / Input & Interaction Design Guidelines
• Experimental Input Modalities

Note: Links point to public pages reflecting shipped outcomes. Detailed design work is confidential and not shown here.

Input Ranking and Interaction Hierarchy

Established foundational principles for how multiple input modalities should be prioritized and combined across the Meta Quest platform, enabling more predictable and usable interaction experiences in spatial contexts.

Sep 2024

Microgestures Input Interactions

Designed and articulated interaction models for microgestures to extend expressivity and precision across spatial scenarios, supporting a broader range of use cases with elegant, human-centric inputs.

July 2024

Stylus Input Device – Logitech MX Ink

Led experience design for the integration of a new stylus input device on Meta Quest, collaborating with partners to align interaction behavior and feedback patterns with existing spatial input standards.

Sep 2024

Meta Horizon OS UI Set

Initiated and shaped Meta’s first official UI component set and design resources for the Meta XR Interaction SDK and Spatial SDK. This work provided foundational UI elements that support consistent experience construction across applications and platforms.

Sep 2024

Mixed Reality Design Guidelines

Authored comprehensive spatial computing and interaction design guidelines for the Meta Horizon developer ecosystem, capturing both core principles and practical patterns that help developers and designers build high-quality mixed reality experiences.

May 2023 – Present

Text Entry and Editing

Driving the design of spatial input and interaction experiences for text entry and editing, including virtual keyboard behaviors and voice dictation, to ensure clarity and usability across mixed reality workflows.

May 2025

Shipped H/W Products


2015 – 2023

As part of the Mixed Reality Design team, I led design efforts for the developer and creator ecosystem, with a focus on the open-source Mixed Reality Toolkit (MRTK). Collaborating closely with technical designers and engineers, I helped define and deliver core spatial interaction patterns and UI controls for Mixed Reality platforms.

With a background in engineering and hands-on experience publishing apps across multiple platforms, I bring a deep empathy for both developers and creators. I supported the team’s design vision by refining visual and interaction designs, building illustrative example scenes using foundational building blocks, and authoring comprehensive design and technical guidelines.

Focus Areas
• Mixed Reality Toolkit (MRTK)
• Mixed Reality Design Labs
• Mixed Reality Documentation
• Common UX building blocks
• Mixed Reality Tutorials
• Design/Developer Evangelism
• Partner Company Engagement
• Internal & External Tools Design

HoloLens 2 Developer Experience

Designing a coherent developer and creator ecosystem for HoloLens 2 and mixed reality platforms. I defined the HoloLens 2 Developer Experience Framework, aligned cross-disciplinary teams, and guided multiple initiatives to help designers and developers adopt spatial computing with confidence.

2018 – 2022

Mixed Reality Toolkit (MRTK) v2

Led the design of a foundational spatial UX system providing cross-platform interaction patterns and UI controls for mixed reality. MRTK v2 became a widely adopted open-source toolkit and earned the AWE 2021 Auggie Award for Best Developer Tool.

2018

MRTK3 Public Preview

Shaped the next-generation mixed reality toolkit built on modern Unity systems, advancing modular UX building blocks, theming, and pattern reuse. This work provided a scalable foundation for future spatial experiences.

May 2022

MRTK Figma Toolkit & Bridge for Unity

Created a design-time tooling system that translates core spatial UI components into Figma and bridges layouts into Unity. This toolkit enables designers and developers to explore, collaborate, and prototype spatial interfaces earlier in the workflow.

May 2021

Mixed Reality Documentation

Authored and structured comprehensive documentation covering spatial interaction fundamentals, best practices, case studies, and technical guidelines. Designed modular layouts and visual assets to support varied learning paths and reduce onboarding friction in a nascent design space.

May 2016

Shipped H/W Products

HoloLens

Oct 2015

Windows Mixed Reality HMDs

Oct 2017

HoloLens 2

Feb 2019

PERSONAL PROJECTS

Spatial Computing Interaction Design (Book)

Spring 2026

Cosmic XR

“Every once in a while I stumble across an experience that makes me pause and say “Wow”. Cosmic XR is one of those rare gems.” – Review article by UploadVR

May 2025 | Available on Meta Quest Store

Type In Space

An experimental spatial text layout app for HoloLens 2, rebuilt from the ground up using MRTK v2 to explore fully articulated hand-tracking interactions. The project investigates how typing and text manipulation can adapt to direct, spatial input in mixed reality.

Dec 2019, Aug 2017 | Available on Microsoft Store in HoloLens