Bootcamp

From idea to product, one lesson at a time. To submit your story: https://tinyurl.com/bootspub1

Follow publication

Unity editor with model inside

I created a prototype on Unity for XR design

Arthur Lee
Bootcamp
Published in
4 min readMay 22, 2024

--

Introduction

In my previous project, I designed an interior design app for spatial computing and extended reality called Envision. Although it was the best I could do at the time, I wasn’t satisfied with how the idea was communicated. I wanted to do better. So I decided to focus on 1 specific interaction in Envision and learn enough Unity to prototype it.

Before and after

Old moving furniture interaction

This was what the interaction looked like before. It is supposed to be a hand grabbing interaction where people could move furniture around to see how it looked. I prototyped this with Bezi, which is a great tool but has its limitations. Using a friend’s Quest 2 headset, performance issues arose due to the many assets. To optimise, I removed textures and moved the demo to desktop. Sadly, the idea got lost in translation.

New moving furniture interaction done on Unity

Here’s the current version. I used Unity and Meta’s Interaction SDK for the prototype, which now runs natively without compromising on textures and quality. Meta’s tools also made adding hand tracking, grabbing, and mixed reality easy. This version clearly communicated the idea, and I was much happier with it.

What I learnt along the way

As I got deeper into the Unity rabbit hole, a question started to pop up in my mind. As an XR designer, where does my work end and the XR developer’s one begin? It’s not like with 2D UI design, where you pretty much design out every single screen and can have the entire flow done. Doing all that in Unity would take forever.

Insightful discussion on the XR design discord
Insightful discussion on the XR design discord

I asked for advice in the XR Design discord and an experienced XR engineer, Pip, came to my aid. The key was to communicate my idea clearly so developers could build on it. As the discussion above shows, it’s perfectly okay to compromise on implementation quality. Instead, we should focus on the interaction details itself.

To show an application’s overall flow, immersive storyboards are more effective. They are like PowerPoint slides but in 3D, allowing us to walk through different scenes and user flows in virtual space. This gives a more accurate representation of the idea.

A simple immersive storyboard made with ShapesXR, if you have an account you can play it with here

In fact, I tried my hand at making a simple one for the XR interior design project with ShapesXR. It details the flow from importing the model, to opening it on the headset and being able to make the model life size.

So here’s how I would classify the tools:

  • For overall user flow: create immersive storyboards with ShapesXR
  • Detailed interactions: Use Bezi first if the interaction is simpler, if not use Unity (with prototyping SDKs like those from Meta’s Presence Platform)

How you can get started

Here’s a list of resources and how I used them:

  1. The Unity Tutorial For Complete Beginners: Having used Unity before, this tutorial served as a quick refresher for me. The video is a manageable 45 minutes long. If you’re new to Unity, this would serve as a nice introduction too.
  2. Valem tutorials on Unity and Meta Interaction SDK: These tutorials are great for getting started using Meta’s Presence Platform. Specifically it focuses on the Interaction SDK. There is a 3 part series on that (here’s part 1). Another great tutorial is the Building Blocks tutorial. Meta’s SDK includes building blocks for key functionality like hand tracking, allowing you set this up on Unity project with 1 click. I recommend going through the 3 part series followed by the building blocks tutorial. Afterwards, play around with the different building blocks to get an idea of how it works.
  3. Build Your Mixed Reality Game & Publish it on Meta’s App Lab Udemy course: Meta’s Presence Platform has multiple SDKs. The Valem tutorials covers the Interaction SDK while this touches on the rest. It is very guided, with many of the steps done for you already. Treat this as an introduction to these other SDKs.
  4. Bezi tutorials: Bezi is essentially the Figma of XR design. It’s great for getting started and is way faster for prototyping simpler interactions compared to Unity which would be overkill. They even have a tutorial playlist. You can continue your work on Unity by importing it too.

Ultimately, the most important thing to do is to tinker and try things out. It allows you to get an understanding of what works and what doesn’t.

That’s it! Thanks for reading and all the best on your XR journey! There’s still a long way to go for me but I will keep moving forward. Special thanks to the people who helped me along the way, especially Pip (here’s his LinkedIn), who answered my question in great detail on the discord.

Free

Distraction-free reading. No ads.

Organize your knowledge with lists and highlights.

Tell your story. Find your audience.

Membership

Read member-only stories

Support writers you read most

Earn money for your writing

Listen to audio narrations

Read offline with the Medium app

--

--

Bootcamp
Bootcamp

Published in Bootcamp

From idea to product, one lesson at a time. To submit your story: https://tinyurl.com/bootspub1

Arthur Lee
Arthur Lee

Written by Arthur Lee

Product Designer. Has a passion for all things design and tech. Personal website: arthurleeyk.com

No responses yet

Write a response