Lessons learned from creating digital space in Spatial
What does a good space in VR look like?

This weekend I have been designing and creating VR experiences in Spatial. With the resources on Spatial's Discord channel organized by Jake Steinerman and tons of sharing on Youtube and Blender Stack Exchange, it's been fun exploring and learning what Spatial and Metaverse are capable of. I've listed some lessons that I've learned along the way from a user experience and implementation perspective.
Feel free to visit my Dome Gallery v2 and two other testing projects, Dome Gallery v1 and Pairing in Space, that I used as examples below.
User experience
Spatial puzzle
How to simplify the spatial puzzle? Users get lost in VR because they fail to build a map with fragmented spatial memory which is caused by speed moving. A clear spatial organization can help users create the map right after entering VR, and have a good user experience. A great example of space organizations is in Francis D.K. Ching's Architecture: Form, Space, and Order.


-
Active and passive space
In the real world, space is passive. It is waiting for users to come in and create spatial memories. In the digital world, space is active. It is filled with attractive elements to spark curiosity.
In conjunction with a previous belief, I created numerous chat rooms in Pairing in Space, hoping users would feel comfortable having conversations in their own domain. However, due to more experimentation, I found, in the virtual world, color is one of the main key elements to spatial stimulation.

In Gallery Dome, I focused on creating an intuitive experience. For example, the spatial organization, lighting, mysterious structure, and wide arched door welcome users. The space in the digital world speaks for itself; all the people who have used this project have successfully navigated through it.

-
Scale
Scale plays a significant role in evaluating the desirability of space. I test-played two projects to find the optimal dimension of spaces; however, I found no set standard. Still, I documented the measurements I used in digital space and hoped it could be a helpful reference.




-
Moving method
Teleporting horizontally mimics the feeling of zooming in. I consider this the most comfortable way of moving in VR because users can easily anticipate the next scene.

From my view, moving vertically by stair and ramp in VR has many issues, some of which include possible motion sickness and constant, manual rotation. I am excited to see if the animated floor could act as an elevator in VR.

A portal(anywhere door) pushes designers to rethink the potential and capability of space. It solves the issues when people teleport too far. However, the loading time of the portal is a factor that needs to be considered. Still, it will be a good element for moving around between major activity centers.

-
Gravity and Fall
Floor boundaries stop avatars; however, they can enter the floor hole by jumping forward. It avoids users making mistakes and turns the experience into a game. To take advantage of this, I think it will be interesting to include the falls into part of the narrative.

-
Thickness of walls
I have found that users can go through a wall if the thickness is smaller than the avatar (1m). If I make a thick wall (>2m) or double wall, this can be avoided. However, it could be interesting to use this glitch to my advantage and use it to create a secret area, like Mario Kart.


Implementation
Rhino to model -> Blender to apply textures and export gITF -> Spatial. Below are some issues I came across and how I solved them.


Basic restriction
Spatial document — Custom Environment
Spatial document — Spatial 3D Model Preparation Guide
-
Textures
I mostly used PBR textures (Albedo, Roughness, and Normalmap) to build Dome Gallery v1, and I found Jake's material examples super helpful. Still, below are some issues I met.

1. Fail to export with texture: It turns out that the UV Map node is necessary, which means we couldn't export texture simply with the image texture node. Also, there can't be other image texture nodes existing at the same time, or the exporter will fail to find the right texture.


2. Weird tiling textures: Although we could use UV editor in Blender to make it a little bit better, it is time-consuming and will still have much room for improvement in terms of quality.

3. The emission doesn't have a glowing effect: This article pointed out that "the core glTF 2.0 format has a clamp on emission. It can't go higher than 1.0." So, it is impossible to create a lighting effect using emission nodes.

4. The scene lighting doesn't match the skybox: I sought to create a night environment, but I found a scene lighting, which I have no control of, lit up the space.

Blender Manual — how to export gITF in Blender
Youtube — Part 3: Creating a glTF-compatible Blender material from a set of PBR textures
babylon.js — Blender to GLTF/GLB — Where are the textures?
Blender Manual — How do I export my model with some light (in .gltf)?
Reddit — How to export a GLB model with emission? -> bake lighting into texture
Blender Manual — Getting just white light instead of actual colors by exporting emission based models
-
Dome and Skybox
First, I tried to use different textures on skyboxes to fix the scene lighting, but nothing changed.
Later, I built a sphere as a dome and attached the night sky texture to change the scene lighting. However, it doesn't work out either. Even worse, when I start on the dome to enter the space, I can't get inside the dome!

Spatial document — Creating a skybox
Artistic Render — Setup a skybox using the sky texture in Blender
Youtube — Blender 2.8 How to setup an hdri environment background
-
Baking
Finally, I found some articles about baking the textures with lighting to replace the default lighting setting in gILB, and it works! This solves the emission, texture repetition, and scene lighting simultaneously. However, I still ran into some problems along:
- Fail to bake: Double-check if the islands overlap.

2. Fail to bake: Only "Cycles" has the bake feature. Make sure to set to this engine before starting to play with nodes.

3. Bad quality: Higher sampling render won't enlarge the output's file size and will significantly improve the mapping quality. Setting the number within 100–400 will be fine, but the higher the number, the more time rendering will take.

4. Texture doesn't include the lighting: Set the bake type to "Combined" and remember to check everything below to bake the same view from render mode.

5. Some of the meshes didn't light up: the lighting needs to be in an object or sphere to reflect the light. Otherwise, direct light's surface will not have a reflection.
6. Different from rendering view: Baked textures still slightly differ from render.

Youtube — How to Bake Lighting, Shadows, Texture and Reflection in Blender-2.90
Youtube — Texture Baking in Blender for Beginners (Tutorial)
Youtube — how to bake ambient occlussion in blender 2 8 tutorial
Youtube — UV Unwrapping
Blenderartists — Baked textures is very different from the render, what gives?
Stackexchange — Cycle Combined bake only adds lighting on side of mesh
With many new features being released, I'm excited to continue exploring and designing VR experiences in Spatial. My next step will be animating the water.
I am interested in designing and prototyping a good user experience in XR. To view more of my projects, you can visit my website or email me through bio.