How I designed an interior designing tool with AI and VR Integration

Let me take you through my process of designing VRIA, a virtual reality interior assistant

Manikant Mudgil
Bootcamp

--

VRIA — Virtual Reality Interior Assistant

Context

As a product designer and developer, I’ve been trying to renovate a living/dining space for some time now. Fueled by a passion for DIY projects, I took the initiative to give interior design a chance to get some inspiration for the space.

I had the following goals for the room:

  1. Try different furniture styles.
  2. Experiment with different layouts like a living area, home office, etc.
  3. Remove unnecessary items and add new decorations.

I tried interior design tools and even a few AI interior tools to help me with some ideas, But the overwhelming process had underwhelming results.

Results from Interior Design Tool vs AI Interior Tool
Results from Interior Design Tool vs AI Interior Tool

I tried recreating and changing layouts to the best of my capabilities but the outcomes from both tools had the following issues:

  • Interior design tools helped me in achieving most of my goals but I could only use the assets from the tool’s library, and matching them to a specific theme was a challenge in itself.
  • AI interior tools were on point with the themes I selected (Vintage, Modern, etc.) but struggled to showcase furniture for the type of room (dining, Living, Office, etc.). Also, I wasn’t able to add, edit, or remove any furniture, decor, or other specific from the results.

I shared my progress and findings with my friends, who are interior designers themselves, and discussed the gaps in these tools and what could be the possible solutions for these issues.

Understanding the problem

I dived deep into understanding the workflow, pain points, and goals of an interior designer with their tool, which helped me find the following insights.

Pain Points and Goals of Interior Designers

With this overview, I now know the kind of problems I needed to solve, I defined a clear problem statement to narrow my focus and scope for this conceptual tool.

How might we help interior designers generate, iterate, collaborate, and visualise their work in real time?

My scope for this tool would be to design an MVP (Minimum Viable Product) that has the following features:

  • Generate or Remove furniture and decor items from a space.
  • Iterate over the generated items for a wider range of selections.
  • Collaborate with teams and clients to showcase designs.
  • Visualise and make changes in the design in real time.

Research and Inspiration

I looked for existing tools that have these features with the intent to figure out a way to combine the best of each implementation to design an MVP as a result.

While doing the research I found IKEA has been working on something similar, They have a virtual reality showroom and a virtual reality interior designer which helps customers decide the color, texture, and material for furniture, and cabinets, of most of the IKEA products inside a 3D room.

IKEA VR Showroom Experience
IKEA VR Showroom Experience

My biggest inspiration for designing this conceptual tool comes from this execution from IKEA. But here are a few reasons why this approach won’t work for us:

  1. This only works in a 3D room, and only with 3D models of their own products.
  2. We won’t be able to add or remove items from this space.
  3. Can’t customize the room to try different layouts.

There are similar VR and AR applications that rely on either creating a room from scratch or scanning the room and then placing 3D furniture assets or decor items. However the biggest issue of this approach would be matching the 3D assets with a real environment or with other 3D assets.

Ideating Solution

It is difficult to create a 3D scan of the room to replicate IKEA’s VR experience but we’ve often seen VR working well with 360° photos and to project a plain view, we can zoom in enough to have a flat image. This would work well in a browser as well as in a VR headset.

By using a 360° photo of the space we can move around in different positions and sizes to have a preview of each section, and we can fully expand the space to have a 360° immersive experience as shown below.

VR expereience using 360° photo

Another use case of generative AI is inpainting and outpainting, which fits perfectly for our use case, where we can select a portion of the photo and manipulate it with the help of prompts, to generate and iterate over the results. This can be achieved with tools like Dall-E, Stable diffusion, and Photoshop generative fill.

Dall-E Inpainting Iterations in a 360° Room
Dall-E Inpainting Iterations in a 360° Room

I used Dall-E to inpaint and iterate over the current furniture, remove objects, and even generated a wall painting that never existed in the photo. I got some realistic results but had to go through over 200 iterations.

Design principle of VRIA

The user flow of the interior design tool helps designers make the best use of their skills and knowledge to get the desired result. What becomes challenging is iterating and testing different ideas individually or in a team, as this step could often take a lot of time.

The current approach of AI interior tools works more or less like a Snapchat filter for a space that doesn’t have much room for human intervention to control the outcomes of the tool.

User Flow of Interior Design and Interior AI Tools

This is what VRIA tries to solve by making the best use of generative AI inpainting to give more control to the designers in creating ideas, having discussions, and improving their efficiency in iterating and testing designs faster.

VRIA introduces the persona of an interior assistant that helps designers with their journey by taking their ideas as prompts and outputting different results to select or edit from. Here’s a user flow of how VRIA works in brief.

User Flow of VRIA
User Flow of VRIA

The feedback loop of giving prompts as input helps to establish a context that gets closer and closer to a desired result one prompt at a time. This will also help in training the AI with each prompt to understand the kind of outcome an interior designer is actually looking for.

Design Process

For designing the MVP I focused on adapting Apple’s spatial design principle and sticking to the scope of designing an experience that showcases the features which help interior designers solve their problems.

Depending on the designer’s requirement the windows/layouts should adapt to their needs. A 360° photo can easily adapt and help designers with viewing and editing in sections or testing the results by expanding into 360° space for an immersive VR experience.

Single, Shared, and 360° View
Single, Shared, and 360° View

I. Home Screen Window

Starting with the home screen, which shows the recent work of a designer with other sub-navigation options in the sidebar. The window size is of the default aspect ratio to help designers focus in the center without moving and panning too much. And a tab bar on the side to help them navigate to different sections of the tool.

VRIA Home Screen

II. Project/Image Window

This is where the editing and designing starts for a designer with a wider window to work with, where the controls in the bottom toolbar help with basic functionalities like generating designs, adding resources, adding comments, undoing, redoing changes, and checking out all generated designs.

Project/Image window with toolbar controls
Project/Image window with toolbar controls

III. Generating Designs

A designer can place this generative frame over the portion of the image they wish to add, edit or remove items from and the toolbar turns into an input field for taking prompts to initiate the inpainting process.

The significance of the generative frame is not only to define an area for the AI to work in but to also let other team members know of the assigned space to avoid conflicts and overlaps in design.

Generating designs with the help of prompts
Generating designs with the help of prompts

IV. Iterating Designs

Based on the prompt, VRIA outputs multiple variants for the designer to pick from. They can then perform the following actions:

  1. Select a variant to preview it in the environment.
  2. Accept a variant that matches their expectation.
  3. Edit prompt for a specific result or in case of misinterpreted results.
  4. Regenerate variants to view more options.
  5. Edit or erase any discrepancy in the variants.
  6. Delete unusable variants which can not be edited for a desired outcome.
Iterating over the generated variants
Iterating over the generated variants

V. Collaborating with the Team

For working together as a team or to showcase the results to a client. A SharePlay experience can be initiated in VRIA, and depending on the level of access assigned by the host one can either edit or only view the project. This will help in managing different team members over multiple projects.

Collaborating with a team member
Collaborating with a team member

VI. 360° Immersive View

The best use case for starting with a 360° photo to work with is that a designer can experience the transformation and test the design ideas with respect to the surroundings in real time. By simply moving around or dragging along the space one can literally see the big picture of the results.

360° Immersive View
360° Immersive View

With this approach, VRIA can solve most of the problems an interior designer faces with current interior design tools and make better use of AI in their workflow.

Prototyping

Using Figma’s smart animate, I made this basic prototype to showcase the user flow of all major features. Here’s a video of me interacting with the prototype, and if you wish to try your hands on it, Use this link.

Interacting with the VRIA prototype

Conclusion

Since spatial design is still in its early stage there's a lot to learn in this space. My biggest takeaway from this challenge is that, if AI does need to be integrated with interior designing, then it should have room for human intervention. This will help in improving the results for both AI and interior designers.

And to have it all in a VR tool will provide the best user experience for all, ensuring efficiency in testing the generated ideas, iterating faster over them along with a team, and showcasing the results as demos to the client, does sound like an ideal tool for interior designers.

Bonus

As an assignment for designing a landing page for a product. I decided to design and develop two landing pages for VRIA in Framer and Dora.

Framer is a no code tool that lets its user develop a fully responsive website from scratch. I first designed the landing page in Figma and then imported it in Framer using Figma to HTML with the Framer plugin. Check out my progress in this video.

VRIA Landing Page in Framer

Similar to Framer, Dora is a no code tool that has keyframe features to build 3D websites which let users customize what view should a viewport have based on their window scroll position. I used a 3D Apple vision pro asset with my prototype assets and with a bit of scrollytelling I was able to achieve this.

VRIA Landing Page in Dora

Both of these websites are works in progress as I’m trying to fix a few bugs and make both websites fully responsive. If you still wish to check them out then you can click here for the Framer landing page and click here for the Dora landing page. If you want to read about the design process of the landing page then click here for the case study.

Thank you for reading and being part of this conceptual tool journey. I’ll be diving deeper into spatial design and exploring more such ideas in the future. I appreciate and welcome your feedback and views on my approach, You can connect with me at X(Twitter) and LinkedIn.

Now, I’ll get back to designing the interior of my room :P

Peace ✌🏻

--

--

A product designer and a developer, passionate about inclusive design.