Case Study

3D Modelling

Project Overview

For my final year project, I wanted to create a concept for a future experience unlike anything that currently exists by combining my passion for space exploration with next-generation artificial intelligence technology.


My vision was to create an exhibition that combined these elements into an immersive and interactive experience where users could learn about what might lie beyond our solar system and capture that feeling of looking up at the night sky and wondering what might be out there.

Concept Generation

To create a mockup of the exhibition, I first created a schematic of how the exhibition would flow with its visitors. Creating a ring shape that would surround a central sphere allowed me to control the flow of visitor traffic, which then allowed me to build the experience around how users would traverse the ring, finishing their journey in the center.


Each segment of the ring would act as a zone, with each zone engaging users to create different parts of the prompt. For example, zone two involves users creating a star which impacts what worlds might develop. A bright star might lead to planets being too warm, while a small dim star could lead to planets having a colder climate.

Concept Creation

Created in Blender

After creating the exhibition plan, I started making a mockup of the zones themselves and how they would look and feel in the actual exhibition. To do this, I used a 3D modeling tool called Blender. It enabled me to create blockouts of the different zones based on the low-fidelity plan I had created. After creating the blockouts, I added more detail to the zones in the form of lighting, textures, and other elements, as well as visual elements relevant to each zone, such as signage, planets, and, for later zones, plants and animals.


I finished off the zones by adding shading modifiers and volumetric lighting to make them more realistic in appearance. Each zone is designed to be similar for consistency and to save time, as creating a unique structure for each zone wouldn’t be possible within the timeframe.

AI Intergration

Powered by DALL-E AI

I wanted to integrate an AI that was capable of generating the different environments, planets, and stars that users would experience based on what the prompt contained. Using DALL-E AI, I began to create different concept images based on prompts that might be generated during the exhibition. I was blown away by the results that DALL-E was able to produce, and it captured the feeling of wonder and excitement that I wanted users to feel.


I decided to use the DALL-E API to produce concept art for the exhibition and then planned on integrating it into the exhibition to act as a way of generating the elements for each zone systematically.

Showcasing the exhibition

Powered by DraftXR

To showcase the concept, I wanted to put the exhibition renders into virtual reality. This would allow users to demo the experience before physically attending the exhibition. To do this, I used a plugin for Figma called DraftXR. This allowed me to design on a flat 1920x1080 frame, which would then be layered in webVR with a background image. Designing in Figma allowed me to create a standard for my user interface, focusing on how the UI would be seen by users at such a close distance and making it a comfortable viewing experience.


The main goal of the interface was to offer users a way to navigate around the demo. It avoids using animations and moving components to reduce motion sickness for users who use a VR headset. It is designed to be in the center of the user’s vision so that all key elements are within their view.


Creating this UI in Figma allowed me to establish a standard for the interface and maintain consistency throughout the prototype, keeping it minimal and not restricting the user’s vision.

View the Demo

https://app.draftxr.com/vr/sVInJY

Ethan Fearon

2023