Magic Leap Studios has launched Undersea, a room-scale, Spatial Computing experience for Magic Leap One made with Unreal Engine 4.
Undersea transforms your space into a dynamically generated, coral reef biome. Distinct vistas and creatures, presented in a photo-real art style, create a sense of presence and connection between creatures and the environment. At Siggraph, we talked about some of the technical challenges we solved bringing this vibrant underwater world to life. We wanted to share those findings here.
Working at Magic Leap means we get to dream about creating formerly impossible computing experiences and then work to turn those dreams into a reality. What if we could turn your room, any room, into a vibrant reef, where fish and sea creatures interact with you as if you were a diver, no water required?
If you’ve not had a chance yet, you can check out the behind the scenes video where we talk about how and why we created the experience. Better still, download it from Magic Leap World, and turn your living room into a coral reef.
If you’re unfamiliar, Magic Leap Studios is a content creation studio that sits within Magic Leap. Its goal is to create high-caliber content that’s realized and experienced in the new medium of Spatial Computing. Put simply, we make cool stuff so you can make cool stuff too. Undersea was largely about pushing the graphical boundaries of spatial computing using UE4 and Vulkan on Magic Leap One.
In that spirit, we’re releasing a series of art and engineering posts to share some of the problems we solved during our development process in the hope that you’ll dream and create experiences we haven’t thought of yet.
How did Undersea come about?
The Undersea project, as we see it today, started from a simple, yet enjoyable demo created at Studios. It featured a single fish which interacted with your hand and then lead the user to a human-sized spherical aquarium containing a coral reef and many varieties of fish. Though that demo was fairly static, it resonated with everyone who saw it. It also gave us a glimpse into the possibilities of building an experience that truly pushed the limits of our technology and graphics hardware, while finding interesting ways to show the true potential of spatial computing.
Animation, Rigging and AI pathfinding
With Undersea, we had to rethink our animations, techniques, and rigs to work with AI pathfinding, which can change states and navigate meshed spaces. Our animation and rigging team invested time developing a standard fish rig and animation pipeline with:
- Standard animation and UI controls
- Auto LOD
- Fish eye look-at
- Customizable tentacle and fin rig
- Blendspaces that trigger behaviors
- Vertex animation texture baking
These features afforded us great flexibility and, most notably, the ability to reuse animation and tune it to a creature’s behaviors. We originally explored using a shader-driven fish swim solution, but found that it lacked the fidelity and control we needed for our project.
Though our internal rig system also had built-in LOD support, we actually ended up using UE4’s auto LOD. This helped reduce our bone and poly counts dynamically as our performance budgets changed.
One of the greatest challenges our animation team faced was building and testing animation sets to work with our AI pathfinding solution. Each fish had a set of “Blend spaces” with a specific set of animations which were triggered by the AI system’s capsule, based on our various logic triggers and specific states such as swim speed, wander, seek, flee, feed, etc. Blend spaces let you specify inputs, the animations, and how the inputs are used to blend between animations.
One of the state machines
On the left panel you can see the various locomotion states, the middle panel has the audio anim notifies and overall anim notify triggers for feed, turns, etc.
Environment and background schooling creatures
Though we ended up relying on rigs and hand-keyed animations, we discovered a valuable solution for certain environment-based creatures that let us\ bake animation data into textures using a pipeline that leveraged Houdini’s game tools, Maya, and UE4. The end result gave us the performance leeway we needed, especially as we tried to find a balance of cpu and gpu (including texture streaming) load using Vulkan 3.1 mobile.
The schools of fish in the midground and background used the baking technique
We knew that we would not be able to use something like birds flocking in real-time due to the sheer amount of content we were wanting to author in our experience. For large groups of fish, we arrived at a solution which reduced our baked texture sizes. For example, for each member fish in a large group, we exported out their swim cycle from Maya to an FBX file. We then took each of those caches into Houdini and generated individual “VATs” or Vertex Animated Textures which we, in turn, plugged into the Houdini softbody frame blending vertex shader in UE4 to deform it. Lastly, our FX artist generated a simulation in Houdini and attached each individual particle to a joint which drove each vertex animated fish along the water.
- Fish and creature swimming
- Standardized rig and animation controls across all fish
- Auto-swim (procedural sinusoidal swim functionality)
- Although we had our own traditional LOD rig system, we ended up using auto LOD in UE4
- Fish eye look-at
- Specialized environment creatures (Eel, Crab, Octopus, Sea Horse, Sea Turtle, Manta, Shark)
- Blendspaces and in-place rotation offsets used in conjunction with AI to trigger behaviors (Speed, State)
- Show sample animated rig in Maya with someone controlling it in Maya
- Setting up needed animations and blend spaces for Tech team to integrate with AI pathfinding
- Vertex Animated fish pipeline (Gave us flexibility and power, given our graphics and mobile spec budget)
Coral Clusters - Procedural spawning and placement
One of our original goals was to give users an opportunity to experience a dynamically placed and procedurally driven undersea reef experience. Thus our challenge was to build a variable-driven coral/rock spawning and placement system using blueprints that integrate with the real world.
In the end, we opted for a more performance friendly “hybrid” approach where we loaded and placed pre-built rock formations and sand bases, then using our coral spawning and placement system, spawned coral and seaweed on them. Before we could do that, the art team had to first establish a cohesive set of coral rock formations that would serve as the foundation for “kit-bashing” different layouts.
Samples of our “building block” rock assets
The system for dynamically spawning and placing corals was developed using blueprints. Although blueprints were initially considered more for early prototyping and making gameplay proof of concepts, our Technical Art team squeezed every bit of power out of them.
It looks like the pink sprinkles you get at your favorite ice cream place, but they’re actually vertices being validated for potential coral growth!
We wanted the coral clusters to behave randomly every time the user returned to the experience. Images of our coral spawner object system and the data tables to control growth.
The system employed some notable features:
- A data table approach to set individual population and species max counts, spawn type, height-based percentages, and other classification variables.
- A vertex crawling and raycast algorithm to check for placement viability, growth, and caching of valid points to file.
- Run-time coral static mesh loop that first calculates the shadow pass and then creates instanced static meshes of all corals.
- Collision-based exclusion allowed us to simply place box colliders where we didn’t want certain species to grow.
To help integrate and ground the coral clusters in the user’s space, we built a set of smaller “supporting” rock and sand elements with dynamically moving seaweed on them. The spawner manager (on the C++ core game side) then placed those supporting elements around the main coral clusters based on available floor planes that the system detects and the space constraints in each user’s room. This also helps ensure a unique layout for every play space.
Although each of the three core vista environments were pre-built, the swappable frames and dynamic AI and FX driven creatures worked to make them feel completely unique.
Procedurally spawned coral clusters surrounded by smaller rock pieces and seaweed. Each layout is dynamic and dependant on the meshed room size
- Our system used a run-time coral static mesh loop that first calculated the shadow pass using actor tags, then immediately created instanced static meshes of all corals to reduce draw calls and overhead
- All growth was data table driven and included:
- Vertex-based growth, raycast search and caching of valid points
- Height-based and collision-based inclusion/exclusion of species
- We built reusable and procedurally spawnable rocks and coral reef elements that could be rotated and snapped in random configurations
- We grounded and integrated the content by using a subtle falloff on sand and adding randomly placed supporting rocks with seaweed around the clusters
- The team used Maya, Zbrush, Substance Designer and Painter to create unique and tileable procedural and hand crafted assets (Cross-covered in Portals section)
In this first post, we’ve looked at how we addressed animation, rigging and AI pathfinding problems for our fish and how we dynamically spawned coral in the user’s space. In our next post, we’ll look at how we built portal Vistas that dynamically extend and transform the user’s space and then consider the lessons learned by the entire team. If you have questions or feedback, come join the conversation on Twitter or in our forums.
CG Supervisor, Magic Leap Studios