For Creators

Introducing Create 1.3: Can you build empathy with a robot?

The latest update of Create introduces a new character, the Robot, who guides players through some of Create’s lesser-known features. The goal with this update was to design an accessible way for people to engage with the subtler possibilities of the experience. A cross-functional team of designers, animators, VFX artists, engineers, and audio designers from our Studios team came together to build an interactive approach to a new character. We wanted to share with other spatial computing devs some of the behind-the-scenes challenges the team confronted bringing Create 1.3 to life. Oh, and we added butterflies.

When we released Create last year alongside Magic Leap’s first device, we wanted people to experience a world where the digital and the physical interact, showing off how spatial computing has the power to delight and amaze. The name of the app says it all. With Magic Leap One, you can create and transform the world around you. From slapping stickers on your wall to releasing jellyfish and sea turtles into your bedroom, Create offers a glimpse of how the impossible is now reality.

Magic Leap Studios’ mission is to make compelling apps that push the boundaries of what’s possible in spatial computing. This requires iteration and evolution at every level of the design and development process. Just because we’ve launched Create, it doesn’t mean we’re done.

Our third update introduces a new character, the Robot, who guides players through some of Create’s lesser known features. Oh, and we added butterflies.

Great software is designed to seem effortless, to appear like it just works. But there was a careful and deliberate process to creating the Robot. We love the results and we hope you do too.

gif dancespin 540

From Jeremy Vanhoozer, VP, Creative Content at Studios

We developed the Robot as a solution to a problem we’d noticed. In the past, to guide players towards possibilities in the Create world, we used contextual prompts that appeared when the player carried out certain actions. For example, in the initial release a 2D text box appeared in the player’s view when they first engaged with an art brush. We discovered through player testing, however, that some players didn’t notice the prompts or understand what they meant. We wanted to add an accessible way for people to engage with subtler possibilities in Create, and a fun character felt like the best way to do that. We needed a more interactive approach.

Create’s animated characters have proven to be one of the most engaging parts of the experience. People relate to them, devising backstory and interaction we never imagined. When it came to giving players a guide, it seemed only natural that it be a friendly character.

What emerged was an all new character, the Robot. This plucky, well-intentioned character’s main goal is to help players discover the cool things in Create. It doesn’t always get things right the first time, but that doesn’t stop it from trying. The Robot interacts with players to highlight features and tips on getting more out of Create. The Robot shows players how to snap objects together, turn on gadgets, feed the T-Rex, and more. Most characters’ interactions are with other characters. The Robot's interactions are with you.

image10

image2

image4

Jonathan Mangagil (Animation)

With tight deadlines and a small team, we had to understand this new character and its goals very quickly. We decided to take a personality assessment as the Robot and discuss our findings. This exercise let us flesh out its personality, establishing its wants, needs, and fears. These became the basis of its AI behavior.

The Robot’s four strongest traits were: creative, enthusiastic, energetic, and fun-loving. Furthermore, we decided that our Robot tended to:

  • ignore fine print and details
  • rarely think about consequences
  • find creative ways to do things
  • be innovative and spontaneous

Knowing this meant we could start animating sooner.

image5

While fleshing out this character, we asked ourselves whether the Robot should fly or walk. Both have pros and cons. A walking Robot requires more animation assets but gives the animator more opportunity to showcase personality as the character moves. A flying character has less animation, but it takes more time to program an intelligent path from point A to point B with personality. For both we had to bring to life how the character turns, accelerates, slows, looks around, and interacts with other characters.

To quickly and clearly showcase personality, the best route was to have it walk, or in this case roll on the ground.

gif robotledge-540 take2

The next question was how to ground our character in the player’s world? Shadows show contact with the player’s ground plane, but they require awareness of the area’s lighting. We decided to use audio and FX. FX leave a light trail as the Robot rolls along the ground, showing contact, while audio adds texture, demonstrating how the Robot sounds rolling over the ground.

Spatial computing changes how we animate. Traditional video games portray characters theatrically, incorporating clear, broad poses readable from multiple angles at different sizes, regardless of the player’s screen and their distance from it. Spatial computing, while still theatrical, redefines the kind of theater we model.

Unlike traditional theater, we think of the burgeoning forms of immersive and interactive theater. Interactive theater breaks the fourth wall separating the performers from the audience. Immersive theater removes the stage, surrounding the player with the performance. Spatial computing is potentially both interactive and immersive, depending on the type of experience. Create is immersive by its nature. The Robot adds an interactive element as it seeks to help the player.

We wanted players to feel appreciated when they completed tasks, encouraging them to go deeper into the experience. The Robot let us do this. If the player pulls an object from the menu, the Robot tracks the object to see what the player would do with it. If a construction block is placed in the scene, the Robot provides another object and demonstrates how players can snap them together. When the player snaps two objects together, the Robot celebrates by jumping up and down. We guide the player by engaging with their actions and encouraging further actions, then rewarding them with reactions.

gif lookatuser-540

One of the fastest ways to have two people build a strong bond is to put them in a situation where they depend on each other. It doesn’t matter if they come from different backgrounds or speak different languages, if they have a common goal they can build a relationship. We gave the Robot a fear of heights and animated it to ask the player’s help whenever it arrived at a steep ledge. Our hope was that when players see this, they feel enough empathy to stop and help the Robot.

Kris Whitney (Design)

One design challenge was handling multiple Robots in a scene. Players can add as many Robots as they want and every Robot could potentially request an action from the player at the same time. It was funny at first to see an army of Robot minions begging for your attention, but it became obvious we needed to limit the requests. We handled this by writing a “bot manager” script to track robot requests from the player. Before a Robot attempts a player interaction, it checks to see if it is allowed. If not, it performs the next highest priority AI task.

Another challenge was handling the Robot’s ability to spawn new objects, like summoning a block of cheese. We have to confirm that the target area is empty to avoid spawning in another character or object. This can happen quite frequently, and must be handled gracefully. For the Robot, we perform two checks and handle the failure cases differently.

gif dinosaur-540

The first check occurs when the Robot arrives in an area of interest where an object could be spawned. If the area has room for a spawned object, it shoots out the object. If the area is obstructed, the Robot performs an ‘inspect’ animation instead. The blocked attempt to spawn an object becomes a scene where the Robot studies the thing that blocked its spawn attempt.

The second check is done when the materialization shot is fired. If the shot hits something in the scene because it has moved into the line of fire, the Robot must respond. Instead of waiting for the player to perform the action it requested, the Robot plays a ‘disappointed’ animation and the rest of the task cancels.

One problem when developing for XR is that you can’t control where the player is looking. Making sure the player doesn’t miss a special event becomes a critical challenge. In the case of the Robot, we realized that it frequently wasn’t in the player’s field of view. For “player requests” to be noticed, we needed audio cues to attract the player’s attention. After the audible cue, the Robot waits for the player to look before performing the rest of its “player request.” This avoids a situation where the player finds the Robot too late, catches the end of the request, and doesn’t have enough time left to perform the request before the Robot “gets bored and wanders off.”

One thing we learned is that, when creating characters that request an action from the player, don’t be repetitive. For the Robot, we had to ensure it didn’t request the same thing multiple times. In our case, the “player request” AI task has a cooldown range to limit how often it fires, which is 20-40 seconds. Each “player request” task also has its own cooldown. So if the the robot requested the player to snap some blocks together 30ish seconds ago, the next request will be something different because “snapping some blocks together” is still on cooldown, while other requests aren’t.

Andrew Moran (Tech)

Create 1.3 brought its share of programming challenges. For locomotion and physics, we knew the Robot would need pathfinding on graphs generated by nav meshes at runtime to navigate the contours of the mesh. As a physics-based agent, the Robot uses customized parameters like max. velocity and acceleration to undergo smooth movement with desired ease-ins/outs. Desired stopping distance can also be incorporated for slowdowns when approaching targets.

Our AI includes a prioritized list of objectives that dictate what the robot should "do" when objects are nearby or when the player performs certain actions. These sub-tasks in the Robot's behavior tree were extended to incorporate player responsiveness, so the Robot's tasks are not complete until the player completes the “suggestion”. This made the Robot both a character of agency and an instructional tool that extended the player’s understanding of the experience.

image3

Animation/VFX let us establish subsystems that micromanage Robot’s visuals into a series of steps. This is highly representative in the Robot’s holograms, materialization, and joint overrides. Holograms are data driven and require the use of storing textures within ScriptableObjects, analogous to traditional inventory systems. Object materialization spawns a temporary object for the player to interact with. Creating scalable systems let us create a zero-gravity “temporary” state for an existing object with minimal impact on objects in other states. Lastly, object-driven animation overrides were necessary to redirect the joints of the Robot to face the direction of interest -- such as looking at the player or an object the player is holding.

Minal Kalkute (VFX)

Visual effects lend a sense of life and spontaneity to video games, adding character to objects and providing focus. Manipulating opacity is a common trick for visual effects artists, but in spatial computing we have to balance object glow with making the object opaque enough to see. Rendering glow effects as transparent is challenging, since the effect is transparent and, consequently, it disappears. Black color doesn’t work with spatial computing, either. To compensate, we imitate black with bright textures or implied shadows. When designing FX for spatial computing, consider how opacity works, how bright the effect must be on the device, and how building an effect around an object affects the testing time, review, and change.

image8

Unity’s Animator tool is essential for FX artists. While you control timing and sequencing with the particle system or scripts, Animator and Timeline let you time effects and control when effects are triggered. When effects have multiple dependencies, like when Robot materializes an object, it’s easier to set timing and duration with keyframes. While programmers can time most visual effects, using Animator simplifies tasks for both programmers and artists. Moreover, Animator doesn’t hamper the review-feedback cycle when parameters change, which optimises workflow. During production, having debug builds and levels for testing effects accelerates turnaround and simplifies feedback.

image9

Effects are not always visible through Magic Leap One or might appear different on device from the editor. Enabling instancing let’s the shader create an instance of the parameter internally and use the same texture on each object, for example. This helps optimise the compute power required to draw the shader in spatial computing when you don’t want too many overdraws. A shader graph like Amplify Shader Editor with the Force Enable Instancing parameter enabled mitigates this issue. When shaders get changed at runtime, instancing is essential. The values of the parameters are often decided at runtime using scripts or the shader itself. Instancing enables smoother transitions.

Optimizing every particle effect helps performance. We never want the frames-per-second (FPS) to drop, especially in an app like Create where players create an entire playground. Multiple Robots shouldn’t hamper FPS. The cause of FPS impact is usually poor character effect management. Use meshes for heavy effects. The mesh renderer in a particle system or in-project game object lightens the load on the engine and reduces draw calls. For world position or object position-related shaders in 3D packages like Maya, place the pivot at the center of the mesh instead of at origin. When using the mesh renderer in Unity for particle effects, try changing Renderer Mode to local in the particle renderer. Adding meshes to particle effect can help when using shaders, for example when you’re controlling UV’s to fake motion or disintegration effects.

When scaling an effect with scripts, ensure the parent prefab is normalized. Make sure scale parameter is 1 and other Transform parameters are zeroed out. This ensures none of the effects are scaled weirdly, creating defects in dependent effects. Inside a particle system, change the Scaling mode parameter to Hierarchy to scale dependent game objects. For effects played once, setting Stop Action to Destroy clears the effect from the scene, preventing unused effects from hogging memory. Shader effects are usually more efficient than particles. Using flip-books inside shaders for sprite-sheets also helps performance.

gif repair-540

The most important part of an FX pipeline is feedback from other artists and art directors. Get reviewed and feedback early and often. This allows time for change and polish later in production. It can be hard to get feedback and waiting is frustrating, so invest in communication with your team. Post your work as small gifs in forums or chat apps like Slack. As feedback comes in, you’ll see how you’re doing before optimising the workflow.

Danielle Price (Audio)

We wanted the robot’s personality to feel more human than machine-like. We wanted to give it human vocal qualities, but we didn’t want it to actually speak. We used voice-like synthesized sound effects to call the players’ attention and to tell the story of his actions. Its sound pallete consists of organic mechanical elements with hints of sci-fi-esque synthesized sounds. We recorded original source material from a vintage Macintosh Plus computer, keyboard and mouse.

image6

These organic-mechanical sounds gave it a more realistic feel and helped build the character as a little guy who had some miles on it. Each animation consisted of multiple sound layers with different frequencies for his body and movements. We isolated all the sound layers in Wwise blend containers to add greater variety and flexibility to the mix.

While working with animation and FX, it was important that we stayed in lockstep. We “sketched in the sounds” but never fully colored in the lines until the visuals were done. Changes in a project can incur risk up until final release. To avoid a last-minute audio design crunch, we built a library of sounds for the Robot’s different movement layers. At the end, when the visuals were done, we could quickly finalize sound timings for the animations by drawing from our library.

image7

The Robot’s hero animations and vocal attention calls are louder than his idles and turn sound effects. Our final mix on the device balances sound to support a variety of interaction scenarios. For example, when multiple characters are active in Create, we dynamically mix the sound by adjusting audio levels based on what the user is choosing to focus on. The volume levels of objects the user isn’t focusing on are attenuated. This helps us avoid a muddy unpleasant mix of too many sounds playing at once.

Conclusion

This third update to Create allowed us to polish and refine an already beloved experience to highlight some of our favorite effects and possibilities. The Robot, a new character who actively seeks a personal connection with the player, gave us an opportunity to create interactive experiences and guide the player to new discoveries. The team working to bring the Robot to life has hopefully shared some opportunities across the design and development spectrum of spatial computing. Our hope is that you take these ideas and go further, discovering new vistas we haven’t imagined yet.

Safe travels. We can’t wait to see what you make.

Related content

Get the latest news and updates

Sign up to receive offers, promotions and other marketing emails from Magic Leap. You can opt out of them at any time.

Sign up to receive offers, promotions and other marketing emails from Magic Leap. You can opt out of them at any time.