For Developers

Tips and tricks from bringing Spotify to Magic Leap 1

Today, many apps are developed for Magic Leap 1 using the popular Unity real-time 3D development platform. The team behind Spotify wanted to use Unity to build a non-game experience. What they’ve created is the first ever Spotify app for spatial computing. The demands of this leap forward required a lot of consideration around workflow and design principles that we wanted to share with our developer community.

Last July, Magic Leap released its Background Music Service (BMS) which lets you play music in the background while other apps are in use. Pairing this with a music provider like Spotify seemed an obvious next step, and last year, we released Spotify on Magic Leap World. Now you can stream music from Spotify while browsing the web and using other landscape apps on Magic Leap 1.

Using Unity to make a non-game

Magic Leap 1’s BMS allows any developer to create a streaming audio application with Unity, Unreal, or Lumin Runtime. For this, the team chose to use Unity for rapid iteration on 3D interfaces and control input. Even though Unity and similar engines have traditionally been used for developing games, the environment is well suited to creating this type of immersive utility in an efficient and familiar way. In Unity, artists can develop the style and layout of the interface with custom models, shaders, and effects, right down to the curvature of invisible elements poised to catch the cursor as it moves between buttons, allowing it to swoop at just the right angle. Programmers don’t need to await the completion of final draft content from their teammates. Code can bring these elements to life starting from day one because Unity and Magic Leap's Lumin SDK make prototyping a breeze, enabling cross-disciplinary teams to move forward without bottlenecks. We hope plenty of others begin working with Magic Leap 1 tools like BMS to push the bounds of what’s possible in spatial computing using Unity, so we wanted to share some of the learnings that emerged along the way.

image2

What Worked

Know your data model

Starting Day 1, we did a lot of leg work to map Spotify's well-documented classes into C# for the quick serialization we’d need in Unity. We unit-tested back and forth between web, system memory, and hard disk with a variety of instances of each class in isolation. Having the data model buttoned-up made a world of difference when it came time to integrating everything into the main scene.

Define rules of engagement for all component communication

Unity has dozens of ways components can reference each other. Without a plan, we were going to get lost in the complexity. We established guidelines defining which components could reference each other and why:

Class Authority: Managers > Controller > Helpers

  • Manager classes were singletons and could reference each other at will.
  • Manager classes could only create hard references to controller classes in rare cases.
  • Controller classes could only create hard references to helper classes.
  • Controller classes were always at the root, parent level of their own prefab.
  • Helper classes had to be descendents of their controller class and could only hard-reference their parent controller and/or fellow helpers within their prefab.
  • All other communications used a custom event system.

Defining these communication patterns allowed us to minimize hard and serialized references across the app. Most Unity devs agree hard references are bad, but our alternative, serialized references, are still technically hard and problematic in many ways. They might be more resilient in a shifting hierarchy, but they don't hold up across modules when implementation chunks get swapped out.

The Challenges

Utilize searchable events

A custom event system was our answer to the hard reference woes. Beyond a certain scale, every app needs a nervous system. The Spotify app’s nervous system is built for fast traversal during debugging. Event triggers mapped in Unity's Inspector proved a roadblock. Artists often prefer to control behaviors from the Inspector without writing source code or asking programmers for help, but the result, serialized triggers, slowed down the whole team.

Programmers trying to debug were flummoxed when the event system trail abruptly stopped. "This method’s obviously executing, but what’s calling it?"

Without fail, it was a serialized trigger. It was usually somewhere in the hierarchy, often two levels removed from the class in question, via hard references. During prototyping it seemed a simple choice, with no hint of the pain it would cause the team later.

The bottleneck was a lack of searchable trigger names. A mix of procedural, serialized events meant only the procedural events were searchable. Programmers would chain-search the event system, tracing an error back until the trail stopped. They had to stop debugging, find the artist who built the prefab and stop their work. The artist usually referred them to another artist who created an adjacent prefab that contained the hard reference and the mapped trigger hidden behind it...not exactly the ideal workflow.

The team agreed that end-to-end source code trails were faster and less stressful for everyone. With Inspector serialization disallowed in our custom event system, team members could step through trigger chains with clear trails and quick word searches. Conversely, artists learned to script triggers rather than linking them in the Inspector. Events could be triggered with a single line of code. Our artists were equipped with example snippets. We all got to work more quickly. We plan to improve on this system to enable Inspector serialization again without losing searchability this time. Meanwhile, we have a system that doesn’t give the team whiplash.

Prefab everything!

Modularity is key. In our workflow, a primary scene file was locked for changes for no more than a few minutes at a time. Our rule was to prefab all controllers and manager classes. As a result, we rarely change primary scenes that were shared across the team. Team members could merge a prefab into its appropriate place in the scene and check the scene in right away. After that, they could work on the prefab outside the scene without blocking everyone else’s contributions.

In Unity, version control choices are either authoritative (meaning checkouts with file locks) or distributed (meaning frequent merge conflicts). Either way, it proved much faster to block simultaneous file changes by modularizing scene hierarchies rather than using version control tools to manage the fallout.

Zero Iteration is heavenly

image1

Via Zero Iteration, the LuminSDK and Unity integration lets devs press Play and instantly see their scene on Magic Leap 1. This workflow is highly effective for iterative changes. It eliminates the need to build MPK files to load on your headset with every change. Unfortunately, as of the release of Spotify v1.0, the Background Music Service does not communicate with Unity through the Zero Iteration server, so there were certain parts of the Spotify app that we could only test with a build. Portions of the interface that functioned in ZI saw fastest improvements.

What we learned...

Better organization during pre-production

image3

The transition from prototype to final-draft scene components was a bumpy road. It's important to retain agile workflows even as the deliverable version of the app takes shape, but staying agile led to a hodgepodge main scene of finished components beside pre-production components. We finished in the end, but it was slow going until most of the prototype elements got replaced. You have to make a few messes and learn during pre-production. For the sake of team sanity, we should have separated production-ready from the in-development mess.

Even though we vigilantly prefabbed everything and defined prototype vs. production, we wound up integrating them all into one scene. In retrospect, only letting production-ready components into the production version of the scene would have been faster. Starting a production version of the main scene from scratch would have established a clean slate, preventing a lot of refactoring work.

Establish a universal interaction model up-front

We knew the UI should eventually conform to a single interaction model, but the Control was very young when prototyping started. It was the first 6DOF controller on the market for spatial computing, and it required a lot of experimentation.

Our Spotify for Magic Leap 1 has controller raycast capability, presenting a white cursor as the hovering element. The cursor morphs into various modes based upon the targeted element. In general, either the cursor or the hovered element provides a visual cue when the trigger is pressed.

mockup2-blog

Early on, we attempted to match the look and feel of core interactions (hover, activation, drag, etc.) to Spotify's other platforms. It didn’t work. The nuances of spatial computing demand a unique UI. We had to put each element in its own sandbox to learn what felt right. This contracted our window to integrate a cohesive model at the end. If we could do it over, we would start by establishing a set of cursor modes that behave similarly across the app.

Where we go from here

Unity is an exciting development platform, and with the Spotify app, we've tested the waters of a high-end game engine as a toolchain for building something other than games. Our conclusion: Unity is not just a game engine; it's a spatial computing app engine, too. Spotify is not just an app for playing your favorite music. It's now a spatial computing experience.

Related content

Get the latest news and updates

Sign up to receive offers, promotions and other marketing emails from Magic Leap. You can opt out of them at any time.

Sign up to receive offers, promotions and other marketing emails from Magic Leap. You can opt out of them at any time.