Alongside our enterprise and SDK announcements, we’re releasing LuminOS 0.98, the latest version of our spatial computing operating system, and announcing internationalization and localization for Spain and Italy.
Today, we announced Magic Leap 1, an updated version of Magic Leap One Creator Edition, and launched our first suite of software and services specifically designed for our enterprise partners. Our Enterprise Suite brings Magic Leap 1 together with features like Device Manager, Rapid Replace, enterprise level support, and extended warranty for enterprise customers and developers. Alongside this announcement, we’re releasing LuminOS 0.98, our seventh software update since launching Magic Leap One Creator Edition in August 2018. Customers who purchased our Professional Developer package will receive early access to the LuminOS 0.98 update. The release is also available for direct download on our Developer Portal. Over-the-air (OTA) updates for LuminOS 0.98 start rolling out Thursday, December 12th and will continue through the remainder of the week.
We’re also releasing LuminSDK 0.23 and a consolidated set of developer tools to replace the package manager called “The Lab.” If you’re a developer and interested in what’s now possible with the latest version of our SDK, check out our SDK blog post here.
Our community of developers, consumers, and enterprise customers is growing globally. Earlier this year, we expanded internationally to the UK, Germany, and France. Now, we’re announcing internationalization and localization for Spain and Italy. If you haven’t yet, we recommend you take the time to update your home country on your Magic Leap ID. This ensures you’ll see the most relevant apps available in your region. You can find instructions on how to do that here.
New User Features in LuminOS 0.98
With this update to LuminOS, we’ve focused on adding new features and improving the overall user experience for Magic Leap. Historically, we’ve seen some confusion around entering and exiting an application, so we’ve created a new set of design requirements to make it more straightforward for developers to implement and for users to understand. Similarly, six degrees of freedom (6DoF - the ability to use the Control in order to freely move and rotate digital content in space) is now consistent across the OS, with new design guidelines.
We’ve also developed Landscape Manager, a central location for users to manage their running applications, content and maps. With Landscape Manager, you can see which applications are running in the background, share landscape apps with others if you're on a multi-user session, you can summon apps to be closer to you, and close apps you don't need anymore.
Map Creation and management helps guide users in mapping a new space for the first time, allowing them to utilize apps where content persists and in multi-user sharing experiences. Relocalization for Content Persistence guides users in efficiently and quickly relocalizing into an existing map to restore landscape content.
We’ve also added tips and hints to help you understand unknown elements and features that aren’t explicitly described in the UI. These are intended to provide guidance and information when you need it, in the contexts in which they’re most useful.
You can now stream directly from your Magic Leap device to Twitch. After logging into your Twitch account in your device settings, start streaming by pressing the home button and bumper button at the same time to bring up your capture options and select Twitch.
As spatial computing evolves, we’ve introduced new modalities for controlling and manipulating digital content, from our 6DoF controller to hand tracking. We’re adding Speech input to LuminOS so you can use natural, conversational language to perform certain tasks (U.S. English only, for now). Speech is turned off by default, but you can enable it in the Settings > System > Speech page. Just say “Hey, Lumin” to get started, and then ask to open an app to place it, for example: “Open Helio.” You can also move and place prisms by looking at the prism you’d like to move and saying: “Move this…” then looking at the location you’d like to put it and saying “Put it there.” You can see the full list of voice commands in this forum post.
Perception technologies are absolutely crucial to creating a true spatial computing platform. That’s why we’re constantly working on ways to make our device interact more seamlessly with your human physiology and the world around you. With this update, we’ve now enabled Iris ID, which means you can easily unlock your device by simply putting it on. We’ve introduced new algorithms that improve head pose by 30%, making it more robust and accurate. Overall, we’ve radically improved the performance on our perception stack to have better thermal and compute utilization for improved comfort and features.
Over the past year, we’ve focused on improving the way gestures (meaning the way that our device tracks your hands) allow you to interact with digital content and your environment at the same time. We’ve improved the number of keypoints we’re tracking, so our device can now recognize the full hand skeleton and occlude or hide digital content behind your hands and fingers when you interact with it. And in this release, we’ve made further improvements and added a new feature that detects when a hand is holding an item, like the Control, so you can use the Control and your hand at the same time. This means that, when playing games like Dr. Grordbort’s Invaders, you’ll have a better experience because you can use the Control as a grappling hook and your hand as a shield. We’ve also reduced the latency of our gesture pipeline.
In order for spatial computing to truly merge the physical and the digital, gestures must be seamless. You need to reach for a digital object and feel that it interacts just as a real world object would. By making constant improvements to the way we handle gestures, introducing speech input and the ability to say “put that there” and watch digital content move, we’re working towards a platform where you eventually won’t need a control. We’ve been listening to user and developer feedback as we evolve the way gestures work, so that we can ensure we’re creating against the way that people actually use the device rather than delivering arbitrary features. With this feedback, we’re inventing the right multi-modal interactions for spatial computing, and we’ll have more gesture improvements in the next update.
Since introducing the concept of the Magicverse at L.E.A.P. in 2018, we’ve been working to develop a system of systems that brings the physical and digital worlds together as one. The possibilities for developers, enterprise, and consumers are almost unimaginable, and we’re currently testing the first set of Magicverse features with partners that will enable developers to create multi-user XR experiences on Magic Leap devices, Android, and iOS. We’ve already deployed a Magicverse instance at our headquarters where you can be taken on a virtual tour through your mobile or Magic Leap device, and some of the mapping functionality needed for these experiences is now live in 0.98. This foundational infrastructure will empower developers to create massive, multi-user, online experiences for consumers and businesses, where you can create and publish digital content and experiences to physical spaces. You’ll be able to engage with that content whether you’re physically present, digitally co-present, or both—across an array of devices.
Next year, we plan on having two separate updates to LuminOS. We’re also looking forward to showing you advancements in our enterprise offering, our collaboration, copresence, communication features, and the beginnings of the Magicverse.
Last but not least, I would like to thank our community of users and developers for all their feedback and input - please keep it coming.
Yannick Pellet, SVP of Software