First Person Slingshot has landed and this time, it's spatial.
First Person Slingshot has landed and this time, it's spatial.
Resolution Games and Rovio Entertainment made the feathered franchise crash through our screens and bounce around living rooms with Angry Birds FPS: First Person Slingshot, available now through Magic Leap World.
To discover the magic behind the mayhem, we asked Resolution Games CTO Martin Vilcans ten questions about their design and development process.
What was the creative process in developing the Angry Birds experience for Magic Leap?
Some of the game design was already set when we decided on making a slingshot game. Of course we'd have the Red bird and other well-known characters, as well as the player's goal (destroy the pigs). The challenge for us was how to make it in 3D, and not only that, a 3D game where the player can move around freely in the world and shoot from any angle.
With that, we first discussed different ways we could use the Angry Birds characters, structures and world in a way it could naturally fit in someone’s environment - like their room, a living room or kitchen even. We thought about it as something that would be tangible and right in front of you, but also a part of their environment. We then expanded on that thinking to include the limitless possibilities of augmented reality gameplay and visuals to give players an experience they could never have on a 2D screen or in the real world. After we made a prototype of a slingshot game for a VR headset, we saw that the classic Angry Birds gameplay transferred well to 3D. Allowing the player to move around freely in the world adds another dimension to the game, and we expected augmented reality to add even more. By then we had not yet received a Magic Leap headset, but we were confident that we could make a good slingshot game for it and dove right in.
Describe the user experience for Angry Birds. How will Magic Leap One enhance the experience for such a notable franchise?
The basics of the gameplay should be familiar to anyone who has played Angry Birds before. The difference in this version is that you can put the game on your kitchen table, for example, and see the building blocks fall to the floor when you hit the pigs' constructions. The platform truly makes it feel like the overlaid content is there. You can also use the environment around you to bounce your shots say off a wall next to you or get more points for shooting from further back. You can move around the pigs' city to see it from different angles and even get close to the pigs and the birds and they will acknowledge that you're there by looking at you and waving. All of this makes the experience feel less like a video game and more like something that is actually happening right there in your home. It’s an entirely new and more engaging way to interact with the game and characters Angry Birds fans love.
What was your approach to developing for mixed reality/spatial computing?
In the beginning, we didn't have a Magic Leap headset, so we started developing the game in VR. To simulate reality we used a scan of a real room (by 9of9, see Sketchfab). We could use the Magic Leap SDK to get to know the APIs, but without a real headset, we postponed some decisions until we could try out how well our ideas would work on the actual device. Once we got the headset, we integrated elements so that the virtual objects could collide with your physical furniture, walls, floor, etc. We really wanted to integrate the environment just as much as integrate the overlaid content. It was a balancing act to get the right mix that took lots of testing and iterating to find the right amount of each.
How did you approach the process of taking a 2D, scrolling game experience and translate it into a 3D, fully immersive, spatial computing experience? What were a few creative challenges?
Not all game concepts translate well from 2D to 3D, but for this game, it was surprisingly straight-forward. One difference from the 2D game is that in the Magic Leap version you can place the slingshot wherever you want. We did this to encourage the player to move around, which makes the game feel more real, or more augmented reality so to say and achieve that balance of using the physical space within the gameplay. But, that created a challenge for us as the levels had to be designed for this, which took fine tuning to make it feel just right.
Were there any design changes you had to make to take into account the environment around you (versus VR experiences that plant the user into a new environment)?
This was one of the most incredible and yet challenging parts of working with Magic Leap. For instance, the graphics design was challenging. Artists have to give up control of the environment. You don't know if the game will be played in a concrete garage or a ballroom. We had to imagine the game as a physical set of blocks that you could play with anywhere and still feel natural and true to the IP.
And, on another level we want the objects in the game to interact with the real world. For example, you can shoot a bird into one of your real walls and it will bounce off, but we don't want the level to collapse if you put it on an uneven surface. If you put it under something protruding from your wall or other real-world objects that would ruin the game by destroying the level for you. We solved this by only letting the player put the game on an even surface, and ignoring the real-world objects that are too close to the level.
What is the difference you’ve seen in developing for Magic Leap vs. other AR/MR platforms you might have used in the past?
With a new platform you have to take some time to get to know the device and its tools, but we use Unity which supports most AR/MR platforms, including Magic Leap, so we did not have to learn a completely new engine for developing the game, which was incredibly helpful. However, it’s clear that Magic Leap has given a lot of thought into making the platform accessible to all creators.
Tell us more about how you used Unity’s platform to make the transition from other mediums to mixed reality.
One strength of Unity is how it provides a common development environment for many different target platforms. Before we received access to the Magic Leap SDK, we could play the game on VR devices. To simulate what it would look like on the Magic Leap, we added a virtual room and also changed the field of view of the virtual display. Once we got the real device, we changed the build settings to make a version for Magic Leap. Of course, for any new platform we have to spend the time to make sure the game feels right for that specific platform, but Unity lets us focus on fine-tuning instead of having to worry too much about technicalities such as SDKs and build environments.
What potential do you see in developing other games for the Magic Leap platform?
It is an exciting new world that is opening up for game developers. As everything is new, we have to experiment a lot to find what works in AR. However, we are super excited about the potential of how this can change the way people look at gameplay and add a whole new dimension for not only developers but players. We are thrilled to be a part of that discovery process.
What do you think about immersive technologies as a whole? Are other immersive technologies an important part of your business moving forward?
It’s great that computers get more integrated with the real world. We are not virtual beings after all. There has been a progression with computers moving from being stationary, through laptops to being something you carry in your pocket. Each step makes it more integrated into the world around it. Head-mounted AR will at some point replace the mobile phone as the device just about everybody uses for hours every day. Before that, we think handheld AR on mobile phones is an interesting step in that direction. We also work with traditional VR (to the extent that "traditional" can be used to describe something that is still taking huge strides towards its full potential). Each of these technologies captures some part of the possibilities with visual computing, and in the end, will probably have a single device that can do everything.
What are some recommendations for other developers looking to create gaming experiences for mixed reality?
Maybe you have had an idea for a game that you at first thought would be great, but later realized that the limitations of the input device and the display didn’t do the idea justice? For example, many ideas for puzzle games in 3D are finally possible to realize in a user-friendly way. You can also consider how you could use the physical space around the player as an essential part of the game. Maybe let the player design levels by moving physical objects around? But don’t let your imagination run wild. Know the characteristics of the device you are developing for.