Unity MARS provides augmented reality (AR) creators everywhere with specialized tools and a streamlined workflow to deliver responsive, location-aware AR experiences into users’ hands.
Unity MARS is the world’s first authoring solution that brings real-world environment and sensor data into the creative workflow. That means you can quickly build mixed and augmented reality experiences that are context-aware and responsive to physical space. And they will work in any location with any type of data.
From the beginning, we designed Unity MARS to solve the most common pain points across the entire AR development cycle: defining variables, testing scenarios, and delivering AR experiences that intelligently interact with the real world.
1. It solves the “infinite number of variables” problem
AR apps are made to be used in real-world conditions, but it’s notoriously difficult – if not impossible – to manually define all the potential variables your user might encounter when using your app. What physical objects will be in their environment, and where will they be placed? How will the user hold their phone? Will they be sitting or standing?
And even if you know the exact physical site where they’ll be using the app, rooms can be rearranged and there’s still a multitude of human factors to consider. Unity MARS is unique as an AR authoring tool because it enables you to take into consideration all these variables, while also providing you with a visual workflow that lets you move through the prototyping phase quickly, and with very little coding.
To build your app, you begin with proxies that represent real-world objects. With your framework in place, you set conditions and actions on your proxies to tell the app how to respond to them.
With visual aids for “fuzzy” authoring, you define minimum and maximum measurements for real-world objects rather than coding precise values.
With Simulation View, you visualize your app exactly as it will run in the real world. Instead of coding, you simply drag your content directly into the view and Unity MARS creates the appropriate proxies and conditions for you.
Customizable building blocks
To help you get started, we’ve provided Starter Templates, which cover popular AR use cases including a training tutorial application that works with all of our indoor and outdoor environment templates. And we’ll be adding more soon.
2. It seriously reduces the amount of time required to test your app
If you’ve built an AR app before, you know how difficult it is to test on a wide range of devices and in a multitude of locations. Even if you have a specific place in mind, like an event space, you may not be able to test it thoroughly beforehand, given variables like crowds and weather. In short, it’s impossible to test an AR app for every possible user reality. Since we’re not able to bend the laws of time and space, we went for the next best thing – the ability to fully test your AR experience without leaving Unity MARS.
Full testing in the authoring environment
The Simulation View provides you with environment templates that simulate data so you can test your AR experiences against a variety of indoor and outdoor rooms and spaces. That means that you don’t need to have real-world data on hand or have to physically test the experience wherever you want it to work. You can also model your own simulation environments or use photogrammetry scan data.
3. It ensures digital content reacts to the physical world in a believable way
Once you’ve built and tested your AR experience in the Unity MARS authoring environment, you need to make sure it will react intelligently whenever and wherever the end-user interacts with it. Unity MARS enables that. Its runtime logic adapts responsively to the real world, which is especially important for training and remote-guidance apps that must “understand” where physical objects are located.
It intelligently responds to the physical world
You can use any type of real-world data in your app, including surfaces, images, body-tracking (coming soon), and more. The always-on query system gives your app contextually relevant behavior based on the user’s surroundings.
By addressing the toughest challenges in each phase of AR app development, Unity MARS gives creators the ability to finally deliver AR experiences that live up to the end user’s expectations: digital content that seems to live in and react to the real world.
Early adopters of Unity MARS
While developing Unity MARS, we engaged with a number of innovative studios eager to literally get their hands on this new technology. One of them is Sugar Creative, a leading studio based in the UK recognized for their cutting-edge AR and VR experiences.
In partnership with Dr. Seuss Enterprises, Sugar Creative used Unity MARS to create Dr. Seuss’s ABC AR, an app that enhances how children learn to read by bringing the Dr. Seuss characters to life. Will Humphrey, their lead creative and studio manager, has this to say, “Unity MARS has been the toolkit that has allowed us to realize a new horizon, a shift in the potential of immersive experiences by enabling them to become truly dynamic. Put simply, Unity MARS is adding intelligence to AR.”
Other developers working with early versions of Unity MARS have created a variety of AR applications such as sales and marketing experiences for an auto showroom, and a training application for factory workers. Regardless of the use case, they all concur that Unity MARS is ushering in the next generation of AR content by giving them more creative freedom and flexibility.
Get started on your own responsive AR experiences
As well as the features and benefits explained above, Unity MARS leverages our Auggie-award-winning AR Foundation framework to enable you to build an experience once in Unity and deploy it across multiple mobile and wearable AR devices. This authoring workflow fundamentally changes not only how you create AR experiences, but also elevates the quality of the experiences you deliver.
You can try Unity MARS for free for 45 days.
For more information about Unity MARS
Source: Unity Technologies Blog