Based in Silicon Valley, automation technology company Ike is currently working in partnership with the trucking industry to automate long-distance freight transportation. The team, which is made up of a number of veterans of the self-driving industry—alumni of companies including Uber, Waymo, Tesla, Apple, and Cruise—shares a common belief that moving goods instead of people massively simplifies the technical challenges for automated vehicles. With decades of experience in robotics, the team is confident that it can bring a safe, scalable solution to market. What remains the key challenge is proving that safety aspect, as Simulation Lead Pete Melick explains. “The rates of difficult events that you encounter driving many miles over the highway are very low,” he says. “To prove with statistical confidence that your truck is going to be resilient in any different combination of extreme things that could happen—but don’t happen very often to normal trucks—is quite difficult. It would require driving literally tens of millions of miles.”

A hybrid approach to simulation

Instead, the team elected to use simulation as their primary validation tool. There are two types of simulation for autonomous driving, each with its pros and cons. By using both methods, Ike is able to get the best of both worlds.

Log simulation means feeding data from real driving into the automation system. The advantage of using real driving data is that the logged data contains all of the nuanced imperfections of real sensors and real interactions, but it also has the disadvantage that closed-loop control is not possible. In other words, the scene recorded in a real-world log can’t change in response to the behavior of the simulated vehicle. If the automated truck decides to drive slower, other vehicles in the simulation will not react to that change and the simulation loses a lot of its value. 
Spotlight_IKEBlog_Body_Image_3.jpg

A log sim where the simulated vehicle’s behavior (blue) diverges from that of the logged vehicle (red)

Virtual simulation uses fabricated scenarios and responsive Actors, like a video game. Unlike log simulation, other vehicles in a virtual scenario can respond to behavior from the automation system. The unit of a virtual sim is called a scenario, a specific event that the truck might encounter while driving on the highway. Virtual simulation unlocks the possibility of procedurally varying the parameters that define a scenario to create many similar but distinct scenarios, each of which is known as a variation. Variations enable Ike to test its automation software in a vast array of possible circumstances. 

Ike had an advantage in its simulation development through its prior acquisition of a code base from another automated driving company, Nuro. The result of two years of development effort, the code base constituted what Melick describes as a world-class infrastructure for robotics, including essential technologies like visualization, mapping, on-board infrastructure, cloud infrastructure, machine learning, labeling pipelines, and log simulation. The team took this as its starting point, and then added its members’ own expertise in the domain.

Designing scenarios in Unreal Engine

To develop its virtual simulation tool, the team turned to Unreal Engine, spending a year extending the many relevant out-of-the-box features for its specific needs. 

Melick explains that while everyone working on virtual simulation recognizes the benefits of closed-loop control and variations, not everyone is implementing them in the same way. 

“The way that we have differentiated ourselves is the level to which we have embraced the game engine as a key component of our virtual simulation tool, as opposed to some other companies who are using a game engine—maybe even using Unreal Engine—but are kind of keeping it at arm’s length and using it as just an image generator in conjunction with another simulation tool,” he says.

As a first step in that process, Ike customized the Unreal Engine Level Editor to be its scenario design tool. 
Ike’s trucks calculate their position in the world using high-definition maps, consisting of LiDAR intensity and elevation data, which is collected and processed into lane centers and boundaries. That same map data is streamed into Unreal Engine using the Landscape API, so that the team can design and run their scenarios on it.
Spotlight_IKEBlog_Body_Image_4.jpg

Designing a scenario in the Unreal Editor on Ike’s maps

The automation system requires higher-resolution map data than is easily found in open-source data formats; to capture the necessary data, Ike uses a special mapping vehicle fitted out with two LiDAR scanners and physically drives it down the highway. This makes the company completely self-sufficient, giving it the power to simulate anywhere it can drive its mapping vehicle.Once the maps are imported, most of the building blocks for scenario design are available out of the box: triggers based on time or distance, splines for Actors to follow, an efficient environmental query system, a fully featured and customizable GUI, a scripting language for designing arbitrarily complex choreographies. 

“Game engines are tools for building living, breathing worlds—levels in which the player makes decisions and the world reacts realistically,” says Melick. “That’s all a scenario is, except the player is a robot.”

How should that robot control its avatar in the simulator? And how should the simulator provide the necessary inputs to the robot? Ike’s automation software consists of code modules, which communicate with each other using Google Protocol Buffer messages. The team has made some small modifications to the engine to enable it to send and receive those same messages. Any Actor or Component in the simulator can publish or subscribe to any message, just like the onboard software modules. 

side_by_side_optimized.gif
A scenario playing in Ike’s log viewer (left) and simulator (right). Animation courtesy of Ike.

In the current setup, the simulator publishes mock object detections, which are fed to the tracking software, and subscribes to steering, throttle, and brake commands, which control the motion of the simulated vehicle.

Ike has also used Unreal Engine’s AI Perception system to add some intelligent behaviors to its simulated agents. For example, while following a mapped lane or a predetermined spline, they can detect an obstacle in their path and use an IDM-based speed controller to avoid a collision.

Harnessing the power of Blueprint

To enable designers to extend the range of scenarios, the team exposes functionality to Blueprint, Unreal Engine’s visual scripting system.

“We know that as fast as we work, we can never outpace the imaginations of our scenario designers,” says Melick. “We need to help them create scenarios in efficient and repeatable ways.” 

Using Blueprint, designers create new behaviors and choreography. For example, they can make an Actor weave left and right about its lane with a parameterized period and amplitude. Or they can add simulated noise to the detections fed to the autonomy software to test its sensitivity to imperfect inputs. They can even use Blueprint to create a keyboard-controlled Actor to interact with the simulated truck—all without writing a line of C++.
Spotlight_IKE_Blog_Body_Image_1.jpg

Adding noise to simulated autonomy inputs in Blueprint

Blueprint is also the key to Ike’s variations system. A designer adds a variable parameter by adding a Component to an Actor and implementing a Blueprint function that defines the effect of varying that parameter. “In this way, we can vary just about any property of a scenario,” says Melick. “We can vary the position, orientation, speed, or size of any Actor. We can vary the target an Actor wants to drive towards, their desired following distance, or how aggressively they’ll change lanes. If it can be expressed in Blueprint, it can be varied.”

Melick is aware that some people criticize the use of game engines as simulators, claiming that they don’t allow for determinism, the property where repeated executions of the same simulation produce identical results. 

“That assumption is wrong,” he says. “Ike’s simulator is deterministic, enabling us to benefit from fast, repeatable offline testing. We’ve only scratched the surface of Unreal’s potential for autonomy simulation. We have many exciting projects underway, including using virtual sim for repeatable hardware-in-the-loop testing and making it possible to automatically generate virtual scenarios from real driving events.” 
controllable_actor_optimized.gif

Manually driving a virtual vehicle in Ike’s simulation. Animation courtesy of Ike.
 
While we may still be some time away from seeing driverless trucks safely navigating our highways, every scenario Ike creates in Unreal Engine brings that goal one step closer. It’s a reassuring thought.Interested in finding out how you could unleash Unreal Engine’s potential for training and simulation? Get in touch to start that conversation.
 

Source: Unreal Engine Blog