Products
To reduce the need for costly physical prototypes while speeding the design process, the Ansys AVxcelerate Sensors solution offers the opportunity to virtually experience sensors to test and analyze their performance. In the virtual world, use realistic driving scenarios to investigate radar, lidar and camera sensors perception in a Mil Sil or Hil context.
Discover high-fidelity physics-based sensor simulation with ground truth information for autonomous vehicles.
Enable the use of multiple sensor simulations amidst realistic driving scenarios to ensure the physical prototype meets expectations.
AVxcelerate Sensors readily integrates the simulation of ground-truth sensors and camera, radar, lidar and ultrasonic sensor types while accurate outputs enable you to assess your complex ADAS systems and autonomous vehicles virtually by connecting perception, fusion and control function into the driving simulator of your choice like IPG Automotive CarMaker or Carla.
Benefit from powerful ray-tracing capabilities to recreate sensor behavior and easily retrieve sensor outputs through a dedicated interface.
Co-simulation capabilities with the driving simulator of your choice like IPG Automotive CarMaker allow OEMs and automotive suppliers to ensure the safety of ADAS feature and autonomous vehicle by validating perception, fusion and control algorithm using Ansys physics-based sensors simulation (camera, radar, lidar) in sync with IPG Automotive CarMaker or Carla vehicle dynamics and scenarios. To test sensors in scenarios, add several cars to create complex situations, such as following a car and monitoring the path of a crossing car simultaneously. Each vehicle in a scenario can be either static or automatic, enabling evaluation at a specific point of interest or along a predefined trajectory. Sensor simulation follows ego-vehicle dynamic motion.
Ideal models of ultrasonic, camera, radar, flashing lidar and rotating lidar sensors help define a sensor’s specifications or develop ADAS features at early development stages. Sensor data output, such as lightning strike hit points, material properties, etc., and vehicle parameter input or output, such as position, orientation, speed, and steering wheel, are all available. Thanks to the consistent combination of data from multiple sensors, the simulation enables you to validate the model of a smart sensor’s behavior or its fusion algorithms. Deterministic and real-time modes are both supported.
A high-fidelity, real-time, physics-based camera model is defined as a set of standardized parameters coming from a camera datasheet. Software allows you to simulate the actual camera model in edge case driving situations. It simulates all components of a camera, such as the lens system, imager and pre-processor. For automotive front-facing cameras, the windshield can also be considered in simulation. Consider the optical and spectral properties of the environment in visible range, along with the optical properties of the lens system and optoelectronic properties of the imager. With the addition of plugins, the simulation can manage dynamic adaption. Camera simulation creates raw images, which are used to test and validate perception algorithms either as models-in-the-loop, software-in-the-loop or hardware-in-the-loop.
Physics-based lidar models accurately reproduce the behavior of the IR emitter and sensor. All types of lidar technology can be parameterized in the software. You will benefit from powerful ray-tracing capabilities to recreate sensor behavior and be able to easily retrieve sensor results through a dedicated interface. IR emitter world model IR properties (is this “The IR emitter will model IR properties?” and receiver electronics are considered in the simulation which can output from raw signals – waveforms – to point clouds. This solution provides a unique way to collect virtual sensor information during real-time drives and use the information to develop autopilot code.
Radar is prevalent in driver assistance systems thanks to its high precision and exceptional scalability. Ansys VRXPERIENCE Sensors provides a unique radar model-based based on modern ray-tracing GPU technique. It is valid for automotive applications to ease the virtual design, testing and validation of radar systems. VRXPERIENCE’s GPU Radar feature provides the capability to perform full physics-based radar scenario simulations in real time at frame rates greater than 30 frames per second. The simulations consider multi-bounce reflections and transmissions from dielectric surfaces. Multichannel and MIMO radars can be simulated using the linear scalability of GPU Radar. With the addition of GPU radar, VRXPERIENCE now provides the ability to perform ADAS and Autonomy scenario simulation with full-physics models of all key sensors – cameras, radars and lidars. The data collected out of the radar model is used to efficiently stimulate the algorithm of radar ECU digital signal processing to quickly improve the accuracy and robustness of automotive radars in edge cases. Ansys VRXPERIENCE comes with a library of objects with dielectric properties defined.
Hardware-in-the-loop (HiL): Ansys AVxcelerate Sensors use raw sensor output to feed actual smart sensors being tested on a hardware-in-the-loop test bench. For cameras, the solution connects to an image injection box, which replaces the actual imager with the virtual one. The injection box also manages high-speed connections between the imager and the processing chip (i2c). Model-and Software-in-the-Loop (MiL,Sil): Perform massive scenario variation by leveraging cutting-edge testing — on-premise and in the cloud. Assess perception performance by varying parameters across countless driving scenarios.\\
Please let us know how we can help you and meet your simulation product needs. We want to assign your request to the proper engineer, so please provide as much information as possible.
If this is urgent, please email productinfo@padtinc.com or call 480.813.4884.
Please fill out the form with as much information as possible, and someone will get back to your shortly.
Or reach out directly: