Skip to Main Content

Ansys AVxcelerate Sensors
Test and Validate Sensor Perception for Autonomous Vehicles

Ansys AVxcelerate provides accurate sensor simulation capabilities, enabling you to test your autonomous systems, including sensor perception, faster than relying only on actual driving or recorded data.

Ansys AVxcelerate Sensors

Realistic Driving Scenarios using the Driving Simulator of Your Choice

Ansys AVxcelerate Sensors provides physically accurate sensor simulation for autonomous system testing with sensor perception. Save testing time and cost while increasing perception performances for the camera, LiDAR, radar, and thermal camera sensors. Leveraging AVxcelerate's real time capabilities, perform virtual testing in Software-in-the-loop or Hardware-in-the-loop context following the progress of your design cycles.

  • Camera, LiDAR, radar and thermal sensors
    Camera, LiDAR, radar and thermal sensors
  • Accurate physics-based real time sensors
    Accurate physics-based real time sensors
  • SiL, HiL testing
    SiL, HiL testing
  • Open interoperability with any workflow
    Open interoperability with any workflow
  • Driving simulator agnostic
    Driving simulator agnostic
Ansys AVxcelerate Sensors

Quick Specs

Enrich your sensor perception test cases and coverage with accurate real-time synthetic data. Set up multi-sensors simulation in SiL or HiL context toguarantee your ADAS/AV system performance under any operating condition.

  • Radar. LiDAR Sensor
  • Camera Sensor
  • Thermal Camera sensor
  • Ground-Truth Sensors
  • Multi-Sensor Simulation
  • SiL, HiL connectivity
  • Open Architecture/APIs
  • Driving Simulator agnostic
  • Multi-GPU/HPC scalability

July 2024

What's New

Ansys 2024 R2 innovations introduce capabilities like radar digital twin IP protection, adaptive grid sampling, and numerous updates to ODD to accommodate larger, more complicated simulations and ease-of-use UI improvements.  

avsimulation-avxcelerate-sensors-r2-2024-radar-encryption.png
Radar Sensor Digital Twin Encryption

Users can now build an IP-protected sensor model using the AVxcelerate Sensors Lab UI. The password-protected model locks all parts of the radar sensor, including design, waveform, and beam pattern information, allowing Tier 1 and 2 suppliers to confidently share their models with OEMs for accurate system simulations. 

avsimulation-avxcelerate-sensors-r2-2024-radar-adaptive-grid-sampling.png
Radar Sensor Adaptive Grid Sampling

AVxcelerate sensors simulation includes adaptive grid sampling. This method is used for long-range (200-500m) object detection that needs high ray density and sampling. Users can now define the objects in the simulation where dedicated rays would be focused, resulting in a 3x faster simulation with 6.8x less GPU memory consumption and the same or better level of accuracy.

avsimulation-avxcelerate-sensors-r2-2024-avx-autonomy-scenario-exploration.png
Enhanced Scenario Exploration

Ansys 2024 R2 brings increased performance, scalability, and cloud processing (Azure & AWS) for AVxcelerate Autonomy users to simulate larger, more complicated scenarios (>500k samples per job), along with multiple ease-of-use UI improvements.

Continental Tests Real-Life Scenarios During ADAS/AD Testing and Validation

 

Continental Automatic Emergency Braking

Our camera sensor technology is critical to the work we are doing in supporting autonomous function for our customers. Using Ansys AVxcelerate Sensors during ADAS/ADtesting and validation, we were able to confidently test real-life scenarios thatwere previously off-limitsto us with simulation, withcomplete confidence in theaccuracy of our results. Even though the work to develop a well-roundedsolution is still ongoing, the collaboration between. Ansys AVxcelerate Sensors and Continental camera sensor solutions is already delivering promising results.”

— Dr. Martin Punke Head of Camera Product Technology / Continental

Enrich your sensor perception test cases and coverage with accurate real-time synthetic data. Set up multi-sensors simulation in SiL or HiL context to guarantee your ADAS/AV system performance under any operating condition.

To achieve a high level of accuracy, advanced driver assistance systems/ autonomous driving (ADAS/AD) technology requires Continental to target its camera sensors for simulation. Continental engineers do real-world driving on test tracks or roads to train, test, and validate ADAS or AD systems. They also do component-level testing and simulation; however, only limited engineering simulation solutions are available to tackle this problem. Even though the effort to develop a well-rounded solution is ongoing, the collaboration between Ansys AVxcelerate Sensors and Continental camera sensor solutions delivers promising results.

Applications

View all Applications
2021-01-fusa-window.jpg

Autonomy System Development

Model-based safety and cybersecurity assessments using Ansys simulation help to accelerate autonomous system development and certification.

Autonomous vehicle sensor simulation including lidar, radar and camera design

Autonomy Sensors

Ansys provides a comprehensive autonomous vehicle sensor simulation capability that includes lidar, radar and camera design and development.

2020-11-ansys-stock-20201123_0076.jpg

Physically Accurate Simulation Solutions forSensor Testing and Validation

Ansys AVxcelerate provides physically accurate sensor simulation for autonomous system testing with sensor perception in the loop. Save on testing time and cost while increasing perception performances for camera, lidar, radar and thermal sensors.

autonomous vehicle sensors

 

Key Features

Benefit from powerful ray-tracing capabilities to recreate sensor behavior and easily retrieve sensor outputs through a dedicated interface.

  • Driving Scenarios  using the driving simulator of your choice
  • Camera Sensor
  • Lidar Sensor
  • Radar Sensor
  • HiL, SiL and MiL Connectivity 

To test sensors in scenarios, add several cars to create complex situations, such as following a car and monitoring the path of a crossing car simultaneously. Each vehicle in a scenario can be either static or automatic, enabling evaluation at a specific point of interest or along a predefined trajectory. Sensor simulation follows ego-vehicle dynamic motion.

Software allows you to simulate the actual camera model in edge case driving situations. It simulates all components of a camera, such as the lens system, imager and pre-processor. For automotive front-facing cameras, the windshield can also be considered in simulation. Consider the optical and spectral properties of the environment in visible range, along with the optical properties of the lens system and optoelectronic properties of the imager. With the addition of plugins, the simulation can manage dynamic adaption. Camera simulation creates raw images, which are used to test and validate perception algorithms either as models-in-the-loop, software-in-the-loop or hardware-in-the-loop. 

You will benefit from powerful ray-tracing capabilities to recreate sensor behavior and be able to easily retrieve sensor results through a dedicated interface. IR emitter world model IR properties (is this “The IR emitter will model IR properties?” and receiver electronics are considered in the simulation which can output from raw signals – waveforms – to point clouds. This solution provides a unique way to collect virtual sensor information during real-time drives and use the information to develop autopilot code. 

Learn more about lidar.

VRXPERIENCE’s GPU Radar feature provides the capability to perform full physics-based radar scenario simulations in real time at frame rates greater than 30 frames per second. The simulations consider multi-bounce reflections and transmissions from dielectric surfaces. Multichannel and MIMO radars can be simulated using the linear scalability of GPU Radar. With the addition of GPU radar, VRXPERIENCE now provides the ability to perform ADAS and Autonomy scenario simulation with full-physics models of all key sensors - cameras, radars and lidars. The data collected out of the radar model is used to efficiently stimulate the algorithm of radar ECU digital signal processing to quickly improve the accuracy and robustness of automotive radars in edge cases. Ansys VRXPERIENCE comes with a library of objects with dielectric properties defined.

Model-and Software-in-the-Loop (MiL,Sil): Perform massive scenario variation by leveraging cutting-edge testing — on-premise and in the cloud. Assess perception performance by varying parameters across countless driving scenarios.

AVXCELERATE SENSOR RESOURCES & EVENTS

Featured Webinars

Ansys AV Simulation
Ansys 2023 R2: Ansys AV Simulation (AVxcelerate) What’s New

Join us to hear about new capabilities and features in our 2023 R2 release of Ansys AV Simulation (AVxcelerate). With this release, we introduce improvements in Camera, Thermal Camera & Radar models and the simulation ecosystem to allow users to perform more accurate and much higher fidelity simulations.

On Demand Webinar
Ansys Webinar
Sensor and HMI Development & System Validation

For a sustainable business model solution, an intensive trade-off between performance and safety is made in the development of AD systems. Learn how Ansys solutions address critical technical challenges in areas such as sensor and HMI development and system validation.

On Demand Webinar
Ansys Webinar
Physics-Based Sensor Simulation for In-Cabin Sensing Systems Development

In-cabin sensing system requirements are increasingly becoming an essential part of governments policies and car safety rating organizations. Learn in-cabin sensing systems requirements andwatch physics-based sensor simulation for in-cabin monitoring systems development andvalidation process.


White Papers & Articles

Physics-based radar

Physics-Based Radar Modeling: Driving Toward Increased Safety

An innovative solution from Ansys, AVxcelerate, is designed to speed up the sensor development process — without compromising safety — by simulating product performance under varying road and environmental conditions.


Videos


Ansys software is accessible

It's vital to Ansys that all users, including those with disabilities, can access our products. As such, we endeavor to follow accessibility requirements based on the US Access Board (Section 508), Web Content Accessibility Guidelines (WCAG), and the current format of the Voluntary Product Accessibility Template (VPAT).

See What Ansys Can Do For You

Contact us today

* = Pflichtfeld

Danke für die Kontaktaufnahme

Wir sind hier, um Ihre Fragen zu beantworten und freuen uns auf das Gespräch mit Ihnen. Ein*e Mitarbeiter*in unseres Ansys-Verkaufsteams wird sich in Kürze mit Ihnen in Verbindung setzen.

Footer Image