Skip to Main Content

Ansys AVxcelerate Sensors Simulator

Course Overview

To reduce the need for costly physical prototypes while speeding the design process, the Ansys AVxcelerate Sensors solution offers the opportunity to virtually experience sensors to test and analyze their performance. In the virtual world, use realistic driving scenarios to investigate camera, LiDAR, and radar sensors perception in a MiL, SiL, or HiL context.

The Sensors Simulation course starts with an introduction to sensors setup with model parametrization and configuration of a multi-sensor layout (Camera, LiDAR, Radar) on a vehicle.

Then, you will be guided through the data preparation process and launch of a driving scenario: preparation of a track, configuration of the ego car equipped with sensors, as well as static and dynamic objects for the traffic. 

Advanced sensor training modules (Camera, LiDAR, Radar) allow you to take a deep dive into sensor model settings and their simulation output options (displayed, saved on disk, or retrieved through APIs).  

Prerequisites

  • There is no prerequisite for this course.

Teaching Method

Lectures and computer practical sessions to validate acquired knowledge.

Learning Outcome

Following the completion of this course, you will be able to:

  • Perform physics-based data preparation for tracks and assets (optics and electromagnetic).
  • Use existing library and add custom tracks, vehicles, and objects.
  • Configure sensor models such as cameras, LiDARs and radars.
  • Prepare sensor layouts.
  • Integrate perception algorithms or control laws.
  • Run simulations to test your SuT (System under Test) in SiL or HiL context.
  • Recover simulation output of physics-based sensors simulation.
  • Define campaign of scenarios and perform simulation batching.

 Available Dates

Learning Options

Training materials for this course are available with a Ansys Learning Hub Subscription. If there is no active public schedule available, private training can be arranged. Please contact us.

Agenda:

This is a 3-day classroom course covering both lectures and workshops. For virtual training, this course is covered over 7 x 2-hour sessions lectures only.

Virtual Classroom Session 1

  • Module 01: Getting Started
  • Module 02: Camera
  • Workshop 02.1: Defining Camera Parameters
  • Workshop 02.2: Defining a Camera Simulation
  • Workshop 02.3: CAM Ground Truth Sensor
  • Workshop 02.4: Camera Lens Output
  • Workshop 02.5: Customizable Pixel Segmentation

Virtual Classroom Session 2

  • Module 03: LiDAR
  • Workshop 03.1: Prepare Ego Car Embedding Physics-based LiDAR
  • Workshop 03.2: Defining a LiDAR Simulation
  • Workshop 03.3: Animated NCAP Pedestrian for LiDAR

Virtual Classroom Session 3

  • Module 04: Real-time Radar
  • Workshop 04.1: Animated NCAP Pedestrian for Radar
  • Workshop 04.2: Creating Far Field Data from HFSS and Import in AVX

Virtual Classroom Session 4

  • Module 05: Data Preparation
  • Workshop 05.1: Prepare Ego Car Embedding Physics-based LiDAR
  • Workshop 05.2: Streetlights Preparation
  • Workshop 05.3: Data Preparation in Blender for AVX

Virtual Classroom Session 5

  • Module 06.1: gRPC and Shared Memory Data Access APIs
  • Module 06.2: Feedback Control APIs
  • Module 06.3: Simulation Control APIs
  • Module 06.4: Lighting System Control APIs
  • Module 06.5: Using the API Samples
  • Module 06.6: Lighting Systems REST APIs
  • Module 06.7: Sensor Labs REST APIs
  • Module 06.8: Asset Preparation API Python Sample

Virtual Classroom Session 6

  • Workshop 07: Animated Character for In-Cabin Sensing System Use Case
  • Module 08: Thermal Simulator
  • Workshop 08: Thermal Simulation

Virtual Classroom Session 7

  • Module 09: Scalability and Deployment
  • Workshop 09: Scalable Sensor Simulation across GPUs
  • Sensors Simulation Getting Started
  • Camera
  • LiDAR
  • Radar
  • Data Preparation
  • Driver animation
  • API