ADAS Vehicle Architectures - Smart Sensors versus Centralized Platforms

3Q 2016 | Technology Analysis Report | AN-2238 | 21 pages | 2 tables | 4 charts | 12 figures | PDF

Certain ADAS functions are becoming more and more popular on consumer vehicles, with uptake being driven by falling costs and the reflection of standard fitment in model safety ratings. To date, these systems have been largely enabled by smart sensors, which combine object detection, sensor processing, and actuation within a single module, enabling lower system cost and faster implementation.

This research report will explain the unsustainability of this approach going forward, and will analyze the centralized system architectures that will be necessary in the autonomous and driverless vehicles of the future.

Table of Contents

  • 1. INTRODUCTION TO ADAS ARCHITECTURES
  • 2. SMART SENSOR MODULES
    • 2.1. DISTANCE CALCULATIONS
    • 2.2. ROBUSTNESS AND FUNCTIONAL SAFETY
  • 3. CENTRALIZED ADAS AND AUTONOMOUS VEHICLE PLATFORMS
  • 4. HIGH PROCESSING POWER
    • 4.1. SENSOR PROCESSING AND FUSION
    • 4.2. DEEP LEARNING
    • 4.3. FUNCTIONAL SAFETY
  • 5. IN-VEHICLE CONNECTIVITY
    • 5.1. 1000BASE-T1 - MARVELL SEMICONDUCTOR
    • 5.2. HDBaseT - VALENS SEMICONDUCTOR
  • 6. CENTRALIZED PLATFORMS FOR AUTONOMOUS DRIVING: NVIDIA, NXP, MOBILEYE, AND AUDI
    • 6.1. NVIDIA - DRIVE PX 2
    • 6.2. NXP - BlueBox
    • 6.3. MOBILEYE - EYEQ5
    • 6.4. AUDI - zFAS