Supply Chain > ADAS/AV Stack > Sensors overview
EV Sensors Overview
ADAS (Advanced Driver-Assistance Systems) and AV (Autonomous Vehicle) sensors provide the raw observations used for perception, sensor fusion, and safety decisions. Most modern architectures use a multi-sensor suite: cameras for rich scene understanding, radar for robust ranging and velocity, and (in some systems) LiDAR for high-precision 3D depth. Additional sensors such as ultrasonics, IMUs (Inertial Measurement Units), and vehicle motion sensors support close-range awareness, localization, and control stability.
Sensor suite at a glance
Most ADAS systems rely on complementary sensing modalities to reduce blind spots and handle diverse lighting and weather conditions.
| Sensor type | Primary contribution | Strengths | Constraints | Typical use |
|---|---|---|---|---|
| Cameras | Visual scene understanding | High detail, classification, lane and object recognition | Lighting and glare sensitivity; occlusion | L2 ADAS baseline; essential for many stacks |
| Radar | Range and velocity | Works in many weather/lighting conditions; strong for relative speed | Lower spatial resolution than vision; multipath artifacts | ACC/AEB; long-range perception; redundancy |
| LiDAR (where used) | 3D depth and geometry | High-precision depth; strong spatial structure | Cost/packaging; cleaning; performance varies in adverse conditions | Higher autonomy stacks; premium ADAS in some OEMs |
| Ultrasonics | Short-range proximity | Low cost; good for parking-range distances | Short range; environmental sensitivity | Parking assist; low-speed maneuvers |
| IMU (Inertial Measurement Unit) | Acceleration and rotation | High-rate motion estimation; supports stability and localization | Bias drift; requires fusion with other sensors | Localization support; stability; sensor fusion |
| Vehicle motion sensors | Wheel speed, steering angle, yaw rate | Direct motion-state inputs for control | Not environment perception | All ADAS control loops |
| GNSS (Global Navigation Satellite System) | Absolute positioning and timing | Wide-area positioning; time base | Urban canyon multipath; outages | Navigation; fleet location; localization augmentation |
Camera systems
Cameras are typically the highest-volume, highest-bandwidth ADAS sensor. A modern ADAS vehicle may use multiple cameras for forward, rear, and surround coverage.
- Common roles: forward perception, lane keeping, traffic sign recognition, surround view, driver monitoring (DMS) where equipped
- Key hardware blocks: image sensor, lens, ISP (Image Signal Processor), serializer/PHY, enclosure and heater (where used)
- Supply-chain drivers: sensor resolution, HDR performance, low-light capability, and automotive qualification
Radar
Radar provides range and velocity information and is robust across many lighting conditions.
- Common roles: adaptive cruise control, automatic emergency braking, blind-spot monitoring
- Key hardware blocks: radar transceiver, antenna array, radar processing silicon, enclosure and radome
- Integration note: radar may connect via Ethernet or CAN depending on bandwidth and architecture
LiDAR (where used)
LiDAR provides high-fidelity depth and 3D structure. It appears in some higher autonomy stacks and some premium ADAS implementations.
- Common roles: precise depth perception, redundancy and structure for localization
- Key hardware blocks: emitter/receiver, optics, scanning mechanism (if used), processing, cleaning/heating provisions
- Integration note: typically high-bandwidth; Ethernet is common
Ultrasonics
Ultrasonics are low-cost short-range sensors used for low-speed maneuvers.
- Common roles: parking assist, obstacle proximity sensing
- Constraints: limited range and resolution; sensitive to surface and environmental conditions
Localization and motion sensors
ADAS/AV stacks rely on localization support beyond environment perception.
- IMU: high-rate acceleration and rotation inputs for stability and localization support
- Wheel speed and steering angle: core signals for motion control and state estimation
- GNSS: absolute position and time base; often fused with inertial and wheel sensors
Sensor interfaces to compute and the IVN
Sensor data must move to compute with adequate bandwidth and predictable latency. Interface choices strongly shape wiring, cost, and scalability.
| Interface | Typical sensor fit | Why used | Notes |
|---|---|---|---|
| Automotive Ethernet | Cameras, LiDAR, high-content radar, zonal aggregation | High bandwidth; scalable topologies | Requires PHYs/switches and attention to signal integrity |
| CAN / CAN-FD | Radar modules (in some designs), status/control messaging | Deterministic control and diagnostics | Not suitable for high-rate camera streaming |
| Direct point-to-point (varies) | Certain camera/radar topologies | Simplify dedicated links | Often evolves toward Ethernet as systems scale |
Synchronization and time-stamping
Sensor fusion quality depends on timing alignment between sensors and compute.
- Time synchronization: common time base across sensors and compute nodes
- Accurate time-stamping: required for associating measurements across modalities
- Hardware implication: clocks and synchronization support in sensors, switches, and compute
Cleaning, heating, and packaging support
Sensors must operate in real-world environments. Hardware support systems often determine reliability as much as the sensors themselves.
- Heaters and defogging: keep lenses and radomes clear
- Washers and air jets (where used): remove debris from critical sensors
- Mounting and calibration stability: vibration, thermal expansion, and alignment retention
- Environmental sealing: water ingress and corrosion control
Sensor redundancy (hardware perspective)
Redundancy is achieved by overlapping modalities and fields of view.
- Modality redundancy: cameras plus radar; LiDAR where used
- Coverage redundancy: multiple sensors covering critical forward and side zones
- Power and data path considerations: independent power rails and network paths where required
Supply-chain notes
Sensor content is a major driver of ADAS BOM cost and supplier differentiation.
- Cameras: image sensors, lenses, ISPs, and packaging quality drive capability and reliability
- Radar: RF silicon, antenna design, and radome packaging drive performance
- LiDAR: optics, emitter/receiver technology, cleaning/heating needs drive cost and packaging complexity
- Interconnect: Ethernet PHYs, switches, connectors, and harness design become critical at scale
