Autonomy Hub


Autonomy describes machines that can perceive their surroundings, interpret context, and act with minimal oversight. It spans autonomous vehicles, mobile robots, and legged systems, all built on similar sensing, compute, and AI control foundations.

At its core, autonomy is a decision-making engine. It turns machines into self-directed operators: recognizing what is happening, forecasting what will happen next, choosing a safe and efficient action, and executing that action with precision. This capability is rapidly transforming mobility, logistics, and industrial workflows.

Autonomy goes far beyond driver assistance. ADAS supports a human. True autonomy performs the full task: navigating routes, handling obstacles and uncertainty, completing missions such as deliveries or inspections, docking for charging, or manipulating objects—while the human shifts into a supervisory role rather than an operational one.


Site Node Primary Focus Autonomy-Related Examples
Vehicles Product view of EVs and autonomous machines Autonomous cars, robotaxis, autonomous trucks, delivery robots, drones, humanoid and quadruped model pages
Autonomy Technology and capability view Autonomous Vehicle Directory, Sensors & Compute, Autonomy Stack, Robotics & Humanoids
Fleets Operations and control of vehicles and robots at scale Autonomous fleets, robotic fleets, depot integration, charging and routing for autonomous assets
Systems Hub Deep systems architectures SDS loops, OTA pipelines, digital twins, energy-aware scheduling, facility-scale automation

Vehicles tell you what machines exists, Fleets show you how they are operated, Autonomy explains how it works, and the Systems Hub shows how everything fits together at system scale.


Autonomous Vehicles

The autonomous form factors across land, sea, and air. Connects platform-level autonomy capabilities to individual models and families under the Vehicles node.

Domain Examples Primary Uses
Passenger Mobility Robotaxis, autonomous shuttles, future L4/L5 ride-hail EVs On-demand ride services, robo-shuttles, urban and campus transport
Freight & Logistics Autonomous delivery vans, robotrucks, yard and port tractors, sidewalk delivery bots Middle-mile freight, last-mile delivery, yard and terminal operations, logistics hubs
Heavy Equipment Autonomous mining trucks, autonomous construction and agricultural equipment Mining haulage, earthmoving, field operations, industrial sites
Aviation & Maritime Cargo drones, inspection UAVs, autonomous tugs and workboats Logistics, infrastructure inspection, offshore operations, port support

Legged Robots & Mobile Robotics

Robotics and humanoids extend the autonomy story beyond road vehicles. They bring autonomy into factories, warehouses, depots, and built environments designed for people.

  • Legged Robots (Humanoids and Quadrupeds): Tesla Optimus, Agility Digit, Fourier GR-1, Unitree humanoids and quadrupeds, and similar platforms that combine locomotion, perception, and manipulation.
  • Mobile Robotics: warehouse AMRs, sidewalk delivery bots, yard and port robots, security and inspection robots that share autonomy concepts with vehicles but operate at lower speeds and in constrained environments.
  • Multi-Agent Robotics: coordinated fleets of robots and vehicles operating under shared missions, such as depot operations where trucks, delivery bots, and humanoids work together.

These systems run variants of the same autonomy stack as AVs but add legged locomotion, multi-contact planning, and manipulation of objects in close proximity to humans.


Sensors & Compute

Sensors and compute form the physical substrate of autonomy. This dimension covers the perception hardware and on-board processing required to run advanced autonomy stacks.

  • Cameras: forward, surround, fisheye, interior
  • Radar: conventional, imaging, and 4D radar for adverse weather and redundancy
  • LiDAR: spinning, solid-state, and FMCW LiDAR in select autonomy stacks
  • Other Sensors: ultrasonic, time-of-flight, inertial measurement units, GPS and GNSS
  • Autonomy Compute: Tesla HW4/HW5, NVIDIA Drive platforms, Qualcomm Ride, Mobileye EyeQ Ultra, Horizon Journey, and other domain-specific accelerators

These components define what an autonomous platform can perceive, how quickly it can react, and how efficiently it can run complex neural networks within the constraints of EV power and thermal envelopes.


Autonomy Stack

The autonomy stack is the software and AI architecture that transforms sensor data into safe, useful action. At a high level, most stacks share the same core layers, whether they run in a robotaxi, a delivery bot, or a legged humanoid.

Layer Role Vehicles Robotics
Perception Turn raw sensor data into objects, lanes, free space, and scene understanding Vehicles detect lanes, traffic actors, signs, road edges, and obstacles Robots detect people, shelves, pallets, stairs, tools, and task-relevant objects
Prediction Forecast how other agents and the environment will evolve over time Predict vehicle and pedestrian motion, traffic light phases, merging behavior Predict human motion, forklift paths, robot interactions, object dynamics
Planning Choose safe, efficient trajectories and high-level behaviors Route selection, lane changes, merges, unprotected turns, docking for charging Path planning in cluttered spaces, task sequencing, approach and retreat behaviors
Control Translate plans into smooth, stable motion Steering, acceleration, braking, traction and stability management Leg joint control, balance and recovery, manipulation, fine motion control
Learning & OTA Close the loop between field data and model updates Fleet data collection, training on clusters such as Dojo or GPU clouds, OTA rollouts Task and environment data, simulation-driven training, OTA updates to robot fleets

Autonomy levels (L2 through L4/L5) describe how much of the driving task the system handles and under which conditions. Higher levels demand stronger perception, more robust prediction, and tighter integration between planning, control, safety, and human supervision.


Market Outlook for Autonomy

Autonomy adoption will not progress evenly across all segments. Some use cases are poised for rapid scale; others will grow steadily in niche but high-value roles. The table below ranks major autonomy segments by expected adoption through the 2030 timeframe.

Rank Segment Adoption Outlook Notes
1 Robotaxis and Autonomous Ride Services Very High High utilization, strong unit economics, and dense urban demand make robotaxis one of the most transformative autonomy applications.
2 Autonomous Delivery and Logistics High Middle-mile robotrucks, last-mile delivery vans, and sidewalk robots benefit from repeatable routes and strong cost pressure in logistics.
3 Autonomous Heavy Equipment High Mining, agriculture, and construction offer controlled environments and strong safety and productivity gains, supporting early autonomy at scale.
4 Logistics Robots and Mobile Robotics High Warehouse AMRs, yard and port robotics, and industrial mobile robots will continue to expand as e-commerce and automation investments grow.
5 Humanoids and Quadrupeds Medium-High Legged robots are earlier in the adoption curve but have large potential in factories, logistics, and service roles, particularly where environments are already designed for humans.