Autonomous control systems for heavy construction equipment
Bedrock Robotics builds autonomy software for construction machinery, running a hardware-forward stack (NVIDIA Jetson, CUDA, Lidar, GNSS) paired with ML infrastructure (PyTorch, ONNX, TensorRT) and now experimenting with NeRF for perception. The hiring mix is heavily weighted toward embedded and systems engineers (39 of 45 roles), with meaningful data and ML investment (4 data roles), signaling they're solving perception and edge-deployment problems at scale—not just simulation.
Bedrock Robotics develops autonomy software that retrofits onto existing heavy construction equipment, enabling autonomous operation without replacing machinery. Founded in 2024, the company addresses an acute structural gap: construction labor shortages that are outpacing workforce growth. They deploy autonomous excavators and other equipment on job sites, requiring real-time ML inference on resource-constrained hardware, integrated fleet management, and field-hardened data pipelines for continuous model improvement. The business operates in the United States with an engineering-focused organization of 51–200 people.
Bedrock uses NVIDIA Jetson, CUDA, and Lidar/GNSS for onboard autonomy hardware; Rust, C, C++, and Python for core software; PyTorch, ONNX, and TensorRT for ML inference; and SolidWorks, Fusion 360, CATIA for CAD design. They're adopting NeRF for perception tasks.
Core projects include autonomous excavator deployment, onboard autonomy software, simulation environments, real-time ML optimization for edge hardware, data ingestion pipelines, fleet management systems, and sensor fusion architecture for autonomous construction equipment.
Other companies in the same industry, closest in size