AI perception and planning stack for autonomous driving across ADAS to Level 5
Autobrains builds deep-learning perception and motion-planning software for autonomous vehicles, from driver-assistance systems through full autonomy. The tech stack—PyTorch, TensorRT, ONNX Runtime, CUDA, QNX, AUTOSAR—reflects a mature embedded-AI company shipping to automotive hardware. Active hiring is concentrated in engineering and research (20 of 29 roles), with senior and leadership seniority dominating; the focus on 'GPU inference pipelines for embedded platforms' and 'edge device deployment' signals aggressive optimization of model inference for cost and latency on production vehicles.
Notable leadership hires: Path Planning Team Lead
Autobrains develops modular autonomous-driving software spanning all levels of autonomy, from advanced driver-assistance to full driverless capability. The company sells to automotive OEMs and Tier 1 suppliers; strategic investors include BMW i Ventures, VinFast, Continental AG, and Knorr-Bremse AG. Core product areas cover perception, motion planning, trajectory optimization, and behavioral control, with recent work on fleet-feedback validation loops and integration with digital-twin environments. The organization operates across Israel, Germany, Vietnam, and the United States, with 51–200 employees and active hiring concentrated in engineering and research roles.
Python, PyTorch, C++, ROS 2, TensorRT, ONNX Runtime, CUDA, NVIDIA, QNX, AUTOSAR, and Linux. The mix reflects deep-learning model development (PyTorch, CUDA) integrated with embedded automotive software standards (AUTOSAR, QNX).
Motion planning and control algorithms for ADAS through driverless autonomy, GPU inference optimization for embedded platforms, scenario-based validation and fleet-feedback loops, and integration of planning modules with world models and edge-AI stacks.
Other companies in the same industry, closest in size