Industrial memory and storage for edge AI and embedded systems
Innodisk manufactures industrial-grade DRAM and flash storage, now pivoting toward edge AI with NVIDIA and TensorFlow Lite integration. The hiring mix—mostly sales and senior roles—reflects a GTM acceleration phase, while active projects around AI inference pipelines and framework integration signal the company is moving beyond commodity memory into packaged AI solutions. Stack adoption of RDMA and focus on system-level debugging suggest they're tackling real-time performance bottlenecks in edge deployments.
Innodisk is a Taiwan-headquartered public manufacturer of industrial memory, storage, and embedded computing solutions. The company serves aerospace, defense, in-vehicle, IoT, and server markets with customized hardware components—primarily industrial DRAM and NVMe flash. Recent product work centers on edge AI inference, with deployments on ARM and x86 platforms running TensorFlow Lite and ONNX Runtime. Current pain points include memory compatibility in mixed-vendor environments, distributor channel performance, and stability of AI frameworks on resource-constrained edge devices.
Innodisk builds on NVIDIA hardware, Intel/ARM processors, Linux (Ubuntu/CentOS/RHEL), TensorFlow Lite, ONNX Runtime, and PCIe/NVMe for AI inference pipelines. Recently adopting RDMA for low-latency memory access.
Innodisk is actively hiring in the United States and Taiwan, with current openings split between sales, engineering, and marketing functions.
Other companies in the same industry, closest in size