WEKA builds storage infrastructure purpose-built for AI workloads, with a tech stack rooted in low-level systems (C/C++, kernel drivers, RDMA, NVMe, DPDK, SPDK) that minimize latency and maximize throughput. The company is actively addressing performance bottlenecks in kernel-level I/O and scaling to petabyte-scale distributed systems. Sales hiring (22 roles) significantly outpaces engineering (17), indicating a shift from product-driven to sales-led motion as the platform matures.
WEKA provides a containerized, microservices-based storage system designed for enterprises running AI, machine learning, and high-performance computing workloads. The platform targets neoclouds and exascale AI environments where traditional storage architectures create bottlenecks. The company operates across multiple geographies (11 countries including US, India, Israel, UK, Australia, Singapore) and is focused on accelerating customer adoption and expanding sales coverage. Active development spans distributed file system components, kernel drivers, and multi-cloud proof-of-value implementations on OCI and other platforms.
WEKA uses C/C++, Rust, and kernel-level technologies (RDMA, DPDK, SPDK, NVMe) running on Linux. Infrastructure partners include AWS, GCP, OCI, with hardware vendors Dell and Supermicro. Orchestration tools: Docker, Ansible.
WEKA hires across 11 countries: United States, India, Israel, United Kingdom, United Arab Emirates, Australia, Singapore, South Korea, China, Indonesia, and Germany.
Other companies in the same industry, closest in size