Japanese-language LLM research and development with production ML infrastructure
SB Intuitions is a public R&D company building large-language models optimized for Japanese. The tech stack reveals deep ML infrastructure maturity: vLLM, SGLang, TensorRT-LLM for inference optimization; Kubernetes, Docker, Terraform, CloudFormation for orchestration; and multi-cloud deployment across GCP, AWS, Azure. The project backlog—Sarashina model deployment, petabyte-scale data management, multi-datacenter integration, and incident response—maps directly to the pain points (LLM scaling, data governance, network security), indicating an organization moving from research prototype to production systems.
SB Intuitions develops large-language models with a focus on Japanese-language capability. As a public company based in Tokyo with 51–200 employees, the organization is structured around engineering, data, security, and research functions. The project portfolio spans model optimization and inference deployment (Sarashina system), distributed data infrastructure at petabyte scale using S3 and multi-datacenter storage, CI/CD pipeline construction, and compliance governance. Hiring is active and accelerating in Japan, concentrated in engineering and senior technical roles.
Multi-cloud (GCP, AWS, Azure) with specialized ML tooling: vLLM, SGLang, TensorRT-LLM for LLM inference; Kubernetes, Docker, Terraform for orchestration; Apache Spark and Hadoop for data processing; Go, Rust, Python, Java for services.
Model-to-user value conversion systems, petabyte-scale distributed data management, multi-datacenter integration, LLM performance optimization, and incident response and security governance infrastructure.
Other companies in the same industry, closest in size