AI infrastructure and deployment platform for Southeast Asian enterprises
MTAI operates an AI compute and deployment stack—vLLM, SGLang, Ollama, FastAPI, PostgreSQL, vector databases (Milvus, Qdrant, pgvector)—targeting multi-sector adoption across healthcare, finance, retail, manufacturing, and telecommunications. The hiring profile is heavily finance-skewed (4 finance roles, 1 engineering, 1 HR, 1 legal), with acceleration in M&A and post-transaction integration activity, suggesting growth through acquisition or portfolio consolidation rather than organic engineering velocity.
MTAI is an AI solutions provider serving enterprises across Southeast Asia, with particular focus on healthcare, finance, retail, manufacturing, and telecommunications. The company delivers AI compute infrastructure, software, and deployment services designed for high-performance, industry-specific use cases. Based in Kuala Lumpur with 51–200 employees, MTAI operates a technical stack centered on open-source LLM orchestration (vLLM, Ollama, SGLang) layered over vector search (Milvus, Qdrant) and traditional data infrastructure (PostgreSQL, Redis). Current priorities span LLM deployment optimization, data pipeline engineering for model training, and internal compliance scaling—suggesting both customer-facing AI product maturity and operational complexity from expansion.
MTAI builds on vLLM, SGLang, and Ollama for LLM inference, FastAPI for application serving, PostgreSQL and Redis for data, and Milvus/Qdrant for vector search—a stack optimized for open-source model deployment and multi-tenant inference.
MTAI targets healthcare, finance, retail, manufacturing, and telecommunications across Southeast Asia, with AI solutions tailored to sector-specific compliance, performance, and scalability needs.
Other companies in the same industry, closest in size