Xinobi AI is a Tokyo-based AI startup (founded 2024) building personal AI agents through a stack centered on LangChain, LangGraph, and vector databases (Faiss, Milvus, Weaviate, Pinecone). The tech choices—RAG pipelines, LLM evaluation frameworks, and multi-cloud deployment (GCP, AWS, Azure)—signal a focus on retrieval-augmented agent capabilities rather than foundational model work. Hiring is balanced across engineering, product, and marketing with manager-level roles present, reflecting early-stage scaling beyond pure R&D into go-to-market execution.
Xinobi AI develops personal AI agents designed to think, learn, and grow alongside users. The company operates across Tokyo, Seoul, and Singapore, with a co-founding team that includes former Silicon Valley executives and a serial entrepreneur. The product roadmap centers on 2025 launches in Japan, Korea, and the U.S., supported by ongoing work on RAG memory systems, LLM benchmarking, and global GTM strategy. Current hiring focuses on expanding the core engineering and product teams while building out marketing capacity for international expansion.
LangChain, LangGraph, FastAPI, Python, and vector databases (Faiss, Milvus, Weaviate, Pinecone). For model access: OpenAI, Anthropic, Mistral, Google, Hugging Face. Infrastructure: GCP, AWS, Azure, Kubernetes, Docker. Observability: Weights & Biases, MLflow, BentoML.
Personal AI product launch across Japan, Korea, and the U.S.; memory and retrieval pipelines for RAG; LLM evaluation frameworks; global GTM strategy; and market localization (particularly Korea). Also scaling agent backend infrastructure and building cross-border execution capability.
Other companies in the same industry, closest in size