AI orchestration platform for enterprise workflow automation
Thread AI builds infrastructure for orchestrating AI workflows across enterprise operations. The stack—React, TypeScript, Python, Spark, Flink, Airflow, plus Kubernetes and multi-cloud (AWS, GCP, Azure)—is engineered for data-intensive, event-driven processes at scale. The hiring mix is heavily senior-skewed (9 of 12 open roles) and engineering-focused, suggesting they're past early MVP and scaling toward production reliability; concurrent emphasis on sales infrastructure (playbooks, funnel, personas) signals transition from founder-led sales to repeatable GTM.
Thread AI provides an AI orchestration platform called Lemma designed to help enterprises build and run mission-critical AI workflows and agents. The product sits between models, APIs, and data sources, embedding business logic and governance into automated processes. They target mid-market and enterprise operations teams looking to automate complex, event-driven workflows. The platform includes low-code workflow design, SDK support for custom logic, and multi-tenant governance. The company is 11–50 people, based in New York, and is privately held.
Lemma, an AI orchestration platform that connects models, APIs, and data sources to build and run enterprise AI workflows with embedded business logic and enterprise-grade governance.
React and TypeScript for frontend; Python, Java, Go for backend; Apache Spark, Flink, and Airflow for data pipelines; Kubernetes, Docker for orchestration; AWS, GCP, Azure for cloud infrastructure.
Other companies in the same industry, closest in size