Intric builds AI infrastructure for public-sector and enterprise customers operating under strict data residency and security constraints. Their stack—Python, FastAPI, Kubernetes, PostgreSQL, pgvector, vLLM across AWS/GCP/Azure—reveals a multi-cloud, self-hosted architecture designed for regulated deployments. Active hiring skews heavily toward sales (9 roles) relative to engineering (6), paired with simultaneous expansion into Spain, DACH, and Finland, indicating a transition from product validation toward market-driven growth.
Intric, founded in 2021 and based in Stockholm, develops an open AI platform aimed at governments and large enterprises that cannot use public cloud AI services due to sovereignty or compliance requirements. The company operates across three core areas: platform infrastructure (Kubernetes, multi-cloud orchestration), AI model serving (vLLM, pgvector), and customer integration (FastAPI). Current pain points center on cloud infrastructure management under security constraints, market expansion (Spain, DACH, Finland), and customer onboarding—reflected in active projects around market-entry strategy, partnership programs, and use-case ideation with pilot customers.
Python, FastAPI, Docker, Kubernetes, PostgreSQL, pgvector, vLLM, Redis across AWS, GCP, and Azure. Frontend uses Svelte, SvelteKit, and Tailwind CSS. Infrastructure tooling includes Terraform and Helm.
Intric is actively hiring in Sweden and Germany, with ongoing market-expansion initiatives in Spain, DACH region, and Finland.
Other companies in the same industry, closest in size