echoloc

Tecton Tech Stack

Feature store for real-time ML inference at scale

Software Development San Francisco, California 51–200 employees Founded 2019 Privately Held

Tecton operates a feature store—infrastructure that transforms raw data into ML-ready features and serves them for real-time predictions. The tech stack (Python, Java, Kotlin, Go, Spark, Ray, DuckDB, Kafka, Flink, Kubernetes, gRPC) reflects a systems-heavy, polyglot engineering org built for sub-millisecond latency and millions of requests per second. Active projects center on query execution (DAG-based multi-stage queries, optimization), serving platform scaling, and distributed compute—pain points that map directly to their core technical challenge: maintaining tight SLAs on availability and latency across a several-million-line monorepo.

Tech Stack 29 technologies

Core StackPython Java Kotlin Go Apache Spark Databricks Snowflake BigQuery Redshift AWS DynamoDB Redis Kubernetes gRPC PostgreSQL Apache Flink Kafka Ray Arrow DuckDB AWS EMR Google Cloud Dataproc GCP Bigtable SQL Dataproc Bazel Azure

What Tecton Is Building

Challenges

  • Complex data problems in production machine learning
  • Tight slas on availability and latency
  • Scaling serving platform for millions of requests
  • Supporting a several-million-loc polyglot monorepo
  • Maintaining multiple well-tested python sdk versions
  • Friction points in building software
  • Scalability of ci infrastructure
  • Scaling serving platform for high throughput
  • Low latency serving infrastructure
  • Ensuring compliance with data residency

Active Projects

  • Evolving query execution engine to support complex multi-stage queries with dags
  • Bazel monorepo ecosystem
  • Build features
  • Scaling serving platform to handle millions of requests per second
  • Building integrated observability solution
  • Product workflows for feature development and production
  • Local development environment for cloud orchestration
  • Implement tecton deployments
  • Query optimizer
  • Distributed compute and resource management

Hiring Activity

Minimal30 roles · 0 in 30d

Department

Engineering
13
Data
6
Sales
3
Product
2

Seniority

Senior
19
Staff
3
Mid
2
Company intelligence

Find more companies like Tecton by tech stack, pain points and active projects

Get started free

About Tecton

Tecton is a feature store platform founded in 2019 by the creators of Uber's Michelangelo ML platform. The product transforms raw data into ML-ready features and serves them for real-time predictions—used for fraud detection, credit decisions, and personalization. The company operates with a senior-heavy engineering org (13 engineers, mostly senior/staff level) and minimal recent hiring velocity, suggesting a stable, focused product roadmap rather than rapid scaling. They serve mid-market to enterprise ML teams running production inference workloads that demand high availability and sub-millisecond response times.

HeadquartersSan Francisco, California
Company Size51–200 employees
Founded2019
Hiring MarketsUnited States, Canada

Frequently Asked Questions

What tech stack does Tecton use?

Python, Java, Kotlin, Go, Apache Spark, Ray, Arrow, DuckDB, Kafka, Flink, Kubernetes, gRPC, PostgreSQL, Snowflake, BigQuery, Redshift, DynamoDB, Redis, and cloud infrastructure across AWS, GCP, and Azure.

What is Tecton working on?

Query execution engines with DAG support, query optimization, scaling serving to millions of requests per second, distributed compute, observability, and improved developer workflows for feature development.

Similar Companies in Software Development

Other companies in the same industry, closest in size