AI-powered sign language translation for digital accessibility
Hand Talk by Sorenson translates digital content into sign language via AI, built on PyTorch + ONNX + OpenVINO for inference efficiency. The tech stack and active projects reveal a company scaling ML inference pipelines—hiring across data, engineering, and sales while tackling production deployment, latency, and cost optimization. This is a business-impact company doing hard infrastructure work.
Hand Talk by Sorenson builds AI-driven accessibility solutions that automatically translate websites, platforms, and digital experiences into sign language. Founded in 2012 and based in Brazil, the company operates a 51–200-person team using virtual sign-language avatars (Hugo and Maya) to connect brands with deaf and hard-of-hearing audiences. The product combines real-time inference, data platform infrastructure, and assistive features to make digital content inclusive. Current hiring focuses on data engineering and sales roles across Brazil.
FastAPI, Flask, Node.js, PyTorch, ONNX, OpenVINO, Docker, Kubernetes, AWS, Terraform, React, and SQL. The ML inference stack (PyTorch → ONNX → OpenVINO) is optimized for low-latency sign-language translation.
ML inference optimization, real-time inference pipelines, data lakehouse implementation on AWS, and infrastructure-as-code (Terraform). Core challenges are scaling inference to production with low latency and cost efficiency.
Other companies in the same industry, closest in size