Metaverse studio for live events and transmedia production
ZeroSpace operates a metaverse lab and studio space for live events and media production. The tech stack reveals a production-heavy operation: Unreal Engine + C++ for real-time experiences, volumetric capture pipelines (FBX, Maya, Houdini), and an emerging AI/ML layer (PyTorch, TensorFlow, Stable Diffusion, ComfyUI) for petabyte-scale media datasets. Active project work spans real-time virtual production, motion capture processing, and ML infrastructure deployment — indicating a shift from pure venue operations toward in-house generative and volumetric content creation.
Notable leadership hires: Technical Director
ZeroSpace is a metaverse studio and live events venue operating in Brooklyn. The company functions as both a physical/digital space for immersive experiences and a production studio, with in-house capabilities across virtual production, volumetric capture, motion capture, and generative AI workflows. The team is engineering-focused with technical leadership, reflecting the infrastructure-heavy nature of real-time virtual production and media ML systems. Operations span immersive experience design, technology infrastructure, and live event production.
Unreal Engine, C++, Python, Maya, Blender, Houdini for 3D production; PyTorch, TensorFlow, Stable Diffusion for ML workloads; Terraform, CloudFormation, Apache Airflow, Kubeflow for infrastructure; ffmpeg, ImageMagick for media processing.
Real-time virtual production pipelines, volumetric capture integration, motion capture data processing, ML infrastructure for petabyte-scale media datasets, and generative workflows using ComfyUI.
11–50 employees. Current open roles span engineering (4), product (1), and production (1), with seniority mix skewed toward senior and director-level positions.
Other companies in the same industry, closest in size