Tyle is building the visual biomarker layer for human health.
Computer vision and multimodal AI now make it possible to extract meaningful biological signals from visual data at scale. Skin and other observable traits contain far more information about systemic health than is currently captured or utilized. We are building the models and infrastructure to capture, interpret, and continuously learn from those signals.
Our ambition is to build the foundational infrastructure that turns high-quality visual input into continuous, preventative insight at scale.
Founded by engineers and product leaders from Meta and Google, Tyle is operating in stealth, closing a sizable pre-seed round, and launching its first few pilots.
This is a rare opportunity to define a new layer of health infrastructure from first principles.
The role
We are hiring a Founding Engineer to take ownership of and scale Tyle’s core technology stack, spanning proprietary imaging hardware, capture software, data infrastructure, proprietary multimodal models, and live product interfaces.
You will take technical ownership of this system, improving signal quality at capture, strengthening data and training loops, advancing model performance in real-world conditions, and building the infrastructure required to deploy and operate reliably at scale. You will make foundational architectural decisions that determine how the system evolves and how new capabilities are introduced over time.
A core part of the role is rapidly evaluating and integrating emerging computer vision research, translating new techniques and published work into production-grade improvements that create measurable impact.
This role is for an engineer who wants both depth and breadth, solving hard computer vision problems while directly shaping product direction and long-term technical strategy.
Core Responsibilities
- Own and evolve the end-to-end computer vision system, from signal capture and data curation through model training, evaluation, and deployment
- Improve real-world model performance by refining data pipelines, feedback loops, and evaluation methodology
- Architect and scale the infrastructure that supports reliable inference, monitoring, and iteration in production
- Contribute across the stack where needed, including capture systems, backend services, and user-facing product surfaces
- Set technical direction and engineering standards as the company grows
What we’re looking for
You have deep experience in computer vision and have built, trained, and deployed models that perform in real-world conditions. You understand data quality, labeling strategy, evaluation methodology, and the constraints that affect model performance in production.
You are comfortable reading research papers, reproducing results, and adapting state-of-the-art techniques to practical constraints. You can distinguish between promising ideas and production-ready approaches, and you know how to turn experiments into reliable systems.
You operate comfortably as an AI generalist, moving between model development, experimentation infrastructure, and production deployment. You design clear systems, write maintainable code, and make sound architectural decisions under ambiguity.
You effectively leverage modern AI-assisted tools and workflows to accelerate development, automate repetitive tasks, and improve system design. You maintain a strong bar for correctness, performance, and reliability, and have shipped systems that meet it.
This is a hands-on role. You will spend the majority of your time building, experimenting, and shipping. We value clear thinking, written design, measurable progress, and fast iteration grounded in data.
We operate in person in the Bay Area for tight collaboration across hardware, software, and AI. Remote within the US is possible for exceptional candidates with the right experience.
The base salary range for this role is $120,000 – $200,000, depending on experience and scope. The role also includes meaningful equity participation, with total compensation targeted in the $210,000 – $260,000 range.