Perception Engineer – Autonomous Heavy Machinery
Base Salary Range: $200,000-$300,000 + Equity
Location: Onsite in San Fransisco
Travel: Frequent (~50% at customer job sites)
About the Opportunity
Our client is developing autonomous and tele-operated systems for large-scale industrial vehicles, bringing modern robotics and AI into one of the most complex real-world environments imaginable. Their technology is designed to operate reliably in unstructured, high-variability outdoor settings where safety, robustness, and real-time performance matter.
This role is ideal for someone who is deeply motivated by physical systems, enjoys working close to hardware, and wants to see their perception stack directly influence the motion of multi-ton machines in production environments.
What You’ll Work On
- Designing and implementing 3D perception pipelines for tele-operated and semi-autonomous industrial vehicles
- Developing strong classical perception baselines and incrementally advancing them with deep learning–based models where appropriate
- Building real-time object detection, classification, and tracking systems to improve operator awareness and system safety
- Performing multi-sensor calibration and fusion across modalities, maintaining spatial and temporal alignment in challenging field conditions
- Optimizing perception algorithms for real-time execution on embedded compute, balancing latency, stability, and robustness
- Debugging and validating systems directly on machines operating in real-world job sites
What Our Client Is Looking For
- 2–10 years of experience working on perception systems for robotics or autonomous platforms
- Strong fundamentals in computer vision, 3D perception, and sensor fusion
- Experience building systems that operate outside of clean lab environments
- Comfort working close to hardware and debugging in the field
- A builder mindset — someone who has created systems from the ground up (professionally or personally)
- Willingness to travel frequently and spend significant time on-site with customers
- Ability to work onsite or relocate if needed
Bonus Experience
- Perception experience on heavy machinery, autonomous vehicles, or mobile robots
- Familiarity with LiDAR, radar, and multi-camera systems
- Experience deploying perception models to embedded or edge compute platforms
- Prior exposure to teleoperation or human-in-the-loop autonomy systems
Why This Role Is Different
- Your work directly controls real, large-scale machines, not simulations or demos
- Heavy exposure to field deployments and customer environments
- Opportunity to build foundational perception systems at an early stage
- Tight feedback loop between software, hardware, and real-world behavior