
Blue River Technology
2 months ago

We’re Blue River, a team of innovators driven to create intelligent machinery that solves monumental problems for our customers. We empower our customers – farmers, construction crews, and foresters - to implement safer and more sustainable solutions, driving increased profitability with less reliance on scarce labor. We believe that focusing on the small stuff – pixel-by-pixel and task-by-task - leads to big gains. With our partners at John Deere, we have the ability to bring innovative computer vision, machine learning, robotics, and product management solutions to scale production, maximizing their potential impact.
Summary
We are looking for a highly skilled and innovative Principal Machine Learning Tech Lead to join the team responsible for building the perception stack for Blue River Technologys autonomous tractor initiative. The hire will be responsible for leading development on ML models, training and testing pipelines to enhance system safety and performance in challenging production environments. This role will range from developing initial prototypes to scaling for production. We are looking for an experienced, motivated, and collaborative technical leader to help build the future of autonomous agriculture.
Job Responsibilities
The primary job responsibilities are noted below.
- Lead the design, development, and deployment of machine learning and computer vision models that power real-time perception for autonomous agricultural robots.
- Own the full ML lifecycle—from dataset design and data collection through training, evaluation, model optimization, and deployment on embedded platforms.
- Collaborate cross-functionally with robotics engineers, software developers, and hardware teams to integrate perception systems into field-ready machines.
- Lead a team of CVML engineers, fostering technical growth, strong engineering practices, and a culture of experimentation and delivery.
- Define and execute the ML roadmap, balancing research exploration with production needs to meet performance, safety, and robustness targets in challenging environments.
- Prototype and validate new approaches quickly, using both real-world and synthetic data to improve detection, segmentation, and scene understanding under variable conditions (lighting, occlusion, dust, etc).
- Continuously evaluate model performance in the field and refine approaches to maximize reliability and generalization across crop types, terrains, and climates.
Required Experience and Skills
- 8-12 years of experience developing high-performance ML systems.
- Experience shaping the technical vision for larger projects and providing guidance while contributing to core capabilities.
- Strong leadership and communication skills, and demonstrated experience in leading and mentoring teams.
- Hands-on work experience with GPU-based computing on large data sets.
- Extensive experience with deep learning frameworks (e.g., PyTorch).
- Bachelor’s or Master’s Degree in Computer Science or a related field, graduate degree preferred.
Preferred Experience and Skills
- Production experience with developing hardened machine learning perception algorithms.
- Experience building and deploying full-stack ML pipelines, from data ingestion to model training, testing, and deployment.
- Experience in working with CVML & Robotics projects.
- Experience in working with the vehicle autonomy space.
- Familiarity with TPUs, GPUs, and FPGAs.
- Experience in metrics implementation, analysis, and dashboarding.
- Knowledge of robotics libraries such as ROS / ROS2.
-
Only individual applicants will be considered. We do not work with third-party agencies or proxy interview services. Submissions from individuals misrepresenting their identity or experience will be immediately disqualified.
-
All interviews are live and interactive. We assess real-time problem solving and communication skills—no pre-recorded responses or external assistance is permitted.
-
We verify identity before final interview rounds. This may include live ID verification and technical screening under camera.
-
Applicants must have direct, hands-on experience with the technologies listed. Candidates should be prepared to speak in depth about recent and relevant projects.
At Blue River, we’re passionate about creating an inclusive workplace that promotes and values diversity. While we have more work to do to advance diversity and inclusion, we’re investing in our programs, including recruiting, mentorship, career development, and learning & development to ensure they support our Diversity, Equity, and Inclusion goals. We support each employee in living a full life, enabling a thriving career, and accomplishing a meaningful, challenging mission while collaborating with incredible people. We are dedicated to building a diverse and inclusive workplace, so if you’re excited about this role but your experience doesn’t align completely with the job description, we encourage you to apply anyway.
We are an equal-opportunity employer and do not discriminate based on race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, veteran status, or disability status. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, perform essential job functions, and receive other benefits and privileges of employment. Please contact us to request an accommodation.
The US annual base salary range for this position is up to $312,000, along with eligibility for Blue River’s bonus and benefit programs. Our salary ranges are determined by role, level, and location. Within the range, individual pay is determined by work location and additional factors, including job-related skills, experience, and relevant education or training. Your recruiter can share more about the specific salary range for your location during the hiring process. During the recruitment process, we may identify an alternative role or level to which you are more suited. If your ideal role at Blue River differs from the advertised position, we will provide an updated pay range as soon as possible during the hiring process.
## LI-AN1