When John Deere’s artificial intelligence (AI) misidentifies a weed, it’s not a bad product recommendation. It’s a farmer’s entire season gone wrong, thousands of dollars lost, and a livelihood at stake.

Existential pressure has shaped how the 187-year-old equipment giant approaches AI differently than almost anyone in Silicon Valley.

Fahad Khan, head of product management for Blue River, John Deere’s platform handling data, machine learning, and robotics, described the 2017 acquisition as “a match made in heaven” during a recent panel at the TechCrunch Disrupt conference in San Francisco.

Blue River brought software expertise and talent to complement John Deere’s hardware capabilities, creating what Khan called a critical intersection of “mission, machine, and model.”

The company’s flagship product, See and Spray, identifies weeds and spray only them, dramatically reducing pesticide use through computer vision-based machine learning. But the simplicity of the concept belies the complexity of execution in real-world agriculture.

“The stakes in agriculture for AI and autonomy are almost existential,” Khan said. Unlike mistakes in digital environments that might result in “a few lost clicks,” errors in agricultural AI translate directly to lost yield and threatened livelihoods.

“We get maybe one chance with each farmer. If we mess that up, there’s no going back,” he said.

The global market for AI agricultural technology, which reached $4.7 billion in 2024, is on pace for a compound annual growth rate of more than 25% through the end of the decade, industry analysts project.

Precision farming has emerged as the dominant application of AI in agriculture, commanding 46% of the market share in 2024, while machine learning technologies account for 41% of AI deployments. Such technologies are helping farmers make data-driven decisions about everything from irrigation and fertilization to pest management and harvest timing.

The adoption curve shows significant momentum, with more than 70% of U.S. farmers now implementing at least one precision agriculture technique.

The payoff is substantial: Advanced farming practices using AI have demonstrated the potential to boost crop yields by as much as 30%, making the technology increasingly essential for farmers looking to maximize productivity while managing resources more efficiently in an era of climate uncertainty and rising production costs.

This reality has shaped John Deere’s approach to AI development, treating every system as mission-critical and building extensive resiliency and redundancy into their models. The company validates each step with human experts, recognizing that trust in agriculture “is earned, not assumed just because you are a technology company.”

A key differentiator in John Deere’s approach is its reliance on agronomists — experts with PhDs in agronomy — rather than general crowdsourced annotators. These specialists provide crucial guidance for training models to operate in extraordinarily variable conditions, from changing weather and lighting to different crop growth stages.

“Sometimes it’s impossible to identify a weed from a crop,” Khan said. “If you ask my 50-plus ML engineers, they wouldn’t know.” This expert knowledge integrates into every stage of the machine learning pipeline, from data collection and processing to identifying edge cases and evaluating model performance against real field conditions.

Jeff Mills, president of iMerit, emphasized that this expert-in-the-loop approach differs fundamentally from basic human-in-the-loop systems, with specialized knowledge deeply integrated into the technology stack rather than operating as an external component.

John Deere categorizes its models into three tiers: generic models like large language models for research summarization, vertical models specific to individual crops, and custom fine-grained models for autonomy applications. When Khan joined Blue River, the company used just two models for automation — perception and depth. Today, the complexity has expanded dramatically.

Operating tractors that move 20 mph to 30 mph, the system has mere milliseconds to decide whether to spray, making latency and accuracy paramount. A missed pass can cost thousands of dollars, underscoring why the company maintains separate models for different crops and regions while consolidating them into unified perception and depth systems.

The company’s evolution from camera-only systems to multimodal approaches reflects the harsh realities of field operations. Khan described the challenge vividly: tractors create massive dust clouds that obscure vision, requiring specialized cameras and additional sensors like radar and lidar.

“AI in the dust,” Khan called it, noting that multimodality provides essential resilience for identifying objects in challenging conditions. The company also leverages generative AI for synthetic data generation, though Khan emphasized its limitations in explaining model decisions.

John Deere has established an AI operations center where farmers can examine exactly what their machines were “thinking.” The company breaks down operations into “see, sense, and act,” allowing farmers to review what the machine observed, its decision-making process, and the probabilities behind its actions.

This transparency proved especially valuable after an early setback when models trained in January and February failed when deployed in June and July. The team hadn’t accounted for how dramatically crops change throughout their growth cycle, a lesson that fundamentally altered their approach to data collection and model iteration.

Looking ahead, Khan expects a significant percentage of John Deere’s tractors to be autonomous by 2030, with the technology expanding beyond agriculture into forestry and construction. The company is also developing “prescriptive AI” to advise farmers on what to plant, when to plant, and which techniques to use.

To that end, John Deere has worked closely with Carnegie Mellon University and other robotics experts to develop models for crop management — particularly bots with sophisticated hands to pick and sort.

“Hopefully,” Khan said, the goal is “becoming even a closer friend with the farmers, with the operators that we actually have”—augmenting farmer intuition with intelligence rather than replacing it.