Ag Robotics Technology

Illinois' crop-counting robot earns top recognition

Today's crop breeders are trying to boost yields while also preparing crops to withstand severe weather and changing climates. To succeed, they must locate genes for high-yielding, hardy traits in crop plants' DNA. A robot developed by the University of Illinois to find these proverbial needles in the haystack was recognized by the best systems paper award at Robotics: Science and Systems, the preeminent robotics conference held last week in Pittsburgh.

"There's a real need to accelerate breeding to meet global food demand," said principal investigator Girish Chowdhary, an assistant professor of field robotics in the Department of Agricultural and Biological Engineering and the Coordinated Science Lab at Illinois. "In Africa, the population will more than double by 2050, but today the yields are only a quarter of their potential."

Crop breeders run massive experiments comparing thousands of different cultivars, or varieties, of crops over hundreds of acres and measure key traits, like plant emergence or height, by hand. The task is expensive, time-consuming, inaccurate, and ultimately inadequate--a team can only manually measure a fraction of plants in a field.

"The lack of automation for measuring plant traits is a bottleneck to progress," said first author Erkan Kayacan, now a postdoctoral researcher at the Massachusetts Institute of Technology. "But it's hard to make robotic systems that can count plants autonomously: the fields are vast, the data can be noisy (unlike benchmark datasets), and the robot has to stay within the tight rows in the challenging under-canopy environment."

Illinois' 13-inch wide, 24-pound TerraSentia robot is transportable, compact and autonomous. It captures each plant from top to bottom using a suite of sensors (cameras), algorithms, and deep learning. Using a transfer learning method, the researchers taught TerraSentia to count corn plants with just 300 images, as reported (DOI: 10.1002/rob.21794) at this conference.

"One challenge is that plants aren't equally spaced, so just assuming that a single plant is in the camera frame is not good enough," said co-author ZhongZhong Zhang, a graduate student in the College of Agricultural Consumer and Environmental Science (ACES). "We developed a method that uses the camera motion to adjust to varying inter-plant spacing, which has led to a fairly robust system for counting plants in different fields, with different and varying spacing, and at different speeds."

Source: https://www.igb.illinois.edu

Discover more from “AgTechNews.com”