Corn is a very valuable crop, but also an intensive crop requiring high amounts of water and nutrients to attain high yields.
That’s why for many decades, seed industry and government scientists have been working hard on ways to make rapid breeding improvements to every aspect of the plant, from larger root mass to better disease resistance. It’s not easy to efficiently measure differences in some traits among corn hybrid lines, but a new combined robotic and imaging technology is tackling that problem with regard to the trait of leaf angle.
That is, the more upright the leaves in a corn plant, the more sunlight they can capture, especially morning and afternoon sun. More photosynthesis results, along with a higher the rate of plant growth and eventual grain production. Upright leaves also help reduce the shade any particular plant casts onto others and can allow more plants to be grown in less field space while keeping grain yield similar.
This was shown as far back as 55 years ago in 1968, when US scientists found that certain corn hybrids with more upright leaves gave more grain than others with higher leaf angles, which “strongly supports corn breeding programs in the area of plant geometry and crop canopy.”
How do we measure plant leaf angle? Up until this point, there’s been only one time-consuming way: measurement by hand. But now, there’s AngleNet, a system of four tiers of cameras on a wheeled robot and advanced modelling software, proven effective at getting plant angle information to breeders much more efficiently than manual measurement and just about as accurately. It’s the culmination of years of work by a team at several universities: Iowa State, North Carolina State, Auburn, Cornell, Saint Michael’s College and University of Missouri-Columbia.
As the robot moves (through GPS-based remote control) between crop rows, two cameras each sit at four levels to capture images of the leaves on each plant. The customized dual-camera module (coined ‘PhenoStereo,’ with phenotype referring to visible traits) features “integrated strobe lighting for high-speed stereoscopic image acquisition.” That is, the supplemental and consistent light from the strobe and the high shutter speed of the cameras overcome motion blur from the robot moving along (and also from potential breezes moving the plant leaves). These features also overcome the issue of variable lighting among the corn plants from leaf shadows and also potential leaf movement, which would, if not addressed, also cause image inaccuracies.
Text continues below image
The stereoscopic images at each level are then analysed into a 3D model that determines the angle of each leaf photographed. 3D modelling is a necessity because leaves from other plants often block a particular leaf. That is, without creating a 3D model of each leaf on the fly from what you can capture with rapid photos, you could not determine its angle accurately.
This automated image processing system. AngleNet, uses ‘convolutional neural networks’ to analyze the two-dimensional images from the cameras and create 3D models. Convolutional neural networks are a type of artificial network used widely in the analysis of visual imagery, designed specifically to process pixel data.
As lead scientists Dr Lirong Xiang explains in a news release, “for plant breeders, it’s important to know not only what the leaf angle is, but how far those leaves are above the ground. This gives them the information they need to assess the leaf angle distribution for each row of plants.” She added that “we’re already working with some crop scientists to make use of this technology, and we’re optimistic that more researchers will be interested in adopting the technology to inform their work.”
Xiang is referring in part to two members of Iowa State’s Plant Sciences Institute, who already two years ago published a study where they mapped trait loci and predicted candidate genes for leaf angle in maize.
Text continues below video