Autonomous perception systems for horticulture tree crops (AH11009)
What was it all about?
There is considerable scope for the use of robotics and mechanisation in horticulture, but all applications rely on the development of an effective perception capability. Drones, robotic spray systems or harvesters need to be able to ‘see’ in order to operate effectively.
The aim of this project was to design, develop and test a series of algorithms that can identify individual trees as well as the fruit, nuts and flowers on those trees.
Researchers conducted a series of trials which successfully demonstrated the use of these algorithms in apple, almond, avocado, lychee, mango and bananas.
Two autonomous perception ground vehicles (Shrimp and Mantis) were used to gather data trialling a number of sensors. Existing technology proved effective at separating the data corresponding to the ground surface and at segmenting most trees on either side of the traversed row. Extensions to these algorithms were then developed including almost real-time tree segmentation, crop segmentation and detection, and flower segmentation and detection.
The project progressed through an initial design and requirements phase through to the design and build of a system capable of identifying and capturing an image of a crop target.
The outcome of these experiments shows great promise for future applications in using the system within tree crops for automated tasks such as pollination and pest management.
Learn more about the Australian Centre for Field Robotics at the University of Sydney, which includes agricultural robotics including Shrimp, Mantis and other technologies developed in partnership with Hort Innovation.