Abstract:
This research was a part of a project that developed mobile robots that performed targeted pollen spraying and automated harvesting in pergola structured kiwifruit orchards. The project was called the “MBIE Multipurpose Orchard Robotics Project” and was a collaboration between the University of Auckland, the University of Waikato, Robotics Plus Ltd and Plant & Food Research. Many of the contributions presented here specifically relate to automated harvesting, pollination and navigation in kiwifruit orchards. The contributions relating to harvesting include multiple fruit detection methods, testing multiple sensors for harvesting, a study of how to perform kiwifruit detachment, a calibration method for a kiwifruit harvesting robot and a path planning method for picking fruit that could not be reached by previous kiwifruit harvesting robots. In addition, multiple kiwifruit detachment mechanisms were designed and field testing of one of the concepts showed that the mechanism could reliably pick kiwifruit. Furthermore, this kiwifruit detachment mechanism was able to reach over 80 percent of fruit in the cluttered kiwifruit canopy, whereas the previous state of the art mechanism was only able to reach less than 70 percent of the fruit. Artificial pollination was performed by detecting flowers and then spraying pollen in solution onto the detected flowers from a line of sprayers on a boom, while driving at up to 1.4 ms-1. In addition, the height of the canopy was measured and the spray boom was moved up and down to keep the boom close enough to the flowers for the spray to reach the flowers, while minimising collisions with the canopy. The pollination system contributions described here include the design of flower detection systems, the calibration method used, a method for removing noise from stereo-matching algorithms and multiple generations of the boom height control algorithms. In addition, a dry pollination system was created, which produced fruit with similar statistics to commercial crops. Mobile robot navigation was performed using a 2D lidar in apple orchards and vineyards. Lidar navigation in kiwifruit orchards was more challenging because the pergola structure only provides a small amount of data for the direction of rows, compared to the amount of data from the overhead canopy, the undulating ground and other objects in the orchards. Multiple methods are presented here for extracting structure defining features from 3D lidar data in kiwifruit orchards. In addition, a 3D lidar navigation system- which performed row following, row end detection and row end turnswas tested for over 30 km of autonomous driving in kiwifruit orchards. The row detection component of this navigation system worked well compared to an existing kiwifruit row detection method. Computer vision algorithms for row detection and row following were also tested. The computer vision algorithm worked as well as the 3D lidar row following method in testing. The design of robust safety systems will be critical to the deployment of large mobile orchard robots. A risk assessment was performed for autonomous navigation with a large mobile orchard robot and found that the required risk reduction (PLr) was a Performance Level of d or e, depending on the degree of interaction between the robot and untrained people. These are very demanding requirements and so an architecture was proposed that included multiple sensors with multiple layers of redundancy and diversity in order to reduce the risk of Common Cause Failures. Methods for pedestrian detection in camera and lidar data are presented here, including methods to detect pedestrians that are in hazardous situations. The most robust of the pedestrian detection methods tested was a high visibility vest detector using the intensity channel of a 3D lidar; this method was tested to have 100 percent true positives, 0 false negatives and 0 false positives within a range of 4 m for people wearing high visibility vests.