Cambridge University researchers have developed what is believed to be the first robotic lettuce leaf peeling system of its kind
Cambridge, United Kingdom
September 24, 2018
“There is a growing need to develop automated robotic solutions for agriculture due to increasing demand for food, changing climate conditions and decreasing availability of manual human labour," said Cambridge PhD student Luca Scimeca. "Our lettuce and stem detection algorithm demonstrates a robot which is robust to clutter, varying lighting conditions, and camera distance, as well as to variations in produce size, shape and orientation."
The novel machine vision pipeline and suction removal/vacuum system, developed in the Department’s Machine Intelligence Laboratory, is capable of performing the peeling process – with full leaf removal – 50% of the time, with the process taking an average of 27 seconds to complete.
Sorting crops, such as lettuces, and removing the outer leaves after harvesting is a task currently performed by farm workers. For farm workers this is a very easy task, but for robots, it is a challenging vision and manipulation task and which has so far been difficult for robotic technologies to grasp.
But now the automated removal of lettuce leaves has moved a step closer to reality, after the research team, led by Dr Fumiya Iida, Lecturer in Mechatronics, addressed the challenges of handling this soft, fragile produce. Their creation of a 3D-printed circular nozzle, mounted on the end of a robotic arm and tested with a suction system, acts as the single vacuum suction point. It is designed to grab a leaf and remove it from the main body of the lettuce using a tearing action, without causing damage to the produce.
Crucial to the accuracy of the leaf tearing is the use of computer vision to locate and determine the positioning of the lettuce. It does this by first detecting the lettuce stem with the aid of a 2D web camera placed directly above and within the assumed field of vision. In cases where the stem cannot be found, an action will be taken to flip the lettuce over by applying a horizontal force and rolling the lettuce with a soft pad attached to the robot arm. Better positioning of the lettuce can then be achieved with the outer leaf on top and with minimal risk of damage.
The lettuce and stem detection algorithm was tested on 180 pictures of individual lettuces taken with the web camera at heights of between 70cm and 100cm. A total of 10 different iceberg lettuces were used in different positions, and with varying light direction and intensity, with some positioned alongside background objects arranged to represent clutter. In addition, 30 frames were taken after storing the produce for three days, resulting in changes in stem colour. The lettuce detection algorithm was able to accurately locate the centre of the lettuce with 100% accuracy and the stem detection algorithm achieved a detection accuracy of 81.01%. As a result of these findings, the research team were able to identify the optimum lettuce leaf removal point.
Luca Scimeca, from the Biologically Inspired Robotics Laboratory (BIRL), worked on the vision system. He said the robot could be applied to many other crops, such as cauliflower, which is far less fragile and which poses less challenges when it comes to computer vision analysing its orientation.
“Lettuce leaf peeling is an interesting robotics problem from an engineering perspective because the leaves are soft, they tear easily and the shape of the lettuce is never a given,” he said. “The computer vision we have developed, which lies at the heart of our lettuce peeling robot, can be applied to many other crops, such as cauliflower, where similar information would be required for the post-processing of the produce.
"However, further work is needed to integrate the three stages: vision detection, rolling system and leaf tearing/removal, into one single end-to-end solution. We propose an approach using a two arm Baxter robot, where the pose estimation and peeling process is combined.”