Design of an eye-in-hand sensing and servo control framework for harvesting robotics in dense vegetation

被引:80
作者
Barth, Ruud [1 ,2 ]
Hemming, Jochen [2 ]
van Henten, Eldert J. [3 ]
机构
[1] Harvard Univ, 60 Oxford St, Cambridge, MA 02138 USA
[2] Univ Wageningen & Res Ctr, Greenhouse Hort, POB 644, NL-6700 AP Wageningen, Netherlands
[3] Univ Wageningen & Res Ctr, Farm Technol Grp, Droevendaalsesteeg 1, NL-6708 PB Wageningen, Netherlands
关键词
Framework; Harvest robots; Visual servo control; ROS; SLAM; LOCALIZATION;
D O I
10.1016/j.biosystemseng.2015.12.001
中图分类号
S2 [农业工程];
学科分类号
0828 ;
摘要
A modular software framework design that allows flexible implementation of eye-in-hand sensing and motion control for agricultural robotics in dense vegetation is reported. Harvesting robots in cultivars with dense vegetation require multiple viewpoints and on-line trajectory adjustments in order to reduce the amount of false negatives and correct for fruit movement. In contrast to specialised software, the framework proposed aims to support a wide variety of agricultural use cases, hardware and extensions. A set of Robotic Operating System (ROS) nodes was created to ensure modularity and separation of concems, implementing functionalities for application control, robot motion control, image acquisition, fruit detection, visual servo control and simultaneous localisation and mapping (SLAM) for monocular relative depth estimation and scene reconstruction. Coordination functionality was implemented by the application control node with a finite state machine. In order to provide visual servo control and simultaneous localisation and mapping functionalities, off-the-shelf libraries Visual Servoing Platform library (ViSP) and Large Scale Direct SLAM (LSD-SLAM) were wrapped in ROS nodes. The capabilities of the framework are demonstrated by an example implementation for use with a sweet-pepper crop, combined with hardware consisting of a Baxter robot and a colour camera placed on its end-effector. Qualitative tests were performed under laboratory conditions using an artificial dense vegetation sweet-pepper crop. Results indicated the framework can be implemented for sensing and robot motion control in sweet-pepper using visual information from the end-effector. Future research to apply the framework to other use-cases and validate the performance of its components in servo applications under real greenhouse conditions is suggested. (C) 2015 The Authors. Published by Elsevier Ltd on behalf of IAgrE.
引用
收藏
页码:71 / 84
页数:14
相关论文
共 44 条
  • [31] ViSP for visual servoing
    Marchand, É
    Spindler, F
    Chaumette, F
    [J]. IEEE ROBOTICS & AUTOMATION MAGAZINE, 2005, 12 (04) : 40 - 52
  • [32] Marchand É, 1999, ICRA '99: IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, VOLS 1-4, PROCEEDINGS, P3224, DOI 10.1109/ROBOT.1999.774089
  • [33] Vision-based control of robotic manipulator for citrus harvesting
    Mehta, S. S.
    Burks, T. F.
    [J]. COMPUTERS AND ELECTRONICS IN AGRICULTURE, 2014, 102 : 146 - 158
  • [34] Murphy R.R., 2019, Introduction to AI robotics
  • [35] Nguyen H., 2013, ICRA
  • [36] Quigley M, 2009, IEEE INT CONF ROBOT, P3604
  • [37] Automatic fruit recognition and counting from multiple images
    Song, Y.
    Glasbey, C. A.
    Horgan, G. W.
    Polder, G.
    Dieleman, J. A.
    van der Heijden, G. W. A. M.
    [J]. BIOSYSTEMS ENGINEERING, 2014, 118 : 203 - 215
  • [38] Point Feature Extraction on 3D Range Scans Taking into Account Object Boundaries
    Steder, Bastian
    Rusu, Radu Bogdan
    Konolige, Kurt
    Burgard, Wolfram
    [J]. 2011 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2011, : 2601 - 2608
  • [39] Field test of an autonomous cucumber picking robot
    Van Henten, EJ
    Van Tuijl, BAJ
    Hemming, J
    Kornet, JG
    Bontsema, J
    Van Os, EA
    [J]. BIOSYSTEMS ENGINEERING, 2003, 86 (03) : 305 - 313
  • [40] An autonomous robot for harvesting cucumbers in greenhouses
    van Henten, EJ
    Hemming, J
    van Tuijl, BAJ
    Kornet, JG
    Meuleman, J
    Bontsema, J
    van Os, EA
    [J]. AUTONOMOUS ROBOTS, 2002, 13 (03) : 241 - 258