Determining pose of 3D objects with curved surfaces

被引:12
作者
Chen, JL
Stockman, GC
机构
[1] Pattern Recognition Anil Image Processing Laboratory, Michigan State University, East Landing, MI
关键词
pose determination; 3D objects; object tracking; object modeling; image matching; recognition by alignment;
D O I
10.1109/34.476010
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
A method is presented for computing the pose of rigid 3D objects with arbitrary curved surfaces. Given an input image and a candidate object model and aspect, the method will verify whether or not the object is present acid if so, report pose parameters. The curvature method of Basri and Ullman is used to model points on the object rim, while stereo matching is used for internal edge points. The model allows an object edgemap to be predicted from pose parameters. Pose is computed via an iterative search for the best pose parameters. Heuristics are used so that matching can succeed in the presence of occlusion and artifact and without resorting to use of corresponding salient feature points. Bench tests and simulations show that the method almost always converges to ground truth pose parameters for a variety of objects and for a broad set of starting parameters in the same aspect.
引用
收藏
页码:52 / 57
页数:6
相关论文
共 17 条
[1]  
BAJCSY R, 1987, 1ST P INT C COMP VIS, P231
[2]  
Basri R., 1988, Second International Conference on Computer Vision (IEEE Cat. No.88CH2664-1), P482, DOI 10.1109/CCV.1988.590027
[3]  
BOWYER K, 1989, MAY P IM UND WORKSH, P831
[4]  
CHEN JL, 1993, JUN P IEEE C COMP VI, P233
[5]  
GOAD C, 1983, P DARPA IMAGE UNDERS
[6]  
Gross A. D., 1988, Second International Conference on Computer Vision (IEEE Cat. No.88CH2664-1), P690, DOI 10.1109/CCV.1988.590052
[7]  
GUPTA A, 1991, THESIS U PENNSYLVANI
[8]  
HIGUCHI K, 1994, 2ND P IEEE CAD BAS V, P124
[9]  
KEREN D, 1991, IEEE T PATTERN ANAL, V16, P38
[10]   INTERNAL REPRESENTATION OF SOLID SHAPE WITH RESPECT TO VISION [J].
KOENDERINK, JJ ;
VANDOORN, AJ .
BIOLOGICAL CYBERNETICS, 1979, 32 (04) :211-216