Vision-Based Online Trajectory Generation and Its Application to Catching
- 458 Downloads
In this paper, a method for vision-based online trajectory generation is proposed. The proposed method is based on a nonlinear mapping of visual information to the desired trajectory, and this nonlinear mapping is defined by learning based on constraints of dynamics and kinematics. This method is applied to a catching task, and a reactive and flexible motion is obtained owing to real-time high-speed visual information. Experimental results on catching a moving object using a high-speed vision chip system are presented.
KeywordsVisual Information Joint Angle Joint Torque Trajectory Generation Target Trajectory
Unable to display preview. Download preview PDF.
- 2.Allen, P., Yoshimi, B., and Timucenko, A. (1991). Real-time visual servoing. Proc. IEEE Int. Conf. Robot. and Automat., pages 2376–2384.Google Scholar
- 4.B. Ghosh, N. Xi, and T.J. Tarn (1999). Control in robotics and automation: sensor-based integration. ACADEMIC Press.Google Scholar
- 6.Hong, W. and Slotine, J. (1995). Experiments in hand-eye coordination using active vision. Proc. 4th Int. Symp. on Experimental Robot. Google Scholar
- 7.ai]Ishikawa, M., Morita, A., and Takayanagi, N. (1992). High speed vision system using massively parallel processing. Proc. IEEE Int. Conf. Intelligent Robot. and Systems, pages 373–377.Google Scholar
- 8.Komuro, T., Ishii, I., and Ishikawa, M. (1997). Vision chip architecture using general-purpose processing elements for 1ms vision system. Proc. IEEE Int. Workshop on Computer Architecture for Machine Perception, pages 276–279.Google Scholar
- 10.Nakabo, Y., Ishikawa, M., Toyoda, H., and Mizuno, S. (2000). 1ms column parallel vision system and its application of high speed target tracking. Proc. IEEE Int. Conf. Robot. and Automat., pages 650–655.Google Scholar
- 11.Namiki, A. and Ishikawa, M. (2001). Sensory-motor fusion architecture based on high-speed sensory feedback and its application to grasping and manipulation. Proc. Int. Symp. Robotics, pages 784–789.Google Scholar
- 12.Namiki, A., Nakabo, Y., Ishii, I., and Ishikawa, M. (1999a). 1ms grasping system using visual and force feedback. Video Proc. IEEE Int. Conf. Robot. Automat. Google Scholar
- 13.Namiki, A., Nakabo, Y., Ishii, I., and Ishikawa, M. (1999b). High speed grasping using visual and force feedback. Proc. IEEE Int. Conf. Robot. Automat., pages 3195–3200.Google Scholar
- 15.Sakaguchi, T., Fujita, M., Watanabe, H., and Miyazaki, F. (1993). Motion planning and control for a robot performer. Proc. IEEE Int. Conf. Robot. and Automat., 3:925–931.Google Scholar
- 16.Zhang, M. and Buehler, M. (1994). Sensor-based online trajectctory genera-tion for smoothly grasping moving objects. Proc. IEEE Int. Symp. Intelligent Control, pages 141–146.Google Scholar