Perceptive User Interface, a Generic Approach

  • Michael Van den Bergh
  • Ward Servaes
  • Geert Caenen
  • Stefaan De Roeck
  • Luc Van Gool
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3766)


This paper describes the development of a real-time perceptive user interface. Two cameras are used to detect a user’s head, eyes, hand, fingers and gestures. These cues are interpreted to control a user interface on a large screen. The result is a fully functional integrated system that processes roughly 7.5 frames per second on a Pentium IV system. The calibration of this setup is carried out through a few simple and intuitive routines, making the system adaptive and accessible to non-expert users. The minimal hardware requirements are two web-cams and a computer. The paper will describe how the user is observed (head, eye, hand and finger detection, gesture recognition), the 3D geometry involved, and the calibration steps necessary to set up the system.


Gesture Recognition White Balance Connected Component Analysis Gesture Recognition System Screen Plane 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Wren, C.R., Azerbayejani, A., Darell, T., Pentland, A.P.: Pfinder: Real-time tracking of the human body. IEEE Transactions on Pattern Analysis and Machine Intelligence 19, 780–785 (1997)CrossRefGoogle Scholar
  2. 2.
    Plänkers, R., Fua, P.: Articulated soft objects for video-based body modeling. In: Proceedings 8th International Conference on Computer Vision (2001)Google Scholar
  3. 3.
    Jones, M.J., Rehg, J.M.: Statistical color models with application to skin detection. Cambridge Research Laboratory Technical Report Series CRL98/11 (1998) Google Scholar
  4. 4.
    Jedynak, B., Zheng, H., Daoudi, M., Barret, D.: Maximum entropy models for skin detection. Technical Report publication IRMA 57 (2002) Google Scholar
  5. 5.
    Mester, R., Aach, T., Dümbgen, L.: Illumination-invariant change detection using a statistical colinearity criterion. In: Pattern Recognition: Proceedings 23rd DAGM Symposium, pp. 170–177 (2001)Google Scholar
  6. 6.
    Hung, Y.P., Yang, Y.S., Chen, Y.S., Hsieh, I.B., Fuh, C.S.: Free-hand pointer by use of an active stereo vision system. In: Proceedings of Third Asian Conference on Computer Vision 1, pp. 632–639 (1998)Google Scholar
  7. 7.
    Sánchez-Nielsen, E., Antón-Canaís, L., Hernández-Tejera, M.: Hand gesture recognition for human-machine interaction. Journal of WSCG 12 (2004)Google Scholar
  8. 8.
    Gool, L.V.: Beeldinterpretatie en Computer Vision II. Visics, KULeuven (2002-2003) Google Scholar
  9. 9.
    Koninckx, T., Griesser, A., Van Gool, L.: Real-time range scanning of deformable surfaces by adaptively coded structured light. In: Kawada, S. (ed.) Fourth International Conference on 3-D Digital Imaging and Modeling (3DIM 2003), pp. 293–302 (2003)Google Scholar
  10. 10.
    Koninckx, T., Van Gool, L.: High-speed active 3d acquisition based on a patternspecific mesh. In: SPIE’s 15th annual symposium on electronic imaging - videometrics VII, vol. 5013, pp. 26–37 (2003)Google Scholar
  11. 11.
    Fischler, M., Bolles, R.: Random sampling consensus: a paradigm for model fitting with application to image analysis and automated cartography. Commun. Assoc. Comp. Mach. 24, 381–395 (1981)MathSciNetGoogle Scholar
  12. 12.
    Hartley, R.I., Zisserman, A.: Multiple view geometry in computer vision, 239–240 (2000)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2005

Authors and Affiliations

  • Michael Van den Bergh
    • 1
    • 3
  • Ward Servaes
    • 2
  • Geert Caenen
    • 1
  • Stefaan De Roeck
    • 1
  • Luc Van Gool
    • 1
    • 3
  1. 1.ESAT-PSIUniversity of LeuvenBelgium
  2. 2.PHILIPS 
  3. 3.Computer Vision Group (BIWI)ETH ZuerichSwitzerland

Personalised recommendations