Publication detail

Dynamic Tracking in Meeting Room Scenarios Using Omnidirectional View

POTÚČEK, I., WALLHOFF, F., ZOBL, M., RIGOLL, G.

Original Title

Dynamic Tracking in Meeting Room Scenarios Using Omnidirectional View

English Title

Dynamic Tracking in Meeting Room Scenarios Using Omnidirectional View

Type

conference paper

Language

en

Original Abstract

The robust localization and tracking of faces in video streams is a fundamental concern for many subsequent multi-modal recognition approaches. Especially in meeting scenarios several independent processing queues often exist that use the position and gaze of faces, such as group  action- and face recognizers. The costs for multiple camera recordings of meeting scenarios are obviously higher compared to those of a single omnidirectional camera setup. Therefore it would be desirable to use these easier to acquire omnidirectional recordings. The present work presents an implementation of a robust particle filter based face-tracker using omnidirectional views. It is shown how omnidirectional images have to be unwarped before they can be processed by localization and tracking systems being invented for undistorted material. The performance of the system is evaluated on a part of  the PETS-ICVS 2003 Smart Meeting Room dataset.

English abstract

The robust localization and tracking of faces in video streams is a fundamental concern for many subsequent multi-modal recognition approaches. Especially in meeting scenarios several independent processing queues often exist that use the position and gaze of faces, such as group  action- and face recognizers. The costs for multiple camera recordings of meeting scenarios are obviously higher compared to those of a single omnidirectional camera setup. Therefore it would be desirable to use these easier to acquire omnidirectional recordings. The present work presents an implementation of a robust particle filter based face-tracker using omnidirectional views. It is shown how omnidirectional images have to be unwarped before they can be processed by localization and tracking systems being invented for undistorted material. The performance of the system is evaluated on a part of  the PETS-ICVS 2003 Smart Meeting Room dataset.

Keywords

computer vision, skin detection, omni-directional image, geometrical corrections, face detection, recognition.

RIV year

2004

Released

28.04.2004

Publisher

IEEE Computer Society

Location

Cambridge

ISBN

0-7695-2128-2

Book

17th International Conference on Pattern Recognition (ICPR 2004)

Pages from

933

Pages to

936

Pages count

4

URL

Documents

BibTex


@inproceedings{BUT16932,
  author="Igor {Potúček} and Frank {Wallhoff} and Martin {Zobl} and Ronald {Tetzlaff}",
  title="Dynamic Tracking in Meeting Room Scenarios Using Omnidirectional View",
  annote="

The robust localization and tracking of faces in video streams is a fundamental concern for many subsequent multi-modal recognition approaches. Especially in meeting scenarios several independent processing queues often exist that use the position and gaze of faces, such as group  action- and face recognizers. The costs for multiple camera recordings of meeting scenarios are obviously higher compared to those of a single omnidirectional camera setup. Therefore it would be desirable to use these easier to acquire omnidirectional recordings. The present work presents an implementation of a robust particle filter based face-tracker using omnidirectional views. It is shown how omnidirectional images have to be unwarped before they can be processed by localization and tracking systems being invented for undistorted material. The performance of the system is evaluated on a part of  the PETS-ICVS 2003 Smart Meeting Room dataset.", address="IEEE Computer Society", booktitle="17th International Conference on Pattern Recognition (ICPR 2004)", chapter="16932", institution="IEEE Computer Society", year="2004", month="april", pages="933--936", publisher="IEEE Computer Society", type="conference paper" }