Detail publikace

Dynamic Tracking in Meeting Room Scenarios Using Omnidirectional View

Originální název

Dynamic Tracking in Meeting Room Scenarios Using Omnidirectional View

Anglický název

Dynamic Tracking in Meeting Room Scenarios Using Omnidirectional View

Jazyk

en

Originální abstrakt

The robust localization and tracking of faces in video streams is a fundamental concern for many subsequent multi-modal recognition approaches. Especially in meeting scenarios several independent processing queues often exist that use the position and gaze of faces, such as group  action- and face recognizers. The costs for multiple camera recordings of meeting scenarios are obviously higher compared to those of a single omnidirectional camera setup. Therefore it would be desirable to use these easier to acquire omnidirectional recordings. The present work presents an implementation of a robust particle filter based face-tracker using omnidirectional views. It is shown how omnidirectional images have to be unwarped before they can be processed by localization and tracking systems being invented for undistorted material. The performance of the system is evaluated on a part of  the PETS-ICVS 2003 Smart Meeting Room dataset.

Anglický abstrakt

The robust localization and tracking of faces in video streams is a fundamental concern for many subsequent multi-modal recognition approaches. Especially in meeting scenarios several independent processing queues often exist that use the position and gaze of faces, such as group  action- and face recognizers. The costs for multiple camera recordings of meeting scenarios are obviously higher compared to those of a single omnidirectional camera setup. Therefore it would be desirable to use these easier to acquire omnidirectional recordings. The present work presents an implementation of a robust particle filter based face-tracker using omnidirectional views. It is shown how omnidirectional images have to be unwarped before they can be processed by localization and tracking systems being invented for undistorted material. The performance of the system is evaluated on a part of  the PETS-ICVS 2003 Smart Meeting Room dataset.

BibTex


@inproceedings{BUT16932,
  author="Igor {Potúček} and Frank {Wallhoff} and Martin {Zobl} and Ronald {Tetzlaff}",
  title="Dynamic Tracking in Meeting Room Scenarios Using Omnidirectional View",
  annote="

The robust localization and tracking of faces in video streams is a fundamental concern for many subsequent multi-modal recognition approaches. Especially in meeting scenarios several independent processing queues often exist that use the position and gaze of faces, such as group  action- and face recognizers. The costs for multiple camera recordings of meeting scenarios are obviously higher compared to those of a single omnidirectional camera setup. Therefore it would be desirable to use these easier to acquire omnidirectional recordings. The present work presents an implementation of a robust particle filter based face-tracker using omnidirectional views. It is shown how omnidirectional images have to be unwarped before they can be processed by localization and tracking systems being invented for undistorted material. The performance of the system is evaluated on a part of  the PETS-ICVS 2003 Smart Meeting Room dataset.", address="IEEE Computer Society", booktitle="17th International Conference on Pattern Recognition (ICPR 2004)", chapter="16932", institution="IEEE Computer Society", year="2004", month="april", pages="933--936", publisher="IEEE Computer Society", type="conference paper" }