Abstract:
Vision-based driver assistance in modern cars has to perform
automated real-time understanding or modeling of tra c environments
based on multiple sensor inputs, using `normal' or specialized (such as
night vision) stereo cameras as default input devices. Distance measurement,
lane-departure warning, tra c sign recognition, or trajectory calculation
are examples of current developments in the eld, contributing
to the design of intelligent vehicles.
The considered application scenario is as follows: two or more cameras
are installed in a vehicle (typically a car, but possibly also a boat, a
wheelchair, a forklift, and so forth), and the operation of this vehicle (by
a driver) is supported by analyzing in real-time video sequences recorded
by those cameras. Possibly, further sensor data (e.g., GPS, radar) are also
analyzed in an integrated system.
Performance evaluation is of eminent importance in car production. Crash
tests follow international standards, de ning exactly conditions under
which a test has to take place. Camera technology became recently an
integral part of modern cars. In consequence, perfectly speci ed and
standardized tests (`camera crash tests') are needed very soon for the
international car industry to identify parameters of stereo or motion
analysis, or of further vision-based components.
This paper reports about current performance evaluation activities in
the .enpeda.. project at The University of Auckland. Test data are so far
recti ed stereo sequences (provided by Daimler A.G., Germany, in 2007),
and stereo sequences recorded with a test vehicle on New Zealand's roads.
Description:
You are granted permission for the non-commercial reproduction, distribution, display, and performance of this technical report in any format, BUT this permission is only for a period of 45 (forty-five) days from the most recent time that you verified that this technical report is still available from the original MI_tech website http://www.mi.auckland.ac.nz/index.php?option=com_content&view=article&id=91&Itemid=76 . All other rights are reserved by the author(s).