ViF-GTAD: A new Automotive Dataset with Ground Truth for ADAS/AD Development, Testing and Validation
Sarah Haas, Selim Solmaz, Jakob Reckenzaun, Simon Genser
Abstract: A new dataset for automated driving, which is the subject matter of this paper, identifies and addresses a gap in existing similar perception datasets. While most state-of-the-art perception datasets primarily focus on the provision of various onboard sensor measurements along with the semantic information under various driving conditions, the provided information is often insufficient since the object list and position data provided include unknown and time-varying errors. The current paper and the associated dataset describe the first publicly available perception measurement data that include not only the on-board sensor information from the camera, Lidar, and radar with semantically classified objects but also the high precision ground-truth position measurements enabled by the accurate RTK-assisted GPS localization systems available on both the ego vehicle and the dynamic target objects. This paper provides insight on the capturing of the data, explicitly explaining the metadata structure and the content, as well as the potential application examples where it has been, and can potentially be, applied and implemented in relation to automated driving and environmental perception systems development, testing, and validation
ViF-GTAD: A new Automotive Dataset with Ground Truth for ADAS/AD Development, Testing and Validation