IRLTrees3D: A 3D reconstruction dataset of trees
Loading...
Files
Accepted Version
Date
2025
Authors
Chai, Joseph
O’Sullivan, Barry
Nguyen, Hoang D.
Journal Title
Journal ISSN
Volume Title
Publisher
Computer Vision Foundation
Published Version
Abstract
Measuring tree dimensions is fundamentally important to effective forestry and environmental monitoring, particularly in the estimation of investable carbon stock. Three dimensional (3D) reconstruction, therefore, has become promising for enabling high-precision measurements and AI-driven applications in this domain, but remains challenging due to environmental uncertainty and the limitations of existing methods and datasets. This paper presents a novel, high-resolution dataset of 3D vegetation models comprising trees, logs, and plants captured from diverse locations across Ireland. The reconstruction pipeline with high-quality 2D views and 3D artifacts is provided to ensure methodological transparency and reproducibility for bridging multidimensional AI and computer vision tasks. To evaluate the geometric accuracy of the reconstructed models, we employed the diameter at breast height (DBH) as a validation metric. Our evaluation yielded a mean absolute error of 0.53 cm, with a standard deviation of 0.28cm, demonstrating the reliability of our cost-effective photogrammetry approach. The release of this dataset constitutes a significant contribution to the domain of 3D reconstruction of trees, providing a robust framework for future research in the field.
Description
Keywords
Measuring tree dimensions , Three dimensional (3D) reconstruction
Citation
Chai, J., O'Sullivan, B. and Nguyen, H. D. (2025) 'IRLTrees3D: A 3D reconstruction dataset of trees', Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV) Workshops, Honolulu, Hawai'i, 19-23 October 2025, pp. 2876-2881. Available at: https://openaccess.thecvf.com/content/ICCV2025W/SEA/html/Chai_IRLTrees3D_A_3D_Reconstruction_Dataset_of_Trees_ICCVW_2025_paper.html (Accessed: 23 December 2025)
Copyright
© 2025, the Authors.
