Many algorithms for virtual tree generation exist, but the visual realism of the 3D models is unknown. This problem is usually addressed by performing limited user studies or by side-by-side visual comparison. We introduce an automated system for realism assessment of the tree model based on their perception. We conducted a user study in which 4,000 participants compared over one million pairs of images to collect subjective perceptual scores of a large dataset of virtual trees. The scores were used to train two neural-network-based predictors. A view independent ICTreeF uses the tree model's geometric features that are easy to extract from any model. The second is ICTreeI that estimates the perceived visual realism of a tree from its image. Moreover, to provide an insight into the problem, we deduce intrinsic attributes and evaluate which features make trees look like real trees. In particular, we show that branching angles, length of branches, and widths are critical for perceived realism. We also provide three datasets: carefully curated 3D tree geometries and tree skeletons with their perceptual scores, multiple views of the tree geometries with their scores, and a large dataset of images with scores suitable for training deep neural networks.
@article{polasek21ictree, title = {ICTree: Automatic Perceptual Metrics for Tree Models}, author = {Polasek, Tomas and Hrusa, David and Benes, Bedrich and \v{C}ad\'{\i}k, Martin}, year = {2021}, volume = {40}, number = {6}, issn = {0730-0301}, url = {https://doi.org/10.1145/3478513.3480519}, doi = {10.1145/3478513.3480519}, journal = {ACM Trans. Graph.}, month = {dec}, articleno = {230}, numpages = {15}, }
Acknowledgements and Credits: the presented code and dataset should not be used for commercial purposes without our explicit permission. Please acknowledge the use of the dataset by citing the publication.