Effects of the physical environment on data visualization effectiveness in augmented reality

Author(s) Collection number Pages Download abstract Download full text
Lytovchenko O. V., Мозіль Б. І., Думанський І. І. № 2 (90) 156-164 Image Image

This article analyzes the influence of real-world environments on the effectiveness of augmented reality (AR) data visualizations. AR content must compete with dynamic, cluttered backgrounds, creating significant cognitive challenges for users. This review systematizes recent scientific advancements addressing these core issues.

A primary focus is the shift from subjective self-reports to objective, real-time physiological measurements of cognitive load, particularly using eye-tracking metrics such as fixation duration and pupil dilation. These indicators are foundational for creating bio-adaptive interfaces that dynamically adjust visualizations to prevent user overload.

The article also analyzes strategies for mitigating visual clutter, highlighting effective label management techniques, such as grouping. It examines the influence of background characteristics on legibility, underscoring the trend toward AI-driven systems that automatically adjust visualization styles (e.g., contrast, outlines) to maintain readability and perceptual coherence.

Finally, the analysis covers innovative approaches to spatial content placement and ergonomics. It discusses moving beyond head-locked displays to paradigms like object-anchoring and utilizing unconventional surfaces (such as the floor, ceiling, or periphery) to free the user’s central vision. The ergonomic implications for mobile scenarios are explored, particularly the speed-accuracy trade-off associated with different content anchoring methods (e.g., head vs. hand).

In conclusion, this article discusses that the future of effective AR lies in holistic, context-aware systems that dynamically adapt to the environment, the user’s real-time cognitive state, individual traits, and task demands.

Keywords: data visualization, augmented reality, AR, cognitive load, visual clutter, situated analytics, adaptive interfaces, human–computer interaction.

doi: 10.32403/0554-4866-2025-2-90-156-164


  • 1. Suryani, M., Santoso, H. B., Schrepp, M., Aji, R. F., Hadi, S., Sensuse, D. I., & Suryono, R. R. (2024). Role, methodology, and measurement of cognitive load in computer science and information systems research. IEEE Access.
  • 2. Suzuki, Y., Wild, F., & Scanlon, E. (2024). Measuring cognitive load in augmented reality with physiological methods: A systematic review. Journal of Computer Assisted Learning, 40(2), 375-393.
  • 3. Suzuki, Y., Wild, F., & Scanlon, E. (2024, June). Measuring Cognitive Load with Eye-Tracking During Mental Rotation with 2D and 3D Visualization in AR. In International Conference on Immersive Learning (pp. 34-48). Cham: Springer Nature Switzerland.
  • 4. Chiossi, F., Trautmannsheimer, I., Ou, C., Gruenefeld, U., & Mayer, S. (2024). Searching across realities: Investigating erps and eye-tracking correlates of visual search in mixed reality. IEEE Transactions on Visualization and Computer Graphics.
  • 5. Park, J. H., Roper, B., Arezoumand, A., & Tran, T. (2025). Exploring AR Label Placements in Visually Cluttered Scenarios. arXiv preprint arXiv:2507.00198.
  • 6. Lee, B., Sedlmair, M., & Schmalstieg, D. (2023). Design patterns for situated visualization in augmented reality. IEEE Transactions on Visualization and Computer Graphics, 30(1), 1324-1335.
  • 7. Tandon, S., Abdul-Rahman, A., & Borgo, R. (2022). Measuring effects of spatial visualization and domain on visualization task performance: a comparative study. IEEE Transactions on Visualization and Computer Graphics, 29(1), 668-678.
  • 8. Eh Phon, D. N., Murli, N., Che Lah, N. H., Hashim, S. (2025). The Impact of Augmented Reality on Spatial Visualization Ability and Science Achievement: Exploring their Correlation. International Journal on Advanced Science, Engineering and Information Technology, 15(4), 1221–1228.
  • 9. Patel, D. B. (2025). Incorporating Augmented Reality into Data Visualization for Real-Time Analytics. Utilitas Mathematica, 122(1), 3216-3230.
  • 10. Guo, F., Chen, J., Li, M., Lyu, W., & Zhang, J. (2022). Effects of visual complexity on user search behavior and satisfaction: an eye-tracking study of mobile news apps. Universal Access in the Information Society, 21(4), 795-808.
  • 11. Rasch, J., Müller, F., & Chiossi, F. (2025). A Vision for AI-Driven Adaptation of Dynamic AR Content to Users and Environments. arXiv preprint arXiv:2504.16562.
  • 12. Tomkins, A., & Lange, E. (2023). Planning and designing natural and urban environments with an adaptive visualization framework: the case of Pazhou Island, Guangzhou, Pearl River Delta. Land, 12(2), 377.
  • 13. Davari-Najafabadi, S. (2024). Intelligent Augmented Reality (iAR): Context-aware Inference and Adaptation in AR.
  • 14. Satkowski, M., Rzayev, R., Goebel, E., & Dachselt, R. (2022, October). ABOVE & BELOW: Investigating ceiling and floor for augmented reality content placement. In 2022 IEEE Inter­national Symposium on Mixed and Augmented Reality (ISMAR) (pp. 518-527). IEEE.
  • 15. Rasch, J., Wilhalm, M., Müller, F., & Chiossi, F. (2025, April). AR You on Track? Investigating Effects of Augmented Reality Anchoring on Dual-Task Performance While Walking. In Pro­ceedings of the 2025 CHI Conference on Human Factors in Computing Systems (pp. 1-21).
  • 16. Xiao, R., & Benko, H. (2016, May). Augmenting the field-of-view of head-mounted displays with sparse peripheral displays. In Proceedings of the 2016 CHI conference on human factors in computing systems (pp. 1221-1232).