Name: | Description: | Size: | Format: | |
---|---|---|---|---|
5.64 MB | Adobe PDF |
Advisor(s)
Abstract(s)
The visual fidelity of virtual reality (VR) and augmented reality (AR) environments is
essential for user immersion and comfort. Dynamic lighting often leads to chromatic distortions
and reduced clarity, causing discomfort and disrupting user experience. This paper introduces
an AI-driven chromatic adjustment system based on a modified U-Net architecture, optimized for
real-time applications in VR/AR. This system adapts to dynamic lighting conditions, addressing
the shortcomings of traditional methods like histogram equalization and gamma correction, which
struggle with rapid lighting changes and real-time user interactions. We compared our approach
with state-of-the-art color constancy algorithms, including Barron’s Convolutional Color Constancy
and STAR, demonstrating superior performance. Experimental results from 60 participants show
significant improvements, with up to 41% better color accuracy and 39% enhanced clarity under
dynamic lighting conditions. The study also included eye-tracking data, which confirmed increased
user engagement with AI-enhanced images. Our system provides a practical solution for developers
aiming to improve image quality, reduce visual discomfort, and enhance overall user satisfaction in
immersive environments. Future work will focus on extending the model’s capability to handle more
complex lighting scenarios.
Description
Keywords
AI-driven image enhancement virtual reality augmented reality image quality deep learning lighting conditions
Citation
Abbasi, M., Váz, P., Silva, J., & Martins, P. (2024). Enhancing Visual Perception in Immersive VR and AR Environments: AI-Driven Color and Clarity Adjustments Under Dynamic Lighting Conditions. Technologies, 12(11), 216. https://doi.org/10.3390/technologies12110216
Publisher
MDPI