Optimising Virtual Environments Using Perceptually-Driven Rendering

Student thesis: Phd

Abstract

Rendering realistic virtual environments requires intense computation that will always scale to the generational expectations of what users consider high-quality. With the renewed interest in Virtual Reality and display technologies becoming more proficient, computational requirements will increase even further. However, human perception is limited, which means that high-quality rendering is often unnecessary after a certain level as it does not affect the experience of the user. Perceptually-driven rendering techniques have emerged as a solution to optimise computational loads by redirecting resources based on perceptual limitations whilst keeping the visual experience intact. The aim of this thesis is to explore aspects of human perception that have traditionally received less attention from the graphics community. An extensive framework of perceptual rendering will be introduced which reveals promising areas that could be used to facilitate computational savings. To this extent, the relation between the perception of causality in collision events and other factors that limit visual acuity such as retinal eccentricity, velocity, shape and visual crowding will be disentangled. The rest of the thesis is concerned with how self-induced movement modulates visual sensitivity in VR. 6DOF tracking in VR enables users to produce realistic movements (e.g. head rotations or walking) within a VE, which was previously impossible with conventional display technologies. This thesis will therefore explore how head rotations are able to mask sensitivity to changes in the geometrical Level-of-Detail (LOD). We show that using information about the rotational velocity of the user's head can be a reliable driver for an LOD strategy. Next, the impact of active movement on Foveated Rendering (FR) algorithms is explored. Firstly, evidence that active movement allows for significantly more severe configurations of an FR algorithm when compared to passive movement is provided. Finally, the findings in this dissertation suggest that using directly available information about the user's behaviour such as the type of movement and the nature of the task a viewer is engaged in allows for significant computational savings under FR.
Date of Award1 Aug 2024
Original languageEnglish
Awarding Institution
  • The University of Manchester
SupervisorStephen Pettifer (Supervisor) & Paul Warren (Supervisor)

Keywords

  • Psychophysics
  • Perception
  • Perceptually-Driven Rendering
  • Virtual Reality
  • Computer Graphics

Cite this

'