Profile page photo

Psychometric evaluations are generally used to understand the Quality of Experience (QoE) of immersive environments produced using augmented/mixed/virtual reality. Typically, these subjective evaluations are done from an end-user point-of-view, but these are limited by the subjective observations due to: (i) a user's bias in grading their experience (some are more critical than others); (ii) user's interest and concentration throughout the task; (iii) ease of use and comfort level of the interaction interfaces, (iv) task duration, (v) user fatigue when tested for different scenarios such as different network conditions, and (vi) importance of the application. The most commonly used subjective method for quality measurement is the Mean Opinion Score (MOS). MOS is standardized in the ITU-T (International Telecommunications Union) recommendations, and it is defined as a numeric value going from 1 to 5 (i.e. poor to excellent).

The objective approach consists of measuring the QoE by monitoring the network technical parameters or the network Quality of Service (QoS), such as throughput, delay, and packet loss. Most of the research on objective approaches for QoS-QoE mapping have focused on video streaming. For instance, it is assumed that video QoE is affected by three key network parameters: loss, delay, and jitter. Long jitter influences discontinuity and additional packet loss, whereas packet delays are related to buffering time. Hence, video streaming QoE is considered as a function of these two application specific metrics: buffering time (BT) and streaming video discontinuity (SVD). It is obvious that such objective QoS-QoE mapping strategies cannot be directly applied for immersive environments.

Hence, in this talk, we address two related questions: (1) Can we identify metrics that can objectively quantify the performance of an immersive environment? (2) Can we use the above objective performance metrics to understand the possible user QoE without the need for subjective user study or with minimal user study?

We start with different examples of immersive environments such as haptic-enabled applications, mirror therapy, and games. We discuss what metrics are influenced by different system parameters such as processing power, and network QoS. Then, we present some of our preliminary work on understanding users' QoE through these metrics.

Short bio

B. Prabhakaran is a Professor in the faculty of Computer Science Department, University of Texas at Dallas. Dr. Prabhakaran received the prestigious NSF CAREER Award FY 2003 for his proposal on Animation Databases. He was selected as an ACM Distinguished Scientist in 2011 and is currently a IEEE Senior Member. He is an Associate Editor of IEEE Transactions on Multimedia. He is Member of the Editorial board of Multimedia Systems Journal (Springer), Multimedia Tools and Applications journal (Springer), and other multimedia systems journals. He received the Best Associate Editor for 2015, from Springer's Multimedia Systems Journal. Dr Prabhakaran is a Member of the Executive Council of the ACM Special Interest Group on Multimedia (SIGMM) and is the Co-Chair of IEEE Technical Committee on Multimedia Computing (TCMC) Special Interest Group on Video Analytics (SIGVA). Dr. Prabhakaran is the General Co-Chair of the IEEE International Conference on Health Informatics (ICHI) 2015. He was also a General Co-Chair of ACM International Conference on Multimedia Retrieval 2013 (ICMR 2013), IEEE Haptic, Audio, and Visual Environments (HAVE) 2014, a General Co-chair of ACM Multimedia 2011, and ACM Multimedia and Security (MM&Sec) 2007. Prof Prabhakaran's research has been funded by Federal Agencies such as the National Science Foundation (NSF), USA Army Research Office (ARO), and the US-IGNITE Program.

More information: http://utdallas.edu/~praba/cv.pdf