Change search
Link to record
Permanent link

Direct link
BETA
Alternative names
Publications (10 of 36) Show all publications
Dima, E., Brunnstrom, K., Sjöström, M., Andersson, M., Edlund, J., Johansson, M. & Qureshi, T. (2020). Joint effects of depth‑aiding augmentations and viewing positionson the quality of experience in augmented telepresence. Quality and User Experience, 5, Article ID 2.
Open this publication in new window or tab >>Joint effects of depth‑aiding augmentations and viewing positionson the quality of experience in augmented telepresence
Show others...
2020 (English)In: Quality and User Experience, ISSN 2366-0139, Vol. 5, article id 2Article in journal (Refereed) Published
Abstract [en]

Virtual and augmented reality is increasingly prevalent in industrial applications, such as remote control of industrial machinery,due to recent advances in head-mounted display technologies and low-latency communications via 5G. However, theinfluence of augmentations and camera placement-based viewing positions on operator performance in telepresence systemsremains unknown. In this paper, we investigate the joint effects of depth-aiding augmentations and viewing positionson the quality of experience for operators in augmented telepresence systems. A study was conducted with 27 non-expertparticipants using a real-time augmented telepresence system to perform a remote-controlled navigation and positioningtask, with varied depth-aiding augmentations and viewing positions. The resulting quality of experience was analyzed viaLikert opinion scales, task performance measurements, and simulator sickness evaluation. Results suggest that reducing thereliance on stereoscopic depth perception via camera placement has a significant benefit to operator performance and qualityof experience. Conversely, the depth-aiding augmentations can partly mitigate the negative effects of inferior viewingpositions. However the viewing-position based monoscopic and stereoscopic depth cues tend to dominate over cues basedon augmentations. There is also a discrepancy between the participants’ subjective opinions on augmentation helpfulness,and its observed effects on positioning task performance.

Place, publisher, year, edition, pages
Switzerland: , 2020
Keywords
Quality of experience, Augmented reality, Telepresence, Head mounted displays, Depth perception
National Category
Telecommunications Communication Systems Media Engineering
Identifiers
urn:nbn:se:ri:diva-43883 (URN)10.1007/s41233-020-0031-7 (DOI)
Projects
Quality of Experience for Augmented Telepresence (KK-foundation 20160194)
Funder
Knowledge Foundation, 20160194European Regional Development Fund (ERDF), 20201888
Available from: 2020-02-13 Created: 2020-02-13 Last updated: 2020-02-24Bibliographically approved
van Kasteren, A., Brunnstrom, K., Hedlund, J. & Snijders, C. (2020). Quality of Experience Assessment of 360-degree video. In: Chandler, D.M. McCourt, M. Mulligan, J.B. (Ed.), Human Vision and Electronic Imaging 2020: . Paper presented at Human Vision and Electronic Imaging 2020. Burlingame, CA, USA, Article ID HVEI-091.
Open this publication in new window or tab >>Quality of Experience Assessment of 360-degree video
2020 (English)In: Human Vision and Electronic Imaging 2020 / [ed] Chandler, D.M. McCourt, M. Mulligan, J.B., Burlingame, CA, USA, 2020, article id HVEI-091Conference paper, Published paper (Refereed)
Abstract [en]

The research domain on the Quality of Experience (QoE) of 2Dvideo streaming has been well established. However, a new videoformat is emerging and gaining popularity and availability: VR 360-degree video. The processing and transmission of 360-degree videosbrings along new challenges such as large bandwidth requirementsand the occurrence of different distortions. The viewing experienceis also substantially different from 2D video, it offers moreinteractive freedom on the viewing angle but can also be moredemanding and cause cybersickness. Further research on the QoEof 360-videos specifically is thus required. The goal of this study isto complement earlier research by (Tran, Ngoc, Pham, Jung, andThank, 2017) testing the effects of quality degradation, freezing, andcontent on the QoE of 360-videos. Data will be gathered throughsubjective tests where participants watch degraded versions of 360-videos through an HMD. After each video they will answerquestions regarding their quality perception, experience, perceptualload, and cybersickness. Results of the first part show overall ratherlow QoE ratings and it decreases even more as quality is degradedand freezing events are added. Cyber sickness was found not to bean issue.

Place, publisher, year, edition, pages
Burlingame, CA, USA: , 2020
Series
Electronic Imaging, ISSN 2470-1173
Keywords
Quality of Experience, 360 video, eye-tracking
National Category
Telecommunications Communication Systems Media Engineering
Identifiers
urn:nbn:se:ri:diva-43884 (URN)
Conference
Human Vision and Electronic Imaging 2020
Funder
Vinnova, 2018-00735
Available from: 2020-02-13 Created: 2020-02-13 Last updated: 2020-02-17Bibliographically approved
Bosse, S., Brunnstrom, K., Arndt, S., Martini, M. G., Ramzan, N. & Engelke, U. (2019). A common framework for the evaluation of psychophysiological visualquality assessment. Quality and User Experience, 4(3)
Open this publication in new window or tab >>A common framework for the evaluation of psychophysiological visualquality assessment
Show others...
2019 (English)In: Quality and User Experience, ISSN 2366-0139, E-ISSN 2366-0147, Vol. 4, no 3Article in journal (Refereed) Published
Abstract [en]

The assessment of perceived quality based on psychophysiological methods recently gained attraction as it potentiallyovercomes certain flaws of psychophysical approaches. Although studies report promising results, it is not possible toarrive at decisive and comparable conclusions that recommend the use of one or another method for a specific applicationor research question. The video quality expert group started a project on psychophysiological quality assessment to studythese novel approaches and to develop a test plan that enables more systematic research. This test plan comprises of a specificallydesigned set of quality annotated video sequences, suggestions for psychophysiological methods to be studied inquality assessment, and recommendations for the documentation and publications of test results. The test plan is presentedin this article.

Keywords
Video quality · Psychophysiology · Quality assessment · Subjective tests · Electroencephalography · Video quality expert group · VQEG
National Category
Communication Systems Computer Systems Telecommunications Media Engineering
Identifiers
urn:nbn:se:ri:diva-39617 (URN)10.1007/s41233-019-0025-5 (DOI)
Projects
Celtc-Next 5G Perfecta (2018-00735)
Available from: 2019-08-01 Created: 2019-08-01 Last updated: 2019-08-05Bibliographically approved
Brunnstrom, K., Dima, E., Andersson, M., Sjöström, M., Qureshi, T. & Johanson, M. (2019). Quality of Experience of hand controller latency in a Virtual Reality simulator. In: Damon Chandler, Mark McCourt and Jeffrey Mulligan (Ed.), Human Vision and Electronic Imaging 2019: . Paper presented at Human Vision and Electronic Imaging 2019. , Article ID 3068450.
Open this publication in new window or tab >>Quality of Experience of hand controller latency in a Virtual Reality simulator
Show others...
2019 (English)In: Human Vision and Electronic Imaging 2019 / [ed] Damon Chandler, Mark McCourt and Jeffrey Mulligan, 2019, article id 3068450Conference paper, Published paper (Refereed)
Abstract [en]

In this study, we investigate a VR simulator of a forestry crane used for loading logs onto a truck, mainly looking at Quality of Experience (QoE) aspects that may be relevant for task completion, but also whether there are any discomfort related symptoms experienced during task execution. A QoE test has been designed to capture both the general subjective experience of using the simulator and to study task performance. Moreover, a specific focus has been to study the effects of latency on the subjective experience, with regards to delays in the crane control interface. A formal subjective study has been performed where we have added controlled delays to the hand controller (joystick) signals. The added delays ranged from 0 ms to 800 ms. We found no significant effects of delays on the task performance on any scales up to 200 ms. A significant negative effect was found for 800 ms added delay. The Symptoms reported in the Simulator Sickness Questionnaire (SSQ) was significantly higher for all the symptom groups, but a majority of the participants reported only slight symptoms. Two out of thirty test persons stopped the test before finishing due to their symptoms.

Series
Electronic Imaging, ISSN 2470-1173
Keywords
Quality of Experience, Virtual Reality, Simulator, QoE, Delay
National Category
Communication Systems Telecommunications Media Engineering
Identifiers
urn:nbn:se:ri:diva-37738 (URN)
Conference
Human Vision and Electronic Imaging 2019
Funder
Knowledge Foundation, 20160194
Available from: 2019-02-07 Created: 2019-02-07 Last updated: 2019-02-08Bibliographically approved
Dima, E., Brunnstrom, K., Sjöström, M., Andersson, M., Edlund, J., Johanson, M. & Qureshi, T. (2019). View Position Impact on QoE in an Immersive Telepresence System for Remote Operation. In: 2019 Eleventh International Conference on Quality of Multimedia Experience (QoMEX): . Paper presented at 2019 Eleventh International Conference on Quality of Multimedia Experience (QoMEX). Berlin, Germany: IEEE
Open this publication in new window or tab >>View Position Impact on QoE in an Immersive Telepresence System for Remote Operation
Show others...
2019 (English)In: 2019 Eleventh International Conference on Quality of Multimedia Experience (QoMEX), Berlin, Germany: IEEE , 2019Conference paper, Published paper (Refereed)
Abstract [en]

In this paper, we investigate how different viewingpositions affect a user’s Quality of Experience (QoE) and performancein an immersive telepresence system. A QoE experimenthas been conducted with 27 participants to assess the generalsubjective experience and the performance of remotely operatinga toy excavator. Two view positions have been tested, an overheadand a ground-level view, respectively, which encourage relianceon stereoscopic depth cues to different extents for accurate operation.Results demonstrate a significant difference between groundand overhead views: the ground view increased the perceiveddifficulty of the task, whereas the overhead view increased theperceived accomplishment as well as the objective performanceof the task. The perceived helpfulness of the overhead view wasalso significant according to the participants.

Place, publisher, year, edition, pages
Berlin, Germany: IEEE, 2019
Keywords
quality of experience, augmented telepresence, head mounted display, viewpoint, remote operation, camera view
National Category
Communication Systems Computer Systems Telecommunications Media Engineering
Identifiers
urn:nbn:se:ri:diva-39618 (URN)10.1109/QoMEX.2019.8743147 (DOI)2-s2.0-85068638935 (Scopus ID)
Conference
2019 Eleventh International Conference on Quality of Multimedia Experience (QoMEX)
Projects
KK-project Quality of Experience for Augmented Telepresence (20160194)
Funder
Knowledge Foundation, 20160194
Note

© 2019 IEEE.  Personal use of this material is permitted.  Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.

Available from: 2019-08-01 Created: 2019-08-01 Last updated: 2020-02-04Bibliographically approved
Allison, R., Brunnstrom, K., Chandler, D., Colett, H., Corriveau, P., Daly, S., . . . Zhang, Y. (2018). Perspectives on the definition of visually lossless quality for mobile and large format displays. Journal of Electronic Imaging, 27(5), Article ID 053035.
Open this publication in new window or tab >>Perspectives on the definition of visually lossless quality for mobile and large format displays
Show others...
2018 (English)In: Journal of Electronic Imaging, ISSN 10179909, Vol. 27, no 5, article id 053035Article in journal (Refereed) Published
Abstract [en]

Advances in imaging and display engineering have given rise to new and improved image and videoapplications that aim to maximize visual quality under given resource constraints (e.g., power, bandwidth).Because the human visual system is an imperfect sensor, the images/videos can be represented in a mathematicallylossy fashion but with enough fidelity that the losses are visually imperceptible—commonly termed“visually lossless.” Although a great deal of research has focused on gaining a better understanding ofthe limits of human vision when viewing natural images/video, a universally or even largely accepted definitionof visually lossless remains elusive. Differences in testing methodologies, research objectives, and targetapplications have led to multiple ad-hoc definitions that are often difficult to compare to or otherwise employ inother settings. We present a compendium of technical experiments relating to both vision science and visualquality testing that together explore the research and business perspectives of visually lossless image quality,as well as review recent scientific advances. Together, the studies presented in this paper suggest that a singledefinition of visually lossless quality might not be appropriate; rather, a better goal would be to establish varyinglevels of visually lossless quality that can be quantified in terms of the testing paradigm.

Keywords
visual lossless; visual lossy; image quality; industrial perspective; mobile screen; large format displays
National Category
Communication Systems Telecommunications Media Engineering
Identifiers
urn:nbn:se:ri:diva-35314 (URN)10.1117/1.JEI.27.5.053035 (DOI)2-s2.0-85054964928 (Scopus ID)
Note

Copyright (2018) Society of Photo-Optical Instrumentation Engineers. One print or electronic copy may be made for personal use only. Systematic reproduction and distribution, duplication of any material in this paper for a fee or for commercial purposes, or modification of the content of the paper are prohibited.

Available from: 2018-10-15 Created: 2018-10-15 Last updated: 2018-12-21Bibliographically approved
Brunnström, K., Sjöström, M., Muhammad, I., Magnus, P. & Johanson, M. (2018). Quality of Experience for a Virtual Reality simulator. In: Rogowitz, B.;Pappas, T.;De Ridder H. (Ed.), Human Vision and Electronic Imaging 2018: . Paper presented at IS&T Human Vision and Electronic Imaging 2018, Burlingame, California USA, 28 Jan. - 2 Feb, 2018. The Society for Imaging Science and Technology
Open this publication in new window or tab >>Quality of Experience for a Virtual Reality simulator
Show others...
2018 (English)In: Human Vision and Electronic Imaging 2018 / [ed] Rogowitz, B.;Pappas, T.;De Ridder H., The Society for Imaging Science and Technology, 2018Conference paper, Published paper (Refereed)
Abstract [en]

In this study, we investigate a VR simulator of a forestry crane used for loading logs onto a truck, mainly looking at Quality of Experience (QoE) aspects that may be relevant for task completion, but also whether there are any discomfort related symptoms experienced during task execution. The QoE test has been designed to capture both the general subjective experience of using the simulator and to study task completion rate. Moreover, a specific focus has been to study the effects of latency on the subjective experience, with regards both to delays in the crane control interface as well as lag in the visual scene rendering in the head mounted display (HMD). Two larger formal subjective studies have been performed: one with the VR-system as it is and one where we have added controlled delay to the display update and to the joystick signals. The baseline study shows that most people are more or less happy with the VR-system and that it does not have strong effects on any symptoms as listed in the Simulator Sickness Questionnaire (SSQ). In the delay study we found significant effects on Comfort Quality and Immersion Quality for higher Display delay (30 ms), but very small impact of joystick delay. Furthermore, the Display delay had strong influence on the symptoms in the SSQ, and causing test subjects to decide not to continue with the complete experiments. We found that this was especially connected to the longer added Display delays (≥ 20 ms).

Place, publisher, year, edition, pages
The Society for Imaging Science and Technology, 2018
Keywords
Quality of Experience, Virtual Reality, Simulator, QoE, Delay
National Category
Engineering and Technology Communication Systems Computer Systems
Identifiers
urn:nbn:se:ri:diva-35187 (URN)10.2352/ISSN.2470-1173.2018.14.HVEI-526 (DOI)2-s2.0-85064043234 (Scopus ID)
Conference
IS&T Human Vision and Electronic Imaging 2018, Burlingame, California USA, 28 Jan. - 2 Feb, 2018
Funder
Knowledge Foundation, 20160194
Available from: 2018-09-18 Created: 2018-09-18 Last updated: 2019-08-13Bibliographically approved
Brunnstrom, K. & Barkowsky, M. (2018). Statistical quality of experience analysis: on planning the sample size and statistical significance testing. Journal of Electronic Imaging (JEI), 27(5), Article ID 053013.
Open this publication in new window or tab >>Statistical quality of experience analysis: on planning the sample size and statistical significance testing
2018 (English)In: Journal of Electronic Imaging (JEI), ISSN 1017-9909, E-ISSN 1560-229X, Vol. 27, no 5, article id 053013Article in journal (Refereed) Published
Abstract [en]

This paper analyzes how an experimenter can balance errors in subjective video quality tests betweenthe statistical power of finding an effect if it is there and not claiming that an effect is there if the effect is not there,i.e., balancing Type I and Type II errors. The risk of committing Type I errors increases with the number ofcomparisons that are performed in statistical tests. We will show that when controlling for this and at thesame time keeping the power of the experiment at a reasonably high level, it is unlikely that the number oftest subjects that are normally used and recommended by the International Telecommunication Union (ITU),i.e., 15 is sufficient but the number used by the Video Quality Experts Group (VQEG), i.e., 24 is more likelyto be sufficient. Examples will also be given for the influence of Type I error on the statistical significance ofcomparing objective metrics by correlation. We also present a comparison between parametric and nonparametricstatistics. The comparison targets the question whether we would reach different conclusions on the statisticaldifference between the video quality ratings of different video clips in a subjective test, based on thecomparison between the student T-test and the Mann–Whitney U-test. We found that there was hardly a differencewhen few comparisons are compensated for, i.e., then almost the same conclusions are reached. Whenthe number of comparisons is increased, then larger and larger differences between the two methods arerevealed. In these cases, the parametric T-test gives clearly more significant cases, than the nonparametrictest, which makes it more important to investigate whether the assumptions are met for performing a certaintest.

Place, publisher, year, edition, pages
SPIE/IS&T: , 2018
Keywords
Type-I error; video quality; statistical significance; quality of experience; Student T-test; Bonferroni; Mann–Whitney U-test; parametric versus nonparametric test.
National Category
Telecommunications Communication Systems Media Engineering
Identifiers
urn:nbn:se:ri:diva-35233 (URN)10.1117/1.JEI.27.5.053013 (DOI)2-s2.0-85054069504 (Scopus ID)
Funder
Knowledge Foundation, 20160194
Note

Copyright (2018) Society of Photo-Optical Instrumentation Engineers. One print or electronic copy may be made for personal use only. Systematic reproduction and distribution, duplication of any material in this paper for a fee or for commercial purposes, or modification of the content of the paper are prohibited.

Available from: 2018-10-03 Created: 2018-10-03 Last updated: 2018-12-21Bibliographically approved
Sedano, I., Prieto, G., Brunnstrom, K., Kihl, M. & Montalban, J. (2017). Application of full-reference video quality metrics in IPTV. In: IEEE International Symposium on Broadband Multimedia Systems and Broadcasting, BMSB: . Paper presented at 12th IEEE International Symposium on Broadband Multimedia Systems and Broadcasting, BMSB 2017, 7 June 2017 through 9 June 2017.
Open this publication in new window or tab >>Application of full-reference video quality metrics in IPTV
Show others...
2017 (English)In: IEEE International Symposium on Broadband Multimedia Systems and Broadcasting, BMSB, 2017Conference paper, Published paper (Refereed)
Abstract [en]

Executing an accurate full-reference metric such as VQM can take minutes in an average computer for just one user. Therefore, it can be unfeasible to analyze all the videos received by users in an IPTV network for example consisting of 10.000 users using a single computer running the VQM metric. One solution can be to use a lightweight no-reference metrics in addition to the full-reference metric mentioned. Lightweight no-reference metrics can be used for discarding potential situations to evaluate because they are accurate enough for that task, and then the full-reference metric VQM can be used when more accuracy is needed. The work in this paper is focused on determining the maximum number of situations/users that can be analyzed simultaneously using the VQM metric in a computer with good performance. The full-reference metric is applied on the transmitter using a method specified in the recommendation ITU BT.1789. The best performance achieved was 112.8 seconds per process.

Keywords
IPTV & Internet TV, Objective evaluation techniques, Performance evaluation, QoE, Broadband networks, Multimedia systems, Television broadcasting, Full references, Internet tv, IPTV networks, No-reference metrics, Single computer, Video quality, IPTV
National Category
Natural Sciences
Identifiers
urn:nbn:se:ri:diva-30833 (URN)10.1109/BMSB.2017.7986191 (DOI)2-s2.0-85027248799 (Scopus ID)9781509049370 (ISBN)
Conference
12th IEEE International Symposium on Broadband Multimedia Systems and Broadcasting, BMSB 2017, 7 June 2017 through 9 June 2017
Available from: 2017-09-07 Created: 2017-09-07 Last updated: 2019-02-04Bibliographically approved
Brunnström, K. & Barkowsky, M. (2017). Balancing type I errors and statistical power in video quality assessment. In: IS and T International Symposium on Electronic Imaging Science and Technology: . Paper presented at Human Vision and Electronic Imaging 2017, HVEI 2017, 29 January 2017 through 2 February 2017 (pp. 91-96).
Open this publication in new window or tab >>Balancing type I errors and statistical power in video quality assessment
2017 (English)In: IS and T International Symposium on Electronic Imaging Science and Technology, 2017, p. 91-96Conference paper, Published paper (Refereed)
Abstract [en]

This paper analyzes how an experimenter can balance errors in subjective video quality tests between the statistical power of finding an effect if it is there and not claiming that an effect is there if the effect it is not there i.e. balancing Type I and Type II errors. The risk of committing Type I errors increases with the number of comparisons that are performed in statistical tests. We will show that when controlling for this and at the same time keeping the power of the experiment at a reasonably high level, it will require more test subjects than are normally used and recommended by international standardization bodies like the ITU. Examples will also be given for the influence of Type I error on the statistical significance of comparing objective metrics by correlation.

Keywords
International standardization, Objective metrics, Statistical power, Statistical significance, Subjective video quality, Time keeping, Type I and type II errors, Video quality assessment, Errors
National Category
Natural Sciences
Identifiers
urn:nbn:se:ri:diva-35363 (URN)10.2352/ISSN.2470-1173.2017.14.HVEI-122 (DOI)2-s2.0-85041524573 (Scopus ID)
Conference
Human Vision and Electronic Imaging 2017, HVEI 2017, 29 January 2017 through 2 February 2017
Available from: 2018-10-23 Created: 2018-10-23 Last updated: 2018-10-23Bibliographically approved
Organisations
Identifiers
ORCID iD: ORCID iD iconorcid.org/0000-0001-5060-9402

Search in DiVA

Show all publications
v. 2.35.9