Change search
Link to record
Permanent link

Direct link
Publications (10 of 21) Show all publications
Göreke, H., Djupsjöbacka, A., Schenkman, B., Andrén, B., Hermann, D. S. & Brunnström, K. (2023). Perceptual Judgments of Simulated Low Temperatures in LCD based Vehicle Displays. In: Digest of Technical Papers - SID International Symposium: . Paper presented at SID International Symposium Digest of Technical Papers, 2023. Los Angeles, USA. 21 May 2023 through 26 May 2023 (pp. 595-598). John Wiley and Sons Inc, 54(1)
Open this publication in new window or tab >>Perceptual Judgments of Simulated Low Temperatures in LCD based Vehicle Displays
Show others...
2023 (English)In: Digest of Technical Papers - SID International Symposium, John Wiley and Sons Inc , 2023, Vol. 54, no 1, p. 595-598Conference paper, Published paper (Refereed)
Abstract [en]

A well-known drawback with LCD-displays in cold is a slow pixel response leading to poor picture quality. Low temperatures can constitute a hazard in viewing important displays in cars. Perceptual experiments with 20 test-persons were conducted to find clear and acceptable ranges on screens simulating distortions in low temperatures. The results showed perception over clear and acceptable image quality was impaired beyond -20°C for the LCD-screen in the experiments. 

Place, publisher, year, edition, pages
John Wiley and Sons Inc, 2023
Keywords
CMS; Cold screen; LCD displays; LCD screens; Lows-temperatures; Perceptual judgements; Picture quality; Psychophysic; Vehicle display; Video quality; Liquid crystal displays
National Category
Computer and Information Sciences
Identifiers
urn:nbn:se:ri:diva-68033 (URN)10.1002/sdtp.16628 (DOI)2-s2.0-85175305976 (Scopus ID)
Conference
SID International Symposium Digest of Technical Papers, 2023. Los Angeles, USA. 21 May 2023 through 26 May 2023
Note

This research has been supported by the Sweden´s Innovation Agency (VINNOVA, dnr. 2020-05129 and 2021-02107) through the project SCREENS and Celtic-Next project IMMINENCE (C2020/2-2).

Available from: 2023-11-23 Created: 2023-11-23 Last updated: 2023-11-23Bibliographically approved
Brunnström, K., Djupsjöbacka, A., Billingham, J., Wistel, K., Andrén, B., Ozolins, O. & Evans, N. (2023). Video expert assessment of high quality video for Video Assistant Referee (VAR): A comparative study. Multimedia tools and applications
Open this publication in new window or tab >>Video expert assessment of high quality video for Video Assistant Referee (VAR): A comparative study
Show others...
2023 (English)In: Multimedia tools and applications, ISSN 1380-7501, E-ISSN 1573-7721Article in journal (Refereed) Epub ahead of print
Abstract [en]

The International Football Association Board decided to introduce Video Assistant Referee (VAR) in 2018. This led to the need to develop methods for quality control of the VAR-systems. This article focuses on the important aspect to evaluate the video quality. Video Quality assessment has matured in the sense that there are standardized, commercial products and established open-source solutions to measure it with objective methods. Previous research has primarily focused on the end-user quality assessment. How to assess the video in the contribution phase of the chain is less studied. The novelties of this study are two-fold: 1) The user study is specifically targeting video experts i.e., to assess the perceived quality of video professionals working with video production. 2) Six video quality models have been independently benchmarked against the user data and evaluated to show which of the models could provide the best predictions of perceived quality. The independent evaluation is important to get unbiased results as shown by the Video Quality Experts Group. An experiment was performed involving 25 video experts in which they rated the perceived quality. The video formats tested were High-Definition TV both progressive and interlaced as well as a quarters size format that was scaled down half the size in both width and height. The videos were encoded with both H.264 and Motion JPEG for the full size but only H.264 for the quarter size. Bitrates ranged from 80 Mbit/s down to 10 Mbit/s. We could see that for H.264 that the quality was overall very good but dropped somewhat for 10 Mbit/s. For Motion JPEG the quality dropped over the whole range. For the interlaced format the degradation that was based on a simple deinterlacing method did receive overall low ratings. For the quarter size three different scaling algorithms were evaluated. Lanczos performed the best and Bilinear the worst. The performance of six different video quality models were evaluated for 1080p and 1080i. The Video Quality Metric for Variable Frame Delay had the best performance for both formats, followed by Video Multimethod Assessment Fusion method and the Video Quality Metric General model. 

Place, publisher, year, edition, pages
Springer, 2023
Keywords
Digital television; High definition television; Sports; Value engineering; Contribution; Objective video quality; PSNR; SSIM; Subjective video quality; Video assistant referee; Video quality; VIF; VMAF; VQM general; VQM_VFD; Quality control
National Category
Computer and Information Sciences
Identifiers
urn:nbn:se:ri:diva-69335 (URN)10.1007/s11042-023-17741-4 (DOI)2-s2.0-85180664528 (Scopus ID)
Note

This work was funded by Fédération Internationale de Football Association (FIFA) and Sweden´s Innovation Agency (VINNOVA, dnr. 2021-02107) through the Celtic-Next project IMMINENCE (C2020/2-2), which is hereby gratefully acknowledged.

Available from: 2024-01-15 Created: 2024-01-15 Last updated: 2024-05-27Bibliographically approved
Brunnström, K., Djupsjöbacka, A., Billingham, J., Wistel, K., Andrén, B., Ozolins, O. & Evans, N. (2023). Video quality of video professionals for Video Assisted Referee (VAR). Paper presented at IS and T International Symposium on Electronic Imaging: Human Vision and Electronic Imaging, HVEI 2023. San Francisco, USA. 15 January 2023 through 19 January 2023. IS and T International Symposium on Electronic Imaging Science and Technology, 35(10), Article ID 259.
Open this publication in new window or tab >>Video quality of video professionals for Video Assisted Referee (VAR)
Show others...
2023 (English)In: IS and T International Symposium on Electronic Imaging Science and Technology, ISSN 2470-1173, Vol. 35, no 10, article id 259Article in journal (Refereed) Published
Abstract [en]

Changes in the footballing world’s approach to technology and innovation contributed to the decision by the International Football Association Board to introduce Video Assistant Referees (VAR). The change meant that under strict protocols referees could use video replays to review decisions in the event of a "clear and obvious error" or a "serious missed incident". This led to the need by Fédération Internationale de Football Association (FIFA) to develop methods for quality control of the VAR-systems, which was done in collaboration with RISE Research Institutes of Sweden AB. One of the important aspects is the video quality. The novelty of this study is that it has performed a user study specifically targeting video experts i.e., to measure the perceived quality of video professionals working with video production as their main occupation. An experiment was performed involving 25 video experts. In addition, six video quality models have been benchmarked against the user data and evaluated to show which of the models could provide the best predictions of perceived quality for this application. Video Quality Metric for variable frame delay (VQM_VFD) had the best performance for both formats, followed by Video Multimethod Assessment Fusion (VMAF) and VQM General model.

Place, publisher, year, edition, pages
Society for Imaging Science and Technology, 2023
Keywords
Sports; Value engineering; Federation internationale de football associations; Perceived quality; Quality modeling; Quality of videos; Research institutes; User data; User study; Video production; Video quality; Video quality metric; Quality control
National Category
Computer and Information Sciences
Identifiers
urn:nbn:se:ri:diva-66714 (URN)10.2352/EI.2023.35.10.HVEI-259 (DOI)2-s2.0-85169574220 (Scopus ID)
Conference
IS and T International Symposium on Electronic Imaging: Human Vision and Electronic Imaging, HVEI 2023. San Francisco, USA. 15 January 2023 through 19 January 2023
Note

This work was mainly funded by Fédération Internationale de Football Association (FIFA) and partly supported by the Sweden´s Innovation Agency (VINNOVA, dnr. 2021-02107) through the Celtic-Next project IMMINENCE (C2020/2-2) as well as RISE internal funding. 

Available from: 2023-09-20 Created: 2023-09-20 Last updated: 2023-12-04Bibliographically approved
Eksvärd, S. & Falk, J. (2021). Evaluating Speech-to-Text Systems and AR-glasses: A study to develop a potential assistive device for people with hearing impairments.
Open this publication in new window or tab >>Evaluating Speech-to-Text Systems and AR-glasses: A study to develop a potential assistive device for people with hearing impairments
Show others...
2021 (English)Report (Other academic)
Abstract [sv]

Att ha en hörselskada eller att vara döv har flera konsekvenser på individens livskvalité. Det påverkar vardagen i stor utsträckning och gör det svårt att delta i konversationer. Det finns redan flera hjälpmedel för att underlätta vardagen för individer med hörselskador, exempelvis  hörapparater. Dock finns vissa utmaningar och problem med dessa. En möjlig lösning skulle kunna vara att använda Augmented Reality, eller förstärkt verklighet tillsammans med ett tal-till-text-system, där tal skulle kunna konverteras til text som exempelvis presenteras i AR-glasögon. Augmented Reality (AR) är en teknik som möjliggör att förstärka verkligheten genom att  datorgenerande information, till exempel visuella objekt, presenteras ovanpå verkligheten. En variant av en AR-teknologi är AR-glasögon, vilket innebär att datorgenerade objekt presenteras i glasögonen och sedermera förstärker användarens verklighet. Olika varianter av AR och AR-glasögon har studerats länge, men det inte förrän under den senaste tiden som kvalitén blivit tillräckligt bra för att kunna användas i vardagen. Idag finns flera olika AR-glasögon, med olika tekniska, ergonomiska, visuella och optiska egenskaper, där vissa glasögon kan vara mer eller mindre lämpade inom ett visst användningsområde. Glasögonens egenskaper påverkar även användarens upplevelse. Även om tekniken redan finns på marknaden kvarstår vissa problem, exempelvis belysning, bakgrund och att synligheten av de datorgenerade objekten påverkas av förhållanden i omgivningen. Då målet är att presentera text i AR-glasögonen är det viktiga att texten är synlig, läslig1 och läsbar2 under flera olika förhållanden, som varierande belysning och bakgrund. Dessutom bör hänsyn tas till hur texten presenteras, där faktorer som storlek, textfärg, bakgrundsfärg bakom text samt antal rader bör beaktas. Tal-till-text-system är också en teknik som fått genomslag under de senaste åren. Bland annat tack vare de stora framsteg som gjorts inom taligenkänning. På den engelskspråkiga marknaden är tekniken utbredd och det finns flera befintliga system, som Google Cloud Speech API, IBM Watson och Microsoft Azure. På den svenska marknaden finns dock få system och utvärderingar av dessa. Vid utvärderingar av tal-till-text-system bör man ta hänsyn till faktorer som korrekthet, fördröjning och robusthet under olika ljudnivåer, talhastigheter och dialekter. Således bör man utvärdera vilket svenskt tal-till-text-system som är lämpligast att använda för att översätta tal till text med hänsyn till ovannämnda aspekter. I detta examensarbete undersöks hur egenskaperna i två olika AR-glasögon påverkar användarupplevelsen med fokus på komfort, design, samt glasögonens optiska och visuella egenskaper. Vidare undersöks hur belysning och bakgrund påverkar läsligheten och läsbarheten av text, samt hur texten ska presenteras för att resultera i bäst läslighet och läsbarhet. Detta studeras genom användartester, där olika formateringar på texten presenteras under olika belysningar och med olika bakgrund. Detta följs av en kort enkät där deltagarna får besvara frågor gällande textformateringarna. Enkäten innehåller även frågor relaterade till glasögonens egenskaper och hur dessa påverkar användarupplevelsen för att kunna undersöka vilka glasögon som skulle vara mest lämpade för detföreslagna systemet. Avslutande genomförs en marknadsundersökning och litteraturundersökning över befintliga tal-till-text-system på den svenska marknaden. Resultatet från studien visar att belysning och bakgrund påverkar synligheten och sedermera läsligheten av texten som presenteras i glasögonen. Vidare påvisas det att textens formatering, med avseende på textfärg, textbakgrund, antal rader och storlek påverkar läsligheten. Resultatet visar även att AR-glasögonens egenskaper påverkar användarupplevelsen, läsligheten och läsbarheten. Vilka glasögon som är bäst lämpade tycks dock bero på individuella preferenser. Gällande tal-till-text-system, indentifieras fyra stycken som finns tillgängliga på den svenska marknaden. Utifrån utvärderingen av tillgängliga tal-till-text-system rekommenderas Google Cloud Speech API, baserat på tekniska egenskaper, robusthet och tillgänglighet.

Publisher
p. 111
Series
RISE Rapport ; 2021:31
National Category
Engineering and Technology
Identifiers
urn:nbn:se:ri:diva-52579 (URN)10.23699/yedh-qn68 (DOI)978-91-89385-16-0 (ISBN)
Note

Uppsala Univerity.Master Sci Thesis, Teknisk- naturvetenskaplig fakultet, UTH-enheten. 

Available from: 2021-03-09 Created: 2021-03-09 Last updated: 2023-05-25Bibliographically approved
Falk, J., Eksvärd, S., Schenkman, B., Andrén, B. & Brunnström, K. (2021). Legibility and readability in Augmented Reality. In: 2021 13th International Conference on Quality of Multimedia Experience (QoMEX): . Paper presented at 2021 13th International Conference on Quality of Multimedia Experience (QoMEX). 14-17 June 2021. Montreal, Canada. (pp. 231-236).
Open this publication in new window or tab >>Legibility and readability in Augmented Reality
Show others...
2021 (English)In: 2021 13th International Conference on Quality of Multimedia Experience (QoMEX), 2021, p. 231-236Conference paper, Published paper (Refereed)
Abstract [en]

Digital technology offers multimodal presentation of information, that can be used for translating foreign languages or for alleviating hearing impairment or deafness communication problems. Today, there exists various aids that can be used for speech-to text translations, but there are some challenges with these. One potential solution for this is to make use of a combination of Augmented Reality (AR) and speech-to-text systems, where speech is converted into text that is then presented in AR-glasses. In AR, one crucial problem is the legibility and readability of text under different environmental conditions. Different types of AR glasses have different ergonomic characteristics, which implies that certain glasses might be more suitable for such a system than others. In this investigation, two different AR-glasses were evaluated based on among other things their optical, visual, and ergonomic characteristics. User tests were conducted to evaluate the legibility and readability of text under different environmental contexts. The results indicate that legibility and readability are affected by factors such as ambient illuminance, background properties and how the text is presented with respect to polarity, opacity, size, and number of lines. The characteristics of the glasses impacts the user experience, but which glasses that is preferred depends on the individual.

Keywords
Image quality, Visualization, Deafness, Glass, Auditory system, Optics, Solids, Augmented Reality, AR, legibility, readability
National Category
Human Computer Interaction
Identifiers
urn:nbn:se:ri:diva-55406 (URN)10.1109/QoMEX51781.2021.9465455 (DOI)
Conference
2021 13th International Conference on Quality of Multimedia Experience (QoMEX). 14-17 June 2021. Montreal, Canada.
Available from: 2021-07-08 Created: 2021-07-08 Last updated: 2023-05-25Bibliographically approved
Brunnström, K., Djupsjöbacka, A. & Andrén, B. (2021). Objective video quality assessment methods for Video assistant refereeing (VAR) Systems.
Open this publication in new window or tab >>Objective video quality assessment methods for Video assistant refereeing (VAR) Systems
2021 (English)Report (Other academic)
Abstract [en]

This report describes the work and conclusions drawn after phase 4 in the project “Assessment methods for Video assistant refereeing (VAR) System”.

The performance of six different video quality models have been evaluated, that were identified during phase 1, against the subjective video quality database that was created during phase 3. The results are slightly different for 1080p compared to 1080i. For 1080p the models VQM_VFD, SSIM and VMAF performs the best with Pearson Correlation Coefficients (PCC) above 0.9. For 1080i the PCC drops a bit overall and then VMAF and VQM_VFD are close in performance and performing the best. The overall performance for both formats VMAF an VQM_VFD stands out as the best models. In this comparison VQM_VFD has the added advantage to also be able to perform its own registration i.e. to fix any misalignment between the reference video and the distorted one.

Publisher
p. 14
Series
RISE Rapport ; 2021:30
National Category
Signal Processing
Identifiers
urn:nbn:se:ri:diva-52578 (URN)10.23699/0x71-2570 (DOI)978-91-89385-15-3 (ISBN)
Available from: 2021-03-09 Created: 2021-03-09 Last updated: 2023-05-25Bibliographically approved
Brunnström, K., Djupsjöbacka, A. & Andrén, B. (2021). Objective video quality assessment methods for Video assistant refereeing (VAR) System: Phase 4 report on synchronizationand latency measurements. Kista
Open this publication in new window or tab >>Objective video quality assessment methods for Video assistant refereeing (VAR) System: Phase 4 report on synchronizationand latency measurements
2021 (English)Report (Other academic)
Abstract [en]

This report describes the work and conclusions drawn after phase 4 for the synchronization and latencymeasurements in the project “Assessment methods for Video Assistant Refereeing (VAR) System”.In phase 4, we have focused on up-dating the software so that it automatically detects: p (progressive),i (interlaced), and PsF (progressive segmented frame) formats and analyse the different video types. Wehave also the performed a field-test with an OB-van at an arena.

Place, publisher, year, edition, pages
Kista: , 2021. p. 13
Series
RISE Rapport ; 2021:30
Keywords
Fotboll, Video Assistant Referee (VAR), Latency, Synchronicity
National Category
Communication Systems Media Engineering Telecommunications
Identifiers
urn:nbn:se:ri:diva-59807 (URN)10.23699/0x71-2570 (DOI)978-91-89385-15-3 (ISBN)
Available from: 2022-07-15 Created: 2022-07-15 Last updated: 2023-05-25Bibliographically approved
Brunnström, K., Djupsjöbacka, A. & Andrén, B. (2021). Video quality based on a user study with video professionals for Video Assisted Refereeing (VAR) Systems.
Open this publication in new window or tab >>Video quality based on a user study with video professionals for Video Assisted Refereeing (VAR) Systems
2021 (English)Report (Other academic)
Abstract [en]

This document describes a user experiment with the purpose of finding a baseline quality that is suitable for system and a database for training and evaluating the objective quality measurement methods suitable for assessing the video quality of VAR systems. 

A user experiment was performed involving 25 Swedish video experts. Three different video formats were incorporated 1080p, 1080i and 540i. The degradations were in most cased done using encoding with Motion JPEG (MJPEG) and H.264 in the bitrate range from 80 Mbit/s down to 10 Mbit/s. 

MJPEG loses quality very fast and already at 80 Mbit/s it has significantly lower quality than the uncompressed reference and then for even lower bitrates the quality falls quickly to bad. On the other hand, H.264 was not found to be significant different from the uncompressed reference until the bitrate had dropped to 10 Mbit/s for 1080p. For 1080i 20 Mbit/s was also weakly and for 540i 20 Mbit/s was significantly lower for some of the scaling methods. For 1080i the deinterlacing requires careful consideration, since the deinterlacing scheme introduced received very low quality scores. For the scaling scheme lanczos was the best and bilinear the worst. 

Requirement levels on bitrate and the encoders MJPEG and H.264 based on this experiment•    MJPEG require more than 120 Mbit/s•    H.264 require more than 50 Mbit/s

Publisher
p. 33
Series
RISE Rapport ; 2021:29
National Category
Signal Processing
Identifiers
urn:nbn:se:ri:diva-52577 (URN)10.23699/79ja-gj68 (DOI)978-91-89385-14-6 (ISBN)
Available from: 2021-03-09 Created: 2021-03-09 Last updated: 2023-05-25Bibliographically approved
Brunnström, K., Andrén, B., Schenkman, B., Djupsjöbacka, A. & Hamsis, O. (2020). Recommended precautions because of Covid-19 for perceptual, behavioural,quality and user experience experimentswith test persons in indoor labs.
Open this publication in new window or tab >>Recommended precautions because of Covid-19 for perceptual, behavioural,quality and user experience experimentswith test persons in indoor labs
Show others...
2020 (English)Report (Other academic)
Abstract [en]

Based on the recommendations from the Public Health Agency of Sweden(Folkhälsomyndigheten; FHM) and a set of internal rules from RISE, the followingrules are published for how to conduct experiments involving test persons in the timesof the pandemic Covid-19. The recommendations are for non-invasive and non-medicaltests, e.g. perceptual, consumer, ergonomic and human-computer interaction teststaking place in an indoor laboratory.

Specifically, in this document we are specifying how experiments with test personstargeting audio and visual presentations should be done considering necessaryprecautions imposed by the Covid-19 pandemic. Laboratory experiments with testpersons, as it involves inviting people to the lab, require particular planning and carefulconsideration, if they are to be carried out safely because of the risks imposed by theCovid-19 pandemic. The safety aspects are valid for both the invited test persons andare equally important for the health of the test leaders.

Publisher
p. 21
Series
RISE Rapport ; 2020:84
Keywords
Covid-19, pre-caution, test persons, experiments, perceptual, behavioural, quality of experience, user experience, video quality
National Category
Engineering and Technology
Identifiers
urn:nbn:se:ri:diva-50132 (URN)10.23699/j865-cz77 (DOI)978-91-89167-69-8 (ISBN)
Available from: 2020-10-23 Created: 2020-10-23 Last updated: 2023-05-25
Hermann, D., Djupsjöbacka, A., Andrén, B., Brunnström, K. & Rydell, N. (2017). Display panel certification system for the vehicle industry. In: Digest of Technical Papers - SID International Symposium: . Paper presented at SID Symposium, Seminar, and Exhibition 2017, Display Week 2017, 21 May 2017 through 26 May 2017 (pp. 471-474). , 48(1)
Open this publication in new window or tab >>Display panel certification system for the vehicle industry
Show others...
2017 (English)In: Digest of Technical Papers - SID International Symposium, 2017, Vol. 48, no 1, p. 471-474Conference paper, Published paper (Refereed)
Abstract [en]

The ever-increasing need for displaying in-vehicle visual information in a non-distracting way requires a high visual performance of automotive displays. For their procurement, deep technical and supply-chain knowledge is required. Therefore, based on our comparisons of in-vehicle and laboratory visual-performance measurements, we propose a certification system for automotive display panels.

Keywords
Car displays, Certification, Display, Measurements, Panels, Requirements, Safety, Visual performance, Accident prevention, Display devices, Measurement, Supply chains, Automotive displays, Certification systems, Supply chain knowledge, Visual information, Vehicle performance
National Category
Natural Sciences
Identifiers
urn:nbn:se:ri:diva-35368 (URN)10.1002/sdtp.11653 (DOI)2-s2.0-85044459309 (Scopus ID)
Conference
SID Symposium, Seminar, and Exhibition 2017, Display Week 2017, 21 May 2017 through 26 May 2017
Available from: 2018-10-23 Created: 2018-10-23 Last updated: 2023-05-25Bibliographically approved
Organisations
Identifiers
ORCID iD: ORCID iD iconorcid.org/0000-0001-9964-7792

Search in DiVA

Show all publications