Publikationen


Suche nach „[B.] [Eskofier]“ hat 4 Publikationen gefunden
Suchergebnis als PDF
    DigitalF: Angewandte Informatik

    Beitrag (Sammelband oder Tagungsband)

    J. Paulus, G. Michelson, Marcus Barkowsky, J. Hornegger, B. Eskofier, M. Schmidt

    Measurement of Individual Changes in the Performance of Human Stereoscopic Vision for Disparities at the Limits of the Zone of Comfortable Viewing

    2013 International Conference on 3D Vision (3DV 2013)

    2013

    Abstract anzeigen

    3D displays enable immersive visual impressions but the impact on the human perception still is not fully understood. Viewing conditions like the convergence-accommodation (C-A) conflict have an unnatural influence on the visual system and might even lead to visual discomfort. As visual perception is individual we assumed the impact of simulated 3D content on the visual system to be as well. In this study we aimed to analyze the stereoscopic visual performance of 17 subjects for disparities inside and outside the in literature defined zone of comfortable viewing to provide an individual evaluation of the impact of increased disparities on the performance of the visual system. Stereoscopic stimuli were presented in a four-alternative forced choice (4AFC) setup in different disparities. The response times as well as the correct decision rates indicated the performance of stereoscopic vision. The results showed that increased disparities lead to a decline in performance. Further, the impact of the presented disparities is dependent on the difficulty of the task. The decline of performance as well as the deciding disparities for the decline were subject dependent.

    DigitalF: Angewandte Informatik

    Zeitschriftenartikel

    Marcus Barkowsky, J. Bialkowski, B. Eskofier, R. Bitto, A. Kaup

    Temporal Trajectory Aware Video Quality Measure

    IEEE Journal of Selected Topics in Signal Processing, vol. 3, no. 2, pp. 266-279

    2009

    Abstract anzeigen

    The measurement of video quality for lossy and low-bitrate network transmissions is a challenging topic. Especially, the temporal artifacts which are introduced by video transmission systems and their effects on the viewer's satisfaction have to be addressed. This paper focuses on a framework that adds a temporal distortion awareness to typical video quality measurement algorithms. A motion estimation is used to track image areas over time. Based on the motion vectors and the motion prediction error, the appearance of new image areas and the display time of objects is evaluated. Additionally, degradations which stick to moving objects can be judged more exactly. An implementation of this framework for multimedia sequences, e.g., QCIF, CIF, or VGA resolution, is presented in detail. It shows that the processing steps and the signal representations that are generated by the algorithm follow the reasoning of a human observer in a subjective experiment. The improvements that can be achieved with the newly proposed algorithm are demonstrated using the results of the Multimedia Phase I database of the Video Quality Experts Group.

    DigitalF: Angewandte Informatik

    Beitrag (Sammelband oder Tagungsband)

    Marcus Barkowsky, B. Eskofier, R. Bitto, J. Bialkowski, A. Kaup

    Perceptually motivated spatial and temporal integration of pixel based video quality measures

    Mobile Content Quality of Experience 2007 (MobConQoE '07): Fourth International Conference on Heterogeneous Networking for Quality, Reliability, Security and Robustness

    2007

    Abstract anzeigen

    In the evaluation of video quality often a full reference approach is used, thus calculating some measure of difference between the reference frames and the distorted frames. Often this measure returns one value per pixel, in the simplest case the squared difference. Conventionally, this pixel based measure is averaged over space and time. This paper introduces a psychophysically derived algorithm for this step. It uses the distribution of the cells in the fovea and the assumption that in a subjective test the part with the highest distortion is most important. Additionally, a temporal integration step is proposed which models the recency and forgiveness effect. Different video quality measures are enhanced with these two steps and their performance is evaluated using the results of a subjective test.

    DigitalF: Angewandte Informatik

    Beitrag (Sammelband oder Tagungsband)

    Marcus Barkowsky, B. Eskofier, J. Bialkowski, A. Kaup

    Influence of the Presentation Time on Subjective Votings of Coded Still Images

    Proceedings of the International Conference on Image Processing

    2006

    Abstract anzeigen

    The quality of coded images is often assessed by a subjective test. Usually the viewers get as much time as they need to find a stable result. In video sequences however, the viewer has to judge the quality in a shorter time that is defined by the changing content or a following scene cut. Therefore it is desirable to know the influence of a shorter presentation time on the perceptibility of distortions. In this paper we present the results of a suitable subjective test on coded still images. The images were presented for six different durations, ranging from 200 ms to 3 s. Special care was taken to avoid the memorization effect usually present after short presentations. The results show that the viewers tend to avoid extreme votings at short durations. The variance of the votings is also discussed in detail. Based on the result of the voting for the longest presentation time, we propose a prediction model for the voting of the shorter durations using a logistic curve fit. This presentation time model (PTM) is presented and analysed in detail.