DigitalS: TC Grafenau
Dietmar Jakob, Sebastian Wilhelm
Prozessablauf von automatisierten Datenerhebungen in privaten Haushalten – Erfahrungsbericht
Data2Day 2020 - Konferenz für Big Data, Data Science und Machine Learning, Heidelberg (online)
Smarte Technologien erzeugen Datenströme, die unter anderem für personalisierte Mehrwertdienste genutzt werden können. Zur Entwicklung entsprechender Dienste und intelligenter Algorithmen ist es häufig notwendig, gelabelte Rohdaten zu sammeln. Dabei sind neben den technischen Herausforderungen auch Datenschutzbestimmungen und ethische Fragestellungen zu berücksichtigen. In unserem Vortrag präsentieren wir einen Erfahrungsbericht über die Erhebung von Stromverbrauchsdaten in 20 privaten Haushalten, beginnend bei der Auswahl der Testhaushalte, der Sicherstellung einer informierten Zustimmung, die Installation der technischen Komponenten bis zur Anonymisierung und Veröffentlichung der Daten.
DigitalF: Angewandte Informatik
Subjective and Objective Video Quality Measurement in Low-Bitrate Multimedia Scenarios
In recent years, many distribution channels for low-bitrate video transmissions were setup. The parameter settings for the encoder, the transmission
channel, the decoder and the playback device are manifold. In order to maintain customer satisfaction, it is necessary to carefully select and continuously
tune those parameters and to monitor the resulting video quality at the receiver. This thesis considers the quality measurement by a human observer
and by an automated algorithm.
In the first part of the thesis, several subjective tests are performed in
order to draw conclusions about the choice of transmission parameters. The
experience gained from those experiments led to three psychophysical experiments that focus on isolated aspects of the video quality in lossless or
lossy low-bitrate transmissions. Three distinct algorithms are deduced from
the subjective experiments which deal with the temporal aspects. First, the
visibility of artifacts is modeled when the viewer only has a short period
of time for the examination. Second, the influence of transmission outages
is modeled: The video playback may pause and content may be skipped if
retransmission is not possible. Third, the visual degradation introduced by
a reduction of the frame rate is modeled.
The second part of the thesis is dedicated to the objective measurement.
It is assumed that the reference video sequence is available for comparison
with the degraded sequence. Because the performance of the automated
measurement depends strongly on the correct alignment of the degraded
signal to the reference signal, various algorithms are reviewed, enhanced, and
compared that locate the corresponding reference frame for a given degraded
So far, many algorithms have been published that reliably predict the
visual quality of still images or temporally undistorted video sequences. In
this thesis, a new framework is presented that allows to evaluate the performance of these algorithms for temporally distorted video transmissions.
The processing steps and the signal representations follow the reasoning of
a human observer in a subjective experiment as observed in the first part
of the thesis. The improvements that can be achieved with the newly proposed framework are demonstrated by comparing the objective scores with
the subjective results of the comprehensive Multimedia Phase I database of
the Video Quality Experts Group.
DigitalF: Angewandte Informatik
Parity-based Error Detection with Recomputation for Fault-tolerant Spaceborne Computing
In radiation environment (e.g., space, nuclear reactor), electronics can fail due to bitflips in the flipflops of integrated circuits. A common solution is to triplicate the flipflops and connect their outputs to a voter. If one of the three bits is flipped, then the voter outputs the majority value and tolerates the error. This method is called triple modular redundancya (TMR). TMR can cause about 300% area redundancy. An alternative way is error detection with on-demand recomputation, where the recomputation is done by repeating the failed processing request to the processing circuit. The computation is done in consecutive transactions, which we call transaction-based processing. We implemented and evaluated the aforementioned alternative approach using parity checking on the Microsemi ProASIC3 FPGA, which is often used in space applications. The results show that parity-based error detection with our system recovery approach can save up to 54% of the area overhead that would be caused by the TMR, and achieve in most circuits slightly better timing results than TMR on ProASIC3. This area saving can be the key for fitting the application to a space-constrained chip.