Suche nach „[U.] [Frese]“ hat 2 Publikationen gefunden
Suchergebnis als PDF
    DigitalF: Angewandte InformatikS: TC Plattling MoMo

    Beitrag (Sammelband oder Tagungsband)

    R. Wagner, U. Frese, Berthold Bäuml

    Unified treatment of sparse and dense data in graph-based least squares

    Proceedings of the 2016 IEEE-RAS 16th International Conference on Humanoid Robots (Humanoids 2016) [November 15-17, 2016; Cancún, Mexico]


    DOI: 10.1109/HUMANOIDS.2016.7803393

    Abstract anzeigen

    In this paper, we present a novel method of incorporating dense (e.g., depth, RGB-D) data in a general purpose least-squares graph optimization framework. Rather than employing a loosely coupled, layered design where dense data is first used to estimate a compact SE(3) transform which then forms a link in the optimization graph as in previous approaches [28, 10, 26], we use a tightly coupled approach that jointly optimizes over each individual (i.e. per-pixel) dense measurement (on the GPU) and all other traditional sparse measurements (on the CPU). Concretely, we use Kinect depth data and KinectFusion-style point-to-plane ICP measurements. In particular, this allows our approach to handle cases where neither dense, nor sparse measurements separately define all degrees of freedom (DoF) while taken together they complement each other and yield the overall maximum likelihood solution. Nowadays it is common practice to flexibly model various sensors, measurements and to be estimated variables in least-squares frameworks. Our intention is to extend this flexibility to applications with dense data. Computationally, the key is to combine the many dense measurements on the GPU efficiently and communicate only the results to the sparse framework on the CPU in a way that is mathematically equivalent to the full least-squares system. This results in <;20 ms for a full optimization run. We evaluate our approach on a humanoid robot, where in a first experiment we fuse Kinect data and odometry in a laboratory setting, and in a second experiment we fuse with an unusual “sensor”: using the embodiedness of the robot we estimate elasticities in the kinematic chain modeled as unknown, time-varying joint offsets while it moves its arms in front of a tabletop manipulation workspace. In both experiments only tightly coupled optimization will localize the robot correctly.

    DigitalF: Angewandte InformatikS: TC Plattling MoMo


    O. Birbach, U. Frese, Berthold Bäuml

    Rapid calibration of a multi-sensorial humanoid’s upper body: An automatic and self-contained approach

    The International Journal of Robotics Research, vol. 34, no. 4-5, pp. 420-436


    DOI: 10.1177/0278364914548201

    Abstract anzeigen

    This paper addresses the problem of calibrating a pair of cameras, a Microsoft Kinect sensor and an inertial measurement unit (IMU) mounted at the head of a humanoid robot with respect to its kinematic chain. As complex manipulation tasks require an accurate interplay of all involved sensors, the quality of calibration is crucial for the outcome of the intended tasks. Typical procedures for calibrating are often time-consuming, involve multiple people overseeing a series of subsequent calibration steps and require external tools. We therefore propose to auto-calibrate all sensors in a single, completely automatic and self-contained procedure, i.e. without a calibration plate. By automatically detecting a single point feature on each wrist while moving the robot’s head, the stereo cameras’, the Kinect’s infrared camera’s intrinsic and extrinsic and an IMU’s extrinsic parameters are calibrated while considering the arm joint elasticities and joint angle offsets. All parameters are obtained by formulating the calibration problem as a single least-squares batch-optimization problem. The procedure is integrated on DLR’s humanoid robot Agile Justin allowing to obtain an accurate calibration in around 5 minutes by simply “pushing a button”. The proposed approach is experimentally validated by means of standard metrics of the calibration errors.