Publikationen


Suche nach „[A.] [Christmann]“ hat 4 Publikationen gefunden
Suchergebnis als PDF
    TC Grafenau

    Zeitschriftenartikel

    A. Christmann, Robert Hable

    Estimation of Scale Functions to Model Heteroscedasticity by Kernel Based Quantile Methods

    Journal of Nonparametric Statistics, vol. 26, no. 2, pp. 219-239

    2014

    DOI: 10.1080/10485252.2013.875547

    Abstract anzeigen

    A main goal of regression is to derive statistical conclusions on the conditional distribution of the output variable Y given the input values x. Two of the most important characteristics of a single distribution are location and scale. Regularised kernel methods (RKMs) – also called support vector machines in a wide sense – are well established to estimate location functions like the conditional median or the conditional mean. We investigate the estimation of scale functions by RKMs when the conditional median is unknown, too. Estimation of scale functions is important, e.g. to estimate the volatility in finance. We consider the median absolute deviation (MAD) and the interquantile range as measures of scale. Our main result shows the consistency of MAD-type RKMs.

    TC Grafenau

    Beitrag (Sammelband oder Tagungsband)

    A. Christmann, Robert Hable

    On the Bootstrap Approach for Support Vector Machines and Related Kernel Based Methods

    Proceedings of the 59th World Statistics Congress of the International Statistical Institute (ISI) [August 25th - 30th 2013, Hong Kong]

    2013

    TC Grafenau

    Zeitschriftenartikel

    A. Christmann, Robert Hable

    Consistency of support vector machines using additive kernels for additive models

    Computational Statistics & Data Analysis, vol. 56, no. 4, pp. 854-873

    2012

    DOI: 10.1016/j.csda.2011.04.006

    Abstract anzeigen

    Support vector machines (SVMs) are special kernel based methods and have been among the most successful learning methods for more than a decade. SVMs can informally be described as kinds of regularized MM-estimators for functions and have demonstrated their usefulness in many complicated real-life problems. During the last few years a great part of the statistical research on SVMs has concentrated on the question of how to design SVMs such that they are universally consistent and statistically robust for nonparametric classification or nonparametric regression purposes. In many applications, some qualitative prior knowledge of the distribution View the MathML sourceP or of the unknown function ff to be estimated is present or a prediction function with good interpretability is desired, such that a semiparametric model or an additive model is of interest. The question of how to design SVMs by choosing the reproducing kernel Hilbert space (RKHS) or its corresponding kernel to obtain consistent and statistically robust estimators in additive models is addressed. An explicit construction of such RKHSs and their kernels, which will be called additive kernels, is given. SVMs based on additive kernels will be called additive support vector machines . The use of such additive kernels leads, in combination with a Lipschitz continuous loss function, to SVMs with the desired properties for additive models. Examples include quantile regression based on the pinball loss function, regression based on the ϵϵ-insensitive loss function, and classification based on the hinge loss function.

    TC Grafenau

    Zeitschriftenartikel

    A. Christmann, Robert Hable

    On Qualitative Robustness of Support Vector Machines

    Journal of Multivariate Analysis, vol. 102, no. 6, pp. 993-1007

    2011

    DOI: 10.1016/j.jmva.2011.01.009

    Abstract anzeigen

    Support vector machines (SVMs) have attracted much attention in theoretical and in applied statistics. The main topics of recent interest are consistency, learning rates and robustness. We address the open problem whether SVMs are qualitatively robust. Our results show that SVMs are qualitatively robust for any fixed regularization parameter λλ. However, under extremely mild conditions on the SVM, it turns out that SVMs are not qualitatively robust any more for any null sequence λnλn, which are the classical sequences needed to obtain universal consistency. This lack of qualitative robustness is of a rather theoretical nature because we show that, in any case, SVMs fulfill a finite sample qualitative robustness property. For a fixed regularization parameter, SVMs can be represented by a functional on the set of all probability measures. Qualitative robustness is proven by showing that this functional is continuous with respect to the topology generated by weak convergence of probability measures. Combined with the existence and uniqueness of SVMs, our results show that SVMs are the solutions of a well-posed mathematical problem in Hadamard’s sense.