Datenbestand vom 29. Oktober 2024

Warenkorb Datenschutzhinweis Dissertationsdruck Dissertationsverlag Institutsreihen     Preisrechner

aktualisiert am 29. Oktober 2024

ISBN 978-3-8439-0249-6

72,00 € inkl. MwSt, zzgl. Versand


978-3-8439-0249-6, Reihe Informatik

Erik Rodner
Learning from Few Examples for Visual Recognition Problems

266 Seiten, Dissertation Friedrich-Schiller-Universität Jena (2011), Softcover, A5

Zusammenfassung / Abstract

The lack of training data is one of the core problems and limiting factors in many industrial and high-level vision tasks. Despite the human ability of quickly generalizing from often just a single example of an object category, current computer vision systems require a large number of examples to learn from. The following work deals with this problem and two paradigms are considered to tackle it: transfer learning and one-class classification.

The idea of transfer learning is that prior information from related tasks can be used to improve learning of a new task with few training examples. In this work, several transfer learning approaches are presented, which concentrate on transferring different types of knowledge from related tasks. The presented regularized tree method extends decision tree methods by incorporating prior knowledge from previously built trees of other tasks. Another proposed algorithm transfers feature relevance information to guide the search for suitable base classifiers during learning of a random decision forest.

In addition, a third developed approach utilizes dependent Gaussian processes and allows for non-parametric transfer learning. Furthermore, a technique is presented that automatically selects a related task from a large set of tasks and estimates the required degree of transfer. The method is able to adaptively learn in heterogeneous environments and is based on efficient leave-one-out estimates and semantic similarities.

All methods are analyzed quantitatively in different scenarios (binary and multi-class transfer learning) and are applied to image categorization. Significant performance benefits are shown in comparison with current transfer learning methods and to state-of-the-art classifiers not exploiting knowledge transfer.

Another very difficult problem occurs when training data is only available for a single class. To solve this problem, new efficient one-class classification methods are presented, which are directly derived from the Gaussian process framework and allow flexible learning with kernels. The suitability of the proposed algorithms for a wide range of applications is demonstrated for image categorization and action detection tasks, as well as the demanding application of defect localization in wire ropes.

The experimental results show that the proposed methods are able to outperform other one-class classification methods.