Physical description |
xxiv, 736 pages : illustrations ; 25 cm. |
Series |
Adaptive and learning systems for signal processing, communications, and control. |
|
Adaptive and learning systems for signal processing, communications, and control.
|
Notes |
"A Wiley-Interscience publication." |
Bibliography |
Includes bibliographical references (pages 723-732) and index. |
Contents |
Introduction: The Problem of Induction and Statistical Inference -- I. Theory of Learning and Generalization. 1. Two Approaches to the Learning Problem -- App. to Ch. 1. Methods for Solving III-Posed Problems. 2. Estimation of the Probability Measure and Problem of Learning. 3. Conditions for Consistency of Empirical Risk Minimization Principle. 4. Bounds on the Risk for Indicator Loss Functions -- App. to Ch. 4. Lower Bounds on the Risk of the ERM Principle. 5. Bounds on the Risk for Real-Valued Loss Functions. 6. The Structural Risk Minimization Principle -- App. to Ch. 6. Estimating Functions on the Basis of Indirect Measurements. 7. Stochastic III-Posed Problems. 8. Estimating the Values of Function at Given Points -- II. Support Vector Estimation of Functions. 9. Perceptrons and Their Generalizations. 10. The Support Vector Method for Estimating Indicator Functions. 11. The Support Vector Method for Estimating Real-Valued Functions. 12. SV Machines for Pattern Recognition. |
|
13. SV Machines for Function Approximations, Regression Estimation, and Signal Processing -- III. Statistical Foundation of Learning Theory. 14. Necessary and Sufficient Conditions for Uniform Convergence of Frequencies to Their Probabilities. 15. Necessary and Sufficient Conditions for Uniform Convergence of Means to Their Expectations. 16. Necessary and Sufficient Conditions for Uniform One-Sided Convergence of Means to Their Expectations. |
Summary |
A comprehensive look at learning and generalization theory. The statistical theory of learning and generalization concerns the problem of choosing desired functions on the basis of empirical data. Highly applicable to a variety of computer science and robotics fields, this book offers lucid coverage of the theory as a whole. Presenting a method for determining the necessary and sufficient conditions for consistency of learning process, the author covers function estimates from small data pools, applying these estimations to real-life problems, and much more. |
Subject |
Computational learning theory.
|
ISBN |
0471030031 (acid-free paper) |
|