Dekorationsartikel gehören nicht zum Leistungsumfang.
Sprache:
Englisch
204,95 €*
-17 % UVP 246,09 €
Versandkostenfrei per Post / DHL
Lieferzeit 2-4 Werktage
Kategorien:
Beschreibung
The aim of this book is to discuss the fundamental ideas which lie behind the statistical theory of learning and generalization. It considers learning as a general problem of function estimation based on empirical data. Omitting proofs and technical details, the author concentrates on discussing the main results of learning theory and their connections to fundamental problems in statistics. These include: * the setting of learning problems based on the model of minimizing the risk functional from empirical data * a comprehensive analysis of the empirical risk minimization principle including necessary and sufficient conditions for its consistency * non-asymptotic bounds for the risk achieved using the empirical risk minimization principle * principles for controlling the generalization ability of learning machines using small sample sizes based on these bounds * the Support Vector methods that control the generalization ability when estimating function using small sample size. The second edition of the book contains three new chapters devoted to further development of the learning theory and SVM techniques. These include: * the theory of direct method of learning based on solving multidimensional integral equations for density, conditional probability, and conditional density estimation * a new inductive principle of learning. Written in a readable and concise style, the book is intended for statisticians, mathematicians, physicists, and computer scientists. Vladimir N. Vapnik is Technology Leader AT&T Labs-Research and Professor of London University. He is one of the founders of
The aim of this book is to discuss the fundamental ideas which lie behind the statistical theory of learning and generalization. It considers learning as a general problem of function estimation based on empirical data. Omitting proofs and technical details, the author concentrates on discussing the main results of learning theory and their connections to fundamental problems in statistics. These include: * the setting of learning problems based on the model of minimizing the risk functional from empirical data * a comprehensive analysis of the empirical risk minimization principle including necessary and sufficient conditions for its consistency * non-asymptotic bounds for the risk achieved using the empirical risk minimization principle * principles for controlling the generalization ability of learning machines using small sample sizes based on these bounds * the Support Vector methods that control the generalization ability when estimating function using small sample size. The second edition of the book contains three new chapters devoted to further development of the learning theory and SVM techniques. These include: * the theory of direct method of learning based on solving multidimensional integral equations for density, conditional probability, and conditional density estimation * a new inductive principle of learning. Written in a readable and concise style, the book is intended for statisticians, mathematicians, physicists, and computer scientists. Vladimir N. Vapnik is Technology Leader AT&T Labs-Research and Professor of London University. He is one of the founders of
Zusammenfassung
The aim of this book is to discuss the fundamental ideas which lie behind the statistical theory of learning and generalization. Written in readable and concise style and devoted to key learning problems, the book is intended for statisticians, mathematicians, physicists, and computer scientists.
Inhaltsverzeichnis
Informal Reasoning and Comments * Consistency of Learning Processes * Bounds on the Rate of Convergence of Learing Processes * Controlling the Generalization Ability of Learning Processes * Methods of Pattern Recognition * Methods of Function Estimation * Direct Methods in Statistical Learning Theory * The Vicinal Risk Minimization Principle and the SVMs
Details
Erscheinungsjahr: | 2010 |
---|---|
Fachbereich: | Wahrscheinlichkeitstheorie |
Genre: | Importe, Mathematik |
Rubrik: | Naturwissenschaften & Technik |
Medium: | Taschenbuch |
Inhalt: |
xx
314 S. 48 s/w Illustr. |
ISBN-13: | 9781441931603 |
ISBN-10: | 1441931600 |
Sprache: | Englisch |
Einband: | Kartoniert / Broschiert |
Autor: | Vapnik, Vladimir |
Auflage: | Second Edition 2000 |
Hersteller: |
Springer US
Springer New York Springer US, New York, N.Y. |
Verantwortliche Person für die EU: | Springer Verlag GmbH, Tiergartenstr. 17, D-69121 Heidelberg, juergen.hartmann@springer.com |
Maße: | 235 x 155 x 19 mm |
Von/Mit: | Vladimir Vapnik |
Erscheinungsdatum: | 01.12.2010 |
Gewicht: | 0,511 kg |
Zusammenfassung
The aim of this book is to discuss the fundamental ideas which lie behind the statistical theory of learning and generalization. Written in readable and concise style and devoted to key learning problems, the book is intended for statisticians, mathematicians, physicists, and computer scientists.
Inhaltsverzeichnis
Informal Reasoning and Comments * Consistency of Learning Processes * Bounds on the Rate of Convergence of Learing Processes * Controlling the Generalization Ability of Learning Processes * Methods of Pattern Recognition * Methods of Function Estimation * Direct Methods in Statistical Learning Theory * The Vicinal Risk Minimization Principle and the SVMs
Details
Erscheinungsjahr: | 2010 |
---|---|
Fachbereich: | Wahrscheinlichkeitstheorie |
Genre: | Importe, Mathematik |
Rubrik: | Naturwissenschaften & Technik |
Medium: | Taschenbuch |
Inhalt: |
xx
314 S. 48 s/w Illustr. |
ISBN-13: | 9781441931603 |
ISBN-10: | 1441931600 |
Sprache: | Englisch |
Einband: | Kartoniert / Broschiert |
Autor: | Vapnik, Vladimir |
Auflage: | Second Edition 2000 |
Hersteller: |
Springer US
Springer New York Springer US, New York, N.Y. |
Verantwortliche Person für die EU: | Springer Verlag GmbH, Tiergartenstr. 17, D-69121 Heidelberg, juergen.hartmann@springer.com |
Maße: | 235 x 155 x 19 mm |
Von/Mit: | Vladimir Vapnik |
Erscheinungsdatum: | 01.12.2010 |
Gewicht: | 0,511 kg |
Sicherheitshinweis