Dekorationsartikel gehören nicht zum Leistungsumfang.
Sprache:
Englisch
252,50 €*
Versandkostenfrei per Post / DHL
Aktuell nicht verfügbar
Kategorien:
Beschreibung
Dieses Buch widmet sich der statistischen Theorie des Lernens und der Verallgemeinerung - das heißt, dem Problem der Auswahl der gewünschten Funktion auf der Basis empirischen Datenmaterials. Anwendung findet die Theorie auf vielen verschiedenen Gebieten - in neuronalen Netzwerken, Fuzzy-Logic-Systemen und künstlicher Intelligenz - beispielsweise in der Psychologie und der Informationswissenschaft. (8/98)
Dieses Buch widmet sich der statistischen Theorie des Lernens und der Verallgemeinerung - das heißt, dem Problem der Auswahl der gewünschten Funktion auf der Basis empirischen Datenmaterials. Anwendung findet die Theorie auf vielen verschiedenen Gebieten - in neuronalen Netzwerken, Fuzzy-Logic-Systemen und künstlicher Intelligenz - beispielsweise in der Psychologie und der Informationswissenschaft. (8/98)
Über den Autor
Vladimir Naumovich Vapnik is one of the main developers of the Vapnik-Chervonenkis theory of statistical learning, and the co-inventor of the support vector machine method, and support vector clustering algorithm.
Inhaltsverzeichnis
Partial table of contents:
THEORY OF LEARNING AND GENERALIZATION.
Two Approaches to the Learning Problem.
Estimation of the Probability Measure and Problem of Learning.
Conditions for Consistency of Empirical Risk Minimization Principle.
The Structural Risk Minimization Principle.
Stochastic Ill-Posed Problems.
SUPPORT VECTOR ESTIMATION OF FUNCTIONS.
Perceptrons and Their Generalizations.
SV Machines for Function Approximations, Regression Estimation, and Signal Processing.
STATISTICAL FOUNDATION OF LEARNING THEORY.
Necessary and Sufficient Conditions for Uniform Convergence of Frequencies to Their Probabilities.
Necessary and Sufficient Conditions for Uniform One-Sided Convergence of Means to Their Expectations.
Comments and Bibliographical Remarks.
References.
Index.
THEORY OF LEARNING AND GENERALIZATION.
Two Approaches to the Learning Problem.
Estimation of the Probability Measure and Problem of Learning.
Conditions for Consistency of Empirical Risk Minimization Principle.
The Structural Risk Minimization Principle.
Stochastic Ill-Posed Problems.
SUPPORT VECTOR ESTIMATION OF FUNCTIONS.
Perceptrons and Their Generalizations.
SV Machines for Function Approximations, Regression Estimation, and Signal Processing.
STATISTICAL FOUNDATION OF LEARNING THEORY.
Necessary and Sufficient Conditions for Uniform Convergence of Frequencies to Their Probabilities.
Necessary and Sufficient Conditions for Uniform One-Sided Convergence of Means to Their Expectations.
Comments and Bibliographical Remarks.
References.
Index.
Details
Erscheinungsjahr: | 1998 |
---|---|
Fachbereich: | Wahrscheinlichkeitstheorie |
Genre: | Mathematik |
Rubrik: | Naturwissenschaften & Technik |
Medium: | Buch |
Seiten: | 768 |
Inhalt: | 768 S. |
ISBN-13: | 9780471030034 |
ISBN-10: | 0471030031 |
Sprache: | Englisch |
Einband: | Gebunden |
Autor: | Vapnik, Vladimir N |
Hersteller: |
Wiley
John Wiley & Sons |
Maße: | 240 x 161 x 45 mm |
Von/Mit: | Vladimir N Vapnik |
Erscheinungsdatum: | 30.09.1998 |
Gewicht: | 1,296 kg |
Über den Autor
Vladimir Naumovich Vapnik is one of the main developers of the Vapnik-Chervonenkis theory of statistical learning, and the co-inventor of the support vector machine method, and support vector clustering algorithm.
Inhaltsverzeichnis
Partial table of contents:
THEORY OF LEARNING AND GENERALIZATION.
Two Approaches to the Learning Problem.
Estimation of the Probability Measure and Problem of Learning.
Conditions for Consistency of Empirical Risk Minimization Principle.
The Structural Risk Minimization Principle.
Stochastic Ill-Posed Problems.
SUPPORT VECTOR ESTIMATION OF FUNCTIONS.
Perceptrons and Their Generalizations.
SV Machines for Function Approximations, Regression Estimation, and Signal Processing.
STATISTICAL FOUNDATION OF LEARNING THEORY.
Necessary and Sufficient Conditions for Uniform Convergence of Frequencies to Their Probabilities.
Necessary and Sufficient Conditions for Uniform One-Sided Convergence of Means to Their Expectations.
Comments and Bibliographical Remarks.
References.
Index.
THEORY OF LEARNING AND GENERALIZATION.
Two Approaches to the Learning Problem.
Estimation of the Probability Measure and Problem of Learning.
Conditions for Consistency of Empirical Risk Minimization Principle.
The Structural Risk Minimization Principle.
Stochastic Ill-Posed Problems.
SUPPORT VECTOR ESTIMATION OF FUNCTIONS.
Perceptrons and Their Generalizations.
SV Machines for Function Approximations, Regression Estimation, and Signal Processing.
STATISTICAL FOUNDATION OF LEARNING THEORY.
Necessary and Sufficient Conditions for Uniform Convergence of Frequencies to Their Probabilities.
Necessary and Sufficient Conditions for Uniform One-Sided Convergence of Means to Their Expectations.
Comments and Bibliographical Remarks.
References.
Index.
Details
Erscheinungsjahr: | 1998 |
---|---|
Fachbereich: | Wahrscheinlichkeitstheorie |
Genre: | Mathematik |
Rubrik: | Naturwissenschaften & Technik |
Medium: | Buch |
Seiten: | 768 |
Inhalt: | 768 S. |
ISBN-13: | 9780471030034 |
ISBN-10: | 0471030031 |
Sprache: | Englisch |
Einband: | Gebunden |
Autor: | Vapnik, Vladimir N |
Hersteller: |
Wiley
John Wiley & Sons |
Maße: | 240 x 161 x 45 mm |
Von/Mit: | Vladimir N Vapnik |
Erscheinungsdatum: | 30.09.1998 |
Gewicht: | 1,296 kg |
Warnhinweis