Dekorationsartikel gehören nicht zum Leistungsumfang.
Sprache:
Englisch
39,55 €*
Versandkostenfrei per Post / DHL
auf Lager, Lieferzeit 1-2 Werktage
Kategorien:
Beschreibung
"Teaches a range of machine learning methods, from simple to complex. Includes dozens of illustrative examples using the R programming language and real datasets. Covers not only how to use machine learning methods but also why these methods work and advice on how to avoid common pitfalls"--
"Teaches a range of machine learning methods, from simple to complex. Includes dozens of illustrative examples using the R programming language and real datasets. Covers not only how to use machine learning methods but also why these methods work and advice on how to avoid common pitfalls"--
Über den Autor
Norman Matloff is an award-winning professor at the University of California, Davis. Matloff has a PhD in mathematics from UCLA and is the author of The Art of Debugging with GDB, DDD, and Eclipse and The Art of R Programming (both from No Starch Press).
Inhaltsverzeichnis
Acknowledgments
Introduction
PART I: PROLOGUE, AND NEIGHBORHOOD-BASED METHODS
Chapter 1: Regression Models
Chapter 2: Classification Models
Chapter 3: Bias, Variance, Overfitting, and Cross-Validation
Chapter 4: Dealing with Large Numbers of Features
PART II: TREE-BASED METHODS
Chapter 5: A Step Beyond k-NN: Decision Trees
Chapter 6: Tweaking the Trees
Chapter 7: Finding a Good Set of Hyperparameters
PART III: METHODS BASED ON LINEAR RELATIONSHIPS
Chapter 8: Parametric Methods
Chapter 9: Cutting Things Down to Size: Regularization
PART IV: METHODS BASED ON SEPARATING LINES AND PLANES
Chapter 10: A Boundary Approach: Support Vector Machines
Chapter 11: Linear Models on Steroids: Neural Networks
PART V: APPLICATIONS
Chapter 12: Image Classification
Chapter 13: Handling Time Series and Text Data
Appendix A: List of Acronyms and Symbols
Appendix B: Statistics and ML Terminology Correspondence
Appendix C: Matrices, Data Frames, and Factor Conversions
Appendix D: Pitfall: Beware of “p-Hacking”!
Introduction
PART I: PROLOGUE, AND NEIGHBORHOOD-BASED METHODS
Chapter 1: Regression Models
Chapter 2: Classification Models
Chapter 3: Bias, Variance, Overfitting, and Cross-Validation
Chapter 4: Dealing with Large Numbers of Features
PART II: TREE-BASED METHODS
Chapter 5: A Step Beyond k-NN: Decision Trees
Chapter 6: Tweaking the Trees
Chapter 7: Finding a Good Set of Hyperparameters
PART III: METHODS BASED ON LINEAR RELATIONSHIPS
Chapter 8: Parametric Methods
Chapter 9: Cutting Things Down to Size: Regularization
PART IV: METHODS BASED ON SEPARATING LINES AND PLANES
Chapter 10: A Boundary Approach: Support Vector Machines
Chapter 11: Linear Models on Steroids: Neural Networks
PART V: APPLICATIONS
Chapter 12: Image Classification
Chapter 13: Handling Time Series and Text Data
Appendix A: List of Acronyms and Symbols
Appendix B: Statistics and ML Terminology Correspondence
Appendix C: Matrices, Data Frames, and Factor Conversions
Appendix D: Pitfall: Beware of “p-Hacking”!
Details
Erscheinungsjahr: | 2024 |
---|---|
Genre: | Informatik |
Rubrik: | Naturwissenschaften & Technik |
Medium: | Taschenbuch |
ISBN-13: | 9781718502109 |
ISBN-10: | 1718502109 |
Sprache: | Englisch |
Einband: | Kartoniert / Broschiert |
Autor: | Matloff, Norman |
Hersteller: |
Random House LLC US
No Starch Press |
Maße: | 235 x 180 x 20 mm |
Von/Mit: | Norman Matloff |
Erscheinungsdatum: | 09.01.2024 |
Gewicht: | 0,534 kg |
Über den Autor
Norman Matloff is an award-winning professor at the University of California, Davis. Matloff has a PhD in mathematics from UCLA and is the author of The Art of Debugging with GDB, DDD, and Eclipse and The Art of R Programming (both from No Starch Press).
Inhaltsverzeichnis
Acknowledgments
Introduction
PART I: PROLOGUE, AND NEIGHBORHOOD-BASED METHODS
Chapter 1: Regression Models
Chapter 2: Classification Models
Chapter 3: Bias, Variance, Overfitting, and Cross-Validation
Chapter 4: Dealing with Large Numbers of Features
PART II: TREE-BASED METHODS
Chapter 5: A Step Beyond k-NN: Decision Trees
Chapter 6: Tweaking the Trees
Chapter 7: Finding a Good Set of Hyperparameters
PART III: METHODS BASED ON LINEAR RELATIONSHIPS
Chapter 8: Parametric Methods
Chapter 9: Cutting Things Down to Size: Regularization
PART IV: METHODS BASED ON SEPARATING LINES AND PLANES
Chapter 10: A Boundary Approach: Support Vector Machines
Chapter 11: Linear Models on Steroids: Neural Networks
PART V: APPLICATIONS
Chapter 12: Image Classification
Chapter 13: Handling Time Series and Text Data
Appendix A: List of Acronyms and Symbols
Appendix B: Statistics and ML Terminology Correspondence
Appendix C: Matrices, Data Frames, and Factor Conversions
Appendix D: Pitfall: Beware of “p-Hacking”!
Introduction
PART I: PROLOGUE, AND NEIGHBORHOOD-BASED METHODS
Chapter 1: Regression Models
Chapter 2: Classification Models
Chapter 3: Bias, Variance, Overfitting, and Cross-Validation
Chapter 4: Dealing with Large Numbers of Features
PART II: TREE-BASED METHODS
Chapter 5: A Step Beyond k-NN: Decision Trees
Chapter 6: Tweaking the Trees
Chapter 7: Finding a Good Set of Hyperparameters
PART III: METHODS BASED ON LINEAR RELATIONSHIPS
Chapter 8: Parametric Methods
Chapter 9: Cutting Things Down to Size: Regularization
PART IV: METHODS BASED ON SEPARATING LINES AND PLANES
Chapter 10: A Boundary Approach: Support Vector Machines
Chapter 11: Linear Models on Steroids: Neural Networks
PART V: APPLICATIONS
Chapter 12: Image Classification
Chapter 13: Handling Time Series and Text Data
Appendix A: List of Acronyms and Symbols
Appendix B: Statistics and ML Terminology Correspondence
Appendix C: Matrices, Data Frames, and Factor Conversions
Appendix D: Pitfall: Beware of “p-Hacking”!
Details
Erscheinungsjahr: | 2024 |
---|---|
Genre: | Informatik |
Rubrik: | Naturwissenschaften & Technik |
Medium: | Taschenbuch |
ISBN-13: | 9781718502109 |
ISBN-10: | 1718502109 |
Sprache: | Englisch |
Einband: | Kartoniert / Broschiert |
Autor: | Matloff, Norman |
Hersteller: |
Random House LLC US
No Starch Press |
Maße: | 235 x 180 x 20 mm |
Von/Mit: | Norman Matloff |
Erscheinungsdatum: | 09.01.2024 |
Gewicht: | 0,534 kg |
Warnhinweis