Dekorationsartikel gehören nicht zum Leistungsumfang.
Hyperparameter Optimization in Machine Learning
Make Your Machine Learning and Deep Learning Models More Efficient
Taschenbuch von Tanay Agrawal
Sprache: Englisch

47,90 €*

inkl. MwSt.

Versandkostenfrei per Post / DHL

Lieferzeit 4-7 Werktage

Kategorien:
Beschreibung
Dive into hyperparameter tuning of machine learning models and focus on what hyperparameters are and how they work. This book discusses different techniques of hyperparameters tuning, from the basics to advanced methods.

This is a step-by-step guide to hyperparameter optimization, starting with what hyperparameters are and how they affect different aspects of machine learning models. It then goes through some basic (brute force) algorithms of hyperparameter optimization. Further, the author addresses the problem of time and memory constraints, using distributed optimization methods. Next yoüll discuss Bayesian optimization for hyperparameter search, which learns from its previous history.

The book discusses different frameworks, such as Hyperopt and Optuna, which implements sequential model-based global optimization (SMBO) algorithms. During these discussions, yoüll focus on different aspects such as creation of search spaces and distributed optimization of these libraries.

Hyperparameter Optimization in Machine Learning creates an understanding of how these algorithms work and how you can use them in real-life data science problems. The final chapter summaries the role of hyperparameter optimization in automated machine learning and ends with a tutorial to create your own AutoML script.

Hyperparameter optimization is tedious task, so sit back and let these algorithms do your work.
What You Will Learn
Discover how changes in hyperparameters affect the model¿s performance.
Apply different hyperparameter tuning algorithms to data science problems
Work with Bayesian optimization methods to create efficient machine learning and deep learning models
Distribute hyperparameter optimization using a cluster of machines
Approach automated machine learning using hyperparameter optimization
Who This Book Is For

Professionals and students working with machine learning.
Dive into hyperparameter tuning of machine learning models and focus on what hyperparameters are and how they work. This book discusses different techniques of hyperparameters tuning, from the basics to advanced methods.

This is a step-by-step guide to hyperparameter optimization, starting with what hyperparameters are and how they affect different aspects of machine learning models. It then goes through some basic (brute force) algorithms of hyperparameter optimization. Further, the author addresses the problem of time and memory constraints, using distributed optimization methods. Next yoüll discuss Bayesian optimization for hyperparameter search, which learns from its previous history.

The book discusses different frameworks, such as Hyperopt and Optuna, which implements sequential model-based global optimization (SMBO) algorithms. During these discussions, yoüll focus on different aspects such as creation of search spaces and distributed optimization of these libraries.

Hyperparameter Optimization in Machine Learning creates an understanding of how these algorithms work and how you can use them in real-life data science problems. The final chapter summaries the role of hyperparameter optimization in automated machine learning and ends with a tutorial to create your own AutoML script.

Hyperparameter optimization is tedious task, so sit back and let these algorithms do your work.
What You Will Learn
Discover how changes in hyperparameters affect the model¿s performance.
Apply different hyperparameter tuning algorithms to data science problems
Work with Bayesian optimization methods to create efficient machine learning and deep learning models
Distribute hyperparameter optimization using a cluster of machines
Approach automated machine learning using hyperparameter optimization
Who This Book Is For

Professionals and students working with machine learning.
Über den Autor

Tanay is a deep learning engineer and researcher, who graduated in 2019 in Bachelor of Technology from SMVDU, J&K. He is currently working at Curl Hg on SARA, an OCR platform. He is also advisor to Witooth Dental Services and Technologies. He started his career at MateLabs working on an AutoML Platform, Mateverse. He has worked extensively on hyperparameter optimization. He has also delivered talks on hyperparameter optimization at conferences including PyData, Delhi and PyCon, India.

Zusammenfassung

Covers state-of-the-art techniques for hyperparameter tuning

Covers implementation of advanced Bayesian optimization techniques on machine learning algorithms to complex deep learning frameworks

Explains distributed optimization of hyperparameters, which increases the time efficiency of the model significantly

Inhaltsverzeichnis
Chapter 1: Hyperparameters.- Chapter 2: Brute Force Hyperparameter Tuning.- Chapter 3: Distributed Hyperparameter Optimization.- Chapter 4: Sequential Model-Based Global Optimization and Its Hierarchical.- Chapter 5: Using HyperOpt.- Chapter 6: Hyperparameter Generating Condition Generative Adversarial Neural.
Details
Erscheinungsjahr: 2020
Genre: Informatik
Rubrik: Naturwissenschaften & Technik
Medium: Taschenbuch
Seiten: 188
Inhalt: xix
166 S.
49 s/w Illustr.
4 farbige Illustr.
166 p. 53 illus.
4 illus. in color.
ISBN-13: 9781484265789
ISBN-10: 1484265785
Sprache: Englisch
Ausstattung / Beilage: Paperback
Einband: Kartoniert / Broschiert
Autor: Agrawal, Tanay
Auflage: 1st ed.
Hersteller: Apress
Apress L.P.
Maße: 235 x 155 x 11 mm
Von/Mit: Tanay Agrawal
Erscheinungsdatum: 29.11.2020
Gewicht: 0,295 kg
preigu-id: 118959090
Über den Autor

Tanay is a deep learning engineer and researcher, who graduated in 2019 in Bachelor of Technology from SMVDU, J&K. He is currently working at Curl Hg on SARA, an OCR platform. He is also advisor to Witooth Dental Services and Technologies. He started his career at MateLabs working on an AutoML Platform, Mateverse. He has worked extensively on hyperparameter optimization. He has also delivered talks on hyperparameter optimization at conferences including PyData, Delhi and PyCon, India.

Zusammenfassung

Covers state-of-the-art techniques for hyperparameter tuning

Covers implementation of advanced Bayesian optimization techniques on machine learning algorithms to complex deep learning frameworks

Explains distributed optimization of hyperparameters, which increases the time efficiency of the model significantly

Inhaltsverzeichnis
Chapter 1: Hyperparameters.- Chapter 2: Brute Force Hyperparameter Tuning.- Chapter 3: Distributed Hyperparameter Optimization.- Chapter 4: Sequential Model-Based Global Optimization and Its Hierarchical.- Chapter 5: Using HyperOpt.- Chapter 6: Hyperparameter Generating Condition Generative Adversarial Neural.
Details
Erscheinungsjahr: 2020
Genre: Informatik
Rubrik: Naturwissenschaften & Technik
Medium: Taschenbuch
Seiten: 188
Inhalt: xix
166 S.
49 s/w Illustr.
4 farbige Illustr.
166 p. 53 illus.
4 illus. in color.
ISBN-13: 9781484265789
ISBN-10: 1484265785
Sprache: Englisch
Ausstattung / Beilage: Paperback
Einband: Kartoniert / Broschiert
Autor: Agrawal, Tanay
Auflage: 1st ed.
Hersteller: Apress
Apress L.P.
Maße: 235 x 155 x 11 mm
Von/Mit: Tanay Agrawal
Erscheinungsdatum: 29.11.2020
Gewicht: 0,295 kg
preigu-id: 118959090
Warnhinweis

Ähnliche Produkte

Ähnliche Produkte