Dekorationsartikel gehören nicht zum Leistungsumfang.
Models of Neural Networks I
Taschenbuch von Eytan Domany (u. a.)
Sprache: Englisch

50,95 €*

inkl. MwSt.

Versandkostenfrei per Post / DHL

Lieferzeit 4-7 Werktage

Kategorien:
Beschreibung
One of the great intellectual challenges for the next few decades is the question of brain organization. What is the basic mechanism for storage of memory? What are the processes that serve as the interphase between the basically chemical processes of the body and the very specific and nonstatistical operations in the brain? Above all, how is concept formation achieved in the human brain? I wonder whether the spirit of the physics that will be involved in these studies will not be akin to that which moved the founders of the "rational foundation of thermodynamics". C. N. Yang! 10 The human brain is said to have roughly 10 neurons connected through about 14 10 synapses. Each neuron is itself a complex device which compares and integrates incoming electrical signals and relays a nonlinear response to other neurons. The brain certainly exceeds in complexity any system which physicists have studied in the past. Nevertheless, there do exist many analogies of the brain to simpler physical systems. We have witnessed during the last decade some surprising contributions of physics to the study of the brain. The most significant parallel between biological brains and many physical systems is that both are made of many tightly interacting components.
One of the great intellectual challenges for the next few decades is the question of brain organization. What is the basic mechanism for storage of memory? What are the processes that serve as the interphase between the basically chemical processes of the body and the very specific and nonstatistical operations in the brain? Above all, how is concept formation achieved in the human brain? I wonder whether the spirit of the physics that will be involved in these studies will not be akin to that which moved the founders of the "rational foundation of thermodynamics". C. N. Yang! 10 The human brain is said to have roughly 10 neurons connected through about 14 10 synapses. Each neuron is itself a complex device which compares and integrates incoming electrical signals and relays a nonlinear response to other neurons. The brain certainly exceeds in complexity any system which physicists have studied in the past. Nevertheless, there do exist many analogies of the brain to simpler physical systems. We have witnessed during the last decade some surprising contributions of physics to the study of the brain. The most significant parallel between biological brains and many physical systems is that both are made of many tightly interacting components.
Inhaltsverzeichnis
1. Collective Phenomena in Neural Networks.- 1.1 Introduction and Overview.- 1.2 Prerequisites.- 1.3 The Hopfield Model.- 1.4 Nonlinear Neural Networks.- 1.5 Learning, Unlearning, and Forgetting.- 1.6 Hierarchically Structured Information.- 1.7 Outlook.- References.- 2. Information from Structure: A Sketch of Neuroanatomy.- 2.1 Development of the Brain.- 2.2 Neuroanatomy Related to Information Handling in the Brain.- 2.3 The Idea of Electronic Circuitry.- 2.4 The Projection from the Compound Eye onto the First Ganglion (Lamina) of the Fly.- 2.5 Statistical Wiring.- 2.6 Symmetry of Neural Nets.- 2.7 The Cerebellum.- 2.8 Variations in Size of the Elements.- 2.9 The Cerebral Cortex.- 2.10 Inborn Knowledge.- References.- 3. Storage Capacity and Learning in Ising-Spin Neural Networks.- 3.1 Introduction.- 3.2 Content-addressability: A Dynamics Problem.- 3.3 Learning.- 3.4 Discussion.- References.- 4. Dynamics of Learning.- 4.1 Introduction.- 4.2 Definition of Supervised Learning.- 4.3 Adaline Learning.- 4.4 Perceptron Learning.- 4.5 Binary Synapses.- 4.6 Basins of Attraction.- 4.7 Forgetting.- 4.8 Outlook.- References.- 5. Hierarchical Organization of Memory.- 5.1 Introduction.- 5.2 Models: The Problem.- 5.3 A Toy Problem: Patterns with Low Activity.- 5.4 Models with Hierarchically Structured Information.- 5.5 Extensions.- 5.6 The Enhancement of Storage Capacity: Multineuron Interactions.- 5.7 Conclusion.- References.- 6. Asymmetrically Diluted Neural Networks.- 6.1 Introduction.- 6.2 Solvability and Retrieval Properties.- 6.3 Exact Solution with Dynamic Functionals.- 6.4 Extensions and Related Work.- Appendix A.- Appendix B.- Appendix C.- References.- 7. Temporal Association.- 7.1 Introduction.- 7.2 Fast Synaptic Plasticity.- 7.3 Noise-Driven Sequences of Biased Patterns.- 7.4 Stabilizing Sequences by Delays.- 7.5 Applications: Sequence Recognition, Counting, and the Generation of Complex Sequences.- 7.6 Hebbian Learning with Delays.- 7.7 Epilogue.- References.- 8. Self-organizing Maps and Adaptive Filters.- 8.1 Introduction.- 8.2 Self-organizing Maps and Optimal Representation of Data.- 8.3 Learning Dynamics in the Vicinity of a Stationary State.- 8.4 Relation to Brain Modeling.- 8.5 Formation of a "Somatotopic Map".- 8.6 Adaptive Orientation and Spatial Frequency Filters.- 8.7 Conclusion.- References.- 9. Layered Neural Networks.- 9.1 Introduction.- 9.2 Dynamics of Feed-Forward Networks.- 9.3 Unsupervised Learning in Layered Networks.- 9.4 Supervised Learning in Layered Networks.- 9.5 Summary and Discussion.- References.- Elizabeth Gardner-An Appreciation.
Details
Erscheinungsjahr: 2012
Fachbereich: Theoretische Physik
Genre: Physik
Rubrik: Naturwissenschaften & Technik
Medium: Taschenbuch
Seiten: 380
Reihe: Physics of Neural Networks
Inhalt: xviii
355 S.
3 farbige Illustr.
355 p. 3 illus. in color.
ISBN-13: 9783642798160
ISBN-10: 3642798160
Sprache: Englisch
Ausstattung / Beilage: Paperback
Einband: Kartoniert / Broschiert
Redaktion: Domany, Eytan
Schulten, Klaus
Hemmen, J. Leo Van
Herausgeber: Eytan Domany/J Leo van Hemmen/Klaus Schulten
Auflage: 2nd ed. 1995. Softcover reprint of the original 2nd ed. 1995
Hersteller: Springer Berlin
Springer Berlin Heidelberg
Physics of Neural Networks
Maße: 235 x 155 x 21 mm
Von/Mit: Eytan Domany (u. a.)
Erscheinungsdatum: 19.01.2012
Gewicht: 0,575 kg
preigu-id: 106331301
Inhaltsverzeichnis
1. Collective Phenomena in Neural Networks.- 1.1 Introduction and Overview.- 1.2 Prerequisites.- 1.3 The Hopfield Model.- 1.4 Nonlinear Neural Networks.- 1.5 Learning, Unlearning, and Forgetting.- 1.6 Hierarchically Structured Information.- 1.7 Outlook.- References.- 2. Information from Structure: A Sketch of Neuroanatomy.- 2.1 Development of the Brain.- 2.2 Neuroanatomy Related to Information Handling in the Brain.- 2.3 The Idea of Electronic Circuitry.- 2.4 The Projection from the Compound Eye onto the First Ganglion (Lamina) of the Fly.- 2.5 Statistical Wiring.- 2.6 Symmetry of Neural Nets.- 2.7 The Cerebellum.- 2.8 Variations in Size of the Elements.- 2.9 The Cerebral Cortex.- 2.10 Inborn Knowledge.- References.- 3. Storage Capacity and Learning in Ising-Spin Neural Networks.- 3.1 Introduction.- 3.2 Content-addressability: A Dynamics Problem.- 3.3 Learning.- 3.4 Discussion.- References.- 4. Dynamics of Learning.- 4.1 Introduction.- 4.2 Definition of Supervised Learning.- 4.3 Adaline Learning.- 4.4 Perceptron Learning.- 4.5 Binary Synapses.- 4.6 Basins of Attraction.- 4.7 Forgetting.- 4.8 Outlook.- References.- 5. Hierarchical Organization of Memory.- 5.1 Introduction.- 5.2 Models: The Problem.- 5.3 A Toy Problem: Patterns with Low Activity.- 5.4 Models with Hierarchically Structured Information.- 5.5 Extensions.- 5.6 The Enhancement of Storage Capacity: Multineuron Interactions.- 5.7 Conclusion.- References.- 6. Asymmetrically Diluted Neural Networks.- 6.1 Introduction.- 6.2 Solvability and Retrieval Properties.- 6.3 Exact Solution with Dynamic Functionals.- 6.4 Extensions and Related Work.- Appendix A.- Appendix B.- Appendix C.- References.- 7. Temporal Association.- 7.1 Introduction.- 7.2 Fast Synaptic Plasticity.- 7.3 Noise-Driven Sequences of Biased Patterns.- 7.4 Stabilizing Sequences by Delays.- 7.5 Applications: Sequence Recognition, Counting, and the Generation of Complex Sequences.- 7.6 Hebbian Learning with Delays.- 7.7 Epilogue.- References.- 8. Self-organizing Maps and Adaptive Filters.- 8.1 Introduction.- 8.2 Self-organizing Maps and Optimal Representation of Data.- 8.3 Learning Dynamics in the Vicinity of a Stationary State.- 8.4 Relation to Brain Modeling.- 8.5 Formation of a "Somatotopic Map".- 8.6 Adaptive Orientation and Spatial Frequency Filters.- 8.7 Conclusion.- References.- 9. Layered Neural Networks.- 9.1 Introduction.- 9.2 Dynamics of Feed-Forward Networks.- 9.3 Unsupervised Learning in Layered Networks.- 9.4 Supervised Learning in Layered Networks.- 9.5 Summary and Discussion.- References.- Elizabeth Gardner-An Appreciation.
Details
Erscheinungsjahr: 2012
Fachbereich: Theoretische Physik
Genre: Physik
Rubrik: Naturwissenschaften & Technik
Medium: Taschenbuch
Seiten: 380
Reihe: Physics of Neural Networks
Inhalt: xviii
355 S.
3 farbige Illustr.
355 p. 3 illus. in color.
ISBN-13: 9783642798160
ISBN-10: 3642798160
Sprache: Englisch
Ausstattung / Beilage: Paperback
Einband: Kartoniert / Broschiert
Redaktion: Domany, Eytan
Schulten, Klaus
Hemmen, J. Leo Van
Herausgeber: Eytan Domany/J Leo van Hemmen/Klaus Schulten
Auflage: 2nd ed. 1995. Softcover reprint of the original 2nd ed. 1995
Hersteller: Springer Berlin
Springer Berlin Heidelberg
Physics of Neural Networks
Maße: 235 x 155 x 21 mm
Von/Mit: Eytan Domany (u. a.)
Erscheinungsdatum: 19.01.2012
Gewicht: 0,575 kg
preigu-id: 106331301
Warnhinweis

Ähnliche Produkte

Ähnliche Produkte