72,95 €
72,95 €
inkl. MwSt.
Sofort per Download lieferbar
36 °P sammeln
72,95 €
Als Download kaufen
72,95 €
inkl. MwSt.
Sofort per Download lieferbar
36 °P sammeln
Jetzt verschenken
Alle Infos zum eBook verschenken
72,95 €
inkl. MwSt.
Sofort per Download lieferbar
Alle Infos zum eBook verschenken
36 °P sammeln
- Format: PDF
- Merkliste
- Auf die Merkliste
- Bewerten Bewerten
- Teilen
- Produkt teilen
- Produkterinnerung
- Produkterinnerung

Bitte loggen Sie sich zunächst in Ihr Kundenkonto ein oder registrieren Sie sich bei
bücher.de, um das eBook-Abo tolino select nutzen zu können.
Hier können Sie sich einloggen
Hier können Sie sich einloggen
Sie sind bereits eingeloggt. Klicken Sie auf 2. tolino select Abo, um fortzufahren.

Bitte loggen Sie sich zunächst in Ihr Kundenkonto ein oder registrieren Sie sich bei bücher.de, um das eBook-Abo tolino select nutzen zu können.
This book presents a general theory of neural nets, starting from general laws, models, and biology. With numereous examples, illustrations, and references it is suitable for readers who seek an overview of the field and as a basis for university courses.
- Geräte: PC
- ohne Kopierschutz
- eBook Hilfe
- Größe: 46.89MB
Andere Kunden interessierten sich auch für
Fuzzy Logic and Applications (eBook, PDF)40,95 €
Ke-Lin DuNeural Networks in a Softcomputing Framework (eBook, PDF)40,95 €
Zhong LiFuzzy Chaotic Systems (eBook, PDF)112,95 €
Colin FyfeHebbian Learning and Negative Feedback Networks (eBook, PDF)112,95 €
ICANN '94 (eBook, PDF)40,95 €
From Statistics to Neural Networks (eBook, PDF)72,95 €
Artificial Neural Networks - ICANN 2007 (eBook, PDF)72,95 €-
-
-
This book presents a general theory of neural nets, starting from general laws, models, and biology. With numereous examples, illustrations, and references it is suitable for readers who seek an overview of the field and as a basis for university courses.
Dieser Download kann aus rechtlichen Gründen nur mit Rechnungsadresse in A, B, BG, CY, CZ, D, DK, EW, E, FIN, F, GR, HR, H, IRL, I, LT, L, LR, M, NL, PL, P, R, S, SLO, SK ausgeliefert werden.
Produktdetails
- Produktdetails
- Verlag: Springer Berlin Heidelberg
- Seitenzahl: 502
- Erscheinungstermin: 29. Juni 2013
- Englisch
- ISBN-13: 9783642610684
- Artikelnr.: 53396940
- Verlag: Springer Berlin Heidelberg
- Seitenzahl: 502
- Erscheinungstermin: 29. Juni 2013
- Englisch
- ISBN-13: 9783642610684
- Artikelnr.: 53396940
- Herstellerkennzeichnung Die Herstellerinformationen sind derzeit nicht verfügbar.
1. The Biological Paradigm.- 1.1 Neural computation.- 1.2 Networks of neurons.- 1.3 Artificial neural networks.- 1.4 Historical and bibliographical remarks.- 2. Threshold Logic.- 2.1 Networks of functions.- 2.2 Synthesis of Boolean functions.- 2.3 Equivalent networks.- 2.4 Recurrent networks.- 2.5 Harmonic analysis of logical functions.- 2.6 Historical and bibliographical remarks.- 3.Weighted Networks - The Perceptron.- 3.1 Perceptrons and parallel processing.- 3.2 Implementation of logical functions.- 3.3 Linearly separable functions.- 3.4 Applications and biological analogy.- 3.5 Historical and bibliographical remarks.- 4. Perceptron Learning.- 4.1 Learning algorithms for neural networks.- 4.2 Algorithmic learning.- 4.3 Linear programming.- 4.4 Historical and bibliographical remarks.- 5. Unsupervised Learning and Clustering Algorithms.- 5.1 Competitive learning.- 5.2 Convergence analysis.- 5.3 Principal component analysis.- 5.4 Some applications.- 5.5 Historical and bibliographical remarks.- 6. One and Two Layered Networks.- 6.1 Structure and geometric visualization.- 6.2 Counting regions in input and weight space.- 6.3 Regions for two layered networks.- 6.4 Historical and bibliographical remarks.- 7. The Backpropagation Algorithm.- 7.1 Learning as gradient descent.- 7.2 General feed-forward networks.- 7.3 The case of layered networks.- 7.4 Recurrent networks.- 7.5 Historical and bibliographical remarks.- 8. Fast Learning Algorithms.- 8.1 Introduction - classical backpropagation.- 8.2 Some simple improvements to backpropagation.- 8.3 Adaptive step algorithms.- 8.4 Second-order algorithms.- 8.5 Relaxation methods.- 8.6 Historical and bibliographical remarks.- 9. Statistics and Neural Networks.- 9.1 Linear and nonlinear regression.- 9.2 Multiple regression.- 9.3Classification networks.- 9.4 Historical and bibliographical remarks.- 10. The Complexity of Learning.- 10.1 Network functions.- 10.2 Function approximation.- 10.3 Complexity of learning problems.- 10.4 Historical and bibliographical remarks.- 11. Fuzzy Logic.- 11.1 Fuzzy sets and fuzzy logic.- 11.2 Fuzzy inferences.- 11.3 Control with fuzzy logic.- 11.4 Historical and bibliographical remarks.- 12. Associative Networks.- 12.1 Associative pattern recognition.- 12.2 Associative learning.- 12.3 The capacity problem.- 12.4 The pseudoinverse.- 12.5 Historical and bibliographical remarks.- 13. The Hopfield Model.- 13.1 Synchronous and asynchronous networks.- 13.2 Definition of Hopfield networks.- 13.3 Converge to stable states.- 13.4 Equivalence of Hopfield and perceptron learning.- 13.5 Parallel combinatorics.- 13.6 Implementation of Hopfield networks.- 13.7 Historical and bibliographical remarks.- 14. Stochastic Networks.- 14.1 Variations of the Hopfield model.- 14.2 Stochastic systems.- 14.3 Learning algorithms and applications.- 14.4 Historical and bibliographical remarks.- 15. Kohonen Networks.- 15.1 Self-organization.- 15.2 Kohonen's model.- 15.3 Analysis of convergence.- 15.4 Applications.- 15.5 Historical and bibliographical remarks.- 16. Modular Neural Networks.- 16.1 Constructive algorithms for modular networks.- 16.2 Hybrid networks.- 16.3 Historical and bibliographical remarks.- 17. Genetic Algorithms.- 17.1 Coding and operators.- 17.2 Properties of genetic algorithms.- 17.3 Neural networks and genetic algorithms.- 17.4 Historical and bibliographical remarks.- 18. Hardware for Neural Networks.- 18.1 Taxonomy of neural hardware.- 18.2 Analog neural networks.- 18.3 Digital networks.- 18.4 Innovative computer architectures.- 18.5 Historical and bibliographicalremarks.
1. The Biological Paradigm.- 1.1 Neural computation.- 1.2 Networks of neurons.- 1.3 Artificial neural networks.- 1.4 Historical and bibliographical remarks.- 2. Threshold Logic.- 2.1 Networks of functions.- 2.2 Synthesis of Boolean functions.- 2.3 Equivalent networks.- 2.4 Recurrent networks.- 2.5 Harmonic analysis of logical functions.- 2.6 Historical and bibliographical remarks.- 3.Weighted Networks - The Perceptron.- 3.1 Perceptrons and parallel processing.- 3.2 Implementation of logical functions.- 3.3 Linearly separable functions.- 3.4 Applications and biological analogy.- 3.5 Historical and bibliographical remarks.- 4. Perceptron Learning.- 4.1 Learning algorithms for neural networks.- 4.2 Algorithmic learning.- 4.3 Linear programming.- 4.4 Historical and bibliographical remarks.- 5. Unsupervised Learning and Clustering Algorithms.- 5.1 Competitive learning.- 5.2 Convergence analysis.- 5.3 Principal component analysis.- 5.4 Some applications.- 5.5 Historical and bibliographical remarks.- 6. One and Two Layered Networks.- 6.1 Structure and geometric visualization.- 6.2 Counting regions in input and weight space.- 6.3 Regions for two layered networks.- 6.4 Historical and bibliographical remarks.- 7. The Backpropagation Algorithm.- 7.1 Learning as gradient descent.- 7.2 General feed-forward networks.- 7.3 The case of layered networks.- 7.4 Recurrent networks.- 7.5 Historical and bibliographical remarks.- 8. Fast Learning Algorithms.- 8.1 Introduction - classical backpropagation.- 8.2 Some simple improvements to backpropagation.- 8.3 Adaptive step algorithms.- 8.4 Second-order algorithms.- 8.5 Relaxation methods.- 8.6 Historical and bibliographical remarks.- 9. Statistics and Neural Networks.- 9.1 Linear and nonlinear regression.- 9.2 Multiple regression.- 9.3Classification networks.- 9.4 Historical and bibliographical remarks.- 10. The Complexity of Learning.- 10.1 Network functions.- 10.2 Function approximation.- 10.3 Complexity of learning problems.- 10.4 Historical and bibliographical remarks.- 11. Fuzzy Logic.- 11.1 Fuzzy sets and fuzzy logic.- 11.2 Fuzzy inferences.- 11.3 Control with fuzzy logic.- 11.4 Historical and bibliographical remarks.- 12. Associative Networks.- 12.1 Associative pattern recognition.- 12.2 Associative learning.- 12.3 The capacity problem.- 12.4 The pseudoinverse.- 12.5 Historical and bibliographical remarks.- 13. The Hopfield Model.- 13.1 Synchronous and asynchronous networks.- 13.2 Definition of Hopfield networks.- 13.3 Converge to stable states.- 13.4 Equivalence of Hopfield and perceptron learning.- 13.5 Parallel combinatorics.- 13.6 Implementation of Hopfield networks.- 13.7 Historical and bibliographical remarks.- 14. Stochastic Networks.- 14.1 Variations of the Hopfield model.- 14.2 Stochastic systems.- 14.3 Learning algorithms and applications.- 14.4 Historical and bibliographical remarks.- 15. Kohonen Networks.- 15.1 Self-organization.- 15.2 Kohonen's model.- 15.3 Analysis of convergence.- 15.4 Applications.- 15.5 Historical and bibliographical remarks.- 16. Modular Neural Networks.- 16.1 Constructive algorithms for modular networks.- 16.2 Hybrid networks.- 16.3 Historical and bibliographical remarks.- 17. Genetic Algorithms.- 17.1 Coding and operators.- 17.2 Properties of genetic algorithms.- 17.3 Neural networks and genetic algorithms.- 17.4 Historical and bibliographical remarks.- 18. Hardware for Neural Networks.- 18.1 Taxonomy of neural hardware.- 18.2 Analog neural networks.- 18.3 Digital networks.- 18.4 Innovative computer architectures.- 18.5 Historical and bibliographicalremarks.







