The Handbook of Brain Theory and Neural Networks: Second Edition by Michael A. Arbib (PDF)

9

 

Ebook Info

  • Published: 2011
  • Number of pages: 1308 pages
  • Format: PDF
  • File Size: 33.61 MB
  • Authors: Michael A. Arbib

Description

A new, dramatically updated edition of the classic resource on the constantly evolving fields of brain theory and neural networks.Dramatically updating and extending the first edition, published in 1995, the second edition of The Handbook of Brain Theory and Neural Networks presents the enormous progress made in recent years in the many subfields related to the two great questions: How does the brain work? and, How can we build intelligent machines?Once again, the heart of the book is a set of almost 300 articles covering the whole spectrum of topics in brain theory and neural networks. The first two parts of the book, prepared by Michael Arbib, are designed to help readers orient themselves in this wealth of material. Part I provides general background on brain modeling and on both biological and artificial neural networks. Part II consists of “Road Maps” to help readers steer through articles in part III on specific topics of interest. The articles in part III are written so as to be accessible to readers of diverse backgrounds. They are cross-referenced and provide lists of pointers to Road Maps, background material, and related reading.The second edition greatly increases the coverage of models of fundamental neurobiology, cognitive neuroscience, and neural network approaches to language. It contains 287 articles, compared to the 266 in the first edition. Articles on topics from the first edition have been updated by the original authors or written anew by new authors, and there are 106 articles on new topics.

User’s Reviews

Editorial Reviews: Review This revised Handbook of Brain Theory provides useful new data and and updates key concepts in neuroscience. It will be an indispensable guide for exploring the essentials of brain science.―Masao Ito, RIKEN Brain Science Institute About the Author Michael Arbib has played a leading role at the interface of neuroscience and computer science ever since his first book, Brains, Machines, and Mathematics.From Neuron to Cognition provides a worthy pedagogical sequel to his widely acclaimed Handbook of Brain Theory and Neural Networks. After thirty years at University of Southern California he is now pursuing interests in “how the brain got language” and “neuroscience for architecture” in San Diego.

Reviews from Amazon users which were colected at the time this book was published on the website:

⭐Review of Second Edition (January 2008):This sizable collection of articles updates the first volume with many discoveries and conceptual developments that were unknown at the time. Meant of course for reference, a typical reader, such as this reviewer, would probably not read every article in the collection but would instead concentrate on the ones of primary interest. The editor however does offer advice on “how to use this book” at the beginning of the book, for those readers who intend to use it as their primary source of information, or for instructors who will use it as a supplement to such classes as brain theory, artificial intelligence, computational neuroscience, and cognitive neuroscience. All of these topics are represented, with emphasis of course on those that the editor finds important. Time constraints will of course play in role in any sampling algorithm for the articles, but every article that was studied by this reviewer was well worth the time spent.One of these articles, written by the editor, gave an overview of his work on the `mirror system hypothesis’ (MSH). This work has been widely discussed in the literature on evolutionary linguistics since the first edition of this book, and when confronting it for the first time may seem like a radical hypothesis. Such skepticism is aggravated by the lack of any historical record for the structure of the brain, and so any theories on language evolution will remain more tentative as compared to other scientific theories. The editor though wants the reader to consider evidence for the mirror system hypothesis that is drawn from existing life forms. Thus he proposes that we examine the “mirror system” for grasping in monkeys, which he asserts contains `mirror neurons” that are activated when the monkey performs a specific hand action and when it only observes a human or other monkey performing a similar action. The MSH is the assertion that the matching in the neural code between observation and execution occurs in the common ancestor of monkey and human. Further, this matching explains the notion of language `parity’, which asserts that a spoken utterance has essentially identical semantics between speaker and listener. The editor reviews his ideas on what brain mechanisms are responsible for language and grasping, and whether a mirror system is indeed present in humans. Experiments using proton emission topography support his thesis to some extent, but he cautions that the a lot more work needs to be done before one can make definitive conclusions. His thesis though is a plausible one on the surface, and interesting in that it proposes that language originally evolved not from a need for communication but from a need to recognize a set of actions. “Language readiness” then, resulted from an extension of the mirror system from being able to recognize single actions to being able to imitate compound actions. A natural question to ask here is why sophisticated grammatical constructions, some of them semantically awkward and of no practical value, would evolve from the mere need to imitate, which itself is not really complex from any reasonable measure of complexity. The editor is aware of these kinds of objections, for in the article he addresses them under the guise of `protospeech’, wherein he postulates two evolutionary stages for its development. His assertions in this regard are interesting for they involve the need for cooperation between two or more areas of the brain. Along these same lines, and even more fascinating, is the editor’s discussion on neuronal models for the mirror system, for when he proposes a canonical structuring for sentences he is actually asserting a kind of “entanglement” (he does not use this terminology in the article) between the F5 area and its mirror.Review of First Edition:This complilation of articles by leading experts in the field gives an excellent overview of studies in cognitive theory and the theory and applications of neural networks. The first two parts of the book give an overview and background of the properties of neurons and gives guidance to the reader on what sequence the articles are to be read. This reviewer did not read all of the articles, but only those that piqued his interest. such as the following articles which are particularly well-written and informative: 1. “Applications of Neural Networks”: Outlines the diverse applications of neural networks to signal processing, time series, imaging, etc. 2. “Astronomy”: Neural network applications in astronomy, such as adaptive optics and telescope guidance. 3. “Chains of Coupled Oscillators”: Their connection with the lamprey central pattern generator. 4. “Chaos in Axons”: An excellent review of chaos experimentally in squid axons and numerically with nerve equations. 5. “Collective Behavior of Coupled Oscillators”: A study of the phase and complex Ginzburg-Landau model. 6. “Computer Modeling Methods for Neurons”: Good overview of numerical modeling of neurons. 7. “Computing with Attractors”: Overview of omputing and feedback networks with attractors and a fascinating discussion of the possible existence of attractors in the brain. 8. “Constrained Optimization and the Elastic Net”: Useful discussion of application of neural networks to optimization problems. 9. “Data Clustering and Learning”: Good discussion of parameter estimation of mixture models by parametric statistics and vector quantization of a data set by combinatorial optimization. 10. “Diffusion Models of Neuron Activity”: Discusses 1-dimensional stochastic diffusion models for the neuron membrane potential. 11. “Disease: Neural Network Models”: Interesting overview of neural net computational models of various mental illnesses. 12. “Dynamics and Bifurcation of Neural Networks”: Discussion of neural nets and their behavior as dynamical systems. 13. “Emotion and Computational Neuroscience”: Fascinating discussion of computational models of emotion. 14. “Investment Management”: A discussion of tactical asset allocation neural network methods in asset management. 15. “Learning and Centralization: Theoretical Bounds”: Overview of computational learning theory. 16. “Locust Flight”: Interesting neural network study of the locust flight system. 17. “Neural Optimization”: Discussion of combinatorial optimization using Ising and Potts neural networks. 18. “PAC Learning and Neural Networks”: Overview of the Valiant “probabilistically correct learning paradigm in neural networks. 19. “Protein Structure Prediction”: Neural network applications to prediction of protein secondary structure. 20. “Schema Theory”: Extremely interesting overview of schemas. 21. “Speech Recognition: Pattern Matching”: Excellent discussion of the applications of hidden Markov models to speech recognition. 22. “Statistical Mechanics of Neural Networks”: Discussion of the use of the Hopfield model in neural networks. 23. Vapnik-Chervonenkis Dimension of Neural Networks”: Very interesting discussion of the VC-dimension of neural networks.

⭐As expected

⭐The articles in this work are written by a who’s who list of authors from the cognitive and computational neuroscience community. Each article is useful for getting an initial bearing on a topic from this dynamic field. The references for each article serve as useful “jumping off points” for further learning. It should be noted that this text is not a typical college textbook — it is a reference work. As such, a beginner to the field should consider one of the other introductory textbooks (perhaps “The Cognitive Neurosciences”).

⭐For those fascinated with Neural Network Theory, this book is a comprehensive compendium of some of the best papers published in the subject. So far it is one of the best volumes in Neural Networks that I have seen, and a well thought paper compilation.

⭐This is THE neural network and brain theory reference. Owning it is like owning an entire library, though much more compact.If you take a look at the table of contents, you’ll see the massive value in this book. If you’re into neural nets and brain theory, or want to be, you need this book.

Keywords

Free Download The Handbook of Brain Theory and Neural Networks: Second Edition in PDF format
The Handbook of Brain Theory and Neural Networks: Second Edition PDF Free Download
Download The Handbook of Brain Theory and Neural Networks: Second Edition 2011 PDF Free
The Handbook of Brain Theory and Neural Networks: Second Edition 2011 PDF Free Download
Download The Handbook of Brain Theory and Neural Networks: Second Edition PDF
Free Download Ebook The Handbook of Brain Theory and Neural Networks: Second Edition

Previous articleDNS and BIND on IPv6: DNS for the Next-Generation Internet 1st Edition by Cricket Liu (PDF)
Next articleQuantum Attacks on Public-Key Cryptosystems 2013th Edition by Song Y. Yan (PDF)