Perceptrons: An Introduction to Computational Geometry, Expanded Edition by Marvin Minsky (PDF)

18

 

Ebook Info

  • Published: 1987
  • Number of pages: 308 pages
  • Format: PDF
  • File Size: 13.34 MB
  • Authors: Marvin Minsky

Description

Perceptrons – the first systematic study of parallelism in computation – has remained a classical work on threshold automata networks for nearly two decades. It marked a historical turn in artificial intelligence, and it is required reading for anyone who wants to understand the connectionist counterrevolution that is going on today. Artificial-intelligence research, which for a time concentrated on the programming of ton Neumann computers, is swinging back to the idea that intelligence might emerge from the activity of networks of neuronlike entities. Minsky and Papert’s book was the first example of a mathematical analysis carried far enough to show the exact limitations of a class of computing machines that could seriously be considered as models of the brain. Now the new developments in mathematical tools, the recent interest of physicists in the theory of disordered matter, the new insights into and psychological models of how the brain works, and the evolution of fast computers that can simulate networks of automata have given Perceptrons new importance.Witnessing the swing of the intellectual pendulum, Minsky and Papert have added a new chapter in which they discuss the current state of parallel computers, review developments since the appearance of the 1972 edition, and identify new research directions related to connectionism. They note a central theoretical challenge facing connectionism: the challenge to reach a deeper understanding of how “objects” or “agents” with individuality can emerge in a network. Progress in this area would link connectionism with what the authors have called “society theories of mind.” Marvin L. Minsky is Donner Professor of Science in M.I.T.’s Electrical Engineering and Computer Science Department. Seymour A. Papert is Professor of Media Technology at M.I.T. .

User’s Reviews

Editorial Reviews: About the Author Marvin Minsky (1927–2016) was Toshiba Professor of Media Arts and Sciences and Donner Professor of Electrical Engineering and Computer Science at MIT. He was a cofounder of the MIT Media Lab and a consultant for the One Laptop Per Child project.The late Seymour A. Papert was a Professor in MIT’s AI Lab (1960–1980s) and MIT’s Media Lab (1985–2000) and the author of Mindstorms: Children, Computers, and Powerful Ideas.

Reviews from Amazon users which were colected at the time this book was published on the website:

⭐Even though this is an old book i enjoyed reading through it. There is a lot of stuff that isn’t referenced much, as example he had a little breakout on toroidal maps.

⭐Book is great. But the copy I received is made of a cheap paper. I know that there is a better quality version of this book.

⭐As expected

⭐Big book about those special computer science’ topics, great if your looking for research material. I used it as a referral for my thesis,

⭐Shipping is fast. Book is nice and clean. Can’t wait to read it!

⭐In 1958, Cornell psychologist Frank Rosenblatt proposed the ‘perceptron’, one of the first neural networks to become widely known. A retina sensory layer projected to an association layer made up of threshold logic units which in turn connected to the third layer, the response layer. If two groups of patterns are linearly separable then the perceptron network works well in learning to classify them in separate classes. In this reference, Minsky and Papert show that assuming a diameter-limited sensory retina, a perceptron network could not always compute connectedness, ie, determining if a line figure is one connected line or two separate lines. Extrapolating the conclusions of this reference to other sorts of neural networks was a big setback to the field at the time of this reference. However, it was subsequently shown that having an additional ‘hidden’ layer in the neural network overcame many of the limitations. This reference figures so prominently in the field of neural networks, and is often referred to in modern works. But of even greater significance, the history of the perceptron demonstrates the complexity of analyzing neural networks. Before this reference, artificial neural networks were considered terrific, after this reference limited, and then in the 1980s terrific again. But at the time of this writing, it is realized that despite physiological plausibility, artificial neural networks do not scale well to large or complex problems that brains can easily handle, and artificial neural networks as we know them may actually be not so terrific.

Keywords

Free Download Perceptrons: An Introduction to Computational Geometry, Expanded Edition in PDF format
Perceptrons: An Introduction to Computational Geometry, Expanded Edition PDF Free Download
Download Perceptrons: An Introduction to Computational Geometry, Expanded Edition 1987 PDF Free
Perceptrons: An Introduction to Computational Geometry, Expanded Edition 1987 PDF Free Download
Download Perceptrons: An Introduction to Computational Geometry, Expanded Edition PDF
Free Download Ebook Perceptrons: An Introduction to Computational Geometry, Expanded Edition

Previous articleC# Bible 1st Edition by Jeff Ferguson (PDF)
Next articleInteraction Between Compilers and Computer Architectures (The Springer International Series in Engineering and Computer Science, 613) by Gyungho Lee (PDF)