The Universal Computer: The Road from Leibniz to Turing 1st Edition by Martin Davis (PDF)

10

 

Ebook Info

  • Published: 2000
  • Number of pages: 256 pages
  • Format: PDF
  • File Size: 2.24 MB
  • Authors: Martin Davis

Description

One of the world’s pioneers in the development of computer science offers a mesmerizing history of computers. Computers are everywhere today–at work, in the bank, in artist’s studios, sometimes even in our pockets–yet they remain to many of us objects of irreducible mystery. How can today’s computers perform such a bewildering variety of tasks if computing is just glorified arithmetic? The answer, as Martin Davis lucidly illustrates, lies in the fact that computers are essentially engines of logic. Their hardware and software embody concepts developed over centuries by logicians such as Leibniz, Boole, and Godel, culminating in the amazing insights of Alan Turing. The Universal Computer traces the development of these concepts by exploring with captivating detail the lives and work of the geniuses who first formulated them. Readers will come away with a revelatory understanding of how and why computers work and how the algorithms within them came to be.

User’s Reviews

Editorial Reviews: Amazon.com Review Computers rely on such things as semiconductors, memory chips, and electricity. But they also rely on a hard-won body of scientific knowledge that has enabled the now-ubiquitous devices to perform complex calculations, multitask, and even play a game of solitaire. Martin Davis, a fluent interpreter of mathematics and philosophy, locates the source of this knowledge in the work of the remarkable German thinker G. W. Leibniz, who, among other accomplishments, was a distinguished jurist, mining engineer, and diplomat but found time to invent a contraption called the “Leibniz wheel,” a sort of calculator that could carry out the four basic operations of arithmetic. Leibniz subsequently developed a method of calculation called the calculus raciocinator, an innovation his successor George Boole extended by, in Davis’s words, “turning logic into algebra.” (Boole emerges as a deeply sympathetic character in Davis’s pages, rather than as the dry-as-dust figure of other histories. He explained, Davis reports, that he had turned to mathematics because he had so little money as a student to buy books, and mathematics books provided more value for the money because they took so long to work through.) Davis traces the development of this logic, essential to the advent of “thinking machines,” through the workshops and studies of such thinkers as Georg Cantor, Kurt Gödel, and Alan Turing, each of whom puzzled out just a little bit more of the workings of the world–and who, in the bargain, made the present possible. –Gregory McNamee From Publishers Weekly This thoroughly enjoyable mix of biographical portraits and theoretical mathematics reveals how a sequence of logicians posed the conceptual questions and contributed the crucial insights resulting in the development of computers long before the technology was available to build even the simplest machines. An intriguing portrait of the great 17th-century mathematician G.W. Leibniz, a pivotal figure in the history of the search for human knowledge, launches this account by New York University professor emeritus Davis (Computability and Unsolvability). Steeped in Aristotelian ideas of perfection but trained in modern engineering, Leibniz conceived the idea of a universal system for determining truth. His contributions to this system are as diverse as the ingenious Leibniz Wheel (an early calculating machine) and the notation used today for calculus. His ideasDin particular, his recognition of the deep connection between systems of notation and actual physical devices for performing computationDinspired mathematicians and logicians, including George Boole, Gottlob Frege, Georg Cantor, David Hilbert and Kurt G del, until Alan Turing used them to develop the powerful mathematical tools that underlie modern computers as well as some of the earliest computer prototypes. After Leibniz, people thought about the problem of building computational systems; after Turing, people got busy building the machines. Davis has told the fascinating story in between. Full of well-honed anecdotes and telling detail, the book reads like a masterful lecture. Presenting key mathematical ideas in moderate depth, it also offers a solid introduction to the field of computer science that will captivate motivated readers. Agent, Alex Hoyt. Copyright 2000 Reed Business Information, Inc. From Scientific American “As computers have evolved from the room-filling behemoths that were the computers of the 1950s to the small powerful machines of today that perform a bewildering variety of tasks, their underlying logic has remained the same,” Davis says. “These logical concepts have developed out of the work of a number of gifted thinkers over a period of centuries. In this book I tell the story of the lives of these people and explain some of their thought.” Davis, professor emeritus of mathematics at New York University, has devoted his career to “this relationship between the abstract logical concepts underlying modern computers and their physical realization.” His tale encompasses seven mathematicians who contributed to that relationship: Gottfried Leibniz, George Boole, Gottlob Frege, Georg Cantor, David Hilbert, Kurt Gödel and Alan Turing. Leibniz, one reads, dreamed of “machines capable of carrying out calculations.” Boole put forward an algebra of logic. And on to Turing, who envisioned a “universal machine” that could play games like chess, be induced to learn much as a child does and ultimately “could be made to exhibit behavior one would be led to call intelligent.” Davis believes that the story he tells “underscores the power of ideas and the futility of predicting where they will lead.” EDITORS OF SCIENTIFIC AMERICAN Review Anyone who works with computers today, who seeks to look into the electronic future, can profit greatly from reading [this]. — John McCarthy, Stanford UniversityDelightfully entertaining and most instructive! — Raymond Smullyan, author of The Riddle of Scheherazade and First-Order LogicErudite, gripping and humane, Martin Davis shows the extraordinary individuals through whom the groundwork of the computer came into being. — Andrew Hodges, author of Alan Turing: The EnigmaMartin Davis speaks about logic with the love and touch of a sculptor speaking about stone. — Dennis Shasha, New York University[A]n elegant history of the search for the boundaries of logic and the machines that live within them. — Wired, Peter Wayner, December 2000 About the Author Martin Davis’s other books include Computability and Unsolvability. A professor emeritus at New York University, he is currently a visiting scholar at the University of California-Berkeley. Read more

Reviews from Amazon users which were colected at the time this book was published on the website:

⭐Dr. Martin Davis’ book is excellent! It starts out to be biographies of seven great mathematicians/logicians, but is so much more. You get inside the heads of these great men, but you also learn about their lives, the world in which they lived, world history, and the mathematics that they developed. Dr. Davis has a way of presenting the math in a style that practically anyone can understand with a little effort. Also, Dr. Davis adds anecdotes about some of the men in his book because he knew them or heard them speak. Also, Dr. Davis is a math/logic professor with many years of experience. The book presents the material so the reader feels like he’s reading an exciting story. The book is exciting and energetic. For anyone with an interest in science or math, computers, technology, this is a worthwhile read. Students entering college would especially benefit from the book as it might give their course of study some new meaning and provide additional motivation to learn and achieve.

⭐Very good book. Explains in simple and clear words the most complex ideas that lead to the invention of the modern computer. An informative book that should be on the reading list of everybody interested in the development of artificial intelligence and the basics of computer science. Complicated ideas such as Cantor’s multiple infinities and Gödel’s Incompleteness theorem are explained to the interested layman without the use of convoluted mathematical formulas. Martin Davis enables the reader to understand how it was possible to create machines that could simulate human intelligence and how men like Alan Turing lay the foundations for the world we live in today.

⭐I read this for a 300 level computer science class and was surprised and how enjoyable it was. It has some heavy math/logic in some places but overall not outside the realm of understanding of a college student. Even if the logic parts are too deep, you can still enjoy it for it’s historical accounts of how people react to new paradigms of understanding and philosophy.

⭐Worth the read if you enjoy computational history. Well written and not too challenging given the mathematical references.Davis was there at the beginning and knows his computer history.

⭐Great timeless work focusing on historical development of logic theory.

⭐The book that arrived was not the dimensions in the description.

⭐Great, thanks!

⭐Awesome review of logic, semantics, and semiotics.

⭐Somewhat disjointed study of the evolution of the universal computer. I think this started off as a series of lecture notes.

Keywords

Free Download The Universal Computer: The Road from Leibniz to Turing 1st Edition in PDF format
The Universal Computer: The Road from Leibniz to Turing 1st Edition PDF Free Download
Download The Universal Computer: The Road from Leibniz to Turing 1st Edition 2000 PDF Free
The Universal Computer: The Road from Leibniz to Turing 1st Edition 2000 PDF Free Download
Download The Universal Computer: The Road from Leibniz to Turing 1st Edition PDF
Free Download Ebook The Universal Computer: The Road from Leibniz to Turing 1st Edition

Previous articleMathematics and the Laws of Nature: Developing the Language of Science (History of Mathematics (Facts on File)) by John Tabak (PDF)
Next articleGod’s Equation: Einstein, Relativity, and the Expanding Universe by Amir D. Aczel (PDF)