pdf | 21.42 MB | English | Isbn:9780262534772 | Author: Marvin Minsky, Seymour A. Papert, Leon Bottou | Year: 2017
About ebook: Perceptrons, Reissue of the 1988 Expanded Edition with a new foreword by Léon Bottou: An Introduction to Computational Geometry
The first systematic study of parallelism in computation by two pioneers in the field.
Reissue of the 1988 Expanded Edition with a new foreword by Léon Bottou
In 1969, ten years after the discovery of the perceptronwhich showed that a machine could be taught to perform certain tasks using examplesMarvin Minsky and Seymour Papert published Perceptrons, their analysis of the computational capabilities of perceptrons for specific tasks. As Léon Bottou writes in his foreword to this edition, "Their rigorous work and brilliant technique does not make the perceptron look very good. " Perhaps as a result, research turned away from the perceptron. Then the pendulum swung back, and machine learning became the fastest-growing field in computer science. Minsky and Papert's insistence on its theoretical foundations is newly relevant.
Perceptrons the first systematic study of parallelism in computationmarked a historic turn in artificial intelligence, returning to the idea that intelligence might emerge from the activity of networks of neuron-like entities. Minsky and Papert provided mathematical analysis that showed the limitations of a class of computing machines that could be considered as models of the brain. Minsky and Papert added a new chapter in 1987 in which they discuss the state of parallel computers, and note a central theoretical challenge: reaching a deeper understanding of how "objects" or "agents" with individuality can emerge in a network. Progress in this area would link connectionism with what the authors have called "society theories of mind. "
Reissue of the 1988 Expanded Edition with a new foreword by Léon Bottou
In 1969, ten years after the discovery of the perceptronwhich showed that a machine could be taught to perform certain tasks using examplesMarvin Minsky and Seymour Papert published Perceptrons, their analysis of the computational capabilities of perceptrons for specific tasks. As Léon Bottou writes in his foreword to this edition, "Their rigorous work and brilliant technique does not make the perceptron look very good. " Perhaps as a result, research turned away from the perceptron. Then the pendulum swung back, and machine learning became the fastest-growing field in computer science. Minsky and Papert's insistence on its theoretical foundations is newly relevant.
Perceptrons the first systematic study of parallelism in computationmarked a historic turn in artificial intelligence, returning to the idea that intelligence might emerge from the activity of networks of neuron-like entities. Minsky and Papert provided mathematical analysis that showed the limitations of a class of computing machines that could be considered as models of the brain. Minsky and Papert added a new chapter in 1987 in which they discuss the state of parallel computers, and note a central theoretical challenge: reaching a deeper understanding of how "objects" or "agents" with individuality can emerge in a network. Progress in this area would link connectionism with what the authors have called "society theories of mind. "
Category:Science & Technology, Computers, Engineering, Technology, Mathematics, Artificial Intelligence (AI), Robotics & Artificial Intelligence, Computers - General & Miscellaneous, Geometry, Computer Mathematics, Geometry - General & Miscellaneous, Neural Networks, Parallel, Distributed, and Supercomputing