Algorithmic Learning in a Random World 2005th Edition by Vladimir Vovk (PDF)

3

 

Ebook Info

  • Published: 2005
  • Number of pages: 340 pages
  • Format: PDF
  • File Size: 3.62 MB
  • Authors: Vladimir Vovk

Description

Algorithmic Learning in a Random World describes recent theoretical and experimental developments in building computable approximations to Kolmogorov’s algorithmic notion of randomness. Based on these approximations, a new set of machine learning algorithms have been developed that can be used to make predictions and to estimate their confidence and credibility in high-dimensional spaces under the usual assumption that the data are independent and identically distributed (assumption of randomness). Another aim of this unique monograph is to outline some limits of predictions: The approach based on algorithmic theory of randomness allows for the proof of impossibility of prediction in certain situations. The book describes how several important machine learning problems, such as density estimation in high-dimensional spaces, cannot be solved if the only assumption is randomness.

User’s Reviews

Editorial Reviews: Review From the reviews:”Algorithmic Learning in a Random World has ten chapters, three appendices, and extensive references. Each chapter ends with a section containing comments, historical discussion, and bibliographical remarks. … The material is developed well and reasonably easy to follow … . the text is very readable. … is doubtless an important reference summarizing a large body of work by the authors and their graduate students. Academics involved with new implementations and empirical studies of machine learning techniques may find it useful too.” (James Law, SIGACT News, Vol. 37 (4), 2006) From the Back Cover Conformal prediction is a valuable new method of machine learning. Conformal predictors are among the most accurate methods of machine learning, and unlike other state-of-the-art methods, they provide information about their own accuracy and reliability.This new monograph integrates mathematical theory and revealing experimental work. It demonstrates mathematically the validity of the reliability claimed by conformal predictors when they are applied to independent and identically distributed data, and it confirms experimentally that the accuracy is sufficient for many practical problems. Later chapters generalize these results to models called repetitive structures, which originate in the algorithmic theory of randomness and statistical physics. The approach is flexible enough to incorporate most existing methods of machine learning, including newer methods such as boosting and support vector machines and older methods such as nearest neighbors and the bootstrap.Topics and Features: * Describes how conformal predictors yield accurate and reliable predictions, complemented with quantitative measures of their accuracy and reliability * Handles both classification and regression problems * Explains how to apply the new algorithms to real-world data sets * Demonstrates the infeasibility of some standard prediction tasks * Explains connections with Kolmogorov’s algorithmic randomness, recent work in machine learning, and older work in statistics * Develops new methods of probability forecasting and shows how to use them for prediction in causal networks Researchers in computer science, statistics, and artificial intelligence will find the book an authoritative and rigorous treatment of some of the most promising new developments in machine learning. Practitioners and students in all areas of research that use quantitative prediction or machine learning will learn about important new methods.

Reviews from Amazon users which were colected at the time this book was published on the website:

⭐Most people, when they’re concerned with doing probabilistic forecasts model using a Bayesian approach, or model class probabilities separately from the classifier (aka fitting logistic models to the output of an SVM aka Platt’s method). The authors of this book have a different approach, which is more fundamental and more powerful than Bayesian ideas. In fact, this basket of ideas can be used to troubleshoot and “de-Bayes” your Bayesian learner, and let you know when your prior is broken.This book is the first exposition of the ideas of Vovk, Schafer and Gammerman on this subject, generally known as “conformal prediction.” I’ve written a long blog on the topic. The ideas are useful in all areas of probabilistic prediction, anything from anomaly detection to semi-supervised learning can benefit from these ideas and techniques.It’s kind of a travesty that so few people seem to know anything about them; there are only a few open source code projects using the ideas, and I find I am the first person to review the book. The book requires some work; it’s mathematical and far from a cookbook for the average practitioner.

⭐Machine learning is introduced like in no other book. Its elementary, the basics are covered. Not an expensive book.

Keywords

Free Download Algorithmic Learning in a Random World 2005th Edition in PDF format
Algorithmic Learning in a Random World 2005th Edition PDF Free Download
Download Algorithmic Learning in a Random World 2005th Edition 2005 PDF Free
Algorithmic Learning in a Random World 2005th Edition 2005 PDF Free Download
Download Algorithmic Learning in a Random World 2005th Edition PDF
Free Download Ebook Algorithmic Learning in a Random World 2005th Edition

Previous articleAutomata and Languages: Theory and Applications 2000th Edition by Alexander Meduna (PDF)
Next articleMathematical Foundations of Computer Science 1984: 11th Symposium Praha, Czechoslovakia September 3-7, 1984. Proceedings (Lecture Notes in Computer Science, 176) 1984th Edition by M.P. Chytil (PDF)