Information Theory: A Tutorial Introduction (Tutorial Introductions) by James V Stone | (PDF) Free Download

23

 

Ebook Info

  • Published: 2018
  • Number of pages: 260 pages
  • Format: PDF
  • File Size: 6.51 MB
  • Authors: James V Stone

Description

Originally developed by Claude Shannon in the 1940s, information theory laid the foundations for the digital revolution, and is now an essential tool in telecommunications, genetics, linguistics, brain sciences, and deep space communication. In this richly illustrated book, accessible examples are used to introduce information theory in terms of everyday games like ‘20 questions’ before more advanced topics are explored. Online MatLab and Python computer programs provide hands-on experience of information theory in action, and PowerPoint slides give support for teaching. Written in an informal style, with a comprehensive glossary and tutorial appendices, this text is an ideal primer for novices who wish to learn the essential principles and applications of information theory.

User’s Reviews

Reviews from Amazon users which were colected at the time this book was published on the website:

⭐Information Theory: A tutorial Introduction is a highly readable first account of Shannon’s mathematical theory of communication, now known as information theory. It assumes little prior knowledge and discusses both information with respect to discrete and continuous random variables. It is highly readable and a great gentle introduction to the theory. It is intuitive rather than rigorous and gives very clear examples of how to use information and coding theory to frame certain problems.The book starts out by defining the units of information, bits and then how to quantify information so that it has the properties that we would like it to have namely additive, continuous, maximal with respect to certain probability measures and symmetry. He goes through some of the various examples to give a feel of how information can be considered and how many means of communication include much redundancy. The author first tackles discrete random variables and their entropy and information. Through basic ideas like the less likely a piece of information is the more information it has leads to the construction of a measure of information which fits with Shannon’s conditions. From that construction of entropy the author then goes through and explores its repercussions with things like dice examples. The explanations are clear and its hard not to feel like you are really making some progress on what is supposed to be a very difficult subject. The author moves from information to coding theory, which is the practical application of the subject and introduces ideas like channel capacity, how much information can be transmitted in a noiseless channel, conditional expectations and coding schemes that can deliver results arbitrarily close to the channel capacity. The author spends some time discussing redundancy in English and coding schemes for how to create efficient blocks for the language which is really interesting. One learns very quickly through the tutorials that trying to have blocks of code which are equiprobable is the way to get to the best coding efficiency and maximize information transmission. The author then moves on to noisy channels, which is far more practical and further evolves the readers intuition. One learns how to add redundancy to codes to make them resilient to noise contamination. The author discusses continuous random variables and their information. The information of a continuous variable, which is a more difficult subject, can be considered infinite given the events possible are uncountable, but with noise this is avoided and the author explains the ideas well so that the content remains clear. The author discusses mutual information in the continuous case giving the reader a strong understanding of joint and marginal probability distributions. With these tools the author goes back to channel capacity in a noisy channel for continuous variables and re-discusses the ideas from discrete variables in the new light. The author then moves on from communication related information theory to entropy and physics. These chapters were to give a feel of the similarity and topics like thermodynamics and quantum information are lightly touched.Information Theory is a highly readable account of what is usually a very technical subject. The reader will come through it with an intuitive feel for information and how transmission of information across various channels and coding schemes to do this effectively. The only thing is there are no exercises so the confidence will be to some extent somewhat false as reading and implementing are very different things. That being said as an introductory text before tackling more mathematical works I think this is very helpful. I have both this as well as the author’s work on Bayes and I prefer the content of this. Definitely recommended.

⭐Whether you are just beginning your relationship with mathematics or you are a theoretical mathematician working on the applications of knot theory to 4D topology, you have something to gain from this book.Mathematics is hard. The language in which mathematicians describe their work only compounds the difficulty of learning math for few are fluent in this succinct language. Unfortunately, it is this language almost all introductory mathematical textbooks inevitably conform to, at the cost of the reader’s comprehension. Dr. Stone, overcomes this language rift by explaining the math in a friendly, familiar way. He further takes the care to ensure appropriate time is spent clarifying each topic in a variety of ways (in case one does not make much sense). Stone also provides appendices as reference for the reader who may need more explanation or refreshers. This kind of guidance through mathematical theory is inherently absent in the mathematical language as its core is precision, brevity, and removal of all redundancy.It is his very thoughtful explanation and walkthrough which makes me confidence to say that Dr. James V Stone’s introduction to information theory is conceivably the best book I have read; not just in regards to information theory but in regards to mathematics (applied or otherwise) as a whole.The reader is guided through Shannon’s seminal work in a way that is applicable regardless of the reader’s background (mathematics, art, biology, journalism, etc). Dr. Stone helps the reader develop an intuition for information theory.The feeling of such a clear and expounded grasp on a mathematical field is so rare currently that this feeling is most difficult to describe other than you’ll “just get it.” If you have had minimal exposure to math, are helplessly confused by proofs, feel like you just never understand, this book is for you. This book is equally applicable to those versed in mathematics, as it provides an understanding that is often disjoint from a theoretical approach.Consider how many people utilize “basic” mathematics to help them approach and solve questions in their daily life intuitive like e.g. if I make $x per month, how much do I earn per week? Of my weekly earnings, if I set aside y for groceries and z for savings, how much do I have left to spend?The commonplaceness of applying math in this way, by understanding the concepts behind the math, rather than just plugging into a formula is exactly what you can expect to gain from reading Dr. Stone’s book. By the end of even the first section of the first chapter, the reader may find that they are already grasping this intuitive understanding, and applying it to the world around them. Dr. Stone helps the reader integrate the core concepts of information theory so that the math behind it becomes a tool for the reader to use, rather than to be perplexed by.The book is a joy to read, and a privilege to learn from.For those who have read the review this far, I study mathematics and neuroscience. As such, I have read my fair share of mathematical textbooks, mathematical introductions, and mathematical books for “dummies” (as I often feel myself). In my experience math never stops being complex, and try as many might to simply it, none have succeeded as well as Dr. Stone. I was versed in information theory before reading this book. The intuition and deep understanding / appreciation for this field that I have gained from the book is unheard-of. It is intuition that makes a great mathematician, and this book will teach you to think intuitively. The clarity of Dr. Stone’s work is so profound I have little other way to describe how accessible it is to all walks of life.While many may not consider this a “mathematical textbook,” let Dr. Stone’s style be an example for how math should initially be taught. This book works well as a standalone text, and as a supplement to more intricate texts in regard to information theory. Do not let the title introduction fool you, Dr. Stone manages to maintain the intricacies of the field in a way often overlooked.

⭐Two questions as I read this book. How can information resolve ambiguity if redundancy is eliminated as inefficiency? Shannon equated info with surprise value. Redundancy is boring. But who puts the form in information? Plus he defined communication as from input to output. In linguistics is from output to input. Communication implies production reliably identical to input. This makes no sense.

⭐At last a book which is pragmatic, and properly so. So many other books are written by people whose entire life is about the subject at hand, they obviously find it easy, and run through the basics at 100mph. Here is a book where, clearly the author is still a professional, but understands completely that “baby steps”, appealing first to intuition, is the initial way forward. This has the perfectly sloped learning curve for the person who is not a post-grad doctoral candidate in maths or physics (read: I’m a computer scientist and greek letter overload is just not my thing), yet it develops the concepts into rigour as one goes along. Would that all scientific textbooks were written this way.

⭐Great introductory book on this topic. 1 star dinked as the quality of the book was mediocre. This book is printed by Amazon, UK (it says so in the back matter). The print and paper quality is, unfortunately, not as good as you would get with an ‘off the shelf’ book. If buying again, I’d try to get a publishers edition secondhand. I have given feedback directly to Amazon customer care about this.

⭐This is easily the best and most concise introduction to Information Theory that ive read, and ive read a lot. The author takes time to elaborate the detail and anchors it around the insight that the detail is to express. I spent the whole of last christmas reading it, and it was a pleasure to finally get to grips with the theory.

⭐Good introduction, with some interesting examples. The writing style is easy to follow, but some part are a bit too dry for my taste. I still recommend it over other more complex and boring books on this subject.

⭐A clear, well presented introduction to the main ideas of information theory. A number of illustrative applications are presented.

Keywords

Free Download Information Theory: A Tutorial Introduction (Tutorial Introductions) in PDF format
Information Theory: A Tutorial Introduction (Tutorial Introductions) PDF Free Download
Download Information Theory: A Tutorial Introduction (Tutorial Introductions) 2018 PDF Free
Information Theory: A Tutorial Introduction (Tutorial Introductions) 2018 PDF Free Download
Download Information Theory: A Tutorial Introduction (Tutorial Introductions) PDF
Free Download Ebook Information Theory: A Tutorial Introduction (Tutorial Introductions)

Previous articleQuantum Information Processing 2nd Edition by Thomas Beth (PDF)
Next articleQuantum and Statistical Field Theory (Oxford Science Publications) 1st Edition by Michel Le Bellac | (PDF) Free Download