文件名称:Information Theory, Inference, and Learning Algorithms
文件大小:12.1MB
文件格式:PDF
更新时间:2013-04-04 08:43:45
Machine learning Information theory
David J. C. MacKay's book. The textbook used for the information theory course in Cambridge University. Also an excellent book for machine learning. The special feature of this book is it reveals the relationship between information theory and machine learning. According to the author "information and learning are the two sides of the same coin". This book is strongly recommended for those studying machine learning and/or information theory. It really brainstorms you. Contents Preface 1 Introduction to Information Theory 2 Probability, Entropy, and Inference 3 More about Inference I Data Compression 4 The Source Coding Theorem 5 Symbol Codes 6 Stream Codes 7 Codes for Integers II Noisy-Channel Coding 8 Dependent Random Variables 9 Communication over a Noisy Channel 10 The Noisy-Channel Coding Theorem 11 Error-Correcting Codes and Real Channels III Further Topics in Information Theory 12 Hash Codes: Codes for Ecient Information Retrieval 13 Binary Codes 14 Very Good Linear Codes Exist 15 Further Exercises on Information Theory 16 Message Passing 17 Communication over Constrained Noiseless Channels 18 Crosswords and Codebreaking 19 Why have Sex? Information Acquisition and Evolution IV Probabilities and Inference 20 An Example Inference Task: Clustering 21 Exact Inference by Complete Enumeration 22 Maximum Likelihood and Clustering 23 Useful Probability Distributions 24 Exact Marginalization 25 Exact Marginalization in Trellises 26 Exact Marginalization in Graphs 27 Laplace's Method 28 Model Comparison and Occam's Razor 29 Monte Carlo Methods 30 Ecient Monte Carlo Methods 31 Ising Models 32 Exact Monte Carlo Sampling 33 Variational Methods 34 Independent Component Analysis and Latent Variable Modelling 35 Random Inference Topics 36 Decision Theory 37 Bayesian Inference and Sampling Theory V Neural networks 38 Introduction to Neural Networks 39 The Single Neuron as a Classier 40 Capacity of a Single Neuron 41 Learning as Inference 42 Hopeld Networks 43 Boltzmann Machines 44 Supervised Learning in Multilayer Networks 45 Gaussian Processes 46 Deconvolution VI Sparse Graph Codes 47 Low-Density Parity-Check Codes 48 Convolutional Codes and Turbo Codes 49 Repeat{Accumulate Codes 50 Digital Fountain Codes VII Appendices A Notation B Some Physics C Some Mathematics Bibliography Index