Digital communications and Signal Processing
Tuesday, March 29, 2022
Basics of Information Theory
Entropy
Entropy definition
Entropy as expectation
Joint Entropy
Conditional Entropy
Relative Entropy or Kullback-Leibler Distance
Mutual Information
Mutual information is relative entropy between joint distribution between p(x,y) and product distribution p(x)p(y), something like difference between dependency and independency
Capacity
Newer Posts
Home
Subscribe to:
Posts (Atom)