Digital communications and Signal Processing
Tuesday, March 29, 2022
Basics of Information Theory
Entropy
Entropy definition
Entropy as expectation
Joint Entropy
Conditional Entropy
Relative Entropy or Kullback-Leibler Distance
Mutual Information
Mutual information is relative entropy between joint distribution between p(x,y) and product distribution p(x)p(y), something like difference between dependency and independency
Capacity
No comments:
Post a Comment
Newer Post
Home
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment