Friday, January 24, 2025

Some Facts About CRC


  Cyclic Redundancy Checks (CRCs) are the most widely used error detection mechanism in everyday life. Whenever you transmit and receive data or store and retrieve data, you need to make sure errors are not made or at least know when they happen. CRCs are such an effective mechanism to achieve this and are easy to implement and use. They are used in network communications (wired networks such as ethernet, wireless networks such as LAN and cellular), storage devices (hard drives, SSDs, CDs, DVDs), file compression protocols such as zip and rar and many many more to list here. Literally trillions of CRCs are being computed on earth every second. As I implemented CRC at my job, I got curious some aspects of it which lead to writing this blog. While it is very easy to describe CRC as a simple remainder generation scheme using linear feedback shift register (LFSR), I couldn't help but notice that CRC is a linear block code and that LFSR is an IIR filter which is basically an LTI system. 

  •  So if CRC is considered a linear block code (LBC), how does it fit in the LBC theory? 
  • What is its generator matrix (GM)? Where does the polynomial fit in the picture? 
  • Why is a feedback structure used for code generation? 
  • Why is it used only for error detection and not error correction? 
  • What is its error detection capability? 
  • What kind of polynomials are used in CRC generation?

Get the answers here

Saturday, November 2, 2024

Generator Matrices for Popular LDPC Codes


 LDPC codes are the most important Shannon limit reaching error correction codes in use today, most popular being cellular (5GNR) and WiFi (5 and later). When I examined the LDPC codes in these standards, the first thing that caught my eye was that they were specified using H (parity check matrix or PCM) instead of G (generator matrix or GM). Moreover these were systematic codes, but they did not fit the PCM format of a standard systematic linear block code. But as per the technical specification, there are systematic codes. So how is this possible? What do their GMs look like?


Get the answers here

Tuesday, March 29, 2022

Basics of Information Theory

 

Entropy
Entropy definition

Entropy as expectation


Joint Entropy


Conditional Entropy


Relative Entropy or Kullback-Leibler Distance


Mutual Information

Mutual information is relative entropy between joint distribution between p(x,y) and product distribution p(x)p(y), something like difference between dependency and independency



Capacity