Entropy basics and entropy of markoff sources

This page describes Entropy basics and mention entropy equation for markoff sources.It mentions difference between bits vs decit vs nat.

• Information content in a signal will be high if its probability of occurrence will be less.
• If probability of occurrence of message is p then its information content I will be as follows.

Difference between Bits vs Decit vs Nat

Following equations mention difference between Bits vs Decit vs Nat.
I = log2(1/p) ...Bits
I = log10(1/p) ...Decit or Hartley
I = ln (1/p) ...nat

Entropy basics

• Average information content per symbol in a group of symbols is known as entropy and represented by H. •  If there are M symbols and say their probability of occurrence is p1,p2,....,pi,.....Pm then entropy H is expressed as follows:
H = SUM(from i=1 to i=M) pi*log2(1/pi) bits/symbols

Entropy will be maximum when probability of occurrence of all M symbols will be equal and Hmax = log2(M) bits/symbols.

If data source is emitting symbols at a rate rs(symbols/sec) then source information rate R is given by R= rs*H (bits/sec)

Entropy of Markoff Sources

Entropy of markoff source is expressed as follows:


Entropy of markoff source

Where pi is probability that source is in state i
Pij is probability when it is going from state i to state j

Markoff source is sourec which emits the symbols dependently.

Useful Links

echo canceller
exchange signaling types
EPABX basics
PDH vs SDH
CAS vs CCS
Erlang/Grade of Service

RF and Wireless Terminologies