18
$\begingroup$

Suppose that ${X_n; Y_n}$ is a random process with a discrete alphabet, that is, taking on values in a discrete set for $n$ data length. They correspond to the input and output of a communication process. Assuming Y to be the discretized output, what is meant by Shannon's source entropy for a given discretization bin $k$ : $H_{source} = lim_{k \rightarrow \infty} \frac{1}{k} H_k$ where $H_k$ stands for Shannons entropy.

I found this in Stojanovski et al's paper: Applications of Symbolic Dynamics in Chaos Synchronization. Now, Jun Chen also mentions about source entropy but the formula is very different! Its the same as Shannon's entropy.

So, what exactly is source entropy? Is it the entropy of $X$ or $Y$ or is it the same as Shannon's entropy?

$\endgroup$
5
  • $\begingroup$ I haven't read the sources you cite, but the definition you quoted appears to be the definition of [entropy rate][1]. For a process where the random variables are i.i.d., the entropy rate equals the entropy of any of the random variables in the process. [1]: en.wikipedia.org/wiki/Entropy_rate $\endgroup$
    – Néstor
    Commented Jul 30, 2012 at 21:19
  • $\begingroup$ In the link which you provided, $n$ implies the number of data samples or the number of quantization levels?Also,source entropy(X) will not be equal to source entropy(Y) or would it be same sinceY is just a discretization of X? $\endgroup$
    – chk
    Commented Jul 30, 2012 at 22:15
  • $\begingroup$ The idea in the definition of entropy rate is that the random variables $X_n$ form a stochastic process. Therefore, for example, $H(X_1,X_2,...,X_n)$ is the entropy of the joint distribution of $(X_1,X_2,...,X_n)$, $p(X_1,X_2,...,X_n)$ (i.e., the so-called joint entropy). Therefore, for example, for i.i.d. random variables, $H(X_1,X_2,...,X_n)=H(X_1)H(X_2)...H(X_n)=nH(X_i)$, because of the independence assumption (the joint distribution can be expressed as a product of the individual distributions), where $i$ can be any of $i=1,2,3,...,n$ (that's the result I claimed in my first comment). $\endgroup$
    – Néstor
    Commented Jul 30, 2012 at 23:12
  • $\begingroup$ So, when implementing entropy should I take the joint probability or the probability of the occurence of each symbol?Also, its still unclear for me if I need to calculate entropy of the discretized variable Y or X(raw signal) when the task is to determine the information capacity or the number of bits and hence number of quantization bins the channel can accomodate without error. $\endgroup$
    – chk
    Commented Jul 31, 2012 at 0:11
  • $\begingroup$ The link to Applications of Symbolic Dynamics in Chaos Synchronization did not work for me. $\endgroup$
    – Galen
    Commented Jun 29, 2021 at 14:17

0

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.