Questions tagged [information-theory]
The science of compressing and communicating information. It is a branch of applied mathematics and electrical engineering. Though originally the focus was on digital communications and computing, it now finds wide use in biology, physics and other sciences.
2,580 questions
4
votes
0
answers
50
views
Optimal signaling to recover an element’s position in a k-set
Given integers $n$ and $k$, Alice is given $k$ numbers $1 \le a_1 < a_2 < \cdots < a_k \le n$. She then writes down a message $x\ (1 \le x \le m)$. Bob is given the message $x$ and one ...
0
votes
0
answers
40
views
Generator matrix/ linear block code [closed]
Question
A $(6,3)$ systematic block code has the following generator matrix:
$$
A =
\begin{pmatrix}
1 & 1 & 0 & 1 & 0 & 0 \\
0 & 1 & 1 & 0 & 1 & 0 \\
0 & ...
0
votes
1
answer
51
views
Can mutual information be defined between a random variable and a conditional distribution
Quantities like mutual Information $I$, entropy $H$,etc. are typically defined as taking random variables as input. However, they are actually just functions on probability distributions - e.g. the ...
5
votes
0
answers
83
views
How can we rigorously bound the performance of the Fano code?
There are estimates on the expected codeworth length of the Fano symbol code (not to be confused with the Shannon code), but I don't know where they come from. Some definitions:
Let $\mathcal{X}$ be a ...
0
votes
0
answers
70
views
Where can I find more information about 'specific information'
I am looking to borrow a concept from information theory. Namely, the mutual information between two random variables $X, Y$ when one of the random variables is fixed, e.g. $Y = y$:
$$
I(X ; Y = y)
$$...
0
votes
1
answer
65
views
What is the formal definition of information, in the context of information theory? [closed]
I have asked what information is in both the physics and philosophy stack exchange, but now I am asking it here, in hopes of getting a mathematical perspective on the question. In the mathematical ...
-1
votes
1
answer
89
views
In the mathematical field of information theory, are there rigorous definitions of signal and noise?
In the mathematical field of information theory, there are the notions of signal and noise. Informally, signal is useful or meaningful information, and noise is unwanted or useless information. But I ...
0
votes
0
answers
102
views
Computing $P$ from $H$ in $H = - P \log_2 P$
I am currently working through an example of a smoothing technique for information. I have finished applying the technique to my data and now want to figure out the probability of the item after it’s ...
2
votes
0
answers
42
views
Is conditional mutual information $I(X;Y\mid Z)$ lower semicontinuous for general $X,Y,Z$?
Assume $(X_n,Y_n,Z_n)\Rightarrow (X,Y,Z)$ weakly on standard Borel spaces. Is it always true that
$$I(X;Y\mid Z)\ \le\ \liminf_{n\to\infty} I(X_n;Y_n\mid Z_n)?$$
It is classical that relative entropy $...
3
votes
2
answers
223
views
Isn't Entropy Functional Continuous?
We know that, if $P_n$ is the (discrete) uniform distribution on $\{0,\frac{1}{n},\frac{2}{n},\dots, 1\}$, and if $f$ is the (continuous) uniform distribution on $[0,1]$, then $P_n\stackrel{d}{\to} f$....
1
vote
1
answer
82
views
Regarding an axiom of the entropy function
I'm currently attending an introductive course on information theory given by a very famous mathematician who is undeniably an expert in the field. When explaining the axioms of said entropy function, ...
2
votes
0
answers
42
views
Can KL divergence be written as mutual information under some special construction?
Mutual information is itself a Kullback–Leibler (KL) divergence:
$$
I(X;Y) = D_{\mathrm{KL}}\big(P_{XY} \mid P_X P_Y\big),
$$
that is, the KL divergence between the joint distribution and the product ...
0
votes
0
answers
38
views
Entropy of observed counts given expected probability
Suppose I have a set of symbols with expected probabilities for each, and a set of n observed sequences of these symbols, each of length m. From simply looking over the array of observed counts of X ...
12
votes
2
answers
419
views
Why does this process produce the binary entropy distribution? Random point from randomly chosen interval
Draw two independent random variables $A$ and $B$ each from ${\rm Uniform}(0,1)$. Then draw a sample $x$ from ${\rm Uniform}(A,B)$ or ${\rm Uniform}(B,A)$, depending on which of $A$ and $B$ is larger. ...
0
votes
1
answer
71
views
Bound on 'absolute' mutual information.
I have random variables $X = (X_1,...,X_n)$, and $Y = (Y_1,...,Y_m)$, where $X_i \in \{0,1\}$ and $Y_i \in [0,1]$. The entries of $Y$ are independent, but the entries of $X$ are not, although $P(X|Y) =...