Skip to main content

Questions tagged [information-theory]

The science of compressing and communicating information. It is a branch of applied mathematics and electrical engineering. Though originally the focus was on digital communications and computing, it now finds wide use in biology, physics and other sciences.

4 votes
0 answers
50 views

Given integers $n$ and $k$, Alice is given $k$ numbers $1 \le a_1 < a_2 < \cdots < a_k \le n$. She then writes down a message $x\ (1 \le x \le m)$. Bob is given the message $x$ and one ...
Dinshey's user avatar
  • 595
0 votes
0 answers
40 views

Question A $(6,3)$ systematic block code has the following generator matrix: $$ A = \begin{pmatrix} 1 & 1 & 0 & 1 & 0 & 0 \\ 0 & 1 & 1 & 0 & 1 & 0 \\ 0 & ...
qwerty's user avatar
  • 1
0 votes
1 answer
51 views

Quantities like mutual Information $I$, entropy $H$,etc. are typically defined as taking random variables as input. However, they are actually just functions on probability distributions - e.g. the ...
lilsquirrel's user avatar
5 votes
0 answers
83 views

There are estimates on the expected codeworth length of the Fano symbol code (not to be confused with the Shannon code), but I don't know where they come from. Some definitions: Let $\mathcal{X}$ be a ...
FShrike's user avatar
  • 48.2k
0 votes
0 answers
70 views

I am looking to borrow a concept from information theory. Namely, the mutual information between two random variables $X, Y$ when one of the random variables is fixed, e.g. $Y = y$: $$ I(X ; Y = y) $$...
lilsquirrel's user avatar
0 votes
1 answer
65 views

I have asked what information is in both the physics and philosophy stack exchange, but now I am asking it here, in hopes of getting a mathematical perspective on the question. In the mathematical ...
user107952's user avatar
  • 24.8k
-1 votes
1 answer
89 views

In the mathematical field of information theory, there are the notions of signal and noise. Informally, signal is useful or meaningful information, and noise is unwanted or useless information. But I ...
user107952's user avatar
  • 24.8k
0 votes
0 answers
102 views

I am currently working through an example of a smoothing technique for information. I have finished applying the technique to my data and now want to figure out the probability of the item after it’s ...
Roger's user avatar
  • 11
2 votes
0 answers
42 views

Assume $(X_n,Y_n,Z_n)\Rightarrow (X,Y,Z)$ weakly on standard Borel spaces. Is it always true that $$I(X;Y\mid Z)\ \le\ \liminf_{n\to\infty} I(X_n;Y_n\mid Z_n)?$$ It is classical that relative entropy $...
June Kalicharan's user avatar
3 votes
2 answers
223 views

We know that, if $P_n$ is the (discrete) uniform distribution on $\{0,\frac{1}{n},\frac{2}{n},\dots, 1\}$, and if $f$ is the (continuous) uniform distribution on $[0,1]$, then $P_n\stackrel{d}{\to} f$....
Ashok's user avatar
  • 2,023
1 vote
1 answer
82 views

I'm currently attending an introductive course on information theory given by a very famous mathematician who is undeniably an expert in the field. When explaining the axioms of said entropy function, ...
J.J.T's user avatar
  • 1,087
2 votes
0 answers
42 views

Mutual information is itself a Kullback–Leibler (KL) divergence: $$ I(X;Y) = D_{\mathrm{KL}}\big(P_{XY} \mid P_X P_Y\big), $$ that is, the KL divergence between the joint distribution and the product ...
Kasi's user avatar
  • 83
0 votes
0 answers
38 views

Suppose I have a set of symbols with expected probabilities for each, and a set of n observed sequences of these symbols, each of length m. From simply looking over the array of observed counts of X ...
biohacker's user avatar
  • 133
12 votes
2 answers
419 views

Draw two independent random variables $A$ and $B$ each from ${\rm Uniform}(0,1)$. Then draw a sample $x$ from ${\rm Uniform}(A,B)$ or ${\rm Uniform}(B,A)$, depending on which of $A$ and $B$ is larger. ...
Akiva Weinberger's user avatar
0 votes
1 answer
71 views

I have random variables $X = (X_1,...,X_n)$, and $Y = (Y_1,...,Y_m)$, where $X_i \in \{0,1\}$ and $Y_i \in [0,1]$. The entries of $Y$ are independent, but the entries of $X$ are not, although $P(X|Y) =...
Ollie's user avatar
  • 119

15 30 50 per page
1
2 3 4 5
172