You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: doc/src/week5/week5.do.txt
+102-3Lines changed: 102 additions & 3 deletions
Original file line number
Diff line number
Diff line change
@@ -300,7 +300,8 @@ $\rho_B$, that is we have for a given probability distribution $p_i$
300
300
!split
301
301
===== Entropies and density matrices =====
302
302
303
-
303
+
We discuss first the classical information entropy.
304
+
Thereafter we define the quantum-mechanical quantity, commonly known as the Von-Neumann entropy.
304
305
305
306
!split
306
307
===== Shannon information entropy =====
@@ -320,9 +321,89 @@ Why this expression? What does it mean?
320
321
321
322
!split
322
323
===== Mathematics of entropy =====
323
-
What is the basic idea of the entropy as it is used in information theory (we leave out the standard description from statistical physics here)?
324
324
325
-
We want to have a measure of unlikelihod. Consider a simple binary system, with two outcomes, true and false with true given by a probability $p$ and false given by $1-p$. Since $p$ represents a probability
325
+
What is the basic idea of the entropy as it is used in information
326
+
theory (we leave out the standard description from statistical physics
327
+
here)?
328
+
329
+
We want to have a measure of unlikelihood. Consider a simple binary
330
+
system, with two outcomes, true and false with true given by a
331
+
probability $p$ and false given by $1-p$. Since $p$ represents a
332
+
probability, and assuming the probabilities are properly normalized,
333
+
we have $p\in [0,1]$.
334
+
335
+
Suppose know that we know precisely $p$, for example we could set
336
+
$p=1$. This means that there is only one outcome, namely the true one
337
+
and false is equal to zero. We know thus (exactly) which state the
338
+
system is in. If we set $p=0$, then we know that the outcome is
339
+
false, without doubt. Can we find a way to, through a specific
340
+
function to say with a given certainty that the system is in a
341
+
specific state? Let us call this function for $S$.
342
+
343
+
!split
344
+
===== More on the mathematics of entropy =====
345
+
346
+
We could for example use
347
+
!bt
348
+
\[
349
+
S(x)= -\log_2{p(x)},
350
+
\]
351
+
!et
352
+
as potential
353
+
function. Note that if we change the log-base, we only change the results by a scaling constant.
354
+
355
+
When $p(x)=1$, then $S(x)=0$ and we could say that the
356
+
probability of being in one specific state is uniquely defined since
357
+
$S$ is then zero. This function is also continuous, it is additive
358
+
(which is very useful of we have independent and identically
359
+
distributed stochastic variables), it is also high for unlikely events
360
+
since when $p$ goes to zero (unlikely event) it becomes very
361
+
large. It is also a non-negative function.
362
+
363
+
!split
364
+
===== Changing the expression =====
365
+
366
+
We note however that if we go back to our binary model and would like to
367
+
describe the entropy in terms of the variable $p$, we end up with $p$ being zero for the false outcome.
368
+
The latter has probability one ($p=0)$.
369
+
Using that
370
+
!bt
371
+
\[
372
+
\lim_{x\to -\infty} -x \log_2{x}=0,
373
+
\],
374
+
!et
375
+
we can define the classical information entropy as
376
+
!bt
377
+
\[
378
+
S(x) = -p(x) \log_2{p(x)}.
379
+
\]
380
+
!et
381
+
If we now consider a series of stochastic variables
382
+
$X=\{x_0,x_1,\dots,x_{n-1}\}$ with probability for an outcome $x\in X$ given by $p_X(x)$, the entropy become
383
+
!bt
384
+
\[
385
+
S_X = -\sum_{x\in X}p_X(x_i) \log_2{p_X(x_i)}.
386
+
\]
387
+
!et
388
+
389
+
!split
390
+
===== Binary example =====
391
+
392
+
If we now use the above binary example, we have (using that $p_1=p$ and $p_2=1-p$)
393
+
!bt
394
+
\[
395
+
S = -\sum_{i=1}^{i=2}p_i \log_2{p_i}=-p\log_2{p}-(1-p)\log_2{1-p}.
396
+
\]
397
+
!et
398
+
For $p=0$ we have $S=0$ and for $p=1$ we get $S=0$. Thus, for the two
399
+
uniquely defined states we have entropy zero and a precise knowledge of the state. The maximum value is at
400
+
$p=0.5$ and this is a situation where our level of knowledge is no
401
+
better than us tossing a coin. This simple example demonstrates how we
402
+
can use the above mathematical expression as a way to say whether we
403
+
have a well-defined outcome or not. Our lack of knowledge is expressed
0 commit comments