Skip to content

Commit 21f45e4

Browse files
committed
Update week5.do.txt
1 parent 2083841 commit 21f45e4

File tree

1 file changed

+102
-3
lines changed

1 file changed

+102
-3
lines changed

doc/src/week5/week5.do.txt

Lines changed: 102 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -300,7 +300,8 @@ $\rho_B$, that is we have for a given probability distribution $p_i$
300300
!split
301301
===== Entropies and density matrices =====
302302

303-
303+
We discuss first the classical information entropy.
304+
Thereafter we define the quantum-mechanical quantity, commonly known as the Von-Neumann entropy.
304305

305306
!split
306307
===== Shannon information entropy =====
@@ -320,9 +321,89 @@ Why this expression? What does it mean?
320321

321322
!split
322323
===== Mathematics of entropy =====
323-
What is the basic idea of the entropy as it is used in information theory (we leave out the standard description from statistical physics here)?
324324

325-
We want to have a measure of unlikelihod. Consider a simple binary system, with two outcomes, true and false with true given by a probability $p$ and false given by $1-p$. Since $p$ represents a probability
325+
What is the basic idea of the entropy as it is used in information
326+
theory (we leave out the standard description from statistical physics
327+
here)?
328+
329+
We want to have a measure of unlikelihood. Consider a simple binary
330+
system, with two outcomes, true and false with true given by a
331+
probability $p$ and false given by $1-p$. Since $p$ represents a
332+
probability, and assuming the probabilities are properly normalized,
333+
we have $p\in [0,1]$.
334+
335+
Suppose know that we know precisely $p$, for example we could set
336+
$p=1$. This means that there is only one outcome, namely the true one
337+
and false is equal to zero. We know thus (exactly) which state the
338+
system is in. If we set $p=0$, then we know that the outcome is
339+
false, without doubt. Can we find a way to, through a specific
340+
function to say with a given certainty that the system is in a
341+
specific state? Let us call this function for $S$.
342+
343+
!split
344+
===== More on the mathematics of entropy =====
345+
346+
We could for example use
347+
!bt
348+
\[
349+
S(x)= -\log_2{p(x)},
350+
\]
351+
!et
352+
as potential
353+
function. Note that if we change the log-base, we only change the results by a scaling constant.
354+
355+
When $p(x)=1$, then $S(x)=0$ and we could say that the
356+
probability of being in one specific state is uniquely defined since
357+
$S$ is then zero. This function is also continuous, it is additive
358+
(which is very useful of we have independent and identically
359+
distributed stochastic variables), it is also high for unlikely events
360+
since when $p$ goes to zero (unlikely event) it becomes very
361+
large. It is also a non-negative function.
362+
363+
!split
364+
===== Changing the expression =====
365+
366+
We note however that if we go back to our binary model and would like to
367+
describe the entropy in terms of the variable $p$, we end up with $p$ being zero for the false outcome.
368+
The latter has probability one ($p=0)$.
369+
Using that
370+
!bt
371+
\[
372+
\lim_{x\to -\infty} -x \log_2{x}=0,
373+
\],
374+
!et
375+
we can define the classical information entropy as
376+
!bt
377+
\[
378+
S(x) = -p(x) \log_2{p(x)}.
379+
\]
380+
!et
381+
If we now consider a series of stochastic variables
382+
$X=\{x_0,x_1,\dots,x_{n-1}\}$ with probability for an outcome $x\in X$ given by $p_X(x)$, the entropy become
383+
!bt
384+
\[
385+
S_X = -\sum_{x\in X}p_X(x_i) \log_2{p_X(x_i)}.
386+
\]
387+
!et
388+
389+
!split
390+
===== Binary example =====
391+
392+
If we now use the above binary example, we have (using that $p_1=p$ and $p_2=1-p$)
393+
!bt
394+
\[
395+
S = -\sum_{i=1}^{i=2}p_i \log_2{p_i}=-p\log_2{p}-(1-p)\log_2{1-p}.
396+
\]
397+
!et
398+
For $p=0$ we have $S=0$ and for $p=1$ we get $S=0$. Thus, for the two
399+
uniquely defined states we have entropy zero and a precise knowledge of the state. The maximum value is at
400+
$p=0.5$ and this is a situation where our level of knowledge is no
401+
better than us tossing a coin. This simple example demonstrates how we
402+
can use the above mathematical expression as a way to say whether we
403+
have a well-defined outcome or not. Our lack of knowledge is expressed
404+
by the largest entropy value.
405+
406+
326407
!split
327408
===== Von Neumann entropy =====
328409

@@ -334,6 +415,24 @@ S=-\mathrm{Tr}[\rho\log_2{\rho}].
334415
!et
335416
This is the so-called Von Neumann entropy. How did we arrive at this expression?
336417

418+
!split
419+
===== The density matrix again =====
420+
421+
Let us assume we are studying specific system $A$. The density matrix,
422+
using an ONB basis $\psi_j$ is defined as
423+
!bt
424+
\[
425+
\rho_A=\sum_j\lambda_j\vert \psi_j\rangle\langle \psi_j\vert=\sum_j p_j\vert \psi_j\rangle\langle \psi_j\vert,
426+
\]
427+
!et
428+
and the eigenvalues $\lambda_j$ are the probabilities (or overlap coefficients) of being in a
429+
specific state $\psi_j$.
430+
431+
!split
432+
===== Linking with a new expression for the entropy =====
433+
434+
435+
337436
!split
338437
===== Two-qubit system and calculation of density matrices and exercise =====
339438

0 commit comments

Comments
 (0)