Monday 19 August 2019

Open Quantum system

Probability Theory

$\Omega$, the sample space, is the space of events. The most basic indivisible event $\omega \in \Omega$ is called elementary event.
We are usually interested only in a subset $A \in P(\Omega)$. This A is called a $\sigma$-algebra, and by definition should satisfy following conditions-
  1. $\Omega , \emptyset \in A$. 
  2. For all $A_1 ,A_2 \in A$, $A_1 \cap A_2 , A_1 \cup A_2 \in A$.
  3. For some mysterious reason, if $A_1,A_2,... \in A$, where the $A_n$'s are countably many, then $\cup_{n=1} ^\infty A_n \in A$. Why doesn't this automatically follow from the second condition?
Probability measure is a map $\mu : A \rightarrow [0,1]$. 
$\mu (A|B)$ means probability that A will happen given B has happened. If occurrence of B does not change the likelihood of A, then A and B are called statistically independent of each other.
The $\sigma$-algebra of Borel sets of R is the smallest $\sigma$-algebra which contains all subsets of the form $(-\infty, x), x \in R$. Borel Set contains all open and closed intervals of the real axis.

Random variable 

is a map $X: \Omega \rightarrow R$, which assigns to each elementary event $\omega \in \Omega$ a real number $X(\omega)$.
A further condition on the function X is that each point on the number line must be mapped to some $\omega \in A$, so that the reverse mapping exists for each point on the number line.

Stochastic process

It is a time dependent random variable.
Practically, it is understood with the means of probability dependent paths evolving with time, as a function of the real line.
$P(B_1,t_1; B_2;t_2;...;B_m,t_m) \equiv \mu (X(t_1)\in B_1, X(t_2) \in B_2, ..., X(t_m) \in B_m)$
The above equation means that the probability that a particle will evolve through Borel sets $B_1, B_2,...,B_m$ at discrete times $t_1,t_2,...,t_m$ depends on the path, and is denoted by the LHS notation.
As further elucidation, note that
$$P(R,t)=1$$
$$P(B_1,t_1; B_2;t_2;...;B_m,t_m) \geq 0$$
$$P(B_1,t_1; B_2,t_2;...;B_m,t_m;R,t_{m+1})=P(B_1,t_1; B_2;t_2;...;B_m,t_m)$$
Note that generally, the probability that a certain path will jump to some other path might depend on its past. So, the jump probability is a function of past values of the path. This not the case in Markovian evolution.

Markov process

A stochastic evolution of a path with short memory. Precisely,
$$X(t_{m+1}) \in B| X(t_m)=x_m, ... , X(t_1) = x_1)= \mu(X(t) \in B| X(t_m)=x_m)$$
where, $t_1 < t_2 < ... < t_m < t_{m+1}$. This equation means that the probability that the jump to B happens depends only on $x_m$.
Let $T(x,t|x',t') \equiv p_{1|1}(x,t|x',t')$ mean the probability that the jump from x' at t' to x at t happens. The RHS is called conditional transition probability or propagator.

Chapman-Kolgorov equation

$$T(x_3,t_3|x_1,t_1)= \int dx_2 T(x_3,t_3|x_2,t_2)T(x_2,t_2|x_1,t_1)$$
In the differential form,
$$\frac{\partial}{\partial t}T(x,t|x',t')=A(t)T(x,t|x',t')$$
Here, A(t) is a linear operator, a matrix with uncountably large size, which acts on the real number. Explicitly,
$$A(t)T(x,t|x',t') \equiv \lim_{\Delta t\to 0} \frac {1}{t}\left[ \int dx'' T(x,t+ \Delta t|x'',t)T(x,t|x',t')\right]$$
Note that both these equations are valid generally, not just valid for the Markov process.

Stationary and homogeneous stochastic processes

A stochastic process is stationary if the probability weight of all the paths remain invariant under time translation. Explicitly,
$$p_m(x_m,t_m +T;...;x_1,t_1 + T)= p_m(x_m,t_m;...;x_1,t_1)$$
A homogeneous process is one in which the propagator, T(x,t|x',t,) depends only on t-t', for a given x. So, a homogeneous process is statistically time invariant, while a stationary process's probability distribution remains invariant in time. An example of a process which is homogeneous but not stationary is Wiener process, they claim. Wiener process is just Brownian motion. Brownian motion is not stationary because it spreads.