next up previous
Next: About this document ...

11. Lévy-stable distributions
LAST TIME


Much emphasis on Gaussian processes

\begin{displaymath}\sigma^2=\langle x^2\rangle-\langle x\rangle^2=\sigma_1^2+\sigma_2^2\end{displaymath}

Augustin Cauchy and Paul Lévy asked:
What distributions share the property that

\begin{displaymath}P(x)=\int dx_1P_1(x_1)P_2(x-x_1)\end{displaymath}

is of the same "type" as P1 and P2?
Non-Gaussian such distributions Lévy-stable.

Example:
CAUCHY DISTRIBUTION:

\begin{displaymath}P(x)=\frac{b}{\pi[(x-a)^2+b^2]}\end{displaymath}

Using contour integration one finds that

\begin{displaymath}\frac{b_1+b_2}{\pi[(x-a_1-a_2)^2+(b_1+b_2)^2]}\end{displaymath}


\begin{displaymath}=\int_{-\infty}^{\infty}dx_1\frac{b_1b_2}
{\pi^2[(x_1-a_1)^2+b_1^2][(x-x_1-a_2)^2+b_2^2]}\end{displaymath}

i.e. the parameters a and b add under
convolution! Cauchy distribution does not have a finite variance!

\begin{displaymath}\int dx x^2P(x)=\infty\end{displaymath}



On the other hand derivation of Fokker-Planck equation from jump moments

\begin{displaymath}a_n=\frac{1}{\delta t}\int P(q-r,t+\delta t\vert q,t)r^ndr\end{displaymath}

assumed that in limit $\delta t\rightarrow 0, a_1, a_2$ remain finite, while $a_n\rightarrow 0$ in limit.


In probability theory Fourier transform of P(x) called characteristic function

\begin{displaymath}p(k)=\int_{-\infty}^{\infty} P(x) e^{ikx}dx=\langle e^{ikx}\rangle\end{displaymath}


\begin{displaymath}P(x)=\frac{1}{2\pi}\int_{-\infty}^{\infty}p(k)dk\end{displaymath}

The last line follows from the properties of the Dirac $\delta$-function:

\begin{displaymath}\int_{-\infty}^{\infty}f(x)\delta(x)=f(0)\end{displaymath}


\begin{displaymath}\delta(x)=\frac{1}{2\pi}\int_{-\infty}^{\infty}e^{ikx}dk\end{displaymath}

Not all functions p(k) can serve.
Normalization condition imposes constraint

p(0)=1

We must also have

\begin{displaymath}p(k)\leq\int_{-\infty}^{\infty}\vert P(x)\vert\vert e^{ikx}\vert dx=1\end{displaymath}

Requirement that $P(x)\geq 0$ imposes further restrictions on characteristic function.

EXAMPLES:
Characteristic function of Gaussian

\begin{displaymath}P(x)=\frac{\exp (-\frac{x^2}{2\sigma^2})}{\sqrt{2\pi\sigma^2}}\;;\;
p(k)=\exp(-\frac{k^2\sigma^2}{2})\end{displaymath}

Cauchy distribution

\begin{displaymath}P(x)=\frac{b}{\pi[(x-a)^2+b^2]}\end{displaymath}


p(k)=e-b|k|

Fourier transform of convolution integral

\begin{displaymath}\int_{-\infty}^{\infty}dx_1\int_{-\infty}^{\infty}dx_2P_1(x_1)P_2(x_2)\exp(ik(x_1+x_2)\end{displaymath}


=p(k)=p1(k)p2(k)

Note that if

p1=p2=e-b'|k|

p(k) will be on the same form with b=2b'
b may be complex, as long as p*(k)=p(-k) but we must have Re b>0
How does the Fourier transform of e-b|k| look?
\begin{figure}
\epsfxsize=420pt
\epsffile{ftlevy.eps}
\end{figure}
Consider translationally invariant random walks

\begin{displaymath}P(x_2,t_2\vert x_1,t_1)\equiv P(x_2-x_1,t_2-t_1)\end{displaymath}

What is the most general form of P(x,t) that is positive definite and normalized and satisfies the chain condition

\begin{displaymath}P(x,t)=\int dy P(x-y,t-t_1)P(y,t_1)\end{displaymath}

for all intermediate times t1?

A necessary and sufficient condition on P(x,t),
t>0 is that its characteristic function can be written either on the form

\begin{displaymath}p(k,t)=\exp[ikt(v+bc\ln\vert k\vert)-b\vert k\vert t]\end{displaymath}

with $-\pi/2\leq c\leq\pi/2$ or

\begin{displaymath}p(k,t)=\exp(-ivkt-bt\vert k\vert^\alpha[1-i\omega k/\vert k\vert])\end{displaymath}

where v is real, $0\leq\alpha\leq2$, $\alpha\neq 1$, b>0, $\omega$ is real and $\vert\omega\vert\leq\vert\tan(\pi\alpha/2)\vert$.
Theorem due to Khintchine and Lévy, proof in Gnedenko and Kolmogorov [1954].

In most cases the inverse Fourier transforms cannot be carried out analytically. An interesting special case is $v=0,\;\alpha=1/2,\;\\ bt=1,
\;\omega=1$. One finds

\begin{displaymath}P(x)=\left\{\begin{array}{lcl}\frac{\exp-\frac{1}{2x}}{(2\pi)^{1/2}x^{3/2}}
&for& x>0\\ 0&&x<0
\end{array}\right.\end{displaymath}

This function can be normalized, but has
neither mean nor variance.
\begin{figure}
\epsfxsize=300pt
\epsffile{levy5.eps}
\end{figure}



\begin{figure}
\epsfxsize=420pt
\epsffile{gwalk.eps}
\end{figure}



\begin{figure}
\epsfxsize=420pt
\epsffile{cauchyw.eps}
\end{figure}


ASYMPTOTIC BEHAVIOR Can show that inverse Fourier transform of

\begin{displaymath}p(k)=\exp -b\vert k\vert^\alpha\end{displaymath}

aproaches

\begin{displaymath}P(x)\propto \frac{1}{x^{\alpha+1}};\;x\rightarrow \infty\end{displaymath}

if $0<\alpha<2$
while cumulative distribution
(probability that $x\geq z$) satisfies

\begin{displaymath}C(z)\propto z^{-\alpha};\; z\rightarrow\infty\end{displaymath}

Power law implies scaling!
Rescale $z\Rightarrow \lambda z$.
Rescale $C\Rightarrow \lambda^{-\alpha} C$
Power law distribution does not change!

SUMMARY Return to title page

 
next up previous
Next: About this document ...
Birger Bergersen
1998-10-14