Lecture 1. Characteristic function of a random variable
The characteristic function of a random variable ξ is a non-random function of the parameter t – Ψξ(t), which is equal to the expected value of the function еitξ, where i is an imaginary unit. In other words, Ψ is a function that takes real numbers as an argument, and its values are complex numbers. In this case, if our random variable ξ has a discrete distribution, i.e. it takes the value xk with probability pk, the characteristic function of this random variable can be calculated as the sum shown on the slide. If the random variable is continuous, i.e. its distribution law is given by the distribution density, the characteristic function will be given as a Fourier transform. Please, note that it is also shown on the slide. We know that if the Fourier transform of a function is given, it is easy enough to reconstruct the function itself. That is, if the characteristic function of a continuous random variable is known, it is easy to restore the distribution density from it, we use the inverse Fourier transform. For those who are not familiar with the Fourier transform, I advise you to study the literature, I mean that we will not consider the properties of the Fourier transform in detail. At the same time, since the distribution of a random variable is uniquely restored by its characteristic function, random variables that have the same distribution will have the same characteristic functions. This is true and vice versa. If random variables have the same characteristic functions, their distributions will be the same.
Let’s consider what properties the characteristic function of a random variable has. The first property says that for any random variable, the characteristic function is defined at any point, and the modulus of the value of the characteristic function will never exceed 1. Let me remind you that the values of the characteristic function are complex-valued, and this property means that all the values of the characteristic function will lie inside or on the boundary of the unit circle. We prove this fact for continuous random variables, and a similar proof can be applied for discrete random variables. The modulus of an integral always does not exceed the integral of the module. At the same time, inside we have a distribution density under the module, which is real-valued and non-negative, which means that it can be taken out from under the module, and the еitх function remains under the module. According to Euler’s formula, еitх can be represented as cos(tx)+i*sin(tx). For a given complex number, the modulus is cos2(tx)+sin2(tx). And we know that the value of this expression is always 1. This means that the modulus of our characteristic function will never be greater than 1. So, the statement is proved.
Let’s consider the following property. At point 0, any characteristic function will take the value 1. This is easy to see. We substitute the expression t=0, in this case, еitх turns into 1. And we get the integral of the distribution density, which, as we know, due to the correctness of the normalization condition, is equal to 1.
The next property. For even distribution densities, the characteristic function turns out to be real-valued. That is, we consider the conjugate of a characteristic function. Again, we prove it for continuous random variables. We make a replacement in the integral. Let у=-х, then dy=-dx. We substitute this replacement. Note that if x tends to -∞, y will tend to +∞ and vice versa. So, we will have a minus integral from +∞ to -∞: еitу multiplied by the distribution density at the point –y. When we change the boundaries, i.e. we change the upper and lower limit of the integral, the minus will disappear. Since the distribution density is even, the argument will also have a minus value. This means that we get a characteristic function of the value ξ. We get that the conjugate value at the point t coincides with the value of the function at the point t, which means that there is no imaginary part at this point. This is true for any values of t. This means that all values of the characteristic function do not have an imaginary part, i.e. the function is a real-valued function.
The next property. To consider it, we first differentiate the equalities by the variable t. Let’s do this k times. Thus, we get that the k-th derivative of the characteristic function can be calculated in the way that is presented on the screen (see the video). Note that if we substitute the value 0 for the variable t, in the right part of еitх it turns into 1. We get nothing more than the expected value ξk. Thus, the k-th derivative of the characteristic function in 0 will be equal to ik multiplied by the expected value of the k-th power of the random variable. This means that this expected value can be calculated from this equality. Let’s express it, we get the formula given on the slide. We find the main numerical characteristics of a random variable. As we know this is the expected value and variance. We get the expected value for k=1, i.e. it will be –i multiplied by the derivative of the first degree in 0. If we want to find the variance, we should remember that the variance can be calculated using the formula: М(Х2) – (М(Х))2. We will get М(Х2) by the upper formula, when substituting k=2, and subtract the expected value we know, and in the square, so the minus will disappear. If we substitute k=2, we get (–i)2, which is the same as i2=–1. Therefore, we get minus the second derivative in 0 of the characteristic function and minus the square of the expected value. When we count the expectation square, we will have (–i)2 there, which also gives -1. Accordingly, the minus is multiplied by the minus, the minus will disappear. We get that the variances can be calculated as the square of the first derivative of the characteristic function at 0 minus the second derivative of the characteristic function at 0.
This concludes the properties discussed at the current lecture, and at the next lecture we will speak about sequences of random variables and characteristic functions connected with them.