Lecture 2. Properties of characteristic functions random variable

View

Ссылка на yuotube

We  go on talking about characteristic functions. In the last lecture, we studied the concept of a characteristic function and considered several properties. Let's Let us consider some more properties.

Property 1. If we know the characteristic function of some random variable ξ, and the random variable η is a linear combination of the random variable ξ and a constant, i.e. η = a + bξ, then the characteristic function of the value η can be calculated as follows: Ψη(t) = M[eitη] (this is the mathematical expectation of e to the power of itη). We replace η with a + bξ. Note that if we have a power of sum, then we can represent it as eita eitbξ. The eita multiplier is not a random variable, so we can take it out of the sign of mathematical expectation. Thus, we get eitaM[ei(tb)ξ]. Now we have the tb parameter instead of the t parameter. Thus, we get eita multiplied by the characteristic function of the random variable ξ at the point bt. This is what we use when working with the normal distribution in a practical class.

The next property concerns a set of independent m in the set of random variables. Independent aggregates mean that the distribution law of the system is equal to the product of partial distribution laws, in this case, pairwise independence (where any two random variables are independent) is not enough. Then the characteristic function of the sum is calculated as the product of characteristic functions. This is not difficult to prove. We substitute by definition, write η as the sum, and divide it into multipliers. Since the random variables ξ1, ξ2ξm are independent, then the random variables eitξ1, eitξ2, … eitξm are independent in the aggregate, and hence the mathematical expectation of the product is equal to the product of mathematical expectations.

Thus, we can divide into a product of mathematical expectations, each of which is a characteristic function of the corresponding random variable, that is, Ψk(t) is a characteristic function of the value ξk. We can use this to prove the stability of the distribution of random variables. 

The distribution F is stable with respect to summation if for any set of independent random variables ξ1, ξ2, … ξm that have distributions with different parameters but the same distribution, their sum also has this distribution. That is, if random variables have a normal distribution, each has its own mathematical expectation, its own dispersion, then their sum will necessarily have a normal distribution. We are going to prove this in a practical class.

Now let's consider a few other examples.

Example 1. We are going to prove that the Bernoulli distribution is not stable with respect to summation.

To do this, we calculate the characteristic function. I won't describe it in detail here. Let me remind you that the calculation of the characteristic function of the Bernoulli distribution was given in detail in practical lesson 1 of this section. If ξ1, ξ2ξm is a set of random variables that have the Bernoulli distribution, then the characteristic function of their sum according to the previous statement looks like this (look at the slide). We see that this product cannot be represented in the form of eitp + q, that is, there are different powers of e, so the Bernoulli distribution is not stable in summation.  However, for a specific value of the parameter p, we get a characteristic function for the binomial distribution, we are going to get it in practical class 2. The sum of the Bernoulli distribution is a binomial distribution if the parameters are the same. If the parameters are different, we do not get any good distribution.

Example 2. In practical lesson 1, we analyzed finding a characteristic function for an equiprobability distribution. We are going to prove that this distribution is not stable.

The slide shows the formula for the characteristic function. If we consider a system of m independent random variables with an equiprobabipity distribution, then the distribution of their sum looks like this (see the video). We cannot represent this product as the last multiplier (see the video). This means that the distribution is not stable. Try to prove this fact without using the characteristic function. To do this, you can use geometric probability if you consider a set of 2, 3, or more equiprobability distributions.

Characteristic functions are also used to prove convergence over a distribution.

A sequence of random variables converges in distribution to a random variable ξ if its distribution function at each point converges to the distribution law of the random variable ξ, where Fn is the distribution law of the random variable ξn, and F(x) is the distribution law of the random variable ξ.

Then there is a theorem about continuous compliance. Let Ψ(t) be a characteristic function of the value ξ, where the function Ψ is continuous at zero, and Ψn(t) is a sequence of characteristic functions for the value ξn.

ξ converges to the value ξ in the distribution for n tending to infinity if and only if its characteristic function converges pointwise to the characteristic function of the value ξ. This theorem can be used to prove the law of large numbers or the central limit theorem, that is, this theorem is quite often used to prove the convergence of a sequence of random variables. This is the end of the theory. You can start doing practical lesson 2.


Last modified: Четверг, 5 декабря 2024, 9:12