A Hoeﬀding-Azuma Type Inequality for Random Processes

. The subject of this paper is a Hoeﬀding-Azuma type estimation for the diﬀerence between an adapted random process and its conditional expectation given a related ﬁltration.


Introduction
Hoeffding-Azuma type inequalities have very important applications in probability theory, statistics and different branches of science.In this section we give a brief history of Hoeffding-Azuma type inequalities.
1.1.Classical Hoeffding-Azuma type inequalities.Let (Ω, F, P ) be a probability triple, where Ω is a sample space, F is a σ-algebra on Ω and P is a σ−additive probability measure on F. Let us denote by B the Borel algebra on R. Note that B = σ(τ |x−y | ), the minimal σ−algebra containing the natural topology τ |x−y | on R.
The classical Hoeffding inequality is about finding upper bounds for the probability that the sum of n independent random variables exceeds its mean by a positive number nt.The pioneering work was by Hoeffding ( [6], Theorem 2) who proved the following theorem.
Theorem 1.1.(Hoeffding Inequality) If X 1 , X 2 , ..., X n are independent random variables and a i ≤ where, Besides Hoeffding inequality there are two more classical fundamental results.These are Azuma inequality and Chernoff inequality.Let us give both inequalities in the form expressed by T.
Theorem 1.3.(Azuma Inequality) Let X 1 , X 2 , ..., X n be a sequence of scalar random variables with Assume also that we have martingale difference property almost surely, for all i = 1, ..., n.Then for any λ > 0 the sum S n = X 1 + X 2 + ... + X n obey the large deviation inequality for some constants C, c > 0.
1.2.Hoeffding-Azuma type inequalities for Martingale differences.Martingales and Markov chains are known to be widely used areas of Hoeffding-Azuma type inequalities.
Definition 1.2.(see [5], Section 12.1) A sequence of random variables The Hoeffding inequality for martingale differences is of supreme importance in the theory of martingales (see [5], Section 12.2.) Theorem 1.4.(Hoeffding inequality for martingale differences) Let Y = {Y n : n ≥ 0} be a martingale, and suppose that there exists a sequence K 1 , K 2 , ..., of real numbers such that Inequality (1.1) means that if the martingale differences are bounded then the large deviation of Y n from its initial value Y 0 is small.
There are many new results and their applications in the literature on these issues.
A generalization of Hoeffding inequality for dependent random variables was given in [10].Optimal bounds for Hoeffding's inequalities were found in [8].In [4] it was proved a Hoeffding type inequality to partial sums that are derived from a uniformly ergodic Markov chain.New type of inequalities were introduced in [2] (see also references therein).In [3] some inequalities were obtained for unbounded random variables.
[7] Significantly improved the well-known Bennett-Hoeffding bound for sums of independent random variables by using, instead of the class of all increasing exponential functions, a much larger class of generalized moment functions.The resulting bounds have certain optimality properties.

Inequalities for Adopted Random Precesses
A random process is a collection of random variables {X(t), t ∈ T }.Particularly, a sequence X 0 , X 1 , X 2 , ..., X n .... of random variables defined on the same probability triple (Ω, F, P ) is a random process.
n=0 is called a filtration of (Ω, F, P ) and a sequence {X n : n ≥ 0} of random variables is said to be adapted to the filtration {F n } ∞ n=0 if X n is F n measurable for all n.
We denote by E(X n |F n−1 ) the condition expectation of X n given F n−1 for all n ≥ 1.
As pointed out in the introduction the classical Hoeffding inequality is about finding upper bounds for the probability that the sum of n independent random variables exceeds its mean by nt.
Finding upper bounds for the probability that the sum of n terms of an adapted random process exceeds its conditional mean by x is also among the topics of interest.
As far as we know, no previous research has investigated x for general adopted random processes, which is the main subject of this paper.
The basic result is the following theorem.
Theorem 2.1.Let {X n : n ≥ 0} be a sequence of random variables which is adapted to a filtration Proof.Let us define the following sequence: We have Hence, Next, we estimate E e θ(Yn−Y 0 ) .By the tower property we obtain that ) We set f (y ) = e θy , |y | ≤ C n .The function f (y ) = e θy is convex, whence it follows that Thus, and by the condition of the theorem Taking the conditional expectation of (2.8) and using the fact that Therefore, By using this inequality, (2.6) and iterations we get It follows from (2.5) that Finally, minimizing the right side of this inequality in θ and replacing the terms under the sum we obtain the needed inequality (2.1).
and in this case we get the martingale difference inequality (1.1).
3. Random Precesses in the Hilbert Space L 2 (Ω, F, P ) Let X = {X i : i ≥ 0} be an adopted random process and X i ∈ L 2 (Ω, F i , P ) for all i .As can be seen from the following theorem that if X i ∈ L 2 (Ω, F i , P ) then the conditional expectation E(X i |F i−1 ) is a version of orthogonal projection of X i onto the subspace L 2 (Ω, F i−1 , P ).Theorem 3.1.(see [1]) Let (X, Z) be a bivariate random vector and By a using a Hilbert space property we can write where Y i ∈ L 2 (Ω, F i−1 , P ) ⊥ -the orthogonal complement of the subspace L 2 (Ω, F i−1 , P ).By (3.1) we have Y i is F i measurable and An immediate consequence of (3.2) is The main conclusion of the above given arguments is given in the following theorem.Conflicts of Interest: The author declares that there are no conflicts of interest regarding the publication of this paper.

Theorem 2 .
Let Y n be a version of orthogonal projection of X n onto the subspaceL 2 (Ω, F n−1 , P ) ⊥ , n ≥ 0. If |Y n | ≤ C nalmost surely, for n ≥ 0 then P Theorem 1.2.(Chernoff Inequality) Let X 1 , X 2 , ..., X n be independent scalar random variables with |X i | ≤ K almost surely, with mean µ i and variance σ 2 i .Then for any λ > 0, one has