Recall that throughout this paper, we use S to denote the solution set of constrained convex minimization problem (1.1).
Let H be a real Hilbert space and C be a nonempty closed convex subset of Hilbert space H. Let be a k-Lipschitzian and η-strongly monotone operator with constant , such that . Suppose that ∇f is L-Lipschitz continuous. We now consider a mapping on C defined by:
where , and is nonexpansive. Let and satisfy the following conditions:
-
(i)
and ;
-
(ii)
;
-
(iii)
is continuous with respect to s and .
It is easy to see that is a contraction. Indeed, we have for each ,
where . Hence, has a unique fixed point in C, denoted by which uniquely solves the fixed-point equation
(3.1)
The following proposition summarizes the properties of the net .
Proposition 3.1 Let be defined by (3.1). Then the following properties for the net hold:
-
(a)
is bounded for ;
-
(b)
;
-
(c)
defines a continuous curve from into C.
Proof It is well known that: solves the minimization problem (1.1) if and only if solves the fixed-point equation
where is a constant. It is clear that , i.e., .
(a) Take a fixed , we obtain that
It follows that
(3.2)
For , note that
and
where and .
Then we get
Since and , there exists a real positive number such that
(3.3)
It follows from (3.2) and (3.3) that
Since , there exists a real positive number such that , and
Hence, is bounded.
(b) Note that the boundedness of implies that is also bounded. Hence, by the definition of , we have
(c) For , there exists
and
where and .
So for , we get
for some appropriate constant such that
Now take and calculate
It follows that
Since is bounded, and is continuous with respect to s, defines a continuous curve from into C. □
The following theorem shows that the net converges strongly as to a minimizer of (1.1), which solves some variational inequality.
Theorem 3.2 Let H be a real Hilbert space and C be a nonempty, closed and convex subset of Hilbert space H. Let be a k-Lipschitzian and η-strongly monotone operator with constant , such that . Suppose that the minimization problem (1.1) is consistent and let S denote its solution set. Assume that the gradient ∇f is Lipschitzian with constant . Let be defined by (3.1), where the parameter and is nonexpansive. Let and satisfy the following conditions:
-
(i)
and ;
-
(ii)
;
-
(iii)
is continuous with respect to s and .
Then the net converges strongly as to a minimizer of (1.1), which solves the variational inequality
(3.4)
Equivalently, we have .
Proof It is easy to see the uniqueness of a solution of the variational inequality (3.4). Indeed, suppose both and are solutions to (3.4), then
(3.5)
and
(3.6)
Adding up (3.5) and (3.6) gets
The strong monotonicity of F implies that and the uniqueness is proved. Below we use to denote the unique solution of the variational inequality (3.4).
Let us show that as . Set
Then we have . For any given , we get
Since is the metric projection from H onto C, we have
Note that and , so we get , i.e., .
It follows from (3.7) that
By (3.3), we obtain that
Since is bounded, it is obvious that if is a sequence in such that , and .
By Proposition 3.1(b) and (3.3), we have
So, by Lemma 2.3, we get .
Since , we obtain from (3.8) that .
Next, we show that solves the variational inequality (3.4). Observe that
Hence, we conclude that
Since is nonexpansive, is monotone. Note that, for any given , and .
By (3.3), it follows that
Since , by Proposition 3.1(b), we obtain from (3.9) that
So is a solution of the variational inequality (3.4). We get by uniqueness. Therefore, as .
The variational inequality (3.4) can be rewritten as
So in terms of Lemma 2.4, it is equivalent to the following fixed point equation:
Next, we study the following iterative method. For a given arbitrary initial guess , we propose the following explicit scheme that generates a sequence in an explicit way:
(3.10)
where the parameters . Let and satisfy the following conditions:
-
(i)
and ;
-
(ii)
;
-
(iii)
.
It is proved that the sequence converges strongly to a minimizer of (1.1), which also solves the variational inequality (3.4). □
Theorem 3.3 Let H be a real Hilbert space and C be a nonempty, closed and convex subset of Hilbert space H. Let be a k-Lipschitzian and η-strongly monotone operator with constant , such that . Suppose that the minimization problem (1.1) is consistent and let S denote its solution set. Assume that the gradient ∇f is Lipschitzian with constant . Let be generated by algorithm (3.10) and the parameters . Let , and satisfy the following conditions:
-
(C1)
and ;
-
(C2)
for all n;
-
(C3)
and ;
-
(C4)
;
-
(C5)
and .
Then the sequence generated by the explicit scheme (3.10) converges strongly to a minimizer of (1.1), which is also a solution of the variational inequality (3.4).
Proof It is well known that:
-
(a)
solves the minimization problem (1.1) if and only if solves the fixed-point equation
where is a constant. It is clear that , i.e., .
-
(b)
the gradient ∇f is -ism.
-
(c)
is averaged for , in particular, the following relation holds:
We observe that is bounded. Indeed, take a fixed , we get
It follows that
Note that, by using the same argument as in the proof of (3.3), there exists a real positive number such that
(3.11)
Since , there exists a real positive number such that and by (3.11) we get
It follows from induction that
(3.12)
Consequently, is bounded. It implies that is also bounded.
We claim that
(3.13)
Indeed, since
we obtain that
By using the same argument as in the proof of Proposition 3.1(c), we obtain that
for some appropriate constant such that
Thus, we get
for some appropriate constant such that
Consequently, we get
By Lemma 2.5, we obtain .
Next, we show that
(3.14)
Indeed, it follows from (3.13) that
Now we show that
(3.15)
where is a solution of the variational inequality (3.4).
Indeed, take a subsequence of such that
(3.16)
Without loss of generality, we may assume that .
We observe that
It follows from (3.11) that
By (3.14), we get .
In terms of Lemma 2.3, we get .
Consequently, from (3.16) and the variational inequality (3.4), it follows that
Finally, we show that .
As a matter of fact, set
Then, .
In terms of Lemma 2.4 and (3.11), we obtain
It follows that
since is bounded, we can take a constant such that
It then follows that
(3.17)
where .
By (3.15) and , we get . Now applying Lemma 2.5 to (3.17) concludes that as . □