Skip to main content
  • Research Article
  • Open access
  • Published:

Strong and Weak Convergence of the Modified Proximal Point Algorithms in Hilbert Space

Abstract

For a monotone operator , we shall show weak convergence of Rockafellar's proximal point algorithm to some zero of and strong convergence of the perturbed version of Rockafellar's to under some relaxed conditions, where is the metric projection from onto . Moreover, our proof techniques are simpler than some existed results.

1. Introduction

Throughout this paper, let be a real Hilbert space with inner product and norm , and let be on identity operator in . We shall denote by the set of all positive integers, by the set of all zeros of , that is, and by the set of all fixed points of , that is, . When is a sequence in , then (resp., , ) will denote strong (resp., weak, weak) convergence of the sequence to .

Let be an operator with domain and range in . Recall that is said to be monotone if

(1.1)

A monotone operator is said to be maximal monotone if is monotone and for all .

In fact, theory of monotone operator is very important in nonlinear analysis and is connected with theory of differential equations. It is well known (see [1]) that many physically significant problems can be modeled by the initial-value problems of the form

(1.2)

where is a monotone operator in an appropriate space. Typical examples where such evolution equations occur can be found in the heat and wave equations or Schrodinger equations. On the other hand, a variety of problems, including convex programming and variational inequalities, can be formulated as finding a zero of monotone operators. Then the problem of finding a solution with has been investigated by many researchers; see, for example, Bruck [2], Rockafellar [3], Brézis and Lions [4], Reich [5, 6], Nevanlinna and Reich [7], Bruck and Reich [8], Jung and Takahashi [9], Khang [10], Minty [11], Xu [12], and others. Some of them dealt with the weak convergence of (1.4) and others proved strong convergence theorems by imposing strong assumptions on .

One popular method of solving is the proximal point algorithm of Rockafellar [3] which is recognized as a powerful and successful algorithm in finding a zero of monotone operators. Starting from any initial guess , this proximal point algorithm generates a sequence given by

(1.3)

where for all is the resolvent of on the space . Rockafellar's [3] proved the weak convergence of his algorithm (1.3) provided that the regularization sequence remains bounded away from zero and the error sequence satisfies the condition . Gler's example [13] however shows that in an infinite-dimensional Hilbert space, Rochafellar's algorithm (1.3) has only weak convergence. Recently several authors proposed modifications of Rochafellar's proximal point algorithm (1.3) to have strong convergence. For examples, Solodov and Svaiter [14] and Kamimura and Takahashi [15] studied a modified proximal point algorithm by an additional projection at each step of iteration. Lehdili and Moudafi [16] obtained the convergence of the sequence generated by the algorithm

(1.4)

where is viewed as a Tikhonov regularization of . Using the technique of variational distance, Lehdili and Moudafi [16] were able to prove convergence theorems for the algorithm (1.4) and its perturbed version, under certain conditions imposed upon the sequences and . For a maximal monotone operator , Xu [12] and Song and Yang [17] used the technique of nonexpansive mappings to get convergence theorems for defined by the perturbed version of the algorithm (1.4):

(1.5)

In this paper, under more relaxed conditions on the sequences and , we shall show that the sequence generated by (1.5) converges strongly to (where is the metric projection from onto ) and the sequence generated by (1.3) weakly converges to some . Moreover, our proof techniques are simpler than those of Lehdili and Moudafi [16], Xu [12], and Song and Yang [17].

2. Preliminaries and Basic Results

Let be a monotone operator with . We use and to denote the resolvent and Yosida's approximation of , respectively. Namely,

(2.1)

For and , the following is well known. For more details, see [18, Pages 369–400] or [3, 19].

(i) for all ;

(ii) for all ;

(iii) is a single-valued nonexpansive mapping for each (i.e., for all );

(iv) is closed and convex;

  1. (v)

    (The Resolvent Identity) For and and

    (2.2)

In the rest of this paper, it is always assumed that is nonempty so that the metric projection from onto is well defined. It is known that is nonexpansive and characterized by the inequality: given and ; then if and only if

(2.3)

In order to facilitate our investigation in the next section we list a useful lemma.

Lemma 2.1 (see Xu [20, Lemma ]).

Let be a sequence of nonnegative real numbers satisfying the property:

(2.4)

where , , and satisfy the conditions (i) (ii) either or (iii) for all and Then converges to zero as .

3. Strongly Convergence Theorems

Let be a monotone operator on a Hilbert space . Then is a single-valued nonexpansive mapping from to . When is a nonempty closed convex subset of such that for all (here is closure of ), then we have for and all , and hence the following iteration is well defined

(3.1)

Next we will show strong convergence of defined by (3.1) to find a zero of . For reaching this objective, we always assume in the sequel.

Theorem 3.1.

Let be a monotone operator on a Hilbert space with . Assume that is a nonempty closed convex subset of such that for all and for an anchor point and an initial value , is iteratively defined by (3.1). If and satisfy

(i)

(ii)

(iii)

then the sequence converges strongly to , where is the metric projection from onto .

Proof.

The proof consists of the following steps:

Step 1.

The sequence is bounded. Let , then and for some , we have

(3.2)

So, the sequences , , and are bounded.

Step 2.

for each . Since

(3.3)

we have

(3.4)

Step 3.

Indeed, we can take a subsequence of such that

(3.5)

We may assume that by the reflexivity of and the boundedness of . Then . In fact, since

(3.6)

then, for some constant , we have

(3.7)

Thus,

(3.8)

Take on two sides of the above equation by means of (3.4), we must have . So, . Hence, noting the projection inequality (2.3), we obtain

(3.9)

Step 4.

. Indeed,

(3.10)

Therefore,

(3.11)

where So, an application of Lemma 2.1 onto (3.11) yields the desired result.

Theorem 3.2.

Let be as Theorem 3.1, the condition (iii) is replaced by the following condition:

(3.12)

Then the sequence converges strongly to , where is the metric projection from onto .

Proof.

From the proof of Theorem 3.1, we can observe that Steps 1, 3 and 4 still hold. So we only need show to Step 2: for each .

We first estimate From the resolvent identity (2.2), we have

(3.13)

Therefore, for a constant with ,

(3.14)

It follows from Lemma 2.1 that

(3.15)

As then

(3.16)

Since , then there exists and a positive integer such that for all , . Thus for each , we also have

(3.17)

we have

Corollary 3.3.

Let be as Theorem 3.1 or 3.2. Suppose that is a maximal monotone operator on and for , is defined by (3.1). Then the sequence converges strongly to , where is the metric projection from onto .

Proof.

Since is a maximal monotone, then is monotone and satisfies the condition for all . Putting , the desired result is reached.

Corollary 3.4.

Let be as Theorem 3.1 or 3.2. Suppose that is a monotone operator on satisfying the condition for all and for , is defined by (3.1). If is convex, then the sequence converges strongly to , where is the metric projection from onto .

Proof.

Taking , following Theorem 3.1 or 3.2, we easily obtain the desired result.

4. Weakly Convergence Theorems

For a monotone operator , if for all and , then the iteration () is well defined. Next we will show weak convergence of under some assumptions.

Theorem 4.1.

Let be a monotone operator on a Hilbert space with . Assume that for all and for an initial value , iteratively define

(4.1)

If satisfies

(4.2)

then the sequence converges weakly to some .

Proof.

Take , we have

(4.3)

Therefore, is nonincreasing and bounded below, and hence the limit exists for each . Further, is bounded. So we have

(4.4)

Hence,

(4.5)

As is weakly sequentially compact by the reflexivity of , and hence we may assume that there exists a subsequence of such that . Using the proof technique of Step 3 in Theorem 3.1, we must have that .

Now we prove that converges weakly to . Supposed that there exists another subsequence of which weakly converges to some . We also have . Because exists for each and

(4.6)

thus,

(4.7)

Similarly, we also have

(4.8)

Adding up the above two equations, we must have . So,  .

In a summary, we have proved that the set is weakly sequentially compact and each cluster point in the weak topology equals to . Hence, converges weakly to . The proof is complete.

Theorem 4.2.

Let be a maximal monotone operator on a Hilbert space with . For an initial value , iteratively define

(4.9)

If and satisfy

(4.10)

then the sequence converges weakly to some .

Proof.

Take and , we have

(4.11)

It follows from Liu [21, Lemma ] that the limit exists for each and hence both and are bounded. So we have

(4.12)

Hence,

(4.13)

The remainder of the proof is the same as Theorem 4.1; we omit it.

References

  1. Zeidler E: Nonlinear Functional Analysis and Its Applications, Part II: Monotone Operators. Springer, Berlin, Germany; 1985.

    Book  Google Scholar 

  2. Bruck RE Jr.: A strongly convergent iterative solution of for a maximal monotone operator in Hilbert space. Journal of Mathematical Analysis and Applications 1974, 48: 114–126. 10.1016/0022-247X(74)90219-4

    Article  MathSciNet  MATH  Google Scholar 

  3. Rockafellar RT: Monotone operators and the proximal point algorithm. SIAM Journal on Control and Optimization 1976,14(5):877–898. 10.1137/0314056

    Article  MathSciNet  MATH  Google Scholar 

  4. Brézis H, Lions P-L: Produits infinis de résolvantes. Israel Journal of Mathematics 1978,29(4):329–345. 10.1007/BF02761171

    Article  MathSciNet  MATH  Google Scholar 

  5. Reich S: Weak convergence theorems for nonexpansive mappings in Banach spaces. Journal of Mathematical Analysis and Applications 1979,67(2):274–276. 10.1016/0022-247X(79)90024-6

    Article  MathSciNet  MATH  Google Scholar 

  6. Reich S: Strong convergence theorems for resolvents of accretive operators in Banach spaces. Journal of Mathematical Analysis and Applications 1980,75(1):287–292. 10.1016/0022-247X(80)90323-6

    Article  MathSciNet  MATH  Google Scholar 

  7. Nevanlinna O, Reich S: Strong convergence of contraction semigroups and of iterative methods for accretive operators in Banach spaces. Israel Journal of Mathematics 1979,32(1):44–58. 10.1007/BF02761184

    Article  MathSciNet  MATH  Google Scholar 

  8. Bruck RE, Reich S: A general convergence principle in nonlinear functional analysis. Nonlinear Analysis: Theory, Methods & Applications 1980,4(5):939–950. 10.1016/0362-546X(80)90006-1

    Article  MathSciNet  MATH  Google Scholar 

  9. Jung JS, Takahashi W: Dual convergence theorems for the infinite products of resolvents in Banach spaces. Kodai Mathematical Journal 1991,14(3):358–365. 10.2996/kmj/1138039461

    Article  MathSciNet  MATH  Google Scholar 

  10. Khang DB: On a class of accretive operators. Analysis 1990,10(1):1–16.

    Article  MathSciNet  MATH  Google Scholar 

  11. Minty GJ: On the monotonicity of the gradient of a convex function. Pacific Journal of Mathematics 1964, 14: 243–247.

    Article  MathSciNet  MATH  Google Scholar 

  12. Xu H-K: A regularization method for the proximal point algorithm. Journal of Global Optimization 2006,36(1):115–125. 10.1007/s10898-006-9002-7

    Article  MathSciNet  MATH  Google Scholar 

  13. Güler O: On the convergence of the proximal point algorithm for convex minimization. SIAM Journal on Control and Optimization 1991,29(2):403–419. 10.1137/0329022

    Article  MathSciNet  MATH  Google Scholar 

  14. Solodov MV, Svaiter BF: Forcing strong convergence of proximal point iterations in a Hilbert space. Mathematical Programming. Series A 2000,87(1):189–202.

    MathSciNet  MATH  Google Scholar 

  15. Kamimura S, Takahashi W: Strong convergence of a proximal-type algorithm in a Banach space. SIAM Journal on Optimization 2002,13(3):938–945. 10.1137/S105262340139611X

    Article  MathSciNet  MATH  Google Scholar 

  16. Lehdili N, Moudafi A: Combining the proximal algorithm and Tikhonov regularization. Optimization 1996,37(3):239–252. 10.1080/02331939608844217

    Article  MathSciNet  MATH  Google Scholar 

  17. Song Y, Yang C: A note on a paper "A regularization method for the proximal point algorithm". Journal of Global Optimization 2009,43(1):171–174. 10.1007/s10898-008-9279-9

    Article  MathSciNet  MATH  Google Scholar 

  18. Aubin J-P, Ekeland I: Applied Nonlinear Analysis, Pure and Applied Mathematics (New York). John Wiley & Sons, New York, NY, USA; 1984:xi+518.

    Google Scholar 

  19. Takahashi W: Nonlinear Functional Analysis—Fixed Point Theory and Its Applications. Yokohama Publishers, Yokohama, Japan; 2000:iv+276.

    MATH  Google Scholar 

  20. Xu H-K: Strong convergence of an iterative method for nonexpansive and accretive operators. Journal of Mathematical Analysis and Applications 2006,314(2):631–643. 10.1016/j.jmaa.2005.04.082

    Article  MathSciNet  MATH  Google Scholar 

  21. Liu Q: Iterative sequences for asymptotically quasi-nonexpansive mappings with error member. Journal of Mathematical Analysis and Applications 2001,259(1):18–24. 10.1006/jmaa.2000.7353

    Article  MathSciNet  MATH  Google Scholar 

Download references

Acknowledgments

The authors are grateful to the anonymous referee for his/her valuable suggestions which helps to improve this manuscript. This work is supported by Youth Science Foundation of Henan Normal University(2008qk02) and by Natural Science Research Projects (Basic Research Project) of Education Department of Henan Province (2009B110011, 2009B110001).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yisheng Song.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 2.0 International License ( https://creativecommons.org/licenses/by/2.0 ), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Reprints and permissions

About this article

Cite this article

Chai, X., Li, B. & Song, Y. Strong and Weak Convergence of the Modified Proximal Point Algorithms in Hilbert Space. Fixed Point Theory Appl 2010, 240450 (2010). https://doi.org/10.1155/2010/240450

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1155/2010/240450

Keywords