In this section, we will introduce our implicit extragradient algorithm and show its strong convergence to the unique solution of .
Algorithm 1.
Let
be a closed convex subset of a real Hilbert space
. Let
be an
inverse strongly monotone mapping. Let
be a (nonself) contraction with coefficient
. For any
, define a net
as follows:
where is a constant.
Note the fact that is a possible nonself mapping. Hence, if we take , then (3.1) reduces to
Remark 3.1.
We notice that the net defined by (3.1) is well defined. In fact, we can define a selfmapping as follows:
From Lemma 2.1, we know that if , the mapping is nonexpansive.
For any , we have
This shows that the mapping is a contraction. By Banach contractive mapping principle, we immediately deduce that the net (3.1) is well defined.
Theorem 3.2.
Suppose the solution set of is nonempty. Then the net generated by the implicit extragradient method (3.1) converges in norm, as , to the unique solution of the hierarchical variational inequality . In particular, if one takes that , then the net defined by (3.2) converges in norm, as , to the minimumnorm solution of the variational inequality .
Proof.
Take that . Since , using the relation (2.4), we have . In particular, if we take , we obtain
From (3.1), we have
Noting that is nonexpansive, thus,
That is,
Therefore, is bounded and so are , . Since is inverse strongly monotone, it is Lipschitz continuous. Consequently, and are also bounded.
From (3.6),(2.5), and the convexity of the norm, we deduce
Therefore, we have
Hence
By the property (ii) of the metric projection , we have
where is some appropriate constant. It follows that
and hence (by (3.7))
which implies that
Since , we derive
Next, we show that the net is relatively normcompact as . Assume that is such that as . Put and .
By the property (ii) of metric projection , we have
Hence
Therefore,
In particular,
Since is bounded, without loss of generality, we may assume that converges weakly to a point . Since , we have . Hence, also converges weakly to the same point .
Next we show that . We define a mapping by
Then is maximal monotone (see [33]). Let . Since and , we have . On the other hand, from , we have
that is,
Therefore, we have
Noting that , , and is Lipschitz continuous, we obtain . Since is maximal monotone, we have and hence .
Therefore we can substitute for in (3.20) to get
Consequently, the weak convergence of and to actually implies that strongly. This has proved the relative normcompactness of the net as .
Now we return to (3.20) and take the limit as to get
In particular, solves the following VI
or the equivalent dual VI (see Lemma 2.2)
Therefore, . That is, is the unique solution in of the contraction . Clearly this is sufficient to conclude that the entire net converges in norm to as .
Finally, if we take that , then VI (3.28) is reduced to
Equivalently,
This clearly implies that
Therefore, is the minimumnorm solution of .This completes the proof.
Remark 3.3.

(1)
Note that our Implicit Extragradient Algorithms (3.1) and (3.2) have strong convergence in an infinite dimensional Hilbert space.

(2)
In many problems, it is needed to find a solution with minimum norm; see [34–38]. Our Algorithm (3.2) solves the minimum norm solution of .