Abstract:
A general projection neural network is considered, and its stability and convergence are analyzed. When the sum of the underlying mappings is asymmetric, global convergence and exponential stability of the general projection neural network are strictly shown under mild conditions by defining the suitable energy functions, respectively. Compared with the existing results for this network, the given stability conditions do not require the differentiability of the mappings, and the symmetric of the sum of the mappings. Theoretical analysis and illustrative examples show that the obtained results can be applied to some non-monotone problems, and the given conditions can be easily checked. Since this network can be used to solve a broad class of optimization and equilibrium problems, the obtained results are significant in both theory and application.