New Results on Stability for a Class of Fractional-Order Static Neural Networks

  • PDF / 756,688 Bytes
  • 25 Pages / 439.37 x 666.142 pts Page_size
  • 52 Downloads / 158 Views

DOWNLOAD

REPORT


New Results on Stability for a Class of Fractional-Order Static Neural Networks Xiangqian Yao1 · Meilan Tang1 · Fengxian Wang2 · Zhijian Ye1 · Xinge Liu1 Received: 28 November 2019 / Revised: 6 May 2020 / Accepted: 9 May 2020 © Springer Science+Business Media, LLC, part of Springer Nature 2020

Abstract This paper investigates the stability of a class of fractional-order static neural networks. Two new Lyapunov functions with proper integral terms are constructed. These integrals with variable upper limit are convex functions. Based on the fractional-order Lyapunov direct method and some inequality skills, several novel stability sufficient conditions which ensure the global Mittag–Leffler stability of fractional-order projection neural networks (FPNNs) are presented in the forms of linear matrix inequalities (LMIs). Two LMI-based Mittag–Leffler stability criteria with less conservativeness are given for a special kind of FPNNs. Finally, the effectiveness of the proposed method is demonstrated via four numerical examples. Keywords Fractional-order · Projection neural networks · Convex Lyapunov function · Mittag–Leffler stability · Linear matrix inequality

This work is partly supported by the National Science Foundation of China under Grants 61773404 and 11601104 and Fundamental Research Funds for the Central Universities of Central South University 2018zzts316.

B B

Meilan Tang [email protected] Xinge Liu [email protected] Xiangqian Yao [email protected] Fengxian Wang [email protected] Zhijian Ye [email protected]

1

School of Mathematics and Statistics, Central South University, Changsha 410083, China

2

School of Electrical and Information Engineering, Zhengzhou University of Light Industry, Zhengzhou 450002, China

Circuits, Systems, and Signal Processing

1 Introduction In recent years, neural networks have been successfully applied in many areas such as image and signal processing, associative memory, structural and material optimization problems and even mechanics [22,24,25,33]. It is well known that stability is the primary condition for various applications of neural networks. So the stability of neural networks is extensively studied [12,14,31]. In 2009, Syed Ali et al. [37] generalize Hopfield neural network to uncertain Takagi-Sugeno (TS) fuzzy Hopfield neural network with time delays and address its stability by using Lyapunov functional theory. A novel LMI-based asymptotic stability criterion is obtained. Markovian jump neural networks have been an interesting subject of recent research on stability analysis because they can model dynamic systems whose structures are subject to random abrupt parameter changes. Based on the properties of infinitesimal generator and inequality technique, some novel delay-dependent mean-square asymptotic stability conditions of Markovian jump generalized neural networks with interval time-varying delays are presented in [36]. By employing the average dwell time approach, the stochastic finite-time control problem for a class of Markovian jump-switched neural networ