首页 > 最新文献

Fuzzy Neural Intelligent Systems最新文献

英文 中文
Application of Neuro-Fuzzy Systems: Development of a Fuzzy Learning Decision Tree and Application to Tactile Recognition 神经模糊系统的应用:模糊学习决策树的发展及其在触觉识别中的应用
Pub Date : 2000-09-21 DOI: 10.1201/9781420057997.CH17
Hongxing Li, C. L. P. Chen, Han-Pang Huang
{"title":"Application of Neuro-Fuzzy Systems: Development of a Fuzzy Learning Decision Tree and Application to Tactile Recognition","authors":"Hongxing Li, C. L. P. Chen, Han-Pang Huang","doi":"10.1201/9781420057997.CH17","DOIUrl":"https://doi.org/10.1201/9781420057997.CH17","url":null,"abstract":"","PeriodicalId":239984,"journal":{"name":"Fuzzy Neural Intelligent Systems","volume":"50 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2000-09-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129245923","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Fuzzy Assessment Systems of Rehabilitative Process for CVA Patients CVA患者康复过程的模糊评价系统
Pub Date : 2000-09-21 DOI: 10.1201/9781420057997.CH18
Hongxing Li, C. L. P. Chen, Han-Pang Huang
In recent years, cerebrovascular accidents have become a very serious disease in our society. How to assess the states of cerebrovascular accident (CVA) patients and rehabilitate them is very important. Therapists train CVA patients according to the functional activities they need in their daily lives. During the rehabilitative therapeutic activities, the assessment of motor control ability for CVA patients is very important. In this chapter, a fuzzy diagnostic system is developed to evaluate the motor control ability of CVA patients. The CVA patients will be analyzed according to the motor control abilities defined by kinetic signals. The kinetic signals are fed into the proposed fuzzy diagnostic system to assess the global control ability and compare with the FIM (Functional Independent Measurement) score, which is a clinical index for assessing the states of CVA patients in hospitals. It is shown that the proposed fuzzy diagnostic system can precisely assess the motor control ability of CVA patients.
近年来,脑血管意外已成为我国社会中一种非常严重的疾病。如何评估脑血管意外患者的状态并进行康复治疗是非常重要的。治疗师根据CVA患者在日常生活中需要的功能活动来训练他们。在康复治疗活动中,对CVA患者的运动控制能力进行评估是非常重要的。在本章中,我们开发了一个模糊诊断系统来评估CVA患者的运动控制能力。根据运动信号定义的运动控制能力对CVA患者进行分析。将运动信号输入到所提出的模糊诊断系统中,评估整体控制能力,并与FIM(功能独立测量)评分进行比较,FIM是评估医院CVA患者状态的临床指标。结果表明,所提出的模糊诊断系统能够准确地评价CVA患者的运动控制能力。
{"title":"Fuzzy Assessment Systems of Rehabilitative Process for CVA Patients","authors":"Hongxing Li, C. L. P. Chen, Han-Pang Huang","doi":"10.1201/9781420057997.CH18","DOIUrl":"https://doi.org/10.1201/9781420057997.CH18","url":null,"abstract":"In recent years, cerebrovascular accidents have become a very serious disease in our society. How to assess the states of cerebrovascular accident (CVA) patients and rehabilitate them is very important. Therapists train CVA patients according to the functional activities they need in their daily lives. During the rehabilitative therapeutic activities, the assessment of motor control ability for CVA patients is very important. In this chapter, a fuzzy diagnostic system is developed to evaluate the motor control ability of CVA patients. The CVA patients will be analyzed according to the motor control abilities defined by kinetic signals. The kinetic signals are fed into the proposed fuzzy diagnostic system to assess the global control ability and compare with the FIM (Functional Independent Measurement) score, which is a clinical index for assessing the states of CVA patients in hospitals. It is shown that the proposed fuzzy diagnostic system can precisely assess the motor control ability of CVA patients.","PeriodicalId":239984,"journal":{"name":"Fuzzy Neural Intelligent Systems","volume":"2 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2000-09-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121130070","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
A DSP-based Neural Controller for a Multi-degree Prosthetic Hand 基于dsp的多度假手神经控制器
Pub Date : 2000-09-21 DOI: 10.1201/9781420057997.CH19
Hongxing Li, C. L. P. Chen, Han-Pang Huang
{"title":"A DSP-based Neural Controller for a Multi-degree Prosthetic Hand","authors":"Hongxing Li, C. L. P. Chen, Han-Pang Huang","doi":"10.1201/9781420057997.CH19","DOIUrl":"https://doi.org/10.1201/9781420057997.CH19","url":null,"abstract":"","PeriodicalId":239984,"journal":{"name":"Fuzzy Neural Intelligent Systems","volume":"14 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2000-09-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131122611","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Mathematical Essence and Structures of Feedforward Artificial Neural Networks 前馈人工神经网络的数学本质与结构
Pub Date : 2000-09-21 DOI: 10.1201/9781420057997.ch3
Hongxing Li, C. L. P. Chen, Han-Pang Huang
In this chapter, we first introduce the mathematical model and structure of artificial neurons. After that, we consider several artificial neural networks that assemble the neurons. This chapter does not intend to explain details of biological neurons. Instead, we only focus on artificial neurons that simply extract the abstract operation of biological neurons and their mathematical models. We start with an introduction in Section 1, followed by the discussion of neuron models in Section
在本章中,我们首先介绍了人工神经元的数学模型和结构。之后,我们考虑几个人工神经网络来组装神经元。本章不打算详细解释生物神经元。相反,我们只关注人工神经元,简单地提取生物神经元的抽象操作及其数学模型。我们从第1节的介绍开始,然后在第1节讨论神经元模型
{"title":"Mathematical Essence and Structures of Feedforward Artificial Neural Networks","authors":"Hongxing Li, C. L. P. Chen, Han-Pang Huang","doi":"10.1201/9781420057997.ch3","DOIUrl":"https://doi.org/10.1201/9781420057997.ch3","url":null,"abstract":"In this chapter, we first introduce the mathematical model and structure of artificial neurons. After that, we consider several artificial neural networks that assemble the neurons. This chapter does not intend to explain details of biological neurons. Instead, we only focus on artificial neurons that simply extract the abstract operation of biological neurons and their mathematical models. We start with an introduction in Section 1, followed by the discussion of neuron models in Section","PeriodicalId":239984,"journal":{"name":"Fuzzy Neural Intelligent Systems","volume":"101 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2000-09-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122973190","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Foundation of Neuro-Fuzzy Systems and an Engineering Application 神经模糊系统基础及其工程应用
Pub Date : 2000-09-21 DOI: 10.1201/9781420057997.CH14
Hongxing Li, C. L. P. Chen, Han-Pang Huang
This chapter discusses the foundation of neuro-fuzzy systems. First, we introduce Takagi, Sugeno, and Kang (TSK) fuzzy model [l,2] and its difference from the Mamdani model. Under the idea of TSK fuzzy model, we discuss a neuro-fuzzy system architecture: Adaptive Network-based Fuzzy Inference System (ANFIS) that is developed by Jang [3]. This model allows the fuzzy systems to learn the parameters adaptively. By using a hybrid learning algorithm, the ANFIS can construct an input-output mapping based on both human knowledge and numerical data. Finally, the ANFIS architecture is employed for an engineering example an IC fabrication time estimation. The result is compared with other different algorithms: Gauss-Newton-based Levenberg-Marquardt algorithm (GN algorithm), and backpropagation of neural network (BPNN) algorithm. Comparing these two methods, the ANFIS algorithm gives the most accurate prediction result at the expense of the highest computation cost. Besides, the adaptation of fuzzy inference system provides more physical insights for engineers to understand the relationship between the parameters.
本章讨论神经模糊系统的基础。首先,我们介绍Takagi, Sugeno, and Kang (TSK)模糊模型[1,2]及其与Mamdani模型的区别。在TSK模糊模型的思想下,我们讨论了一种神经模糊系统架构:Jang[3]开发的自适应网络模糊推理系统(Adaptive Network-based fuzzy Inference system, ANFIS)。该模型允许模糊系统自适应学习参数。通过混合学习算法,ANFIS可以构建基于人类知识和数值数据的输入输出映射。最后,将ANFIS体系结构应用于集成电路制造时间估计的工程实例。结果与其他不同的算法:基于高斯-牛顿的Levenberg-Marquardt算法(GN算法)和神经网络反向传播(BPNN)算法进行比较。对比两种方法,ANFIS算法以最高的计算代价给出了最准确的预测结果。此外,模糊推理系统的自适应为工程师理解参数之间的关系提供了更多的物理视角。
{"title":"Foundation of Neuro-Fuzzy Systems and an Engineering Application","authors":"Hongxing Li, C. L. P. Chen, Han-Pang Huang","doi":"10.1201/9781420057997.CH14","DOIUrl":"https://doi.org/10.1201/9781420057997.CH14","url":null,"abstract":"This chapter discusses the foundation of neuro-fuzzy systems. First, we introduce Takagi, Sugeno, and Kang (TSK) fuzzy model [l,2] and its difference from the Mamdani model. Under the idea of TSK fuzzy model, we discuss a neuro-fuzzy system architecture: Adaptive Network-based Fuzzy Inference System (ANFIS) that is developed by Jang [3]. This model allows the fuzzy systems to learn the parameters adaptively. By using a hybrid learning algorithm, the ANFIS can construct an input-output mapping based on both human knowledge and numerical data. Finally, the ANFIS architecture is employed for an engineering example an IC fabrication time estimation. The result is compared with other different algorithms: Gauss-Newton-based Levenberg-Marquardt algorithm (GN algorithm), and backpropagation of neural network (BPNN) algorithm. Comparing these two methods, the ANFIS algorithm gives the most accurate prediction result at the expense of the highest computation cost. Besides, the adaptation of fuzzy inference system provides more physical insights for engineers to understand the relationship between the parameters.","PeriodicalId":239984,"journal":{"name":"Fuzzy Neural Intelligent Systems","volume":"53 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2000-09-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116711470","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 5
Flat Neural Networks and Rapid Learning Algorithms 平面神经网络和快速学习算法
Pub Date : 2000-09-21 DOI: 10.1201/9781420057997.CH5
Hongxing Li, C. L. P. Chen, Han-Pang Huang
In this chapter, we will introduce flat neural networks architecture. The system equations of flat neural networks can be formulated as a linear system. In this way, the performance index is a quadratic form of the weights, and the weights of the networks can be solved easily using a linear least-square method. Even though they have a linear-system-equations-like equation, the flat neural networks are also perfect for approximating non-linear functions. A fast learning algorithm is given to find an optimal weight of the flat neural networks. This formulation makes it easier to update the weights instantly for both a newly added input and a newly added node. A dynamic stepwise updating algorithm is given to update thc weights of the system instantly. Finally, we give several examples of applications of the flat neural networks, such as an infrared laser data set, a chaotic time-series, a monthly flour price data set, and a non-linear system identification problem. The simulation results are compared to existing models in which more complex architectures and more costly training are needed. The results indicate that the flat neural networks are very attractive to real-time processes. 5.1 Introduction Feedforward artificial neural networks have been a popular research subject recently. The research topics vary from the theoretical view of learning algorithms such as learning and generalization properties of the networks to a variety of applications in control, classification, biomedical, manufacturing, and business forecasting, etc. The backpropagation (BP) supervised learning algorithm is one of the most popular learning algorithms being developed for layered networks [l-21. Improving the learning speed of BP and increasing the generalization capability ofthe networks have played a center role in neural network research [3-91. Apart from multi-layer network architectures and the BP algorithm, various simplified architectures or different non-linear activation functions have been devised. Among those, so-called flat networks including functional-link neural networks and radial basis function networks have been proposed [lo-151. These flat networks remove the drawback
在本章中,我们将介绍平面神经网络架构。平面神经网络的系统方程可以表示为线性系统。这样,性能指标是权值的二次形式,网络的权值可以用线性最小二乘法求解。即使它们有一个类似线性系统方程的方程,平面神经网络也非常适合近似非线性函数。给出了一种求平面神经网络最优权值的快速学习算法。这个公式可以更容易地立即更新新添加的输入和新添加的节点的权重。提出了一种动态分步更新算法,实现了系统权值的即时更新。最后,我们给出了平面神经网络的几个应用实例,如红外激光数据集、混沌时间序列、月度面粉价格数据集和非线性系统识别问题。仿真结果与现有模型进行了比较,这些模型需要更复杂的体系结构和更昂贵的训练费用。结果表明,平面神经网络对实时过程具有很大的吸引力。前馈人工神经网络是近年来研究的热点。研究课题从学习算法的理论观点,如网络的学习和泛化特性,到控制、分类、生物医学、制造和商业预测等领域的各种应用。反向传播(BP)监督学习算法是为分层网络开发的最流行的学习算法之一[l-21]。提高BP的学习速度和提高网络的泛化能力是神经网络研究的中心问题[3-91]。除了多层网络结构和BP算法外,还设计了各种简化结构或不同的非线性激活函数。其中提出了所谓的平面网络,包括函数链神经网络和径向基函数网络[lo-151]。这些平面网络消除了这个缺点
{"title":"Flat Neural Networks and Rapid Learning Algorithms","authors":"Hongxing Li, C. L. P. Chen, Han-Pang Huang","doi":"10.1201/9781420057997.CH5","DOIUrl":"https://doi.org/10.1201/9781420057997.CH5","url":null,"abstract":"In this chapter, we will introduce flat neural networks architecture. The system equations of flat neural networks can be formulated as a linear system. In this way, the performance index is a quadratic form of the weights, and the weights of the networks can be solved easily using a linear least-square method. Even though they have a linear-system-equations-like equation, the flat neural networks are also perfect for approximating non-linear functions. A fast learning algorithm is given to find an optimal weight of the flat neural networks. This formulation makes it easier to update the weights instantly for both a newly added input and a newly added node. A dynamic stepwise updating algorithm is given to update thc weights of the system instantly. Finally, we give several examples of applications of the flat neural networks, such as an infrared laser data set, a chaotic time-series, a monthly flour price data set, and a non-linear system identification problem. The simulation results are compared to existing models in which more complex architectures and more costly training are needed. The results indicate that the flat neural networks are very attractive to real-time processes. 5.1 Introduction Feedforward artificial neural networks have been a popular research subject recently. The research topics vary from the theoretical view of learning algorithms such as learning and generalization properties of the networks to a variety of applications in control, classification, biomedical, manufacturing, and business forecasting, etc. The backpropagation (BP) supervised learning algorithm is one of the most popular learning algorithms being developed for layered networks [l-21. Improving the learning speed of BP and increasing the generalization capability ofthe networks have played a center role in neural network research [3-91. Apart from multi-layer network architectures and the BP algorithm, various simplified architectures or different non-linear activation functions have been devised. Among those, so-called flat networks including functional-link neural networks and radial basis function networks have been proposed [lo-151. These flat networks remove the drawback","PeriodicalId":239984,"journal":{"name":"Fuzzy Neural Intelligent Systems","volume":"16 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2000-09-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115954390","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Mathematical Essence and Structures of Feedback Neural Networks and Weight Matrix Design 反馈神经网络的数学本质与结构与权矩阵设计
Pub Date : 2000-09-21 DOI: 10.1201/9781420057997.CH7
Hongxing Li, C. L. P. Chen, Han-Pang Huang
This chapter focuses on mathematical essence and structures of neural networks and fuzzy neural networks, especially on discrete feedback neural networks. We begin with review of Hopfield networks and discuss the mathematical essence and the structures of discrete feedback neural networks. First, we discuss a general criterion on the stability of networks, and we show that the energy function commonly used can be regarded as a special case of the criterion. Second, we show that the stable points of a network can be converted as the fixed points of some function, and the weight matrix of the feedback neural networks can be solved from a group of systems of linear equations. Last, we point out the mathematical base of the outer-product learning method and give several examples of designing weight matrices based on multifactorial functions. In previous chapters, we have discussed in detail the mathematical essence and structures of feedforward neural networks. Here, we study the mathematical essence and structures of feedback neural networks, namely, the Hopfield networks [l]. illustrates a single-layer Hopfield net with n neurons, where are outer input variables, which usually are treated as " the first impetus " , then they are removed and the network will continue to evolve itself. connection weights, wij = wji and wii=O. The activation functions of the neurons are denoted by cpi, where the threshold values are 8i.
本章重点介绍神经网络和模糊神经网络的数学本质和结构,特别是离散反馈神经网络。我们首先回顾Hopfield网络,讨论离散反馈神经网络的数学本质和结构。首先,我们讨论了网络稳定性的一般判据,并证明了常用的能量函数可以看作是该判据的一个特例。其次,我们证明了网络的稳定点可以转化为某个函数的不动点,并且反馈神经网络的权矩阵可以由一组线性方程组求解。最后指出了外积学习方法的数学基础,并给出了基于多因子函数的权矩阵设计实例。在前面的章节中,我们详细讨论了前馈神经网络的数学本质和结构。在这里,我们研究反馈神经网络的数学本质和结构,即Hopfield网络[1]。举例说明了一个有n个神经元的单层Hopfield网络,其中是外部输入变量,通常被视为“第一推动力”,然后它们被移除,网络将继续自我进化。连接权值,wij = wji, wii=O。神经元的激活函数用cpi表示,其中阈值为8i。
{"title":"Mathematical Essence and Structures of Feedback Neural Networks and Weight Matrix Design","authors":"Hongxing Li, C. L. P. Chen, Han-Pang Huang","doi":"10.1201/9781420057997.CH7","DOIUrl":"https://doi.org/10.1201/9781420057997.CH7","url":null,"abstract":"This chapter focuses on mathematical essence and structures of neural networks and fuzzy neural networks, especially on discrete feedback neural networks. We begin with review of Hopfield networks and discuss the mathematical essence and the structures of discrete feedback neural networks. First, we discuss a general criterion on the stability of networks, and we show that the energy function commonly used can be regarded as a special case of the criterion. Second, we show that the stable points of a network can be converted as the fixed points of some function, and the weight matrix of the feedback neural networks can be solved from a group of systems of linear equations. Last, we point out the mathematical base of the outer-product learning method and give several examples of designing weight matrices based on multifactorial functions. In previous chapters, we have discussed in detail the mathematical essence and structures of feedforward neural networks. Here, we study the mathematical essence and structures of feedback neural networks, namely, the Hopfield networks [l]. illustrates a single-layer Hopfield net with n neurons, where are outer input variables, which usually are treated as \" the first impetus \" , then they are removed and the network will continue to evolve itself. connection weights, wij = wji and wii=O. The activation functions of the neurons are denoted by cpi, where the threshold values are 8i.","PeriodicalId":239984,"journal":{"name":"Fuzzy Neural Intelligent Systems","volume":"22 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2000-09-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123295605","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Basic Structure of Fuzzy Neural Networks 模糊神经网络的基本结构
Pub Date : 2000-09-21 DOI: 10.1201/9781420057997.CH6
Hongxing Li, C. L. P. Chen, Han-Pang Huang
In this chapter we shall discuss the structure of fuzzy neural networks. We start with general definitions of multifactorial functions. And we show that a fuzzy neu-ron can be formulated by means of standard multifactorial function. We also give definitions of a fuzzy neural network based on fuzzy relationship and fuzzy neurons. Finally, we describe a learning algorithm for a fuzzy neural network based on V and A operations. 6.1 Definition of Fuzzy Neurons Neural networks alone have demonstrated their ability to classify, recall, and associate information [l]. In this chapter, we shall incorporate fuzziness to the networks. The objective to include the fuzziness is to extend the capability of the neural networks to handle " vague " information than " crisp " information only. Previous work has shown that fuzzy neural networks have achieved some level of success both fundamentally and practically [l-lo]. As indicated in reference [l], there are several ways to classify fuzzy neural networks: (1) a fuzzy neuron with crisp signals used to evaluate fuzzy weights, (2) a fuzzy neuron with fuzzy signals which is combined with fuzzy weights, and (3) a fuzzy neuron described by fuzzy logic equations. In this chapter, we shall discuss a fuzzy neural network where both inputs and outputs can be either a crisp value or a fuzzy set. To do this we shall first introduce multifactorial function [ll, 121. We have pointed out from Chapter 4 that one of the basic functions of neurons is that the input to a neuron is synthesized first, then activated, where the basic operators to be used as synthesizing are " + " and ". " denoted by (+, .) and called synthetic operators. However, there are divers styles operators will be multifactorial functions, so we now briefly introduce the concept of multifactorial functions. In [0, lIm, a natural partial ordering " 5 " is defined as follows: A multifactorial function is actually a projective mapping from an rn-ary space to a
在本章中,我们将讨论模糊神经网络的结构。我们从多因子函数的一般定义开始。并证明了模糊神经元可以用标准多因子函数表示。给出了基于模糊关系和模糊神经元的模糊神经网络的定义。最后,我们描述了一种基于V和a运算的模糊神经网络学习算法。仅神经网络就已经证明了其分类、回忆和关联信息的能力[1]。在本章中,我们将把模糊性纳入到网络中。引入模糊性的目的是扩展神经网络处理“模糊”信息的能力,而不是只处理“清晰”信息。先前的工作表明,模糊神经网络在根本上和实践上都取得了一定程度的成功[l-lo]。如文献[1]所示,对模糊神经网络的分类有几种方法:(1)使用带有清晰信号的模糊神经元来评价模糊权重;(2)使用带有模糊信号的模糊神经元与模糊权重相结合;(3)使用模糊逻辑方程描述的模糊神经元。在本章中,我们将讨论一个模糊神经网络,其中输入和输出都可以是一个清晰值或模糊集。为此,我们首先引入多因子函数[1,121]。我们在第4章中已经指出,神经元的一个基本功能是先对神经元的输入进行合成,然后再进行激活,其中用于合成的基本算子是“+”和“。用(+,.)表示,称为合成运算符。然而,有多种风格的运算符会是多因子函数,所以我们现在简单介绍一下多因子函数的概念。在[0,lIm中,一个自然偏序“5”定义如下:一个多因子函数实际上是一个从任意空间到a的投影映射
{"title":"Basic Structure of Fuzzy Neural Networks","authors":"Hongxing Li, C. L. P. Chen, Han-Pang Huang","doi":"10.1201/9781420057997.CH6","DOIUrl":"https://doi.org/10.1201/9781420057997.CH6","url":null,"abstract":"In this chapter we shall discuss the structure of fuzzy neural networks. We start with general definitions of multifactorial functions. And we show that a fuzzy neu-ron can be formulated by means of standard multifactorial function. We also give definitions of a fuzzy neural network based on fuzzy relationship and fuzzy neurons. Finally, we describe a learning algorithm for a fuzzy neural network based on V and A operations. 6.1 Definition of Fuzzy Neurons Neural networks alone have demonstrated their ability to classify, recall, and associate information [l]. In this chapter, we shall incorporate fuzziness to the networks. The objective to include the fuzziness is to extend the capability of the neural networks to handle \" vague \" information than \" crisp \" information only. Previous work has shown that fuzzy neural networks have achieved some level of success both fundamentally and practically [l-lo]. As indicated in reference [l], there are several ways to classify fuzzy neural networks: (1) a fuzzy neuron with crisp signals used to evaluate fuzzy weights, (2) a fuzzy neuron with fuzzy signals which is combined with fuzzy weights, and (3) a fuzzy neuron described by fuzzy logic equations. In this chapter, we shall discuss a fuzzy neural network where both inputs and outputs can be either a crisp value or a fuzzy set. To do this we shall first introduce multifactorial function [ll, 121. We have pointed out from Chapter 4 that one of the basic functions of neurons is that the input to a neuron is synthesized first, then activated, where the basic operators to be used as synthesizing are \" + \" and \". \" denoted by (+, .) and called synthetic operators. However, there are divers styles operators will be multifactorial functions, so we now briefly introduce the concept of multifactorial functions. In [0, lIm, a natural partial ordering \" 5 \" is defined as follows: A multifactorial function is actually a projective mapping from an rn-ary space to a","PeriodicalId":239984,"journal":{"name":"Fuzzy Neural Intelligent Systems","volume":"32 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2000-09-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127374910","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
期刊
Fuzzy Neural Intelligent Systems
全部 Acc. Chem. Res. ACS Applied Bio Materials ACS Appl. Electron. Mater. ACS Appl. Energy Mater. ACS Appl. Mater. Interfaces ACS Appl. Nano Mater. ACS Appl. Polym. Mater. ACS BIOMATER-SCI ENG ACS Catal. ACS Cent. Sci. ACS Chem. Biol. ACS Chemical Health & Safety ACS Chem. Neurosci. ACS Comb. Sci. ACS Earth Space Chem. ACS Energy Lett. ACS Infect. Dis. ACS Macro Lett. ACS Mater. Lett. ACS Med. Chem. Lett. ACS Nano ACS Omega ACS Photonics ACS Sens. ACS Sustainable Chem. Eng. ACS Synth. Biol. Anal. Chem. BIOCHEMISTRY-US Bioconjugate Chem. BIOMACROMOLECULES Chem. Res. Toxicol. Chem. Rev. Chem. Mater. CRYST GROWTH DES ENERG FUEL Environ. Sci. Technol. Environ. Sci. Technol. Lett. Eur. J. Inorg. Chem. IND ENG CHEM RES Inorg. Chem. J. Agric. Food. Chem. J. Chem. Eng. Data J. Chem. Educ. J. Chem. Inf. Model. J. Chem. Theory Comput. J. Med. Chem. J. Nat. Prod. J PROTEOME RES J. Am. Chem. Soc. LANGMUIR MACROMOLECULES Mol. Pharmaceutics Nano Lett. Org. Lett. ORG PROCESS RES DEV ORGANOMETALLICS J. Org. Chem. J. Phys. Chem. J. Phys. Chem. A J. Phys. Chem. B J. Phys. Chem. C J. Phys. Chem. Lett. Analyst Anal. Methods Biomater. Sci. Catal. Sci. Technol. Chem. Commun. Chem. Soc. Rev. CHEM EDUC RES PRACT CRYSTENGCOMM Dalton Trans. Energy Environ. Sci. ENVIRON SCI-NANO ENVIRON SCI-PROC IMP ENVIRON SCI-WAT RES Faraday Discuss. Food Funct. Green Chem. Inorg. Chem. Front. Integr. Biol. J. Anal. At. Spectrom. J. Mater. Chem. A J. Mater. Chem. B J. Mater. Chem. C Lab Chip Mater. Chem. Front. Mater. Horiz. MEDCHEMCOMM Metallomics Mol. Biosyst. Mol. Syst. Des. Eng. Nanoscale Nanoscale Horiz. Nat. Prod. Rep. New J. Chem. Org. Biomol. Chem. Org. Chem. Front. PHOTOCH PHOTOBIO SCI PCCP Polym. Chem.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1