节点文献

带参数聚合算子的模糊联想记忆网络

Fuzzy associative memory network based on parameterized gathering operator

  • 推荐 CAJ下载
  • PDF下载
  • 不支持迅雷等下载工具,请取消加速工具后下载。

【作者】 李鹰徐蔚鸿唐良荣

【Author】 LI Ying1,XU Wei-hong1,2,TANG Liang-rong1(1.College of Computer and Communications Engineering,Changsha University of Science and Technology,Changsha Hunan 410076,China;2.College of Mathematics and Computer Science,Jishou University,Jishou Hunan 416000,China)

【机构】 长沙理工大学计算机与通信工程学院吉首大学数学与计算科学学院

【摘要】 基于最大运算Max和t--范数T的神经网络模型Max-T FAM是B.Kosko提出的经典模糊联想记忆(FAM)网络的一种重要的广义形式,其性能有多处不足.本文利用一种参数化聚合算子∨λ,提出了一种计算简单、易于硬件实现的广义模糊联想记忆(GFAM)网络,其连接算子从{∨λ|λ∈[0,1]}中选取;从理论上严格证明了GFAM具有一致连续性,比所有Max-T FAM的映射能力和存储能力强很多;接着运用模糊关系方程理论提出和分析了GFAM的一种所谓的Max-Min-λ学习算法;最后用实验对GFAM和Max-T FAM的完整可靠存储能力进行了比较,并示例了GFAM在图像联想方面的应用.

【Abstract】 The neural network model Max-T FAM with the maximum operation and a t-norm T is an important gen-eralized form of the classical fuzzy associative memory(FAM) network proposed by B.Kosko.This model has several disadvantages in its properties.Using a parameterized aggregating operator ∨λ,we present a new generalized fuzzy asso-ciative memory(GFAM) network which is simple in computation and easy in implementation by hardware.All conjunctive operators of the interconnections of GFAM are chosen from a cluster {∨λ|λ ∈ [0,1]}.The strict theoretical study reveals that the GFAM is uniformly continuous and has much higher mapping ability and stronger storage capability than all Max-T FAMs.From the theory of fuzzy relational equations,we derive and analyze a so-called Max-Min-λlearning algorithm for GFAM.Experimental comparisons of the storage capability have been made between GFAM and all Max-T FAMs.An application of GFAM to associative images is illustrated.

【基金】 国家自然科学基金资助项目(6003302);教育部重点科研基金资助项目(208098)
  • 【文献出处】 控制理论与应用 ,Control Theory & Applications , 编辑部邮箱 ,2010年11期
  • 【分类号】TP183
  • 【被引频次】1
  • 【下载频次】98
节点文献中: 

本文链接的文献网络图示:

本文的引文网络