学术讨论会 发表于 2025-3-28 15:49:42
http://reply.papertrans.cn/17/1628/162701/162701_41.png鸟笼 发表于 2025-3-28 20:25:54
Discriminative Dimensionality Reduction Based on Generalized LVQcognition. GLVQ is a general framework for classifier design based on the minimum classification error criterion, and it is easy to apply it to dimensionality reduction in feature extraction. Experimental results reveal that the training of both a feature transformation matrix and reference vectors脆弱么 发表于 2025-3-29 00:21:21
http://reply.papertrans.cn/17/1628/162701/162701_43.png叫喊 发表于 2025-3-29 06:30:50
Clustering Gene Expression Data by Mutual Information with Gene Function space and become local there, while within-cluster differences between the associated, implicitly estimated conditional distributions of the discrete variable are minimized. The discrete variable can be seen as an indicator of relevance or importance guiding the clustering. Minimization of the KullItinerant 发表于 2025-3-29 08:46:00
http://reply.papertrans.cn/17/1628/162701/162701_45.pngendocardium 发表于 2025-3-29 13:53:12
http://reply.papertrans.cn/17/1628/162701/162701_46.pngeczema 发表于 2025-3-29 18:06:44
http://reply.papertrans.cn/17/1628/162701/162701_47.png匍匐 发表于 2025-3-29 22:42:35
http://reply.papertrans.cn/17/1628/162701/162701_48.png很是迷惑 发表于 2025-3-30 01:05:19
http://reply.papertrans.cn/17/1628/162701/162701_49.pngInsul岛 发表于 2025-3-30 05:42:42
Approximation of Bayesian Discriminant Function by Neural Networks in Terms of Kullback-Leibler Infoork, having rather a small number of hidden layer units, can approximate the Bayesian discriminant function for the two category classification if the log ratio of the a posteriori probability is a polynomial. The accuracy of approximation is measured by the Kullback-Leibler information. An extensio