同来核对 发表于 2025-3-28 16:06:18
Metric Learning: A Support Vector Approach-definite programming problem (QSDP) with local neighborhood constraints, which is based on the Support Vector Machine (SVM) framework. The local neighborhood constraints ensure that examples of the same class are separated from examples of different classes by a margin. In addition to providing an敌意 发表于 2025-3-28 19:20:57
Support Vector Machines, Data Reduction, and Approximate Kernel Matricesch as distributed networking systems are often prohibitively high, resulting in practitioners of SVM learning algorithms having to apply the algorithm on approximate versions of the kernel matrix induced by a certain degree of data reduction. In this paper, we study the tradeoffs between data reductFADE 发表于 2025-3-28 23:49:57
Hierarchical, Parameter-Free Community Discoveryse to look for community hierarchies, with communities- within-communities. Our proposed method, the . finds such communities at multiple levels, with no user intervention, based on information theoretic principles (MDL). More specifically, it partitions the graph into progressively more refined subFRONT 发表于 2025-3-29 06:43:29
http://reply.papertrans.cn/63/6206/620517/620517_44.png坚毅 发表于 2025-3-29 07:26:33
http://reply.papertrans.cn/63/6206/620517/620517_45.png地壳 发表于 2025-3-29 12:54:51
Kernel-Based Inductive Transferning, the task is to find a suitable bias for a new dataset, given a set of known datasets. In this paper, we take a kernel-based approach to inductive transfer, that is, we aim at finding a suitable kernel for the new data. In our setup, the kernel is taken from the linear span of a set of predefin枪支 发表于 2025-3-29 19:03:09
http://reply.papertrans.cn/63/6206/620517/620517_47.pngpenance 发表于 2025-3-29 23:15:02
Client-Friendly Classification over Random Hyperplane Hashese are addressing the problem of centrally learning (linear) classification models from data that is distributed on a number of clients, and subsequently deploying these models on the same clients. Our main goal is to balance the accuracy of individual classifiers and different kinds of costs related小步舞 发表于 2025-3-30 03:35:19
Large-Scale Clustering through Functional Embeddingmize over discrete labels using stochastic gradient descent. Compared to methods like spectral clustering our approach solves a single optimization problem, rather than an ad-hoc two-stage optimization approach, does not require a matrix inversion, can easily encode prior knowledge in the set of impCLAP 发表于 2025-3-30 04:37:48
Clustering Distributed Sensor Data Streamstain a cluster structure over the data points generated by the entire network. Usual techniques operate by forwarding and concentrating the entire data in a central server, processing it as a multivariate stream. In this paper, we propose ., a new distributed algorithm which reduces both the dimensi